![]() |
massive data logging - Printable Version +- Logic Machine Forum (https://forum.logicmachine.net) +-- Forum: LogicMachine eco-system (https://forum.logicmachine.net/forumdisplay.php?fid=1) +--- Forum: Gateway (https://forum.logicmachine.net/forumdisplay.php?fid=10) +--- Thread: massive data logging (/showthread.php?tid=4217) |
massive data logging - gdimaria - 05.09.2022 Hi, I need to trend log about 250/300 KNX objects. I know that's over LM possibilities due the 21 Mb archive limit. Can you suggest an appropriate device to do this job? I already knows BAB TECHNOLOGIE GmbH DATALOGGER V2 (https://bab-technologie.com/datalogger-v2/?lang=en), it could be ok. But I trust in your experience and I would receive a suggest. Thanks Peppe RE: massive data logging - admin - 06.09.2022 With low resolution / retention settings you could fit 300 trends into LM. An alternative solution is to use Grafana - either cloud-hosted or by installing it locally on a small PC. Installation can be a bit tricky but it's a very flexible solution for storing time-series data. RE: massive data logging - gdimaria - 07.09.2022 Grafana seems to be very cool! And how to connect LM to it, sending telegrams via remote access? RE: massive data logging - admin - 07.09.2022 LM is sending data to InfluxDB using HTTP. This thread has working examples: https://forum.logicmachine.net/showthread.php?tid=1531 Another option is MQTT: https://grafana.com/blog/2021/08/12/streaming-real-time-sensor-data-to-grafana-using-mqtt-and-grafana-live/ RE: massive data logging - gdimaria - 14.09.2022 (07.09.2022, 07:47)admin Wrote: LM is sending data to InfluxDB using HTTP. This thread has working examples: https://forum.logicmachine.net/showthread.php?tid=1531 Things are little changed on InfluxDB: https://docs.influxdata.com/influxdb/cloud/organizations/buckets/create-bucket/#create-a-bucket-using-the-influxdb-api it seems a little too complicated to me... no shorter ways to implement a massive data logging? Peppe RE: massive data logging - admin - 14.09.2022 You only need to create a bucket once. It will store your data for a certain amount of time. There are examples of sending data to influx v2 in the topic I've linked to. The setup part is a bit complicated but you only have to do it once ![]() RE: massive data logging - gdimaria - 15.09.2022 (14.09.2022, 11:33)admin Wrote: You only need to create a bucket once. It will store your data for a certain amount of time. There are examples of sending data to influx v2 in the topic I've linked to. I accepted the challenge! ![]() But right now I get error messages: * arg: 1 * string: error sending to influx * arg: 2 * string: {"code":"invalid","message":"at line 1:36: cannot parse value for field key \"value\": invalid float value syntax (check rejected_points in your _monitoring bucket for further information)"} * arg: 3 * number: 400 * arg: 4 * string: rawdata,name=ora,addr=33/1/9 value=10:57 and * arg: 1 * string: error sending to influx * arg: 2 * string: {"code":"invalid","message":"at line 1:47: cannot parse value for field key \"value\": invalid bool value \"Thu\" (check rejected_points in your _monitoring bucket for further information)"} * arg: 3 * number: 400 * arg: 4 * string: rawdata,name=giornomeseanno,addr=33/1/8 value=Thu 15-09-2022 the script I used is the following, can you help me? Code: local socket = require('socket') RE: massive data logging - admin - 15.09.2022 You're sending string data (10:57 and Thu 15-09-2022) which Influx does not understand. String values must be enclosed in quotes. Try replacing lines 35..42 with this: Code: local body Currently this script sends values for all objects. There should be some filtering implemented so only the required data is pushed. RE: massive data logging - gdimaria - 15.09.2022 (15.09.2022, 09:23)admin Wrote: You're sending string data (10:57 and Thu 15-09-2022) which Influx does not understand. String values must be enclosed in quotes. ok, now it's collecting data.... and yes, I need to filter them. I guess I have to modify the function knx_callback(event), but my object addresses are not in a specific range, so I need to implement a sort of list of addresses to send. RE: massive data logging - admin - 15.09.2022 This example will push only objects that have influx tag attached. Code: local http = require('socket.http') RE: massive data logging - gdimaria - 16.09.2022 (15.09.2022, 12:55)admin Wrote: This example will push only objects that have influx tag attached. Everything is fine (although I still have to learn how to handle the data to create a decent report ![]() Now I have the final step: to connect influxdb to Grafana. I don't know how to choose the right authentication metod to connect it. Where to put the token and org name??! RE: massive data logging - admin - 19.09.2022 If you switch to Flux query language then you can specify access token, org and bucket. This might be helpful: https://medium.com/@nanditasahu031/integration-of-influxdb2-with-grafana-28b4aebb3368 https://blog.devgenius.io/grafana-influxdb-as-sql-data-visualization-561e89b207 |