06.04.2017, 18:21 (This post was last modified: 06.04.2017, 18:39 by buuuudzik.)
Thanks Erwin,
but I know about this API and I am using it, but it is very low-level and e.g. it hasn't timestamp only some values which are from start to end and also I've tested that fetch is not working properly when you try to get the data from some range and also fetchone has some bugs or isn't accomplished like library from old-trends.
The other problem with current version of trend API which I've found is that when I want check only data from monday 12:00 to tuesday 7:00 I must download full monday and tuesday and doing whole calculation on not small table in Lua. And also when I want try something like this(fetching only a part of day):
Code:
1234567891011
require('trends')
name = 'Słońce_Azymut'dates = {}
dates['start'] = { year = 2017, month = 4, day = 6, hour=7}
dates['end'] = { year = 2017, month = 4, day = 6, hour=12}
-- resolution (in seconds) is optional-- default trend resolution is used when not setresolution = 3600-- hourly datares, a, b = trends.fetch(name, dates, resolution)
log(res)
The result is empty table A working way is when I download data from day=6 to day=7 and then I must do whole job in Lua.
For me it would be perfect if I can analyse freely and reliably data from trends, so I can check freely every trend and find e.g. yearly maximum or how much time object crossed the limit and exactly when etc..
Perfect would be if this would be like in SQL because for me this is the best way for such analysis
I think also good option would be a tool for convert rrd db to SQL(but this I can do by converting whole data from Trend API, try to add to this data some timestamp and save it in .csv and then import such table to DB Browser)
06.04.2017, 22:00 (This post was last modified: 06.04.2017, 22:48 by Erwin van der Zwart.)
Hi Buuudzik,
Unfortunately that option is not available, but you can easily create simulair result including timestamp by using this:
Code:
123456789101112131415161718
require('trends')
name = 'Słońce_Azymut'dates = {}
dates['start'] = { year = 2017, month = 4, day = 6}
dates['end'] = { year = 2017, month = 4, day = 7}
-- default trend resolution is used when not setresolution = 3600-- hourly datares, a, b = trends.fetch(name, dates, resolution)
-- Create new table with timestamp and only values between 7 and 13newtable = {}
fori, rowinipairs(res) doif (i-1) >= 7and (i-1) <= 13thentable.insert(newtable, {time = (i-1) .. ':00', value = row})
endendlog(newtable)
If you resolution is 5 minutes you can convert it to hourly data like this:
Code:
1234567891011121314151617181920212223242526
require('trends')
name = 'Słońce_Azymut'dates = {}
dates['start'] = { year = 2017, month = 4, day = 6}
dates['end'] = { year = 2017, month = 4, day = 7}
-- default trend resolution is used when not setresolution = 3600-- hourly datares, a, b = trends.fetch(name, dates, resolution)
newtable = {}
if #res == 288thencounter = 0total = 0fori, rowinipairs(res) dototal = total + rowcounter = counter + 1ifcounter == 12thenif ((i/12)-1) >= 7and ((i/12)-1) <= 13thentable.insert(newtable, {time = ((i/12)-1) .. ':00', value = (total/12)})
endcounter = 0endendendlog(newtable)
If you resolution is 10 minutes change 288 to 144 and all 12 to 6 , for 15 minutes resolution change 288 to 96 and all 12 to 4, for 20 minutes resolution change 288 to 72 and all 12 to 3, for 30 minutes resolution change 288 to 48 and all 12 to 2.
-- This function prepare export table and .csv file from defined rangefunctiontrend_export(name, dates, mode, timestampType)
require('trends')
-- detect timeShiftfunctiondetectTimeShift(t0, t1)
t0_isdst, t1_isdst = os.date('*t', t0).isdst, os.date('*t', t1).isdstif (t0_isdst ~= t1_isdst) thenif (t1_isdst) then-- change to summertimereturn1else-- change to wintertimereturn2endelsereturnfalseendend-- Convert word "now" in "start" to data tableiftype(dates['start']) == "string"thenifdates['start'] == "now"thendates['start'] = os.date("*t", os.time())
elseifdates['start'] == "yesterday"thendates['start'] = os.date("*t", os.time()-24*3600)
elseifdates['start'] == "week_ago"thendates['start'] = os.date("*t", os.time()-7*24*3600)
elseifdates['start'] == "month_ago"thendates['start'] = os.date("*t", os.time()-30*24*3600)
elseifdates['start'] == "year_ago"thendates['start'] = os.date("*t", os.time()-365*24*3600)
endend-- Convert word "now" in "end" to data tableiftype(dates['end']) == "string"thenifdates['end'] == "now"thendates['end'] = os.date("*t", os.time())
elseifdates['end'] == "yesterday"thendates['end'] = os.date("*t", os.time()-24*3600)
elseifdates['end'] == "week_ago"thendates['end'] = os.date("*t", os.time()-7*24*3600)
elseifdates['end'] == "month_ago"thendates['end'] = os.date("*t", os.time()-30*24*3600)
elseifdates['end'] == "year_ago"thendates['end'] = os.date("*t", os.time()-365*24*3600)
endend-- Start and end in unix timestarttime = os.time(dates['start'])
endtime = os.time(dates['end'])
-- Invert "start" and "end" when "end" is smaller than "start"ifstarttime > endtimethen_ = dates['start']
dates['start'] = dates['end']
dates['end'] = _end-- Read trend resolution from DBresolution = db:getlist("SELECT resolution*60 FROM trends WHERE name='" .. name .. "'")[1]
-- Prepare a table with the data which include the whole data from specified rangeifresolutionthen-- Expand a range with a one more day if there is some data in the last dayif (os.date("%H", endtime)*3600 + os.date("%M", endtime)*60) >= resolutionthendates['end'].day = dates['end'].day + 1raw_data = trends.fetch(name, dates, resolution)
elseraw_data = trends.fetch(name, dates, resolution)
endstart = 1data = {}
start_timestamp = os.time(dates['start']) + resolution-- Cut the data and prepare a table with the data only from specified rangeoffset = 0skip = 0k = 1whilek <= #raw_datadov = raw_data[k]
ifk == 1thenlast_timestamp = start_timestampelselast_timestamp = current_timestampendcurrent_timestamp = start_timestamp + (k-1) * resolution + offsettimeShiftStatus = detectTimeShift(last_timestamp, current_timestamp)
iftimeShiftStatustheniftimeShiftStatus == 1then-- winter->summerskip = 6-- skip 1 hour empty dataoffset = offset - 3600elseiftimeShiftStatus == 2then-- summer->winteroffset = offset + 3600endend-- Add to new table only records from specified rangeifcurrent_timestamp >= starttimeandcurrent_timestamp <= endtimetheniftimestampType == 'unixepoch'thendata[start] = {current_timestamp, v}
elsedata[start] = {os.date('%Y.%m.%d %H:%M:%S', current_timestamp), v} endstart = start+1endk = k + 1 + skipskip = 0endifmode == 'csv'then-- Prepare .csv filefork,vinipairs(data) dodata[k] = table.concat(data[k], ",")
endcsv = "timestamp, value\r\n" .. table.concat(data, "\r\n")
returnname, csv, resolutionelseifmode == 'table'thenreturnname, data, resolutionendelsealert('There is no trend with such a name')
endend
Function:
trend_export(name, dates, mode, timestampType)
Function parameters:
name - trend name(must be the same as in DB)
dates - a table with: year, month, day, hour or a string("now", "yesterday","week_ago", "month_ago", "year_ago")
mode - 'table' for lua table and 'csv' for e.g. sending mail and using it after in Excel
timestampType - 'unixepoch' for unix timestamp or false for using local human readable timestamp like '2017.03.26 07:20:00'
Function returns:
name - name of trends
data - csv or Lua table based on mode parameter
resolution - trend resolution in seconds e.g. 300 means 300/60 = 5 minutes
require('user.trends_pro')
-- Export dataname = 'Kitchen_Temperature'dates = {}
dates['start'] = { year=2016, month=4, day=5, hour=0}
dates['end'] = "now"name, csv = trend_export(name, dates, 'csv', 'unixepoch')
-- Send an email with dataattachement = { {filename=name .. '.csv', filedata=csv, mimetype='text/csv'} }
mailattach('someone@example.com', 'Report ' .. name, 'CSV file attached', attachement)
-- Calculate maximum value in specified period and log itname, data = trend_export(name, dates, 'table')
minValue, maxValue = -50, 100currentMax = niloccurrences = {}
fork,vinipairs(data) docurValue = v[2]
ifnotcurMaxthencurMax = curValuetable.insert(occurrences, v)
elseifcurValue > minValueandcurValue > curMaxandcurValue < maxValuethencurMax = curValueoccurrences = {}
table.insert(occurrences, v)
elseifcurValue == currentMaxthentable.insert(occurrences, v)
endendendmax = curMaxifoccurrences == 0thenlog('There was no occurrences. Probably there was no data in table.')
elsemessage = 'Maximum value of ' .. name .. ' equals ' .. max .. ' and there was ' .. #occurrences .. ' occurrence'if #occurrences > 1thenmessage = message .. 's.'elsemessage = message .. '.'endlog(message, occurrences)
end
After export it is possible to import .csv file to e.g. program DB Browser for SQLite and then it is possible to use freely SQL commands to analyse this data for find e.g.:
- how many days in year was enough sunny(temp>20 and brightness>70klux) and in which month was the best for holiday?
- what would be the best angle for PV based on the last year?
- when there was no internet in house?
- how many times current temperature was very different from the setpoint and what was the reason?
- how many times in the last year there was some people in the house?
- ~~how many peoples was in the house in the las year daily based on doors(door open/2) or sensors(2 presence sensor in the hallway)?
and the other
I've updated above library and now it can correct the data after timeshift and also it gives a possibility to select a type of timestamp: unixepoch or more human readable
But unfortunatey time stamp is not human readable.
I've updated above library a few times yesterday so maybe you has not last version and please check also last version of examples. Change return values was one of the last change. Now I'm thinking about add also some filter function which can be created by user e.g. check if value is from Monday and if then find min value from all Mondays from the specified period.
Yes, because you've specified 'unixepoch' Delete it and you will see human readable format.
And how you are using it? Unfortunately demo LM from openrb.com is currently unavailable, so I cannot show you this on real LM.
Sometimes there are issues with reloading library so please clear library, clear and disable script and after all of these tasks try again use this library and script because it works for sure also with this human readable timestamp