| 
		
	
	
	
		
	Posts: 42 
	Threads: 11 
	Joined: Apr 2018
	
 Reputation: 
1 
	
	
		Hi.
 I need help with a scheduled script.
 I want to read 4 byte floating point values throu tags or miltiple tags
 and store the values to local lm ftp in a csv file.
 
 The value from each object should contain: Date, Time, Name, Value and unit/suffix.
 
 Is this possible?
 
		
	 
	
	
	
		
	Posts: 8413 
	Threads: 45 
	Joined: Jun 2015
	
 Reputation: 
481 
	
	
	
		
	Posts: 42 
	Threads: 11 
	Joined: Apr 2018
	
 Reputation: 
1 
	
	
		I was thinking a script like this.
 But this script is only working in an event.
 
 -- Write data to a file in the local FTP
 
 value = event.getvalue()
 
 
 logdate = os.date('%Y.%m.%d %H:%M:%S', os.time())
 csv = string.format('%q,%q,%q,%q\r\n', logdate, event.dst, grp.alias(event.dst), tostring(value))
 
 log(csv)
 
 
 file, error = io.open("home/ftp/outdoor_temp1.txt","a")
 log(file,error)
 file:write(csv)
 file:close()
 
		
	 
	
	
	
		
	Posts: 8413 
	Threads: 45 
	Joined: Jun 2015
	
 Reputation: 
481 
	
	
		You need to provide an absolute path: Code: file, error = io.open("/home/ftp/outdoor_temp1.txt","a")
		
	 
	
	
	
		
	Posts: 42 
	Threads: 11 
	Joined: Apr 2018
	
 Reputation: 
1 
	
	
		Hi. 
 I got this script to work. But i have two qiestions.
 
 It is possible to skip past hour data and get the acctual time instead?
 
 How read the unit/suffix and add it to the result.
 
 Result:
 --Date, Time, Adress, Name, Value
 "2018.10.04 07:34:10","33/1/3","Meter 2","1495"
 
 --------Script write to FTP----------
 
 -- get past hour data (3600 seconds)
 logtime = os.time() - 10 * 1
 
 -- list of objects by id
 objects = {'Mätare'}
 
 -- objects with logging enabled
 query = 'SELECT address, datatype, name FROM objects WHERE disablelog=0'
 for _, object in ipairs(db:getall(query)) do
 objects[ tonumber(object.address) ] = {
 datatype = tonumber(object.datatype),
 name = tostring(object.name or ''),
 }
 end
 
 -- csv buffer
 buffer = { }
 
 -- get object logs
 query = 'SELECT src, address, datahex, logtime, eventtype FROM objectlog WHERE logtime >= ? ORDER BY id DESC'
 for _, row in ipairs(db:getall(query, logtime)) do
 object = objects[ tonumber(row.address) ]
 
 -- found matching object and event type is group write
 if object and row.eventtype == 'write' then
 datatype = object.datatype
 
 -- check that object datatype is set
 if datatype then
 -- decode data
 data = knxdatatype.decode(row.datahex, datatype)
 
 -- remove null chars from char/string datatype
 if datatype == dt.char or datatype == dt.string then
 data = data:gsub('%z+', '')
 -- date to DD.MM.YYYY
 elseif datatype == dt.date then
 data = string.format('%.2d.%.2d.%.2d', data.day, data.month, data.year)
 -- time to HH:MM:SS
 elseif datatype == dt.time then
 data = string.format('%.2d:%.2d:%.2d', data.hour, data.minute, data.second)
 end
 else
 data = ''
 end
 
 -- format csv row
 logdate = os.date('%Y.%m.%d %H:%M:%S', row.logtime)
 csv = string.format('\r%q,%q,%q,%q', logdate, knxlib.decodega(row.address), object.name, tostring(data))
 
 -- add to buffer
 table.insert(buffer, csv)
 end
 end
 
 ftpfile = string.format('/home/ftp/%s.csv', os.date('%Y-%m-%d_%H-%M'))
 
 if #buffer > 1 then
 data = table.concat(buffer, '\r\n')
 io.writefile(ftpfile, data)
 end
 log(data)
 
		
	 
	
	
	
		
	Posts: 8413 
	Threads: 45 
	Joined: Jun 2015
	
 Reputation: 
481 
	
	
		What do you mean by the actual time? 
You can get units like this, then use object.units when generating CSV:
 Code: query = 'SELECT address, datatype, name, units FROM objects WHERE disablelog=0'for _, object in ipairs(db:getall(query)) do
 objects[ tonumber(object.address) ] = {
 datatype = tonumber(object.datatype),
 name = tostring(object.name or ''),
 units = tostring(object.units or ''),
 }
 end
		
	 
	
	
	
		
	Posts: 42 
	Threads: 11 
	Joined: Apr 2018
	
 Reputation: 
1 
	
	
		Could i get this script to work with a residential script?
 --Write data to a file in the local FTP
 
 value = event.getvalue()
 
 
 logdate = os.date('%Y.%m.%d %H:%M:%S', os.time())
 csv = string.format('%q,%q,%q,%q\r\n', logdate, event.dst, grp.alias(event.dst), tostring(value))
 
 log(csv)
 
 
 file, error = io.open("home/ftp/outdoor_temp1.txt","a")
 log(file,error)
 file:write(csv)
 file:close()
 
		
	 
	
	
	
		
	Posts: 5284 
	Threads: 29 
	Joined: Aug 2017
	
 Reputation: 
237 
	
	
		Yes only you need to change value = event.getvalue()  to value = grp.getvalue('1/1/1')This will work but if you will be writing to FTP which is on SD card every 60s you will kill it quite fast.
 
------------------------------Ctrl+F5
 
		
	 
	
	
	
		
	Posts: 8413 
	Threads: 45 
	Joined: Jun 2015
	
 Reputation: 
481 
	
	
		The path is incorrect, it must start with a "/", so it should be /home/ftp/outdoor_temp1.txtAnother issue is that you are constantly appending data to a file. If this is done very fast then the file can grow in size quickly and fill the whole partition. And as Daniel mentioned this can degrade the SD card due to constant writes.
 If you want to send data to external sources look into MQTT. If you want to log data externally then InfluxDB can be used (it has very simple integration with LM).
 
		
	 
	
	
	
		
	Posts: 42 
	Threads: 11 
	Joined: Apr 2018
	
 Reputation: 
1 
	
		
		
		11.01.2021, 16:04 
(This post was last modified: 11.01.2021, 16:17 by Rauschentofft.)
		
	 
		I am going use scheduled script. Every 6th hour it should read the meters and store it ftp and then send an mail.Then i want to delete the file after the email is sent.
 
 I tried with this script and made the changes but i got this error:
 
 Error log:
 test 11.01.2021 17:16:00
 User script:7: attempt to index global 'event' (a nil value)
 stack traceback:
 User script:7: in main chunk
 
 --Write data to a file in the local FTP
 
 value = grp.getvalue('16/1/1')
 
 
 logdate = os.date('%Y.%m.%d %H:%M:%S', os.time())
 csv = string.format('%q,%q,%q,%q\r\n', logdate, event.dst, grp.alias(event.dst), tostring(value))
 
 log(csv)
 
 
 file, error = io.open("/home/ftp/outdoor_temp1.txt","a")
 log(file,error)
 file:write(csv)
 file:close()
 
		
	 
	
	
	
		
	Posts: 8413 
	Threads: 45 
	Joined: Jun 2015
	
 Reputation: 
481 
	
	
		Replace event.dst in your script with '16/1/1', event variable is not defined in resident and scheduled scripts. 
If you want to send data as an attachment then you don't need to write it to file at all. See this example: https://forum.logicmachine.net/showthread.php?tid=394 
If you need to write some temporary data to a file then use tmp storage (for example "/tmp/myfile.txt"). tmp storage uses RAM so it does not degrade the SD card.
	
		
	 
	
	
	
		
	Posts: 42 
	Threads: 11 
	Joined: Apr 2018
	
 Reputation: 
1 
	
	
		Instead of get past hour data, just read every adress with tag energy. Code: -- Mail Attachmentsubject = 'Energy Values'
 --to = 'mail@mail.com''
 to = mail@mail.com'
 
 
 -- get past hour data (3600 seconds)
 logtime = os.time() - 60 * 60
 
 -- list of objects by id
 objects = {'Energy'}
 
 -- objects with logging enabled
 query = 'SELECT address, datatype, name FROM objects WHERE disablelog=0'
 for _, object in ipairs(db:getall(query)) do
 objects[ tonumber(object.address) ] = {
 datatype = tonumber(object.datatype),
 name = tostring(object.name or ''),
 }
 end
 
 -- csv buffer
 buffer = { '"date","address","name","value"' }
 
 -- get object logs
 query = 'SELECT src, address, datahex, logtime, eventtype FROM objectlog WHERE logtime >= ? ORDER BY id DESC'
 for _, row in ipairs(db:getall(query, logtime)) do
 object = objects[ tonumber(row.address) ]
 
 -- found matching object and event type is group write
 if object and row.eventtype == 'write' then
 datatype = object.datatype
 
 -- check that object datatype is set
 if datatype then
 -- decode data
 data = knxdatatype.decode(row.datahex, datatype)
 
 -- remove null chars from char/string datatype
 if datatype == dt.char or datatype == dt.string then
 data = data:gsub('%z+', '')
 -- date to DD.MM.YYYY
 elseif datatype == dt.date then
 data = string.format('%.2d.%.2d.%.2d', data.day, data.month, data.year)
 -- time to HH:MM:SS
 elseif datatype == dt.time then
 data = string.format('%.2d:%.2d:%.2d', data.hour, data.minute, data.second)
 end
 else
 data = ''
 end
 
 -- format csv row
 logdate = os.date('%Y.%m.%d %H:%M:%S', row.logtime)
 csv = string.format('%q,%q,%q,%q', logdate, knxlib.decodega(row.address), object.name, tostring(data))
 
 -- add to buffer
 table.insert(buffer, csv)
 end
 end
 
 -- transform table to string for csv file
 data1 = table.concat(buffer, '\r\n')
 
 -- make sure these settings are correctat
 local settings = {
 -- "from" field, only e-mail must be specified here
 from = 'mail@mail.com'',
 -- smtp username
 user = 'mail@mail.com'',
 -- smtp password
 password = 'Password',
 -- smtp server
 server = 'smtp.gmail.com',
 -- smtp server port
 port = 465,
 -- enable ssl, required for gmail smtp
 secure = 'sslv23',
 }
 
 mime = require("mime")
 ltn12 = require("ltn12")
 local smtp = require('socket.smtp')
 local escape = function(v)
 return '<' .. tostring(v) .. '>'
 end
 
 settings.from = escape(settings.from)
 settings.rcpt = { escape(to) }
 
 settings.source = smtp.message{
 headers = {
 to = to,
 subject = subject,
 },
 body = {
 preable = "This email contains 1 attachment.",
 [1] = {
 body = mime.eol(0, [["This message is automaticly send, Do not reply on this e-mail."]])
 },
 [2] = {
 headers = {
 ["content-type"] = 'text/plain; name="Objectlog.csv"',
 ["content-disposition"] = 'attachment; filename="Objectlog.csv"',
 },
 body = data1
 },
 epilogue = "End of mail"
 }
 }
 
 r, e = smtp.send(settings)
 
 if (e) then
 io.stderr:write("Could not send email: ", e, "\n")
 end
		
	 
	
	
	
		
	Posts: 42 
	Threads: 11 
	Joined: Apr 2018
	
 Reputation: 
1 
	
	
		I have the function now. It is possible to skip the adress in the csv file. 
 I just want Date, Time, Name and Value
 
2021.01.13 18:38:17,"16/1/3","Mät 3","3400502" 
2021.01.13 18:38:12,"16/1/2","Mät 2","200302" 
2021.01.13 18:38:08,"16/1/1","Mät 1","10202" 
Br
 Code: -- get past hour data (3600 seconds)logtime = os.time() - 60 * 60
 
 -- list of objects by id
 objects = {'Mätare'}
 
 -- objects with logging enabled
 query = 'SELECT address, datatype, name FROM objects WHERE disablelog=0'
 for _, object in ipairs(db:getall(query)) do
 objects[ tonumber(object.address) ] = {
 datatype = tonumber(object.datatype),
 name = tostring(object.name or ''),
 }
 end
 
 -- csv buffer
 buffer = { '"date","name","adress","value"' }
 
 -- get object logs
 query = 'SELECT src, address, datahex, logtime, eventtype FROM objectlog WHERE logtime >= ? ORDER BY id DESC'
 for _, row in ipairs(db:getall(query, logtime)) do
 object = objects[ tonumber(row.address) ]
 
 -- found matching object and event type is group write
 if object and row.eventtype == 'write' then
 datatype = object.datatype
 
 -- check that object datatype is set
 if datatype then
 -- decode data
 data = knxdatatype.decode(row.datahex, datatype)
 
 -- remove null chars from char/string datatype
 if datatype == dt.char or datatype == dt.string then
 data = data:gsub('%z+', '')
 -- date to DD.MM.YYYY
 elseif datatype == dt.date then
 data = string.format('%.2d.%.2d.%.2d', data.day, data.month, data.year)
 -- time to HH:MM:SS
 elseif datatype == dt.time then
 data = string.format('%.2d:%.2d:%.2d', data.hour, data.minute, data.second)
 end
 else
 data = ''
 end
 
 -- format csv row
 logdate = os.date('%Y.%m.%d %H:%M:%S', row.logtime)
 csv = string.format('%q,%q,%q,%q', logdate, knxlib.decodega(row.address), object.name, tostring(data))
 
 -- add to buffer
 table.insert(buffer, csv)
 end
 end
 
 -- transform table to string for csv file
 data1 = table.concat(buffer, '\r\n')
 
 -- make sure these settings are correctat
 local settings = {
 -- "from" field, only e-mail must be specified here
 from = 'gmail@gmail.com',
 -- smtp username
 user = 'gmail@gmail.com',
 -- smtp password
 password = 'Password!',
 -- smtp server
 server = 'smtp.gmail.com',
 -- smtp server port
 port = 465,
 -- enable ssl, required for gmail smtp
 secure = 'sslv23',
 }
 
 mime = require("mime")
 ltn12 = require("ltn12")
 local smtp = require('socket.smtp')
 local escape = function(v)
 return '<' .. tostring(v) .. '>'
 end
 
 settings.from = escape(settings.from)
 settings.rcpt = { escape(to) }
 
 settings.source = smtp.message{
 headers = {
 to = to,
 subject = subject,
 },
 body = {
 preable = "This email contains 1 attachment.",
 [1] = {
 body = mime.eol(0, [["This message is automaticly send, Do not reply on this e-mail."]])
 },
 [2] = {
 headers = {
 ["content-type"] = 'text/plain; name="Objectlog.csv"',
 ["content-disposition"] = 'attachment; filename="Objectlog.csv"',
 },
 body = data1
 },
 epilogue = "End of mail"
 }
 }
 
 r, e = smtp.send(settings)
 
 if (e) then
 io.stderr:write("Could not send email: ", e, "\n")
 end
		
	 
	
	
	
		
	Posts: 8413 
	Threads: 45 
	Joined: Jun 2015
	
 Reputation: 
481 
	
	
		Use this to generate CSV for a certain tag and a certain time: Code: tagname = 'log'logtime = os.time() - 60 * 60
 
 query = [[
 SELECT ol.*, o.name, o.datatype
 FROM objectlog ol
 JOIN objects o ON ol.address=o.id
 JOIN objecttags t ON o.id=t.object
 WHERE t.tag=? AND logtime>=?
 ORDER BY id DESC
 ]]
 
 buffer = { '"date","name","address","value"' }
 
 items = db:getall(query, tagname, logtime)
 for _, item in ipairs(items) do
 id = tonumber(item.address) or 0
 logdate = os.date('%Y.%m.%d %H:%M:%S', math.floor(item.logtime))
 etype = item.eventtype
 
 if (etype == 'write' or etype == 'response') and item.datatype then
 value = grp.decodevalue(item.datahex, item.datatype)
 else
 value = ''
 end
 
 buffer[ #buffer + 1 ] = string.format('%q,%q,%q,%q',
 logdate, item.name or '', buslib.decodega(id), tostring(value))
 end
 
 csv = table.concat(buffer, '\n')
		
	 
	
	
	
		
	Posts: 42 
	Threads: 11 
	Joined: Apr 2018
	
 Reputation: 
1 
	
	
		HI.
 I still got the groupadress in the csv file.
 
		
	 
	
	
	
		
	Posts: 8413 
	Threads: 45 
	Joined: Jun 2015
	
 Reputation: 
481 
	
	
		Like this: Code: tagname = 'log'logtime = os.time() - 60 * 60
 
 query = [[
 SELECT ol.*, o.name, o.datatype
 FROM objectlog ol
 JOIN objects o ON ol.address=o.id
 JOIN objecttags t ON o.id=t.object
 WHERE t.tag=? AND logtime>=?
 ORDER BY id DESC
 ]]
 
 buffer = { '"date","name","value"' }
 
 items = db:getall(query, tagname, logtime)
 for _, item in ipairs(items) do
 id = tonumber(item.address) or 0
 logdate = os.date('%Y.%m.%d %H:%M:%S', math.floor(item.logtime))
 etype = item.eventtype
 
 if (etype == 'write' or etype == 'response') and item.datatype then
 value = grp.decodevalue(item.datahex, item.datatype)
 else
 value = ''
 end
 
 buffer[ #buffer + 1 ] = string.format('%q,%q,%q',
 logdate, item.name or '', tostring(value))
 end
 
 csv = table.concat(buffer, '\n')
		
	 
	
	
	
		
	Posts: 42 
	Threads: 11 
	Joined: Apr 2018
	
 Reputation: 
1 
	
	
		Hi.
 And if i want to email this csv file?
 
		
	 
	
	
	
		
	Posts: 8413 
	Threads: 45 
	Joined: Jun 2015
	
 Reputation: 
481 
	
	
	
		
	Posts: 42 
	Threads: 11 
	Joined: Apr 2018
	
 Reputation: 
1 
	
	
		Hi.
 In the csv file it looks like this:
 
 "date","address","name","value"
 "2021.02.02 05:00:07","1/6/17","N13A33","11491"
 "2021.02.02 05:00:07","1/6/16","N10A25","12894"
 
 Could i change it so it looks like this instead?
 
 date;name;value
 2021-02-02;4400A1FS6B;591616,24
 2021-02-02;4400A1FS6C;290030,96
 
		
	 
	
	
	
		
	Posts: 8413 
	Threads: 45 
	Joined: Jun 2015
	
 Reputation: 
481 
	
	
		This should produce what you need. Are you sure that you don't need the time to be included? Code: tagname = 'log'logtime = os.time() - 60 * 60
 
 query = [[
 SELECT ol.*, o.name, o.datatype
 FROM objectlog ol
 JOIN objects o ON ol.address=o.id
 JOIN objecttags t ON o.id=t.object
 WHERE t.tag=? AND logtime>=?
 ORDER BY id DESC
 ]]
 
 buffer = { 'date;name;value' }
 
 items = db:getall(query, tagname, logtime)
 for _, item in ipairs(items) do
 id = tonumber(item.address) or 0
 logdate = os.date('%Y-%m-%d', math.floor(item.logtime))
 etype = item.eventtype
 
 if (etype == 'write' or etype == 'response') and item.datatype then
 value = grp.decodevalue(item.datahex, item.datatype)
 if type(value) == 'number' then
 value = tostring(value):gsub('%.', ',')
 end
 else
 value = ''
 end
 
 buffer[ #buffer + 1 ] = string.format('%s;%s;%s',
 logdate, item.name or '', tostring(value))
 end
 
 csv = table.concat(buffer, '\n')
		
	 |