| 
		
	
	
	
		
	Posts: 141 
	Threads: 19 
	Joined: Apr 2018
	
 Reputation: 
0 
	
		
		
		04.10.2023, 20:38 
(This post was last modified: 04.10.2023, 20:38 by manos@dynamitec.)
		
	 
		Hello Admin,
 I can understand that but we have a LM5Lp2 and using it on various sites as a logger when required. It is not that it is running for very long periods. I just want to overcome the 50k data points limit. In the projects we have (not logging everything of course), I would guess that 50K telegrams will be enough for one day. If the LM is logging for 7-15 days then this file will be 15 time written or so. When installed on a new site for example we can delete all files from the FTP server and start over with the logging.
 
 Is this possible to do?
 
 Regards,
 
		
	 
	
	
	
		
	Posts: 8413 
	Threads: 45 
	Joined: Jun 2015
	
 Reputation: 
481 
	
	
		See this example: https://openrb.com/example-export-last-h...-from-lm2/ 
It has both local file and remote FTP options.
	
		
	 
	
	
	
		
	Posts: 141 
	Threads: 19 
	Joined: Apr 2018
	
 Reputation: 
0 
	
		
		
		16.01.2024, 11:59 
(This post was last modified: 16.01.2024, 12:03 by manos@dynamitec.)
		
	 
		 (05.10.2023, 06:52)admin Wrote:  See this example: https://openrb.com/example-export-last-h...-from-lm2/It has both local file and remote FTP options.
 
Hello Admin,
 
I have applied the solution from the example on the link, although unfortunately I just noticed that not all fields are included on the csv file. I would like to export all data that can be seen on the object logs utility. For the moment only timestamp, address, name and value is included in the csv export...
 
I am using the following script: 
 Code:    --ftp fileftpfile = string.format('/home/ftp/%s.csv', os.date('%Y-%m-%d_%H-%M'))
 -- get past 12 hours data (43200 seconds)
 logtime = os.time() - 12 * 60 * 60
 
 -- list of objects by id
 objects = {}
 
 -- objects with logging enabled
 query = 'SELECT address, datatype, name FROM objects WHERE disablelog=0'
 for _, object in ipairs(db:getall(query)) do
 objects[ tonumber(object.address) ] = {
 datatype = tonumber(object.datatype),
 name = tostring(object.name or ''),
 }
 end
 
 -- csv buffer
 buffer = { '"date","address","name","value"' }
 
 -- get object logs
 query = 'SELECT src, address, datahex, logtime, eventtype FROM objectlog WHERE logtime >= ? ORDER BY id DESC'
 for _, row in ipairs(db:getall(query, logtime)) do
 object = objects[ tonumber(row.address) ]
 
 -- found matching object and event type is group write
 if object and row.eventtype == 'write' then
 datatype = object.datatype
 
 -- check that object datatype is set
 if datatype then
 -- decode data
 data = knxdatatype.decode(row.datahex, datatype)
 
 -- remove null chars from char/string datatype
 if datatype == dt.char or datatype == dt.string then
 data = data:gsub('%z+', '')
 -- date to DD.MM.YYYY
 elseif datatype == dt.date then
 data = string.format('%.2d.%.2d.%.2d', data.day, data.month, data.year)
 -- time to HH:MM:SS
 elseif datatype == dt.time then
 data = string.format('%.2d:%.2d:%.2d', data.hour, data.minute, data.second)
 end
 else
 data = ''
 end
 
 -- format csv row
 logdate = os.date('%Y.%m.%d %H:%M:%S', row.logtime)
 csv = string.format('%q,%q,%q,%q', logdate, knxlib.decodega(row.address), object.name, tostring(data))
 
 -- add to buffer
 table.insert(buffer, csv)
 end
 end
 
 -- upload to ftp only when there's data in buffer
 if #buffer > 1 then
 data = table.concat(buffer, '\r\n')
 io.writefile(ftpfile, data)
 end
How can this be modified to include the type (read/write) and source address fields?
 
Kind Regards,
	 
		
	 
	
	
	
		
	Posts: 8413 
	Threads: 45 
	Joined: Jun 2015
	
 Reputation: 
481 
	
	
	
		
	Posts: 266 
	Threads: 39 
	Joined: Feb 2016
	
 Reputation: 
1 
	
	
		 (17.01.2022, 08:17)admin Wrote:  Escaping of the "to" address was missing.Here's a rewritten script with clearer structure. You only need to modify the to/settings/subject at the to top of the script. SSL settings do not matter anymore because newer SSL library automatically negotiates the highest supported SSL/TLS version between server/client.
 
 Code: to = 'to@example.com'
 settings = {
 -- "from" field, only e-mail must be specified here
 from = 'sender@example.com',
 -- smtp username
 user = 'sender@example.com',
 -- smtp password
 password = '12345678',
 -- smtp server
 server = 'smtp.gmail.com',
 -- smtp server port
 port = 465,
 -- enable ssl, required for gmail smtp
 secure = 'sslv23',
 }
 
 subject = 'CSV logs'
 
 logtime = os.time() - 60 * 60 -- last hour (3600 seconds)
 
 buffer = {'"date","address","name","value"'}
 
 query = [[
 SELECT o.id, o.datatype, o.name, ol.datahex, ol.logtime, ol.eventtype
 FROM objectlog ol
 JOIN objects o ON ol.address=o.id
 WHERE ol.logtime >= ?
 ORDER BY ol.id DESC
 ]]
 
 for _, row in ipairs(db:getall(query, logtime)) do
 if row.datatype and row.eventtype == 'write' then
 data = grp.decodevalue(row.datahex, row.datatype)
 logdate = os.date('%Y.%m.%d %H:%M:%S', row.logtime)
 
 buffer[ #buffer + 1 ] = string.format('%q,%q,%q,%q',
 logdate, knxlib.decodega(row.id), row.name, tostring(data))
 end
 end
 
 csv = table.concat(buffer, '\r\n')
 
 smtp = require('socket.smtp')
 mime = require('mime')
 ltn12 = require('ltn12')
 
 function escape(v)
 return '<' .. tostring(v) .. '>'
 end
 
 to = escape(to)
 
 msgt = {
 headers = {
 to = to,
 ['content-type'] = 'text/csv',
 ['content-disposition'] = 'attachment; filename="logs.csv"',
 ['content-transfer-encoding'] = 'BASE64',
 subject = subject,
 },
 body = ltn12.source.chain(
 ltn12.source.string(csv),
 ltn12.filter.chain(mime.encode('base64'), mime.wrap('base64'))
 )
 }
 
 settings.source = smtp.message(msgt)
 settings.from = escape(settings.from)
 settings.rcpt = { to }
 
 res, err = smtp.send(settings)
 log(res, err)
 
Hi, thanks for the script,
 
is there an easy way to separate in different csv file the log object and send them in one email?
	 
-----------FRANCE SMARTHOME & SMARTBUILDING INTEGRATION
 SE ECO EXPERT
 
		
	 
	
	
	
		
	Posts: 266 
	Threads: 39 
	Joined: Feb 2016
	
 Reputation: 
1 
	
		
		
		20.02.2024, 11:49 
(This post was last modified: 20.02.2024, 12:07 by domotiqa.)
		
	 
		I test it but I think the smtp output of my customer is blocked (result 1, nil) for the log. I will try on my test server Code: smtp = require('socket.smtp')mime = require('mime')
 ltn12 = require('ltn12')
 
 
 to = 'xxx@gmail.com'
 subject = 'CSV logs'
 message='export log'
 logtime = os.time() - 60 * 60--*24*7 -- last hour (3600 seconds)
 
 
 -------------- cherche les cpt
 query = [[
 SELECT ol2.name
 FROM objects ol2
 WHERE ol2.disablelog = 0
 ]]
 
 csv={}
 nombre=0
 nom={}
 
 for _, objectlog in ipairs(db:getall(query, logtime)) do
 nombre=nombre+1
 
 nom[nombre]=objectlog.name
 
 ----------------
 
 --buffer = {'"date","address","name","value"'}
 buffer = {'"date","name","value"'}
 
 query2 = [[
 SELECT o.id, o.datatype, o.name, ol.datahex, ol.logtime, ol.eventtype
 FROM objectlog ol
 JOIN objects o ON ol.address=o.id
 WHERE (ol.logtime >= ?)     ORDER BY ol.id DESC
 ]]
 
 for _, row in ipairs(db:getall(query2, logtime)) do
 if row.datatype and row.eventtype == 'write' and row.name == nom[nombre] then
 data = grp.decodevalue(row.datahex, row.datatype)
 logdate = os.date('%Y.%m.%d %H:%M:%S', row.logtime)
 
 buffer[ #buffer + 1 ] = string.format('%q,%q,%q',
 --logdate, knxlib.decodega(row.id), row.name, tostring(data))
 logdate, string.split(row.name,'-')[5], tostring(data))
 end
 end
 
 csv[nombre] = table.concat(buffer, '\r\n')
 
 end --pour chaque object ayant log
 
 
 attachement = {}
 nombre=0
 
 for _, nomCpt in ipairs(nom) do
 nombre=nombre+1
 -- create file
 table.insert(attachement, {filename=nomCpt .. '.csv', filedata=csv[nombre], mimetype='text/csv'})
 end
 
 mailattach(to, subject, message, attachement)
and function in common
 Code: --********************************-- send an e-mail with attachments
 function mailattach(to, subject, message, files)
 local settings = {
 -- "from" field, only e-mail must be specified here
 from = 'notification@xxx.com',
 -- smtp username
 user = 'notification@xxx.com',
 -- smtp password
 password = 'xxx',
 -- smtp server
 server = 'xxx.net',
 --server = 'xxxx.net',
 
 -- smtp server port
 port = 465,
 --port = 587,
 -- enable ssl, required for gmail smtp
 secure = 'sslv23',
 --secure = 'tlsv1_2',
 }
 -- ssl0.ovh.net sur port 465
 -- ns0.ovh.net sur ports 25, 5025, 587, enfin tout ce que j'essaye...
 
 
 local smtp = require('socket.smtp')
 
 if type(to) ~= 'table' then
 to = { to }
 end
 
 for index, email in ipairs(to) do
 to[ index ] = '<' .. tostring(email) .. '>'
 end
 
 -- escape double quotes in file name
 for _, object in ipairs(table) do
 --files[_].filename = files[_].filename:gsub('"', '\\"')
 end
 
 -- fixup from field
 local from = '<' .. tostring(settings.from) .. '>'
 
 -- message headers and body
 email = {headers = {}, body = {}}
 email.headers = {
 to = table.concat(to, ', '),
 subject = subject,
 ['From'] = from,
 --['Content-type'] = 'text/html; charset=utf-8',
 }
 email.body[1] = { headers = { ['Content-Type'] = 'text/html; charset=utf-8', }, body = mime.eol(0, message) }
 
 -- adding attachements
 for _, object in ipairs(files) do
 email.body[_+1] = {
 headers = { ['Content-Type'] = files[_].mimetype or 'text/plain', ['Content-Disposition'] = 'attachment; filename="' .. files[_].filename ..'"', ['Content-Transfer-Encoding'] = 'BASE64', },
 body = ltn12.source.chain( ltn12.source.string(files[_].filedata), ltn12.filter.chain(mime.encode('base64'), mime.wrap() ) )
 }
 end
 
 settings.source = smtp.message(email)
 
 settings.from = from
 settings.rcpt = to
 
 return smtp.send(settings)
 end
-----------FRANCE SMARTHOME & SMARTBUILDING INTEGRATION
 SE ECO EXPERT
 
		
	 |