This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

Read values in scheduled script store in ftp
#1
Hi.

I need help with a scheduled script.
I want to read 4 byte floating point values throu tags or miltiple tags
and store the values to local lm ftp in a csv file.

The value from each object should contain: Date, Time, Name, Value and unit/suffix.

Is this possible?
Reply
#2
You can use this example as a starting point: http://openrb.com/example-export-last-ho...-from-lm2/
Reply
#3
I was thinking a script like this.

But this script is only working in an event.

-- Write data to a file in the local FTP

value = event.getvalue()  


logdate = os.date('%Y.%m.%d %H:%M:%S', os.time())
csv = string.format('%q,%q,%q,%q\r\n', logdate, event.dst, grp.alias(event.dst), tostring(value))

log(csv)  
  

file, error = io.open("home/ftp/outdoor_temp1.txt","a")
log(file,error)
file:write(csv)
file:close()
Reply
#4
You need to provide an absolute path:
Code:
file, error = io.open("/home/ftp/outdoor_temp1.txt","a")
Reply
#5
Hi. 

I got this script to work. But i have two qiestions.

It is possible to skip past hour data and get the acctual time instead?

How read the unit/suffix and add it to the result.

Result:
--Date, Time, Adress, Name, Value
"2018.10.04 07:34:10","33/1/3","Meter 2","1495"

--------Script write to FTP----------

-- get past hour data (3600 seconds)
logtime = os.time() - 10 * 1

-- list of objects by id
objects = {'Mätare'}

-- objects with logging enabled
query = 'SELECT address, datatype, name FROM objects WHERE disablelog=0'
for _, object in ipairs(db:getall(query)) do
 objects[ tonumber(object.address) ] = {
   datatype = tonumber(object.datatype),
   name = tostring(object.name or ''),
 }
end

-- csv buffer
buffer = { }

-- get object logs
query = 'SELECT src, address, datahex, logtime, eventtype FROM objectlog WHERE logtime >= ? ORDER BY id DESC'
for _, row in ipairs(db:getall(query, logtime)) do
 object = objects[ tonumber(row.address) ]

 -- found matching object and event type is group write
 if object and row.eventtype == 'write' then
   datatype = object.datatype

   -- check that object datatype is set
   if datatype then
     -- decode data
     data = knxdatatype.decode(row.datahex, datatype)

     -- remove null chars from char/string datatype
     if datatype == dt.char or datatype == dt.string then
       data = data:gsub('%z+', '')
     -- date to DD.MM.YYYY
     elseif datatype == dt.date then
       data = string.format('%.2d.%.2d.%.2d', data.day, data.month, data.year)
     -- time to HH:MM:SS
     elseif datatype == dt.time then
       data = string.format('%.2d:%.2d:%.2d', data.hour, data.minute, data.second)
     end
   else
     data = ''
   end

   -- format csv row
   logdate = os.date('%Y.%m.%d %H:%M:%S', row.logtime)
   csv = string.format('\r%q,%q,%q,%q', logdate, knxlib.decodega(row.address), object.name, tostring(data))

   -- add to buffer
   table.insert(buffer, csv)
 end
end

ftpfile = string.format('/home/ftp/%s.csv', os.date('%Y-%m-%d_%H-%M'))

if #buffer > 1 then
 data = table.concat(buffer, '\r\n')
 io.writefile(ftpfile, data)
end
log(data)
Reply
#6
What do you mean by the actual time?

You can get units like this, then use object.units when generating CSV:
Code:
query = 'SELECT address, datatype, name, units FROM objects WHERE disablelog=0'
for _, object in ipairs(db:getall(query)) do
objects[ tonumber(object.address) ] = {
   datatype = tonumber(object.datatype),
   name = tostring(object.name or ''),
   units = tostring(object.units or ''),
}
end
Reply
#7
Could i get this script to work with a residential script?

--Write data to a file in the local FTP

value = event.getvalue()


logdate = os.date('%Y.%m.%d %H:%M:%S', os.time())
csv = string.format('%q,%q,%q,%q\r\n', logdate, event.dst, grp.alias(event.dst), tostring(value))

log(csv)


file, error = io.open("home/ftp/outdoor_temp1.txt","a")
log(file,error)
file:write(csv)
file:close()
Reply
#8
Yes only you need to change value = event.getvalue() to value = grp.getvalue('1/1/1')
This will work but if you will be writing to FTP which is on SD card every 60s you will kill it quite fast.
Reply
#9
The path is incorrect, it must start with a "/", so it should be /home/ftp/outdoor_temp1.txt
Another issue is that you are constantly appending data to a file. If this is done very fast then the file can grow in size quickly and fill the whole partition. And as Daniel mentioned this can degrade the SD card due to constant writes.
If you want to send data to external sources look into MQTT. If you want to log data externally then InfluxDB can be used (it has very simple integration with LM).
Reply
#10
I am going use scheduled script. Every 6th hour it should read the meters and store it ftp and then send an mail.
Then i want to delete the file after the email is sent.

I tried with this script and made the changes but i got this error:

Error log:
test 11.01.2021 17:16:00
User script:7: attempt to index global 'event' (a nil value)
stack traceback:
User script:7: in main chunk

--Write data to a file in the local FTP

value = grp.getvalue('16/1/1')


logdate = os.date('%Y.%m.%d %H:%M:%S', os.time())
csv = string.format('%q,%q,%q,%q\r\n', logdate, event.dst, grp.alias(event.dst), tostring(value))

log(csv)


file, error = io.open("/home/ftp/outdoor_temp1.txt","a")
log(file,error)
file:write(csv)
file:close()
Reply
#11
Replace event.dst in your script with '16/1/1', event variable is not defined in resident and scheduled scripts.
If you want to send data as an attachment then you don't need to write it to file at all. See this example: https://forum.logicmachine.net/showthread.php?tid=394
If you need to write some temporary data to a file then use tmp storage (for example "/tmp/myfile.txt"). tmp storage uses RAM so it does not degrade the SD card.
Reply
#12
Instead of get past hour data, just read every adress with tag energy.

Code:
-- Mail Attachment
subject = 'Energy Values'
--to = 'mail@mail.com''
to = mail@mail.com'


-- get past hour data (3600 seconds)
logtime = os.time() - 60 * 60

-- list of objects by id
objects = {'Energy'}

-- objects with logging enabled
query = 'SELECT address, datatype, name FROM objects WHERE disablelog=0'
for _, object in ipairs(db:getall(query)) do
  objects[ tonumber(object.address) ] = {
    datatype = tonumber(object.datatype),
    name = tostring(object.name or ''),
  }
end

-- csv buffer
buffer = { '"date","address","name","value"' }

-- get object logs
query = 'SELECT src, address, datahex, logtime, eventtype FROM objectlog WHERE logtime >= ? ORDER BY id DESC'
for _, row in ipairs(db:getall(query, logtime)) do
  object = objects[ tonumber(row.address) ]

  -- found matching object and event type is group write
  if object and row.eventtype == 'write' then
    datatype = object.datatype

    -- check that object datatype is set
    if datatype then
      -- decode data
      data = knxdatatype.decode(row.datahex, datatype)

      -- remove null chars from char/string datatype
      if datatype == dt.char or datatype == dt.string then
        data = data:gsub('%z+', '')
      -- date to DD.MM.YYYY
      elseif datatype == dt.date then
        data = string.format('%.2d.%.2d.%.2d', data.day, data.month, data.year)
      -- time to HH:MM:SS
      elseif datatype == dt.time then
        data = string.format('%.2d:%.2d:%.2d', data.hour, data.minute, data.second)
      end
    else
      data = ''
    end

    -- format csv row
    logdate = os.date('%Y.%m.%d %H:%M:%S', row.logtime)
    csv = string.format('%q,%q,%q,%q', logdate, knxlib.decodega(row.address), object.name, tostring(data))

    -- add to buffer
    table.insert(buffer, csv)
  end
end

-- transform table to string for csv file
data1 = table.concat(buffer, '\r\n')

  -- make sure these settings are correctat
  local settings = {
    -- "from" field, only e-mail must be specified here
  from = 'mail@mail.com'',
    -- smtp username
  user = 'mail@mail.com'',
    -- smtp password
  password = 'Password',
    -- smtp server
    server = 'smtp.gmail.com',
    -- smtp server port
    port = 465,
    -- enable ssl, required for gmail smtp
    secure = 'sslv23',
  }

  mime = require("mime")
  ltn12 = require("ltn12")
  local smtp = require('socket.smtp')
  local escape = function(v)
  return '<' .. tostring(v) .. '>'
  end

settings.from = escape(settings.from)
settings.rcpt = { escape(to) }

settings.source = smtp.message{
   headers = {
    to = to,
      subject = subject,
    },
   body = {
      preable = "This email contains 1 attachment.",
      [1] = {
      body = mime.eol(0, [["This message is automaticly send, Do not reply on this e-mail."]])
      },
      [2] = {
     headers = {
      ["content-type"] = 'text/plain; name="Objectlog.csv"',
      ["content-disposition"] = 'attachment; filename="Objectlog.csv"',
     },
      body = data1
      },
    epilogue = "End of mail"
   }
}

r, e = smtp.send(settings)

if (e) then
  io.stderr:write("Could not send email: ", e, "\n")
end
Reply
#13
I have the function now. It is possible to skip the adress in the csv file.

I just want Date, Time, Name and Value

2021.01.13 18:38:17,"16/1/3","Mät 3","3400502"
2021.01.13 18:38:12,"16/1/2","Mät 2","200302"
2021.01.13 18:38:08,"16/1/1","Mät 1","10202"
Br


Code:
-- get past hour data (3600 seconds)
logtime = os.time() - 60 * 60

-- list of objects by id
objects = {'Mätare'}

-- objects with logging enabled
query = 'SELECT address, datatype, name FROM objects WHERE disablelog=0'
for _, object in ipairs(db:getall(query)) do
  objects[ tonumber(object.address) ] = {
    datatype = tonumber(object.datatype),
    name = tostring(object.name or ''),
  }
end

-- csv buffer
buffer = { '"date","name","adress","value"' }

-- get object logs
query = 'SELECT src, address, datahex, logtime, eventtype FROM objectlog WHERE logtime >= ? ORDER BY id DESC'
for _, row in ipairs(db:getall(query, logtime)) do
  object = objects[ tonumber(row.address) ]

  -- found matching object and event type is group write
  if object and row.eventtype == 'write' then
    datatype = object.datatype

    -- check that object datatype is set
    if datatype then
      -- decode data
      data = knxdatatype.decode(row.datahex, datatype)

      -- remove null chars from char/string datatype
      if datatype == dt.char or datatype == dt.string then
        data = data:gsub('%z+', '')
      -- date to DD.MM.YYYY
      elseif datatype == dt.date then
        data = string.format('%.2d.%.2d.%.2d', data.day, data.month, data.year)
      -- time to HH:MM:SS
      elseif datatype == dt.time then
        data = string.format('%.2d:%.2d:%.2d', data.hour, data.minute, data.second)
      end
    else
      data = ''
    end

    -- format csv row
    logdate = os.date('%Y.%m.%d %H:%M:%S', row.logtime)
    csv = string.format('%q,%q,%q,%q', logdate, knxlib.decodega(row.address), object.name, tostring(data))

    -- add to buffer
    table.insert(buffer, csv)
  end
end

-- transform table to string for csv file
data1 = table.concat(buffer, '\r\n')

  -- make sure these settings are correctat
  local settings = {
    -- "from" field, only e-mail must be specified here
  from = 'gmail@gmail.com',
    -- smtp username
  user = 'gmail@gmail.com',
    -- smtp password
  password = 'Password!',
    -- smtp server
    server = 'smtp.gmail.com',
    -- smtp server port
    port = 465,
    -- enable ssl, required for gmail smtp
    secure = 'sslv23',
  }

  mime = require("mime")
  ltn12 = require("ltn12")
  local smtp = require('socket.smtp')
  local escape = function(v)
  return '<' .. tostring(v) .. '>'
  end

settings.from = escape(settings.from)
settings.rcpt = { escape(to) }

settings.source = smtp.message{
   headers = {
    to = to,
      subject = subject,
    },
   body = {
      preable = "This email contains 1 attachment.",
      [1] = {
      body = mime.eol(0, [["This message is automaticly send, Do not reply on this e-mail."]])
      },
      [2] = {
     headers = {
      ["content-type"] = 'text/plain; name="Objectlog.csv"',
      ["content-disposition"] = 'attachment; filename="Objectlog.csv"',
     },
      body = data1
      },
    epilogue = "End of mail"
   }
}

r, e = smtp.send(settings)

if (e) then
  io.stderr:write("Could not send email: ", e, "\n")
end
Reply
#14
Use this to generate CSV for a certain tag and a certain time:
Code:
tagname = 'log'
logtime = os.time() - 60 * 60

query = [[
SELECT ol.*, o.name, o.datatype
FROM objectlog ol
JOIN objects o ON ol.address=o.id
JOIN objecttags t ON o.id=t.object
WHERE t.tag=? AND logtime>=?
ORDER BY id DESC
]]

buffer = { '"date","name","address","value"' }

items = db:getall(query, tagname, logtime)
for _, item in ipairs(items) do
  id = tonumber(item.address) or 0
  logdate = os.date('%Y.%m.%d %H:%M:%S', math.floor(item.logtime))
  etype = item.eventtype

  if (etype == 'write' or etype == 'response') and item.datatype then
    value = grp.decodevalue(item.datahex, item.datatype)
  else
    value = ''
  end

  buffer[ #buffer + 1 ] = string.format('%q,%q,%q,%q',
    logdate, item.name or '', buslib.decodega(id), tostring(value))
end

csv = table.concat(buffer, '\n')
Reply
#15
HI.

I still got the groupadress in the csv file.
Reply
#16
Like this:
Code:
tagname = 'log'
logtime = os.time() - 60 * 60

query = [[
SELECT ol.*, o.name, o.datatype
FROM objectlog ol
JOIN objects o ON ol.address=o.id
JOIN objecttags t ON o.id=t.object
WHERE t.tag=? AND logtime>=?
ORDER BY id DESC
]]

buffer = { '"date","name","value"' }

items = db:getall(query, tagname, logtime)
for _, item in ipairs(items) do
  id = tonumber(item.address) or 0
  logdate = os.date('%Y.%m.%d %H:%M:%S', math.floor(item.logtime))
  etype = item.eventtype

  if (etype == 'write' or etype == 'response') and item.datatype then
    value = grp.decodevalue(item.datahex, item.datatype)
  else
    value = ''
  end

  buffer[ #buffer + 1 ] = string.format('%q,%q,%q',
    logdate, item.name or '', tostring(value))
end

csv = table.concat(buffer, '\n')
Reply
#17
Hi.

And if i want to email this csv file?
Reply
#18
See this thread: https://forum.logicmachine.net/showthread.php?tid=394
Reply
#19
Hi.

In the csv file it looks like this:

"date","address","name","value"
"2021.02.02 05:00:07","1/6/17","N13A33","11491"
"2021.02.02 05:00:07","1/6/16","N10A25","12894"

Could i change it so it looks like this instead?

date;name;value
2021-02-02;4400A1FS6B;591616,24
2021-02-02;4400A1FS6C;290030,96
Reply
#20
This should produce what you need. Are you sure that you don't need the time to be included?
Code:
tagname = 'log'
logtime = os.time() - 60 * 60

query = [[
SELECT ol.*, o.name, o.datatype
FROM objectlog ol
JOIN objects o ON ol.address=o.id
JOIN objecttags t ON o.id=t.object
WHERE t.tag=? AND logtime>=?
ORDER BY id DESC
]]

buffer = { 'date;name;value' }

items = db:getall(query, tagname, logtime)
for _, item in ipairs(items) do
  id = tonumber(item.address) or 0
  logdate = os.date('%Y-%m-%d', math.floor(item.logtime))
  etype = item.eventtype

  if (etype == 'write' or etype == 'response') and item.datatype then
    value = grp.decodevalue(item.datahex, item.datatype)
    if type(value) == 'number' then
      value = tostring(value):gsub('%.', ',')
    end
  else
    value = ''
  end

  buffer[ #buffer + 1 ] = string.format('%s;%s;%s',
    logdate, item.name or '', tostring(value))
end

csv = table.concat(buffer, '\n')
Reply


Forum Jump: