Posts: 176
Threads: 42
Joined: Jul 2015
Reputation:
2
Hi. You got a script example that exports csv log to ftp client.
I have tried this but the file get to big as it logs all changes.
Is it possible to send only once an hour, the trend value for the passed hour or minute. That i get 24 values a day or 60 values a minute if i want bigger resolution on some adresses?
Posts: 7758
Threads: 42
Joined: Jun 2015
Reputation:
447
Try this, it should fetch values for the given trend for the last hour with one minute resolution:
Code: require('genohm-scada.trends')
time = os.time()
date = {
stime = time - 3600,
etime = time,
}
values = trends.fetch('My Trends Name', 'hour', date)
for _, data in ipairs(values) do
value = data[ 2 ]
date = os.date('%Y.%m.%d %H:%M', time)
log(date, value)
time = time + 60
end
Posts: 176
Threads: 42
Joined: Jul 2015
Reputation:
2
Ok. And if i w would like to have 15 minutes resolution or 60 minutes. Is that also possible?
Can i use ftp server or something to access the memory with filemanager to find the csv files manualy also?
Posts: 176
Threads: 42
Joined: Jul 2015
Reputation:
2
01.10.2015, 19:33
(This post was last modified: 02.10.2015, 06:48 by admin.)
I tried to integrate this into your example but it only types the data in the logs of the LM.
How do I change this script to get it to work?
I need the LM to export the log with 30 min resolution to my web server once a day.
When fetching the log i also need the values to be accurate. How is this data produced? Like the 13.00 log result, it should be the average of 12.45-13.15. Is that correct?
Code: require('socket.ftp')
logtime = os.time() - 60 * 60
objects = {}
query = 'SELECT address, datatype, name FROM objects WHERE disablelog=0'
for _, object in ipairs(db:getall(query)) do
objects[ tonumber(object.address) ] = {
datatype = tonumber(object.datatype),
name = tostring(object.name or ''),
}
end
buffer = {}
require('genohm-scada.trends')
time = os.time()
date = {
stime = time - 3600,
etime = time,
}
values = trends.fetch('watt lys', 'hour', date)
for _, data in ipairs(values) do
value = data[ 2 ]
date = os.date('%Y.%m.%d %H:%M', time)
log(date, value)
time = time + 60
end
if #buffer > 1 then
data = table.concat(buffer, '\r\n')
res, err = socket.ftp.put({
host = 'x',
user = 'x',
password = 'x',
command = 'appe',
argument = 'x',
source = ltn12.source.string(data)
})
end
if err then
alert('FTP upload error: %s', tostring(err))
end
Posts: 7758
Threads: 42
Joined: Jun 2015
Reputation:
447
Your script does not work because you are not putting any data in buffer and thus there's no data to upload.
Try this:
Code: require('socket.ftp')
require('genohm-scada.trends')
time = os.time()
date = {
stime = time - 3600,
etime = time,
}
values = trends.fetch('watt lys', 'hour', date)
buffer = {}
count = 0
average = 0
resolution = 30 -- in minutes
for _, data in ipairs(values) do
count = count + 1
average = average + data[ 2 ]
if count == resolution then
date = os.date('%Y.%m.%d %H:%M', time)
value = average / resolution
row = string.format('%q,%q', date, value)
table.insert(buffer, row)
time = time + 60 * resolution
count = 0
average = 0
end
end
if #buffer > 0 then
data = table.concat(buffer, '\r\n')
res, err = socket.ftp.put({
host = 'x',
user = 'x',
password = 'x',
command = 'appe',
argument = 'x',
source = ltn12.source.string(data)
})
end
if err then
alert('FTP upload error: %s', tostring(err))
end
Posts: 176
Threads: 42
Joined: Jul 2015
Reputation:
2
Hi I have testet and it works. You can se published data on http://passivpluss.no/WP/?page_id=537
I have another request.
The commercial Husbanken whom I have to deliver data to wants to have 1 minute data for some logs. What is the resolution of the built in trend data?
Is it possible to log on to ftp on Logic Machine and fetch the log file or do we have to have a script for this?
The trend on my page has a resolution on 1 hour now, but I use the example you made for me. But with resolution 60 minute. Is that a good way to o it or do it suck resourses? I have much logging to be exported
Posts: 7758
Threads: 42
Joined: Jun 2015
Reputation:
447
By default there's some one-minute data available, you can change resolution variable value from 30 to 1 and it should work.
Posts: 176
Threads: 42
Joined: Jul 2015
Reputation:
2
How much is some? How often do i have to run this script to get 1 minute data?
Posts: 7758
Threads: 42
Joined: Jun 2015
Reputation:
447
Some depends on your setting of "1 minute data" for each trend. The script above should be run once in an hour.
Posts: 176
Threads: 42
Joined: Jul 2015
Reputation:
2
This logging works perfect.
1. If I would like to do the same but with hours, or days to make the csv. Could you help me with this?
2. I see that there is a bug in the trend viewer. When using the trend viewer to look at days I can se that for instance 8 of oktober, is the 9 of oktober in the monts vierw of the trend
3. Is there an easy to use accumulate function? In my case I got a lot of readings of energy and water, I would like to have a function that uses the trend files once an hour to add last hour value. This is to get a accumulated energy value for many months. Like I start this at 15.00. I use 500 watts in an hour. At 16.00 i would like to add 500 to group adress x\x\x so that new value is 500. say if i turn down the demand and only use 200watt in an hour. at 17.00 i would like to add 200 to x\x\x and make it 700. and so on...
4. Underneath is my logging by hours I use at the moment.
This produces a csv file that looks like
"Tid(Time)" "Aktiv effekt L1(Power usage L1)"
"10.okt 10.00" "500"
"11.okt 11.00" "400"
and so on for the last 24 hours
If I want to impement another trends values is that possible?
Like
"Tid(Time)" "Aktiv effekt L1(Power usage L1)" "Aktiv effekt L2(Power usage L2)" "Aktiv effekt L2(Power usage L2)"
"10.okt 10.00" "500" "400" "200"
"11.okt 11.00" "400" "500" "250"
time = os.time()
date = {
stime = time - 86400,
etime = time,
}
values = trends.fetch('aktiv effekt L1', 'hour', date)
buffer = {}
count = 0
average = 0
resolution = 60 -- in minutes
row = string.format('%q,%q', "Tid", "aktiv L1")
table.insert(buffer, row)
for _, data in ipairs(values) do
count = count + 1
average = average + data[ 2 ]
if count == resolution then
date = os.date('%d.%b %H:%M', time -86400)
value = average / resolution
row = string.format('%q,%q', date, value)
table.insert(buffer, row)
time = time + 60 * resolution
count = 0
average = 0
end
end
if #buffer > 0 then
data = table.concat(buffer, '\n\r')
res, err = socket.ftp.put({
xxxxxx
x
xxxx
Thanks for an helping hand
Posts: 1764
Threads: 6
Joined: Jul 2015
Reputation:
117
Hi,
You currently use the trend type "Counter" for the power usage, this results in a total usage (like it should), if you use a second trend type "Absolute value" on the same object,
you get the trend like you discribed. This trend will show only the absolute value per hour.
You can use your trend.fetch example to fetch the (absolute) power usage and write it to another object. The CSV can be grabbed like the example you already have, just read out both trends and mail them to your account.
BR,
Erwin
Posts: 176
Threads: 42
Joined: Jul 2015
Reputation:
2
(02.11.2015, 07:28)Erwin van der Zwart Wrote: Hi,
You currently use the trend type "Counter" for the power usage, this results in a total usage (like it should), if you use a second trend type "Absolute value" on the same object,
you get the trend like you discribed. This trend will show only the absolute value per hour.
You can use your trend.fetch example to fetch the (absolute) power usage and write it to another object. The CSV can be grabbed like the example you already have, just read out both trends and mail them to your account.
BR,
Erwin
Hi and thanks for the feedback. I currently use the absolute value trend and that gives me the total watts pr hour. I just activated a counter trend for the same group adress and get the following results:
Absolute value 2.1.15 19.00 = 1805 W the counter trend for the same time basis gives me 18507 W.
Therefor I am a little uncertain what the counter value is as it is aprox the absolute value times 10
|