Yesterday, 13:54
Hi!
In some big project we have a heavy trafic issue (CPU/IO high load).
For example we have a function as below;
We have a special KNX scene which turns lights with the brightness depends on the time.
For example the light brightness changes between 9 am and 1pm.
We use scheduler function to store the brightness value of each light
and we fetch these data & execute grp.write() for each lights from the event script. (as we cannnot use scene function for it)
The number of lights associated to this script is sometimes over 30.
Ideally we want to limit the KNX telegrams up to 20 telegrams per second but I am not sure how to control that from event script.
In such cases, which option is preferrable?
・Use os.sleep() function in the execution loop (As of LM manual of 2016 using sleep function was not recommended but can we do this as your 2022 manual does not mention about it anymore?)
・Use redis function to use queue system
If there is any other way you recommend, we really appreciate it !
Thank you in advance.
In some big project we have a heavy trafic issue (CPU/IO high load).
For example we have a function as below;
We have a special KNX scene which turns lights with the brightness depends on the time.
For example the light brightness changes between 9 am and 1pm.
We use scheduler function to store the brightness value of each light
and we fetch these data & execute grp.write() for each lights from the event script. (as we cannnot use scene function for it)
The number of lights associated to this script is sometimes over 30.
Ideally we want to limit the KNX telegrams up to 20 telegrams per second but I am not sure how to control that from event script.
In such cases, which option is preferrable?
・Use os.sleep() function in the execution loop (As of LM manual of 2016 using sleep function was not recommended but can we do this as your 2022 manual does not mention about it anymore?)
・Use redis function to use queue system
If there is any other way you recommend, we really appreciate it !
Thank you in advance.