Welcome to our new forum
All users of the legacy CODESYS Forums, please create a new account at account.codesys.com. But make sure to use the same E-Mail address as in the old Forum. Then your posts will be matched.
Close
I have some analog sensor threshold constants defined in my code. Even though the sensor value is delivered as a 16-bit integer, I normally like to define the thresholds using REAL data types, with SI units (example: 112.5, units of Β°F). This makes the code easier to read and modify.
However, in the code, I need the threshold converted to the 16-bit integer for the algorithm to use.
In C code, this was a pretty simple thing to do, as I could use a #define macro to perform the conversion, and as long as my coding style ensured that the value was "const/read-only", it would optimize and implicitly convert it to the 16-bit int (no need to perform operations with REAL data types at run-time, or even have a REAL data type stored in memory).
Is there something similar to this in structured text?
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Anonymous
-
2017-09-24
Originally created by: scott_cunningham
There is not an equivalent to the define of C language. The closest you can come is to define the conversion as a constant first (if you can) and then define your 16 bit constants as a REAL x conversion.
But I would suggest just defining the real and making the conversion during code execution using a function. Unless you are running a low power embedded system, you wouldn't notice a problem. And even in an embedded system, you probably wouldn't notice anything as most have floating point chips anyway.
In my machine programs, I mostly convert all my variables to reals first! Fieldbus values - convert to reals every PLC scan... Then I can limit my overflow/underflow errors to a small area. Plus it's much easier to debug when I see DriveTemp = 57.3 (C) instead of 573.
Famous quote: "The real problem is that programmers have spent far too much time worrying about efficiency in the wrong places and at the wrong times; premature optimization is the root of all evil (or at least most of it) in programming." Donald Knuth
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Greetings,
I have some analog sensor threshold constants defined in my code. Even though the sensor value is delivered as a 16-bit integer, I normally like to define the thresholds using REAL data types, with SI units (example: 112.5, units of Β°F). This makes the code easier to read and modify.
However, in the code, I need the threshold converted to the 16-bit integer for the algorithm to use.
In C code, this was a pretty simple thing to do, as I could use a #define macro to perform the conversion, and as long as my coding style ensured that the value was "const/read-only", it would optimize and implicitly convert it to the 16-bit int (no need to perform operations with REAL data types at run-time, or even have a REAL data type stored in memory).
Is there something similar to this in structured text?
Originally created by: scott_cunningham
There is not an equivalent to the define of C language. The closest you can come is to define the conversion as a constant first (if you can) and then define your 16 bit constants as a REAL x conversion.
For example:
But I would suggest just defining the real and making the conversion during code execution using a function. Unless you are running a low power embedded system, you wouldn't notice a problem. And even in an embedded system, you probably wouldn't notice anything as most have floating point chips anyway.
In my machine programs, I mostly convert all my variables to reals first! Fieldbus values - convert to reals every PLC scan... Then I can limit my overflow/underflow errors to a small area. Plus it's much easier to debug when I see DriveTemp = 57.3 (C) instead of 573.
Famous quote: "The real problem is that programmers have spent far too much time worrying about efficiency in the wrong places and at the wrong times; premature optimization is the root of all evil (or at least most of it) in programming." Donald Knuth