There is a migratory bird that comes down to our Sydney area every late spring. It's a native cuckoo. I know this bird now but have never seen it. Only hear it, particularly at 3am in the morning with it's demented calling echoing in the night. In the daylight though there is no order in the bush. All the trees are crooked and the branches too. I went looking for the bird.That's what I did !, Couldn’t find anything. Lots of trees, sometimes sticks. I had my binoculars up their till my arms and eyebrows hurt. Looking through all the scraggly gum trees with their branches going every which way. Couldn’t see a thing. There though.. a movement, I saw it. The picture changed and suddenly I could hear
it, see it, and understand that bird on the branch way over there, but now an instant later, it had my full attention. I slowly raised the binoculars.
In monitoring large fields of seemingly disorganised and random data especially, we want to re-act fast to a change in any of it. We know what to do with it, and all of our reactions are pre-programmed. If nothing is changing in the field of view, then we don't waste time thinking or looking, just seeing, our eyes are their just comparing things from moment to moment. Today I'm going to introduce the Intrinsic Data Functions. These Intrinsic functions are different from the usual programming loop functions. Outside of engineering our SCD5200/RTU50 equipment, not much is known about them. One of the cool aspects of Intrinsic Data Functions is that they are configured and bound to each data point as required. They are processed only if there is a change of their source data references. Not difficult conceptually, but a necessary component to where this series of blogs is going. Have patience and read on..
In monitoring large fields of seemingly disorganised and random data especially, we want to re-act fast to a change in any of it. We know what to do with it, and all of our reactions are pre-programmed. If nothing is changing in the field of view, then we don't waste time thinking or looking, just seeing, our eyes are their just comparing things from moment to moment. Today I'm going to introduce the Intrinsic Data Functions. These Intrinsic functions are different from the usual programming loop functions. Outside of engineering our SCD5200/RTU50 equipment, not much is known about them. One of the cool aspects of Intrinsic Data Functions is that they are configured and bound to each data point as required. They are processed only if there is a change of their source data references. Not difficult conceptually, but a necessary component to where this series of blogs is going. Have patience and read on..
Imagine I have a
piece of data and I happen to be an Intrinsic Data Function Machine. As a machine I only respond to changes in data, and all my functions for each point of data are pre-configured. I've got to respond to a change in any data fast, at any time. I am monitoring thousands of points of data every second. Normally nothing much changes, so I just read
the data, look for a change, and if nothing changed from last time. I don't do anything special with it. Just store it away with a time tag for the
audit trail and for next time I check.
OK but then one time, the data changes,.. This is a measurement perhaps. An analog measurement. Now here is the divergence for you. In a PLC for example, running in IEC 1131 style, the IO will all be read, all be processed, and all be written EVERY cycle. This takes a lot of processing and a lot of programming. What if, we could configure the differential action on a piece of data in the general configuration of the data. Then, when the data changed, the machine would just do what was configured for that data. All manner of simple functions could be programmed by configuration tick boxes rather than having to actually write up a later logic or sequence functional chart.
OK but then one time, the data changes,.. This is a measurement perhaps. An analog measurement. Now here is the divergence for you. In a PLC for example, running in IEC 1131 style, the IO will all be read, all be processed, and all be written EVERY cycle. This takes a lot of processing and a lot of programming. What if, we could configure the differential action on a piece of data in the general configuration of the data. Then, when the data changed, the machine would just do what was configured for that data. All manner of simple functions could be programmed by configuration tick boxes rather than having to actually write up a later logic or sequence functional chart.
What are the
characteristics of this kind of implicit behaviour ? Why use it ?
The benefit of IDF
is that I can configure causal transfer functions between inputs which do have
a relationship well known at the time of database configuration. What might fit into this category ?
An obvious case,
with the development of more and more requirements for data crossovers and
redundancy is the need for coming up with a representation of an input based on
many relevant measurements and sources. Let's say for
example that my input is the best representation of the data as measure by say two measurements, {p1,p2}. What change in the set of measurements if anything could re-trigger a re-calculation of [P] in the database ?.
- A change in p1 or p2 value that takes it into a different quality.
- A change in p1 and/or p2 value itself.
- A change in p1 or p2 reachability.
Tje [IDF] is configured to look at many different conditions of {p1,p2} including value and quality and reachability (sometimes included in quality) and functionally comes up with the BEST value for P given any of the listed changes.
Note that all of this can be pre-configured for the point [P] at configuration time. It is possible of course to write a program to calculate the function BEST, but with an Intrinsic Data Function listed as BEST this is done intrinsically whenever there is a data change. A nice thing about IDF's is their ability to simplify programming by removing all of the pre-processing and transformations required to make the data 'sensible' - more from the philosopher Kant in later posts..
Since I'm travelling - I have run out of time, but leave you to think over this idea and what it means before I return. Any questions or comments welcome as always.. Chris [Shanghai 4 Dec 2012.]
Note that all of this can be pre-configured for the point [P] at configuration time. It is possible of course to write a program to calculate the function BEST, but with an Intrinsic Data Function listed as BEST this is done intrinsically whenever there is a data change. A nice thing about IDF's is their ability to simplify programming by removing all of the pre-processing and transformations required to make the data 'sensible' - more from the philosopher Kant in later posts..
Since I'm travelling - I have run out of time, but leave you to think over this idea and what it means before I return. Any questions or comments welcome as always.. Chris [Shanghai 4 Dec 2012.]
No comments:
Post a Comment