Historians: Read Value Changes
Obtain and react to any and all values changes from a historian.
What Does This Article Cover?
Some solutions require obtaining all value changes from a historian, including late-arriving data. This article provides starter solutions for this specific scenario.
What is late-arriving data?
Late-arriving data refers to data changes that arrive at the historian with a timestamp that is earlier than the most recent data change already stored for a given point or tag. Late data may arrive due to the following.
- Network latency or outages
- Buffering at the source (e.g., a PLC, RTU, or edge device)
- Store-and-forward
- Time synchronization issues
- Editing in the historian or external application writing to the historian
General Intelligence Hub design considerations for obtaining value changes
- This design approach should only be used when it is necessary to obtain all value changes, including obtaining late-arriving data.
- Obtaining data by polling or aggregation may result in less data and better data pipeline performance.
- The overall data pipeline solution might involve polling at a frequency or calling a pipeline externally.
- The resulting data payload might be written to a MQTT Broker, database, data lake, or data warehouse, or made available by the Intelligence Hub REST Data Server.
- When creating a solution, start with a small number of points or tags and a short time span.
- Avoid long-duration queries.
- Attempt to use a query that executes in a few seconds.
- Avoid Pipeline Debug mode and Enabling Tracking Activity if the Intelligence Hub Pipeline is processing more than many few dozen value changes.
- This solution type is conducive to a narrow table format with one column for values (perhaps numeric and non-numeric) and no or minimal additional context at the time.
- Consider normalizing data types in a narrow table and the respective logic.
- It might be practical to have a column for numeric values and a separate column for non-numeric values.
- Model the desired output schema using an Intelligence Hub Model.
- Asset, point, or tag context might be obtained infrequently and stored in a different table in the destination system.
- In this scenario, the time obtained by the Intelligence Hub Connection Input is the time that the respective value was written to the historian.
AVEVA PI Data Archive design considerations for obtaining value changes
These considerations pertain to obtaining value changes for PI Data Archive PI Points.
- The Intelligence Hub AVEVA PI System Connection Point Changes Input is used to obtain value changes. It returns all adds, updates, inserts, and deletes for all PI Points identified in the query.
- A Reference list of PI Points may be used or specific named PI Points.
- Consider using the Point Browse Connection to obtain the list of Point Names and optimize the Cache Lifetime setting.
- The Connection Input creates a subscription via PI Data Pipes.
- The subscription remains active for ten minutes and the subscription is reactivated each time it is queried.
- The Connection uses the query as defined at the time of subscription creation.
- If PI Points are added or deleted the query and or subscription ID may need to be changed.
- The starter solution captures the PI Point's Point ID.
- The assumption is that the narrow table is linked to a table containing Point metadata.
- A project file may be downloaded [here].
AVEVA PI System Asset Framework design considerations for obtaining value changes
These considerations pertain to obtaining value changes for PI Asset Framework Attributes.
- The Intelligence Hub PI Connection Asset Changes Input enables you to capture all additions, updates, inserts, and deletions for every asset attribute, including those linked to PI Points and those that are not.
- This method is recommended if you need to monitor changes to Static Values (asset attributes not mapped to PI Points) or require asset context to be part of the payload.
- If only PI Point value changes are needed, you may obtain the PI Points directly and leverage the PI Data Archive approach.
- The Connection Input creates a subscription via PI Data Pipes. The subscription remains active for ten minutes, and the subscription is reactivated each time it is queried.
- The Connection uses the query as defined at the time of subscription creation. If assets are added or deleted in PI, the query and or subscription ID may need to be changed.
- The starter solution includes the attribute name, asset name, and asset path to simplify understanding and implementation.
- For more robust designs, consider incorporating the element ID and attribute ID. This approach assumes the narrow table is associated with destination system tables that store asset and attribute metadata.
- A project file may be downloaded [here].
Additional Resources