Skip to content
  • There are no suggestions because the search field is empty.

Tutorial: Buffer Pipeline

What Does This Article Cover?

This tutorial will walk you through building a simple Pipeline to facilitate use cases that require buffering data for a time or window size. The tutorial also covers additional pipeline stages that are common with buffer use cases: Buffer, WriteNew, Transform and CSV. This tutorial includes the follow sections.

  • Tutorial Prerequisites
  • Tutorial Preparation
  • Tutorial Instructions
  • Tutorial Summary

Tutorial Prerequisites

  1. This tutorial assumes you have completed the initial Installation of the Intelligence Hub
  2. This tutorial assumes you have applied your Intelligence Hub license to your environment
  3. It is recommended to read the Getting Started Series before completing this tutorial exercise
  4. It is recommended to complete "Tutorial: Connections", "Tutorial: Models and Instances" and "Tutorial: Templating"

Tutorial Preparation

  1. Enable Intelligence Hub MQTT broker

    • In the left-hand navigation panel, navigate to Manage, and click Settings
    • Under the MQTT Broker section enable the broker, if ports 1885 and 1886 are being utilized on your Intelligence Hub server, update to ports of your choosing, otherwise accept the defaults and click save
  2. Import the required Connections

    • In the left-hand navigation panel, navigate to Manage, and click Project
    • Within the Import screen, ensure Full Project is off (otherwise your existing project will be overwritten)
    • Change the Import Type to JSON and paste the following code block into the Project box and click the import button

Configuration Download

  1. Update the imported Connections as required

    • Navigate to Configure and click Connections, Click Tutorial_File and update the File Settings to an accessible directory from your environment - Note this is currently set to /files

    • Navigate to Configure and click Connections, Click Tutorial_MQTT and Update the MQTT settings as required based on the prior preparation step #1

    • Navigate to Configure and click Connections, Click "Tutorial_CMMS_servicelogs" and type the following within the password field and click "Save"

                                password
      
  2. Setup UNS Client

    • In the left-hand navigation panel, navigate to Tools and right click UNS client and open Link in New Tab
    • Enter login information
    • For Connection select Tutorial_MQTT
    • For Subscribed Topics remove the default wildcard entry # and Subscribe to Topic "Tutorial/#"
    • Click Add
    • Click Connect and confirm UNS client says "Connected to Tutorial_MQTT"
    • Return to the previous tab

Tutorial Instructions

  1. Review the source data

    • Navigate to Instances
    • Open CNC_Asset_Info_Instance_Collection
    • Click Test Instance to review the source data
  2. Build a new Pipeline

    • Navigate to Pipelines
    • Click New Pipeline in the upper right corner
    • Within the Pipeline Start stage, specify "Tutorial_Buffer_Pipeline" for the Name
  3. Add Stages to the Pipeline

    • Within the Stages panel, add a Transform stage by dragging it to the working space

    • Enter the following and connect to the Pipeline Start

      stage.setMetadata("AssetID", event.value.AssetID);
      
    • NOTE: This Stage utilizes the setMetadata stage to provide the Pipeline the source's AssetID attribute. In this stage we are presenting the "AssetID" attribute value from the incoming source payload to the Pipeline as metadata. We'll use the AssetID for our BufferKey. Having a specified BufferKey will ensure this Pipeline will treat each incoming payload uniquely by the AssetID.

  • Within the Stages panel, add a Time Buffer

  • Set the Window Interval to 10 seconds

  • Update your Window Expression to the following

    • NOTE: This expression we will ensure each incoming payload will be indexed by the AssetID

      stage.setBufferKey(event.value.AssetID);
      
  • Connect the Transform Stage with the TimeBuffer Stage

  • Within the Stages panel, add a Write New Stage

  • For the new Write New Stage, enter "WriteNew_BufferMQTT" as the name

  • Using the References panel, map Tutorial_MQTT to the Write New Use Connection attribute

  • Within the Write New Stage, enter "Tutorial/PipelineBuffer/{{event.metadata.bufferKey}}" as the Topic

  • Within the Write New Stage, enable the Retain setting

  • Connect the TimeBuffer Stage stage with the WriteNew_BufferMQTT Stage

  • Click Submit

  1. Create a Flow to the Buffer Pipeline

    • Navigate to Flows
    • Click new Flow
    • Within the New Flow details view, enter "Tutorial_PipelineFlow" for the Flow name.
    • Click next
    • Within the New Flow Sources & Targets view, use the reference panel and set the Type to Instance
    • Map CNC_Asset_Info_Instance_Collection to the Sources panel
    • From the References panel, set type to Pipeline
    • Map Tutorial_Buffer_Pipeline to the Targets panel
    • Click next
    • Set the interval to 2 seconds, enable the flow and click submit
    • Review the results within your UNS client
  2. Add additional Stages to the Pipeline

    • Navigate to Pipelines

    • Click Tutorial_Buffer_Pipeline

    • Within the Stages panel, add a CSV stage by dragging it to the working space

    • From the existing TimeBuffer stage, connect to the CSV file format stage

    • From the Stages panel map Write New to the working area

    • Within the new WriteNew stage, configure the name to WriteNew_File

    • Using the References Panel, map Tutorial_File to the WriteNew_File Use Connection setting

    • Within the WriteNew_File stage, configure the file name to "{{event.metadata.bufferKey}}.csv", this will utilize the previously configured bufferkey as our file name and ensure each Asset's payload are outputted independently

    • From the existing CSV stage, connect to the WriteNew_File stage

    • Click save

    • Navigate to the previously specified file directory and review the results

      • NOTE: The specified buffer window will need to elapse before the files will be created

Tutorial Summary

This Tutorial covered several Pipeline stages: Transform, Buffer, File Format and Write New. You created a Pipeline that takes an incoming payload, stages the AssetID as the BufferKey and performs a Time Buffer to create an array of data over the specified buffer period. This Pipeline has two separate output destinations, the first being our UNS to view the Array data and the second being our File directory. This Pipeline was built in a way to accept many Asset sources, assuming each source contains an AssetID attribute.