Skip to content
  • There are no suggestions because the search field is empty.

Tutorial: Buffer Pipeline

This step-by-step interactive tutorial provides an introduction to  Intelligence Hub Pipelines using the Buffer Stage.

What Does This Article Cover?

This tutorial will walk you through building a simple Pipeline to facilitate use cases that require buffering data for a time or window size.  The tutorial also covers additional Pipeline stages that are common with buffer use cases: Buffer, WriteNew, Transform and CSV.  This tutorial includes the follow sections.

  • Tutorial Prerequisites
  • Tutorial Preparation
  • Tutorial Instructions
  • Tutorial Summary

Tutorial Prerequisites

  1. This tutorial assumes you have completed the initial Installation of the Intelligence Hub.
  2. It is recommended to read the Getting Started Series before completing this tutorial exercise.
  3. It is recommended to complete "Tutorial: Connections", "Tutorial: Models and Instances", "Tutorial: Static Templating", and "Tutorial: Dynamic Templating".

Tutorial Preparation

  • Enable Intelligence Hub MQTT broker

    • In the left-hand navigation panel, navigate to Manage, and click Settings.
    • Under the MQTT Broker section enable the broker, if ports 1885 and 1886 are being utilized on your Intelligence Hub server, update to ports of your choosing, otherwise accept the defaults and click the Save button.
  • Import the required Connections

    • In the left-hand navigation panel, navigate to Manage, and click Project.
    • Within the Import screen, ensure Full Project is off (otherwise your existing project will be overwritten).
    • Copy the JSON provided here. Project JSON.
    • Change the Import Type to JSON and paste in the following with the Project box and click the Import button.
  • Update the imported Connections as required

    • Navigate to Configure, select Connections, and select "HB_ Tutorial_File" and update the File Settings to an accessible directory from your environment.  Note this is currently set to /files.

    • Navigate to Configure, select Connections, and select "HB_Tutorial_MQTT" and update the MQTT settings as required based on the prior preparation step #1.

    • Navigate to Configure, select click Connections, and select "HB_Tutorial_SQL_Server" and type the following within the password field and click the Save button.

      • password
  • Setup UNS Client

    • In the left-hand navigation panel, navigate to Tools and right click UNS Client and select Open Link in New Tab.
    • Enter login information
    • For Connection select "HB_Tutorial_MQTT".
    • For Subscribed Topics enter [Tutorial/#].  Do not include the brackets.
    • Click Add
    • Click the "x" to remove topic "#".

    • Click the Connect button.

    • Confirm the UNS Client says "Connected to HB_Tutorial_MQTT".

    • Return to the previous web browser tab.

Tutorial Instructions

  • Review the source data

    • Navigate to Instances.
    • Open "HB_Tutorial_CNC_Asset_Info_Instance".
    • Click the Test Instance button to review the source data.
  • Build a new Pipeline

    • Navigate to Pipelines
    • Click the New Pipeline button in the upper right corner.
    • Within the Pipeline Start stage, specify "HB_Tutorial_Buffer_Pipeline" for the Name.
    • Leave the Empty Pipeline selection enabled.
    • Click the Finish button.
  • Add Stages to the Pipeline

    • Add a Flow Trigger stage to the Pipeline.
    • Select the "FlowTrigger" stage and use the Reference Panel to add the "HB_Tutorial_CNC_Asset_Info_Instance"Instance to the Flow stage References. 
    • Change the polling interval to 2 seconds.
    • Add a Transform stage by dragging it to the working space

    • Enter the following and connect to the Pipeline Start

      stage.setMetadata("AssetID", event.value.AssetID);
      
    • NOTE: This Stage utilizes the setMetadata stage to provide the Pipeline the source's AssetID attribute. In this stage we are presenting the "AssetID" attribute value from the incoming source payload to the Pipeline as metadata. We'll use the AssetID for our BufferKey. Having a specified BufferKey will ensure this Pipeline will treat each incoming payload uniquely by the AssetID.

    • Within the Stages panel, add a Timed Buffer stage.

    • Set the Window Interval to 10 seconds.

    • Update your Window Expression to the following.

    • NOTE: This expression we will ensure each incoming payload will be indexed by the AssetID.

      stage.setBufferKey(event.value.AssetID);
      
    • Connect the Transform Stage to the TimedBuffer Stage.

    • Within the Stages panel, add a Write New Stage.

    • For the new Write New Stage, enter "WriteNew_BufferMQTT" as the name

    • Select "HB_Tutorial_MQTT" from the USe Connection drop down list.

    • Within the Write New Stage, enter [Tutorial/PipelineBuffer/{{event.metadata.bufferKey}}] as the Topic.  Do not include the brackets.

    • Connect the TimeBuffer Stage stage to the WriteNew_BufferMQTT Stage.
    • Enable the Pipeline using the FlowTrigger stage.
    • Click the Save button.

    • View the result in the UNS Client.

 

  • Add additional Stages to the Pipeline

    • Navigate to Pipelines

    • Select the "HB_Tutorial_Buffer_Pipeline".

    • Within the Stages panel, add a CSV stage by dragging it to the working space.

    • From the existing TimedBuffer stage, connect to the CSV file stage.

    • From the Stages panel drag a Write New stage to the working area

    • Within the new WriteNew stage, configure the name to WriteNew_File.

    • In the WriteNew_File select "HB_Tutorial_File" from the drop down list.

    • Within the WriteNew_File stage, configure the file name to "{{event.metadata.bufferKey}}.csv", this will utilize the previously configured bufferkey as our file name and ensure each Asset's payload are outputted independently.

    • From the existing CSV stage, connect to the WriteNew_File stage.

    • Click the Save button.

    • Navigate to the previously specified file directory and review the results.

      • NOTE: The specified buffer window will need to elapse before the files will be created.

 

Tutorial Summary

This Tutorial covered several Pipeline stages: Transform, Buffer, File Format and Write New. You created a Pipeline that takes an incoming payload, stages the AssetID as the BufferKey and performs a Time Buffer to create an array of data over the specified buffer period. This Pipeline has two separate output destinations, the first is MQTT and the second is a File directory. This Pipeline was built in a way to accept many Asset sources, assuming each source contains an AssetID attribute.

Next Tutorial