Skip to content
  • There are no suggestions because the search field is empty.

Tutorial: Transform Pipeline

This step-by-step interactive tutorial provides an introduction to Intelligence Hub Pipelines.

What Does This Article Cover?

This tutorial will walk you through building a Pipeline to facilitate use cases that require transformations to the data structure. This Tutorial will feature Intelligence Hub Pipeline Flatten, Breakup and Write New stages.

  • Tutorial Prerequisites
  • Tutorial Preparation
  • Tutorial Instructions
  • Tutorial Summary

Tutorial Prerequisites

  1. You have completed the initial Installation of the Intelligence Hub.
  2. It is recommended to read the Getting Started Series before completing this tutorial exercise.
  3. It is recommended to complete "Tutorial: Connections", "Tutorial: Models and Instances", "Tutorial: Static Templating", and "Tutorial: Dynamic Templating".

Tutorial Preparation

  • Enable Intelligence Hub MQTT broker

    • In the left-hand navigation panel, navigate to Manage, and click Settings.
    • Under the MQTT Broker section enable the broker, if ports 1885 and 1886 are being utilized on your Intelligence Hub server, update to ports of your choosing, otherwise accept the defaults and click the Save button.
  • Import the required Connections

    • In the left-hand navigation panel, navigate to Manage, and click Project.
    • Within the Import screen, ensure Full Project is off (otherwise your existing project will be overwritten).
    • Copy the JSON provided here.  Project JSON
    • Change the Import Type to JSON and paste in the following with the Project box and click the Import button.
  • Update the imported Connections as required

    • Navigate to Configure, select Connections, and select "HB_Tutorial_MQTT" and update the MQTT settings as required based on the prior preparation step #1.

    • Navigate to Configure, select click Connections, and select "HB_Tutorial_SQL_Server" and type the following within the password field and click the Save button.

      •  

        password
        
  • Setup UNS Client

    • In the left-hand navigation panel, navigate to Tools and right click UNS Client and select Open Link in New Tab.
    • Enter login information.
    • For Connection select "HB_Tutorial_MQTT".
    • For Subscribed Topics enter [Tutorial/#].  Do not include the brackets.
    • Click Add
    • Click the "x" to remove topic "#".
    • Click the Connect button.
    • Confirm the UNS Client says "Connected to HB_Tutorial_MQTT".
    • Return to the previous web browser tab.

Tutorial Instructions

  • Review the source data

    • Navigate to Instances

    • Open "HB_Tutorial_Models_Enterprise_CNC_Instance"

    • Click the Test Instance button to review the source data.  Take note we have a complex payload as a result of the parent and child model configuration.

  • Build a new Pipeline

    • Navigate to Pipelines
    • Click the New Pipeline button in the upper right corner.
    • Within the Pipeline Start stage, specify "HB_Tutorial_Transform_To_MQTT" for the Name.
    • Leave the Empty Pipeline selection enabled.
    • Click the Finish button.
  • Add Stages to the Pipeline.
    • Add a Flow Trigger stage to the Pipeline.
    • Select the "FlowTrigger" stage and use the Reference Panel to add the "HB_Tutorial_Models_Enterprise_CNC_Instance" Instance to the Flow stage References. 
    • Change the polling interval to 5 seconds.
    • Add a WriteNew stage by dragging it to the working space.
    • Connect the Pipeline start with the WriteNew stage
    • Update the WriteNew stage name to "WriteNew_Original".
    • Select "HB_Tutorial_MQTT" from the Use Connection drop down list.
    • Within the new Write New stage, enter [Tutorial/Transform/Original] as the Topic.  Do not include the brackets.
    • Enable the Pipeline using the FlowTrigger stage.
    • Click the Save button.
    • View the results in the UNS client.
    • Navigate to Pipelines.
    • Click "HB_Tutorial_Transform_To_MQTT".

    • Add a Flatten Stage by dragging it to the working space.

    • Connect the Pipeline start with the Flatten stage

    • Add a WriteNew stage by dragging it to the working space.

    • Update the WriteNew stage name to "WriteNew_Flatten".

    • Select "HB_Tutorial_MQTT" from the Use Connection drop down list.
    • Within the new Write New stage, enter [Tutorial/Transform/Flattened] as the Topic.  Do not include the brackets.
    • Connect the Flatten Stage with the WriteNew_Breakup Stage
    • Enable the Pipeline using the FlowTrigger stage.
    • Click the Save button.
    • View the results in the UNS client.  Take note the child instance attributes "Values" and "ProgramInfo" have been flatten into a single JSON object.
    • Navigate to Pipelines

    • Click "HB_Tutorial_Transform_To_MQTT".

    • Add a Breakup Stage by dragging it to the working space

    • Connect Flatten Stage with the Breakup stage

    • Within the Breakup Stage, set the Breakup Type to All.

    • Within the Stages panel, add a WriteNew stage by dragging it to the working space.

    • Update the WriteNew stage to name "WriteNew_Breakup"

    • Select "HB_Tutorial_MQTT" from the Use Connection drop down list.
    • Within the new Write New stage, enter [Tutorial/Transform/Breakup/{{event.metadata.breakupName}}] as the Topic.  Do not include the brackets.
    • Connect the Breakup stage to the WriteNew_Breakup Stage

    • Enable the Pipeline using the FlowTrigger stage.
    • Click the Save button.
    • Review the results within your UNS client.  NOTE: We can see the values are now broken up into individual MQTT topics using the {{event.metadata.breakupName}} value in our topic, this includes the flattened child attributes.

 

Here is the completed Project JSON for this tutorial section.

Tutorial Summary

In this tutorial you built a Pipeline composed of Flatten, Breakup and Write New stages.  These stages are commonly utilized in use cases requiring data structure change.  By utilizing Pipelines, Data Engineers can ensure the data is outputted to target system(s) in the required or expected format.