fbpx

Introducing Dynamic Variables and Batching

 

January 24, 2019  |  By Ludwig Tirazona

Recently, we’ve introduced two new capabilities that make an integration engineer’s life even easier when using Reekoh; Dynamic Variables and Batching. In this post I’m going to go into detail on each of them and show how they’re used in Reekoh’s Pipeline Studio.

 

Dynamic Variables

Dynamic Variables are a programming principle that allows fields to be automatically updated based on either a data dictionary (date related) or properties of the data being sent to the plugin.

To demonstrate, let’s set up a simple Pipeline with an HTTP Gateway and SFTP Connector (SFTP Connectors were just released in our last product update):

Then, let’s send two data points to the Pipeline. The JSON field called filename is what we want to dynamically use as the filename on the SFTP server.

{
"device":"",
"content":"1 2 3 4 5 6 7 8 9 10",
"filename": "numbers"
}

{
"device":"",
"content":"A B C F E F G H I J",
"filename": "letters"
}

 

Fields that support Dynamic variables can be configured using double curly braces (i.e {{}}). In this case we want to use a field called filename to dynamically create the field name. The configuration of the field should have the following syntax.

{{filename}}

 

The result is that whenever we post a JSON message with a filename field, that filename field is used when creating the file on the SFTP server.

Examples of Date Based Dynamic Variables

Time formatted in Linux Microseconds:

  • {{now_epoch_microseconds}} Time now
  • {{1m_epoch_microseconds}} 1 minute ago
  • {{5m_epoch_microseconds}} 5 minutes ago
  • {{15m_epoch_microseconds}} 15 Minutes ago
  • {{1h_epoch_microseconds}} 1 Hour ago
  • {{6h_epoch_microseconds}} 6 Hours ago
  • {{12h_epoch_microseconds}} 12 Hours ago
  • {{24h_epoch_microseconds}} 24 Hours ago

ISO6801 format (no timezone):

  • {{now_iso_6801}} Time now
  • {{1m_iso_6801}} 1 Minute ago
  • {{5m_iso_6801}} 5 Minutes ago
  • {{15m_iso_6801}} 15 Minutes ago
  • {{1h_iso_6801}} 1 Hour ago
  • {{6h_iso_6801}} 6 Hours ago
  • {{12h_iso_6801}} 12 Hours ago
  • {24h_iso_6801}} 24 Hours ago

 

Current Plugins Supporting Dynamic Variables

  • InfluxDB Stream
  • FTP Connector
  • SFTP Connector

Q1 2019 Roadmap (Plugins that will have Dynamic Variables enabled)

  • Webhooks Connector
  • Azure SQL Connector
  • PowerBI Connector
  • Splunk Connector

 

Batching

Not all external services support real-time data streaming, so batching is a method that allows Reekoh to work with these services. Some key concepts in Reekoh’s batching paradigm are:

  • A storage mechanism for the Batch while real time is data is coming in. Currently, Reekoh uses a file based storage mechanism (with SQL and MongoDB back-ends next on the roadmap).
  • A schedule/interval for when a new Batch is created. For example, this could be at 11pm every day or every 4 hours.
  • A unique ID for each Batch so data for that Batch can be retrieved.
  • When a Batch is ready (whether because it’s the scheduled time or an interval has expired) the Batching Service will emit a special message which has the batch ID.
  • This message can be used by a Converter to download the Batch and manipulate it before sending it to a Connector that supports Batching.
  • By default, most connectors don’t know how to process a Batch ID or a Batch ID message. In order for a Connector to be able to process the batch message, it needs code added to it.
  • Reekoh is updating commonly use Connectors to act either in Streaming (send data as it arrives) or Batching Mode, without customers having to configure anything extra.

 

Let’s update our pipeline to include a Batching Service plugin:

Then we can send 4 messages to the HTTP Gateway:

{
"device":"",
"content":"A B C F E F G H I J",
"filename": "letters"
}

 

The Batching Service will batch these into a file and then emit the unique Batch ID as a message:

 

Connectors joined to the Batching Service will either support batching or they will not. If a Connector joined to the Batching Service does not support batching, it will not be able to process or send the batch file on.

Current Connectors Supporting Batching

  • FTP Connector
  • SFTP Connector

Roadmap for Batching Connectors (Q1 2019)

  • SMTP Connector
  • Webhooks Connector

Roadmap for New Batching Services (Q1 2019)

  • SQL Based Batches (SQL Table for batching)
  • MongoDB Based Batches (collections)