Script Data Targets

Overview

DataBlend supports users wishing to create scripts within the DataBlend Script Data Target. The DataBlend Script Data Target is a unique connector in that the connection is established within the DataBlend application and sends data. Establishing the connection is simpler than other DataBlend connections as this Data Target does not require a Credential to be created first. Scripts can be written in javascript with tie-ins to .NET with the help of jint. Any asynchronous methods should use .Result to wait for the method to complete.

Configuration

Setting

Required/ Optional

Description

Query

Required

Select the query that will generate the data for this target.
See the relevant target type documentation for additional details on what query columns are required for your target.

Query Mode

Required

Select New unless you have a reason for choosing Latest or Specific.

Script

Required

Enter any script to push data to a desired target.

 Limits

Scripts are only allowed to run for one hour before being terminated. A script may also not allocate over 2GB of memory total during its execution.

Jobs

Parameters

Job parameters get interpolated before the script runs so using will inject the value directly into the script. For example, if a job has a parameter called “name” with a value of “test”

log('Script Data Targets');

will write “test” to a log entry.

Context

All job types listed below have the same three objects injected automatically into the JavaScript context

job - The current job execution object. The object is the same as what would be returned by the API for that job. For example job.parent.name would return the name of the Collector when running a collection job.

log(string)- A function used to output messages to the job logs.

checkCancellation() - Asynchronous function that can be called at any time to see if the user has cancelled the job. This should be used between long-running tasks (eg loading data from multiple API calls) otherwise the job won’t be cancelled until the very end.

Additional context values unique to each job, if any, type are outlined below.

Collectors

schemaName - The schema name set in the parent collector.

createSchema(object) - Shortcut function used to create a schema from a JSON object.

startUpload(schema, collection, totalRecords)- Function used to signal the start of ingestion of data. The schema can be created using the createSchema function and the collection parameter is the job mentioned above. totalRecords should only be used if the total amount of records is known before all the data is uploaded, otherwise use null. This function returns a CollectionItem used in the next two functions.

sendData(collectionItem, records) - Function to upload partial or all data. The collectionItemparameter is the value returned from startUpload. records is an array of JSON objects to store. This method returns a CollectionItem that should be used for subsequent calls to sendData and finishUpload.

finishUpload(collectionItem)- Function used to signal the end of the stream of data. collectionItem is the returned value from sendData or startUpload if there was no data sent.

Data Targets

records - Array of objects from the query execution converted into JSON objects.

Tasks

No additional values.

Additional .NET objects available

In addition to any standard JavaScript objects a few other libraries are available to augment JavaScript’s native capabilities.

HttpClient

Any objects found within the System.Net.Http assembly can be used.

const http = importNamespace('System.Net.Http'); const client = new http.HttpClient(); const message = new http.HttpRequestMessage(http.HttpMethod.Get, 'some url'); const response = this.client.SendAsync(message).Result; const content = JSON.parse(response.Content.ReadAsStringAsync().Result);

SftpClient

SFTP support is provided by Renci.SshNet

const ssh = importNamespace('Renci.SshNet'); const client = new ssh.SftpClient(new ssh.ConnectionInfo('host', 22, 'username', [new ssh.PasswordAuthenticationMethod('username', 'password')])); client.Connect();

JSON

JSON support is provided by Newtonsoft.Json

const json = importNamespace('Newtonsoft.Json'); json.JsonConvert.SerializeObject(someObject);

Crypto

Cryptographic support is provided by System.Security.Cryptography

const sys = importNamespace('System'); const crypto = importNamespace('System.Security.Cryptography'); const text = importNamespace('System.Text'); const hmac = new crypto.HMACSHA256(text.Encoding.UTF8.GetBytes('key')); const hash = hmac.ComputeHash(text.Encoding.UTF8.GetBytes('test')); const hex = sys.BitConverter.ToString(hash); log(hex.replaceAll('-', '').toLowerCase());

XML

XML support is provided by System.Xml

const xml = importNamespace('System.Xml'); const text = '<root attr="test"><test>value1</test><test>value2</test></root>'; const doc = new xml.XmlDocument(); doc.LoadXml(text);

CSV

CSV support is provided by CsvHelper

const sys = importNamespace('System'); const global = importNamespace('System.Globalization'); const io = importNamespace('System.IO'); const csv = importNamespace('CsvHelper'); const stream = new io.MemoryStream(); // load data into stream stream.Seek(0, io.SeekOrigin.Begin); const streamReader = new io.StreamReader(stream); const csvReader = new csv.CsvReader(streamReader, global.CultureInfo.InvariantCulture, false); const records = csvReader.GetRecords(new sys.Object().GetType()).ToList();

Microsoft Graph

Microsoft Graph support is provided by Microsoft.Graph

DataBlend API Clients

Most DataBlend objects have built-in API clients injected into the job context. For example, listing the first ten collectors available to the user running the job can be accomplished via the Collectors client.

const json = importNamespace('Newtonsoft.Json'); const collectors = Collectors .Search({limit: 10}) .Result .Results .Select(c => c.Name); log(json.JsonConvert.SerializeObject(collectors));

List of available clients

AgentPings
Agents
CollectionItems
CollectionLogs
Collections
Collectors
Credentials
CredentialTests
DataQualityReportExecutionLogs
DataQualityReportExecutions
DataQualityReports
DatasinkExecutionLogs
DatasinkExecutions
Datasinks
Datasources
DatasourceSchemas
PluginSchemas
Queries
QueryExecutionLogs
QueryExecutions
SimpleTaskExecutionLogs
SimpleTaskExecutions
SimpleTasks
Streams
StreamUploadLogs
StreamUploads
UnpivotExecutionLogs
UnpivotExecutions
Unpivots
UserProfiles
WorkflowExecutionLogs
WorkflowExecutions
Workflows
WorkflowTasks

Examples

Create Customer

const json = importNamespace('Newtsoft.Json'); const http = importNamespace('System.Net.Http'); const text = importNamespace('System.Text'); const body = json.JsonConvert.SerializeObject({first_name: 'John', last_name: 'Doe', email: 'Test@mail.com'});                 const message = new http.HttpRequestMessage(http.HttpMethod.Post, 'https://api.example.io/v1/contacts'); message.Content = new http.StringContent(body, http.Encoding.UTF8, 'application/json'); message.Headers.Add('Authorization', `Bearer ${apiKey.CalculatedValue}`); const client = new http.HttpClient(); const response = client.SendAsync(message).Result; log(response.Content.ReadAsStringAsync().Result)

Advanced

History Retention

History Retention (Days) allows users to decide how long they want the information from their data targets to be stored. This field is optional.

Timeout (seconds)

The Timeout section allows users to determine if they would like to timeout collections taking longer than a set number of seconds to collect data.

Skip If No Records Found

The “Skip if No Records Found” button is used to eliminate sending information to Data Targets unnecessarily. Simply enable the “Skip if No Records Found” toggle. The use of this toggle is optional.

Agent

The Agent drop-down for users to select any agent they have established. This is optional.

Run As

Run As allows users to select from a drop-down list of users to run the Workflow. This is optional. Please note that Run As is only available to Admin users. If a user is set as the Run As and then demoted to a Member, the user which demoted the Run As user will instead be set as the Run As.

Schedule & Presets        

Link to Schedule and Presets