LogoLogo
AboutSupport
4.3
4.3
  • Unifi User Documentation
  • Install
    • Release Notes
      • Unifi 4.3 Release Notes
      • Unifi 4.2 Release Notes
      • Unifi 4.1 Release Notes
      • Unifi 4.0 Release Notes
      • Unifi 3.1 Release Notes
      • Unifi 3.0 Release Notes
      • Unifi 2.2 Release Notes
      • Unifi 2.1 Release Notes
      • Unifi 2.0 Release Notes
    • Install or Upgrade
      • Global Utility
      • Hotfix
  • Configure
    • Integration Designer
    • Processes
    • Integrations
    • Connections
    • Messages
    • Message Scripts
    • Fields
    • Field Maps
    • Response Actions
    • Event Actions
    • Datasets
      • Create a New Dataset
      • Dataset Extras
    • Polling
      • Pollers
      • Poll Processors
    • Administration
      • Activity Logs
      • Data Stores
      • Properties
      • Scheduled Scripts
      • System Logs
    • Attachments
      • Extracting Attachments
      • Fetching Attachments
      • Embedded Attachments
    • Scripting
      • Snippets
      • Variables
    • Documentation
    • How to guides
      • How to Handle Attachments
        • Message
        • Scripted REST Resource
        • Test AddAttachment
      • How to Manually Close a Bond
      • How to Poll for Large Response Payloads
      • How to Setup an OAuth Connection
        • Identity Provider Instance
        • Identity Consumer Instance
        • OAuth Refresh Token Job
      • How to Setup Heartbeat Messages
  • Deploy
    • Package
    • Instance Clone
  • Operate
    • Bonding
      • Bonds
      • Bonded Attachments
    • Transport
      • Snapshots
      • Transactions
      • Stages
      • HTTP Requests
      • Dataset Requests
      • Poll Requests
    • Error Handling
      • Integration Pause and Resume
      • Integration Repair
      • Request Retry
      • Transaction & Request Replay
      • Transaction Ignore
  • Test
    • Overview
    • Integration Test
    • Test Scenario
    • Test Scenario Data
    • Test Result
    • Test Scenario Result
    • Generating Tests
    • Running Tests
    • Exploring Results
  • Integration Guides
    • Outbound Incident Guide
      • Getting Started
      • Process
      • Integration
      • Connection
      • Create Scenario
        • CreateIncidentResponse Message
        • CreateIncidentResponse Fields
        • CreateIncident Message
        • CreateIncident Fields
        • Trigger
        • Test CreateIncident
      • Update Scenario
        • Response Message
        • UpdateIncident Message
        • UpdateIncident Fields
        • Test UpdateIncident
      • Resolve Scenario
        • ResolveIncident Message
        • ResolveIncident Fields
        • Test ResolveIncident
      • Build - Integration Level
      • Conclusion
    • Bidirectional Asynchronous Incident Guide
      • Getting Started
      • Process
      • Web Service
      • Integration
      • Connection
      • Create Scenario
        • Response Message
        • CreateIncidentReceipt Message
        • CreateIncidentReceipt Fields
        • CreateIncident Message
        • CreateIncident Fields
        • Trigger
        • Test CreateIncident
      • Update Scenario
        • Receipt Message
        • UpdateIncident Message
        • UpdateIncident Fields
        • Test UpdateIncident
      • Resolve Scenario
        • ResolveIncident Message
        • ResolveIncident Fields
        • Test ResolveIncident
      • Build - Integration Level
      • Build the Other Half
        • Move the Integration
        • Reconfigure the Connections
      • Conclusion
    • Incident Update Poller Guide
      • Polling
        • Poll Processor
        • Poller
      • Inbound Message
        • UpdateIncidentInbound Message
        • UpdateIncidentInbound Fields
      • Message Identification
      • Bond Identification
        • Edit Incident Form
        • Edit CreateIncident Message
      • Test Update Poll
      • Conclusion
    • Incident Multiple Message Poller Guide
      • Polling
        • Poll Processor
        • Poller
      • Inbound Messages
        • ResolveIncidentInbound Message
        • ResolveIncidentInbound Fields
      • Testing
        • Test UpdateIncidentInbound
        • Test ResolveIncidentInbound
      • Conclusion
    • Incident Create Poller Guide
      • Polling
        • Connection Variables
        • Poll Processor
        • Poller
      • Messages
        • CreateIncidentInboundReceipt Message
        • CreateIncidentInboundReceipt Fields
        • CreateIncidentInbound Message
        • CreateIncidentInbound Fields
      • Build - Integration Level
      • Test Create Poll
      • Conclusion
    • Incident Parent and Child Poller Guide
      • Polling
        • Connection Variables
        • Child Poll Processor
        • Child Poller
        • Parent Poll Processor
        • Parent Poller
      • Inbound Messages
      • Testing
        • Test UpdateIncidentInbound
        • Test ResolveIncidentInbound
      • Conclusion
    • Incident Attachment Poller Guide
      • Polling
        • Connection Variables
        • Edit Endpoint URLs
        • Get Attachment Poll Processor
        • Get Attachment Poller
        • Select Attachments Poll Processor
        • Select Attachments Poller
        • Edit Child Poll Processor
        • Edit Child Update Poller
      • Messages
        • AddAttachmentInbound Message
      • Testing
        • Test Outbound Scenarios
        • Test CreateIncidentInbound
        • Test UpdateIncidentInbound
        • Test ResolveIncidentInbound
        • Test AddAttachmentInbound
      • Conclusion
  • Troubleshooting
    • Attachments
      • Inbound SOAP/Base64 attachments stopped working
      • New record attachments are not sent from Portal
      • Special characters in attachment file names
    • Datasets
    • Development
      • Bonding to existing records
      • Copying an existing Unifi trigger rule doesn't work
      • Duplicate messages being sent
      • Deleted records are not packaged
      • Multipart Form Data
      • Undefined error when building an integration
    • Diagnostic
    • Installation
      • Latest version of Unifi not accessible
    • Integration Responses
      • Transaction has been processed already
      • Initiating transaction not found for inbound receipt
      • Message has already been processed
      • Message ID not found
      • Message is not valid for this bond
      • Message name not recognised
      • No retry for requests with 401 response
      • Unable to identify message name from request
    • Other
      • Dynamic stage does not render
      • Duplicate bonds on Request integrations
    • Self-test
  • About
    • Quick Tour
    • Roles
    • Supported Features
    • Application Modules
    • Data Model
    • Transport Data Flow
Powered by GitBook
On this page
  • Import Logging
  • Dataset Cleanup
  • Export Tuning
  • Create a Dataset Field Map
  • Disable Dataset Import
  • Multiple Tables
  • Sending Data from a Third Party
  • Using IRE (Identification and Reconciliation Engine)
  • Publishing with Run As user

Was this helpful?

Export as PDF
  1. Configure
  2. Datasets

Dataset Extras

Import Logging

Inbound Datasets use ServiceNow Import Sets giving you the ability to view dataset imports as you would any other data import.

For detailed Activity Logs, navigate to the Transform page on a Dataset and enable the Import logging option.

Datasets with Import logging enabled will generate one Activity Log for each Import Set Row.

Dataset Cleanup

Dataset Requests are a record of when all Dataset imports and exports occurred. They are linked directly to Transactions that belong to a Bond which is exclusive to the Dataset. The records are automatically deleted when the corresponding Transaction is deleted.

The number of Transactions to keep during the daily cleanup can be managed using the Cleanup option which can be configured on each Dataset. This value specifies the number of transactions that should remain following the cleanup process. The default retention is 10 records.

Any Dataset Requests which are somehow orphaned, meaning they have no Dataset or no Transaction specified, will be removed based on the Orphan Record Cleanup Age x_snd_eb.orphan_record.cleanup.age system property. The default orphan cleanup age is 7 days.

Export Tuning

When a Dataset export runs, the system will automatically export the records in small batches depending on its configuration.

Batch size options are:

  • Max size: set the maximum file size to export. There is an internal maximum size of 5MB.

  • Max rows: limit batches to a specific number of records.

  • Max time: prevent long running exports by setting a maximum export time.

The final export size is determined by whichever limit is reached first. If more records are available to be processed, additional Dataset Requests will be created until all records have been exported.

Create a Dataset Field Map

New Field Maps can be configured for handling specific types of data in new ways. We recommend using the "Dataset" prefix when creating your new field map so it is easy to identify.

Dataset specific Field Maps are slightly different to eBonding integration field maps. The main difference is that the Stage to Target script is executed as a ServiceNow Transform Map script. This means that certain objects are not available, including transaction, bond, and request.

These Field Maps should reference the Import Set Row fields using the standard $stage object. Field name conversion from the Import Set Row source object to the Field Map $stage object is handled automatically, i.e. "u_" prefixes are removed.

The target object which represents the record being created or updated can be used normally.

The log object is updated to reference ws_console meaning all logs will be written to the Activity Log generated for each imported record. If required, logs can continue to be written to the Import Set Row using the source object.

Disable Dataset Import

By default, Datasets are automatically configured for both import and export. If you require a one-way data export without import, you can prevent inbound messages from being processed.

  1. From Integration Designer, navigate to the Dataset in your integration.

  2. Find the Send message and click the clickthrough button to open it.

  3. Change the Direction from Bidirectional to Outbound.

  4. Click Save.

Multiple Tables

Datasets are designed to work with one table at a time. If you need to import/export more than one table, setup a Dataset for each table.

Sending Data from a Third Party

Externally generated files can be processed by a Dataset. Files should match the file type the Dataset expects, e.g. CSV or JSON, and be streamed to the Dataset import endpoint. The maximum file size will depend on your instance configuration.

POST https://<instance>.service-now.com/api/x_snd_eb/unifi/<api_name>

Send a file to be processed by a Dataset.

Example: https://acme.service-now.com/api/x_snd_eb/unifi/incident/dataset?file_name=cmdb_ci.csv&reference=Sync%20Server

Query Parameters

Name
Type
Description

file_name*

String

Name of the file

reference*

String

The name of the dataset

Headers

Name
Type
Description

Content-Type*

String

text/csv OR application/json

x-snd-eb-message-name*

String

The name of the Send message, e.g. Send_<table>

Using IRE (Identification and Reconciliation Engine)

It is possible to add support for ServiceNow IRE to your Dataset imports. This will push data through the reconciliation engine rather than directly updating the target records from the Import Set.

Follow these steps to configure your Dataset for use with IRE.

  1. Create a new Data Source.

  2. Create a Field Map for setting up the import for IRE.

  3. Create a Field Map to give the data to IRE.

  4. Add header and footer fields to the Dataset.

  5. Modify Field Maps being used by the Dataset.

Create a new Data Source

A new Data Source is required for IRE rules to be configured. You can add this manually to the sys_choice table on the discovery_source field, or create it using this fix script.

We recommend the data source name matches the integration name. This is how the IRE header Field Map is setup in this example.

If you are packaging Datasets in Scoped Applications, you should create a Fix Script that runs when the application is installed.

var dsUtil = new global.CMDBDataSourceUtil();
dsUtil.addDataSource("ACME CMDB");
Create a Field Map for setting up the import for IRE.

Name

Use IRE - Header

Description

Sets up a Dataset Import for using the ServiceNow IRE (Identification and Reconciliation Engine) to import CMDB data.

Do not map fields using this field map to a CMDB attribute. Ensure that it is the first field to be processed by setting the field order to an appropriately small number.

This field map will create an $ire_values object on the $stage which will store all the values to give to IRE.

Stage to Target

// Setup an IRE item for this record
$stage.$ire_item = {
  className: target.getTableName(),
  internal_id: '' + target.sys_id,
  values: {},
  lookup: [],
  sys_object_source_info: {
    source_feed: '$[field.integration.name] Dataset ' + target.getTableName(),
    source_name: '$[field.integration.name]',
    source_native_key: '' + target.sys_id,
    source_recency_timestamp: '' + new GlideDateTime()
  }
};
Create a Field Map to give the data to IRE.

Name

Use IRE - Footer

Description

Use the ServiceNow IRE (Identification and Reconcilliation Engine) to import CMDB data.

Do not map fields using this field map to a CMDB attribute. Ensure that it is the last field to be processed by setting the field order to an appropriately large number.

This field map will convert the inbound payload to the IRE format, call the createOrUpdateCI() function, and ignore the current import set row.

Stage to Target

var input = JSON.stringify({ items: [$stage.$ire_item] }, null, 2);

x_snd_eb.ws_console.debug('Payload for IRE is:\n\n %0', [input]);

var source_name = $stage.$ire_item.sys_object_source_info.source_name;
var output = sn_cmdb.IdentificationEngine.createOrUpdateCI(source_name, input);

x_snd_eb.ws_console.debug('Output from IRE:\n\n {0}', [output]);

ignore = true;
Add header and footer Fields to the Dataset.
  1. Navigate to the Process message from the Dataset.

  2. Add a new Field

    • Map to field is false

    • Field map is Use IRE – Header

    • Call the property IRE_Header

    • Set the Order to be 1 or less (it needs to be the first field on the message)

  3. Add a new Field

    • Map to field is false

    • Field map is Use IRE – Footer

    • Call the property IRE_Footer

    • Set the Order to be 10000 or more (it needs to be the last field on the message)

Modify Field Maps

Each Field needs to registered for processing by IRE. This is achieved by adding the following snippets (depending on field type) to the end of the Stage to Target script for each Field Map being used in the Dataset.

Standard Field Map - Stage to Target

Most fields only need to tell IRE the field name and the value.

// Store field values for IRE
if ($stage.$ire_item) {
  $stage.$ire_item.values['$[field.element]'] = target.getDisplayValue('$[field.element]');
}

Reference Field Map - Stage to Target

Reference fields need an additional lookup object to pass into IRE.

// Store field values for IRE
if ($stage.$ire_item) {
  $stage.$ire_item.values['$[field.element]'] = target.getDisplayValue('$[field.element]');

  // Add the lookup object for reference fields
  $stage.$ire_item.lookup.push({
    className: target.$[field.element].getReferenceTable(), 
    values: {
      name: '' + target.$[field.element].getDisplayValue()
    }
  });
}

Remember to Build the Message/Integration when you have finished configuring the Fields.

Publishing with Run As user

When publishing Datasets (or any scheduled script) to an update set or application, especially if submitting to the ServiceNow Store, you need to ensure the "Run as" field is emptied since the user will likely not exist on the target system.

PreviousCreate a New DatasetNextPolling

Was this helpful?