Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Here you will find details of what's changed in this release, including new features & improvements, deprecated features and general fixes.
Welcome to the release notes for Unifi - Version 2.2. Please have a read through to see new features and fixes that have been added.
Please note that, as with every release, there may be some changes that are not entirely compatible with your existing integrations. While we do everything we can to make sure you won’t have to fix your integrations when upgrading, we strongly encourage all our customers to perform full end-to-end tests of their integrations before upgrading Unifi in Production.
We also highly recommend aligning your Unifi upgrade with your ServiceNow upgrade. This means you only need to test your integrations one time rather than once for the ServiceNow upgrade and once for the Unifi upgrade.
We really appreciate feedback on what we’re doing - whether it’s right or wrong! We take feedback very seriously, so if you feel you’d give us anything less than a 5 star rating, we’d love to hear from you so we can find out what we need to do to improve!
If you would rate us 5 stars, and haven’t left a review on the ServiceNow Store yet, we’d be grateful if you would head over there to leave us your feedback. It only takes a few minutes and really does help us a lot. Go on, you know you want to leave a 5-star review on the Store!
We’ve tested and verified that Unifi continues to work as expected on the latest Paris release of ServiceNow.
We’ve introduced a brand new feature that makes exporting and migrating integrations seriously easy, 1-click easy in fact. You can now click the “Package” button on an integration in the Portal and the system will gather all the related files that make that integration work and put them in a brand new update set and download it to your desktop. This feature does require the latest version of the Unifi Global Utility.
An API name is automatically generated when creating a new Process. [UN-192]
Integration Diagnostic now checks for global utility installation and version. [UN-369]
Millisecond time fields have been added to transactional records. [UN-541]
Integration Diagnostic now checks for hotfix version. [UN-682]
Activity Log will show “No logs.” if no logs are generated. [UN-768]
Errors are no longer duplicated in Activity Log to make it easier to debug. [UN-769]
Improved documentation interface in UID. [UN-772]
The display value for Connections now includes the Integration. [UN-347]
Bond reference method is no longer available for Response and Receipt type Messages. [UN-603]
Integration no longer controls number of attachments per message as this is done by the message itself. [UN-651]
Added the connection variable page to Portal. [UN-656]
Copying a Connection Variable now works. [UN-657]
Fixed icon alignment on Messages list in UID. [UN-755]
Fixed typo with Event Action. [UN-659]
Message Scripts now only throw their own errors (e.g. with Fields). [UN-749]
Improved spacing with UID Dashboard grid items. [UN-770]
Here you will find details of what's changed in this release, including new features & improvements, deprecated features and general fixes.
The ShareLogic Unifi 3.0 release includes a range of new features, improvements and fixes. Read on to find out about exciting new capabilities that will further enhance your experience.
Please note that, as with every release, there may be some changes that are not entirely compatible with your existing integrations. While we do everything we can to make sure you won’t have to fix your integrations when upgrading, we strongly encourage all our customers to perform full end-to-end tests of their integrations before upgrading Unifi in Production.
We also highly recommend aligning your Unifi upgrade with your ServiceNow upgrade. This means you only need to test your integrations one time rather than once for the ServiceNow upgrade and once for the Unifi upgrade.
We really appreciate feedback on what we’re doing - whether it’s right or wrong! We take feedback very seriously, so if you feel you’d give us anything less than a 5 star rating, we’d love to hear from you so we can find out what we need to do to improve!
If you would rate us 5 stars, and haven’t left a review on the ServiceNow Store yet, we’d be grateful if you would head over there to leave us your feedback. It only takes a few minutes and really does help us a lot. Go on, you know you want to leave a 5-star review on the Store!
Unifi 3.0 introduces the Unifi Test Assistant as a way to view, run and analyse automated tests in your instance. Test Assistant helps to remedy the issues of regression testing large numbers of integrations when you come to upgrade your ServiceNow instance or Unifi itself. Tests are created with one click from existing bonds, and can be tested at any time. Exporting tests to your other instances is a breeze as the tests are packaged with your integrations.
The integration documentation automatically generated for every integration has had a major overhaul with a new layout, better navigation, and major performance improvements.
UN-114 Added a Connection Test button to perform a simple high-level test of the connection to the specified endpoint.
UN-187 Automate creation of the Business Rule on any table that has a Unifi message assigned to it.
UN-783 The table field is now cleared when copying Receipt or Response messages.
UN-794 Added Run Conditions Section to Event Actions in Unifi Designer.
UN-796 Added checks for Unifi properties in Diagnostic.
UN-797 Attachment trigger Business Rule is now wrapped by console so it generates Activity Logs.
UN-807 Improve ActivityLog performance within loops.
UN-809 Added system property to control of the maximum age of Activity Logs.
UN-810 Modified attachment processing so that it will only be processed when the related record is already bonded.
UN-811 Improvements to Activity Log performance.
UN-812 Inbound REST methods are automatically created when Processes are defined/updated.
UN-813 Warning during outbound message processing (e.g. "Ignoring messages for integration ...") have been changed to debugs.
UN-818 Editing a field map now causes the Integration Build button to be shown.
UN-828 Transaction Replay from list view now informs the user if replays have been unsuccessful.
UN-890 Connections can be assigned an instance to prevent production integrations running in sub-production instances when cloned.
UN-908 Add fields and field maps to documentation.
UN-912 Added links to various configuration records in platform so they can be easily opened in Unifi Designer.
UN-915 Show number of active integrations to make it easy for people to know how many licences they have used.
UN-916 Add Response Actions to Unifi documentation generator.
UN-917 Add Poll Processors to Unifi documentation generator.
UN-918 Add Scheduled Jobs to Unifi documentation generator.
UN-924 Update to Transaction and HTTP Request numbers to show decimal suffix on first attempt.
UN-930 Allow message scripts to be toggled on dynamic documentation.
UN-952 Bond ownership can now be controlled from Message fields.
UN-953 Bond state can now be controlled from Message fields.
UN-967 Added information box to Connections page in Unifi Designer to show the inbound endpoints of the integration.
UN-968 Process page now shows all fields in Unifi Designer.
UN-969 Attempt number is now shown on inbound HTTP requests.
UN-972 Related Event Actions are now shown on Messages.
UN-973 Messages are no longer prefixed with 'Copy of' if copying to another Integration.
UN-976 Messages are now ordered by name in Unifi Designer.
UN-977 Inbound transactions now set the source connection on the bond.
UN-978 Field inheritance is now active by default.
UN-988 Authorization header is masked in the headers on HTTP Request.
UN-992 Add support for headers and request in Payload to Stage message scripts.
UN-1000 Enhance Activity Log with success messages.
Other minor portal improvements.
UN-830 Bond history has been removed from the Bond form for performance.
UN-882 Updated the top navigation menu links in portals.
UN-889 Updated the 'Get help' portal page to show new branding and information.
UN-978 Field inheritance is now active by default and the form is locked down if inherit box is checked.
UN-771 Fixed auto-documentation printing layout.
UN-795 Fixed an issue where sending an attachment on new would incorrectly commit a new empty bond when the attachment failed to send.
UN-798 Fixed the Event Actions table field so it shows Unifi tables.
UN-799 Fixed an issue with the Event Action advanced condition also running the standard condition.
UN-800 Fixed an issue where no table being set on an Event Action would cause it to break.
UN-803 Fixed an issue where inactive Event Actions were being processed.
UN-808 Fixed domain value during HTTP Request retry.
UN-816 The Response Action field is now cleared when replaying a Transaction.
UN-820 Modified the attachment business rule condition to prevent an error when 'previous' object does not exist.
UN-822 Fixed the Diagnostic and Self-Test roles so they accessible by the Unifi admin role.
UN-829 Fixed an issue preventing a Unifi admin user from replaying Transactions.
UN-831 Fixed the inbound processing so that the transaction and bond domain are set correctly from the domain on the request.
UN-833 Added measures to prevent the increased async rule performance from causing blank tickets.
UN-865 Fixed an issue with the domain not being set when replaying a transaction or request.
UN-867 Fixed an issue with a successful Transaction not changing the Integration Status from "Awaiting" to "Up".
UN-868 Prevented a situation where an outbound sync response that updated the process record could cause an update loop.
UN-870 Fixed JellyRunnerLite from leaking a global 'result' object.
UN-880 Fixed an issue where copying an integration and clearing the target scope would result in the new integration being in the Unifi scope.
UN-909 Fixed an issue where the HTTP Request retry count failed when it gets to number 10.
UN-910 Fixed an issue where the HTTP Request attempt number was not updated when Replay is used.
UN-914 Introduced auto-refresh to the Operations portal to prevent the connection from being dropped.
UN-946 Fixed an issue with the Resume integration worker popup showing the wrong number of transactions
UN-962 Field bug where copying a Message between Integrations showed 'Build Integration' in the originating Integration, and not the target integration.
UN-971 Fixed an issue where copying a Message to another integration resulted in fields in a different integration.
UN-987 Fixed issue with form choice fields becoming unresponsive in portal forms.
UN-993 Fixed Resume and Repair worker counters being wrong when processing more than 100 records.
Release date: 12 September, 2022
Upgrade Notice
Please note that, as with every release, there may be some changes that are not entirely compatible with your existing integrations. While we do everything we can to make sure you won’t have to fix your integrations when upgrading, we strongly encourage all our customers to perform full end-to-end tests of their integrations before upgrading Unifi in Production.
We also highly recommend aligning your Unifi upgrade with your ServiceNow upgrade. This means you only need to test your integrations one time rather than once for the ServiceNow upgrade and once for the Unifi upgrade.
Feedback and Reviews
We really appreciate feedback on what we’re doing - whether it’s right or wrong! We take feedback very seriously, so if you feel you’d give us anything less than a 5 star rating, we’d love to hear from you so we can find out what we need to do to improve!
If you would rate us 5 stars, and haven’t left a review on the ServiceNow Store yet, we’d be grateful if you would head over there to leave us your feedback. It only takes a few minutes and really does help us a lot. Go on, you know you want to
We are extremely proud to introduce official support for mass data import and export with a brand new feature called Datasets.
Datasets allow you to easily configure imports and exports of large amounts of data on a schedule, with transformation being achieved through the standard use of Unifi Fields and Field Maps, and transportation via automatically generated attachments sent within Unifi Messages. Data is automatically chunked and filtered for delta updates, with full sync being available at any time with one button click. Datasets have been built so you still get the benefit of Unifi operational data but with huge efficiency gains for large amounts of data.
Diagnose and managed your integration with the new Integration Dashboard, containing stats, diagnostics, and access to common integration development actions.
We're continuing to refine the portal interfaces and this update includes a significant performance boost to the Designer experience, along with several developments and updates to make integrations easier to manage and maintain.
The Snapshot table was introduced in a previous release as both a mechanism to support automated testing and a precursor to a more efficient method of processing outbound messages which we intend to move towards in a future release. Unfortunately, no cleanup job was included and this table has been known to become very large.
While this update includes a cleanup job to rectify this issue, we strongly recommend the table be emptied before updating to this version since this will be much faster than relying on the job. Note: Once the table is emptied, Unifi will not be able to create tests for existing bonds.
Pre-upgrade requirement
For existing customers using version Unifi 3.1, we strongly recommend the Snapshot [x_snd_eb_snapshot]
table be emptied before upgrading to version 4.0. This can be achieved by an administrator:
Change to the Unifi application scope
Navigate to the Snapshot table object (/sys_db_object.do?sys_id=cbe6eb84db856010d78869091396198b), and
Click "Delete All Records".
Previously, transactional data relating to bonds was removed by an event specifically for that bond that was created when the bond was closed. For customers with long data retention requirements, this could result in thousands of queued cleanup events.
The cleanup logic has now been updated to improve this. A new field called Cleanup date has been added to Bond records. The Cleanup date is set when the bond is closed and specifies the time the transactional data for the bond should be cleaned. A scheduled job triggers an event and script action once per day. Any records which have a cleanup date before the current date are cleaned. (Note: the job is designed to chunk the processing to prevent excessive system resource consumption for systems with high cleanup volumes.)
A new diagnostic test has been added to ensure the inbound users specified on a Connection has the x_snd_eb.integration role.
The dashboard in Unifi Designer has been improved and updated with a number of new features including:
New “All Integrations” pane with tabs for switching between processes and integrations
Search bar to allow contextual search for all integrations or just those in the selected process
Reversible sorting for Name and Updated date
A field that has its configuration inherited by child fields now shows a list of those inherited fields.
The list of fields in Designer now shows more columns to make it easier to find and identify fields.
The Scenarios tab show within a test in Test Assistant now shows the direction of the scenario to make it easier to see which way the messages are flowing.
Connections have the ability to specify the name of the instance the connection belongs to. This value is now prepopulated with the instance name when the connection is created. The value used is from the system property instance_name
.
Asynchronous integrations use a request-receipt pair of messages where the receipt is a new message sent by the other system to confirm it has processed the original request. Unifi allows a timeout to be specified so that transactions with requests that do not receive a receipt within a given duration can be flagged for investigation with the Timed Out
state. By default, Unifi operates in best practice and discards receipts to messages that have timed out, however this update introduces a new setting on the integration to change this behaviour. Setting Process timed out transactions
to true on the integration will allow receipts to be processed regardless of how long they take.
Previously, when an inbound receipt was rejected because the Transaction state was not “Awaiting Receipt”, the response error would just indicate that it is because it has been “processed already”. This is not always the case so the error message has been updated to explain the actual reason which should be more helpful in testing and operational investigations.
Response errors for inbound requests that are generated by Unifi have been updated with a [Unifi]
prefix so that users can more easily determine if the error is a platform issue such as an invalid user/password or a Unifi issue such as no connection was found.
The Unifi Data Preservers have been updated so the Include in system clone
field is now true, allowing system clones to keep existing data on the target instance.
Pollers generally need to know the last successful request time so future queries can filter the data efficiently. A new Data Store called $last_completed
which contains an ISO date time is now automatically stored on Pollers and is set when a Poll Request state changes to Complete. The Poller object has a new method getLastCompleted()
that can be used which returns a GlideDateTime object of the $last_completed
value or a new GlideDateTime object if any validation on the datetime fails.
A new dashboard has been added to Integrations in Designer to make it easier to manage and diagnose the integration. The dashboard is now the default view of the integration, with the integration settings being moved to a new "Integration" menu option.
Updated copy logic so "Copy of" is only prepended when the name and integration have not been changed by the user or the copy process.
Minor label update to show that ${code}
and {code}
are valid in the Message path field for running inline code.
Executing a Poller in Designer will now open the new Poll Request.
Unifi has always used POST
as the default REST method when no method has been specified on a Message by the user. This value is now set by default for REST integrations so it is explicitly defined and easy to see.
A new cascading job has been introduced to cleanup Snapshot records.
ServiceNow have deprecated async business rule with the introduction of new async processing. All Unifi async rules have been updated to use the new version.
Best practice in Unifi is to turn off unnecessary logging and payload storage in Production since these are generally only necessary in development. A new diagnostic check has been added to find active pollers belonging to active integrations in a production instance which have the Attach response payload
flag set as true. The recommendation is that this flag is disabled for Production instances.
Unifi now stores 30 recent items that have been viewed in Designer for each user so they can be accessed easily. These are accessed through a new icon and menu system which has been added on the top-right of Designer.
Unifi Diagnostic has been fully updated with new diagnostic tests, updated messages, and the ability for many issues to be fixed at the click of a button.
Forms now render much faster in Designer, in many cases almost immediately, with fewer round trips to the server and less work for the server to do.
Introduced the new Dataset feature to allow mass data import and export within Unifi.
A new Message type called Dataset has been added to support Dataset processing. Dataset messages are not processed by Unifi trigger rules.
Datasets are packaged with integrations for export.
Payload to Stage and Stage to Request scripts now have access to a new variable called query
. This is a key-value object that represents the search query parameters in the URL and operates in the same way as the headers
object. It can be used to read query parameter values inbound and write them outbound. When writing outbound in the Stage to Request script, the query object will be compiled and automatically added to the search query of the URL.
It is now much easier to select one script when editing message scripts in Designer. The menu bar has been updated, with a new "Only" option being added to Scripts shown in the "Change view" menu.
Integration is now always shown in Response Action and Event Action lists, making it easier to distinguish between integration specific and Unifi specific records.
A Bond is now set to an Error state when the current Transaction gets set to the Timed Out state, reflecting the fact that no more Transactions can be processed until the current one has been addressed.
Introduced a new rule to prevent more than one Heartbeat message being active at one time.
Fixed a bug that occured when copying a field map in the Designer portal. The request would hang and the system would be blocked. The only way to resume things was to manually call cancel_my_transaction.do. This was caused by a scope issue with Field Maps, Response Actions and Event Actions; they now have their scope set automatically to be the same as the integration scope.
The embedded list of fields on a message in Designer now shows all integration level fields, not just those that specify a table.
Creating a new record in Designer using a modal will now always have default values reset.
Added caching support for new records so the user does not need to wait for a full form load from the server in order to create a new record.
Fixed a bug where fields were not being checked for readonly.
Field name changes are now inherited by all child fields.
If a field has a map to the payload containing a reserved word, e.g. class, then the script will fail silently and no error will show in the ActivityLog. Since ServiceNow basically ignores any issues with scripts wherever possible and just stops working when it cannot, a new diagnostic test has been added to ensure that message scripts pass Javascript validation tests.
Fixed a bug with Activity Log rendering so it shows the user that created the Activity Log instead of the user viewing it.
Fixed notifications and other display issues introduced with form caching by adding manual support for display business rules.
Fixed an issue when creating a new Process where the corresponding REST service details are not automatically populated meaning they would not be packaged.
In newer versions of ServiceNow, several customers have reporting that they very rarely see an issue with an empty bond which ends up stopping the transactions from flowing. This was not an issue until the introduction of the new async business rule processing which executes rules almost immediately. Previous inherent latency with event processing meant the issue was entirely hidden and therefore not a problem. New logic has been added which checks for the bond before the transaction is allowed to move into the processing phase.
Since the current user inserted the new request, ServiceNow executes the request replay as that user. This is fixed with the use of user impersonation for executing the request as it was originally and is supported by the latest version of Unifi Global Utility.
Systems using bidirectional asynchronous integrations between two ServiceNow instances where both instances are using OAuth 2.0 can experience connection problems with receipt processing. This is fixed with the use of user impersonation when sending the receipt and is supported by the latest version of Unifi Global Utility.
Removed the log section . The integration is "TXN0000098040.01".
as the integration and transaction number are already mentioned.
Resuming the integration now sets the status to "Awaiting". Once a transaction completes, the integration is set to "Up".
Since the deprecation of the integration attach logs
debug feature, successful poll requests no longer have responses stored as attachments. A new field called Attach response payload
has been added to the Poller and this is what now controls the payloads.
Fixed an issue where records in a hierarchy (e.g. CMDB) that have been updated on the root table would create bonds for the root table record instead of using the existing bond on the extended table record.
Fixed navigation in Designer so that the browser history is better maintained.
Fixed an issue where strings were being compared to numbers which would prevent the triage condition from working.
In a previous release, Unifi introduced auth header masking on the HTTP Request. While intended only for inbound requests, it was a blanket mask since there should never be any auth headers stored. However, we have since found out that there are some custom auth solutions which manually add the auth header which would then be masked by Unifi and break the integration. This release fixes this issue so custom outbound auth headers are allowed.
When an error is thrown in an inbound Payload to Stage script for a synchronous message, the system still attempts to identify the bond reference and the HTTP Request is marked as OK even though an error has occured. With this update, the original logic is used by default (since fixing this would be a breaking change for all customers using synchronous messaging), but the new logic that exits when an error occurs can be used by setting request.$abort_process_with_request_error = true;
on the payload to stage message script. This will cause the request to exit and prevent further inbound sync logic from running.
Fixed attachment errors so that options.type == "error"
making it more easily recognisable by the Integration feedback script.
Added a check to prevent this issue from occuring.
Some instances of ServiceNow handle UI Script URL's differently, with inline forward slashes working most of the time, but not all the time. This means the UI Script x_snd_eb.keymap/sublime.min.js
may not load and this would prevent scripts from loading. The issue has been fixed by changing the slash to a dot: x_snd_eb.keymap.sublime.min.js
.
Fixed an issue with internal configuration not being updated so the custom stage table was not known.
ServiceNow have introduced a new URL flag to Service Portal which causes the URL to be rewritten after it's been requested, meaning it essentially does a double load. The issue has been fixed by manually adding the flag before ServiceNow does.
Fixed by checking for the second parameter before using it.
Unifi Designer displays the name of the instance in the header bar, but this looks terrible for instances that have HTML in the instance name. HTML is now stripped from the instance name in Designer so it renders nicely.
Fixed the company table reference so that the links work properly in the operations dashboard.
Fixed a bug with field level ACLs so they render correctly in Designer. The field order cannot be written to when it is an inherited field.
Added a semi-colon to the checkpoint definition in automatically generated Message Scripts. The script is updated when it is next built.
Fixed Designer so full screen mode in script editors now renders nicely.
Added support for GlideTime fields in Designer which are not natively supported in Service Portal. This allows Pollers and Datasets to use proper scheduling.
Fixed an issue in Designer where target link generation would still occur even if there was no sys_id on the reference.
Fixed invalid CSS in the Activity Log renderer.
Added manual display rule support so that forms can be rendered quickly using caching but new values are still populated as expected.
Fields now use the Message table if related to a Message, otherwise they will default to the Process table.
Unifi is an integration platform built specifically to support complex ticket exchange integrations to and from your ServiceNow instance. From no-code to pro-code, Unifi gives you the flexibility, insight, and control to create exceptional integration experiences for you, your partners and your customers.
Unifi is a ServiceNow application that is delivered and managed through the . You can find out more information and request a demo through our .
Follow these steps to install or upgrade Unifi.
Unifi is exclusively available through the ServiceNow Store with a limited trial or paid subscription.
To install Unifi on your instance, you will need to ensure it has been given entitlement to the application. You must do this through the ServiceNow Store.
Once you have setup entitlement, you can use the Application Manager in your ServiceNow instance to install and upgrade Unifi. You can find more information on how to Install a ServiceNow Store Application in the ServiceNow Product Documentation.
When upgrading, make sure you also install the latest versions of Global Utility and Hotfix.
Unifi has some powerful features that will only work with access to the global scope. Access is granted through the Unifi Global Utility.
Full details can be found on the Global Utility page.
Unifi has the ability to be patched between releases by using a special Script Include called hotfix.
Full details can be found on the Hotfixes page.
You're good to start building your first integration or import one you already have.
If you’re new to Unifi then you might like to follow one of our integration guides. You can access the Integration Guides from the menu.
Release date: 17 November, 2021
Please note that, as with every release, there may be some changes that are not entirely compatible with your existing integrations. While we do everything we can to make sure you won’t have to fix your integrations when upgrading, we strongly encourage all our customers to perform full end-to-end tests of their integrations before upgrading Unifi in Production.
We also highly recommend aligning your Unifi upgrade with your ServiceNow upgrade. This means you only need to test your integrations one time rather than once for the ServiceNow upgrade and once for the Unifi upgrade.
We really appreciate feedback on what we’re doing - whether it’s right or wrong! We take feedback very seriously, so if you feel you’d give us anything less than a 5 star rating, we’d love to hear from you so we can find out what we need to do to improve!
If you would rate us 5 stars, and haven’t left a review on the ServiceNow Store yet, we’d be grateful if you would head over there to leave us your feedback. It only takes a few minutes and really does help us a lot. Go on, you know you want to leave a 5-star review on the Store!
These fields were previously showing millisecond values. They now show system date-times which will make them easier to read and sort.
Fixed an issue where multiple bonds could be created. When running multiple updates in the same thread, e.g. a GlideRecord loop, it was possible for the the target record to be changed which could result in new bonds being created unnecessarily. This was due to a new feature of GlideScopedEvaluator which replaces the object in memory thus breaking internal Unifi script references.
Due to native Service Portal modifications, script fields in Designer would no longer appear locked even though they were read-only. This fix addresses the styling.
The latest performance improvements to Unifi Designer introduced a caching bug for system fields which is now fixed.
The Execute Now button on Pollers has been fixed so that Pollers are executed as the specified user and not the user who is logged in.
We've seen some issues caused by attachment records being updated multiple times in rapid succession (thousandths of a second) by the system user which occasionally will result in multiple Bonded Attachment records being created for the same attachment. We suspect that this is due to ServiceNow Antivirus scanning and updating the attachment record. To account for this, we’ve created two business rules to fire attachments, one for inserts and one for updates.
Fixed an issue with the licence report not identifying if the Unifi global utility was installed.
Release date: 20 September, 2021
Heartbeat messages can be setup to automatically monitor and manage the integration. When a heartbeat message fails, the integration status will be marked as "Down" and all outbound messages will be queued. As soon as another heartbeat is successful, the integration status is marked as "Up" and the integration is automatically resumed.
Unifi Integration Designer benefits from a significant performance upgrade with lists and forms loading up to 10x faster than previous versions.
Users can now create Messages with type heartbeat which will be automatically sent periodically to the target endpoint and help to prevent failed outbound transactions.
Response Actions now benefit from a scripted condition field so complex logic can be used to determine whether or not a Response Action should be used during outbound response processing.
A new Unifi Response Action has been added to support the logic for heartbeat messages pausing and resuming the integration.
A scheduled job is automatically created and managed based on the presence and state of a heartbeat message within an integration and the integration itself.
Connection records now benefit from an UI Action/button that can be used to generate OAuth tokens. The button only appears when the connection authentication is set to be OAuth 2.0.
Response Actions are now able to handle group codes rather than just one code per action. For example, setting 5xx will match all 500 based response codes.
We've made it easier to find what version of Unifi you are using and included the number of active integrations in the About modal, available from the support menu.
Fields available to add to a Message are now filtered by the Message table in Designer, meaning you won't see fields that aren't relevant anymore.
We've added support for heartbeat messages to be run both synchronously and asynchronously, meaning they work the same way as any other message in Unifi.
Heartbeat transactions are automatically cleaned up every hour by the "[Unifi] Cleanup old heartbeat transactions" Scheduled Script. Two properties have been added to control the retention for complete vs failed heartbeat transactions.
Two identifying headers are automatically added to every heartbeat message that is sent: the identifying message header X-SND-EB-Message-Name
and the heartbeat indicator X-SND-EB-Heartbeat
.
As above. Heartbeat messages are sent only when the integration is active - this includes sending when the integration is paused.
Heartbeat transactions are not queued in the same way other transactions are queued in Unifi.
We've added some measures to help prevent integrations and other content from being created in the Unifi scope. This is disabled by by default in 3.1.2.
Response Actions can now make use of the headers which were provided with the response to an outbound request.
Failed heartbeat messages are ignored when repairing an integration.
To support heartbeat functionality, we allow transactions to be created and queued for sending when the integration is flagged as down, not just when it is manually paused by an administrator.
Customer global portal announcements will no longer be shown in the Unifi portal.
This is a performance improvement to the portal to prevent "scope changed" UI messages from being shown in the native platform view.
Numerous performance improvements made to the portal, including optimising widget dependencies, preventing global UI scripts from loading with every widget request, and reducing the use of $sp.getForm().
Addresses a bug with ServiceNow that allows all UI Scripts that are not "Desktop" to be downloaded with every widget request.
Fixed a bug with the portal when activating fields on a message which would have the button enabled then rapidly disabled and enabled again.
Updated the copy logic to remove field map references from fields that are copied between integrations. The user needs to specify which field map to use from the target integration.
Updated the Connection Test interface so that large response bodies are contained properly.
Fixed a bug with script fields not being editable in Designer when they were be the only thing on the page (.e.g. Message XML Template)
Fixed a bug with test assistant not properly updating the URL when porting tests between instances.
Fixed a JavaScript TypeError that only showed for Unifi Admins on the Process page in Designer.
Fixed an issue with the about modal logo colour in Designer.
Fixed a JavaScript TypeError that only showed for Unifi Admins on the Test Assistant Dashboard.
Fixed the record counter for Event Actions related list on Message in Designer.
Fixed an issue where creating a new Message from the list view in Designer would always result in a Message that was set to async=true
instead of it being controlled by the type of Message being created, e.g. response messages should not be async.
Improved the logic for displaying inbound endpoints on the Connections list in Designer so that similarly named API's are no longer shown (i.e. an API of incident
would also show inbound endpoints for an API of unifi_incident
.
When the property glide.itil.assign.number.on.insert
is set to true, numbers are not generated for Transaction and HTTP Request records on insert, but because we are adding ".01" to the end, the value was being populated as "null.01" and the number was never generated. We now account for this during record initialisation so numbers are generated correctly.
Fixed an issue where errors would be hidden if the payload was null or there was an error in the Reference lookup script on a Message. Errors are now shown correctly in Activity Logs.
When viewing a list of fields in the embedded list on a message in Designer, the Fields counter would show a different number of records than the number of fields actually shown. This was because of a query to show only table-related Fields. Now, all fields related to the table or no table at all are shown.
Removed "use strict" from two portal widgets because of incompatibilities with the Rhino engine and false errors being thrown from prototype based objects.
Creating a new Dataset for exporting and importing data is a simple 3-step process.
Create a new Dataset
Configure the Process Message
Configure the Send Message
Setup an export schedule
Datasets require the Global Utility in order for ServiceNow platform components to be created automatically.
Open Integration Designer and select or create an Integration.
From the main integration menu, select Datasets.
Click New.
Fill in the required information and choose your configuration.
Click Submit and view.
Creating a new Dataset will automatically configure several dependencies in your instance. These are:
A Message called Process_<table>
A Message called Send_<table>
A Scheduled Import Set with the same name as the Dataset
A Transform Map
Once the Dataset has been created, data is configured using Messages, Fields and Field Maps.
The Process Message handles the data mapping. Use the Fields list to configure which fields are being imported/exported.
From the Dataset details tab, navigate to the Process message and use the clickthrough button to open it.
From the Message, open the Fields list and add the fields that should be imported/exported. Each field will need to have a Dataset specific Field Map for transforming the data.
Ensure at least one field will Coalesce so the import can match to existing records.
Click Build Message.
Field Maps designed for an eBonding integration may not work for Datasets. Use the included "Dataset..." Field Maps or create your own using these as an example.
The Send Message handles the data to be imported or exported as an attachment.
The Path will need to be configured depending on where the data is being sent. If you are connecting to another ServiceNow instance using a ShareLogic endpoint, you will likely need to configure the Path to use the dataset web service at path "/dataset".
From the Dataset details tab, navigate to the Send message and use the clickthrough button to open it.
Open the Outbound > Settings page.
Configure the Path as required, e.g. "/dataset".
Click Save.
Datasets have the same schedule logic as any other scheduled job and can be configured to export data whenever needed.
Open the Dataset.
Click Scheduling to open the scheduling tab.
Configure the desired schedule.
Click Save.
Turning off the schedule will prevent data from being automatically exported.
Poll Processors are used to setup, execute and process Poll Requests.
The Poll Processor is a configuration record which contains the logic that will be applied when polling a remote system for data. There are three main scripts which are used to setup, execute and process the Poll Requests.
The Setup Script is the first script to run (it runs at the point in time the Poll Request is created). It is used to build the environment for the poll and define what it will do (for example, create/setup the URL that will be called). It also supplies a params
object as one of the arguments, which will be passed through to the subsequent scripts (and on to further Pollers, if required).
The Request Script is used to reach into the remote system and execute the request. This is usually done by making a REST call to the URL defined in the Setup Script.
The Response Script is used to process the information returned from the remote system. This could include converting the data from its proprietary format to a more usable format, sending the data to Unifi, or even kicking off a new poll.
Global system settings for Unifi.
The Properties module is where you will find the global system settings for the Unifi application.
Control all integrations using this master switch.
Allow users with this role to read bond records for informational purposes. Intended for giving access using related bond lists on the process tables. It does not expose the application menu. For greater control use the x_snd_eb.user role**.**
Choose the required logging level for the application: 'debug' writes all logs to the database, 'info' writes informational, warning and error logs to the database.
Debug mode is extremely resource intensive. Only choose 'debug' for troubleshooting when session debugging is not adequate.
Use for general integration debugging to capture debug messages in the console in addition to info, warn and error.
Enable input/output trace logs for functions that have been wrapped for tracing.
Requires Debug mode to be enabled.
Enable input/output trace logs for core application methods. This produces extremely verbose logs from all core scripts which will help ShareLogic Ltd if you ever run into trouble with the app.
Not required for general integration debugging.
The choices are:
Off: No debug mode is active. Performance is not affected.
Silent: Enhanced exception handling and error logs. Performance is slightly affected.
Trace: Functional profiling in addition to enhanced exception handling and error logs. Peformance is affected. Requires Trace mode to be enabled.
Turns on Activity Logs for Service Portal interfaces.
How many days should Activity Logs be kept before being removed?
Default: 7.
Configure which columns are always shown on lists in the Unifi Integration Designer portal. Multiple columns should be separated by a comma.
How many days should orphaned transactions be kept before they are removed?
Default: 7.
What is the maximum number of orphaned transactions to remove at any one time?
Default: 1000.
How many hours should successful heartbeat transactions be kept for before they are removed?
Default: 24
How many days should failed heartbeat transactions be kept for before they are removed?
Default: 90
Enables logs and payloads generated by Unifi transaction processing to be attached to transactional records.
Logging attachments this way is now deprecated and has been replaced by the Unifi portal and Activity Log.
Control whether to allow files to be created in the Unifi scope on Unifi tables (prefix x_snd_eb_)
Default: false
The Integration is where most configuration and settings are stored.
An Integration defines the connection between a Process and the single system it’s connecting with. It is a record that contains all of the properties and configurations for that unique connection.
Multiple Integrations can exist for one Process and each unique Integration will define the way in which the Process connects with that particular system. (Unifi defines an integration as the connection of two systems to transfer or exchange data for one process. This is represented by one integration record in Unifi.)
Incident Process - JIRA = one Integration
Incident Process - ATOS = one integration
Incident Process - SAP = one integration
Incident Process total = three integrations
Unifi will automatically create a Trigger (Business Rule) for the Process being integrated (if one doesn't already exist) when you run 'Build' either on the Integration or Message once your Create Message is configured.
For step-by-step instructions on how to configure Integrations (and other Integration components) see the Integration Guides section of the documentation.
The Details fields to be configured for the Integration record are as follows:
Name
String
The name of the integration.
Description
String
A description of what this Integration is for and what it does.
Service type
String
The type of web service this integration is using (SOAP/REST).
Message format
String
Automatically pre-process incoming messages for simpler message scripting.
Application
Reference
Application containing this record.
Process
Reference
The process this integration belongs to.
Company
Reference
The company this integration belongs to.
Message format choices: XML, JSON, Advanced
The choice selected here will determine the object that's available to the 'Identify message script'.
Company
This is usually the name of the service provider being connected to, as opposed to the name of the manufacturer of the software.
Messages are central to the functionality of Unifi. Upon receipt of an inbound request, Unifi will be able to identify the Message, know how to process it and subsequently what actions to perform based on the Message configurations. For that reason, it is very important that each Message within an integration be unique (more on that in the Messages section).
The Message Identification fields to be configured for the Integration record are as follows:
Identify message script
Script plain
The script used to extract the unique message identifier from the incoming request payload. It identifies which message record is used to process the request.
Unifi automatically looks for a header called X-SND-EB-Message-Name
to use as the message name. If the header is found then the Identify Message Script is not executed.
Both the payload
and headers
are passed into the script. The following example will parse the XML payload and identify the message name from the first child element of the body node:
The following is an example which returns the message name from a JSON payload:
The Attachment Settings fields to be configured for the Integration record are as follows:
Send existing attachments
Boolean
Set to true to send attachments which were added to the record before it was bonded.
Allowed content types
String
Comma separated list of attachment content types that are allowed to be sent to this integration.
Max attachments per bond
Integer
The maximum number of attachments allowed to be sent per bond.
Max attachments size per message
Decimal
The maximum size of all the attachments in a single message in MB.
Max attachments size per bond
Decimal
The maximum size of all the attachments in a single bond in MB.
Max attachments
Setting any attachment value to -1 means there is no limit.
Allowed content types
OOTB you can send any content type (i.e. this field is empty). You may wish to limit the content type by ‘whitelisting’ (explicitly specifying the file type that is allowed) (e.g. TXT, PNG).
The Bond Settings fields to be configured for the Integration record are as follows:
Bond cleanup
Integer
Number of days after bond closure that all its associated Transactions, HTTP Requests and Bond Attachment records are deleted.
Bond cleanup
Although the associated transaction and attachment records data are deleted, Bond history will still be available. Setting the Bond cleanup
value to 0 means the cleanup is never performed.
The Feedback Settings fields to be configured for the Integration record are as follows:
Enable UI messages
Boolean
Allow information and error messages to be shown to the user as UI Notifications. Only applies to certain notifications.
Note bond history
Boolean
Use the ‘Note bond history’ to process bond history updates.
Note attachment history
Boolean
Use the ‘Note attachment history’ to process attachment updates.
Add note script
Script plain
Script for adding integration updates to the target record. There is no need to call update() on the target.
Note bond history/Note attachment history
When the Note bondhistory/Note attachment history checkbox is checked, the history will be promoted to the work notes fields of the record being integrated for the user to view.
The General Error Handling fields to be configured for the Integration record are as follows:
Sync error message
Reference
The message to use when an inbound message cannot be processed synchronously.
Async error message
Reference
The message to use when an inbound message cannot be processed asynchronously.
Sync error message/Async error message
In the case of a catastrophic failure (e.g. the inability to identify, access, or read an inbound request, or the inability to process the request asynchronously), this will be the message that is sent in response (which can be standard or customised to suit).
The Timeouts fields to be configured for the Integration record are as follows:
Sync timeout
Integer
The amount of time in seconds to wait for a request to be accepted by the external system.
Async timeout
Integer
The amount of time in seconds to wait for an asynchronous receipt.
MID server timeout
Integer
The amount of time in seconds to wait for the MID server to respond (only applies to connections using MID servers).
Sync/Async/Mid server timeout
If there is no response/receipt within the time stipulated, then the request is errored. These errored requests can be rolled up to the record for them to deal with/escalate accordingly. Such insight allows the sender to remain informed of the condition of the request.
The Retry fields to be configured for the Integration record are as follows:
Retry delay
Integer
The amount of time in seconds to wait before retrying a failed outbound request.
Retry limit
Integer
The number of times sending an outbound request is attempted.
There are some fields which are visible in native ServiceNow. These can be viewed by clicking the hamburger menu & selecting 'Open in platform'.
The following non-selectable Integration fields are visible on the platform Integration record (and are controlled by Unifi):
State
String
The current state of the integration. It can be Active, Off, or Paused.
Status
String
The current status of the integration. It can be Up, Down, or Awaiting.
Build date
Glide Date Time
The last time the integration components were generated using the Build action.
Active
Boolean
The integration is enabled when a connection is made active.
Active
This field is not selectable and is controlled by the Connections. There can only be one active Connection at any time. If there is an active Connection, the Integration will be active. If there is no active Connection, the Integration will be inactive.
The Properties fields to be configured for the platform Integration record are as follows:
Run event actions
Boolean
Should this integration run Event Actions for transactional events?
Heartbeat frequency
Integer
The amount of time in seconds to wait between trying an outbound heartbeat request.
Heartbeat message
Reference
The message that should be used to periodically test the integration.
These are used for debugging purposes.
*Attach logs/payloads:
The attached logs were previously controlled by the Attach logs
and Attach payloads
checkboxes on Integration, but they are now overridden off by the Enable Integration Attachments
system property (logging attachments this way is now deprecated and has been replaced by the Unifi Portal and Activity Log).
This property acts as the master switch and effectively disables the checkboxes on the Integration.
Note: the checkboxes have been removed from the Integration form but can still be edited from the list view.
NOTE: When copying Integrations, the ServiceNow System Administrator (admin) role is required to see and to select Application Scope.
Field Maps define template code into which details from Field records are substituted. The resulting blocks of code are wrapped and compiled together into their nominated Message Scripts.
A Field Map defines the template code used by a Field in the various Message Scripts. It has a template script field for each of the Message Script types (Source to Stage, Stage to Request, Payload to Stage, Stage to Target). The template scripts are compiled during the Build activity where details of a Field record are substituted into the template, producing a Message Script specific to the Message and Field.
The Field Map defines the behaviour, or ‘type’ of a Field. Below is a table of the Field Maps packaged with Unifi:
Boolean
Map true/false values.
Choice
Perform value mapping between internal & external values.
Date
Map date values.
DateTime
Convert a GlideDateTime to and from an ISO 8601 date string.
Journal field
Transform journal fields into objects added to an array to handle multiple journal entry sending.
Message Header*
Compose or consume the message protocol header.
Message Name
Map message names.
Receipt Status
Send and process the status data sent/received in a receipt. This is the transaction process error.
Reference
Map reference values.
Requested Item
Map Requested Items.
Source ID
Extract and add the unique message identifier for THIS instance, i.e. your transaction ID.
Source Reference
Extract and add the unique record identifier for THIS instance, i.e. your task number, user_id etc.
String
Map String values
Target ID
Extract and add the unique message identifier for THEIR instance, i.e. the external message ID.
Target Reference
Extract and add the unique record identifier for THEIR instance, i.e. the other system so their task number, user_id etc.
Time
Map time values.
Unifi - SOAP Control
Scripts to control SOAP messages.
*Message Header consists of the following payload object:
message : { name : The message name time_stamp : The time the message was generated in the source system source_reference : The reference of the bonded task in the source system target_reference : The reference of the bonded task in the target system (not present in Create messages) source_id : The transactional identifier of this message target_id : The transactional identifier of the original request message (only present in Receipt messages) }
As stated above, Field Choice is strictly a ‘type’ of Field defined by a Field Map. However, it is worth going into a little more detail here. A Field Choice record represents the mapping of an internal value to/from an external value and a set of Field Choice records can be defined for an Integration level Field.
Internal values are used and stored by the source and target records.
External values are carried in the payload and exchanged between integrated systems.
If, for example, you wanted to map the Description field as part of the CreateIncident Message, you would only need to configure the incident.description Field (see previous page) and allocate it the String ‘type’ Field Map. There would be no need to configure the Field Map record. This would be the case in the majority of instances, using the shipped Field Maps for standard integrations.
If, however, you had a custom requirement to manipulate or transform data in some unique way, then a custom Field Map can be created.
Best practice is to only use Field Maps that belong to an Integration. You can do this by copying the OOTB Field Maps which are shipped with Unifi and labelling them as belonging to a particular Integration. This should be done for each Integration. That way any future updates which may be made to any of the OOTB Field Maps will not impact any of your Integrations and each Integration will comprise its own stack of self-contained records.
In this example we will copy the OOTB String Field and rename it to be used for the Incident Guide Integration.
To copy the String Field Map, navigate to the Field Maps page.
Click on the ellipsis to the right of the String Field Map & then click Copy.
On the Copy Field Map modal, update the Name and Integration as appropriate & click Copy.
*Name: We have chosen to prefix the existing Field Map Name with the initials of our Integration (you are free to choose any appropriate means of identifying/differentiating).
The following is an example of a Source to Stage
Message Script template from the String Field Map. The code between the dollar braces, e.g. $[field.element]
, is executed as JavaScript during the compile phase. The field
variable is the Field GlideRecord object.
The following block of code is extracted from The Source to Stage
Message Script belonging to a CreateIncident Message which was generated using the above String Field Map template against the incident.description Field:
To help you understand and give some context as to where these configuration records fit, we thought it would be helpful to layout the phases of an Integration as we see them.
Field records are configured and linked to Messages (and/or Integrations) and Field Maps.
Field Maps can be edited or created as needed.
Field Choice records are added/edited as needed.
The configurations defined by Fields and Field Maps are compiled into executable code and stored in Message Scripts. Existing Message Scripts simply have their code updated. Code outside of the demarkation comments in the Message Scripts is retained.
Messages and Message Scripts (and Field Choices) are interpreted and executed.
Operational records such as Bonds, Stages, Transactions and Requests are generated and maintained.
Message Scripts are where the request processing and data mapping occurs.
There are four different types of script which are used either before or after the Stage. Having these script types in relation to the Stage allows us to find any potential errors/discrepancies in data more easily.
Payload to Stage (Inbound)
Data coming in is going to be translated (moved ‘as is’) from the Request to the Stage.
Source to Stage (Outbound)
Data going out is going to be transformed (moved and changed) from the source to the Stage.
The following example extract from the Source to Stage
Message Script shows location data being mapped:
Stage to Target (Inbound)
Data coming in is going to be transformed (moved and changed) from the Stage to the target.
Stage to Request (Outbound)*
Data going out is going to be translated (moved ‘as is’) from the Stage to the Request.
*The Stage to Request script is not always necessary as the translation can also occur in the XML Template.
For a list of the available variables for each of the four Message Scripts, click here.
In Unifi Integration Designer, you have visibility of your message scripts in the one pane; this makes scripting so much more efficient.
Although you are able to script directly in the Message Scripts, much of the scripting can now be done by configuring Fields & Field Maps (see below).
The Message Scripts can also be the place where Bond state & ownership are set manually (without using the control fields on Message).
The following Stage to Target
Message Script from the AcknowledgeCreateIncident
Message sets the Bond state to Open:
The following extract from the Stage to Target
Message Script for a CreateIncident
Message shows the ownership of the Bond is with the internal system target.u_ebond_owner = true;
; therefore, Bond ownership cannot be with the external system bond.setValue('owner', false);
. It also shows the source of the Bond being identified as the unique Integration with the external system sending the Message target.u_ebond_source = transaction.integration;
(which can be particularly useful with multi-vendor integrations).
Bond state & ownership can now be configured without scripting by simply setting the values in the following fields:
Set bond owner inbound
Set bond state inbound
Set bond owner outbound
Set bond state outbound
For more details see the Message page.
Much of the scripting (including mapping of fields on the source/target record & other Transaction specific data e.g. Name, Time stamp, Source reference, Target reference) can now be done using Fields & Field Maps (see those pages for further details).
The configurations defined by Fields and Field Maps are compiled into executable code and stored in the Message Scripts (between demarkation comments). Existing Message Scripts simply have their code updated. Code outside of the demarkation comments in the Message Scripts is retained.
Use Connections to manage all your environment details for each integration.
A Connection is a property of an Integration. You must have a Connection set up to allow messages to be sent and received for your Integration.
The Connection stores all the authentication details of the Integration specific to a single environment. You can setup many Connections so you can easily switch between environments as necessary.
The Integration is controlled by the Connections. Making a Connection active will make it’s Integration active (and vice versa).
Although you can have multiple Connections per Integration, only one Connection can be active for an Integration at a time. Activating a different Connection will deactivate other Connections (for the Integration).
You can perform a basic connection test which will make a simple request to the endpoint to check if the user is authorized.
A failure is shown when the status is:
400 Bad Request
401 Unauthorized
403 Forbidden
404 Not Found
405 Method Not Allowed
Unifi will assume all other status codes mean the user is authorized even if the request was bad.
This is a basic connection test and the results do not guarantee connectivity.
To perform the test:
In Unifi Integration Designer, navigate to and open the Connection.
Click 'Connection Test'.
On the Connection Test Modal, click 'Test'.
When complete, the modal will display the results. They will show a maximum of two connection attempts, with the second attempt only running if the first attempt was not successful.
At this time, Unifi is simply showing the HTTP Response as a form of connection “sanity check”.
After viewing the results, click 'Done' to close the modal.
You can assign a Connection to a specific environment to make it easy to see what you are connecting to.
Available environment choices are:
Production
Pre-Production
Test
Development
Sandbox
The Connection is the link between the ServiceNow endpoint that is receiving messages and the Integration that is used to process them. If you want to receive messages from a remote system, you must specify an Inbound user which the remote system will use to authenticate with.
The Inbound endpoints for your Integration are clearly displayed in a widget at the top of the Connections page. Clicking the link to the right of the endpoint url will open the Scripted REST Resource (in native ServiceNow) in a new window.
The Web Service (REST method) is automatically created when the Process is configured.
In order for a Connection to send messages to remote systems, you must provide an Endpoint URL and the method of authentication.
Connections support several types of authentication: Basic, Mutual and OAuth.
Basic authentication is achieved by providing a username and password.
Mutual Authentication (MAuth) using SSL certificates is available by selecting the Mutual auth checkbox and configuring certificates on a ServiceNow Protocol Profile record.
You can connect using OAuth by configuring a ServiceNow OAuth Entity Profile.
If the system you are connecting to is behind a firewall (such as an internal service) you can specify a MID Server for the integration to communicate over.
The following table is a summary of the fields to be configured for the Connection record:
*Instance name:
If this value matches the property "instance_name" then requests will be sent, otherwise the HTTP Request will be errored with an explanation of why. Leave this value blank for legacy connection processing, i.e. always send when the connection is active.
**OAuth profile:
This field is visible when ‘OAuth’ has been selected as the choice from the Authentication
field.
***Protocol profile:
This field is visible when Mutual auth
is checked/set to ‘true’.
****Active:
Visible on the native ServiceNow platform record. Set to true by clicking the 'Enable' button. Set to false by clicking 'Disable'.
A Connection Variable is simply a key-value pair. If you have multiple connection environments (e.g. Dev, Test, Prod) - each containing different data, or information that needs to be passed between environments but changes - then use Connection Variables to provide a consistent entry point in the code.
They can be used to define static data as variables that can change between endpoints. The following example is from the Outbound Settings for a Message:
They could also be used instead of scripting data values directly into the code of Poll Processor scripts. See the following Guides for examples:
Inbound Datasets use ServiceNow Import Sets giving you the ability to view dataset imports as you would any other data import.
For detailed Activity Logs, navigate to the Transform page on a Dataset and enable the Import logging option.
Datasets with Import logging enabled will generate one Activity Log for each Import Set Row.
Dataset Requests are a record of when all Dataset imports and exports occurred. They are linked directly to Transactions that belong to a Bond which is exclusive to the Dataset. The records are automatically deleted when the corresponding Transaction is deleted.
The number of Transactions to keep during the daily cleanup can be managed using the Cleanup option which can be configured on each Dataset. This value specifies the number of transactions that should remain following the cleanup process. The default retention is 10 records.
Any Dataset Requests which are somehow orphaned, meaning they have no Dataset or no Transaction specified, will be removed based on the Orphan Record Cleanup Age x_snd_eb.orphan_record.cleanup.age
system property. The default orphan cleanup age is 7 days.
When a Dataset export runs, the system will automatically export the records in small batches depending on its configuration.
Batch size options are:
Max size: set the maximum file size to export. There is an internal maximum size of 5MB.
Max rows: limit batches to a specific number of records.
Max time: prevent long running exports by setting a maximum export time.
The final export size is determined by whichever limit is reached first. If more records are available to be processed, additional Dataset Requests will be created until all records have been exported.
New Field Maps can be configured for handling specific types of data in new ways. We recommend using the "Dataset" prefix when creating your new field map so it is easy to identify.
Dataset specific Field Maps are slightly different to eBonding integration field maps. The main difference is that the Stage to Target script is executed as a ServiceNow Transform Map script. This means that certain objects are not available, including transaction
, bond
, and request
.
These Field Maps should reference the Import Set Row fields using the standard $stage
object. Field name conversion from the Import Set Row source
object to the Field Map $stage
object is handled automatically, i.e. "u_" prefixes are removed.
The target
object which represents the record being created or updated can be used normally.
The log
object is updated to reference ws_console
meaning all logs will be written to the Activity Log generated for each imported record. If required, logs can continue to be written to the Import Set Row using the source
object.
By default, Datasets are automatically configured for both import and export. If you require a one-way data export without import, you can prevent inbound messages from being processed.
From Integration Designer, navigate to the Dataset in your integration.
Find the Send message and click the clickthrough button to open it.
Change the Direction from Bidirectional to Outbound.
Click Save.
Datasets are designed to work with one table at a time. If you need to import/export more than one table, setup a Dataset for each table.
Externally generated files can be processed by a Dataset. Files should match the file type the Dataset expects, e.g. CSV or JSON, and be streamed to the Dataset import endpoint. The maximum file size will depend on your instance configuration.
POST
https://<instance>.service-now.com/api/x_snd_eb/unifi/<api_name>
Send a file to be processed by a Dataset.
Example: https://acme.service-now.com/api/x_snd_eb/unifi/incident/dataset?file_name=cmdb_ci.csv&reference=Sync%20Server
It is possible to add support for ServiceNow IRE to your Dataset imports. This will push data through the reconciliation engine rather than directly updating the target records from the Import Set.
Follow these steps to configure your Dataset for use with IRE.
Create a new Data Source.
Create a Field Map for setting up the import for IRE.
Create a Field Map to give the data to IRE.
Add header and footer fields to the Dataset.
Modify Field Maps being used by the Dataset.
Remember to Build the Message/Integration when you have finished configuring the Fields.
Attachments in ServiceNow are normally attached to the records they belong to, making it easy to identify and share those attachments over an integration. Embedded attachments require a bit more effort to integrate since they can belong to other records, or in some cases, no record at all. They are typically found inside journal fields.
The use case being looked at in this example is embedding attachments in the Additional Comments field using HTML inside [code]
tags so lengthy comments such as operational checks can be shared with third parties directly from the instance.
To recreate this scenario:
Open a record with a journal field, i.e. an Incident.
From the More options selector, choose Email. The Compose Email window will open.
Copy/paste or drag an image into the email composer.
(In Windows, you can use Windows Key + Shift + S
to take a screenshot and then Ctrl + V
to paste it into the formatter).
Click the Source code button to view the HTML source code.
Select all the source code (Ctrl + A
) and copy it (Ctrl + C
).
Back in your Incident, go to a journal field, e.g. Additional Comments and paste in the HTML.
Wrap the comment in [code]...[/code]
tags to tell ServiceNow to treat this comment as code.
If the record and field is integrated, the field will be shared over the integration, but the other system will not have the attachment so the image will not show.
Note: [code]
tags will only be rendered if the glide.ui.security.allow_codetag
system property is enabled.
We can allow the integration to share embedded attachments with some modification to the outbound mapping scripts.
In this example, we are sharing a journal field so will modify the Field Map for Journal fields. We want the attachment data to be automatically embedded as base64 instead of sending the sys_attachment reference.
Open Integration Designer and navigate to the Field Map you want to modify. Find the Source to Stage
mapping script in theOutbound
panel.
Copy the function below into your Source to Stage
mapping script. This function accepts some text and updates all<img>
tags that reference the sys_attachment
table.
The final step is to update the same Field Map to use the new embedImageAttachments()
function that you've just added.
Add the code to the bottom of the same Source to Stage
mapping script and modify as required.
This code is designed for the out of box Journal field map.
Build your integration to push the Field Map changes to all the messages. You can now test.
Embedding attachments in this way will mean the maximum string length imposed by ServiceNow of 5mb will be easy to reach. Since base64 is a 4/3 compression, you have a realistic limit of around 3mb of images which can be transferred this way in any single request.
This approach is not considered secure since it is possible for malicious Javascript to be rendered in the [code] tags.
The glide.ui.security.allow_codetag
system property is responsible for rendering [code] tags. This is typically disabled as part of Instance Security Hardening.
If the property is disabled then [code] tags will not be rendered. Both systems must have this property enabled for this solution to work.
Some alternative approaches to embedding attachments in this way are:
Create a document containing the images and upload it as an attachment on the record. If you are streaming attachments, you will be able to share a file up to 50mb in size.
Upload the images separately as attachments and reference them in your comments. If you are streaming attachments, each image could be up to 50mb in size.
The current hotfix version is 4.0.2.3.
Unifi can be patched between releases by using a special Script Include called hotfix. If you find a bug in Unifi we may issue an update to hotfix so you can get the features you need without having to upgrade.
When upgrading Unifi, you can revert to the latest version of hotfix included in the upgrade. We reset hotfix with each release when the fixes become part of the main application.
We occasionally release a hotfix when minor issues are found. Simply replace the script in the hotfix
Script Include with the one shown below and you will instantly have access to the fixes.
These hotfixes will be shipped as real fixes with the next version of Unifi, so make sure you have the correct hotfix for your version.
Unifi Integration Designer is the primary and recommended interface used for creating and managing integrations in Unifi.
Designer can be accessed by clicking Unifi Integration Designer ➚
from the navigator menu.
Process is the top level configuration element which contains all Integrations for the Process concerned (e.g. Incident, Problem, Change, Request etc.).
Process is at the top of the configuration chain in Unifi. It is a mechanism which provides functionality not provided OOTB in ServiceNow. The purpose of a process is to keep like integrations in one place (e.g. Incident process integrations, or Change process integrations).
One Process may contain multiple Integrations which can each be configured separately and uniquely (more on that on the Integrations page).
From v3.0, when you create a Process, Unifi will automatically create* the corresponding Web Service (REST method).
If you change either the Process Name or the API Name_**_, Unifi will automatically update the corresponding Web Service (REST method).
Please note: deleting a Process will not remove the Web Services or Business Rule. They will need to be removed manually.
*Requires Global Utility
**If you change the API Name for Process created in versions prior to v3.0, ensure that all scripts which use it are also updated. It will break the integration if they are not.
The fields to be configured in the Process record are as follows:
Description
String
A description of what this Process is for and/or how it works.
Target table
Table name
The primary target or process table that this integration uses.
Reference field
Table name
The field on the target table that is used as the reference for the external system.
Stage table*
Table name
The table for storing staged data.
API Name**
String
The unique name of this process for use with the API.
SOAP Service***
Reference
The Scripted SOAP Service that external systems will connect to.
REST Service****
Reference
The Scripted REST API that external systems will connect to.
Message method****
Reference
The REST interface to send messages to.
Action method****
Reference
The REST interface to send attachments to.
Application
Reference
Application containing this record.
*Stage table
This value is automatically populated. All Integrations for this Process will leverage the one Stage table. The Stage is the root staging table for all data mapping.
Stages are created dynamically at the time of data being sent/received. For more information see the Stages page.
**API Name
The API Name is how we identify which Process we are integrating with. The Scripted SOAP/REST Service will reference the API Name (which is why it is important for this to be a unique reference).
***SOAP/REST Service
When building an Integration & after configuring the relevant SOAP Service, populate this field with the relevant value to enable Unifi to include that Service in the packaged Integration.
****REST Service/Message method/Action method
Unifi will automatically populate these fields with the relevant values. The 'Unifi' Scripted REST API is shipped with Unifi. The relevant Message method & Action method (Scripted REST Resources) are automatically created/updated when the Process is created/updated based on the templates contained in the shipped 'Unifi' Scripted REST API.
Here you will find details of what's changed in this release, including new features & improvements, deprecated features and general fixes.
Welcome to the release notes for Unifi - Version 2.0. Please have a read through to see new features and fixes that have been added.
Upgrade Notice
Please note that this is a major release which may not be entirely compatible with your existing integrations. While we do everything to make sure you won't have to fix your integrations when upgrading, we strongly encourage all our customers to perform full end-to-end tests of their integrations before upgrading Unifi in Production.
Feedback and Reviews
We really appreciate feedback on what we're doing - whether it's right or wrong! We take feedback very seriously, so if you feel you'd give us anything less than a 5 star rating, we'd love to hear from you so we can find out what we need to do to improve!
If you would rate us 5 stars, and haven't left a review on the ServiceNow Store yet, we'd really appreciate you heading over there to leave us your feedback. It only takes a few minutes and really does help us a lot. Go on, you know you want to leave a 5-star review on the Store!
The following items are the major talking points for this release.
Now that Unifi has a solid and proven foundation for building and running integrations, we have built a new Service Portal interface to make it simpler and easier than ever to build an integration. This new interface has been developed from the ground up to be scalable, intuitive and capable of supporting advanced features.
Unifi version 2.0 marks the first release of the portal, with the ability to build Processes, Integrations and Messages, but we will be adding to it in future releases.
Also of importance is the new Dashboard which gives a clear overview of what is happening in your instance right now. A live view of the last 20 transactions to be processed and the bonds that have been worked on today really helps you to see what's going on.
Unifi is tested to be compatible with all current ServiceNow releases: Kingston, London and Madrid.
A lot of effort has been put into upgrading the queue management options that are available. Things like replaying Transactions and Requests now use brand new technology which allows them to be much more robust and as close to a clone of the original as possible. It's also now possible to repair all the outstanding Transactions directly from the Integration with the click of a button.
This release introduces the concept of Field Maps. These allow you to configure your messages with Fields and reusable Field Maps which then automatically generate all your scripts, so you don't need to write any code.
Field Maps have no negative performance impact - in fact, they might be slightly faster - because all the logic to be executed is copied into the Message Scripts when the integration is built. Any existing code in your Message Scripts is retained so you always have the option of doing things manually.
Anyone that's used Unifi knows that a major benefit of building integrations in Unifi is the logging and profiling capability. We've now expanded this capability to make it more easily and widely accessible, so even background processes have complex logs available to see.
With the introduction of the Activity Log table, this becomes your first port of call when trying to looking for issues.
[UN-124] - Add description to connection variables to allow explanation of their use
It is now possible to add a description to the Connection Variable (up to 4000 characters) so notes or explanations can be stored.
[UN-127] - Replay All errored transactions
The Integration record now features a 'Repair' button which will automatically batch process all errored transactions. A background process is executed and with status being displayed through a progress bar.
[UN-152] - Sending attachments
This is a breaking change
Some improvements have been made to the attachment sending.
Two new fields are now available on Message:
Send attachments [true/false]
Send attachments max [Integer - default 1]
In addition to these new fields, the method AttachmentHandler#getAttachmentsToSend(message, transaction)
now takes two arguments where previously it took one. It is this method modification that may break some existing installations that manually call this method (e.g. in a UI Macro or Script Include). Please check usage of this before upgrading.
[UN-170] - Add reprocess/resend to HTTP requests
Replaying an HTTP Request now works properly for both inbound and outbound requests. Previously, complexities with the process prevented this from being fully implemented.
[UN-178] - Activity Log
Unifi has a very smart profiling and logging tool called console which allows detailed logs to be attached to contextually relevant records. While this has been incredibly useful it fell short in two areas. The first is that it would add lots of attachments to the database over time, and the second is that it wasn't easily possible to log non-record generating processes, such as background clean-up jobs.
This update addresses both of these issues through the addition of a new table called Activity Log. The table means that attachments are no longer needed because all transactions are automatically logged. It also means that background processes can be tracked because they do not need a record to belong to. The logs are stored in the table and rendered without needing to download or open an attachment.
Activity Logs will also track (where possible) the integration they belong to and the contextual record it is about. Related lists have been added in places like Bond, Transaction and Request so that Activity Logs can be easily referenced.
By default, Activity Logs are stored for 7 days.
[UN-210] - Dynamic Stage object so we don’t need to extend
Traditionally, local staging tables have been required for each process to capture data at a point in time. This data would then be available to process accurately, but it was also providing a log for future reference. The problem was a new Stage table would need to be created before the Process could be created. Then, any time a data point was needed, a new field would have to be added to the custom stage table.
Dynamic Stage does away with the need for custom stages per process and removes the need to create fields by storing data dynamically as you use it. New Processes only need to reference the Unifi Stage table, and Dynamic Stage will take care of everything else.
This change is fully backwards compatible with the traditional one stage table per process.
Start using Dynamic Stage in your Message Scripts by assigning to and from the $stage
object instead of stage
. It is possible to add not only strings to $stage
, but also complex objects and arrays. Just make sure that things like field objects are coerced to strings (or use getValue()
) otherwise you'll see something like [object Object]
instead of the string value you expect.
Custom stage tables will work with Dynamic Stage, but you will need to add the Stage Data Render
formatter to the form layout.
[UN-266] - Clone cleanup should disable all non-instance specific connections
By default, the Integrations Enabled system property (x_snd_eb.active
) is disabled when an instance is cloned. This update also disables all Connections to prevent accidental connection to the wrong endpoint.
[UN-268] - Add automatic domain setting for Bond and Transaction stack
The domain field is now automatically set for transactional records using the following logic:
Bond domain comes from Target domain (e.g. Incident).
Transaction domain comes from Bond domain.
Stage domain comes from Transaction domain.
HTTP Request domain comes from Transaction domain.
Bonded Attachment domain comes from Bond domain.
[UN-269] - Modify response action override message
The response action error message has been modified to be more descriptive.
e.g. Socket timeout handled by response action "Default 0 Script Errors"
[UN-272] - Add indexes
All tables have been indexed according to the data and queries being used. This should improve performance, especially when loading/searching lists.
[UN-276] - Allow multiple ignore and ignore from right click (Transaction)
Transactions can now be ignored from the list context menu and by a list button.
[UN-299] - Add description fields to configuration records
Add description field to all configuration type records.
Process
Integration
Message
Response Action
Poller
Poll Processor
[UN-304] - Break out GlideRecord#operation() to Model method
It is now possible to check the record operation from the Unifi Model class.
[UN-307] - Add link to app log
The System Log module has been updated to point to the scoped log rather than the system log.
[UN-309] - Update sync overdue transaction job BR on integration
The business rule that generates the overdue transaction jobs for each integration is now wrapped so it appears in the Activity Log.
[UN-317] - Refine Message.processOutbound
The Message.processOutbound()
method has been improved to be more efficient and provide better visibility of processing.
[UN-318] - Add method to console for executing a function
A new method ws_console.execute()
wraps an inline function in console logic and prevents errors from leaking (as errors are captured by the Activity Log and automatically put in gs.error()
).
It should be used in all business rules and scheduled jobs, and anywhere else a script runs in Unifi.
[UN-320] - Add cleanup job for pollers
A scheduled job has been added run every hour and uses a new cleanup
field on the Poller record to determine how many days to keep Poll Requests for.
[UN-326] - Poll Requests module should filter by created on today
Viewing Poll Requests from the application navigator will now filter to show only those created today.
[UN-340] - Sync responses do not have automatic access to internal reference
When an inbound sync request was made, the internal stage did not get auto populated with the internal_reference
field for use by the response message (e.g. to give the Incident number that was created). Unifi now checks if it has a value and if not, auto-populates it.
[UN-352] - Copy message
A Copy action is now available on the Message form which copies the Message and its Scripts.
[UN-353] - Copy connection
A Copy action is now available on the Connection form which copies the Connection and its Variables.
[UN-354] - Add support for streaming an attachment over REST
Unifi now supports sending a whole attachment as the request instead of having it sent as part of the body. This method streams the attachment rather than embedding it which means attachments larger than the string limit (5MB) can now be sent.
To use this feature, simply set the payload of the request to be sys_attachment:<sys_id>
and Unifi will automatically sent the specified attachment.
[UN-355] - Improve environment objects for identify_message script
The Identify message script on the Integration is now provided with integration
and connection
GlideRecord objects. The objects now available are payload
, headers
, integration
, connection
and variables
.
[UN-361] - Add link parsing to console
Console logs now replace text in the format table.sys_id
with a link to the record in HTML format. If the record is found then the table label and record display value are also shown. This allows users to easily open records used during the process.
[UN-363] - Add link to broken transactions on integration
A View broken transactions
related link is now available on the integration to easily view all transactions that are errored or timed out.
[UN-366] - Process stage table should be default to x_snd_eb_stage.
With the introduction of Dynamic Stage, new Process records default to use the Unifi Stage table.
UN-367] - Allow request error to show in Transaction instead of 'Final retry failed.'
The "Final retry failed" error shown in transactions was not helpful and has now been replaced with the error generated by the latest request that failed.
[UN-368] - If transaction is ignored, prevent in flight request from sending
Ignoring a transaction midway through a request being processed would not prevent the request from being sent. Requests now check if the transaction is cancelled just before sending and will automatically cancel themselves if the transaction is cancelled.
[UN-374] - Remove irrelevant debug statements
Irrelevant debug statements have been tidied up so they either do not show or only show when trace logging is enabled.
[UN-376] - Replace snd_console with ws_console
When Unifi was first created, snd_console
was also created to be used for profiling and debugging. With the company transitioning from the old name SN Developer to Whitespace Studios, we have replaced snd_console
with ws_console
. You might even see ws_console
become available for your own apps in future!
This change is backwards compatible. snd_console
still exists in Unifi so any code in your instance that uses this method will still work.
[UN-377] - Message path should allow ${} variable format
The Path on Message would only allow inline scripts using the {}
format, but this is inconsistent with the rest of ServiceNow so it now accepts ${}
or {}
to define inline scripts. e.g.
/upload?id={variables.uid}
or
/upload?id=${variables.uid}
[UN-150] - Replay request/transaction needs to run as the original creator.
When replaying requests or transactions, the replay would run as the logged in user instead of the original user. It now runs as the original user to allow functionality that relies on the identity of the user to run correctly.
[UN-228] - Overhaul pending transactions counter
It is possible for Pending transactions counter on the Bond record to get out of sync. Pending transactions are now calculated differently to try to prevent this from happening.
[UN-257] - Message processOutbound is processing messages for inactive integrations
Integrations that were inactive were still being processed for messages to send when an insert/update was made to a record. This fix prevents inactive integrations from having their messages processed.
[UN-260] - Payload object is not automatically converted to JSON for Stage to Request
Integrations using JSON payloads had to manually stringify the payload object in the Stage to Request script. You can now just set the global variable payload
to your payload object and it will automatically convert it to JSON.
[UN-274] - Poller doesn't check for integration off property
Pollers would still run even if the integration was disabled. This fix means Pollers will not run if the integration is disabled.
[UN-275] - RestHelper wraps with 'result'
This is a breaking change
When using a scripted REST API with the Unifi RestHelper, ServiceNow automatically wrap any non-streamed response in a result
object which makes it difficult to integrate with other systems that have specific response requirements. This fix uses the streaming method to send the response payload which means the message controls the whole response and no result wrapper is added.
[UN-283] - ITIL users cannot see the message names when viewing bond/transactions.
Security rules have been created to allow ITIL users to see the names of the messages that have been used when viewing from transactional data such as bonds and transactions.
[UN-285] - Attachment added can only be true on one message
Previously, only one message per integration could be configured with the 'Attachment added' trigger condition. This update allows one message per table per integration instead.
[UN-289] - Attachment business rule does not run on update
Some processes will update attachment records and these updates were not captured by Unifi. This fix addresses that so updates to attachments will check integrations.
[UN-290] - REST XML does not work
This fix allows XML to work properly with scripted REST API's using the Unifi RestHelper.
[UN-303] - When getting the bond need to order by created at ascending
This fix addresses issues finding the correct bond for integrations that can create more than one bond per ticket and see transactions spread between different (incorrect) bonds.
[UN-306] - ITIL users cannot see the integration names when viewing bond/transactions.
Security rules have been created to allow ITIL users to see the names of the integrations that have been used when viewing from transactional data such as bonds and transactions.
[UN-308] - Data Store checkOrphans method causing errors for empty table name
The scheduled orphan record checker for Data Stores would cause errors when the record that owner the data store was deleted. This would cause the getRefRecord()
method to fail.
[UN-314] - Receiving errors in processing poll requests due to string being too big
Pollers that return strings that are outside the string size limit (usually > 5MB for scoped applications) will generate an exception. A try/catch has been added to help avoid the exception and allow the script to keep running and gracefully exit.
[UN-319] - Repeating an inbound create request causes multiple bonds
When replaying an inbound create request, it would run as the user who replayed the request. Unifi saw this as a new create as it wasn't created by the Integration User. This has been fixed by replaying using a scheduled job that runs as the user that created the original request.
[UN-328] - Outbound attachment sending loop
Sending an outbound attachment that is rejected could result in the attachment being picked up with the next transaction which could also fail, thus causing an infinite loop. Bonded attachments are now only picked up in the Ready state where previously it was Ready or Rejected.
[UN-334] - Sync integration doesn't update bond external reference from stage, but async does.
The External reference on the Bond had to be written to manually for a synchronous message, where it happened automatically for an asynchronous message. It now works automatically for both types of message.
[UN-343] - Multi-table attachment processing
If an integration belonged to a process which pointed to a different table to the attachment sending message, the attachment listener on sys_attachment would not find the message. This is fixed with an option being passable to Message.processOutbound
so the table hierarchy will be searched.
[UN-348] - Affected CI logic updating incident is causing double updates
A caching issue meant it was possible for a work note to be added to the Incident whenever a CI was added/removed from the form. This has been fixed with a more intelligent mechanism of referencing the target record.
[UN-350] - Unifi message name header needs to be case insensitive
Message identification can be done by the request passing a specific header that Unifi recognises rather than adding the message name to the payload. However, headers in ServiceNow Scripted REST are always converted to lowercase which meant the Unifi header name, which used uppercase characters, did not work. This has been fixed by using case-insensitive matching on the Unifi message name header.
[UN-351] - Inbound user should not be mandatory
The Inbound user field on Connection is no longer mandatory. This is important for outbound-only integrations which do not require or permit any inbound requests.
[UN-356] - Add condition to the sys_attachment Unifi trigger
There were scenarios where Unifi log attachments could be sent by Unifi via the integration. Additional conditions on the sys_attachment business rule now prevent this from happening.
[UN-375] - Request URL's have duplicate host
If an integration required two endpoints (such as a ServiceNow Table API integration which also uses the Attachment API), the secondary host defined in a connection variable would not replace the host defined on the connection. The resulting URL had the connection endpoint followed by the variable endpoint.
e.g. Connection URL: https://test.service-now.com/api/now/table
Connection Variable value (attachment_api): https://test.service-now.com/api/now/attachment
Message path: {variables.attachment_api}
Resulting URL: https://test.service-now.com/api/now/tablehttps://test.service-now.com/api/now/attachment
The workaround was to put both hosts in variables, but it has now been fixed so that the message path is pre-processed first and then the connection path is prepended if the resulting URL does not contain a schema (i.e. ://
).
[UN-379] - Inbound attachment does not trigger outbound attachments for other bonds
In the multi-bond scenario where one ticket is bonded to many integrations, an inbound attachment will automatically create Bonded Attachment records for all the other Bonds. The problem was it didn't trigger the outbound message processing so those attachments didn't get sent automatically. This is now fixed so the outbound messages are processed when the bonded attachments are created.
A Message contains all the configuration required to send, receive, and process a request.
Messages are central to the functionality of Unifi. A Message brings together the disparate parts necessary for the configuration required in order to send, receive and process a request.
Unifi will automatically create a Trigger (Business Rule) for the Process being integrated (if one doesn't already exist) when you run 'Build' either on the Integration or Message once your Create Message is configured.
For step-by-step instructions on how to configure Messages (and other Integration components) see the Integration Guides section of the documentation.
The Details fields to be configured for the Message record are as follows:
Message name
String
The message name that is unique for this integration.
Description
String
The description for this message and the requirement it is meeting.
Direction*
Choice
The direction(s) this message is configured to support.
Type
Choice
The primary purpose of the message.
Table
Table name
The table this message will be triggered from. This is typically the same as the target table defined on the process.
Application
Reference
The application containing this record.
Integration
Reference
The integration this record belongs to.
*Direction choices: Inbound, Outbound, or Bidirectional.
The 'Disable' button will set the Active flag to 'false'.
From here you can configure the relevant Field records for the Message. A list of the available and active Fields is displayed*. For more information about Field records click here.
*The Fields displayed in this list are automatically filtered based on the table referenced on the Message (so you will only see relevant Fields).
The Response fields to be configured for the Message record are as follows:
Response
Reference
The immediate synchronous response to this message.
Async
Boolean
Turn this option on if you want inbound processing to occur asynchronously or this message is the first of an asynchronous message pair (make sure to set a receipt message).
Async receipt
Reference
The asynchronous receipt to this message. Leaving this blank will cause the message to be processed asynchronously without sending a receipt.
The Bond fields to be configured for the Message record are as follows:
Bond ownership condition*
String
Determine if the sender should own the bond or not in order for this message to be processed? Use ‘Ignore’ to process regardless of the owner flag.
Bond condition type
String
The type of conditional check made on the bond. None: no checks are made. State: checks against the state are made using the conditional checkboxes. Scripted: the ‘Bond condition script' is used.
Bond condition script
Script plain
The script used to make advanced contidional checks on the bond. Visible if Bond condition type is 'Scripted'.
Bond new
Boolean
Process this message when a new bond is required.
Bond pending
Boolean
Process this message when the bond state is Pending.
Bond open
Boolean
Process this message when the bond state is Open.
Bond suspended
Boolean
Process this message when the bond state is Suspended (internal suspend).
Bond vendor suspended
Boolean
Process this message when the bond state is Vendor suspended (external suspend).
Set bond owner* inbound
String
Set the Bond Owner when receiving this message. Use 'None' to leave the Bond Owner alone or to modify it via a Message/Field Stage to Target script.
Set bond state* inbound
String
Set the Bond State when receiving this message. Use 'None' to leave the Bond State alone or to modify it via a Message/Field Stage to Target script.
Set bond owner* outbound
String
Set the Bond Owner when sending this message. Use 'None' to leave the Bond Owner alone or to modify it via a Message/Field Source to Stage script.
Set bond state* outbound
String
Set the Bond State when sending this message. Use 'None' to leave the Bond State alone or to modify it via a Message/Field Source to Stage script.
*Bond ownership condition choices: Ignore, Must own, Must not own.
*Set bond owner choices: None, Internal, External.
'External' sets the bond.owner
flag to true. 'Internal' sets it to false.
These settings will take precedence over any which are scripted in the Message Scripts.
*Set bo_nd state choices:_ None, Pending, Open Suspended, Vendor Suspended, Closed.
These settings will take precedence over any which are scripted in the Message Scripts.
The Outbound Trigger fields to be configured for the Message record are as follows:
Table
Table name
The table this message will be triggered from. This is typically the same as the target table defined on the process.
Use advanced condition
Boolean
Use a script to evaluate the trigger condition for this message.
Advanced condition*
Script plain
The script that must be met for the message to be processed. Use current to get access to the triggering record.
Outbound condition
String
The script that must be met for the message to be processed. Use current to get access to the triggering record.
Outbound condition
Conditions
The condition that the ServiceNow record must meet to trigger this message being processed.
Send to self*
Boolean
Enable this option to allow this message to be sent even when the integration it belongs to has caused the update.
*Advanced condition:
This field is made visible when the 'Use advanced condition' field is set to true.
*Send to self:
By default, if an integration updates a target record it will not trigger messages to be sent back to itself, preventing feedback loops.
The Template fields to be configured for the Message record are as follows:
XML Template
String
The template to be compiled when sending a message for an integration that is configured to use XML. Also handy for storing an example message which can be used as a reference on inbound messages.
It is possible to use CDATA within an XML message, but because of the way ServiceNow handles XML it can be a little bit tricky. When ServiceNow see’s the CDATA tag, it actually processes it and the tag ends up being removed in the final. We need to use a little trick allow us to actually get the CDATA in the final result.
The Outbound Attachments fields to be configured for the Message record are as follows:
Send attachments
Boolean
Mark this message as being enabled for sending attachments.
Maximum attachments to send
Integer
Set the maximum number of attachments this message can send using the AttachmentSender helper class.
Attachment added*
Boolean
Use this message to immediately send new attachments regardless of the trigger conditions.
*Attachment added:
This field is made visible when the 'Send attachments' field is set to true.
The Outbound Settings fields to be configured for the Message record are as follows:
Path
URL
A path to append to the URL defined in the connection. Specify a full URL to override the connection. Define inline scripts to reference Stage to Request script variables by wrapping code in braces {}, e.g. /{transaction.message_id}.
Action method
String
The SOAP Action or the REST Method to use for this message. If this field is empty the SOAP Action will default to the message name and the REST Method will default to POST.
Order
Integer
The order that outbound messages are processed in.
The Inbound Settings fields to be configured for the Message record are as follows:
Bond reference method
String
Method of searching for and validating an existing bond for incoming messages. Internal: lookup using the internal reference only. External: lookup using the external reference only. Both: lookup using both the internal and external references.
Reference lookup script
Script plain
The script containing functions for extracting internal and external references from the request payload.
Extract attachments
Boolean
Use the Extract attachments script to extract attachments from the inbound payload and prevent attachment data from being stored in the Request payload field.
Extract attachments script*
Script plain
The script used to extract attachment data from a request payload. Used to prevent writing attachment data to the request payload field (full payload can be attached using integration ‘Attach payloads’ option).
*Extract attachments script
This field is made visible when the Extract attachments
box is checked. The script must always return a string and if an object is used it needs to be JSON encoded (i.e. JSON.stringify()
). The following is an example:
From here you can view each of the Message Scripts for the Message (Source to Stage, Stage to Request, Payload to Stage, Stage to Target). The auto-generated code displayed is configured using Fields & Field Maps. For more information on Message Scripts click here.
You MUST NOT edit the code between the Begin & End Comments, or the comments themselves.
If you wish to manually script any code, that must be done outside of those comments.
From here you can view a list of the relevant Response Actions on the Integration. You can also configure a New Response Action from the list. For more information about Response Actions click here.
From here you can view a list of the relevant Event Actions on the Integration. You can also configure a New Event Action from the list. For more information about Event Actions click here.
The ServiceNow Administrator [admin] role is required to access Response/Event Actions.
With the introduction of Datasets, it is now possible for large amounts of user, hardware, network and other supporting data to be sent to and received from remote systems. While this has been possible for some time using Pollers, they are not specific to handling large amounts of data in the way that Datasets are.
Datasets provide easy configuration that is supported by platform intelligence and automation, making it is incredibly easy to setup a robust and efficient mechanism for handling large sets of data.
Datasets require the latest Unifi Global Utility for them to build the necessary configuration correctly.
Each Dataset will create the following configuration:
The Send message is used for sending the data in an attachment to the other system. The name is automatically generating using the prefix "Send" and the name of the table.
The Process message is used for processing the record data both inbound and outbound. The name is automatically generating using the prefix "Process" and the name of the table.
The Import set table is used for staging inbound data. The name is automatically generated using the prefix "dataset" with the sys_id of the dataset.
The Scheduled import record is automatically created for processing import data. It is executed when an inbound Dataset Request has created the import set rows and is ready to be processed.
The Transform map and associated coalesce fields are used for processing the import set.
Data is automatically collected, transformed, packaged and sent to another system on a schedule. Each export creates one or more Dataset Requests depending on the number of records and a series of limits configurable on the Dataset.
Unifi automatically creates a Process message which can be used with outbound Fields and Field maps (via the Source to Stage and Stage to Request Message Scripts) to extract the data from the specified records.
The extracted data is written to an attachment which is then sent using the Send message which is also automatically created by Unifi.
When an inbound Send message (as specificed on the Dataset) with a data attachment is received, the attachment will be processed with each record being inserted into an import set table directly related to the Dataset. The import set table is automatically created and maintained during the Integration Build process.
Unifi uses ServiceNow Import Sets since they offer a well understood mechanism for importing data with performance benefits. The mapping is automatically managed by Unifi and handled through a transform script which uses inbound Fields and Field maps (via the Stage to Target Message Script) to transform the data and write it to the target records.
The process message related to the Dataset is used for inbound and outbound mapping and transformation.
Simply create the fields that you want to export/import and configure them in the same way you would for any other Unifi message. Note: the field maps will likely need to be specific to the dataset fields.
Use the Coalesce field (specific to Dataset messages) to indicate which field should be used for identifying existing records to update during inbound processing.
Response Actions are interceptors that allow you to customise the way responses are handled.
By default, a successful request is determined by the response HTTP Code being in the info (1xx) or success (2xx) ranges and anything outside of these is treated as a request failure.
Sometimes, an API will return a response that needs to be treated differently to how it would normally be handled. For example, an API might return a 400 error to indicate a part of the request contains invalid data. We can use a response action to catch that error and do something about it.
The Response Action executes the following steps:
Run retry logic. Retry logic takes precedence and will cancel the Response Action if possible. If no more retries can be made (or no retry is required), the Response Action is executed.
If no retry error has occurred, the request and transaction states are updated according to the Response Action.
Notify the user by adding an integration note. A note will only be added according to the notification script on the Integration.
If a script is specified, execute the script.
The Details fields that can be configured for the Response Action record are as follows:
Name
String
The quick reference name of this Response Action.
Description
String
An explanation of what the response action is intended to do.
Integration
Reference
The Integration this action applies to.
Message
Reference
(Visible when Integration is populated). Only run this Response action for this message.
Code
String
The status code to match on the request. This could be 500
for a specific match or 5xx
to match all 500 based errors.
Type
Choice
The type of response according to the code.
Application
Reference
Application containing this record.
The Settings fields that can be configured for the Response Action record are as follows:
Retry^
Boolean
Retry the request to the limit specified by the Integration before running this response action.
Notify user*
Boolean
Add a note to the Bond and, if the Integration permits, the target record.
Transaction state**
Choice
Override the Transaction State on the Transaction.
Process state**
Choice
Override the Process State on the Transaction.
Request state**
Choice
Override the Request State on the HTTP Request.
Run script***
Boolean
The script to run when this action is executed.
^Retry
When selected, an additional Advanced retry
boolean type field becomes available (which if selected, also reveals another additional Retry script
field in which to configure any scripted method of retrying).
*Notify user
When selected, an additional Notify message
string type field becomes available in which to enter the required notification text.
**Transaction/Process/Request state
When ’–None–’ is selected, the state is left as is and not overridden.
***Run script
When selected, an additional Script
field becomes available in which to configure any scripted method of executing the Response Action.
The Active field can be configured for the Response Action record as follows:
Active
Boolean
Enable/Disable this action.
This flag is actually set via the Enable/Disable buttons that are available on the editable action.
Unifi has some functionality that requires access to methods not available to scoped applications. We grant Unifi access to those methods through a single global utility Script Include which you can install via Update Set.
With version 4.0, the Update Set also includes a queue processing job Unifi dataset events process
which is necessary for Datasets to be processed.
It is strongly advised that you install this utility to get the most out of Unifi.
Download Unifi Global Utility.
Import the file as an update set, then preview and commit it. You can find more information on how to Load customizations from a single XML file in the ServiceNow Product Documentation.
Method: snd_eb_util.executeNow(job)
This is necessary for Datasets to execute their associated ServiceNow Import Job.
Method: snd_eb_util.getSoapResponseElement(xml)
When working with Scripted SOAP Services, it’s important to be able to set the soapResponseElement directly in order to preserve the exact payload to be sent back to the calling system. This can only be done with the Global Utility.
Method: snd_eb_util.moveAttachments(attachment_ids, record)
The ServiceNow scoped attachment API does not support moving attachments from one record to another. This is necessary for inbound attachments which initially reside on the HTTP Request and are then moved to the Target record.
Method: snd_eb_util.runAsUser(user, fn)
On occasion, Unifi needs to run some code as the given user. This is necessary things like replaying requests.
Method: snd_eb_util.runJelly(jelly_code, vars)
Jelly processing is not supported in the ServiceNow scoped API’s, however it is very useful for XML processing. Use of the Global Utility drastically improves XML payload capabilities if you are working with XML payloads.
Method: snd_eb_util.validateScript(script, scope)
Used by the Unifi Integration Diagnostic to check scripts for errors.
Method: snd_eb_util.writeAttachment(record, filename, content_type, data)
The Scoped Attachment API does not support writing binary attachments which can cause problems when receiving things like Word documents or PDF’s. This method allows Unifi to use the global attachment API to write those files to the database meaning they will work properly.
Method: snd_eb_util.packager.*
The packager methods included in the global utility allow Unifi to automatically export all the components of an integration in one easy step. The packager methods will allow Unifi to create an update set for the integration, add all the configuration records to that update set, and export it as a file download from the Integration page on the Unifi Integration Designer portal.
Method: snd_eb_util.web_service.*
The web service methods included in the global utility allow Unifi to automatically create and update REST Methods used by Unifi integrations.
Method: snd_eb_util.trigger_rule.*
The trigger methods included in the global utility allow Unifi to automatically create a Trigger Business Rule on a table if one doesn't already exist.
Method: snd_eb_util.datasetCreateDataSource(dataset, attachment)
Create a data source for datasets to use when importing data.
Method: snd_eb_util.datasetDeleteScheduledImport(scheduled_import_id)
Delete a scheduled import that belongs to and was created by a Dataset. Used when deleting a Dataset.
Method: snd_eb_util.datasetDeleteTransformMap(transform_map_id)
Delete a transform map that belongs to and was created by a Dataset. Used when deleting a Dataset.
Method: snd_eb_util.datasetEnsureTransformMap(dataset)
Update a transform map for a Dataset to use when importing data. Used when building a Dataset.
Method: snd_eb_util.datasetUpdateTransformMapFields(dataset)
Update the coalesce fields for a transform map. Used when building a Dataset.
Field and Field Map configuration records are elements that simplify and bring additional functionality when configuring Messages & Message Scripts and facilitate the auto-generation of documentation.
Adding these elements to the Unifi configuration records brings a wealth of advantages. In the past, each Message Script has been maintained individually and manually. However, the introduction of Fields and Field Maps allows Message Scripts to be broken down into smaller, reusable components. Having these Field records also means we can easily view and manage the use of fields on Messages and Integrations.
A Field record defines the processing of a discrete component of a Message. Although the Field record is distinct from and separate to the field element of a form, it often represents the handling of an individual field on the source/target record (e.g. Short description). Not only that, a Field record is used to define the objects that will carry other Transaction specific data (e.g. Name, Time stamp, Source reference, Target reference).
A Field record identifies the field of a source/target record (optional - as stated above), the element used to stage the information and the property of the payload exchanged between systems. It can define the direction in which it is applied, any default values to use in each direction and whether it is mandatory. It can exist at the Integration or Message level (more on that in the Inheritance section). The behaviour, or ‘type’ of a Field is defined by the Field Map to which it is linked.
Field inheritance allows a Field record to be created at the Integration level rather than at an individual Message level. Individual, Message level Field records can then be configured to inherit their behaviour from the Integration level Field record.
For example, if a field is used by ten Messages, then we would define ten Field records and link them to the Messages (a separate Field record must be created for each of the Messages where the field is used). If we configure those Field records to inherit their behaviour from the Integration level Field record, then modifications made to the Integration level would immediately be available to the Message level Field records.
A Field record is shown to be an Integration level record when it is not linked to a specific Message i.e. the ‘Message’ field is left blank .
The ‘Integration’ field is mandatory for all Field records, meaning that a Field record must always be linked to an Integration.
A Message level Field record can inherit from its Integration level counterpart by simply checking the ‘Inherit’ field.
Field inheritance is set to true by default. This means the record will be updated with integration-level Field values when saved (except for Active, Inherit and Message values). Uncheck the Inherit field to configure locally.
The image below is an example of an ‘incident.short_description’ Field record:
We can see an example from this list view below that the incident.short_description Field record has been defined for seven different Messages on this Integration (plus the Integration level record):
When configuring a Field record, it is best practice to only link it with a Field Map that belongs to the same Integration as the Field (for details, see the Field Maps page).
As an aid to help identify any issues, in the auto-generated Documentation a warning triangle will appear next to the Field Map and a message will be shown in the Document if the Field Map doesn't belong to the Integration.
To maintain best practice when copying Field records from one Integration to another, if the value of the Integration changes on the Copy Field modal, Unifi automatically clears the value of the Field Map. This means you will need to populate this with the value of the equivalent Field Map on the Integration it was copied to.
In this example, we will copy the incident.impact Field from one Integration to another.
To copy the Field record, navigate to the Fields page. Click on the ellipsis to the right of the incident.impact Field & then click Copy.
On the Copy Field modal, note the value of the Integration and Field Map.
Clear the value of the Integration and the Field Map is automatically cleared.
Select the Integration you wish to copy the Field record to. You are then able to select the equivalent Field Map for that Integration (which you have already created, per Field Maps Best Practice).
Click Copy to save the new Field record.
The Details fields to be configured for the Field record are as follows:
Description
String
Describe what this field is for and any specific details that might help you in future.
Inherit
Boolean
Should this field inherit its configuration from the integration-level field of the same name?
Integration
Reference
The Integration this Field record belongs to.
Message
Reference
The Message this Field record is linked with.
Field map
Reference
The Field Map that knows how to process the data in this Field.
Domain
Reference
The domain that this Field applies to.
The Settings fields to be configured for the Field record are as follows:
Field map
Reference
The Field Map that knows how to process the data in this Field.
Path
String
A path to the node that contains the property. Can be dot or slash notation e.g. parent/child.
Property
String
The name of the property this field represents in the payload.
Map to field
Boolean
Should this field map to a specific field on a record?
Table*
Table name
The primary source/target table that this Field record is mapped to.
Element*
Field name
The database element, or field, being represented.
Inbound
Boolean
Is this Field received from another system?
Outbound
Boolean
Is this Field sent to another system?
Mandatory
Boolean
Is this Field mandatory?
Order
Integer
The order at which this Field is processed.
Depends on
Glide list
Control the order of processing. Specify the Field records that must be processed before this Field record is processed.
*These fields are visible when ‘Map to field’ is set to true
The Defaults fields to be configured for the Field record are as follows:
Default inbound*
Script plain
Generate a default value that can be used when an inbound request does not contain a value for this field.
Default outbound*
Script plain
Generate a default value that can be used when an outbound request does not contain a value for this field.
*Default inbound/outbound: For a list of available variables, click here.
Field Choice records can be defined for an Integration-level Field. Rather than configure choices for each Message, configure choices once at the Integration-level. This means you only need define them once.
These are used when you’re mapping choice field elements with static values that don't change per Message (e.g. State, Impact, Urgency).
When you click on Generate field choices, Unifi will go to the Choice [sys_choice] table and automatically pull back the value and label for each of the elements where the table matches the primary source/target table that this Field record is mapped to - creating Field Choice records for each.
You can also create your own internal/external mappings when you click New. For more on Field Choices click here.
There are some fields which are visible in native ServiceNow. These can be viewed by clicking the hamburger menu & selecting 'Open in platform'.
The following fields are visible on the platform Integration record:
Application
Reference
Application containing this record.
Active
Boolean
Set to true to use this Field record for processing (Inactive Fields will be ignored when building the Message Scripts).
Name
String
The name of the Field record (auto-populated from the Table & Property fields).
Fields and Field Map records are not processed during the operational phase (run time) of the Integration. They are processed by a Build process (triggered by a UI action) which takes the information in the records and produces the code which defines the Message Scripts. The Build activity can be performed on an individual Message, or for the entire Integration.
At run time only the Message Scripts are executed and the Field and Field Map records are not accessed at all. This means there is no operational difference between an integration built using Fields and Field Maps and an integration built without.
Next, we will look at Field Maps in more detail.
Here you will find details of what's changed in this release, including new features & improvements, deprecated features and general fixes.
Welcome to the release notes for Unifi - Version 2.1. Please have a read through to see new features and fixes that have been added.
Please note that, as with every release, there may be some changes that are not entirely compatible with your existing integrations. While we do everything we can to make sure you won't have to fix your integrations when upgrading, we strongly encourage all our customers to perform full end-to-end tests of their integrations before upgrading Unifi in Production.
We also highly recommend aligning your Unifi upgrade with your ServiceNow upgrade. This means you only need to test your integrations one time rather than once for the ServiceNow upgrade and once for the Unifi upgrade.
We really appreciate feedback on what we're doing - whether it's right or wrong! We take feedback very seriously, so if you feel you'd give us anything less than a 5 star rating, we'd love to hear from you so we can find out what we need to do to improve!
If you would rate us 5 stars, and haven't left a review on the ServiceNow Store yet, we'd be grateful if you would head over there to leave us your feedback. It only takes a few minutes and really does help us a lot. Go on, you know you want to
The biggest change in this release is the update to the Portal interface, now known as Unifi Integration Designer.
The old operations dashboard has now been moved to the new Unifi Operations Portal, ready for lots of exciting updates around operations in the near future. It's been replaced with a brand new configuration dashboard, allowing you to view, create, copy and manage your processes and integrations with ease.
There are lots of improvements across the board when configuring an integration, too, not least of all including support for all the components you need to build any Ticket Exchange integration together in an intuitive way.
We've introduced a brand new capability to pause an integration. This allows you to temporarily prevent outbound messages from being sent, instead allowing them queue ready for sending in future. This is different to deactivating the integration, which would prevent message processing altogether. You might want to do this if you are having a lot of errors with the external system, or perhaps if it's gone down for maintenance. Once you know the system is back up and online, you can resume the integration with the click of a button and it will automatically start processing the queued messages.
The introduction of Event Actions is part of our ongoing strategy to improve operations management of integrations. By creating an Event Action, you can easily triage events such as transaction errors and perform actions to handle or report on those actions. For example, you might want to create an Incident record if you receive more than 5 transaction errors in 15 minutes.
With the introduction of Activity Logs you no longer need to attach logs to requests/transactions. Activity Logs are present in related lists on all relevant Transaction stack records.
The attached logs were previously controlled by the Attach logs
and Attach payloads
checkboxes on Integration, but they are now overridden off by a new Integration Enable Attachments
system property. This property acts as the master switch and effectively disables the checkboxes on the Integration. Note: the checkboxes have been removed from the Integration form but can still be edited from the list view.
Admins can now force close a bond. [UN-106]
Description field added to Data Stores. [UN-123]
New option on Integration to send attachments with Create messages. [UN-184]
Activity Log now references the Target (e.g. Incident). [UN-288]
Added a cleaner job, controlled by system properties, to remove orphaned Transactions. [UN-544]
New Poll Request states (Retrieving, Processing) are set while a Poller is processing . [UN-293]
Field style added to Active field on Poller. [UN-298]
Activity Log is now able to show the table that called it. [UN-305]
Field style added to Poll Request state. [UN-313]
Queued Transactions are now ignored when the bond is closed. [UN-370]
Fields can now be copied. [UN-389]
The Integration field is now auto-populated when creating a Field from a Message. [UN-395]
The Table field is now auto-populated when creating a Field. [UN-396]
Added Domain field to Transaction stack forms (Bond, Transaction, Stage, and HTTP Request). [UN-401]
Copy Message now copies all its Fields. [UN-420]
Copying an Integration now copies all the Fields. [UN-421]
Copy Integration is now available in the Portal. [UN-422]
Connection is now shown on Transaction. [UN-426]
Integration Status (Up, Down, Awaiting) is now tracked on the Integration. [UN-427]
Integration State (Active, Paused, Off) is now tracked on the Integration. [UN-428]
Added option to Run Event Actions on an Integration. [UN-431]
Bond now references the Connection it was created for. [UN-438]
Dynamic Stage ($stage) now has a getValue()
method. [UN-441]
Fields and Field Maps can now be configured in the Portal. [UN-443]
Message level Fields can inherit from Integration level Fields. [UN-449]
Pollers are now listed on Integration. [UN-456]
Bonded Attachment records now track direction (Inbound or Outbound). [UN-461]
Added console
object to all scripts for ease of logging. [UN-477]
Updated Activity Logs on the bond so it shows all Activity Logs for the Document record. [UN-513]
Improved Activity Log titles. [UN-514]
Admins can now force re-open a closed Bond. [UN-517]
Improved Execute Now button for Pollers. [UN-521]
Added Activity Logs to Poll Requests. [UN-522]
Added Integration field to Poll Request. [UN-523]
Improved naming for Portal based Activity Logs. [UN-529]
HTTP Request and Transaction now shows the Response Action that was used. [UN-533]
Cascade delete now applies to Field Maps belonging to an Integration. [UN-545]
Cascade delete now applies to Fields belonging to an Integration. [UN-546]
Cascade delete now applies to Pollers belonging to an Integration. [UN-547]
Cascade delete now applies to Pollers belonging to an Poll Processor. [UN-547]
Added Description field to Connection. [UN-565]
Added Description field to Scheduled Scripts. [UN-570]
Added Hints to Field Map fields. [UN-571]
Added execution URL to Activity Log. [UN-574]
Modified security on Activity Logs so they are visible to Unifi Managers. [UN-586]
New Unifi REST service and integration role added to simplify REST API configuration. [UN-633]
Changed Integration Format and Message Service defaults to be REST/JSON. [UN-634]
Bond closed
field has been removed from Message. [UN-332]
The XML Template on Message no longer has a default value. [UN-398]
Changing the Integration on a Field now clears the Message. [UN-399]
Poll Processors are now copied when the Integration is copied. [UN-468]
Improved the Field "Depends on" logic. [UN-476]
Data Stores now inherit the domain from the parent document. [UN-481]
Message.processOutbound()
will not send anything if the action on the current record has been aborted using setAbortAction(true)
. [UN-490]
Replay Request is now available to the Unifi manager role. [UN-494]
Most system properties are now Private and not transferrable by Update Set. [UN-652]
HTTP Request URL length limit increased to 2048 characters. [UN-118]
Transaction count on bond has been deprecated. [UN-141]
Number field has been removed from Messages and Message Scripts. [UN-208]
Modified Bond Reference Method field visibility on Message. [UN-292]
Ignore Transaction button now redirects to the Transaction being ignored. [UN-338]
Deprecated ECC Queue attachment support for attaching logs. [UN-405]
Deactivated the fix script x_snd_eb eBonding Upgrade. [UN-406]
Response is no longer mandatory on Message. [UN-439]
Integration properties which are numbers are now mandatory. [UN-440]
Endpoint URL is now visible on inbound requests. [UN-511]
Deprecated log attachments. [UN-540]
Bonded Attachment fields are now locked down. [UN-577]
Only Unifi Admins or Managers can replay HTTP Requests. [UN-581]
Inbound Create messages are now rejected when a Bond with the same reference exists. [UN-136]
Inbound Bonded Attachments now reference the Transaction they came in on. [UN-151]
Fixed issue with connection name generation. [UN-159]
Replaying a Transaction with attachments will now replay the attachments as well. [UN-226]
The Bond state will now always reflect the overall status of its transactions. [UN-230]
Bond status is now updated correctly when processing numerous queued messages. [UN-273]
A new field Deferred count
has been added to Bonded Attachment which tracks how many times the attachment was not sent and allows us to properly calculate attachment metrics on the Bond. [UN-287]
Bond numbers are no longer used unnecessarily. [UN-315]
Fixed issues with assigning headers in Stage to Request Message Scripts. [UN-333]
Fixed an issue where calling a Unifi endpoint with GET would throw an error. [UN-380]
UTF-8 encoded XML is now checked before processing to prevent parsing issues. [UN-382]
The Name value on Field is now set even when a not mapping to a real field. [UN-397]
Fixed an issue where the Field form would always show the alert for losing changes. [UN-451]
Fixed duplicate attachment messages when using ServiceNow Antivirus scanning. [UN-465]
Inbound transactions that are replayed are now executed as the integration instead of the current user. [UN-473]
Fixed an issue with Field Choice generation not having the table set. [UN-475]
Replaying an HTTP Request now increments the number correctly. [UN-478]
Errors thrown in Field Map scripts will now show up in the Transaction. [UN-479]
Fixed an issue with replaying Transactions created by system
. [UN-482]
Fixed Bond History state logging. [UN-485]
Fixed an issue with false-positive updates from Transactions. [UN-505]
The "Generate field choices" action on Field now checks for existing Field Choice entries. [UN-516]
Reference fields in Messages are now correctly updated when copying an Integration. [UN-530]
Fixed an issue where full Activity Logs would be sent to system log. [UN-534]
Logging verbosity is now allowed at error
level. [UN-535]
Fixed an issue where queued Transactions would not be processed. [UN-543]
Connection is now updated when the Integration name is changed. [UN-552]
API names are now allowed to use a zero (0). [UN-553]
Clicking the "Generate field choices" action on Field will save the record first to ensure choices are generated correctly. [UN-556]
Response Action user notifications now work. [UN-562]
Fixed undefined error with Ignore Transaction button [UN-592]
Fixed issue in generating messages without a Connection URL [UN-638]
Transactions are now ordered by date created in the live feed. [UN-452]
Side menu options are now highlighted when selected. [UN-453]
Reference links now navigate to the Portal pages instead of native ServiceNow. [UN-454]
Changed the style on inactive Boolean fields to differentiate them from read-only Boolean fields. [UN-564]
Embedded fonts/icons to improve page loading times. [UN-572]
Added Enable/Disable button to form headers for easy access. [UN-641]
Fixed read error when attempting to view a record in Portal as a non-admin. [UN-403]
Fixed Portal font size for instances running Madrid onwards. [UN-404]
Fixed an issue with the operations dashboard truncating the live transaction feed when there are less than 20 transactions. [UN-407]
Attachment sending can now be controlled from the portal. [UN-415]
Condition widgets now work in Portal. [UN-434]
Fixed an issue with variables payload
and headers
not being assignable in Message Scripts. (Closures have been replaced with instructive comments.) [UN-463]
Outbound Attachment properties are no longer visible for inbound only messages. [UN-472]
Bond scripted condition type only shows once instead of twice. [UN-558]
Long text fields in Portal lists are now truncated to improve usability. [UN-569]
Fixed render issue with reference fields that have no value in portal [UN-578]
References in lists now show "(empty)" if there is no display value [UN-640]
Fixed issue with form appearing read-only in Portal view when directly navigation to the URL [UN-642]
Fixed issue with Portal form loads from an external link [UN-643]
Added support for day-of-week type fields. [UN-566]
Added support for time fields. [UN-567]
Added support for interval fields. [UN-568]
Lists and pages now update the title in Portal. [UN-637]
A Poller makes a scheduled request to a remote system.
In cases where it is not possible for a remote system to send us the data, we can make a scheduled request for it using Pollers. All Pollers belong to an integration. Although a Poller belongs to only one integration, an integration can have multiple Pollers.
A Poller is a configuration record which defines the frequency of polling and which logic to use (the logic itself is defined in the ). Each time it is run, it creates a corresponding record.
Depending on the use case, the use of a Poller to collect data from a remote system poses some development challenges which need to be considered. Namely, there is an additional responsibility and workload placed on the host system to store and check some returned data in order to evaluate what has changed.
For example, in order to decide whether the state has changed, or what comments have been added, or even which system has made the updates to the data (we don't want to pull back data we have changed), checks have to be built into the scripts. This is aided by holding a copy of the relevant returned data, using (see the relevant page in the Administration section).
Event Actions are a means of triggering an action from an event.
In Unifi, we throw events for Transaction State changes. Like standard ServiceNow Script Actions, Event Actions respond to those events. Unlike Script Actions, however (where there is a simple one-to-one relationship between the event and the action), Event Actions provide an added level of abstraction in that they offer functionality that enables you to triage the events and make decisions about how you deal with them according to the conditions you configure.
Their primary use case would be in dealing with Transaction errors. For example, you may choose to create an incident after receiving five errors in fifteen minutes, or when a specific error code occurs. The available configuration options make their use case extremely flexible, so you could decide, for example, to auto-close previously opened incidents when the Transaction is complete.
The Event Action executes under the following conditions:
Run Conditions. These stipulate the conditions which cause the Event Action to fire. They specify which Integration* and which Event* the Event Action is responding to (*both Integration & Event are mandatory). They can even be dependent on the status of the integration and apply to specific messages.
Triage Conditions. Triage conditions determine whether or not we take any action i.e. the conditions that must apply for the Action script to run.
Action. This is where you will find the Action script, which is the actual script that will run.
The Details fields that can be configured for the Event Action record are as follows:
The Run Conditions fields that can be configured for the Event Action record are as follows:
*Integration / Event:
Both Integration & Event are mandatory.
**Integration status:
For example, set the Integration status to ‘Up’ to trigger the Event Action only when the the Integration is Up and use the code in the Script action to set the Integration to ‘Down’. Subsequent Transaction events (triggered whilst the Integration is ‘Down’) would then not cause the Event Action to run, because the run condition would not match.
The Triage Conditions fields that can be configured for the Event Action record are as follows:
The Action fields that can be configured for the Event Action record are as follows:
Example script:
Script Variables ****(available in both the Action script and the Advanced Condition script)
current: the current Transaction that triggered the event.
action: the Event Action record (GlideRecord object).
In the above example code, we query the incident table for records that match (using short_description as a unique identifier). If one isn’t found, we create a new one and set some values. Otherwise we increment the impact and urgency. Finally, we add a work note to the incident.
The Active field can be configured for the Event Action record as follows:
This flag is actually set via the Enable/Disable buttons that are available on the editable action.
A Data Store is simply a key-value pair stored in the database that is available to all records in the system.
Use Data Stores to easily make variables persistent and retrievable.
Data Stores are used primarily in and scripts when you want to get and set functional data that is relevent to the integration but does not belong on the target record.
This is particularly handy in a uni-directional integration where polling the remote system is necessary. You can store data such as:
The last time a request was made
Identifiers that have been seen before
Watermarks
Use getData
and setData
to work with simple string-like data.
You can also work with objects just as easily by using getDataObject
and setDataObject
. These functions automatically take care of encoding the object for database storage and decoding it for JavaScript usage again.
The System Logs module is a quick link to the ServiceNow system logs and shows errors and warnings from the current day.
The link to the ServiceNow System Logs shows errors and warnings from the current day, and is the place to look in the case of something catastrophic happening outside of Unifi, or something that isn't captured in (effectively providing an additional back up).
Attachments can be extracted from payloads and saved before the payload is saved. This is highly recommended as it saves from storing the attachment itself in the payload on the HTTP Request record.
Attachment data should be extracted, saved as an attachment, and replaced with the attachment sys_id in the format <x-attachment-data sys_id="...">
. Unifi will automatically collect the attachment ids, create Bonded Attachments for each of them, and finally move the attachments to the target record.
We strongly recommend streaming attachments when possible. Attachments can be streamed into Unifi which bypasses the need for extraction, supports binary file sharing, and also allows for sizes up to 50MB (providing your instance and integration configuration supports this).
This is available for all new integrations from Unifi 3.0.
For more information on configuring inbound streaming, please see our guide.
Sometimes attachments details can be provided within another message which require additional API calls to fetch them. We recommend breaking this out using a Poller which will asynchronously fetch the attachments and then push them into Unifi for full tracking. This has a lot of benefits and follows Unifi best practice.
The basic flow is:
Receive a message containing a list of attachment details to fetch from the other system.
Process the message as normal and store the attachment details in the Stage just like any other field.
In the Stage to Target script, loop through the attachments and execute an On-demand asynchronous Poller for each one.
The Poller fetches the attachment and pushes it into Unifi just like any other inbound message. We use a Poller because it has it's own queue management and provides visibility for the transactions.
Unifi processes the message and moves the attachment to the target record. The attachment is also recorded against the bond and the full transaction stack is created.
This example is based on receiving a message which contains a list of attachments that need to be fetched. You may need to adjust to your specific environment and requirements.
Create a new Poll Processor (we recommend <Integration name> - Get Attachment
). It will fetch a single attachment as required and push it into your integration attachment message.
Name: <Integration name> - Get Attachment
Integration: <Integration name>
Active: true
Run parallel: true
Create a new Poller (we recommend <Integration name> - Get Attachment
) and set it use the Poll Processor we've just created. For scheduling, set it to run On Demand.
Name: <Integration name> - Get Attachment
Poll processor: <Integration name> - Get Attachment
Integration: <Integration name>
Active: true
Run: On Demand
You'll need a Message to handle the incoming attachment. You can either use an existing one you already have or create a new one. We recommend calling it AddAttachment
.
Since Unifi will automatically recognise the attachment provided by the Poll Processor, the only mapping you need to configure here is the internal and external reference mapping.
We need to tell the integration which Poller to use for processing the attachments. We do this with the Poller record sys_id. To prevent hard-coding it, we specify it in a connection variable. You'll need to make sure this variable is present on all connections you use.
Description: The sys_id of the Poller record that we execute to retrieve an attachment.
Key: get_attachment_poller_id
Value: <The Poller sys_id>
This section describes what the message that receives the attachment URLs needs to do. You may consider doing this in a new Field and Field Map, or just update the relevant Message Scripts.
When the attachments details are received, they are treated like any other field. Store them in the stage for visibility and later processing.
In the Stage to Target script (either Message Script or Field Map), loop through the attachments and execute a Poller.
Now you've configured your inbound message, Poll Processor, Poller and AddAttachment message, it's time to test!
Scheduled Scripts are automatically created by the system to perform tasks.
The Scheduled Scripts module displays the scripts which are automatically created by the system in order to perform tasks. These scripts generate log entries in the each time they run.
Useful snippets of code given as examples to help with various scripting needs.
Use this script to add a SOAP endpoint to a Unifi Process. You will need to update the api_name to be the same as the one you set in the Process record.
Sometimes it's necessary to remove the namespaces sent in by other systems to make it easier to handle the payload in ServiceNow. The easiest place to do this is within the Scripted SOAP Service, but bear in mind that your payloads will no longer be identical to what the other system sent you. This example shows you how:
Use this script to add a REST endpoint to a Unifi Process. You will need to update the api_name to be the same as the one you set in the Process record.
By wrapping the code in console, we give context to Activity Log and prevent multiple database updates.
This script shows how to find a bond based on an external reference and store some data.
The Activity Logs module is a quick link to the Unifi Activity Logs and shows all entries from the current day.
Not to be confused with the , the Activity Logs module displays all the entries to the Unifi Activity Log table from the current day. This reduces the clutter in the System logs table by bringing all the Unifi log entries together in one place, providing contextual links to the records.
Clicking into the records, you will very quickly discover that the level of detail and clarity provided in the Activity Logs makes them of even greater value. It is the place to look when debugging.
Below is an excerpt from one of the log entries in the Activity Logs:
Here you will find a description of the variables that are available to you when scripting in Unifi.
The Add Note Script is used to notify the end user about integration events. These notes are automatically added during processing and the script allows the target of the note to be customised. E.g., notes are usually added to work notes on Task-based tables, but a custom table might have a different field for this.
The Identify Message Script is executed when an inbound request is received. It returns the name of the message that should be used to process the request. Typically, message names are embedded within the request payload, but it's possible to use the other variables available for more complex identification.
The Endpoint URL will be prepended to Message Path values (providing they do not override). Inline scripts using curly braces {}
can be used to construct more advanced endpoints.
We recommend dynamic endpoints only be considered for the same environment. New environments should have new connections so they can be managed more easily.
The Advanced condition is used to script complex trigger logic into the message. Only use this if you cannot use the Outbound condition filters and the single line Outbound condition.
The Path is used to modify the endpoint for the message. It can be used with inline script evaluation to construct more advanced endpoints. Inline scripts should be wrapped with dollar curly braces ${...}
.
The outbound condition is an inline script field useful for simple one-line conditions to be used in conjunction with the filter.
The Reference lookup script is used to extract the internal and external message ID's when an inbound request is received. These ID's are used to locate the bond (and therefore the target) the request applies to.
The XML Template is evaluated in a similar way to a UI Macro and is extremely useful in constructing advanced XML based payloads using Jelly. Other types of payload can be constructed here, however it's normally easier to do this with the Fields and Field Maps or directly in the Message Scripts.
The XML Template will only be evaluated if the payload has not already been set within the Stage to Request message script.
The Source to Stage script is used to capture data from the source record, e.g., an Incident, and save it to the stage record where it is ready to be used to generate a payload.
Source data values should be captured in full (including referenced values) in the Source to Stage script.
The Stage to Request script is used to generate a payload using the data captured on the stage.
Payload generation and request configuration should be done in the Stage to Request script.
The Payload to Stage script is used to capture data from the inbound request payload and save it to the stage record where it is ready to be used to update the target record.
Data should be extracted from the inbound payload and headers in the Payload to Stage script.
The Stage to Target script is used to update the target record, e.g., an Incident, with the data and references given in the stage.
Reference lookups, data validation, and business logic should be done in the Stage to Target script.
The default inbound script can be used so set the default value to be used by the Field Map when no value is found on the stage.
The default outbound script can be used so set the default value to be used by the Field Map when no value is found on the source.
Fields maps are compiled during Build operations with the field as an input and the resulting code is added to the respective Message Scripts.
Only code contained within dollar-square brackets $[...]
will be compiled. During inbound/outbound processing, standard Message Script variables will apply.
The Response Action script is executed when Run Script is checked. It can be used to do anything based on a response to an outbound request.
Poll Processor scripts have the following variables available.
Some useful data values can be fetched from the Poller data store.
e.g. poller.getData("$last_execution_time")
Follow these steps to test the AddAttachment Message.
If necessary, create at least one bonded record in order to test the AddAttachment Message.
In the originating instance, select one bonded ticket and Add an attachment to the record .
From the bonded record, navigate to the Unifi Integrations related list and click to open the Bond.
The External reference is populated. The State is "Open" and the Status is "OK".
From the Bond, navigate to the following related lists:
Navigate to the Bonded Attachments related list and verify that a Bonded Attachment record has been created.
The State is "Complete".
Navigate to the Transactions related list and click to open the AddAttachment Transaction.
The Transaction state is "Complete" and the Process state is "Accepted".
From the Transaction, navigate to the HTTP Requests related list and click to open the HTTP Request.
The Request state is "OK".
Note the following about the HTTP Request:
Request headers*: Contains the mime_type.
Request payload*: The payload contains the "sys_attachment:<sysid>" (the format which Unifi expects & automatically triggers outbound streaming).
You will never see the attachment data in the payload itself.
In the receiving instance, navigate to the corresponding bonded ticket and verify the attachment has been added to the record.
The attachment has been added to the bonded record.
From the bonded record, navigate to the Unifi Integrations related list and click to open the Bond.
The External reference is populated (note: the Internal/External reference values are opposite to the originating bond). The State is "Open" and the Status is "OK".
From the Bond, navigate to the following related lists:
Navigate to the Bonded Attachments related list and verify that a Bonded Attachment record has been created.
The State is "Complete".
Navigate to the Transactions related list and click to open the AddAttachment Transaction.
The Transaction state is "Complete" and the Process state is "Accepted".
From the Transaction, navigate to the HTTP Requests related list and click to open the HTTP Request.
The Request state is "OK".
Note the following about the HTTP Request:
Endpoint URL: This is defined on the Resource path of the automatically generated Scripted REST Resource.
Congratulations! You have successfully configured & tested a dedicated Message to stream inbound and outbound attachments.
Documenting your integration build is made simple and easy with Unifi's auto-documentation features. From any integration, select the Documentation button to see exactly what has been configured in an easy to read format.
By default, the documentation is viewable as a list of separate pages for easy navigation.
To view the documentation as a single document that can be printed or exported, click the Paginated View button to change it to Inline View. Depending on the size of your integration, this can take some time to generate.
Click the Message Scripts button to toggle having Message Scripts included with each Message definition. Enabling this option with the Inline View can significantly increase the time to generate the documentation.
Unifi will automatically generate the Scripted REST Resource to cater for inbound streaming. If using a Resource created on a release prior to v3.0 this will need to be updated manually.
When you create a Process, Unifi will automatically create the corresponding Scripted REST Resources (see ).
This guide is supplementary to the Bidirectional Asynchronous Incident Guide and assumes the Process (& subsequent Scripted REST Resources) are already in place (See the page of that guide for details).
We will now examine the automatically generated Attachment Resource. If you are manually updating an existing Resource to cater for inbound streaming, ensure that it looks like the this.
The following code is extracted from the automatically generated Attachment Scripted REST Resource:
Note: the above script is processing the inbound attachment using the "AddAttachment"
message. If your message is named differently then simply update.
You may need to update the api_name
in the x_snd_eb.RestHelper()
function. It must be the same as the one you set in the Process record.
With the Message & Scripted REST Resource in place, you are now ready to Test the AddAttachment Message.
Use keytool
to generate a private key pair and matching self-signed certificate that are stored in key store file. To use keytool
, you need to download and install the .
For detailed instructions on how to setup mutual authentication, see the ServiceNow Knowledge Base article .
For information about mutual authentication for inbound web services, see the ServiceNow documentation on .
To configure an OAuth Connection, follow the .
They are accessed via the variables
object and can be used in most scripts. See the page for details.
See parameters for available variables.
See parameters for available variables.
Please refer to .
More advanced conditions can be made in the script field.
Install the for full support of Jelly within XML Templates.
See for available variables.
See for available variables.
Endpoint URL: A concatenation of the Connection URL appended with the defined on the Message.
*These objects were defined in the .
Request payload: The payload structure is defined in the .
Name
String
The name of this event action.
Description
String
An explanation of what the event action is intended to do.
Integration*
Reference
The Integration this action applies to.
Message
Reference
(Visible when Integration is populated). Only run this Event action for this message.
Integration status**
Choice
The Integration status this Event Action has to match. (Choices: Up, Down, Ignore)
Event*
Choice
Which event triggers this Event Action to run. Note, the Triage conditions still need to match as well; this event determines whether the triage conditions are run at all.
Advanced
Boolean
Use a script to determine if this Event Action should run.
Table*
Reference
Table to run a filter against.
Filter
Condition
The condition used to find matching records, e.g. errored transactions in the last 15 minutes.
Limit
Integer
The number of records found using the specified filter that will trigger this Event Action’s Action script to be run. Zero will always be true.
Action Script
Script
Run this script if the Triage conditions have matched.
Active
Boolean
Enable/Disable this action.
Environment
Reference
The environment this connection applies to.
Integration
Reference
The integration this record belongs to.
Description
String
Document variable usage, connection nuances etc. Used in auto documentation generation.
Endpoint URL
URL
The external system’s access URL.
Instance name*
String
Safety check to ensure production integrations are not sent updates from development instances and vice versa.
Authentication
String
The authentication method to use for this connection.
OAuth profile**
Reference
The OAuth Entity Profile to authenticate with.
User
String
The username used in basic authentication.
Password
Password2
The password used in basic authentication.
Inbound user
Reference
The user profile used by the external system for authentication. An active connection must be found for the user to gain access.
Mutual auth
Boolean
Use mutual authentication with each request sent to this connection when true.
Protocol profile***
Reference
The protocol profile to use with this connection.
MID server
Reference
The MID server this connection will use to send messages.
Application
Reference
The application containing this record.
Active****
Boolean
Use this connection for the integration when true.
file_name*
String
Name of the file
reference*
String
The name of the dataset
Content-Type*
String
text/csv OR application/json
x-snd-eb-message-name*
String
The name of the Send message, e.g. Send_<table>
payload
Any
The payload string or object. Pre-processing can be configured on the Integration to automatically convert JSON to Object
or XML to XMLDocument2
.
headers
Object
An object containing the request headers.
params
Object
A key-value pair object of URL parameters.
integration
GlideRecord
The Integration [x_snd_eb_integration]
record.
connection
GlideRecord
The Connection [x_snd_eb_connection]
record used to receive the request.
variables
Object
Object used to contain Connection Variables.
log
,console
Object
Object containing several functions that can be used for logging. info
, warn
, error
and debug
.
error
String
The error message to return from the script. Alternatively you can simply throw a string
or Error
and the system will take care of it.
payload
Any
The payload string or object. Pre-processing can be configured on the Integration to automatically convert JSON to Object
or XML to XMLDocument2
.
headers
Object
An object containing the request headers.
request
GlideRecord
The HTTP Request [x_snd_eb_http_request]
record.
stage
GlideRecord
The Stage [x_snd_eb_stage]
record.
$stage
Object
The dynamic stage which is automatically stored on the stage record.
transaction
GlideRecord
The current Transaction [x_snd_eb_transaction]
record.
bond
Object
Instance of Unifi Bond class.
message
GlideRecord
The record of the Message
being used.
scratchpad
Object
An object that can be used to pass variables between scripts.
log
,console
Object
Object containing several functions that can be used for logging. info
, warn
, error
and debug
.
error
String
The error message to return from the script. Alternatively you can simply throw a string
or Error
and the system will take care of it.
current
GlideRecord
The record that triggers the message. The actual table will differ between Processes.
message
GlideRecord
The record of the Message
being used.
variables
Object
Object used to contain Connection Variables.
log
,console
Object
Object containing several functions that can be used for logging. info
, warn
, error
and debug
.
error
String
The error message to return from the script. Alternatively you can simply throw a string
or Error
and the system will take care of it.
payload
Any
The payload string or object. Pre-processing can be configured on the Integration to automatically convert JSON to Object
or XML to XMLDocument2
.
request
GlideRecord
The record of the HTTP Request [x_snd_eb_http_request]
being used.
answer
Any
The result of the script being called.
current
GlideRecord
The record that triggers the message. The actual table will differ between Processes.
message
GlideRecord
The record of the Message
being used.
variables
Object
Object used to contain Connection Variables.
log
,console
Object
Object containing several functions that can be used for logging. info
, warn
, error
and debug
.
error
String
The error message to return from the script. Alternatively you can simply throw a string
or Error
and the system will take care of it.
payload
Any
The payload string or object. Pre-processing can be configured on the Integration to automatically convert JSON to Object
or XML to XMLDocument2
.
headers
Object
An object containing the request headers.
request
GlideRecord
The HTTP Request [x_snd_eb_http_request]
record.
stage
GlideRecord
The Stage [x_snd_eb_stage]
record.
$stage
Object
The dynamic stage which is automatically stored on the stage record.
transaction
GlideRecord
The current Transaction [x_snd_eb_transaction]
record.
bond
Object
Instance of Unifi Bond class.
message
GlideRecord
The record of the Message
being used.
scratchpad
Object
An object that can be used to pass variables between scripts.
log
,console
Object
Object containing several functions that can be used for logging. info
, warn
, error
and debug
.
error
String
The error message to return from the script. Alternatively you can simply throw a string
or Error
and the system will take care of it.
source
GlideRecord
The record that is being integrated.
stage
GlideRecord
The record of the Stage [x_snd_eb_stage]
being used. The actual table will differ between Processes.
$stage
Object
The dynamic stage object.
transaction
GlideRecord
The record of the Transaction [x_snd_eb_transaction]
being used.
bond
Object
Instance of Unifi Bond class.
message
GlideRecord
The record of the Message
being used.
variables
Object
Object used to contain Connection Variables.
log
,console
Object
Object containing several functions that can be used for logging. info
, warn
, error
and debug
.
error
String
The error message to return from the script. Alternatively you can simply throw a string
or Error
and the system will take care of it.
payload
Any
The payload string or object. Pre-processing can be configured on the Integration to automatically convert JSON to Object
or XML to XMLDocument2
.
headers
Object
An object containing the request headers.
query
Object
An object containing the request URL query parameters. This object is compiled and automatically added to the search query of the URL.
request
GlideRecord
The HTTP Request [x_snd_eb_http_request]
record.
stage
GlideRecord
The Stage [x_snd_eb_stage]
record.
$stage
Object
The dynamic stage which is automatically stored on the stage record.
transaction
GlideRecord
The current Transaction [x_snd_eb_transaction]
record.
bond
Object
Instance of Unifi Bond class.
message
GlideRecord
The record of the Message
being used.
scratchpad
Object
An object that can be used to pass variables between scripts.
log
,console
Object
Object containing several functions that can be used for logging. info
, warn
, error
and debug
.
error
String
The error message to return from the script. Alternatively you can simply throw a string
or Error
and the system will take care of it.
payload
Any
The payload string or object. Pre-processing can be configured on the Integration to automatically convert JSON to Object
or XML to XMLDocument2
.
headers
Object
An object containing request headers keyed by header name.
query
Object
An object containing the request URL query parameters.
request
GlideRecord
The HTTP Request [x_snd_eb_http_request]
record.
stage
GlideRecord
The Stage [x_snd_eb_stage]
record.
$stage
Object
The dynamic stage which is automatically stored on the stage record.
transaction
GlideRecord
The current Transaction [x_snd_eb_transaction]
record.
message
GlideRecord
The record of the Message
being used.
variables
Object
Object used to contain Connection Variables.
log
,console
Object
Object containing several functions that can be used for logging. info
, warn
, error
and debug
.
error
String
The error message to return from the script. Alternatively you can simply throw a string
or Error
and the system will take care of it.
target
GlideRecord
The record that is being integrated.
stage
GlideRecord
The record of the Stage [x_snd_eb_stage]
being used. The actual table will differ between Processes.
$stage
Object
The dynamic stage object.
transaction
GlideRecord
The record of the Transaction [x_snd_eb_transaction]
being used.
bond
Object
Instance of Unifi Bond class.
message
GlideRecord
The record of the Message
being used.
variables
Object
Object used to contain Connection Variables.
log
,console
Object
Object containing several functions that can be used for logging. info
, warn
, error
and debug
.
error
String
The error message to return from the script. Alternatively you can simply throw a string
or Error
and the system will take care of it.
field
GlideRecord
The record of the Field
being compiled.
default_value
Any
This is the value defined by the Field in its Default Inbound and Default Outbound scripts. It applies only to Source to Stage (Outbound) and Stage to Target (Inbound) scripts.
action
GlideRecord
The current Response Action [x_snd_eb_response_action]
GlideRecord.
bond
Object
Instance of Unifi Bond class.
integration
GlideRecord
The record of the Integration [x_snd_eb_integration]
being used.
message
GlideRecord
The record of the Message
being used.
request
GlideRecord
The record of the HTTP Request [x_snd_eb_http_request]
being used.
response_code
String
The response status_code
.
response_headers
Object
An object containing response headers.
response_payload
Any
A pre-processed payload string or object.
transaction
GlideRecord
The record of the Transaction [x_snd_eb_transaction]
being used.
log
,console
Object
Object containing several functions that can be used for logging. info
, warn
, error
and debug
.
current
GlideRecord
The record of the Transaction [x_snd_eb_transaction]
that triggered the event.
action
GlideRecord
The current Event Action [x_snd_eb_event_action]
GlideRecord.
connection
Connection
An instance of the Connection object that is being used.
params
Object
An object that allows values to be passed between poll processor scripts.
poll_request
GlideRecord
The Poll Request [x_snd_eb_poll_request]
record being used.
poller
Poller
An instance of the Poller object that is being used.
variables
Object
Object used to contain Connection Variables.
log
,console
Object
Object containing several functions that can be used for logging. info
, warn
, error
and debug
.
error
String
The error message to return from the script. Alternatively you can simply throw a string
or Error
and the system will take care of it.
ignore
Boolean
Set ignore=true
to prevent the poll processor from running.
*Setup Script only.
answer
Any
It's possible to set the result of the request script to the answer
variable instead of returning it in the function. This value is empty by default.
*Request Script only.
response
Any
The response from the request script. *Response Script only.
$execute_counter
Integer
The number of times the poller has been executed.
$last_completed
Date Time
The last date/time the poller was executed successfully. Use with poller.getlastCompleted()
to always get a valid date.
$last_execution_time
Date Time
The last date/time the poller was executed, whether successful or not.
payload
String,Stream
The raw inbound payload object to process. Attachment data should be removed and replaced with the attachment sys_id in the format "<x-attachment-data sys_id=>
headers
Object
An object containing the request headers.
params
Object
A key-value pair object of URL parameters.
request
GlideRecord
The record of the HTTP Request [x_snd_eb_http_request]
being used.
options
Object
An object containing specific properties for processing.
options.is_stream
Boolean
True if the inbound payload is a stream.
log
,console
Object
Object containing several functions that can be used for logging. info
, warn
, error
and debug
.
error
String
The error message to return from the script. Alternatively you can simply throw a string
or Error
and the system will take care of it.
target
GlideRecord
The target record to update, e.g. an Incident record.
note
String
The update message.
options
Object
An object containing specific properties depending on the note being added.
options.integration_name
String
The name of the integration.
[options.response_action]
GlideRecord
The Response Action [x_snd_eb_response_action]
record being used.
Only provided with response action notification.
[options.type]
String
The type of note: info
or error
.
Only provided with Bond specific notes.
variables
Object
Object used to contain Connection Variables.
log
,console
Object
Object containing several functions that can be used for logging. info
, warn
, error
and debug
.
error
String
The error message to return from the script. Alternatively you can simply throw a string
or Error
and the system will take care of it.
Follow this guide to learn how to setup an OAuth Connection in Unifi.
This guide gives step-by-step instructions on how to setup an Oauth Connection for your Unifi Integration (ServiceNow to ServiceNow).
This document will guide you through the process of configuring an OAuth Connection for your Unifi Integration (ServiceNow to ServiceNow). This will involve making configuration changes in both the identity provider and identity consumer instances. As such, this guide will examine the changes for each instance separately on the subsequent pages.
In this guide, you will configure an additional OAuth Connection to another ServiceNow instance as part of the Incident Guide Integration (created when following the Bidirectional Asynchronous Incident Guide). The external instance will act as the Identity Provider whilst the original instance will act as the Identity Consumer.
It is assumed that the Integration has been configured, packaged and moved to the external instance (see here for details). Therefore, the Process, Web Service & Integration are already in place (if not, please ensure that at least those elements are in place before continuing).
Follow this guide to learn how to configure a dedicated Message in Unifi to handle streamed attachments.
Attachments can be sent either as a stream or embedded in the payload as Base64.
In this guide, you will configure a dedicated Message as part of the Incident Guide Integration (created when following the Bidirectional Asynchronous Incident Guide). It is assumed that the Process, Web Service, Integration & Connection are already in place (if not, please ensure that those elements are in place before continuing).
The components to configure are as follows:
Message
Scripted REST Resource
If you're building a polling integration that has response payloads larger than 5MB then you'll need to save them as attachments and adapt your response processing.
This documentation is specific to handling large response payloads in a Poller integration.
Sometimes a polling integration that is fetching data from another system is required to handle response payloads larger than the 5MB limit imposed by ServiceNow. The normal setup for a Poll Processor includes the response payload being returned as a string in the Request script. Here we look at an alternative approach which avoids handling the response payload as a string and so avoids the 5MB string limit.
Alternative approach for response payloads > 5MB:
Make the request (using RESTMessageV2()).
Save the response as an attachment on a record (using saveResponseBodyAsAttachment()).
Pass the attachment sys_id
through (using getResponseAttachmentSysid()).
Fetch it and do something with the newly generated attachment - passing the stream of the attachment on to be processed however required, e.g. text/xml processing.
Don't pass the response body from the request script to the response script. Use saveResponseBodyAsAttachment() and getResponseAttachmentSysid() instead.
The following example is taken from our Incident Attachment Poller Guide.
Request script: This script uses the ServiceNow RESTMessageV2() web service to make a REST call to the endpoint url created in the Setup script. It returns the body of the request as an attachment who's sys_id it passes to the Response script
saveResponseBodyAsAttachment(): This method takes three parameters:
tableName - the table that contains the record you want to attach the saved file to.
recordSysId - the sys_id of the record you want to attach the saved file to.
fileName - the file name to give to the saved file.
request.execute(): The response object returned by request.execute() provides a method called getResponseAttachmentSysid().
getResponseAttachmentSysid(): This method returns the sys_id
of the attachment generated by the REST call.
Response script: This script sets up some objects to help us; this includes the essential PollHelper() function (which we initialise from the poll_request) along with the info [] array.
After that it sets params.attachment.data to the sys_id of the created attachment, setting up a payload object and submitting it to Unifi by calling the processInbound() method.
After processing the single result, it is logged to the Response status field of the Poll Request.
In our example we've taken the inbound attachment, built a payload and passed it to Unifi to process.
Some integrations will not have a Message that closes the Bond. In these situations, it is preferable to close any open Bonds manually from a business rule on the Target table, e.g. when an Incident is closed . The following business rule script will close all the Bonds for a given record:
Heartbeat messages can be used to help identify if the integration is up or down. Both inbound and outbound heartbeat messages are supported.
Create a new Message and select type Heartbeat. Although specific configuration is not required, you can configure Heartbeat messages as you would any other message. The only difference is the Heartbeat transactions are bonded to the active connection.
You should only have one active Heartbeat message per integration. Activating an outbound Heartbeat message will create a scheduled job which sends the message according to the frequency set on the Integration.
No response message is required for Heartbeat messages unless you would like to customise how responses are sent and received.
When an outbound heartbeat message is sent and fails, the integration will be paused and its Status marked as "Down". This will force all outbound messages to be queued. When a future outbound heartbeat message is successful, the integration will be resumed and its Status marked as "Up".
The frequency of outbound heartbeat messages is controlled by the Heartbeat Frequency setting on the Integration. By default, this is set to 300 seconds (5 minutes), meaning a heartbeat is sent every 5 minutes to the endpoint specified by the connection and message.
A heartbeat is seen to fail when it does not receive a 2xx status code response, e.g. 200. If you need to support other status codes, you can use Response Actions to intercept and modify the request response.
To help identify heartbeat messages, all outbound heartbeat messages contain the header x-snd-eb-heartbeat
with a value of 1
.
You can allow third party systems to send a heartbeat request. A Heartbeat message configured for Inbound processing is required. It can be configured to operate similarly to any other message, although no additional configuration is required.
Inbound heartbeats will run through the full connection process for the integration. By default, a 200 response is returned when authentication and Heartbeat message processing has been successful.
POST
https://myinstancename.service-now.com/api/x_snd_eb/unifi/incident
Example to send a Heartbeat message to check if the API is available.
x-snd-eb-message-name*
String
Your heartbeat message name, e.g. Heartbeat
x-snd-eb-heartbeat
Integer
1
Authorization*
String
Setup authorization as required by your connection.
Care needs to be taken when cloning an instance that uses Unifi, as the connections from that instance could potentially be active in the clone target. E.g. if you clone your production instance over a development instance, then without managing the connections you could connect to production instance target from your development instance.
Unifi comes with ways to deal with these scenarios:
Unifi has clone data preservers to prevent connection records in clone targets from being overwritten, and therefore preventing production credentials and endpoints making it to sub production instacens. However, care must be taken here as you will need to add these preservers to any clone profiles you use.
This is the best way to guarantee a production connection cannot be triggered from a sub production instance, or in fact any connection being triggered from an instance it is not specified for. The connection record contains a string field called instance name. In this field add the instance name that this connection should run against, i.e. if ACMEs production instance is acme.service-now.com then the production connection instance name field should be set to acme, and if ACMEs development instance is acmedev.service-now.com then the development connection instance name field should be set to acmedev. Only when the connections instance name matches the actual instance its being triggered from will a message be sent. If an attempt is made to trigger an integration from an instance that doesn't match, the corresponding transaction will be errored, and the reason why will be provided.
There are clone clean up scripts which will deactivate Unifi in the target clone instance, as well as turning off all active connections. The same issue with data preservers affects clone clean up scripts as well, so you need to make sure that any clone profiles which you use referene them. Also make sure to enable these scripts if you want to use them, they are disabled by default (as we prefer the data preservers).
Make sure you add Unifi to your post clone checklist. Even if you employ the highlighted methods above, its prudent to double check Unifi integrations after a clone.
A Bonded Attachment is created for every synchronised attachment, storing relevant data.
For every attachment that is sent or received, a Bonded Attachment record is created that links to the attachment. This allows the system to automatically make size calculations and track if the attachment has been sent.
The attachment sits on the bond, but is also linked to and can be viewed from the transaction that sent it.
For implementations using multiple integrations with one record, one attachment can be added and it will automatically be sent and tracked across all the relevant bonds.
Unifi has automated checks that prevent attachments and updates being sent back to the originator (controlled by the Connection).
For help on setting up attachments please see guide for How to Handle Attachments.
The following table gives a description of the fields that are visible on the Bonded Attachment record:
Number
String
The unique Bonded Attachment identifier.
State
Choice
The attachment synchronisation state.
Bond
Reference
The Bond this record belongs to.
Transaction
Reference
The Transaction this attachment was sent with.
Size (bytes)
Integer
The size in bytes of the attachment.
Content type
String
The attachment content type.
Created
Glide date time
The time the attachment record was created.
Created by
String
The person/system who created the attachment record.
Attachment
Reference
The system attachment.
File name
String
The file name of the attachment.
Status
String
Information message about the attachment synchronisation.
The state is set in conjunction with the information contained on the Integration & Bond (concerning permitted file types & attachment sizes etc.) and is used to determine whether or not to send the attachment, or where it sits in the process. The Bonded Attachment lifecycle can be seen in the following table which describes the State
field value choices:
Ready
0
The attachment is queued ready for sending.
Sending
1
The attachment is in the process of being sent.
Sent
2
The attachment has been sent successfully.
Complete
3
The attachment has been sent & we have received a receipt acknowledging it.
Rejected
4
An attempt was made to send the attachment, but was rejected.
Failed
5
There was a technical reason sending the attachment failed.
Ignored
6
The attachment was ignored (E.g. because it exceeded the maximum allowed file size).
Each Dataset Request stores the details and outcomes of a dataset import or export.
The Dataset Request is an operational record which stores the details and outcomes of an incoming dataset import or a scheduled dataset export.
An Dataset export can create many Dataset Requests depending on the dataset query and batch size.
A Dataset import is created when an incoming Dataset message containing an attachment with data is received.
Bonds store the data necessary to link your ServiceNow record to an external system.
In most integrations each system knows just enough about the other to facilitate correlation and a semblence of process. On task records in ServiceNow this has normally been done using the Correlation Display
and Correlation ID
fields. If you want to integrate non-task records, then you would need to add fields like these to the table. If you wanted to use one record with many integrations, then you have even more to do.
In Unifi, correlation has been normalised for global use through the use of the Bond record.
Bond records are used for much more than just storing correlation data; they are the glue that links everything together and ensures that you can maintain a high level of control in all your integrations. In addition to this, all transactions and attachments reference a bond making it easy to track communications.
A Bond record’s primary functions are:
To store the internal and external references
To give state control to each integration instance
To provide insight into the health of the bond
To track which system ‘owns’ the bond
To store a running commentary of the bond history
To ‘contain’ all transactions and attachments for the lifetime of the bond
To track attachment totals
The references necessary for integration are stored on the bond. You do not need to store any integration data on the integrated record itself.
The display value of the reference field from the ServiceNow record. This is automatically captured based on the Reference field
specified in the Process which the integration belongs to.
The reference used by the external system.
You must have at least one reference (either internal or external) to form a bond to another system. It is recommended to store both references wherever possible.
In complex integration processes it is often useful (or necessary) to track the state of the bond. This is mainly used to allow or prevent messages from being sent and received, but is also helpful to analysts who can quickly gain insight into what is happening without any additional configuration.
New
The bond is created but not yet inserted.
Pending
The bond has been created and the system is waiting for confirmation from the external system.
Open
Message exchange is available.
Suspended
Message exchange has been suspended by the internal process.
Vendor suspended
Message exchange has been suspended by the external process.
Closed
Message exchange has been terminated.
Every Message can be conditioned using the Bond state. Messages will check that the bond meets the bond condition specified before processing.
The Bond state can be set using the Set bond state inbound
and Set bond state outbound
fields on the Message. It can be also set manually in the Source to Stage
(outbound) and Stage to Target
(inbound) Message Scripts.
Bond health can be determined by the Status
field. This is constantly updated as transactions are flowing between the integrated systems.
OK
All transactions have completed.
Awaiting
A request has been made and is awaiting completion.
Error
A transaction has failed.
Ownership is not always used, but is extremely helpful in complex ticket integrations, especially where many systems might be involved (SIAM/Multi-vendor).
The Owner
flag is used to track which system is currently in charge of the ticket. This allows complex security and process logic to take place, creating a much more robust integration.
The Owner
flag can be set inbound and/or outbound using the Set bond owner inbound
and Set bond owner outbound
fields on the Message.
Unifi has a unique transactional management approach that is accomplished through the bond: all transactional data is stored against the bond until the bond state changes to Closed
, at which point it is deleted. This balance gives integration admins full insight to view and debug integration issues while maintaining table efficiency.
Once the bond is closed, a scheduled job is created in the future using the Integration property Bond cleanup
, which specifies how many days to wait before running the cleanup process.
The cleanup process deletes all Transaction and Bonded Attachment records.
Some integrations may need to maintain an open bond for a significant amount of time, or even indefinitely. In this case, a traditional scheduled cleanup job should be created to run periodically to clear out historic records.
Transaction
x_snd_eb_transaction
Bonded Attachment
x_snd_eb_attachment
Deleting the Transaction record will cascade delete all Stage and HTTP Request records.
You can view the Unifi logs from the Bond record by navigating to the 'Unifi Activity Logs' related list. Activity Logs are particularly helpful to ShareLogic when debugging Unifi itself and will be populated if logging is enabled when debugging.
The logs that can be viewed from the Bond are for the Bond itself and the Transactions that relate to that Bond.
In addition to the Transaction logs (for details of those click here), you will also see the following:
These logs are defined in the Trigger on the process table being integrated.
A Transaction is an instance of a Message occurence. It contains and tracks the processing of a Message using Stages and HTTP Requests.
A Transaction is a vehicle used in the flow of Messages on a bonded ticket. It is essentially a container for a Request/Receipt pair. The Transaction record contains the necessary fields to facilitate and track both the movement and the processing of the pair of Messages.
A Transaction’s primary functions are:
To contain the Request/Receipt pair of Messages.
Handling the queueing of Requests and Receipts.
Tracking the state of the Transaction from two perspectives:
Transaction state
The condition of the Transaction from a movement (transactional) perspective (i.e. the condition of the transport of the data).
Process state
The condition of the Transaction from a process perspective (i.e. its relation to the business logic that’s in place).
Example:
System A sends a Request to system B to update the value of the urgency
field on a ticket without having ‘ownership’ of that ticket. The Request is properly formed and authenticated and is received in system B.
System B sends a response acknowledging the Request and then subsequently processes the Request asynchronously. If, on processing the Request from system A, system B discovers that system A did not have the authority to change the value of the urgency
field, they would send a Receipt Message stating the error. That Receipt is also properly formed and authenticated and is received in system A without any problems.
The Transaction state in this case would be set to ‘Complete’ as both the Request and the Receipt were transported successfully. However, the Process state would be set to ‘Rejected’ as system A did not have the authority to update the value of the urgency
field.
One of the many defining features of Unifi is its ability to track the state of the Transaction from both a transport (What’s happening with the movement of the message?) and a process (How does the content of the message align with business logic?) perspective.
Having the combination of these two states is extremely useful and gives a far more precise understanding of the condition of the Transaction and opens up more options. This means, for example, that an analyst can see and has the opportunity to fix issues based on a process error without the need for technical support (Unifi won’t break an Integration for a process error). For transactional errors Unifi can throw events which can be correlated to send notifications, or trigger the system’s Incident process as appropriate.
*Timed Out
This is configured on the integration by the value in the 'Async timeout' field.
**Ignored
This can be used when a Transaction does not complete (i.e. when it is in an ‘Error’ or ‘Timed Out’ state), or at any time when a Transaction is in progress or ‘Queued’ (e.g. when a system error occurs outside our control).
The following table gives a description of the fields that are visible on the Transaction record:
*External/Message ID
These are used in Unifi to match the related Requests and Receipts (i.e. the Message ID of one system will be used as the External message ID of the other system and vice versa).
**Direction
The direction of the initiating (or first) Message in the Request/Receipt pair will determine the direction of the Transaction.
***Process error
The value in this field can be rolled up to the bonded ticket, giving the analyst insight and the opportunity to resolve process errors.
You can view the transaction logs from the Transaction record by navigating to the 'Unifi Activity Logs' related list, or from the Unifi Operations Manager portal. This hugely significant feature offers considerably more detail than is available with the OOTB ServiceNow system logs and puts that information right where you need it - saving an inordinate amount of time hunting through the [sys_log]
table.
Activity Logs will be populated if logging is enabled when debugging. They are particularly helpful to ShareLogic when debugging Unifi itself.
Clicking the 'View Logs' icon will open a list view of the logs in a new window.
Once the initial Transaction has completed, Unifi checks if there are any other transactions queued and processes them.
Here Unifi is processing the outbound HTTP Request and sending it to the integrated system.
Unifi is picking up the generated Stage data and generating an outbound HTTP Request.
A payload has been received from the external system and Unifi is processing that data and updating the target record within ServiceNow.
An inbound request has been received and generated an inbound HTTP Request.
The logs that can be viewed from the Transaction are for the Transaction itself and the Requests that relate to that Transaction.
Each of the logs relate to a specific part of the process flow, so depending on the direction of the Transaction the logs will occur in a different sequence. From a process perspective, the logs would occur as follows:
Inbound request: RestHelper (relates to the inbound HTTP Request), Transaction receive message (relates to the inbound Transaction).
Corresponding outbound receipt: Transaction sending (relates to the outbound Transaction Receipt), Request sending (relates to the outbound Request).
When Transaction completes: Transaction process next queued.
Outbound: Unifi Business Rule (relates to the logs defined in the Trigger on the process table being integrated), Transaction sending (relates to the outbound Transaction), Request sending
(relates to the outbound Request).
For a synchronous integration that would be it. For an asynchronous integration Unifi would also expect the following:
Corresponding inbound receipt: RestHelper (relates to the inbound Transaction Receipt), Transaction receive message (relates to the Receipt processing).
You can click the Replay button on the Transaction record to replay that transaction.
This will be used when testing/debugging. For example, if there is a transactional error, after you have investigated the cause & made any required changes you will want to replay the Transaction to check if the error clears.
These are the configuration changes to be made in the identity consumer instance when setting up an OAuth Connection.
In native ServiceNow, navigate to System OAuth > Application Registry and click New.
On the interceptor page, click Connect to a third party OAuth Provider.
The fields to be configured for the Application Registry record are as follows:
Name
Name of the OAuth app
<Your Unique Name>
Client ID
The client id of the OAuth app
The Client ID from the Identity Provider Instance
Client Secret
The client secret of the OAuth app
The Client Secret from the Identity Provider Instance
Default Grant type
The Default Grant Type used to establish the OAuth token
'Resource Owner Password Credentials'
Refresh Token Lifespan*
The number of seconds a refresh token issued will be good for
8,640,000 (default value - automatically populated)
Token URL
OAuth token endpoint to retrieve access and refresh tokens
'https://<your-provider-instance>.service-now.com/oauth_token.do'
Comments
Comments about the OAuth app
<Your description of the purpose of the OAuth entity>
*This value is to be left as-is.
Token URL: Replace the <your-provider-instance>
element of the URL with that of the Identity Provider Instance.
Your Application Registries New Record should look like this:
Right-click and Save to remain on the record.
Validate that the OAuth Entity Profiles related list has been populated with the following values:
Name: <Your Unique Name> default_profile
Is default: true
Grant type: Resource Owner Password Credentials
This is the profile which will be selected when configuring the Connection.
In Unifi Integration Designer, navigate to Connections and click New.
We are going to configure a Connection for the Pre-Production environment because we have already configured connections for both the Development and Test environments. Choose whichever environment is appropriate for your requirements.
The fields to be configured for the New Connection modal are as follows:
Environment
The environment this connection applies to.
'Pre-Production'
Endpoint URL
The external system's access URL.
<External system Endpoint URL>
Active
Use this connection for the integration when true.
<true>
The format of the Endpoint URL is as follows:
https://<your_provider_instance>.service-now.com/<your_provider_process_api>
The entire Endpoint URL can be easily obtained from the automatically created Message Resource on the Unifi Scripted REST API (displayed in the widget at the top of the Connections page in the other instance).
Your New Connection modal should look like this:
Submit and view to further configure the Connection.
The fields to be configured for the Details form are as follows:
Authentication
The authentication method to use for this connection.
'OAuth 2.0'
OAuth Profile
The OAuth Entity Profile to authenticate with.
'<Your Unique Name> default_profile' (as created/validated above)
Inbound user
The user profile used by the external system for authentication. An active connection must be found for the user to gain access.
lookup: <Your Inbound User>
Your Details form should look like this:
Save the Connection.
Once you have saved the Connection, the 'Get OAuth Token' button is available.
Click Get OAuth Token.
On the modal that pops up, enter the Username & Password (for the Inbound user of the Identity Provider Instance).
Click Get OAuth Token.
The 'OAuth token flow completed successfully' info message is displayed. Close the modal.
Congratulations. You have successfully configured both halves of the OAuth Connection.
Now all that remains is to test each of the Scenarios in turn. See the following pages of the Bidirectional Asynchronous Incident Guide for examples:
The AddAttachment Message will be used to process inbound and outbound attachments.
In Unifi Integration Designer, click on the 'Messages' icon & then New to begin configuring the AddAttachment Message.
The fields to be configured for the AddAttachment New Message modal are as follows:
Message name
The message name that is unique for this integration.
'AddAttachment'
Type
The primary purpose of the message.
'Update'
Direction
The direction(s) this message is configured to support.
'Bidirectional'
Description
Describe the message and the function it is performing.
<Your description>
Your AddAttachment New Message modal should look like this:
Submit and view to further configure the Message.
Navigate to Message > Response.
The Response fields to be configured are as follows:
Response
The immediate synchronous response to this message.
lookup: 'Response'
Your Response form should look like this:
Navigate to Message > Bond.
The Bond fields to be configured are as follows:
Bond ownership **** condition*
Determine if the sender should own the bond or not in order for this message to be processed? Use 'Ignore' to process regardless of the owner flag. (Choices: Ignore, Must own, Must not own.)
'Ignore'
Bond condition type*
The type of conditional check made on the bond. (None: no checks are made. State: checks against the state are made using the conditional checkboxes. Scripted: the 'Bond condition' script is used.)
'State'
Bond pending
Process this message when the bond state is Pending.
<true>
Bond open
Process this message when the bond state is Open.
<true>
*These fields are automatically populated.
Your Bond form should look like this:
Navigate to Outbound > Trigger.
The Outbound Trigger fields to be configured (as required)* are as follows:
Outbound condition*
The condition that the ServiceNow record must meet to trigger this message being processed.
Update the Outbound condition script field with the code below
*Outbound condition (as required):
It is not always necessary for you to enter a condition. The value given is an example. You may create any condition (or not) to align with your business process requirements. In this case, it makes sense to set the Outbound Condition to false because we do not want Unifi to send this message on any update of the source record. Typically, we would configure different messages to align with state changes (e.g. Close, Resolve).
The code in the 'Outbound condition' script field should look like this:
Your Outbound Trigger form should look like this:
Navigate to Outbound > Attachments.
The Outbound Attachments fields to be configured are as follows:
Send attachments
Mark this message as being enabled for sending attachments.
<true>
Maximum attachments to send*
Set the maximum number of attachments this message can send using the AttachmentSender helper class.
'1'
Attachment added
Use this message to immediately send new attachments regardless of the trigger conditions.
<true>
*This field is automatically populated.
Your Outbound Attachments form should look like this:
Navigate to Outbound > Settings.
The Outbound Setting fields to be configured are as follows:
Path
A path to append to the URL defined in the connection. Specify a full URL to override the connection. Define inline scripts to reference Stage to Request script variables by wrapping code in braces {}, e.g. /{transaction.message_id}.
Update the Path field with the code below
Action method
The SOAP Action or the REST Method to use for this message. If this field is empty the SOAP Action will default to the message name and the REST Method will default to POST.
'POST'
The code in the 'Path' field should look like this:
The /attachment
element of the Path is required when using the Unifi Scripted REST API to automatically generate the Scripted REST Resource. We shall discuss this further on the Scripted REST Resource page.
Please note the following when using the Unifi Scripted REST API to automatically generate the Scripted REST Resource:
If you have attachment messages which were configured before the auto-generated Scripted REST Resources, you must add the /attachment
element to the Path of those messages. Otherwise, they may generate "Content type not allowed" errors.
Your Outbound Settings form should look like this:
Navigate to Inbound > Settings.
The Inbound Settings fields to be configured are as follows:
Bond reference method
Method of searching for and validating an existing bond for incoming messages.
'Internal'*
*Bond reference method value choices: Internal - lookup using the internal reference only. External - lookup using the external reference only. Both - lookup using both the internal and external references.
Choose the value depending on what is needed.
For attachments we only use the Internal lookup because we are not passing both references. Lookup could be done on External or Both, but the references would need to be added to the inbound request.
Make sure to set either stage.internal_reference
or stage.external_reference
(or both) to match in the Payload to Stage Script (see below).
Your Inbound Settings form should look like this:
Navigate to Advanced > Script Editor.
When you first open Script Editor, it should look like this:
The Script Editor fields to be configured are as follows:
Stage to Request
Method of searching for and validating an existing bond for incoming messages.
Update the code in the Stage to Request script field so that it looks like the code below
Payload to Stage
The script containing functions for extracting internal and external references from the request payload.
Update the code in the Payload to Stage script field so that it looks like the code below
The code in the ‘Stage to Request’ script field should look like this:
file_name: It is necessary to encode the file_name
so that the characters are converted into a format that can be transmitted and understood in the endpoint URL. Not doing so could cause the Transaction to fail (particularly if the file_name
contains special characters).
payload: To stream outbound, you will need to set the payload to be "sys_attachment:<sysid>". Unifi will recognise this and automatically stream the attachment out.
You would configure other message data within the URL or the headers.
The code in the ‘Payload to Stage’ script field should look like this:
We have chosen to set the internal reference on stage as we have set the Bond reference method to 'Internal' (see above). I_f we had set the Bond reference method to 'External', we would need to set the stage external reference here (or both if we had set the Bond reference method to 'Both')._
Your Script Editor fields should now look like this:
Save the message.
Next, we shall configure the Scripted REST Resource.
These are the configuration changes to be made in the identity provider instance when setting up an OAuth Connection.
These instructions apply to ServiceNow. Other identity providers may vary.
In native ServiceNow, navigate to System OAuth > Application Registry and click New.
On the interceptor page, click Create an OAuth API endpoint for external clients.
The fields to be configured for the Application Registry record are as follows:
Name
Name of the OAuth app
<Your Unique Name>
Client ID*
The client id of the OAuth app
[read-only] (automatically generated)
Client Secret*
The client secret of the OAuth app
Leave [Blank] to automatically generate
Refresh Token Lifespan*
The number of seconds a refresh token issued will be good for
8,640,000 (default value - automatically populated)
Access Token Lifespan*
The number of seconds an access token issued will be good for
1,800 (default value - automatically populated)
Comments
Comments about the OAuth app
<Your description of the purpose of the OAuth entity>
*These values are to be left as-is.
Your Application Registries New Record should look like this:
Submit the record.
If you re-open the record after submitting it, you will see that the Client Secret has been populated.
If you haven't already done so, you will need to create an Inbound user in this instance. See here for details.
In Unifi Integration Designer, navigate to Connections and click New.
We have chosen to configure a Connection for the Pre-Production environment because we have already configured connections in the Consumer Instance for both the Development and Test environments. Choose whichever environment is appropriate for your requirements.
The fields to be configured for the New Connection modal are as follows:
Environment
The environment this connection applies to.
'Pre-Production'
Endpoint URL
The external system's access URL.
<External system Endpoint URL>
Active
Use this connection for the integration when true.
<true>
The format of the Endpoint URL is as follows:
https://<your_consumer_instance>.service-now.com/<your_consumer_resource_path>
The entire Endpoint URL can be easily obtained from the automatically created Message Resource on the Unifi Scripted REST API (displayed in the widget at the top of the Connections page) in the other instance.
Your New Connection modal should look like this:
Submit and view to further configure the Connection.
Although we will be providing an OAuth Token for the external instance to consume when connecting to this instance, we will use Basic authentication to connect outbound with the Consumer Instance.
The fields to be configured for the Details form are as follows:
Authentication
The authentication method to use for this connection.
'Basic'
User
The username used in basic authentication.
<external.system.user>
Password
The password used in basic authentication.
<External system user password>
Inbound user
The user profile used by the external system for authentication. An active connection must be found for the user to gain access.
lookup: <Your Inbound User>
Your Details form should look like this:
Save the Connection.
At this point you can perform a basic Connection test. For instructions, see here.
Next, configure the Identity Consumer Instance.
Follow these instructions to Package your Integration, ready for migration between ServiceNow instances.
How to package and export an Integration ready to be imported to another instance:
In Unifi Integration Designer, navigate to and open < Your Integration >.
Click the 'Integration' icon to open the Details page.
Click Package.
Confirm by clicking Package.
Notice: the modal tells us how many records were processed and the name of the Update Set that was created (the integration name prepended with the date/time).
Copy the name of the Update Set.
Click Done to close the modal.
Navigate to the downloaded Update Set and Show in folder*.
Congratulations! You now have the Update Set containing the complete Integration available to be imported to your other instance.
You can find more information on how to Load customizations from a single XML file in the ServiceNow Product Documentation.
Every request is handled and tracked by a HTTP Request record.
HTTP Request records are created every time an HTTP Request is generated for sending to or receiving from an external system. They contain all the low-level data about the individual request; the headers, the payload, the URL, timings, etc.
These records are extremely useful for developing and debugging integrations because of the immediate availability and contextual relevance to the integration you are developing. You will often find that it is easier to debug an integration from within Unifi than it is from any external system.
The Request will be inserted in a ‘Ready’ state. Unifi will then asynchronously pick it up and process it. The following table defines the state
field value choices:
*Pending
(After a failure the HTTP Request is created straight away, but remains in a ‘Pending’ state until the timer has finished.)
**Ready
Inbound HTTP Requests are currently processed synchronously and therefore will only ever be ‘OK’ or ‘Error’.
The following table gives a description of the fields that are visible on the HTTP Request record:
The following table gives a description of the Request fields that are visible on the HTTP Request record:
It is often far quicker and easier to tell what has been sent from the integrated system by looking at the values in the Response fields in Unifi than it is to get the same information from the system that has sent it.
The following table gives a description of the Response fields that are visible on the HTTP Request record:
Unifi logs can be viewed by navigating to the 'Unifi Activity Logs' related list.
The logs that relate to the HTTP Request are as follows:
An inbound request has been received and generated an inbound HTTP Request.
Here Unifi is processing the outbound HTTP Request and sending it to the integrated system.
The logs that can be viewed from the HTTP Request record relate to the HTTP Request only.
You can click the Replay Request button on the HTTP Request record to replay that request.
This will be used when testing/debugging. For example, if the Request errors you may want to edit the payload & then replay the Request to check if the error clears.
The Stage is the root staging table for all data mapping. Stages are created dynamically at the time of data being sent/received. The Staged Data fields will vary dependent on the data being sent.
Stages are a snapshot of the data either at the time of being sent or received. The Stage stores the snapshot of data, which is what facilitates the asynchronous exchange of Messages. Having that data called out into separate fields provides a clean & clear view of the data being sent/received, which in turn can provide a clearer picture of the cause of any potential discrepancies in the mapping of data that is being translated*/transformed**.
*Translated: a movement of data
**Transformed: a movement and a change of data
The data being sent for each Message is mapped using the Field records. For more information, see the '' & '' pages.
The following table gives a description of the fields that are visible on the Stage record:
The following is an example of a Stage record:
Whenever there is an update to a bonded record, Unifi will take a snapshot of it and capture it in a Snapshot record. The Snapshot is the key to facilitating automated Integration testing.
A Snapshot is a representation of the bonded record and data from the relevant message that created/updated it.
A snapshot is created in one of two scenarios:
A process record is updated and one or more integrations are interested in it (outbound).
A process record is updated by an integration (inbound).
In the Unifi Transport Data Flow process (see the diagram ), the Snapshot record sits between the Stage and the source/target record. In order for it to be an accurate representation of the source/target record, the Snapshot is taken the same side of the relevant Message Script.
A snapshot is taken of the source record prior to running the Source to Stage Message Script (before transforming any data).
A snapshot is taken of the target record after running the Stage to Target Message Script (after transforming any data).
Snapshots can be viewed by navigating to Unifi > Transport > Snapshots.
The top of the Snapshot record looks like this:
The bottom of the Snapshot record looks like this:
The following table gives a description of the fields that are visible on the Snapshot record.
The Package Integration modal is displayed.
Unifi packages the Integration into an Update Set which is automatically downloaded.
The Integration Package Worker modal is displayed.
*Example shown in Chrome. Other browsers may vary.
Then rename the file (using your Update Set name in order to easily identify it when uploading to the other instance).
Pending*
0
The HTTP Request is waiting to be processed asynchronously, but in a retry scenario it is waiting for the synchronous timeout specified on the Integration.
Ready**
1
The HTTP Request is ready to be processed outbound.
OK
1
There are no errors with the HTTP Request.
Error
3
There was an error with the HTTP Request.
Cancelled
4
The HTTP Request was cancelled.
Number
String
The unique HTTP Request identifier.
Integration
Reference
The integration this record belongs to.
Connection
Reference
The connection this request will use.
Transaction
Reference
The transaction this record belongs to.
Message
Reference
The message used to process this record.
Response Action
Reference
The Response Action that handled this request.
Direction
Choice
The direction this record is travelling in.
Request state
Choice
The state of the request.
Attempt number
Integer
The number of HTTP Request attempts. Failed requests are retried up to the maximum attempts number as configured on the Integration.
Source type
String
The Source type that created this request.
Endpoint URL
URL
The external system’s access URL.
Action method
String
The SOAP Action or the REST Method to use for this request.
Request headers
String
A JSON object containing the headers sent with this request.
Request payload
String
The payload of the request being sent or received.
Time
Glide date time
The time this request was processed.
Size (bytes)
Integer
The size in bytes of the request payload.
Mid server
Reference
The MID server used to send this request.
Status code
String
The response status code.
Time to send
Integer
The time taken in milliseconds to receive a response.
Status text
String
The HTTP status text.
Response headers
String
A JSON object containing the headers sent in response to this request.
Response payload
String
The payload of the response being sent or received.
Sending
0
An outbound Message is being sent to the integrated system.
Received
1
An inbound Message has been received from the integrated system (but not processed).
Awaiting Receipt
2
An outbound Message has been sent to the integrated system and we are awaiting an asynchronous Receipt.
Sending Receipt
3
We are sending an asynchronous receipt in response to receiving and processing an inbound Message from the integrated system.
Received Receipt
4
An inbound asynchronous Receipt has been received from the integrated system.
Complete
5
The Message exchange was completed.
Queued
6
The Message exchange has been temporarily delayed because of incomplete transactions.
Timed Out*
7
An outbound Message has been sent to the integrated system, but we have not received the asynchronous Receipt in the expected timeframe.
Error
8
There was a transactional error with the Message exchange (e.g. invalid message format, unknown endpoint, code error etc.).
Ignored**
9
Transactions can be ignored manually to allow the integration to proceed.
Pending
0
The Message is being processed and is awaiting a decision as to whether it falls within or outside of the scope of the business logic.
Accepted
1
The Message has been processed and accepted as within the scope of the business logic.
Rejected
2
The Message has been processed and rejected as outside the scope of the business logic.
Message ID*
String
The unique internal message identifier.
External message ID*
String
The external system’s unique message identifier.
Table
Table name
The target table this record applies to.
Document
Reference
The integrated ServiceNow ticket.
Integration
Reference
The integration this record belongs to.
Bond
Reference
The bond this record belongs to.
Message
Reference
The message used to process this record.
Direction**
Choice
The direction this record is travelling in.
Transaction state
Choice
The state of communication for this transaction.
Process state
Choice
The business logic process state.
Created
Glide date time
The date and time the transaction was created.
Receipt due by
Glide date time
The date a receipt must be received before this transaction times out. Only applies to asynchronous transactions.
Error
String
The internal communication error.
Process error***
String
The business logic error in processing this transaction.
Number
String
The system generated unique identifier for this record.
Direction
Choice
The direction of the Snapshot. Choices: None, Inbound, Outbound
Table
Table Name
The source/target table of the bonded record.
Document
Document ID
The source/target bonded record.
Messages
String
Object containing details of the Message that initiated the Transaction and its Integration.
Previous
String
The JSON representation of the bonded record before any update was made.
Current
String
The JSON representation of the bonded record at the time Unifi was invoked. Fields may differ to previous values here through form updates, business rules, etc.
After
String
The JSON representation of the bonded record after Unifi processing is complete.
Each Poll Request stores the details and outcomes of a scheduled poll.
The Poll Request is an operational record which stores the details and outcomes of a scheduled poll. It represents a single instance of a Poller being run (a Poll Request record is created each time a Poller is run).
Follow this guide to learn how to generate and run automated Integration Tests in Unifi.
With the introduction of Unifi Test Assistant, you can now generate Integration Tests and perform regression testing at scale. Create, manage and execute tests, view and act on results, and save enormous amounts of time during upgrades and release cycles.
Integration Tests are created directly from the Bond record. At the click of a button, Unifi will generate a test which comprises each of the Transaction scenarios on that Bond.
The generated tests are used to check Unifi's processing of the data i.e. to compare whether it is behaving in the same manner and producing the same results when processing the generated test as it did when processing the original records. It checks not only the data itself, but also the Unifi processes that trigger, transport and respond to that data moving through Unifi.
The Unifi Admin [x_snd_eb.admin] role is required to generate Integration Tests.
When running a test, no connection is made to the other system. Instead, Unifi calls a mock web service which responds with results from the original scenario. Unifi then tests what happens with that response. Doing this helps to ensure the accuracy of the test (testing the functionality of the Unifi process in your instance), without relying on input from an external instance (potentially adding further variables to the test).
In the current release, automated testing only supports REST and JSON payloads (not SOAP or XML). Also, we currently do not support automated testing of attachment messages.
Exploring the results of the Integration Test is intuitive, efficient and informative using Unifi Test Assistant.
Whenever you package your Integration (for details, see the Packager Feature Guide), any Integration Tests you create will also be included along with the other elements of your packaged Integration.
Because tests are generated from real-world data in your instance, in order for your tests to work in other instances, the data that you use has to exist in those instances as well (i.e. the data contained on the bonded record e.g. Caller, Assignment group etc.).
If you change your process (e.g. change the structure of data objects being exchanged), you will need to generate new tests.
The Unifi Test Assistant Process Model shows how Unifi Test Assistant has been built to work with your integration and platform release process.
Unifi Test Assistant is designed to be used with integrations that are already working. It is not a replacement for unit testing.
A) Once UAT has been completed for the integration, Integration Tests can be generated from the resulting Bonds and imported back into Dev.
B) The Integration Tests can be executed as many times as required to perform regression testing for new platform upgrades, patch releases or for any other reason.
C) Integration Tests are packaged with the Integration and can be executed as part of UAT if required.
D) Integration Tests can be generated in Production to allow new or unforeseen scenarios to be captured and tested against in future release cycles.
The dedicated portal interface for running and exploring automated Integration Tests.
Integration Test is the overarching record containing all the elements of the automated Integration Test. It correlates to the Bond from which it was created and comprises each of the Transaction scenarios on that Bond.
Integration Test Scenarios are the elements that make up an Integration Test. Each Scenario will correlate to the relevant Transaction on the Bond from which the test was created. Each contains the relevant Test Scenario Data objects for the particular Scenario.
Test Scenario Data is a JSON representation of all the relevant records created during the processing of a Transaction (e.g. HTTP Request, Transaction, Bond, Snapshot) and is used to both generate the test and ascertain the results of each test run.
Whenever you run an Integration Test, the results are captured in an Integration Test Result record. The record links to and contains a summary of each of the individual Test Scenario Results.
Whenever you run an Integration Test Scenario, the results are captured in an Integration Test Scenario Result record. The results of each Test Scenario are tallied and rolled up to the parent Integration Test Result record.
We will give step-by-step instructions on how to generate, run and explore automated Integration Tests.
Test Scenario Data is a snapshot of all the relevant records created during the processing of a Transaction and is used to both generate the test and ascertain the results of each test run.
Integration Test Scenario data is used to initially create the new test record (the record created during the running of the test) and then to subsequently compare the test results to the expected results.
This is the final step in the generation and setup of Integration Tests. There are various 'Types' of Test Scenario Data which represent the different records used at different points in the processing of data through Unifi. Each of the 'Types' of data are actually a JSON representation of each of those records and are disscussed below.
This is a JSON representation of the Snapshot record taken when processing the original Transaction. This data is used to create or update the new test record.
This is a JSON representation of the Stage created when processing the original Transaction (both inbound and outbound Stages). It is used to compare to the results of the test run.
This is a JSON representation of the Bond created when processing the original Transaction. It is used to compare to the results of the test run.
This is a JSON representation of the Bonded Attachment created when processing the original Transaction. It is used to compare to the results of the test run.
This is a JSON representation of the Transaction created when it was processed originally. It is used to compare to the results of the test run.
This is a JSON representation of the HTTP Request created when processing the original Transaction (both inbound and outbound Requests). It is used to compare to the results of the test run.
The screenshot below is a Snapshot 'Type' Integration Test Scenario Data record, but is representative of each of the various types.
The following table describes the fields which are visible on the Integration Test Scenario Data record.
Name
String
The Number (unique identifier) of the original record.
Scenario
Reference
The Integration Test Scenario this data object belongs to.
Type
Choice
Choices correlate to the different transport stack records.
Direction
Choice
The direction of the original record. Choices: None, Inbound, Outbound
Data
String
The JSON representation of the original record.
Number
String
The unique Stage identifier.
Internal reference
String
The ServiceNow ticket reference.
External reference
String
The external system’s ticket reference.
Direction
Choice
The direction this record is travelling in.
Integration
Reference
The integration this record belongs to.
Transaction
Reference
The transaction this record belongs to.
Message
Reference
The message used to process this record.
Domain
Reference
The domain this record belongs to.
Staged Data fields
String/Object
The fields that correlate to the data being sent/received. They are uniquely and dynamically configured per Message.
You can now Pause & Resume an Integration. This will cause Transactions to queue and then be processed in the order they were queued.
Pause is a UI Action on the Integration. Clicking it will pause the Integration, causing all subsequently created Transactions to queue in the order they were created.
Pause is different to deactivating the Connection. Deactivating the Connection would stop all processing completely, whereas Pause simply prevents the Messages from being sent until the Integration is resumed.
To pause an Integration, click the the Pause UI Action in the header of the Integration record in native ServiceNow.
An Info Message will be displayed, stating the Integration is paused.
Unifi Admins can pause Integrations if the state is 'Active'
Once an Integration has been paused, the Resume UI Action is available on the Integration. Clicking it will restart the queue, processing the queued Transactions in order.
To resume an Integration, click the the Resume UI Action in the header of the Integration record in native ServiceNow.
A modal is displayed asking to confirm Resume Integration. Click OK.
The Resume Integration Worker modal is displayed, showing progress.
Unifi Admins can resume Integrations if the state is 'Paused'
It is not envisaged to use this feature when dealing with errors (use 'Ignore' in that case). The primary use case for Pause and Resume would be for occasions where there are planned outages. For example, the other system is undergoing planned maintenance.
Setting Transactions to Ignored stops the queue from processing.
Ignore is a UI Action on the Transaction record which allows administrators to manually ignore Transactions. Clicking it prevents queued Transactions being processed.
You can ignore a Transaction either from the record itself, or the list view:
Click the the Ignore UI Action in the header of the Transaction record.
Click the Ignore Transaction UI Action from the list view.
Unifi Managers can ignore Transactions that aren't Complete or already ignored.
You would normally use Ignore for Transactions that are Queued, Timed Out or in an Error state. You would perhaps ignore a Transaction that cannot be processed because it is broken and you know it won't work.
Another example might be that you choose to ignore an update because the other system has processed it but not responded correctly (putting the Transaction in Error), so rather than replaying and duplicating the update, it would be better to ignore it. You may even have a number of Transactions that you want to ignore (perhaps in the case of system unavailable).
We've already said that setting Transactions to Ignored stops the queue from processing. Unifi doesn't automatically continue processing subsequent Transactions - they remain Queued. This is because it is more beneficial to have a focused environment when debugging - to not have Transactions automatically firing off (potentially blurring issues).
UI Action is visible to Unifi Managers when the state is Queued.
In the case of an Outbound Transaction with an attachment being ignored, the associated Bonded Attachment record will be set to Rejected. This means that it will be available to be sent with the next Transaction.
Unifi can create an automated Integration Test which will capture and replay the Transaction scenarios from an existing Bond.
An Integration Test is the overarching automated test record and correlates to the Bond. At the click of a button, Unifi will generate an Integration Test directly from the Bond record. The test comprises each of the Integration Test Scenarios (which correlate to the Transactions on that Bond). This means you can create an automated test based on real-world data in your instance which can be re-used to test the functionality of the Unifi process.
Because tests are generated from real-world data in your instance, in order for your tests to work in other instances, the data that you use has to exist in those instances as well (i.e. the data contained on the bonded record e.g. Caller, Assignment group etc.).
Instead of repeating the same manual process each test cycle (i.e. manually running through different scenarios), using automated Integration Tests for the various scenarios means that future testing would reduce a task which may have taken several hours to a matter of seconds.
If you change your process (e.g. change the structure of data objects being exchanged), you will need to generate new tests.
The Unifi Admin [x_snd_eb.admin] role is required to generate Integration Tests.
For instructions on generating, running and exploring Integration Tests, see our Unifi Test Assistant Feature Guide.
These tests are used to check Unifi's processing of the data i.e. to compare whether it is behaving in the same manner and producing the same results when processing the generated test compared with processing the original records. It checks not only the data itself, but also the Unifi processes that trigger, transport and respond to that data moving through Unifi.
When running a test, no connection is made to the other system. Instead, Unifi calls a mock web service which responds with results from the original scenario. Unifi then tests what happens with that response. Doing this helps to ensure the accuracy of the test (testing the functionality of the Unifi process in your instance), without relying on input from an external instance (potentially adding further variables to the test).
In the current release, automated testing supports only REST and JSON payloads (not SOAP or XML). Automated testing of attachment messages is also not supported.
Whenever you package your Integration (for details, see the Packager Feature Guide), any Integration Tests you create will also be included along with the other elements of your packaged Integration.
The following table describes the fields which are visible on the Integration Test record.
Name*
String
The name of the Integration Test. Automatically populated from the Integration, Unifi version and the Bond used to generate the test.
Integration
Reference
The Integration this test belongs to.
Unifi version
String
The licensed version of Unifi at the time the test was created.
Application
Reference
Application scope of the test.
Created
Glide date time
The time the test was created.
Active
Boolean
Set to true to use this test.
Description
String
Use this to enter a meaningful description for your test e.g. Inbound asynchronous test, or Outbound synchronous test
*Name: Though automatically populated, this value can be edited to suit.
The 'Integration Test Scenarios' and 'Integration Test Results' related lists are also visible on the Integration Test record.
Retry logic is configurable per Integration and controls how Unifi will automatically retry errored HTTP Requests.
The retry functionality is like a first line of defence when it comes to error handling. It builds in time to allow the system to deal with potential issues in the first instance and saves the analyst or administrator having to step in at the first sign of any problems. This can be useful in scenarios where, perhaps the external system is temporarily preoccupied with other tasks and is unable to respond in sufficient time.
Rather than fail or error a Transaction at the first unsuccessful attempt, Unifi will automatically retry and process it again. The number of times it attempts to do so and how long it waits (both for a response and before attempting to retry again) are configurable parameters.
Although the retry logic itself is applied at the HTTP Request level, the settings to configure it can be found on the Integration. This means that they can be configured uniquely and specifically for each Integration. Unifi will automatically retry errored Requests according to those settings.
The fields that can be configured are as follows:
In Unifi Integration Designer, navigate to and open < The Integration you wish to configure >.
Click the ‘Integration’ icon (this will open the Details page).
Navigate to Error Handling > Timeouts.
The Timeout fields that can be configured for the Integration are as follows:
Sync timeout
The amount of time in seconds to wait for a request to be accepted by the external system.
Async timeout
The amount of time in seconds to wait for an asynchronous receipt.
MID server timeout
The amount of time in seconds to wait for the MID server to respond (only applies to connections using MID servers).
Navigate to Error Handling > Retry.
The Retry fields that can be configured for the Integration are as follows:
Retry delay
The amount of time in seconds to wait before trying an outbound request again.
Retry limit
The number of times sending an outbound request is attempted.
Retry is automated in Unifi. Should the number of retries be exhausted, the Transaction will be errored and any subsequent Transactions are queued. This prevents Transactions from being sent out of sync and updates being made to bonded records in the wrong sequence.
It will require a user with the Unifi Manager role to intervene, investigate and correct the error before manually restarting the queue.
There are a number of UI Actions available to help and subsequent sections will look at each of those in turn.
In the next section, we'll look at the first of those UI Actions, the Replay feature.
The Repair feature allows you to manually replay all Transactions on an Integration which are in either an Error or Timed Out state.
Repair is a UI Action on the Integration. Clicking it will cause a bulk replay of all its broken Transactions (those in an Error or Timed Out state).
You can repair an Integration either from Unifi Integration Designer, or native ServiceNow:
Click the Repair UI action on the Integration record in the Unifi Integration Designer portal.
Click the the Repair UI Action in the header of the Integration record in native ServiceNow
Unifi Admins can repair Integrations.
As a further aid, there is also a 'View broken transactions' Related Link on the Integration record which will take you to a list of all Transactions in an Error or Timed Out state.
Hints & Tips: It is possible for Unifi to be proactive and raise Incidents about Integrations (see Event Actions). In such a case, the repair functionality could be tied to the resolution of the Incident, whereby Unifi automatically runs 'Repair' on the Integration the Incident was raised against.
When a Transaction is replayed (whether individually, or in bulk), the original record is set to 'Ignored' and a new Transaction (with a decimal suffix) is generated (taking the same Stage data & reprocessing it) and sent.
You can easily replay Requests and Transactions directly from the records themselves. This is an invaluable tool for debugging and error handling.
Unlike retries (which are automated), replays are a user initiated action. You can replay both Requests and Transactions directly from their respective records by simply clicking the UI Action.
There are differences between what happens when you replay a Request compared to replaying a Transaction, which will be discussed later. The benefits will also be discussed.
You can replay a single HTTP Request (whether inbound or outbound). Having the ability to focus in on and replay the specific Request which has errored allows you to identify and correct errors more accurately, quickly and easily.
To replay a Request, click the Replay Request UI Action in the header of the Request record.
A Transaction comprises a Request/Receipt pair of HTTP Requests and represents an instance of a Message being sent/received. You can replay the whole Transaction again (not just a single Request). Replaying a Transaction works differently to replaying a Request and both would be used at different stages of the debugging process - though they are both equally as simple to perform.
You can replay a Transaction either from the record itself, or the list view:
Click the Replay UI Action in the header of the Transaction record
Click the Replay Transaction UI Action from the list view.
Unifi Admins can replay a Transaction at any time - except when it is 'Sending';
Unifi Managers can replay a Transaction when it is either Complete, Errored, or Ignored.
Firstly, having the ability to replay errored Requests/Transactions can save a significant amount of time and effort when debugging and error handling. For example, typically after spotting an error, you would have to step into the config & make a change, step out to the bonded record and send data again, step into the logs to check what happened and continue around until rectified. Compare that to being on the errored HTTP Request, making a change to the data in that request, replaying it and getting immediate feedback (seeing the response codes) all from within the same record.
Not only that, Unifi automatically replays as the originating user*. You don't have to impersonate to replay. Not only does this also save time & effort, but you can debug with confidence - being able to replay as the original sender (not the user replaying), allowing functionality that relies on the identity of the user to run correctly.
*(Originating user: Inbound - Integration user; Outbound - user who initiated the Transaction. )
When you replay a Request, you replay that specific instance of that data (as-is) at that time. When you replay a Transaction, Unifi takes the Stage data and reprocesses it before sending - building the payload/HTTP Request again.
You would normally replay a Request during development & testing, when debugging (making changes during investigation).
You would normally replay a Transaction once you have completed your investigation and made configuration changes, as reprocessing the data would take those changes into account when building a new payload/HTTP Request. Operationally, you are perhaps more likely to replay a Transaction in cases where, for example, the other system was unavailable and the attachment was not sent - so you want to reprocess and attempt to resend the attachment.
Each new attempt to replay either a Transaction or a Request will be incremented with a decimal suffix (.1, .2 etc.)*. This means you can easily identify which replay relates to which record and in which order they were replayed. For example, TX00123 will replay as TX00123.1 and then TX00123.2 etc.
*(Retries will also be incremented in the same manner.)
Whenever an Integration Test Scenario is run as part of an Integration Test, the results are captured in an Integration Test Scenario Result record.
Here you will find the results of the individual Integration Test Scenario.
Whenever you run an Integration Test, for each Integration Test Scenario, Unifi creates an Integration Test Scenario Result record to capture the results of the test run. Each Integration Test Scenario Result record will correlate to the Integration Test Scenario that was run.
The contents of each individual Integration Test Scenario Result will differ relative to the Integration Test Scenario that was run. An example can be seen below.
Navigate to Unifi > Testing > Test Scenario Results to open a list view, then click to open the desired result .
Alternatively, select the desired result from the Integration Test Scenario Results related list on the Integration Test Result record.
Select the desired Integration Test Scenario Result from the list displayed on the Integration Test Result (this will open the platform view of the record in a new window).
The example extract of an Integration Test Scenario Result below is for a 'CreateIncident' scenario, but is representative of a Result record for any Test Scenario.
The assertions that have passed are green; the assertions that have warnings are orange (with the discrepancies called out); links to the documents created during the test run are called out and highlighted blue.
The following table describes the fields which are visible on the Integration Test Scenario Result record.
Test result
Reference
The Test Result this Scenario Result record relates to.
Integration test
Reference
The Integration Test this Scenario Result record relates to.
Scenario
Reference
The Test Scenario this Scenario Result record relates to.
Transaction
Reference
The Transaction created during the test run.
Order
Integer
The order in which each Test Scenario Result is processed (as per the order on the Test Scenario).
State
Choice
The State of the test. Choices: Pending, Ready, Running, Complete.
Status
Choice
The Status of the test. Choices: Passed, Failed, Error, Skipped.
Total
Integer
The total number of tests performed.
Passed
Integer
The total number of tests which passed.
Failed
Integer
The total number of tests which failed.
Pending
Integer
The total number of tests which are pending.
Warnings
Integer
The total number of tests with warnings.
How to generate an automated Integration Test.
Before you can run an automated Integration Test you must first create one. Unifi makes that extemely easy for you.
Integration Tests are created directly from the Bond record. At the click of a button, Unifi will generate a test which comprises each of the Transaction scenarios on that Bond.
Navigate to the Bond you wish to create the automated Integration Test from and click 'Create Test'.
On the modal that pops up, you will need to confirm by clicking OK.
The Create Integration Test Worker modal is displayed, showing progress. Close the modal.
The automated Integration Tests have been created and can be viewed by navigating to Unifi > Testing > Integration Tests.
We recommend you add your own meaningful Description to the Test and Update.
Note the following about the Test that was created.
Name: Autmatically concatenated value [Integration] - [Unifi Version] - [Bond] (this can be edited to suit).
Integration: The Integration the Test belongs to and will be packaged with.
Unifi version: The licensed version of Unifi installed at the time the Test was created.
More information about each of the records created can be found in the Testing section of our Documentation.
If you're the kind of person that likes to know how things work, we've included this information just for you.
When you click 'Create Test', Unifi will create an Integration Test record for that Bond.
It will then take the first Transaction on the Bond and create an Integration Test Scenario record for that Transaction.
Once that is done it will look for all the relevant transport stack records (Snapshot, Stage, Bond, HTTP Request, Transaction) that pertain to that specific Transaction and create the relevant Integration Test Scenario Data objects for each record and adds them to that Integration Test Scenario.
It will then loop through each of the subsequent Transactions on the Bond, repeating the process for each (creating an Integration Test Scenario record and adding the relevant Integration Test Scenario Data objects).
Integration Test Scenarios are the elements that make up an Integration Test. Each Scenario will correlate to the relevant Transaction on the Bond from which the test was created.
An Integration Test Scenario represents a Transaction on a Bond. Each contains the relevant Test Scenario Data objects for the particular Scenario.
The following table describes the fields which are visible on the Integration Test Scenario record.
Name
String
The name of the Test Scenario. Automatically populated from the Transaction and the Message used to generate the test.
Integration test
Reference
The Integration Test this Test Scenario belongs to.
Type
Choice
Automatically populated. Choices include: Inbound Async, Inbound Sync, Outbound Async, Outbound Sync
Order
Integer
The order in which each Test Scenario is run (defaulted to match the order of Transactions on the originating Bond).
Description
String
Use this to enter a description for your Test Scenario.
The 'Integration Test Scenario Data' related list is also visible on the Integration Test Scenario record.
How to run an automated Integration Test.
Once you have generated the automated Integration Test it is available for you to use and re-use over and over again.
Unifi has a dedicated portal interface for automated Integration Testing. To open it from native ServiceNow navigate to Unifi > Unifi Test Assistant.
Once open, you will initially be greeted with the Dashboard, which provides an overview of the Tests and their Results.
If you navigate to Processes you will be presented with an overview of the Tests and their Results grouped by each Process in the instance.
From there, you can either click the appropriate tile in the main pane, or select the equivalent Process listed underneath 'Processes' in the sidebar to view and run the relevant Tests for that Process.
Tests can be run either in native ServiceNow or Unifi Test Assistant.
To run an Integration Test directly from the record itself in native ServiceNow, simply click the 'Run' button.
To run an Integration Test in Unifi Test Assistant, once you have selected the appropriate Process, navigate to and select the relevant Test (listed underneath its integration)...
...click 'Run'...
...and Confirm.
If you're the kind of person that likes to know how things work, we've included this information just for you.
When you click 'Run', Unifi will create/update (depending on the scenario) a test record from the data in the Snapshot record. E.g. in the case of a create scenario it will create a test record, add the values from the Snapshot and save the record. (In one sense Unifi doesn't know it's being tested - it isn't concerned with how the record was created; it sees that a record has been created and values added - it behaves as it normally would if the record was created manually).
It will then check whether it performed as expected (triggering the Integration, creating the relevant transport stack records etc.) and if so, compare the results of the test with those of the original Transaction (using the relevant Integration Test Scenario Data objects as reference).
Whenever you run an Integration Test, the results are captured in an Integration Test Result record.
When you run an Integration Test, Unifi creates an Integration Test Result record to capture the results of the test run. Each Integration Test Result record will correlate to the Integration Test that was run. The Test Result contains the individual 'Integration Test Scenario Results'. Each of which are tallied and rolled up to the parent record and grouped according to various categories.
Whilst Test Results can be viewed from native ServiceNow, it is envisaged that tests will be run and the results subsequently viewed from Unifi Test Assistant.
Navigate to Unifi > Testing > Test Results to open a list view, then click to open the desired result .
When you navigate to and open Unifi > Unifi Test Assistant, you have a number of ways you are able to view the results.
Unifi Test Assistant opens to the Dashboard. From there you can simply click on the desired result from the Recent Test Results tile.
You can also select Test Results & then navigate to the desired Integration Test (which will open displaying the latest result). If you know which Test Result you want to see, you can even type it directly in the Search box.
Or from the relevant Process, you can also navigate to the desired Integration and Integration Test (which will open displaying the latest result).
Depending on where you are viewing the results, the presentation will differ (though the content is obviously the same).
For more information on exploring Test Results, see the Unifi Test Assistant Feature Guide.
The following table describes the fields which are visible on the Integration Test Result record (native ServiceNow).
Number
String
The system generated unique identifier for this record.
Integration test
Reference
The Integration Test this Result record relates to.
Bond
Reference
The Bond created during the running of the test.
Table
Table Name
The target table of the bonded record.
Target
Reference
The test record created during the running of the test.
State
Choice
The State of the test. Choices: Pending, Ready, Running, Complete.
Status
Choice
The Status of the test. Choices: Passed, Failed, Error, Skipped.
Total
Integer
The total number of tests performed.
Passed
Integer
The total number of tests which passed.
Failed
Integer
The total number of tests which failed.
Pending
Integer
The total number of tests which are pending.
Warnings
Integer
The total number of tests with warnings.
These tests have been skipped. Either a test has failed (e.g. an unsupported Test Scenario was run - like adding attachments, or an earlier dependant test failed) and has had to be skipped, or there was an error (perhaps during development of the test) and they had to be skipped.
Unifi will log a warning for test results where a discrepancy has been detected. These warnings should be investigated. However, in the majority of cases they should require no further action. For instance, where date/time values exist in the payload, there will always be a discrepancy between the time in the original payload and the time in that of the Test Scenario Result.
Date/time values for the records themselves are ignored because Unifi knows the structure of those records and expects there to be differences for certain defined elements. However, because Unifi has no way of knowing the structure of the payloads (each integration is able to be defined/structured as required), Unifi cannot choose to ignore those elements.
The 'Integration Test Scenario Results' related list is visible on the Integration Test Result record.
This Guide utilises the Unifi Integration Designer portal interface which allows you to configure and manage integrations much more intuitively and with greater efficiency.
From within native ServiceNow, open the Unifi Integration Designer portal by navigating to Unifi > Unifi Integration Designer.
You will be greeted with the following Dashboard (which opens in a new window):
Any existing Processes will be listed as tiles, showing the number of Active Integrations for each. The total number of Active Integrations is also shown on the Dashboard.
Hovering over the tile of an existing Process will display 'Show integrations'. Clicking it will take you to the Integrations page (for that Process).
If appropriate, you can either edit the Settings of, or add a New Integration to an existing Process by clicking the ellipsis (at the bottom right of the tile).
In the next section, we shall look at configuring the Process.
How to navigate, view and interpret the results of an automated Integration Test.
In Unifi Test Asistant, once you click 'Run' you will see the Integration Test Result record being populated with each of the Integration Test Scenario Results in real-time as they happen.
Once complete, the results for each of the Integration Test Scenarios are tallied and rolled up to the Integration Test Result.
The top of the pane displays a graph and various counts showing the overall Status and numbers of Tests grouped by Status. You can also link out to the Integration Test along with the Bond & Target Record created during the test run.
The Details tab shows the description (as entered after generating the test), along with the date/time and the version of Unifi installed when created. It also links out to the Integration and Process records (opening Integration Designer in a new window).
Target Version: The licensed version of Unifi installed at the time the Test was created. Tracking which version of Unifi was used to create the Test may be useful for compatibility testing after upgrading.
The Scenarios tab shows each of the Integration Test Scenario Results. Clicking the Scenario link will open the Result for that Integration Test Scenario. Clicking the Transaction link will open the Transaction record created during the test run.
The Warnings tab shows all the warnings grouped by each Integration Test Scenario Result. From here you can step into each of the relevant Results and the relevant 'Transport Stack' records (e.g. Stage, Request, Bond, Snapshot).
Unifi will log a warning for test results where a discrepancy has been detected. These warnings should be investigated. However, in the majority of cases they should require no further action. For instance, where date/time values exist in the payload, there will always be a discrepancy between the time in the original payload and the time in that of the Test Scenario Result.
The History tab shows a list of the results of each test run. Each time a Test is run, the Result will be added to the top of this list. Clicking the value in the Number column will display the relevant Integration Test Result above.
You can step into each Integration Test Scenario Result from either the Scenarios or Warnings tabs. Clicking the link will open it in a new window. An example is shown below.
The assertions that have passed are green; the assertions that have warnings are orange (with the discrepancies called out); links to the documents created during the test run are called out and highlighted blue.
The Test Results are then rolled up to the Process...
...and the Dashboard.
This is a summary of all the tests that have been run on the instance. Each of the relative graph segments are clickable links to a list of the relevant Test Result records containing tests which match the filter criteria (i.e. Passed without warning, Passed with warning and Pending).
This shows the number of Tests in relation to the number of Integrations on the instance (Unifi expects at least one Test to exist for each Integration & coverage is calculated on the percentage of Integrations containing at least one Test). The Chart is a graphical representation of the test coverage percentage & the segments are clickable links to a list of the relevant Integrations. Note: in the exanple above, 25% coverage means that only four out of the sixteen Integrations have at least one Integration Test associated.
Example Test Coverage:
One Integration containing one Test = 100% coverage
One Integration containing two Tests = 100% coverage
One Integration containing two tests plus one containing none = 50% coverage
This displays a list of the most recent Test results. Clicking the Number will open the Test Result in the Unifi Test Assistant window; clicking the Integration Test will open the Integration Test in the platform in a new browser window.
This displays the Integrations on the instance grouped by Company. It displays a range of messages about the status of those Integrations in terms of Tests (e.g. '12 integrations without a test', or 'No integration tests found' etc.). Clicking the Company will open a new window containing a list of the Tests for that Company. This will be of particular value if you are a Managed Service Provider (MSP).
The first element to configure is the Process, which is the top level configuration element where all Integrations are contained.
The first thing to do when creating a new integration is to assign it to its Process. For instance, a new Incident integration would be created within an Incident Process. If you do not yet have a Process defined, then you will need to create a new Process
From the Unifi Integration Designer Dashboard, click on New Process.
On the 'New Process' modal, the fields to be configured are as follows:
*API Name
The API Name is how we identify which Process we are integrating with. The Scripted SOAP/REST Service will reference the API Name (which is why it is important for this to be a unique reference).
Your 'New Process' modal should look like this:
Click Create.
You will be redirected to your Process Dashboard:
Click either the '+' tile or 'New Integration' in preparation to configure the Integration.
When you create a Process, Unifi will automatically create the corresponding Web Service (REST methods).
This is given for your information only as we are only concerned in this Guide with sending outbound messages to the table API of another ServiceNow instance (i.e. your Personal Developer Instance, ‘PDI’). However, It will be of value should you go on to configure polling integrations following our Poller Guides (which will utilize this same Process).
Follow this guide to configure a simple outbound integration to the table API of your Personal Developer Instance. It is given as an aid for those new to Unifi, or to play as part of a trial.
Congratulations on your decision to use Unifi, the only integration platform you need for ServiceNow. We are sure you will be more than satisfied with this extremely powerful, versatile and technically capable solution.
We have created this Outbound Incident Guide as an aid to customers who are beginning their journey in deploying the Unifi integration platform. We would not want you to be overwhelmed by exploring all that Unifi has to offer here, so we have deliberately limited the scope of this document. It will guide you through an example of how to configure a basic Incident integration, sending outbound messages via the REST API to the table API of another ServiceNow instance (i.e. your Personal Developer Instance, ‘PDI’).
We do not recommend synchronous integrations for enterprise ticket exchange. This Guide is purely here for you to have a play as part of a trial. It is designed to connect to a PDI without Unifi being installed on the other side.
For more technical information on how to use Unifi, please see our .
Do not build integrations directly within the Unifi application scope. This can create issues with upgrades and application management.
The prerequisite to configuring Unifi is to have it installed on your instance. As with any other ServiceNow scoped application, Unifi must be purchased via the ServiceNow Store before installation.
We recommend you follow the Setup instructions prior to configuring your Integration.
The Connection allows messages to be sent and received and stores all the authentication details of the Integration specific to a single environment.
Before configuring the Connection, you need to ensure you have a user in the instance to use as the Inbound user for the Integration. To configure your Inbound user:
In the native ServiceNow window, navigate to User Administration > Users. Click New.
The fields to be configured for the User record are as follows:
The x_snd_eb.integration role gives access to the Unifi web services. You may need to assign additional roles depending on the process functionality used i.e. things like gs.hasRole(‘itil’) in business rules/scripts etc.
Although you can set up many connections to enable switching between environments (one connection per environment), it is worth noting that only one connection can be active for an Integration at a time.
We will, however, set up only one connection in the 'Development' environment.
Back in the Unifi Integration Designer window, after clicking the 'Connections' icon, the first thing you will notice is a widget at the top of the page which clearly displays the inbound endpoints (REST Resources) which were automatically created when the Process was configured.
This is given for your information only as we are only concerned in this Guide with sending outbound messages to the table API of another ServiceNow instance (i.e. your Personal Developer Instance, ‘PDI’). However, It will be of value should you go on to configure polling integrations following our Poller Guides (which will utilize this same Connection).
Click New.
The fields to be configured for the New Connection modal are as follows:
The format of the Endpoint URL for the ServiceNow Table API is as follows:
https://<your_developer_instance>.service-now.com/api/now/table/<table_name>
If you are going to configure this Outbound Integration only, then use the full URL e.g. https://<your_developer_instance>.service-now.com/api/now/table/incident
If you are going to configure the Pollers also, then truncate the URL as follows https://<your_developer_instance>.service-now.com/api/now
In which case, the /table/incident
part of the URL will need to be added to the 'Path' of the outbound messages (and the Endpoint URLs of the relevant Pollers).
Your New Connection modal should look like this:
Click 'Submit and view'.
Clicking 'Submit' will redirect you to the list view of the record you're creating. Clicking 'Submit and view' will redirect you to the newly created record.
The fields to be configured for the Details form are as follows:
*(External) User/Password: As created/set in your PDI.
**Inbound user: As created above.
Your Details form should look like this:
Save the Connection.
At this stage you can carry our a basic connection test which verifies whether the user is authorized (i.e. whether you've configured the user/password/roles correctly). To do this, click Connection Test.
Then, on the Connection Test modal, click Test.
The results (Pass/Fail) will be displayed.
If you attempt the test against the truncated URL you will receive this error.
Click Done.
The main, manually configured elements are now in place for our Integration to work. We are now ready to configure and test each of our Scenarios in turn.
There is no need to manually configure a Trigger (Business Rule) on the Process table being integrated as Unifi will automatically create one for us (if one doesn't already exist).
This is what defines the connection between a Process and the single system it's connecting with. It is also where most of the configuration and settings are stored.
In the Unifi Integration Designer window, after clicking either on the '+' tile or 'New Integration', you are given a 'New Integration' modal to complete.
The fields to be configured for the New Integration modal are as follows:
*Service type/Message format: these values are defaulted.
Your 'New Integration' modal should look like this:
We have chosen to name this integration 'Push-Pull Incident' as it will also later form the basis for the Poller Guides (where we pull data from your Personal Developer Instance, ‘PDI’).
Click Create.
You will be redirected to the Details page of the newly created Integration.
Before continuing we would like to draw your attention to some of the relevant icons that are now visible down the left hand navigation strip.
The icons are:
a) 'Integration' icon: Opens the current integration's Details page.
b) 'Messages' icon: Opens the current integration's Messages page.
c) 'Fields' icon: Opens the current integration's Fields page.
d) 'Field Maps' icon: Opens the current integration's Field Maps page.
e) 'Documentation' icon: Opens the automatically generated documentation for the current integration. (Another awesome feature in Unifi.)
f) 'Connections' icon: Opens the current integration's Connections page.
The Details page of your Integration form should look like this:
Navigate to Settings > Feedback.
The Feedback fields to be configured for the Integration record are as follows:
The Feedback Settings fields should look like this:
The remaining 'Integration' values are to be left as-is:
Message Identification
All of the remaining 'Settings' values are to be left as-is:
Attachments Settings
Bond Settings
All of the 'Error handling' values are to be left as-is:
General
Timeouts
Retry
Click Save.
Click the 'Connections' icon to move on and configure the Connection.
It requires user intervention to manually restart the queue and process those subsequent Transactions. This is done via the Process Now UI Action.
In the case of a major outage (where perhaps the other system is down, or the authentication user credentials have been updated), you might have a number of failed Transactions. Rather than stepping into each Transaction and replaying them individually, you can simply replay all the broken Transactions on the Integration. This could represent a significant time-saving.
This will facilitate this same connection being used for both the Table API and Attachment API (as per the ).
Name
The name of the ServiceNow process being integrated.
<SN Process Name> (e.g. Incident)
API Name*
The unique name of this process for use with the API.
<your_unique_api>
Target table
The primary target or process table that this integration uses.
'Incident' [incident]
Reference field
The field on the target table that is used as the reference for the external system.
'Number'
Description
Describe what this Process is for.
<Your description>
Environment
The environment this connection applies to.
'Development'
Endpoint URL
The external system's access URL.
<External system Endpoint URL>
Active
Use this connection for the integration when true.
<true>
Authentication
The authentication method to use for this connection.
'Basic'
User*
The username used in basic authentication.
<external.system.user>
Password*
The password used in basic authentication.
<External system user password>
Inbound user**
The user profile used by the external system for authentication. An active connection must be found for the user to gain access.
lookup: <Your Inbound user>
Enable UI messages
Allow information and error messages to be shown to the user as UI Notifications. Only applies to certain notifications.
<true>
Note bond history
Use the 'Note bond history' to process bond history updates. (Set to true for the history to be promoted to the work notes fields of the record we're integrating - for the analyst to view)
<true>
User ID
The id of the user (to be used by the external system for authentication).
<your.integration_user>
First name
The integration user's first name.
<Your First Name>
Last name
The integration user's last name.
<Your Last Name>
Password
The user's password (to be used in basic authentication).
<Your Password>
Roles
The role required for access to the integrated records.
x_snd_eb_integration
Name
The name of the integration.
<Your Name>
Service type*
The type of web service this integration is using (Choices: SOAP/REST).
'REST'
Message format*
Automatically pre-process incoming messages for simpler message scripting. (Choices: XML, JSON, Advanced)
'JSON'
We will utilise the Field & Field Map records to configure the Message Scripts for the CreateIncidentResponse Message.
It is worth copying all relevant OOTB Field Maps as are necessary for your integration before using any of them in your Field Records - thereby mitigating the risk of any potential issues with future upgrades.
The Field Map we shall use for our CreateIncidentResponse Field record is:
Source Reference
To copy the Source Reference Field Map, navigate to the 'Field Maps' icon.
Click on the ellipsis to the right of the Source Reference Field Map & click Copy.
The fields to edit for the Copy Field Map modal are as follows:
Name*
The name of your field map. (If left unedited, it will append the word 'Copy' to the existing name.)
<Your Name>
*Name: We have chosen to prefix the existing Field Map Name with the initials of our Integration (you are free to choose any appropriate means of identifying/differentiating your copy).
Your Copy Field Map modal should look like this:
Integration should be automatically populated.
Click Copy.
You will be redirected to the Details page of the newly created Field Map.
In Unifi Integration Designer, from the CreateIncidentResponse page, navigate to Message > Fields. Click New.
The fields to be configured for the sys_id New Field modal are as follows:
Message*
The Message this Field record is linked with.
'CreateIncidentResponse'
Description
Describe what this field is for and any specific details that might help you in future.
'Extract returned sys_id & store in stage.external_reference'
Active*
Set to true to use this Field record for processing.
<true>
Field map
The Field Map this Field record is linked with.
'PI - Source Reference'**
Map to field*
Use this Field record to represent a field on a source/target table.
<false>
Path
Where in the payload the data will be placed.
'result'
Property
The property in the payload the data will be written to.
'sys_id'
Inbound*
Set to true to use for inbound Messages.
<true>
Outbound
Set to true to use for outbound Messages.
<false>
*These fields are automatically defaulted to true, or automatically populated.
**Field map: Value may vary. Choose the copy Field Map you created for your Integration.
Property: We are setting 'sys_id' as the property because that is what is required by the table API. If it were possible, it would better to use something more meaningful, like the Number of the ticket integrated with, as this aids in debugging.
The 'result.sys_id' New Field modal should look like this:
Submit the record.
You will be redirected back to the Fields page of the CreateIncidentResponse Message.
Now that we’ve configured the Field records for the CreateIncidentResponse message, we are ready to build our message scripts.
The following Field record should now be in place for your CreateIncidentResponse messsage:
Feature Alert: In the picture above you will notice that a 'Build Integration' button has appeared in the banner at the top of the page. Whenever a change is made to a Field record that is associated to a Message (whether that is being created, updated, or deleted) the button will be available and acts as a visual reminder that changes have been made and Message Script(s) need to be built. We will talk more about this feature in the Build Integration Level page.
Navigate to Advanced > Script Editor.
When you first open the Script Editor, you will see the following:
Having visibility of your message scripts in the one pane makes scripting so much more efficient.
Click on Build Message.
You will see the 'Message build successful' Info Message.
Your Script Editor fields should now look like this:
You can click View to adjust the layout and change the view to show various combinations of, or individual script fields.
The newly auto-generated code will appear between a Begin & End Comment immediately prior to any code that may already be there (pre-existing code will be retained).
We will now examine our new, auto-generated Message Script.
Payload to Stage:
Both the stage & bond external reference are being set to the sys id only because we are integrating with the table API and that's what it requires. If possible, it is better to use something more meaningful, like the Number of the ticket integrated with, as this aids in debugging.
Once you have finished examining the code, click 'Close' to navigate back to the Fields page of the CreateIncidentResponse Message.
Next, we will configure the CreateIncident Message.
For each of our Scenarios we will need to configure the relevant Messages & Fields. This scenario will need to be tested before moving on to the next.
The Messages we shall be configuring for the Update Scenario are:
Response
UpdateIncident
We will define which Field records require configuring for each of those Messages at the appropriate time.
The scenario will need to be successfully tested before we can say it is complete.
We shall look in detail at each of the Messages and their respective Fields in turn over the next few pages, before moving on to Test.
The Trigger is a Business Rule which stipulates the conditions under which Messages will be sent for the process concerned.
There is no need for you to manually create a Trigger (Business Rule). If you have more than one, you will make duplicate updates.
Unifi will automatically create a Trigger (Business Rule) for the Process being integrated (if one doesn't already exist) when you run 'Build' either on the Integration or Message once your Create Message is configured.
In native ServiceNow, navigate to System Definition > Busines Rules. Find and navigate to the automatically generated Business Rule.
The format of the name will be '[S] Unifi ' + <Table Name> + ' trigger rule'.
The top section of your Business Rule record should look like this:
Your 'When to run' tab should look like this:
The code in the script field should look like this:
Your 'Advanced' tab should look like this:
We have confirmed the main elements are in place for our Integration to work. We are now ready to Test our CreateIncident Message.
For each of our Scenarios we will need to configure the relevant Messages & Fields. This scenario will need to be tested before moving on to the next.
The Messages we shall be configuring for the Create Scenario are:
CreateIncidentResponse
CreateIncident
We will define which Field records require configuring for each of those Messages at the appropriate time.
The scenario will need to be successfully tested before we can say it is complete.
We shall look in detail at each of the Messages and their respective Fields in turn over the next few pages, before moving on to Test.
The CreateIncident Message will create a ticket on the target table of the integrated system.
After clicking the 'Messages' icon, you will see the following screen (note: the previously configured message is visible in the list):
Click New.
The fields to be configured for the CreateIncident New Message modal are as follows:
Message name
The message name that is unique for this integration.
'CreateIncident'
Type
The primary purpose of the message.
'Create'
Direction
The direction(s) this message is configured to support.
'Outbound'
Your CreateIncident New Message modal should look like this:
Submit and view to further configure the Message.
Navigate to Message > Response.
The Response fields to be configured are as follows:
Response
The immediate synchronous response to this message.
lookup: 'CreateIncidentResponse'
Async*
Turn this option on if you want inbound processing to occur asynchronously or this message is the first of an asynchronous message pair.
<false>
*This field is automatically defaulted to true.
Your Response form should look like this:
Navigate to Message > Bond.
The Bond fields to be configured are as follows:
Bond ownership*
Determine if the sender should own the bond or not in order for this message to be processed? Use 'Ignore' to process regardless of the owner flag. (Choices: Ignore, Must own, Must not own.)
'Ignore'
Bond condition type*
The type of conditional check made on the bond. (None: no checks are made. State: checks against the state are made using the conditional checkboxes. Scripted: the 'Bond condition' script is used.)
'State'
Bond new
Process this message when a new bond is required.
<true>
*These fields are automatically populated.
Your Bond form should look like this:
Navigate to Outbound > Trigger.
The Outbound Trigger fields to be configured (as required)* are as follows:
Outbound condition*
The condition that the ServiceNow record must meet to trigger this message being processed.
<Your condition> e.g. 'Short description contains Push-Pull Integration'
*Outbound condition (as required):
It is not necessary for you to enter a condition. The value given is an example. You may create any condition (or not) to align with your business process requirements.
Your Outbound Trigger form should look like this:
Navigate to Outbound > Settings.
The Outbound Settings fields to be configured are as follows:
Path*
A path to append to the URL defined in the connection. Specify a full URL to override the connection. Define inline scripts to reference Stage to Request script variables by wrapping code in braces {}, e.g. /{transaction.message_id}.
'/table/incident'
Action method
The SOAP Action or the REST Method to use for this message. If this field is empty the SOAP Action will default to the message name and the REST Method will default to POST.
'POST'
*Path
Only add this value if you have used the truncated Endpoint URL in the Connection. If you have used the full Endpoint URL, this step can be skipped.
Your Outbound Settings form should look like this:
Click Save.
We are now ready to configure the Fields for our CreateIncident Message.
The UpdateIncident Message is an update type message that sends updates to the bonded record.
After submitting the 'Response' Message, you were redirected to the Messages page (note: the three previously configured messages are now visible in the list):
Click New.
The fields to be configured for the UpdateIncident New Message modal are as follows:
Message name
The message name that is unique for this integration.
'UpdateIncident'
Type
The primary purpose of the message.
'Update'
Direction
The direction(s) this message is configured to support.
'Outbound'
Your UpdateIncident New Message modal should look like this:
Submit and view to further configure the Message.
Navigate to Message > Response.
The Response fields to be configured are as follows:
Response
The immediate synchronous response to this message.
lookup: 'Response'
Async*
Turn this option on if you want inbound processing to occur asynchronously or this message is the first of an asynchronous message pair.
<false>
*This field is automatically defaulted to true.
Your Response form should look like this:
Navigate to Message > Bond.
The Bond fields to be configured are as follows:
Bond ownership*
Determine if the sender should own the bond or not in order for this message to be processed? Use 'Ignore' to process regardless of the owner flag. (Choices: Ignore, Must own, Must not own.)
'Ignore'
Bond condition type*
The type of conditional check made on the bond. (None: no checks are made. State: checks against the state are made using the conditional checkboxes. Scripted: the 'Bond condition' script is used.)
'State'
Bond open
Process this message when the bond state is Open.
<true>
*These fields are automatically populated.
Your Bond form should look like this:
Navigate to Outbound > Trigger.
The Outbound Trigger fields to be configured (as required)* are as follows:
Outbound condition*
The condition that the ServiceNow record must meet to trigger this message being processed.
<Your condition> e.g. 'Work notes changes' OR 'Additional comments changes'
*Outbound condition (as required):
It is not necessary for you to enter a condition. The value given is an example. You may create any condition (or not) to align with your business process requirements.
Your Outbound Trigger form should look like this:
Navigate to Outbound > Settings.
The Outbound Settings fields to be configured are as follows:
Path*
A path to append to the URL defined in the connection. Specify a full URL to override the connection. Define inline scripts to reference Stage to Request script variables by wrapping code in braces {}, e.g. /{transaction.message_id}.
'/table/incident/{bond.getValue("external_reference")}'
Action method
The SOAP Action or the REST Method to use for this message. If this field is empty the SOAP Action will default to the message name and the REST Method will default to POST.
'PUT'
*Path
Only include the /table/incident
element of this value if you have used the truncated Endpoint URL in the Connection. If you have used the full Endpoint URL, do not include that element of it here.
Your Outbound Settings form should look like this:
Click Save.
We are now ready to configure the Fields for our UpdateIncident Message.
The CreateIncidentResponse Message is the immediate, synchronous response that is sent after processing the Createincident Message.
In Unifi Integration Designer, navigate to the 'Messages' icon. Click New.
The fields to be configured for the CreateIncidentResponse New Message modal are as follows:
Message name
The message name that is unique for this integration.
'CreateIncidentResponse'
Type
The primary purpose of the message.
'Response'
Direction
The direction(s) this message is configured to support. (Choices: Inbound, Outbound, Bidirectional)
'Inbound'
Your CreateIncidentResponse New Message modal should look like this:
Click Submit and view to further configure the Message.
Navigate to Message > Bond.
The Bond fields to be configured are as follows:
Set bond state inbound*
Set the Bond State when receiving this message. Use 'None' to leave the Bond State alone or to modify it via a Message/Field Stage to Target script.
'Open'
*Set bond state choices: None, Pending, Open, Suspended, Vendor suspended, Closed
Your Bond form should look like this:
Setting the Bond state to Open allows messages to be sent/received. See the 'Bonds' page for details.
Click Save.
We are now ready to configure the Fields for our CreateIncidentResponse Message.
We will utilise the Field & Field Map records to configure the Message Scripts for the CreateIncident Message.
Depending on your requirements, you will need to create Field records for each of the relevant Incident record field elements (see the table below for an example). For the sake of brevity, this Guide will focus on a select few. If you wish, however, you are free to continue & configure the remaining Field records. The table below lists an example of the Incident record field elements you may wish to map and the relevant Field Maps required to configure each Field record. For a fuller definition of available Field Maps, please see the relevant page in our .
*caller_id: we have chosen String type here because we are integrating with the table API. This will return the sys id of the caller as a string value. Note: If we were integrating Unifi to Unifi we may use a Reference type which would return the caller as an object with "value", "link" & "display_value" elements.
The Field records we will focus on will be for Caller (String), Short description (String) and State (Choice).
If you haven't already, you will need to copy the relevant additional Field Maps for the CreateIncident Field records as follows:
String
Choice
In Unifi Integration Designer, from the CreateIncident page, navigate to Message > Fields. Click New.
The fields to be configured for the 'incident.caller_id' New Field modal are as follows:
*These fields are automatically defaulted to true, or automatically populated.
**Field map: Value may vary. Choose the copy Field Map you created for your Integration.
Your 'incident.caller_id' New Field modal should look like this:
Submit the record.
You will be redirected back to the Fields page of the CreateIncident Message.
Because the incident.short_description Field record is the same 'type' (PI - String) & the majority of the settings are the same as the previously configured Field, it will be quicker to copy the incident.caller_id Field & make a few minor changes.
Feature Alert: The 'result.sys_id' Field is visible and inactive. It is an integration level Field which was automatically created by Unifi at the time we created the Message level record (in CreateIncidentResponse Fields). We'll talk in more detail about this feature in the following section.
Click the ellipsis next to the incident.caller_id Field record & click Copy.
The fields to edit for the Copy Field modal are as follows:
*This field is automatically populated.
Your Copy Field modal should look like this:
Click Copy.
You will be redirected to the Details page of the newly created incident.short_description Field record.
The following info message is also displayed (which can be closed):
Field records can exist at both the Integration and the Message level (a Field record exists at the Integration level if it isn't linked to a Message i.e. the 'Message' field element is left blank). We noted previously that an Integration level Field record is automatically created when we create one at the Message level. We will utilise and configure both when mapping 'incident.state'.
The 'incident.state' Field record is a Choice 'type' Field. These are used when you’re mapping choice field elements with static values that don't change per Message (e.g. State, Impact, Urgency) i.e. you're not going to have one set of choices/values for create and another for update.
Rather than configure choices for each Message, configure choices once at the Integration level. This means we only need define them once. The Field Map will take care of it and any 'incident.state' Field records that are set at the Message level (i.e. with a value in the 'Message' field) would use the choices configured at the Integration level.
We'll first configure the Message level Field and then move on to configure the choices on its Integration level counterpart.
There is no need to 'Generate field choices' for Message level Field records because the Field Map always looks for them on an Integration level Field which has the same name.
To quickly navigate to the CreateIncident Message from the Details page of the newly created incident.short_description Field record...
...click the 'Preview' icon to the left of the Message field.
From the CreatIncident Message, navigate to Message > Fields. Click New.
The fields to be configured for our 'incident.state' (Message level) New Field modal are as follows:
*These fields are automatically defaulted to true, or automatically populated.
**Field map: Value may vary. Choose the copy Field Map you created for your Integration.
Your 'incident.state' (Message level) New Field modal should look like this:
Submit the record.
You will be redirected back to the Fields page of the CreateIncident Message.
We will need to 'Generate field choices' for this Integration level Choice 'type' Field.
Navigate to the 'Fields' icon to open the Fields page.
Click to open the incident.state (Integration level) Field record (the one where Message is empty).
The incident.state Field record opens to the Details page.
Navigate to Field > Field Choices.
Click Generate field choices.
Click Generate on the 'Generate field choices' modal which displays.
The Field Choices are generated & now visible in the list.
At this stage, you could carry on and configure the remaining Field records for the rest of the Incident fields (as per the table at the top of this section). However, we will now run the Build process to auto-generate our Message Scripts.
Now that we’ve configured the Field records for the CreateIncident message, we are ready to build our message scripts.
From the CreateIncident Message, navigate to Message > Fields.
The following Field records should now be in place for your CreateIncident messsage:
Click on Build Message.
You will see the 'Message build successful' Info Message.
Navigate to Advanced > Script Editor to view the auto-generated code.
Your Script Editor fields should look like this:
The newly auto-generated code will appear between a Begin & End Comment immediately prior to any code that may already be there (pre-existing code will be retained).
We will now examine our new, auto-generated Message Scripts.
Source to Stage:
Stage to Request:
We are now ready to Test our CreateIncident Message.
Before we do, let's view the Trigger which Unifi automatically created when we ran 'Build Message'.
We will test our CreateIncident Message.
Navigate to Incident > Create New.
The Incident fields to configure are as follows:
*Note: We have chosen these fields as these were the ones we mapped when creating our Field records for the CreateIncident Message. Your choices may differ depending on which Field records you created.
Your Incident form should look like this:
Right-click & Save.
Note the values entered (including State & Priority), so that you can check them against the mapped fields on the corresponding record in the instance being integrated to.
You should see an Info Message, confirming the CreateIncident Message is being sent to your Integration:
You should also see a note in the Activities stream:
When you scroll down to the 'Unifi Integrations' related list (you may have to configure the related lists to add it to your Incident form), notice:
A Bond has been created. The State remains 'Pending' until the list is refreshed.
Refresh the list by clicking Bonds.
You should see the External reference populated & the State changed to 'Open':
We are using a sys_id for the External reference in our example because we are integrating with the table API. If possible, it is better to use something more meaningful, like the Number of the ticket integrated with, as this aids in debugging.
Click the Bond Number link to open the Bond record.
Your Bond record should look like this:
Bond details:
Integration: < Your Integration >
Connection: < Your Connection >
Table: 'Incident [incident]'
Document: < Your Incident >
State: 'Open' (Message exchange is available)
Status: 'OK' (All transactions have completed)
Internal reference: < ServiceNow ticket reference > (Same as 'Document')
External reference: < External system's ticket reference >
Transaction:
Message: 'Createincident'
Direction: 'Outbound'
Transaction state: 'Complete' (The data has been successfully transported)
Process state: 'Accepted' (The transaction was accepted as within the scope of the business logic that's in place)
You are able to view the logs in the 'Unifi Activity Logs' related list.
Transaction process next queued: Logs from checking whether there are any other transactions queued and processing those.
Transaction sending: Logs from taking the Stage data, building the Request record & sending the Request to the integrated system.
Business rule: < Your Trigger >: Logs from the Business Rule that triggers Unifi.
Click through to the Transaction record from the related list on the Bond.
Your Transaction record should look like this:
Transaction details:
Table: 'Incident [incident]'
Document: < Your Incident >
Integration: < Your Integration >
Connection: < Your Connection >
Bond: < Your Bond >
Message: 'CreateIncident'
Direction: 'Outbound'
Transaction state: 'Complete' (The data has been successfully transported)
Process state: 'Accepted' (The transaction was accepted as within the scope of the business logic that's in place)
Errors:
Error: (If there was a transactional error the Transaction state would show as 'Error' and the details would be captured here).
Process error: (If there was a process error the Process state would show as 'Rejected' and the details would be captured here)
Stage:
Direction: 'Outbound'
Message: 'CreateIncident'
Internal reference: < ServiceNow ticket reference > (Same as 'Document')
External reference: < External system's ticket reference >
Click through to the Stage record from the related list on the Transaction.
Check the values in the fields match what you expect.
Your Stage record should look like this:
Stage details:
Direction: 'Outbound'
External reference: < External system's ticket reference >
Internal reference: < ServiceNow ticket reference >
Snapshot: < Snapshot record reference >
Message: 'CreateIncident'
Transaction: < Your Transaction >
Integration: < Your Integration >
Mapped Staged Data fields (yours may differ depending on which Field records you created):
Caller ID: < Your caller.id >
Short description: < Your Short description >
State: '1'
Click through to the HTTP Request record from the related list on the Transaction.
This is where you will find details of the data that was transported between the systems being integrated. (These records are extremely useful for developing and debugging integrations because of the immediate availability and contextual relevance to the integration you are developing.)
Your HTTP Request record should look like this:
HTTP Request details:
Integration: < Your Integration >
Connection: < Your Connection >
Transaction: < Your Transaction >
Message: 'CreateIncident'
Direction: 'Outbound'
Request state: 'OK' (There are no errors with the HTTP Request.)
Attempt number: < Number of HTTP Request attempts > (Failed requests are retried up to the maximum attempts number as configured on the Integration.)
Endpoint URL: < The external system’s access URL >
Action Method: 'POST'
Request headers: < The header of the request being sent >
Request payload: < The payload of the request being sent >
Response details:
Status code: '200'
Response headers: < The header of the response being received >
Response payload: < The payload of the response being received >
Navigate to the corresponding Incident in the external system.
Check the values in the fields match those you noted when you saved the Incident in the internal system.
Your external system's Incident record should look like this (depending on the system you're integrating with, your record may look different; the important matter is that the values match):
Caller: < Your Caller >
State: < Your State >
Short description: < Your Short description >
Activities: < Note showing activity on the Incident > (Opened by < your.external.system.user > configured in the Connection)
We are now ready to move on to the Update Scenario.
The Response Message is the immediate synchronous response that is sent to acknowledge the successful transport of another Message.
As previous, after clicking the 'Messages' icon, you will see the following screen (note: both the previously configured messages are now visible in the list):
Click New.
The fields to be configured for the Response New Message modal are as follows:
Your Response New Message modal should look like this:
Click Submit.
You will be redirected to the Messages page. You need not configure the Response Message any further.
Now it's time to move on and configure the UpdateIncident Message.
See (on the 'CreateIncidentResponse Fields' page) for details.
Field inheritance is set to true by default. This means the record will be updated with integration-level Field values when saved (except for Active, Inherit and Message values). Uncheck the Inherit field to configure locally. For more information on Field Inheritance click .
Message*
The Message this Field record is linked with.
'CreateIncident'
Description
Describe what this field is for and any specific details that might help you in future.
'The caller on this incident'
Active*
Set to true to use this Field record for processing.
<true>
Field map
The Field Map this Field record is linked with.
'PI - String'**
Map to field*
Use this Field record to represent a field on a source/target table.
<true>
Table*
The primary source/target table that this Field record is mapped to.
'Incident '[incident]
Element
The field on the source/target table this Field record is mapped to.
'Caller'
Property*
The property in the payload the data will be written to.
Automatically populated
Inbound*
Set to true to use for inbound Messages.
<false>
Outbound*
Set to true to use for outbound Messages.
<true>
Element
The field on the source/target table this Field record is mapped to.
'Short description'
Property*
The property in the payload the data will be written to.
Automatically populated
Description
Describe what this field is for and any specific details that might help you in future.
'The short description of this incident'
Message*
The Message this Field record is linked with.
'CreateIncident'
Description
Describe what this field is for and any specific details that might help you in future.
'The incident lifecycle state'
Active*
Set to true to use this Field record for processing.
<true>
Field map
The Field Map this Field record is linked with.
'PI - Choice'**
Map to field*
Use this Field record to represent a field on a source/target table.
<true>
Table*
The primary source/target table that this Field record is mapped to.
'Incident' [incident]
Element
The field on the source/target table this Field record is mapped to.
'State'
Property*
The property in the payload the data will be written to.
Automatically populated
Inbound*
Set to true to use for inbound Messages.
<false>
Outbound*
Set to true to use for outbound Messages.
<true>
Message name
The message name that is unique for this integration.
'Response'
Type
The primary purpose of the message.
'Response'
Direction
The direction(s) this message is configured to support. (Choices: Inbound, Outbound, Bidirectional)
'Inbound'
caller_id*
String
short_description
String
description
String
category
Choice
impact
Choice
urgency
Choice
state
Choice
comments
String
worknotes
String
Caller*
Person who reported or is affected by this incident.
<Your Caller>
State*
The Incident lifecycle state.
'New' - Default (Automatically populated)
Short description*
A brief description of the incident.
<Your Short description>