Follow this guide to learn how to generate and run automated Integration Tests in Unifi.
With the introduction of Unifi Test Assistant, you can now generate Integration Tests and perform regression testing at scale. Create, manage and execute tests, view and act on results, and save enormous amounts of time during upgrades and release cycles.
This document will guide you through the process of generating, running and exploring automated Integration Tests in Unifi. It will demonstrate how straightforward and efficient those tasks can be, saving huge amounts of time when it comes to testing your Integrations.
We will look at testing from the following perspectives.
Integration Tests are created directly from the Bond record. At the click of a button, Unifi will generate a test which comprises each of the Transaction scenarios on that Bond.
The generated tests are used to check Unifi's processing of the data i.e. to compare whether it is behaving in the same manner and producing the same results when processing the generated test as it did when processing the original records. It checks not only the data itself, but also the Unifi processes that trigger, transport and respond to that data moving through Unifi.
The Unifi Admin [x_snd_eb.admin] role is required to generate Integration Tests.
When running a test, no connection is made to the other system. Instead, Unifi calls a mock web service which responds with results from the original scenario. Unifi then tests what happens with that response. Doing this helps to ensure the accuracy of the test (testing the functionality of the Unifi process in your instance), without relying on input from an external instance (potentially adding further variables to the test).
In the current release, automated testing only supports REST and JSON payloads (not SOAP or XML). Also, we currently do not support automated testing of attachment messages.
Exploring the results of the Integration Test is intuitive, efficient and informative using Unifi Test Assistant.
Because tests are generated from real-world data in your instance, in order for your tests to work in other instances, the data that you use has to exist in those instances as well (i.e. the data contained on the bonded record e.g. Caller, Assignment group etc.).
If you change your process (e.g. change the structure of data objects being exchanged), you will need to generate new tests.
Whenever you package your Integration (for details, see the Packager Feature Guide), any Integration Tests you create will also be included along with the other elements of your packaged Integration.
The Unifi Test Assistant Process Model shows how Unifi Test Assistant has been built to work with your integration and platform release process.
Unifi Test Assistant is designed to be used with integrations that are already working. It is not a replacement for unit testing.
A) Once UAT has been completed for the integration, Integration Tests can be generated from the resulting Bonds and imported back into Dev.
B) The Integration Tests can be executed as many times as required to perform regression testing for new platform upgrades, patch releases or for any other reason.
C) Integration Tests are packaged with the Integration and can be executed as part of UAT if required.
D) Integration Tests can be generated in Production to allow new or unforeseen scenarios to be captured and tested against in future release cycles.
The dedicated portal interface for running and exploring automated Integration Tests.
Integration Test is the overarching record containing all the elements of the automated Integration Test. It correlates to the Bond from which it was created and comprises each of the Transaction scenarios on that Bond.
Integration Test Scenarios are the elements that make up an Integration Test. Each Scenario will correlate to the relevant Transaction on the Bond from which the test was created. Each contains the relevant Test Scenario Data objects for the particular Scenario.
Test Scenario Data is a JSON representation of all the relevant records created during the processing of a Transaction (e.g. HTTP Request, Transaction, Bond, Snapshot) and is used to both generate the test and ascertain the results of each test run.
Whenever you run an Integration Test, the results are captured in an Integration Test Result record. The record links to and contains a summary of each of the individual Test Scenario Results.
Whenever you run an Integration Test Scenario, the results are captured in an Integration Test Scenario Result record. The results of each Test Scenario are tallied and rolled up to the parent Integration Test Result record.
We will give step-by-step instructions on how to generate, run and explore automated Integration Tests.
How to navigate, view and interpret the results of an automated Integration Test.
In Unifi Test Asistant, once you click 'Run' you will see the Integration Test Result record being populated with each of the Integration Test Scenario Results in real-time as they happen.
Once complete, the results for each of the Integration Test Scenarios are tallied and rolled up to the Integration Test Result.
The top of the pane displays a graph and various counts showing the overall Status and numbers of Tests grouped by Status. You can also link out to the Integration Test along with the Bond & Target Record created during the test run.
The Details tab shows the description (as entered after generating the test), along with the date/time and the version of Unifi installed when created. It also links out to the Integration and Process records (opening Integration Designer in a new window).
Target Version: The licensed version of Unifi installed at the time the Test was created. Tracking which version of Unifi was used to create the Test may be useful for compatibility testing after upgrading.
The Scenarios tab shows each of the Integration Test Scenario Results. Clicking the Scenario link will open the Result for that Integration Test Scenario. Clicking the Transaction link will open the Transaction record created during the test run.
The Warnings tab shows all the warnings grouped by each Integration Test Scenario Result. From here you can step into each of the relevant Results and the relevant 'Transport Stack' records (e.g. Stage, Request, Bond, Snapshot).
Unifi will log a warning for test results where a discrepancy has been detected. These warnings should be investigated. However, in the majority of cases they should require no further action. For instance, where date/time values exist in the payload, there will always be a discrepancy between the time in the original payload and the time in that of the Test Scenario Result.
The History tab shows a list of the results of each test run. Each time a Test is run, the Result will be added to the top of this list. Clicking the value in the Number column will display the relevant Integration Test Result above.
You can step into each Integration Test Scenario Result from either the Scenarios or Warnings tabs. Clicking the link will open it in a new window. An example is shown below.
The assertions that have passed are green; the assertions that have warnings are orange (with the discrepancies called out); links to the documents created during the test run are called out and highlighted blue.
The Test Results are then rolled up to the Process...
...and the Dashboard.
This is a summary of all the tests that have been run on the instance. Each of the relative graph segments are clickable links to a list of the relevant Test Result records containing tests which match the filter criteria (i.e. Passed without warning, Passed with warning and Pending).
This shows the number of Tests in relation to the number of Integrations on the instance (Unifi expects at least one Test to exist for each Integration & coverage is calculated on the percentage of Integrations containing at least one Test). The Chart is a graphical representation of the test coverage percentage & the segments are clickable links to a list of the relevant Integrations. Note: in the exanple above, 25% coverage means that only four out of the sixteen Integrations have at least one Integration Test associated.
Example Test Coverage:
One Integration containing one Test = 100% coverage
One Integration containing two Tests = 100% coverage
One Integration containing two tests plus one containing none = 50% coverage
This displays a list of the most recent Test results. Clicking the Number will open the Test Result in the Unifi Test Assistant window; clicking the Integration Test will open the Integration Test in the platform in a new browser window.
This displays the Integrations on the instance grouped by Company. It displays a range of messages about the status of those Integrations in terms of Tests (e.g. '12 integrations without a test', or 'No integration tests found' etc.). Clicking the Company will open a new window containing a list of the Tests for that Company. This will be of particular value if you are a Managed Service Provider (MSP).
How to generate an automated Integration Test.
Before you can run an automated Integration Test you must first create one. Unifi makes that extemely easy for you.
Integration Tests are created directly from the Bond record. At the click of a button, Unifi will generate a test which comprises each of the Transaction scenarios on that Bond.
Navigate to the Bond you wish to create the automated Integration Test from and click 'Create Test'.
On the modal that pops up, you will need to confirm by clicking OK.
The Create Integration Test Worker modal is displayed, showing progress. Close the modal.
The automated Integration Tests have been created and can be viewed by navigating to Unifi > Testing > Integration Tests.
We recommend you add your own meaningful Description to the Test and Update.
Note the following about the Test that was created.
Name: Autmatically concatenated value [Integration] - [Unifi Version] - [Bond] (this can be edited to suit).
Integration: The Integration the Test belongs to and will be packaged with.
Unifi version: The licensed version of Unifi installed at the time the Test was created.
More information about each of the records created can be found in the Testing section of our Documentation.
If you're the kind of person that likes to know how things work, we've included this information just for you.
When you click 'Create Test', Unifi will create an Integration Test record for that Bond.
It will then take the first Transaction on the Bond and create an Integration Test Scenario record for that Transaction.
Once that is done it will look for all the relevant transport stack records (Snapshot, Stage, Bond, HTTP Request, Transaction) that pertain to that specific Transaction and create the relevant Integration Test Scenario Data objects for each record and adds them to that Integration Test Scenario.
It will then loop through each of the subsequent Transactions on the Bond, repeating the process for each (creating an Integration Test Scenario record and adding the relevant Integration Test Scenario Data objects).
How to run an automated Integration Test.
Once you have generated the automated Integration Test it is available for you to use and re-use over and over again.
Unifi has a dedicated portal interface for automated Integration Testing. To open it from native ServiceNow navigate to Unifi > Unifi Test Assistant.
Once open, you will initially be greeted with the Dashboard, which provides an overview of the Tests and their Results.
If you navigate to Processes you will be presented with an overview of the Tests and their Results grouped by each Process in the instance.
From there, you can either click the appropriate tile in the main pane, or select the equivalent Process listed underneath 'Processes' in the sidebar to view and run the relevant Tests for that Process.
Tests can be run either in native ServiceNow or Unifi Test Assistant.
To run an Integration Test directly from the record itself in native ServiceNow, simply click the 'Run' button.
To run an Integration Test in Unifi Test Assistant, once you have selected the appropriate Process, navigate to and select the relevant Test (listed underneath its integration)...
...click 'Run'...
...and Confirm.
If you're the kind of person that likes to know how things work, we've included this information just for you.
When you click 'Run', Unifi will create/update (depending on the scenario) a test record from the data in the Snapshot record. E.g. in the case of a create scenario it will create a test record, add the values from the Snapshot and save the record. (In one sense Unifi doesn't know it's being tested - it isn't concerned with how the record was created; it sees that a record has been created and values added - it behaves as it normally would if the record was created manually).
It will then check whether it performed as expected (triggering the Integration, creating the relevant transport stack records etc.) and if so, compare the results of the test with those of the original Transaction (using the relevant Integration Test Scenario Data objects as reference).