Skip to main content
Liaison

Testing Your Integration

When testing your data integration, the main goal is to determine whether it allows your staff to do what they need to do with the applicant data after loading it into your local systems. With that in mind, there are two primary facets of a data integration that should be tested:

  • Does it work? Test the extraction, transit, transformation, and load of the data, and the automation of those steps if applicable.
  • Does it load data properly? Test that the expected records with all expected data points are loaded to local systems as expected.

It's necessary to clarify some of the logistical details before diving into those two main areas of testing. After reviewing the information below, feel free to use our Sample Testing Tracker to get started with testing your integration.

What does testing look like?

Testing your data integration involves sending real data through each and every element. Starting from a submitted application, data should successfully flow through extraction from Liaison's systems, transit to local systems, transformation to necessary formats, and load to local systems. If your CAS is live and applications have already been submitted, you can use this data to test, as long as you make sure to load them to an environment in your local systems dedicated to testing. If there are no applications submitted to your CAS, you'll have to submit test applications yourself; a member of your account team can provide guidance on how to submit test applications.

If you cannot submit test applications, you can still test parts of your data integration: the transform and load elements. Once you've identified what data points you need to move and which CAS fields contain those data points, you're positioned to create your own test data. Start with the CAS fields you need, then fill in data for each field. You can use the data dictionary (check with a member of your account team for which one to use) to fill in values for fields with defined options. For free text fields, put in any values you feel like, though it's usually a good idea to include the word "TEST" somewhere.

Who should do the testing?

Testing involves every element of the data integration, so the testing either needs to be done by someone who can run and monitor each element or by a team of people with access to each element. The most critical part of testing your data integration is determining whether the records created in your local systems have all the information necessary to perform the intended business process. Make sure that the testing team includes people with the access and knowledge required to review records created in your local systems to ensure they are useable for the intended purpose.

How should testing be done?

Testing your data integration is accomplished by gathering data in the CAS (either by submitting test applications or using live applications) and then running each element of the data integration: extract, transit, transform, and load. You'll want to run each of these elements independently to determine if they function correctly. If your data integration is automated, you'll want to test the automation after testing each individual element. Once you've determined that each component works properly, test that your automation successfully strings each element together.

The key to a successful testing session is keeping track of your expectations. Before testing your data integration, gather information about all of the applications you intend to test. The first thing you'll want to check is that the correct number of records moved from Liaison's systems to your local systems. Having a list of the records you expect to move will allow you to check for that. For each application you intend to test, manually record what you expect the records to look like in your local system. Will the application create a new person record when loaded or should it merge into an existing person record? What data points do you expect to be loaded? In other words, make yourself a guide so that you'll know exactly what's gone wrong, if anything.

To help keep track of your testing, try using our Sample Testing Tracker.

Where should testing be done?

Before diving into what to test, consider where you'll be doing your testing. Liaison provides non-production environments where test applications can be submitted and managed. Contact your Liaison representative if you need help accessing and using these non-production environments. Consider whether your local systems make non-production environments available where you can load data without worrying about the consequences. If so, it's probably best to do your initial testing in that environment. If your local systems don't make non-production environments available, take special care to plan out how your test data will be clearly marked so that you can delete it once testing is complete.

Does it work?

Keeping an eye on your intended use for applicant data once it's loaded to local systems, you'll want to test that your data integration successfully extracts, moves, transforms, and loads CAS data to local systems, and, if you've automated your integration, that all of those steps are completed automatically. When testing the broad functionality of the data integration, leave questions about whether the data was properly transferred aside and focus on the mechanics of the data movement: does it get from the CAS to your local systems?

The first step to testing that your data integration successfully moves data is to run it and see if data shows up in your local systems. After that, you'll want to dive into each step individually:

  • Extract: for testing, consider saving a copy of the CAS data extracted as a file to a local directory to confirm that it was successfully extracted.
  • Transit: local systems differ on how they ingest data, but no matter what system you're loading data to, you'll have to get it there one way or the other. For programmatic delivery (e.g., JSON via an ingestion API), consider logging successful delivery to the console. For file delivery (e.g., CSV via SFTP), consider retaining the uploaded file in the SFTP after ingestion.
  • Transformation: for testing, consider saving a copy of the transformed CAS data as a file to a local directory to confirm that it was successfully transformed. Note that some local systems allow for transformation to occur after the point of loading, in which case reviewing the loaded data is the best way to determine if data is being successfully transformed.
  • Load: while local systems may differ in how they ingest data, they all have some tool for reviewing what data was loaded, when it was loaded, and whether the load was successful. You'll want to use this tool to check that your data integration is successfully loading data.

If you've automated your data integration, you'll want to test that the automation works. Each of the four steps listed above can be triggered manually, and you'll likely want to test them each individually by doing just that. The automation of your data integration is a separate element that must also be tested. Test your automation by turning it on and confirming that each task is completed successfully.

When testing that your data integration successfully moves data, you'll want to consider the following details:

  • Timing: test that your data integration moves applications when expected. For example, if you intend to move applications at the point of completion, be sure this happens on time and not before.
  • Frequency: test that your data integration occurs as frequently as planned. For example, if you've built an automated integration you intend to run three times a day, turn it on and make sure it runs at those times. Or, if you've built a manual data integration based on batch processing that you intend to run once a day, have the staff responsible for running the data integration run it daily for a couple of days to make sure things go smoothly and that the process isn't too onerous on staff.
  • Volume: test your data integration on large and small volumes of data. Test loading just a few records and loading over 100 records.

In the real world, unexpected things happen. You'll want to test your data integration over a few days or week to see that it continues to run successfully even outside of your carefully curated testing environment. Variables like internet connectivity, automated job schedulers, and staff availability can change from day to day, and those changes can affect your data integration. Simulating the real world conditions for your data integration by running it for some time (with test data, of course) can help uncover any issues before they become a problem in production.

Does it load data properly?

The quality control measures you build should allow you to quickly and easily answer this question. While robust quality control measures will allow you to review the integrated data quickly, they won't tell you what data you should be running through the integration to test. Each institution's situation is different: different CASs, different local systems, and different post-integration uses for applicant data. Therefore, it's impossible to provide a one-size-fits-all recipe for creating test data that will effectively determine whether your data integration is loading data properly. That being said, with a solid understanding of your admissions business process, the information below can guide you in creating effective test data.

Start with your intended use for the applicant data once it's been loaded into your local systems. At this point in the construction of your data integration, you should be well aware of what you're trying to do with the data, and with the data points required to accomplish that goal. With your knowledge of the data points that you're loading, consider variations in those data points that you typically see in your applicant pool from year to year. Do you get many international applicants? Does state of residency matter for domestic applicants? Do you have detailed coding in your local systems for CAS programs that require complex mappings? Using historical applicant data can help you understand what sort of data you can reasonably expect will run through your data integration. Catalog the possible variations in each data point.

With a firm grasp on the variety of data you can expect to move through your data integration, you'll next want to create every possible combination of data points. Local systems are complex, so it's important to test all possible variations of data points on incoming records to make sure nothing throws a wrench into the whole integration.

You'll also want to consider applicant data that you don't want to integrate. For example, if you're loading applicant data at the point of completion, you don't want to load applications that have only been submitted. Be sure to include some test data that you don't want to load to make sure your data integration handles those cases properly as well.

In addition to testing all possible combinations of normal variations in your required data points, you should test extremes in your data. For example, while most applicants will have just a few colleges to report in the prior education section, you should create some test records with outlandish numbers of prior colleges (think 15).

If your CAS allows individual applicants to submit multiple applications per academic year, you should take care to test records with multiple applications. Applicants with multiple applications can be hard to handle: you need to be sure that the different applications load to the same person but different applications, without duplicating either person or application records. Consider the many ways in which an applicant can have multiple applications across programs and entry terms: applications to the same program in different entry terms (e.g, fall, spring, summer), applications to different programs in the same entry term, etc. Here, as well, it's important to test extremes (think of an applicant with 10 different applications).

Finally, consider that applicant data isn't static. Depending on your intended use in local systems for applicant data, you may be transferring the same application from CAS to your local systems many times as information is added to it. For example, if applications are being loaded to local systems as soon as they've been created, you'll be loading those applications regularly to make sure the local systems know when they're complete and all required materials have been received. The time dimension becomes important in data integrations of this type. You'll want to test that your data integration handles changes over time properly as well. To do that, you'll want to load the same application record several times, with changes to the record each time it's loaded. So beyond every possible combination of your required data points, you'll need to think through the possible variations over time of your required data points. A common example is application status: if you load applications as soon as they've been "Submitted," you can expect many applications to progress to the "Completed" application status.

Sample Testing Tracker

After reviewing this information, use our Sample Testing Tracker to start testing your integration.

sample-testing-tracker-screenshot.png

  • Was this article helpful?