Testing@LMAX – Compatibility Tests

LMAX Exchange

Once an application goes live, it is absolutely essential that any future changes are able to be work with the existing data in production, typically by migrating it as changes are required. That existing data and the migrations applied to it are often the riskiest and least tested functions in the system. Mistakes in a migration will at best cause a multi-hour outage while backups are restored and more likely will subtly corrupt data, producing incorrect results that may go unnoticed for long periods making it impossible to roll back. At LMAX Exchange we reduce that risk by running compatibility tests.

Sanitised Data

The first requirement for testing data migrations is testing data that as production-like as possible. It’s almost never a good idea to bring actual production data into development and it’s definitely never going to happen for a finance company like LMAX Exchange. Instead, we have a sanitiser that runs in production and generates a very heavily sanitised version of production data that still has the same “shape”. For example, it has the same number of accounts, the same distribution of open positions across instruments, null values in the same places there’s null values in production and so on. However it replaces anything that’s even vaguely personally identifiable or sensitive information. If in doubt, sanitise it out.

Despite being heavily sanitised, we still treat it as data that should be secured but it can be brought into our development, testing and staging environments.

We have multiple production environments so we have sanitised data that matches the shape of each of those environments.

Can Migrations Run?

Once we have sanitised data the most basic check is to confirm the migration process will actually complete successfully. This ensures we can release successfully but doesn’t give us any real confidence that the migration is successful. For such a primitive check it’s surprisingly effective as it picks up the common errors of changing columns to NOT NULL when the production data actually does have null values, or adding a unique constraint to tables when the content isn’t actually unique.

Did Migrations Work?

The obvious next step is to write tests to confirm that migrations actually worked. Our compatibility test jobs are setup so that we can easily write a JUnit test that will be run after migrations complete so we can verify the state. 

The most direct form of test is an example based test. We select some examples from the production dataset and write assertions to check that specific bit of the data migrated in the way we expect. The down side is that we’re dealing with live production data which is regularly updated so it’s possible that our examples will change after we’ve written the test and then, correctly, migrate to something different to what we expect. Still, these are often useful to put through for a single run as a sanity test when developing the migration, then delete them.

Slightly more generic, we can write a test that assert constraints that must be true immediately after the migration completes. For example, when we made permissions more fine-grained we needed to assign a new type of payment role to every account that used real money, but not to any demo accounts (which use pretend money). We can write a test to verify that migration worked correctly quite easily, however once the migration goes out admin users may add or remove the role to different accounts and the constraint would no longer hold. For cases like that we simply delete the test once the migration has gone live at which point it’s done its job anyway.  We also mark the test with a @ValidUntil annotation that makes it clear that the test has a limited life time in case we forget to delete it.

Finally, we can often identify constraints that should always be valid and write permanent tests for them. These are extremely powerful, testing not just that our current migration works correctly but that no future migration breaks that expectation.

Did Something Unexpected Happen?

The compatibility tests that should always hold true have an additional benefit they give us early feedback that the production data has diverged from expectations for some reason, typically because a bug slipped through. We can then investigate, fix the bug to prevent any more data issues and work out how to clean up the problem.

Obviously, finding issues only after production data has gone wrong is not something we ever want to do but it’s still a useful safety net if something slips through all our pre-release testing and the database schema constraints we use. Typically when we do find issues they are minor inconsistencies that don’t cause any issues now, but are like little time bombs just waiting for a future release to assume it can’t possibly happen. So even getting feedback that late in the process often allows us to avoid any user-noticeable effects.

Making Them Easy

We have a base class we extend our compatibility tests from which makes it easy to get a connection to the database and has a few handy utilities for asserting things. By far the most useful however is the assertNoRowsMatch method. It does exactly what it says takes an SQL query and asserts that no rows match. If any do, it prints them out so you get really useful debug information to start investigating the problem. For example:

@Test
public void shouldHaveAPrincipalForEveryAccount() {
  assertNoRowsMatch(
    "SELECT a.account_id, a.name " +
    "FROM account a " +
    "LEFT JOIN principal p ON a.principal_id = p.principal_id " +
    "WHERE p.principal_id IS NULL");
}

If we’ve somehow wound up with an account with no principal that could log into it the test will print the account ID and name so we can investigate what happened and clean up the data.

 

Any opinions, news, research, analyses, prices or other information ("information") contained on this Blog, constitutes marketing communication and it has not been prepared in accordance with legal requirements designed to promote the independence of investment research. Further, the information contained within this Blog does not contain (and should not be construed as containing) investment advice or an investment recommendation, or an offer of, or solicitation for, a transaction in any financial instrument. LMAX Group has not verified the accuracy or basis-in-fact of any claim or statement made by any third parties as comments for every Blog entry.

LMAX Group will not accept liability for any loss or damage, including without limitation to, any loss of profit, which may arise directly or indirectly from use of or reliance on such information. No representation or warranty is given as to the accuracy or completeness of the above information. While the produced information was obtained from sources deemed to be reliable, LMAX Group does not provide any guarantees about the reliability of such sources. Consequently any person acting on it does so entirely at his or her own risk. It is not a place to slander, use unacceptable language or to promote LMAX Group or any other FX and CFD provider and any such postings, excessive or unjust comments and attacks will not be allowed and will be removed from the site immediately.