Interviews

Cost of poor test data management

by Mark Rowe

If there’s one thing that’s going to make a business sit up and listen – it’s money, especially when it comes to avoidable fines. So, with the Financial Conduct Authority (FCA) listing over £800m of fines to businesses breaching financial principles on its website this year, you would think the industry would be more careful about its conduct, writes Andrew Crouch, head of data at software company SQS.

Concerns about fines should also be high on the agenda of organisations ahead of the new EU General Data Protection Regulation’s (GDPR), which are due to be finalised in October this year and come into force in 2017. These new regulations represent the most significant change to data protection in the UK and EU since 1995. For the first time, businesses will have to notify customers – in plain language – what information about them is collected and how it is used. Customers will also have the ‘right to be forgotten,’ making it possible to delete personal information about themselves from online services. The new rules, of course, significantly increase penalties for non-compliance and pose problems for how financial institutions manage their customer data.

With the new regulations looming, and bearing in mind the industry’s evident track record for breaching principles, IT professionals have pointed the finger at banks, naming them most likely to be the first hit by the maximum GDPR fines. In response, banks are rushing to put relevant data management processes in place to comply, but there is a noticeable lack of thought when it comes to their test environments.

With the constant influx of new data sets, as a result of rapid digitisation within the industry, many financial service organisations are in a stage of constant digital development, placing vast amounts of data – including personal customer data – in test environments. However with GDPR fines from the Information Commissioner’s Office (ICO) currently planned for 2 per cent of a business’s global revenue (capped at €100million) and the FCA planning unlimited fines, it is time that test data management was given the same priority as other data programmes. It’s simply not worth getting it wrong.

To ensure banks are not jeopardising their customer data in testing, they need to find out more about their data model. Many organisations, banks included, actually have an incomplete picture of their data. This is particularly the case when it comes to testing, as different data sets are saved down at different times, in different locations. Although some of this data may be structured, much of it will not be, and a data discovery exercise is necessary to locate personal data before banks can start to put a data management process in place.

Once an organisation is familiar with its data model – and has located all personal data – it can be possible to bring data processes in line with the new EU regulations by working with anonymised live data during testing. Using anonymised data can be an ideal solution for banks looking to test new software functionalities. It also has the advantage of revealing no personal details and gives banks piece of mind when it comes to compliance.

However, anonymised live data may only cover around 30 per cent of a bank’s test coverage requirements. If that’s the case, synthetic data (otherwise known as dummy data) can be used to plug the gaps. Synthetic data tools can be few and far between, so using a combination of anonymised data and synthetic data can provide an accurate test data model, whilst mitigating the risk of non-compliance.

Needless to say the security risks associated with using personal data within a live test environment, particularly within highly regulated industries like banking, are countless. Businesses operating within this sector need to ensure their test environments are entirely secure. After all, companies that suffer data breaches will be liable to provide compensation to those affected, as well as face reputational damage.

It has always been important to protect personal data but for banks, the potential fines posed by GDPR have made business leaders sit up and listen. In today’s ever changing digital environment, banks need to fully understand their data models; get to grips with their glut of potentially unstructured, potentially poorly managed data and put processes in place to comply. Carrying out testing efficiently, whilst remaining compliant, will enable the institutions that get it right to produce quality software and avoid potentially catastrophic financial penalties.

Related News

  • Interviews

    Great Train Robbery today

    by Mark Rowe

    On the anniversary of the infamous so-called Great Train Robbery, Martyn Ruks, Group Technical Director at MWR InfoSecurity asks: how would it…

  • Interviews

    Ransomware mitigation

    by Mark Rowe

    To combat ransomware, businesses must use a multifaceted approach, including software, IT and business processes, and employee awareness, says Usman Choudhary, Chief…

  • Interviews

    Praise for Protect Duty

    by Mark Rowe

    The proposed Protect Duty represents a significant development in public safety, making venue owners legally responsible for their patrons’ safety – a…

Newsletter

Subscribe to our weekly newsletter to stay on top of security news and events.

© 2024 Professional Security Magazine. All rights reserved.

Website by MSEC Marketing