The Hidden System Killer & How To Find It

Ian Parsonage

Senior Director, Global Management & IS Consulting

I have been working at a senior level in business systems management for nearly three decades. During this time, I’ve seen and fixed many poorly implemented systems and processes in addition to delivering more than 20 major change projects. Across this spread of time, one particular thing stands out which has often, generally unfairly, categorised a system as being poor or badly implemented.

 

Master Data Management (or MDM)

It can be argued (correctly in my view), that the implementation of a new system should be linked to extensive training of the users. However, what many companies fall short of, is the explanation of why data is critical or, more importantly, the consequences of the careless or unthinking entry.

I have also implemented systems where they rely on information from other older system sources which were not implemented with sufficient rigor in terms of data accuracy or user understanding and training. If this is not spotted prior to implementation, the new system gets the blame (as it’s the change people see). In reality, in the background, there are numerous corrections needed to old and seeming stable systems in order to correct for these historical errors.

It really is not surprising that 7 out of 10 system implementations continue to fail when these factors are considered, leet alone everything else that could go wrong.

In reality, helpdesks and support staff chasing poor master data entries and the consequences of these later in the process falls regularly in the top 5 issues list where the perennial number one item is “I’ve forgotten my password”. Master data is not always the obvious causes of the problem either, it can manifest itself in many varied ways, each requiring different types and complexities of correction:-

– Program terminates unexpectedly

– Wrong information or No Information Presented in another process

– Production Line stops due to incorrect value

– Product is shipped to the wrong place

– Customer is charged the wrong price

– etc. etc.

There have been some very high profile MDM issues in the news too – such as the Cathay Pacific $675 seat bargains (should have been $16,000) or United Airlines selling $4,000 First Class seats for under $100. Both of these caused by an incorrect exchange rate entry in a source system.

Despite literally $Billions being lost globally each year due to MDM, it is still virtually impossible to build a system that can guarantee 100% data entry accuracy. After all, a data is a date, not necessarily the correct one. If a delivery address number exists, but the actual target address is a different number – how do you guard for these programmatically? A valid entry is invalid – smoke comes out of your ears. 🙂

As a result, a common approach in addition to the obvious repeated education of staff and complexities and cost of running everything through a second manual check is to employ members of staff to act as “Data Police” and run a series of analyses across databases to check that entries are sensible and product statistics to demonstrate the issues. Sadly though, this, being human driven is still prone to error and missing events or risks.

However, there is a light at the end of the tunnel now that we living in (or progressing towards) the enlightened world of Industry 4.0. Employing a combination of Artificial Intelligence (or AI) in conjunction with both Big Data Analytics and Data Automation can produce some great results in terms of preventing MDM errors from causing process performance, customer service or financial performance issues.

These processes have been successfully used by CXV Global customers to analyse data, determine success or failure of the result and communicate forwards. These analytical engines have been used to reduce the analysis lead-time by more than 95% with 100% accuracy, completely removing the human element.

In some cases the MDM error results can be fed correctively directly back into the host system. Initially, it may be easier to send the results back to originating data entry person so they can correct their mistake. The cycle time for this could be as low as a few minutes, but certainly within 24 hours. The analysing system would also keep track of metrics by user, field, table etc. to direct potential re-training or other activities.

If you’d like to learn more about how these advanced techniques can help prevent a data catastrophe within your business, reach out to CXV Global for more information.