Data Quality: Set to Extra Rinse and Cycle Dry Heavy

May 28, 2014 Bill Thomas

Image_Getting Clean Data Quality

In our line of business, enterprise data quality is of the utmost importance. We live and breathe data so often that in a sense; we are constantly wearing data wherever we go. Each morning we go through the same routine, some in a different order than others. We get dressed, we eat breakfast, we leave for work and at the end of the day we discard our worn clothes into a pile of dirty laundry.

Data is put to everyday use, just like the clothes you wear on your back, and at the end of the day those clothes need to be washed. Not a normal wash, but with an extra rinse, and you might as well put it for cycle dry heavy with the volume of data being produced today.

In a recent article in TechRepublic by Mary Shacklett, she describes data quality as the ugly duckling of big data. In the article she stated, “a big data survey in 2013 that revealed 60 percent of IT leaders believed their organizations lacked accountability for data quality, and more than 50 percent of IT leaders questioned the validity of their data”.

An astonishing number, but never the less, very believable because when you’re operating a business be prepared to get dirty. Dirt will come from all directions in all shapes and sizes, from different ERP interfaces to your business and IT users; everyone adds dirt to your data, most unintentionally of course.

With all of this dirt, how do you keep your data clean?

Where data migration is event oriented, information governance is a continuous business process that relies heavily on clean data now.

There are several ways to approach data quality management post-data migration, but I will cover two concepts, one clearly recommended over the other: the reactive and proactive approach.

  • Reactive Data Quality Management – If you don’t have a Proactive Data Quality Management in place, then by default you have a Reactive Data Quality Management program. As opposed to a proactive approach, a reactive approach is triggered by a costly business process interruption to occur first.
  • Proactive Data Quality Management – Within a proactive approach is a wealth of knowledge of business rules, access to data and written analytical reports to monitor exceptions for those business rules. The differentiator in hand is to catch the data problems BEFORE they become visible. More like a process than a project, the dynamic nature of your business and your customers will dictate that the data, the business rules and the service level agreements around those business processes will be in a steady state of change, therefore your proactive program must be alive and active.

If you are in a state of Reactive Data Quality Management, here are few questions to get you thinking about a Proactive Data Quality Management program:

  1. Do you publish a monthly scorecard focused on the quality and timeliness of your data?
  2. If there is a business interruption caused by data, is a report written to trap the error in the future?
  3. Who writes your reports, who is the report assigned to and who monitors the report on a daily basis?
  4. Do you have SLA’s (Service Level Agreements) in place regarding the creation of your data?

As always, your program cannot be complete without assigning the right people to help wash the data. Proactively designating qualified resources to roles for establishing a clear and meaningful information governance policy, as well as the data stewards who will enforce data governance policies, is essential in keeping your data clean, reliable and relevant.

While you ponder and determine the best program that accommodates your desired outcomes, keep in mind that the first step in keeping your house clean and free of dirt is to leave your dirty clothes at the door before it gets a chance to get in your house.

The Road to SAPPHIRE NOW Blog Series features a series of blog posts on data quality, information governance and master data management from BackOffice Associates industry experts. Check back weekly for new content related to the 2014 SAPPHIRE NOW and ASUG Annual Conference and join us at the event from June 3-5 in Orlando, FL at Booth #224.

Join the conversation #SAPPHIRENOW.

About the Author

Bill Thomas

Bill Thomas joined BackOffice Associates in 1997 as a Team Lead. In 2009, Bill created the DataFactory, which delivers a global, dual-shore data migration augmentation strategy. Bill brings over 30 years of Software Programming Experience to BackOffice and is now the SVP of Global Delivery Center.

Follow on Twitter More Content by Bill Thomas
Previous Flipbook
[Customer Story] Banco BICSA
[Customer Story] Banco BICSA

Learn how Banco BICSA, a financial services company, established an automatic and reliable way to integrate...

Next Article
In Search of the Empowered Data Steward
In Search of the Empowered Data Steward

One of the most important emerging roles in information governance is the data steward. We look to these te...

×

Get Tips, How To's and Great Reads Delivered Monthly To Your Inbox

Thank you for subscribing!
Error - something went wrong!