How bad data can break your business

970 readers like this.
CIO Big Data Analytics

Today, anyone with a “C” in their title knows that, while they don’t need to be data scientists, they should know what data they have and how it’s used. But now, in the wake of recent security breaches, they also need to know whether that data is safe — and that it’s accurate and trustworthy. For most enterprises and organizations, the answer is that their data is neither.

Enough has been written lately about security and how everyone, everywhere is exposed. I’m going to focus on the area that is often overlooked in the data and security discussions: IT data.

Four sobering stats on data

Here are four stats, all having to do with data and business-critical decision making:

  • More than 40 percent of executives make a major decision at least once a month (2014 PwC survey).
  • The data on which these decisions are based is growing in volume at a rate upwards of 40 percent a year (2014 PwC survey).
  • 40 percent of businesses will fail to achieve their business objectives due to poor data quality (Gartner).
  • At any given moment, 40 percent of an enterprise’s data is missing, wrong or incomplete (Gartner).

Sobering news, isn’t it? And without corrective action, it’s only going to get worse. Currently, we’re in the final days of Data 1.0, where the velocity, variety and volume of data is increasing at an alarming rate. As we enter the world of Data 2.0 (what is called big data) those three Vs will only increase. In the world of Data 3.0 (when the Internet of Things comes full circle) the business and data ecosystems will be moving at speeds we can’t even fathom.

Why and how data goes bad

Data is not born bad. Like food in your refrigerator, it decays over time. The causes are many:  human error (incorrect data entry by an employee or customer; undocumented changes); disconnected management systems; siloing of key data; failed data migration or integration projects; the move from single-source to multiple vendors; and the complications stemming from M&As.

The bottom line with bad data: what previously was a mere annoyance is now a veritable emergency. Bad data can cripple an enterprise. It can also end careers. While security breaches get all the headlines, decisions made with bad data are, by all accounts, fundamentally flawed — a luxury no enterprise can afford. So, while cleaning up our data at one time ranked with needing to get to the gym more often, it is now an ongoing challenge involving an organization’s entire hierarchy, including the C-level.

The solution:  Pull over for a full tune-up

These days, it may feel like your company is rushing down the information highway at speeds barely under your control. However, as stated earlier, these speeds — and subsequent lack of control — are only going to intensify.

The good news: it’s not too late. The technology is finally here to turn your 40 percent bad data into 100 percent certifiably correct data: the data you can use to make confident decisions of consequence. The key to unlocking it all is to not treat it as a “data decision” but as a business one.

Total data quality:  A business initiative

Complete, comprehensive data quality begins by bringing the “data people” out of the lab and into the boardroom and explaining to them the interplay between their data challenges and the company’s business objectives. Incomplete or unreliable data has bottom-line ramifications, and scientists, engineers and management teams have to work together to solve the issue. The data people need to provide management with a unified, consistent, accurate view of the entire IT environment — something that hasn’t been achieved since the days of the single mainframe and central data repository.

This process begins with identifying, accessing and organizing all of an organization’s data. The average enterprise is working with up to 50 different data types. Find the tool or platform that can identify and sync with all these data types, in addition to other sources coming to fruition. Then, purify it by applying Master Data Management (MDM) principles as a means to transform the data into a singular form representative of the truth.

Put your data to work

Those clean, aligned data sets are now ready for analysis. For example, silo intersections help identify process breakdowns that catalyze root-cause analysis, which, in turn, forms the foundation for continued improvement. It is this analytic capability that enables what many data scientists and business analysts consider the Holy Grail:  the ability to both analyze today’s data and to predict and optimize data for the future. It’s not an easy step, but it is a logical one. With the recent surge in machine learning, prescriptive analytics and artificial intelligence, this will be reality within reach next year and eventually commonplace in the next two.

How does an organization proceed in eliminating the bad data that risks bringing a company down? The first step is a “data intervention.” Get everyone, especially the IT teams, to recognize the problem. Then, bring IT into the business fold, explaining that data purification isn’t an academic exercise, rather it is the crucial determinate of financial profit or loss.

Apply the processes and principles outlined above. The technology is there – what is needed is the will to address it and the resources to apply it.
 
A disclaimer:  this article focuses on IT data but it applies to all corporate data.

As CEO, Gary is responsible for leading the strategic direction and execution at Blazent. He has over 25 years of experience in both IT executive roles and in leading high growth IT software organizations.