Building data pipelines, Customer stories

Data Pipeline Problems: Fixing Data at the Source Beats Being a Data Medic

Don’t be a “data medic” trying to fix your company’s poor data and business systems processes manually. There is a better way.

When I ran business systems and data analytics for Nortek, we were plagued, as most companies are, with suboptimal business processes that resulted in dirty data. Analysts and executives would encounter this data when compiling reports. “Why is there such a big outlier here? That doesn’t seem correct.” As a result, they’d busy themselves with excluding erroneous transactions or fixing incorrect entries in the analysis.

Over time, these issues accumulate. And the people who know how to correct them become the critical “data medics” of the business, which depends on them to constantly fix bad data.

Eventually, even the data medics realize that an extremely large part of their own analysis and reporting processes are taken up by the “data medic” phase. And then someone comes up with the bright idea to create a master data warehouse where all of the data can be scrubbed and fixed and normalized so that analysts can pull good, clean data.

Nortek went down this path. Great in concept, but a nightmare in practice. We spent millions of dollars creating, and especially trying to maintain, such a data warehouse. But in the “no good deed goes unpunished” realm, there were four disastrous consequences of this effort:

  • The “secret sauce,” which translates poor processes and data into a master data warehouse, is concentrated in the hands of a very few people, who are expensive and who have very specialized knowledge
  • Because of the specialists required, the lead time and backlog of getting things done in your project tends to increase dramatically
  • You are confronted with increasing time, expense, and difficulty in passing financial, SOX, or IT audits
  • The care and attention required to maintain such an endeavor becomes unexpectedly and exponentially costly

We realized that our data medics were actually building a mountain of expensive technical debt. Only when Nortek selected Incorta as our Unified Data Analytics Platform did we see a better way:

  • Incorta can load data lightning fast. Make a change in the source system, and in a couple of minutes you see that change on your analytics platform and reports.
  • Incorta replicates the data and structure of the source systems. No complicated SQL or ETL, no flattened data, no views, no cubes. CFOs love that the analytics numbers tie-out exactly to the general ledger of the source system, and auditors’ work is quick and straightforward.
  • Incorta is the most powerful data analytics platform on the planet. On any insight, you can immediately drill down to transaction-line detail virtually instantaneously — even across billions of records. So if you want to see what underlies a suspect number, you immediately have the answer in as granular detail as you wish to explore.

The “aha! moment” was the virtually instantaneous turnaround time to fix a problem at its source. The ability to achieve instant gratification in this respect was dramatically different from any other process we had previously attempted. With executive support, we focused our efforts on fixing our processes and data at the source.

Through this process, our data medics gave way to data and application innovators who drove incredible transformation at Nortek. I joined Incorta as CIO to help our customers ride the innovation train that Incorta is helping to power today.

 

Cloud-Email-CTA_blog-size