By Jason English
Tricentis sponsored this post.
Enter the war room. The whole C-level team is on deck as your latest quarterly figures seem to be telling a different story than expected. The BizOps report shows that specific regions seem to be achieving greater-than-expected top-line growth, but global news of an economic slowdown suggests the opposite is true.
After reviewing the information carefully, senior executives begin to suspect that the reports are wrong. However, the challenge of unravelling where the potential issue lies is daunting:
- Is the report rendering incorrectly?
- Is there a problem in the report logic — or the business logic around it?
- Is there a problem in the data feed into the business intelligence (BI) system?
- Is the data being drawn from the appropriate data warehouse instance(s)?
- Is there an issue in the load, transformation, extraction, or source of the various data feeds driving data into the data warehouse?
The potential for complexity is immense. Just 10 years ago, most large enterprises claimed to have a handful of core data services (three, on average). Even if that picture was a bit rosy, with the advance of cloud, process outsourcing to SaaS and partners, mobile work and connected IoT devices, the total number of possible enterprise data sources has ballooned into the millions, feeding into specialized data aggregators and warehouses that could run just about anywhere in a hybrid IT infrastructure.
We’re in the midst of a data explosion and mission-critical data has burst far beyond the scope of traditional data-quality checks. A new approach is needed to ensure decision integrity: trust in high-stakes decisions that span multiple data sets, systems and business units.