Don’t Lose your Digital Transformation Energy to Data Gravity

Model9 Data Gravity BrainBlogAn Intellyx BrainBlog by Jason English, for Model9

The limited resources of our modern world have placed a premium value on energy, especially for powering our homes and buildings. Government and business leaders talk constantly about ‘transforming the power grid’ and finding more sustainable, reliable, and cost efficient sources of electricity.

The initiatives always sound like great ideas, and technologies for harnessing alternative sources like wind and solar have indeed come a long way. So why do so many such projects still encounter limitations?

Our challenges go beyond our methods for generating electricity at its source. We overlook the difficulty and cost of transforming, moving and holding that potential energy closest to its point of utility in batteries or elsewhere – the transmission and storage cost.

From a technology perspective, data is the resource that fuels a modern enterprise – and parallels between the difficulties of moving energy and moving enterprise data are becoming all too clear.

The inertia of data gravity prevents us from exploiting the true value of all of our enterprise data for transformative change, just when we need it most.

When the lift-and-shift of ETL isn’t strong enough

ETL stands for Extract, Transform and Load. For as long as we have been exporting data off of one core enterprise system and importing it to another for business intelligence, transactional and compliance reasons, ETL was the expected norm for making that happen.

In a world where batch reconciliation processes and all-night backups off the mainframe were the norm, the ETL approach served enterprises well for decades – until it didn’t.

The rise of service-based architectures and on-demand cloud infrastructure raised expectations for faster delivery of new application features that can rapidly scale and respond in an instant to meet business demands.

A new space of cloud-based data lakes and warehouses started filling up and providing constant firehoses of data for powering advanced applications like AI-based fraud detection, inference-driven customer recommendations and analytics.

There are hundreds of ETL tools that can be used to lift-and-shift chunks of legacy data from one system to another – but in this real-time application world, it is quickly becoming a constraint to progress.

Until now, many overlooked a critical flaw in modernization goals: the time and cost of moving so much data from mainframes, silos and sources of data – to where it will be productively exploited.

Reframing data gravity on the mainframe

Data has gravity. Therefore, we want to keep it as close as possible to the application functionality it serves.

Fortunately, the community of mainframe system vendors like IBM are already thinking about this, introducing zIIP capabilities on the mainframe and new zSeries systems that allow distributed workloads to run on excess mainframe capacity as well as bursting workloads cloud infrastructure…

 

SHARE THIS:

Principal Analyst & CMO, Intellyx. Twitter: @bluefug