How can we trust hybrid cloud for managing and storing mainframe data?

BMC cloud mainframe data imageAn Intellyx BrainBlog for BMC, by Jason English

We’ve read this script before, so there’s no need to make it into a movie. 

Every plot to modernize enterprise applications starts out with a team heroically attempting to move a portfolio of core business applications, data and workloads off-premises, in order to attain the agility levels of nimbler born-in-the-cloud startups.

Inevitably, our protagonists find themselves challenged by hybrid cloud complications, as different parts of the application portfolio in different regions have unique data management expectations and data protection policies.

With so much business value riding on mainframe data, IT leaders often choose the path of least risk and end up living through a ‘worst of both worlds’ scenario. 

Mistrust in the cloud migration process leads to piecemeal projects, and the anticipation of failure causes teams to maintain duplicate instances of data on the mainframe and in various cloud instances at greater expense.

Why is cloud trust in short supply?

Let’s face it, we call mainframes ‘big iron’ — so that already makes them sound too heavy for the cloud. 

These workhorses are engineered to continuously handle the enterprise’s most critical work with almost zero downtime, as they churn out massive quantities of data to storage volumes that have their own gravitational pull.

Enterprises want to develop new, innovative functionality atop this valuable data, from advanced real-time analytics, to training of machine learning models to drive new features for AI enhancements.

At the same time, they can’t afford to make mistakes that could put mainframe data in jeopardy, whether it happens because of a system or process failure, or a cyberattack.

The conventional ETL (extract, transform, load) method of migrating mainframe data to other systems isn’t an easy task. Backing up a mainframe data store to virtual tape can take a day or two, and extracting data for productive use in applications might require an engineer or analyst to take off for the evening while a batch runs.

How can we modernize the way we handle mainframe data, so it is better suited for hybrid cloud infrastructures, and trust that it will integrate into our overall application portfolio strategy?

Merging mainframe intelligence with optimized cloud data delivery

Mainframes are quickly becoming a first-class citizen of cloud architectures, as new solutions safely expose their data for modern workloads.

A prime example of this movement happened when BMC recently announced their acquisition of Model9, an innovator in mainframe cloud data management and storage. 

The company’s unique approach flipped the script on ETL and instead leveraged ELT (extract-load-transform), using on-premise agents running on the mainframe’s zIIP engines to push data in massively parallel chunks to the cloud, where processing workloads can be elastically spun up and discarded when done. 

When combined with modern mainframe data management and intelligent data processing capabilities within the new BMC AMI Cloud suite, business trust in cloud mainframe data is restored, exemplified in three key use cases: object storage, analytics transformation, and cyber resiliency.

Fast cloud object storage at lower costs

The most common constraint to moving mainframe data into cloud resources is the age-old challenge of speeds and feeds: the time and expense needed to move so much data away from its center of gravity to a place where it will be productively exploited for advanced work such as machine learning to train an AI to recognize fraud.

Conventional processes can be especially slow and cause unplanned cloud storage costs, so time and budget constraints cause data archives to be run less often and fall out of sync with each other. Many companies just give up and settle for parallel storage volumes within the same data center.

With BMC AMI Cloud Data, a faster ELT approach opens up the throttle for cloud export, so when data arrives, elastic cloud resources can then transform proprietary data formats into open formats like JSON or CSV for ingestion into a cloud data warehouse. If needed, data can also be transported just as quickly from cloud back to mainframe if a workload such as the trained AI inference engine that detects instances of fraud would be better suited to run on-premises.

Expose and transform mainframe data for analytics

Companies need the freshest possible data delivered from core mainframes and applications to make better decisions and gain an edge over the competition.

The data layer of a real-time analytics application needs to scale to support thousands of concurrent application processes and user queries. If incoming mainframe data takes too long to arrive and get processed for useful work, sub-optimal decisions will be made on stale data.

The BMC AMI Cloud Analytics solution allows fast queries and massive concurrency across both event-based and historical data sources, while eliminating unnecessary development and maintenance labor, and cloud storage and compute costs. 

One large US transportation company was limited to scheduling ETL imports of only 20 DB2 tables a day from its mainframe directly to Snowflake. They were able to parallelize this transport to move 2,000 DB2 tables into AWS in off-hours every night for cloud analytics work in Snowflake, meaning they could plan against a complete business model every day.

Reduce ransomware threats with cyber resilient backups

Ransomware, in all of its insidious forms, is on the tip of every CISO’s tongue right now. On average a successful ransomware breach will set a company back $1.85M, and we can expect several million attacks a year.

In many ways, our old non-cloud data stores were safer from such threats, because with a secure access perimeter in place, it was much harder for hackers to learn how to break in and execute remote actions on a mainframe.

Now, fully functional automated attack wares are readily available to pro and amateur saboteurs alike on the dark web. Attackers won’t stop at mere data encryption or exfiltration—90% of ransomware attacks also go after backups.

Enterprises are more conscious than ever of the need for safe, frequently updated data archival. To maintain business continuity, the most critical operational and transaction data is saved in secondary or tertiary backups in multiple geographic regions.

Auditors will want to know that mainframe backups are secure from accidental deletion, corruption, or tampering, so some mainframe data is destined for physical drives or virtual tape solutions that are read-only in nature and physically airgapped from the internet. 

BMC AMI Cloud Vault can take the customer’s mainframe data that was destined for secondary storage, and archive it directly to secure read-only cloud object storage without the need for virtual tape. The ability to run archivals across hybrid cloud and remote drives at tighter intervals keeps the volumes in sync with each other, which is especially valuable for multinational businesses and government data estates.

The Intellyx Take

The old model of keeping data volumes and backups in the datacenter where they are directly adjacent to the mainframe was shattered forever when the cloud took off.

Application developers, data scientists and security analysts still want to take advantage of the compute power and vital data provided by mainframes, but they also want to be able to use it in the cloud for innovative work. 

Hopefully the moral of this story isn’t too preachy, as we head into a hybrid IT future where we unlock the value of mainframe data, so it can safely go wherever it is needed most.

 

©2023 Intellyx LLC. Intellyx is solely responsible for the content of this document, and no AI bots were used to write it. At the time of writing, BMC is an Intellyx subscriber. Image source: craiyon.ai.

SHARE THIS:

Principal Analyst & CMO, Intellyx. Twitter: @bluefug