An Intellyx BrainBlog for BMC, by Jason English
Everywhere you look, enterprises are actively looking for ways to get on board the generative AI bandwagon—not just for fear of getting left behind, but because of the promise of helping every member of the team to eliminate non-value-add work in order to focus on more strategic goals.
There’s a huge array of LLMs, RAGs, ML feature stores, and autonomous agents out there now, which can be combined in unique deployment models. Alongside tall tales of massive ROI figures, there are plenty of horror stories about AI hallucinations, intellectual property snafus, and user experience issues.
Even if your company has already established an AI/ML working group and set aside a war chest for merging GenAI into the application estate, where to start?
You can invest in showy things like customer service chatbots and sentiment analysis, but if you really want to make an impact, this is the best time to start optimizing the beating heart of the enterprise: the mainframe.
The newly-released BMC AMI Assistant is a GenAI-powered capability that is being infused within the BMC AMI suite of mainframe solutions, using open source LLMs augmented by firm’s decades-long track record managing some of the world’s largest implementations.
Here are some pragmatic plays a company can make right now using GenAI capabilities for mainframe development, operations, and modernization that can eliminate toil and provide new insights to both experienced SMEs and newer mainframe team members.
1. Team Knowledge Transfer
When long-tenured mainframers retire or move on from their development and operations roles in the organization, we need to backfill their capabilities with newer mainframe engineers, which can be hard to find and train.
I recently sat in on an enlightening session of The Modern Mainframe Podcast where BMC’s Priya Doty brought up their most recent 2024 BMC Mainframe Survey, which showed a 23% drop in respondents with 20 years or more of mainframe experience since 2019, and a 14% increase in respondents with 6-10 years of mainframe experience, indicating a generational shift is already under way toward more mid-career professionals.
When veteran developers depart, their tribal knowledge departs with them. It might be tempting for newer team members to simply lean on GenAI solutions too much here—but if they go too far following suggestions, they will miss out on understanding what’s actually happening within the environment and within the business context that they are trying to support.
One useful tactic to bridge this gap is a staged rollout—where the most experienced engineers start interacting with GenAI first to automate repetitive tasks and support decisions—in order to better train the model and validate the predictive accuracy of the suggested results.
But even if everyone starts using a capability like BMC AMI Assistant simultaneously, if there’s one thing GenAI does really well, it’s documentation. As changes are made to modules, and insights are gathered from team events and communications, GenAI models can produce advice, summaries and documentation that pass on that earned tribal knowledge to the next cohort of mainframers.
2. Coding Assistance
Web-native developers coming into a mainframe group face a steep learning curve when trying to make changes and understand a complex COBOL codebase, with all of its procedures and interdependencies.
Even highly experienced mainframers can be flummoxed when joining a new group, or making updates to a legacy code region that was produced long ago—even if it was their own!
BMC AMI Assistant provides GenAI code explanation functionality within BMC AMI DevX Code Insights, with look-ahead recommendations that appear directly within the developer’s Eclipse or VSCode IDE as they code.
This functionality can not only explain the code to the developer, but also recommend documenting the explanation as a comment, helping users make sense of information and directing them toward the best course of action, while augmenting the model with feedback.
“AI assistants are now helping developers navigate and understand legacy mainframe code much more efficiently. Instead of manually going through lines of code, these tools can automatically annotate code snippets, generate test cases, and even create corresponding data sets. This is a huge time saver, especially in the QA process,” said Liat Sokolov, GenAI product manager for BMC AMI solutions.
“Think of it as a real-time coding companion offering suggestions based on the organization’s best practices and design patterns.”
3. Data Concierge
From a business analyst or mainframe ops perspective, GenAI models can take much of the non-value-added work out of searching through and correlating records from both on-prem data stores and cloud data warehouses.
Most mid-to-large sized companies have a shortage of expert DBAs to match the mainframe talent crunch, so any capability that can help team members understand the database structure and the context of a set of DB2 records without writing complex queries and joins would eliminate countless hours of toil.
In this sense, GenAI becomes a “data concierge” that gets better over time at reliably predicting the mainframe team member’s needs based on simple prompts and providing results and accompanying content and explanations in return.
Further, as more data is fed through GenAI agents, they can assist in automating the processing and cleansing of model training data, for instance during a data meta tagging process or migration. But where that training data comes from matters.
“The quality of LLMs in the generative AI space is all about the quality of the data that was used for training purposes,” said Anthony DiStauro, R&D Solution Architect at BMC. “We’ve got to be very, very careful about appropriately scrubbing data that we’re using for training material.”
4. Rapid Response & Recovery
AIOps practices and solutions have been around for a few years throughout the broader hybrid cloud application estate—where metrics and alerts from observability platforms are filtered, then either translated into automated actions or escalated as an incident to an SRE (site reliability engineer) to resolve performance and security issues.
The mission-critical, highly sensitive nature of mainframe environments makes them opaque to such born-in-the-cloud solutions, meaning a generalist SRE would have to throw issues “over the wall” for mainframe SysProgs to resolve.
GenAI models trained within a private enterprise mainframe environment can simplify operations for system programmers and automation engineers, for instance, in explaining complex REXX Rules, or identifying the root causes of incidents in real time.
Using BMC AMI Assistant in the context of BMC AMI Ops Insight gives SysProgs of all experience levels natural language explanations of potential outages or performance issues, with a dossier of actionable insights that speed up resolution processes.
5. Compliance and Policy Guardrails
In the mainframe world, trust matters deeply, so sending enterprise data to a SaaS-based GenAI solution is a no-go for data security and operational resilience reasons.
The notion that an LLM chat model or AI inference feature must include billions or trillions of data parameters like it’s the latest Microsoft or Google offering is outdated. After all, it’s far more concise to train a GenAI model on COBOL syntax and CICS in a specific local environment, than it is to teach an LLM to communicate in English on any topic with the nuance of a native speaker!
Better still, because GenAI is so good at assisting with knowledge transfer and documentation, it can also make compliance and security auditing exercises much less of a headache.
Fortunately, you don’t have to trust a SaaS service with training or operational data from your trusted systems. Best-in-class enterprises don’t just make KPI targets and financialized goals for their AI initiatives, they will also set forth clear policies for evaluating and adopting GenAI within their most critical environments.
The Intellyx Take
There’s no shortage of new and emerging LLMs and machine learning tools on the frontier, and therefore, there’s no universal one-size-fits-all GenAI solution for everyone.
Modern mainframes have the power and capacity to handle specialized GenAI workloads such as code reviews and data filtering alongside their ongoing operations, and by the way, they can do so with greatly reduced latency and data security concerns.
BMC’s approach of infusing GenAI training, automation and insights into the everyday work environments of mainframe professionals of all seniority and skill levels offers a low-risk way to get beyond the talent gap and start thinking bigger about the future.
©2024 Intellyx B.V. Intellyx is editorially responsible for this document. At the time of writing, BMC is an Intellyx subscriber. None of the other organizations mentioned here are Intellyx customers. No AI bots were used to write this content. Image sources: Adobe Image Express (author composited).
Comments
Comments are closed.