Modern Portals To Quality For Mainframe Transformation

BrainBlog for CloudFrame by Jason English

Mainframe modernization can feel like a scene from Poltergeist, or any other thriller movie where the protagonist is trapped in an endlessly stretching hallway with a door that keeps receding into the distance.

Ask anyone who’s survived a major enterprise upgrade. Transforming thousands of files and COBOL and JCL code into a form that will play nicely with modern Java architecture and cloud computing infrastructure seems to be an endless concern with few offramps.

Even if the application code and the mainframe supporting it are translated, how will we know when the job is really done? How will we know if it is high quality? And how should we define quality, anyway?

Let’s explore these questions of quality and see if we can find a safe passage for mainframe transformation without carrying forward any of the mainframe baggage.

From old code to new gold

The most obvious difficulty of refactoring mainframe code to an object-oriented language is that it is unlikely to work as it once did.

Even if it does work, you could end up with results professionals lovingly refer to as “JOBOL” – which is either a job-oriented language started in the 1970s or a chaotic leprechaun at the end of the rainbow with ‘Just o’Bunch O’Lines’ of code.

Unlike object-oriented programming and relational databases, compute, and data operations can be highly intertwined within legacy mainframes. For instance, the result of a CICS transaction in COBOL may depend upon its location relative to other commands and may reference a record in a specific location to complete its operations.

Mainframes also have their own flavor of utilities – for instance, an older IBM mainframe can have its own file transfer and sort utilities. Some vendors like Micro Focus have encapsulated these common utilities into their own COBOL platforms. Still, these proprietary elements must also become maintainable code if and when they are transformed.

Procedural legacy code can’t simply be translated to Java code; it must be transliterated so it makes logical sense when it arrives, using a framework that understands how to split code that was never object-oriented in the first place into components useful to current developers working in Java.

Spring Batch is an open-source record processing engine that provides a solid substrate for converting existing COBOL into a more declarative format ready for Java developers to redeploy into Spring Boot or microservices-ready architectures without having to learn legacy coding concepts.

From there, intelligent rules-based automation should ensure that existing mainframe workflows are replicated in the new, decoupled architecture. Then, you can just sit back and watch the cost savings on eliminated MIPS roll in, right? It looks like we still need to make sure…

Read the entire BrainBlog here.

SHARE THIS:

Principal Analyst & CMO, Intellyx. Twitter: @bluefug