The Challenge of Enterprise Batch Processing
Since the dawn of enterprise digital computing, managers have been looking to computer operators to run batch data processing jobs. The goal: to generate reports that the managers would use to make decisions on how to run their companies or government agencies.
Today, those large binders filled with perforated pages striped in glorious green and white may be long gone, but batch processing remains an important tool in the modern IT executive’s tool belt. What has changed: the quantity of data to process and the complexity of the jobs.
Today’s modern technology makes easy work of simpler tasks to be sure. The jobs that remain fall into the domain of high performance computing (HPC), a specialized area of IT that leverages high end technology to complete these massive jobs.
According to Gartner, HPC includes batch computing as well as data analytics and other one-time (but potentially recurring), short-term, large-scale, scale-out workloads. Furthermore, because of its high capacity but sporadic usage patterns, batch computing is particularly well-suited to public cloud IaaS, where it can be exceptionally cost-effective. Many HPC workloads depend on a high degree of automation, further reinforcing their suitability for public clouds.
HPC is perhaps most familiar in the context of scientific applications like genome research and seismology. However, there are significant differences between such applications and enterprise use cases. Scientific applications focus upon answering a single question, and they typically have little impact on the bottom line. Enterprise HPC jobs, in contrast, run on a daily, monthly, or quarterly basis in order to support recurring, often mission critical business processes.
Read the entire paper at http://offers.2ndwatch.com/Website-HPC-White-Paper.html.
2nd Watch is an Intellyx client. Intellyx retains full editorial control over the content of this paper.