In a technology world where many computer hardware platforms are replaced by new generations at lightning speed, mainframes may be the exception. Big iron is seeing a resurgence thanks in large part to big data requirements.
A 2012 study of 623 mainframe users commissioned by CA Technologies and conducted by Decipher Research found that 51 percent of U.S. respondents planned to increase spending on mainframe software in the next 12 to 18 months. What’s more, an impressive 81 percent of the global respondents viewed the mainframe as highly strategic.
A primary driver behind this revival is the volume and organizational hurdles posed by today’s real-time information needs -- especially when it comes to enterprise-level concerns such as reliability and security. “The heavy lifting is still done by the mainframe and System z,” Scott Fagen, Distinguished Engineer and Chief Architect for the mainframe business at CA Technologies, told me recently.
“At the end of the day, about 70 percent of all the world’s transactions are happening on the System z platform.”
As a result, businesses are realizing that since much data is already on their in-house mainframe, they might as well keep big data applications there, too. “We are seeing a great amount of interest among customers with [mainframe] Linux capacity, in taking that data and doing some very quick processing, and not having to worry about security or reliability,” Fagen added. “They can run Hadoop and analytics on-premises in the data center.”
Virtualization and Bandwidth
Boosting the move to mainframe use for big data processing and analytics is IBM’s newest System z servers, the zEC12 and zBC12. “The virtual network provided inside an IBM z box is perfect for big data needs,” Fagen said. These System z platforms offer massive virtualization, huge networking bandwidth and can run huge data-crunching projects.
Others are recognizing big iron’s essential role in big data, too. Syncsort, for example, offers software to speed the movement of data from the mainframe to the Hadoop Distributed File System. “Most corporate data is still stored on the mainframe, and a lot of hard work is involved in accessing that mainframe data,” explained Lonne Jaffe, Syncsort CEO and former head of strategy at CA Technologies and also an ex-IBMer.
Of course, using the mainframe as a platform for analytical processing has its drawbacks. From a cost standpoint, doing your predictive analytics in the cloud may be cheaper. According to an article in a recent IBM Systems Magazine, “If you do a Fit for Purpose analysis for Hadoop on the mainframe, it will be difficult to make the hardware costs work out compared to commodity x86 processors, but you certainly would come out ahead on issues like reliability or performance.”
Another potential downside to mainframe dependency is the relative dearth of mainframe computing skills at many enterprises today. “While IT professionals recognize there is a mainframe skills shortage,” no one seems to be making the link to big data,” writes Pedro Pereira in this blog on the SHARE site.
“Grooming mainframe talent is a priority for a growing number of enterprises,”
Pereira adds. SHARE is an association that provides enterprise professionals with education, training and networking.
The bottom line is that as companies embrace their internal mainframe capabilities, they will also need to reassess their staffing to maximize the benefits.