By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

RESEARCH & RESOURCES

The Mainframe: Reports of Its Demise Are Premature

Mainframes are back; well, they never really went away.

What goes around, surely does come around. A recent announcement by SAS Institute that its Enterprise Intelligence Platform software (data integration and business intelligence including OLAP and advanced analytics) runs on IBM System z mainframe platforms, coupled with the company’s introduction of mainframe-friendly (sub-capacity) pricing, serves to remind us that rumors of the mainframe’s death have been exaggerated.

Most enterprise computing environments entail a combination of hardware and software technologies. As SAS clearly recognizes, mainframes are suited to more than large-volume, mission-critical OLTP applications (e.g., airline and hotel reservation systems, banking and credit card applications) or computationally intensive problems for science and other types of research.

In fact, mainframes can be appropriate in data warehousing and business intelligence environments as well. Important characteristics of mainframes (including their high availability and reliability, as well as their ability to handle massive amounts of input and output) are certainly appropriate for today’s enterprise-wide business intelligence needs. Most mainframes are general-purpose machines and thus can host both operational and analytic applications simultaneously.

Today’s mainframes have clearly evolved from their early days when they utilized punched cards or paper tape for input (and sometimes even output). Back then, they typically ran a single proprietary operating system, and some even relied upon a single arithmetic processing unit that could only add or subtract (multiplication and division where accomplished by software routines using repetitive additions and subtractions respectively).

Fast forward to the modern world, and we have virtualization techniques that allow mainframes to host multiple operating systems and emulate other hardware platforms. The extreme air conditioning (or even water-cooled) requirements and high electrical consumption that were a critical part of the mainframe’s “glass house” have been drastically reduced. Tubes have given way to transistors, followed by integrated circuits; memory has evolved from electro-mechanical relays to magnetic cores to solid-state arrays, and now encompass optical as well as electrical and magnetic technologies.

Devices that can accommodate more than two stable states or perhaps use quantum technology to store multiple states simultaneously may someday replace the traditional binary 0/1 memory states. From both a performance and absolute perspective, mainframe prices have continued to dramatically decline. For example, in late 1960s and early 1970s, IBM System 360 memory was approximately $1 a byte and limited as to how much could be attached (e.g., IBM limited directly attached 360 Model 30 memory to less than 100,000 bytes) while in 2003 a megabyte of memory for the IBM zSeries mainframe was priced at approximately $10 (and probably even less today) and the memory capacity is measured in gigabytes.

In general, although technology is continuously changing, products utilizing the technology can evolve to take advantage of such changes. Despite two decades of predictions concerning their demise, mainframes aren’t going anywhere. IT organizations and the vendors targeting them need to keep their options open and recognize that mainframes are part of the overall computing environment and will likely remain so for a long, long time. Organizations should select a computing environment that is suitable for their needs and, if appropriate, not be afraid to consider a mainframe as part of that environment. If an organization is primarily mainframe focused, it should also consider how other hardware technologies might support the mainframe.

Hardware technologies such as mainframes, mid-range servers, appliances, and PCs can all have their place in data warehouse architectures. Organizations should consider their strengths and weaknesses and how they best complement each other and then deploy them accordingly.

About the Author

Michael A. Schiff is founder and principal analyst of MAS Strategies, which specializes in formulating effective data warehousing strategies. With more than four decades of industry experience as a developer, user, consultant, vendor, and industry analyst, Mike is an expert in developing, marketing, and implementing solutions that transform operational data into useful decision-enabling information.

His prior experience as an IT director and systems and programming manager provide him with a thorough understanding of the technical, business, and political issues that must be addressed for any successful implementation. With Bachelor and Master of Science degrees from MIT's Sloan School of Management and as a certified financial planner, Mike can address both the technical and financial aspects of data warehousing and business intelligence.


TDWI Membership

Get immediate access to training discounts, video library, research, and more.

Find the right level of Membership for you.