On a conference call with reporters this week, analysts Matthew Lawton of Framingham, Mass.-based IDC and Andreas Antonopoulos, a senior vice president at Mokena, Ill.-based Nemertes Research Group Inc., sought to connect the dots between these technologies and show how the combination would have impact on IT management in the areas that matter most: budget, operational efficiency and application recoverability.Linux and open source sitting in an ROI tree
Lawton laid out the case for Linux and open source; research indicates that the key drivers for implementing open source in mission-critical areas have begun to overshadow the downsides.
For example, Lawton cited an April 2007 IDC report that asked IT managers about top decision-making and purchasing criteria. The functionality, scalability and reliability of open source software topped the list. "The ones at the bottom of the list were protection from vendor lock-in, indemnification concerns, source code access and the ability to redistribute code," Lawton said.
In short, end users today care less about whether they can tweak code and more about what the software does and how, Lawton said. The good thing for Linux is that a growing majority of end users still agree that the use of Linux results in considerably lower total cost of ownership (TCO) when compared with closed source proprietary alternatives.
"[IT managers] believe that open source saves their organization money," Lawton said. "[Users] also stated they were actively searching out open source alternatives and that a high percentage of their budgets were focused on the use of open software."
Overall, today's data centers are "bullish around cost opportunity," Lawton said, but they also view product performance and functionality as other important evaluation areas.The negatives subside
As the cost and functionality of open source infrastructure like Linux have continued to mature in the data center, IDC concluded that the inhibitors to adoption have decreased. The main inhibitors, Lawton said, were copyright issues and support concerns.
SearchEnterpriseLinux.com's own 2006 user survey showed Linux support remained a thorn in the side of today's data center manager, but IDC's report found that this concern had waned considerably by April. "[Patent and support concerns] did not rate as highly on the inhibitors scale as the key drivers did on their scale. Overall this is good news for [Linux and open source]," Lawton said.
IDC's report also found that 86% of infrastructure projects in the enterprise based on Linux were of critical or high importance to the organization. On the application side, open source applications were of critical or high importance 77% of the time.Server virtualization matures
Antonopoulos of Nemertes addressed how virtualization has evolved from a simple approach to commodity server consolidation to an influential and expanded new role. "Virtualization is driving subtle changes in the enterprise environment, but it is often misunderstood," he said. "There is a growing maturity in the way virtualization is being adopted today."
The "sardine in a can" approach to virtualization – the act of cramming as many virtual machines into a server as possible -- has given way to innovative approaches and tremendous ROI opportunities, Antonopoulos said.
"Certainly consolidation as a trend has been a primary driver behind virtualization; it was the highest-rated driver among enterprises in our research on the next-generation data center in 2006," Antonopoulos said.Today, however, virtualization has moved past consolidation into at least two other areas: (1) application development and deployment and (2) recoverability. Rapid application deployment and recovery
As commodity virtualization becomes better understood, IT managers have begun to use it for its vertical scaling abilities, Antonopoulos said. "[Users] test on a smaller server under a lighter load and then move that application to a larger server with a heavier load. While the scaling would change, none of the configurations would change," he said.
Using virtualization this way has allowed IT managers to essentially hide the underlying differences between servers., including size, brand (HP or IBM, for example), the number of processors, and "allows enterprises to have heterogeneous data centers, use them in a more flexible manner, and deploy applications to more suitable servers without worrying about making configuration changes," Antonopoulos said.
Virtualization has also had a hand in reshaping application recovery. Traditionally, protecting an application meant configuring it with special software agents and pathways to recover it from a remote data center in the event of a disaster. With virtualization, however, many companies today are syncing storage infrastructure between data centers, therefore spawning secondary applications in a remote data center without worrying about changes in hardware or density, Nemertes Research found.
"By putting applications and operating systems [like Linux] inside a virtual machine … businesses can launch new copies of their Web server, etc., in a very short period of time: minutes versus weeks," Antonopoulos said.
"This represents a fundamental shift in IT culture," he continued, "a shift to a pool of virtual servers that make applications available based on service-level agreements and to model their IT more dynamically and respond to changes on demand."
Email Jack Loftus with your comments and suggestions.