Blog


Your Categories
Information Infrastructure EII TCO/ROI Hardware Uncategorized Green IT Development
IBM and the Changing Face of Virtualization
04/07/2011

The recent IBM announcement of new “virtualization” capabilities had the usual quota of significant improvements in value-add to the customer – but its real significance was a signpost to a further evolution of the meaning of “virtual”, a step forward that, as always before, drives new user efficiencies and effectiveness.

 

The focus of the announcement was on Intel-based hardware, and the IBM briefing pointed out ways in which the IBM solution went beyond so-called “basic” virtualization on PC-server-style Intel platforms, and the resulting 25-300% improvements in costs, TCO (total cost of ownership), utilization, power and time expended, etc. The extensions included the usual (for those following IBM) suspects: Tivoli (especially Systems Director), System x with blades, CloudBurst, SONAS, IBM Service Delivery Manager, and popular third-party software from the likes of VMWare, CA, BMC, and Microsoft. The emphasis was on improvements in consolidation, workload management, process automation, and solution delivery. Nothing there to signal a significant step forward in virtualization.

 

Or is there? Here’s what I see as the key step forward, and the key value-add. Sorry, it’s going to require a little history.

 

Whither Virtual?

The word virtual, used to mean a disconnect from physical computers, actually has changed quite a bit since its beginning in the 1960s – and so virtualization, meaning changing existing architectures in the direction of such a disconnect, is a far bigger job now than then. It started in the 1960s with “virtual memory”, the idea that a small bit of RAM could be made to look like a massive amount of main memory, with 80-90% of the performance of physical RAM, by judicious access to greater-capacity disk. Underlying this notion – now enshrined in just about every computer – was the idea of a “veneer” or “false face” to the user, application, or administrator, under which software desperately labored to make the virtual appear physical.

 

Shortly afterwards, in the 1960s and early 1970s, the “virtual machine” appeared, in such products as the IBM 8100. Here, the initial meaning of “virtual” was flipped on its head: instead of placing a huge “face” on a small physical amount of memory, a VM put a smaller “face” mimicking a physical computer over a larger physical one. At the time, with efficient performance the byword of vendors, VMs were for the upper end of the market, where large amounts of storage meant that running multiple “machines” on a computer provided transactional concurrency that in certain cases made it worthwhile to use VMs instead of a monolithic operating system to run multiple applications on a single machine. And so it remained, pretty much, until the 1990s.

 

In the 1990s, the Internet and gaming brought to consumers’ attention the idea of “virtual reality” – essentially, a “false face” over an entire computer, set of computers, or even an Internet that created full-fledged fantasy worlds a la Second Life. At almost the same time, Sun’s espousal of Java brought the notion of VMs as portable “single-application computers” across platforms and vendors. Both of these added the notion of virtuality across multiple physical machines.

 

The key addition to the meaning of “virtual” over the last decade has been the notion of “storage virtualization”, and more recently a variant popularized by Composite Software, “data virtualization”. In this case, the disconnect is not so much between one physical part of a machine and another, or even between two machines, but between programs across physical machines and data across physical machines. The “veneer”, here, presents physical storage of data (even across the Internet, in cloud computing) as one gigantic “data store”, to be accessed by one or multiple “services” that themselves are collections of applications disconnected from specific physical computers.

 

Note that at each stage, an extension of the meaning of virtual meant major cost, performance, and efficiency benefits for users – not to mention an increasing ability to develop widely-used new applications for competitive advantage. Virtual memory cost much less than physical memory. The virtual machine, as it evolved, allowed consolidation and movement onto cheaper remote platforms and, ultimately, the cloud. Storage virtualization has provided enormous data-management cost savings and performance improvements, especially as it allows better workload and data-access parallelization and data compression. And the latter two have played a key role in the second-generation business success of the Web.

 

So what’s next in the saga of virtualization? And will this, too, bring major benefits to users?

 

Tying It All Together

One can imagine a few significant ways to extend the meaning of “virtual” at this point – e.g., by applying it to sensor-driven data streams, as in virtual event stream processing. However, what is significant to me about IBM’s announcement is that includes features to tie existing meanings of virtual together. 

 

Specifically, it appears that IBM seeks to create a common “virtual reality” incorporating virtual machines, storage virtualization, and the “virtual reality” of the cloud. It provides a common “veneer” above these (the “virtual reality” part) for administrators, including some common management of VMs, services, and storage. Underneath that, it provides specific links between these for greater integration – links between virtualized storage (SONAS), virtualized applications (VMWare, KVM), and virtual services (IBM’s cloud support), all disconnected from physical machines. These links cover specific activities, including the consolidation, workload management, process automation, and solution delivery tasks cited above. Added to other IBM or third-party software, they can provide a full disconnected virtual infrastructure for any of today’s key IT or consumer needs.

 

So, as I see it, that’s the significant step forward: not coming up with a new meaning of “virtual”, but integrating the use cases of the various existing meanings of virtual – tying it all together. And the benefits? The usual benefits of such integration, as evidenced by IBM’s “success stories”: greater efficiency across a wider range of existing infrastructure, leading to major cost, performance, and ultimately revenue improvements. These will be evidenced especially in the cloud, since that’s what everyone is paying attention to these days; but they go beyond the cloud, to infrastructures that may well delay cloud integration, defer it forever, or move directly to the cloud’s successor architectures.

 

Users’ Bottom Lines

The key message for users, to my mind, is to treat the new manifestation of virtualization as an added reason to upgrade one’s flavor of virtualization, although not necessarily a reason to upgrade in and of itself. Rather, it should make one’s specific IT budget plans for the immediate future related to virtualization more of a slam dunk, and cause IT planners to dust off some “nice to haves”. And for those seeking a short-cut to current cloud technology, well, wrapping all sorts of virtualization in one bundle seems like a good bet.

 

When users make a choice of vendor, I would say that in most cases IBM ought to be at least in the conversation. This is not to say one vendor’s approach is clearly superior to another’s on its face (remember, virtualization is a “false face”!). However, in this announcement IBM has pointed to specific use cases where its technology has been used and has achieved significant benefits for the user; so much of the implementation risk is reduced.

 Above all, IT buyers should be conscious that in buying the new virtualization, they are tapping further into an extremely powerful current in the computer technology river, and one that is currently right in the main stream. There is, therefore, much less risk from riding the current via new-virtualization investment than from being edged slowly into a backwater through inaction. As virtualization enables cloud, and cloud changes virtualization’s face, the smart IT buyer will add that new face to customer-facing apps, using vendors like IBM as makeup artists supreme.

Post your Comment




(optional)

(optional)





   
Update security code

 
Wayne Kernochan