||Add To My Personal Library
December 29, 2006
Vol.28 Issue 52|
Page(s) 23 in print issue
Drive The Server Huggers To Abstraction
Virtualization is hot right now. More precisely, x86 processor virtualization is hot because a lot of enterprises have gotten serious about server consolidation. One obstacle to virtualization has proved to be more political than technical. So-called server huggers need to get their heads around a new layer of computing abstraction.
John Sloan is a senior research analyst at Info-Tech Research Group. He has more than 15 years of experience as a technology writer, as well as experience in corporate communication, Web site development, and electronic publishing. His research interests include electronic publishing and communications (including e-learning), disaster recovery planning, desktop computing, and Web application development platforms. Sloan holds a masters degree in journalism from The University of Western Ontario.
A server hugger is someone who opposes virtualization because he will lose ownership and/or control over a physical server asset. Server hugging is most often identified in business units outside of IT that resist centralization and consolidation. However, server-hugging tendencies can be exhibited within IT departments, especially if the pressure to consolidate and virtualize servers is coming from outside.
Server huggers are a legacy of the distributed processing revolution. In the days before distributed processing, when hardware was centralized and terrifically expensive, only those with the keys to the golden room had authority over what could be done with a computer. With the relatively inexpensive and standardized PC, processing was distributed to the masses—both on desktops and eventually as servers.
Its The Software, Stupid
Owning a processor became synonymous with authority over the process. From this point of view, any attempt to take my box away can be interpreted as a vengeful evil empire striking back. This view is deeply flawed. Its not about hardware. Its about software. It has always been about software.
Steve Jobs, current head of Apple Computer, was able to exploit this confusion to the point of creating a hardware fetish. Hardcore Mac users were so convinced that there was something special about the physical reality of the Mac that they reacted negatively to recent moves to base the Mac on x86 processors. Yet it has always been the Macs user interface and applications that provided the power to be your best.
Without the software organizing all those binary 0s and 1s, my desktop computer would be little more than a rather large, humming paperweight. Yet, though we intuitively understand that it is all about the software, we still tend to confuse the bottle with the wine.
Been There, Done That
Back in the 90s, it just seemed to make sense. I was involved in a corporate communications project to exploit this new-fangled World Wide Web thing. We had done some of the initial exploratory work locally on a Windows NT workstation but wanted to go big time—building and maintaining a corporate Web presence. We bought an x86-based NT server and colocated it in the central data center.
My department got the software and support we felt we needed for our project through the purchase of a server—our process, our software, our Windows NT box. We knew we were getting a certain level of processing power, memory, and storage because it was in the box. Beyond the box, we had service agreements with IT for the provision of power, network connectivity, maintenance, and backup.
Owning the box simplified things, but it was also inefficient. Under-utilized capacity in the box is wasted because it cannot be transferred or shared. When capacity is fully utilized, the only way to expand capacity is to buy yet a bigger and more powerful box, which requires acquisition, configuration, testing, migration, and inevitable interruption of service.
How To Win Over The Server Huggers
Virtualization should be viewed as a win-win for all involved. The department or business unit—even the user of a virtual PC—still gets the application(s) and OS environment it wants for its process needs. But now that virtual machine can have its processing power and available storage increased without the immediate need to purchase and configure a new physical server.
The virtual machine uses only the capacity it needs. Unused capacity can be shuttled to another virtual machine. More efficient utilization and multiple virtual machines hosting means fewer physical servers need to be purchased and maintained. Availability can also be improved if the infrastructure allows virtual machines to be dynamically moved from one physical box (or blade) to another.
The catch is that the owners of the virtual machines have to get their heads around the idea that they can own a server without owning a physical machine. This is not easy as certain levels of processing, memory, and storage capacity, and even the kind of OS and software, are guaranteed in the fixed physical asset. Now all of those elements are abstracted and fluid.
The guarantee now has to take the form of a clear agreement between the service provider (for example, IT) and the user. Making virtualization work will require transparency, accountability, and communication from the IT department. Essentially, it will be about building trust. Trust can often be a harder thing to build than a server.
Send your comments to email@example.com