By Anne Stuart, June 2006
‘Virtualization’ may sound like the title
of William Gibson’s latest futuristic thriller, but
it’s actually a business technology whose time has
come.
The approach is sort of an IT trick that can yield lots
of tangible benefits focused on the efficient, highly flexible
use of computing resources. At HP Labs, researchers are
exploring virtualization as a method of creating real change
in corporate data centers – which are, of course,
notorious time and money guzzlers.
“At the highest level, our research is aimed at
helping customers manage risk and cost,” says Tom
Christian, a researcher based in Fort Collins, CO.
Christian and others at the lab are working to help make
virtualization a widespread reality. They're exploring ways in which virtualization
can streamline and add value to data-center operation, and contributing innovations to HP virtualization solutions. In addtion, they are participating in a groundbreaking industry-wide
effort to develop standards and an open-source platform
for the technology. (See "Introducing Xen.")
What does virtualization do?
“Virtualization is a construct that allows you to
provide an abstraction that’s different from the
physical machine,” explains John Janakiraman, a Palo
Alto, CA-based HP Labs researcher. “For example,
it can make the computer system appear to have a lot more
memory than it has.”
That opens the door to a whole new universe of computing
capabilities, says Janakiraman: “We can make a physical
machine look like many machines, or we can make many physical
machines look like a single machine – and many other
things in between,” he says.
Such capability lets organizations pool, share and reallocate
their IT resources – manually at first, then automatically.
data centers can, for instance, shift available processing
power to where it’s most needed or quickly open up
new storage space – and shut down machines that aren’t
being used.
“You kill two birds with one stone,” Janakiraman
says. “It allows you to run more applications on
fewer machines, which improves efficiency, but you’re
also getting more flexibility.”
Virtualization reduces risk by creating redundancies,
mirroring physical systems both to back up information
and to provide an alternative if the physical system fails.
At the same time, a single physical server can be securely
divided into multiple virtual servers, isolating and shielding
individual pockets of information as if they were stored
on separate physical servers.
“Things can fail independently,” Janakiraman
says. “One operating system can go down while another
operating system on the machine continues to run.”
Virtualization also makes it possible to avoid application
incompatibility problems by running them in separate, isolated
virtual servers on the same physical server.
And while virtualization may sound complex, it actually
simplifies matters: Because all those virtual servers share
a common architecture and interface, researchers say, they
can be easier to manage and maintain.
Not surprisingly, those tantalizing capabilities have
prompted just about every major high-tech vendor (and many
lesser-known ones) to launch virtualization initiatives.
What’s less predictable is that, in an effort spearheaded
by HP, these otherwise-fierce competitors are collaborating
to develop a standard industry-wide virtualization platform
that will ultimately benefit all their customers.
“It’s one powerful tool,” Janakiraman
says of that effort, known as Xen. “That’s
why all these players are interested.”
Virtualization isn’t a new idea. “Thirty years ago, people
were talking about ways to escape the physical machine,” Christian
says. “Virtualization was commonly used to develop and test new
architectures and software, an application for which it’s still
well suited today. ”
In creating those virtual layers, researchers also realized
they’d created that layer of protection Christian
describes, providing a space to run programs they could
isolate from the rest of the system.
Unfortunately, early virtual machines performed poorly,
Christian says, running too sluggishly for widespread commercial
use. But researchers convinced about virtualization’s
vast potential kept experimenting with the concept.
“They realized that you can take some services,
abstract them and provide access to them through a programmatic
interface,” Christian notes, “and paravirtualization
was born. There’s a lot of buzz about paravirtualization
today, but only the word is new.”
These days, the performance of some benchmarks on virtual
machines can approach the results seen on native hardware.
Once the next-generation secure I/O hardware is available,
performance of virtual machines should be, well, virtually
identical to the real thing, Christian says.
HP already offers a range of virtualization solutions and services. But
there ’s more room for discovery.
HP Labs' work is focused on making corporate data centers
more efficient. In a 2002 study, researchers found that
most of the 1,000 servers in six corporate data centers
were using only 10 to 35 percent of their available processing
power. Virtualization can help a company tap that unused
supply of computing capacity as needed to meet business
demands.
A typical corporate data center has hundreds of machines
running thousands of applications, with demand for those
applications constantly fluctuating. Virtualization simplifies
the adjustment of resources allocated to applications,
enabling staff to meet demand changes in real
time.
Current research is aimed at making such optimization
happen automatically: resources are managed dynamically
and transparently to deliver a contracted level of service
to an application. Virtualization also offers applications
the opportunity to participate in resource management.
“Applications can use the fact that they're running
in a virtual environment to their benefit,” Christian
explains. “For example, applications can return resources
that aren’t going to be needed for awhile to the
system to reduce operating costs, allowing that capacity
to be reallocated elsewhere."
Meanwhile, virtualization can alleviate another data-center problem – high
energy costs.
"One way you can save electricity is by turning things
off," Christian notes. "You can do that if you
have a model like virtualization that lets you compact
your services to run on a minimum set of physical servers.”
Then there’s the need for high availability, which
traditionally calls for having a second system and a cluster
of servers ready to go on demand. Virtualization allows
companies to quietly stash away similar computing capability
until it’s needed. “That way,” Christian
says, “you get high availability at low cost.”
Several HP Labs projects are designed to help make virtualization a widespread
reality. First, of course, there is their ongoing collaboration with others
in the industry – including some competitors – to develop
Xen, the common virtualization platform.
In addition, researchers are exploring options for automating
virtual data center management.
“Right now, a lot of load-balancing is manual. A
person has to initiate it,” Christian says. “The
goal is to get people out of that equation to the extent
that you can.”
But eliminating human intervention involves a critical
social aspect – and a third area of emphasis for
the research team: assuring those responsible for data center
operations that the automated process is as trustworthy
as possible.
“We haven’t quite reached the point where
people are ready to relinquish control,” Christian
says. “It’s like offering somebody a car that
will drive itself in heavy traffic. Would you trust it?"
The researchers say the success of current virtualization
technology has convinced people that virtualization is
ready for the enterprise data center. The next step, they
say, is automation, which will require earning the same
degree of trust.
Researchers are focusing on locking in reliability, with
the goal of building up users' trust and their confidence
that the solution will work.
Researchers don’t hesitate to prognosticate about virtualization’s
short- and long-term potential.
“It’s a key technology for the future, and
it will become ubiquitous,” says Christian. He predicts
that eventually, virtualization – with its obvious
benefits for data centers – will be a standard firmware
layer on every business IT system. “And you’ll
see it even on your home machine,” he says, “where
the isolation will be used to enable secure delivery of
movies, music and other proprietary content.”
And Christian finds none of virtualization’s current
capability – or its seemingly limitless potential – surprising.
“We’ve known all the time that it could do
these things,” he says. “We’re just working
on a common platform so that we can do even more.”
Anne Stuart is a Boston-based freelance journalist who has written about business, technology, and the Internet for more than a decade.
|