Jump to content United States-English
HP.com Home Products and Services Support and Drivers Solutions How to Buy
» Contact HP

HP.com home

The data center in 2012: It's self-managed

» 

HP Labs

» Research
» News and events
» Technical reports
» About HP Labs
» Careers @ HP Labs
» People
» Worldwide sites
» Downloads
Content starts here
The data center of 2012 will be as different from the data centers of today, as today's data centers are from those in the pre-Web days.
gif
gif
gif gif gif gif

Related links

» Rich Friedrich
» ICAC 2007
gif
gif gif
gif

News and events

» Recent news stories
» Archived news stories
gif
gif
gif
gif

By Jamie Beckett, June 2007

It's not just the Internet and how we use it that's being revolutionized by Web 2.0. The computing infrastructure itself is changing to meet the soaring demands for computing bandwidth and storage, the enormous growth of rich media online and the explosion of dynamically created content.

That creates tremendous challenges for IT service providers, says Rich Friedrich, director of HP's Enterprise Systems and Storage Lab.

"Web 2.0 will 'break' the back end," says Friedrich, who is addressing some of the challenges posed by Web 2.0 in his keynote at this week's IEEE International Conference on Autonomic Computing. "The implications on the back end will be as significant as the Web was in the 1990s."

gif

Users control the content

One key challenge has to do with the dynamics of content creation. In the past, site owners typically controlled content on their pages - not only the text, images and general appearance of content, but who could create it, when and where it would appear and how it was organized.

That's all changed. Users upload their own images, video or other types of content to sites like MySpace and YouTube. On del.icio.us and other social bookmarking sites, users tag content to determine for themselves how it is organized. The growth of mashups, which combine and remix data and services of unrelated sites, means that content owners no longer control their own data or even the performance of their own sites.

"All of this material lives and breathes outside the enterprise now, and this has enormous implications for the computing infrastructure," Friedrich says. "There will be 10 times the demand for bandwidth, computation and storage, and we're going to have to manage it in a way that's scalable, predictable and secure."

gif

Automation is essential

In his talk, "Self-managing Data Centers: The Information Engine for Next Generation Internet Applications," Friedrich will discuss the essential role automation will continue to play in managing the back end.

This may involve dynamically adjusting work loads to meet changing demands, monitoring compliance with government regulations or company policies, managing security-protection mechanisms, managing access to information and much more.

Automation could have tremendous implications for data center costs and efficiency. For example, HP Labs' work on automating power and cooling - released as HP's Dynamic Smart Cooling product -- has the potential to reduce electricity use in data centers by 25 to 40 percent. If applied worldwide to some 10,000 data centers, researchers estimate the energy savings would provide enough electricity to power four million U.S. homes.

gif

Virtualized services

A particular IT challenge is dynamically creating and deploying new services, Friedrich notes. Virtualization plays a key role here, because it can create a shared resource pool to maximize infrastructure utilization - essential when you can't predict demand -- while making it possible to isolate individual applications and to provide protection against security threats and software or hardware failures.

"What we're trying to do is create a set of virtualized services that can be shared. We share hardware through virtualization; we now want to do the same thing with applications," he says. "You don't tie specific applications to particular machine so they are easier to move around."

The ultimate goal, says Friedrich, is a self-managing data center.

"The data center of 2012 will be as different from the data centers of today, as today's data centers are from those in the pre-Web days," he says. "HP Labs researchers are vigorously pursuing state-of-the-art technologies to provide the economical, flexible and predictable infrastructures that enterprises will require to handle the demands of the future."

gif

HP Labs at ICAC 2007

Friedrich is one of a number of HP Labs presenters at this year's conference, held June 11-15 in Jacksonville, Fla. Kumar Goswami, who manages the Utility Infrastructure Department at HP Labs, is the conference's program co-chair and will address the opening session. Dejan Milojicic will chair the poster session and the Web 2.0 for Autonomic Computing: a Panacea or a Replacement panel session.

Other HP Labs papers and presenters include:

Papers

SLA Decomposition: Translating Service Level Objectives to System Level Thresholds, (a similar version of this paper appeared as HP Labs tech report, HPL-2007-17) Yuan Chen, Subu Iyer, Xue Liu, Dejan Milojicic, Akhil Sahai (HP Labs)

A Regression-Based Analytic Model for Dynamic Resource Provisioning of Multi-Tier Applications, Qi Zhang (College of William and Mary), Lucy Cherkasova (HP Labs), Evgenia Smirni (College of William and Mary)

Posters

Prato: Databases on Demand, Soila Pertet, Priya Narasimhan (Carnegie Mellon University), John Wilkes, Jay Wylie (HP Labs)

Exhibits

Adaptive HP Infrastructure for SAP, Sven Graupner
(HP Labs/SAP Research)

SLA Decomposition: Translating Service Level Objectives to System Level Thresholds, Yuan Chen (HP Labs)

gif
gif
gif
gif
Printable version
Privacy statement Using this site means you accept its terms Feedback to HP Labs
© 2009 Hewlett-Packard Development Company, L.P.