Jump to content United States-English
HP.com Home Products and Services Support and Drivers Solutions How to Buy
» Contact HP

HP.com home

 

February 2005

Agent of change: Automated tools from HP research aim to reduce cost, difficulty of IT changes


» 

HP Labs

» Research
» News and events
» Technical reports
» About HP Labs
» Careers @ HP Labs
» People
» Worldwide sites
» Downloads





















































IT technician in server room

Content starts here
By reducing the cost and difficulty of IT changes, these tools could make it easier for businesses and other large organizations to explore and adopt better business processes.

by Steve Towns*

"Nothing endures like change." Heraclitus, a Greek philosopher, said it in the 6th century, but it could easily be any business manager speaking today. What's different now is that change in business involves more than people -- it involves IT systems. And those don't change so easily.

Introducing a new product? Making an acquisition? Merging two divisions? Ready for a costly IT headache?

"Organizations today are constantly in the middle of change,” says Sharad Singhal, a researcher in HP Labs. “So the question for us was: How can we make IT more responsive to those changes?”

Researchers may have an answer in an experimental tool that aims to turn IT systems into flexible resource utilities by delivering on-demand computational resources to applications or services.

Research contributes to new products

The research is already making its way into products. The HP OpenView Automation Manager, announced in November, is the first automation product to use critical business performance information to allocate IT resources.

The software, based in part on business intelligence technology developed in HP Labs, adjusts the configuration of IT services and applications based on changes in demand, automatically reacting to peaks in order processing, for example, or adapting to a surge in e-mail traffic.

Automation Manager and future products emerging from the research could reduce the risk of complex IT projects and shave weeks from the deployment process.

Reducing time to deploy new applications

Researchers in the lab designed the software to capture user requirements, assess the available IT resources, account for the existing workloads and other constraints, and create system configurations that meet those requirements.

Easier said than done. The tools must not only understand the capacity and behavior of sophisticated hardware such as servers and storage, but they must also determine capabilities of multiple operating systems and sort out conflicts among them -- all while complying with hundreds of constraints required to protect the operation of key systems.

“Typically you would need to gather a team of experts -- on networking, servers, storage and applications -- to figure out what needs to be done,’” says Singhal. “Using that approach, designing and deploying something like a new Web application could take about 12 weeks."

“These new tools look at all of that, reconcile the information and quickly bring the system together,” he adds. “We would like to bring that time -- including all the human processes that go with testing -- down to a week or less.”

Masking complexity

To accomplish that, the researchers developed a collection of integrated technologies that perform three key functions:

• Policy-based resource composition - allows users to model system composition rules and best practices, and gives them the ability to automatically create custom configurations that conform to those policies. This shortens the time spent designing IT environments and reduces the likelihood of error.

• Capacity management - provides scheduling and capacity management algorithms that track complex resource demands and react accordingly. This lets operators manage infrastructure use and meet quality-of-service requirements.

• Resource assignment - uses mathematical programming techniques to ensure that resource requirements such as network bandwidth are met for new applications, and that adding those applications to shared resources does not cause system bottlenecks.

Added flexibility for business managers

These components mask the intricacy of large IT environments by giving IT managers a relatively simple drag-and-drop interface for designing new functions. This doesn’t completely eliminate the need for skilled people, however. The tool assembles the pieces; IT professionals check the results.

If the results aren’t quite right, operators tweak the requirements to get a different answer.

By reducing the cost and difficulty of IT changes, these tools could make it easier for businesses and other large organizations to explore and adopt better business processes.

“The amount of time managers spend agonizing over business-level change will shrink because they know if they make the wrong decision, they can go back and change it tomorrow,” Singhal notes. “They haven’t set into motion a bunch of wheels that can’t be stopped.”

Designing for data centers

The new technology grew from Utility Data Center research at HP Labs. The Utility Data Center concept treats IT resources in an organization’s data center as one pool of computing power. Computing resources are configured and allocated to handle a particular task, then reconfigured on the fly to tackle the next job.

Instead of operating dedicated payroll systems or accounting systems, for example, a business could apply computing power to those tasks when needed, and direct those resources elsewhere when finished. The experimental software facilitates a Utility Data Center approach by automating the entire life cycle of computing tasks -- the design, deployment, operation and decommissioning of each computing job.

But it is just as effective with traditional, dedicated systems because it collects data about the underlying infrastructure, and configures systems based on an organization’s IT capabilities, taking into account whether resources can be shared or not.

IT agility

Ultimately HP Labs research may allow enterprises to use their IT resources with unprecedented effectiveness.

For instance, businesses could rapidly allocate resources to create custom infrastructure for marketing promotions, or speed up test and development environments where infrastructure has to be constantly reconfigured. What's more, gaining an understanding of how different workloads are placed on applications could make it possible to reassign applications to run on infrastructure that would otherwise be idle.

The tool also could facilitate IT consolidation efforts and help businesses make more effective use of their computing infrastructures because it would allow them to determine what proportion of resources each application requires whether it's possible to juggle those resources to operate more efficiently.

How HP uses the tool

Internally, HP is using the software's capacity management components to consolidate multiple servers into a shared utility to support business applications. Like many large companies, HP maintains hundreds of internal business applications, each of which traditionally required its own server. Treating these separate servers as a single, shared computing resource reduces the number of servers needed, which lowers licensing, support, management and hardware costs.

HP researchers also used the technology to help design an IT consolidation initiative for a large financial services firm. The firm plans to move users from standard PCs to terminal-based “virtual” desktops supported by centralized servers and storage. Using components of the software, researchers modeled how the proposed system would react to various demands and how resources could be allocated most effectively.

“We have to simplify the process of making changes so that systems quickly accommodate new business requirements,” Singhal said. “We’re trying to make sure that IT resources support the kinds of things an enterprise needs to do.”


* Steve Towns, editor of HP Government Solutions magazine, wrote an earlier version of this story for the magazine's Winter 2005 issue . The original story was modified for HP Labs.

Related links

» HP OpenView Automation Manager
» Technical report: A capacity management system for resource pools
» Technical report: A resource utility system

News and events

» Recent news stories
» Archived news stories

HP servers

 




The software is designed to:

• Capture user requirements
• Assess the available IT resources
• Account for the existing workload and other constraints
• Configure systems to meet the requirements

Sharad Singhal










































































HP Labs' utility data center
     
Printable version
Privacy statement Using this site means you accept its terms Feedback to HP Labs
© 2009 Hewlett-Packard Development Company, L.P.