by Quentin Hardy
March 17, 2003 This article originally appeared on Forbes.com:
The biggest names in computing have a new mantra: You only have to pay for what you use. It may sound like a radical pitch, but think of it as the old time-sharing scheme reincarnated.
Hewlett-Packard's futuristic lab has produced something truly visionary: time-sharing. That's right. The hot computing concept of the 1960s has been brought back to life. At HP's bright, pristine data center in Palo Alto, Calif. customers see 324 computer servers, each shrunk to a circuit board and nested in a stack with 30 to 40 others in 6-foot-6-inch-high cabinets. As separate machines they would need 2,500 wires to link them. This bulk installation needs only 12 cables, plus a piece of software that makes sure each server is doing all it can before firing up another. The targeted air-conditioning blows 70-degree air only on the servers that are serving up data.
During the time-sharing wave of 35 years ago the main purpose of sharing was to maximize hardware. Back then an IBM 7094, with all of 2 kilobytes of memory--one-millionth of what you get in a mail-order Dell these days--might rent for $70,000 a month. The time-share operator would queue up an assortment of customers so that not a second of processor time went to waste.
Nowadays an equally important objective is to save on labor. The sales pitch is this: Pool your computing needs with those of dozens of other customers and you won't have to keep your own programmers and troubleshooters on the payroll. Shared computing, it is said, will knock 20% off your data-processing budget. And that's in writing. "Big firms have 8,000 to 10,000 servers running at 20% capacity," says Shane Robison, HP's chief technology officer. "Our intention is to lower costs, add speed and remove risk."
At a time when a recovery in corporate tech spending is being pushed out to 2005, the giants of computing may have finally hit on something customers want to buy: simplicity. The biggest names in the business have their brightest engineers rolling out ways to unify the great hardware hodgepodge, rationalize wasteful resources and automate the labor-intensive side of technology. HP, Sun Microsystems, International Business Machines and Microsoft are desperate to counter their customers' anger and loss of faith by proving they can make computing cheaper and more efficient.
"Most customers don't trust equipment manufacturers to be open about integration," says Sun Chief Scott McNealy. "We didn't deserve their trust--the industry didn't deliver."
HP envisions a future in which corporations use centrally managed pools of processing o storage. Big tasks will be shared in grids of machines perhaps thousands of miles apart, assembled on-the-fly for a job. Customers will use and pay only for what they need at any moment. Hardware sellers will get paid not for their silicon but for their ability to hide complexity from the customer. A crucial fact: Nowadays maintenance and services eat up 70% to 80% of a company's info tech budget.
Every new technology wave comes with its own catchphrase: distributed computing, client-server, e-business. The new time-sharing has competing slogans: HP calls it "adaptive computing"; IBM calls it "e-business on-demand"; others call it "don't-care-ware."
It has yet to be shown that the new time-sharing will save money for customers while making money for the suppliers. Products are still rushing out the door, and a big deal takes months to sell. But HP is well along on one part of the solution: server consolidation. That accounts for $500 million, or 25%, of HP's consulting business.
HP's first big customer for its product is Royal Philips Electronics' semiconductor division in the Netherlands. In January Philips began consolidating 400 servers into two data centers that can allocate storage space on-the-fly to those departments that need more without having to buy more disk drives. Some storage drives were sitting around half-empty. By midyear Philips' data centers will be able to "sell" printing, computing and file management services to employees. Wim Verkuijlen, the chip group's vice president for infrastructure, is confident he'll trim 20% out of his budget by automating tasks once done manually.
HP's big plans have nothing on the visions at IBM. "If you just talk about server consolidation and you don't talk about remaking a business, you're not seeing the whole picture," says Irving Wladawsky-Berger, general manager of IBM's on-demand business. His company's recent $3.5 billion acquisition of PricewaterhouseCoopers' tech consulting arm was designed to add specific industry expertise to IBM's technical know-how so IBM could remotely manage, say, the inventory data of an auto company. One early customer for time-sharing is Mobil Travel Guide, the restaurant-and-hotel rater partly owned by ExxonMobil. This month the company will extend its print offerings to the Internet, with services that let you locate an eatery and book a hotel room from a cell phone. Mobil hired IBM to do the data processing, shutting off Microsoft databases in favor of Oracle database software running on a shared IBM z900 mainframe. Mobil will pay on a per-usage basis that gives it the coverage it needs in high-demand summer months, while costing little when business drops in February. Mobil figures it will spend 25% less this way than it would with a conventional outsourcing contract. In April J.P. Morgan Chase will begin a $5 billion, seven-year outsourcing deal for IBM's on-demand system, moving 4,000 of its employees and consultants to IBM's payroll.
Microsoft, with a $2 billion business in server software, will promote "automated provisioning" as part of an overall strategy to further "march into the data center." The Redmond giant will introduce an updated tool kit for independent developers working with Microsoft data center software. Using this, developers will come up with apps that can "introduce themselves," automatically installing themselves on 1 or 100 servers at once using the Internet-era language XML. Such automation and, of course, the power of cheap chips from Intel is supposed to be the last nail in the mainframe's coffin. "IBM wants to keep mainframes relevant, but you want to write software for the cheaper, more efficient new systems," says Bill L. Veghte, vice president of Microsoft's Windows server division.
Sun, now an underdog with its proprietary technology and withered market capitalization, can't mount the consulting-heavy approaches of HP or IBM nor match the marketing clout of Microsoft. It is pushing its network skills as a solution to complexity. Sun's N1 (it stands for "Network One") is largely aimed at network management, where it figures 80% of computing budgets go.
Sun plans to act as a general contractor. Customers will go to its lab, called Iforce, where engineers will build and test a custom system that users pay for only when they are satisfied. The idea: Buy the hardware you need and don't be stuck with excessive gear. Of course, if Sun succeeds in getting more mileage out of hardware and software, the less hardware and software it may end up selling. "The hardware people at Sun feel like this is a threat," says Steven MacKay, the head of Sun's N1 project, "but if we don't do it, someone else will."
As big as constant computing is, the big guys can't do it all themselves. Newer companies will have a place, but not easy riches.
Under both Sun's N1 and HP's product is software from a startup called Terraspring. The company started life in 1999 to manage corporate data centers. After raising $60 million in venture capital, its backers realized it needed much more to compete in the big time. Terraspring licensed its management software to HP in 2001, then sold itself to Sun last November. "I was naive to think a startup could pull it off," says founder Ashar Aziz, who is now the chief technical officer of N1. "This is a play for big systems."
Companies that were quick to spot the move to constant computing but misjudged the resources needed are now scrambling to size themselves correctly. Netscape cofounder Marc Andreessen started a data-center management firm called Loudcloud in 1999, later selling the service business to concentrate on automation software. Now called Opsware, the firm trades below $2 a share. Andreessen, unrepentant, says his little company will prevail. "The customers won't be fooled," he says. "They know the big hardware companies don't want to interoperate, they want to keep their proprietary hardware." Besides, he says, he has a deal with Sun to operate part of its constant computing offering.
Others may fare independently by staying inside specialties. A five-year-old company called VMware supplies virtual machine software to both IBM and HP and is annually doubling its revenues. "We're realizing this with our partners," says cofounder and Chief Executive Diane Greene. "We don't want to boil the whole ocean."
Don't expect a steady parade of stocks going public on the back of constant computing. Venture capitalists know that most of their portfolio companies will end up as sources of R&D for the big fish. In this world "acquisitions will be the exit for VCs," says Mayfield managing partner Kevin Fong.