Monday, June 18, 2007

Column: Smile--You're always on Camera


Letter From Silicon Valley

Burlingame, Calif. -Last week, I got a glimpse of the future--and even captured a couple of minutes of it on video.

It happened during the opening dinner of a three-day powwow on media and entertainment, convened by the Paley Center for Media (formerly the Museum of Television and Radio) and the technology governors of the World Economic Forum. The event glittered with executives and political wonks: California Gov. Arnold Schwarzenegger spoke; so did Google's Eric Schmidt and Yahoo Chief Terry Semel.

Even more memorable than the speakers, however, were the scores of guests holding up little white boxes, about the size of a BlackBerry, videotaping the event. Each dinner guest had been given a "Flip" video camcorder--and instantly put it to use.

Sitting next to me, Jonathan Kaplan, the spirit and chief executive behind the company that developed the Flip, beamed--and pointed his Flip toward center stage as well.

Welcome to a world where everything you do may well be recorded. Flip clips bring special significance to the phrase "global village." Just like several generations ago, when the old ladies of the village kept a sharp eye on everyone, today an electronic eye will do that for us.

Every event. Any time. Anywhere. This thing fits into a purse or pocket. The quality of the video? Not as great as your high-end camcorder, for sure, but sweet enough--and certainly handy. It took my 10-year old about two seconds to find the "on" button and start filming breakfast the next morning.

It won't all be great video. (OK, my Arnold footage isn't brilliant. I did get a shot of the green snakeskin boots, though.) There will be plenty of chances to accidentally delete footage. (Yeah, I did that too.)

Kaplan's company, Pure Digital Technologies, based in San Francisco, is at the front of this wave. In May, it rolled out two models: one captures 30 minutes of video, the other 60 minutes. Prices range from roughly $120 to $150--maybe less, depending on where you shop.

Such technology is essentially software, wrapped up in a package of plastic and metal. Kaplan says Pure Digital spent about six years building the software inside the Flip. At first, the start-up licensed its technology to a number of companies: RCA, for instance, makes "Small Wonder," a similar camcorder. These plug into your computer or your television directly. Drugstore chain CVS/pharmacy sells a one-time use camcorder, based on Pure Digital's technology, that sells for $30. (Those video clips have to transferred to a DVD at CVS, a $13 operation.)

But the opportunity seemed too juicy to leave to everyone else. About six months ago, Pure Digital, which has a raft of high-stakes financial backers including Mike Moritz of Sequoia Capital and Benchmark Capital, decided to go for broke by selling Flip video devices itself. "Designed in America, made in China," Kaplan says.

Expect to see these kinds of recorders--whether using Pure Digital's technology or someone elses'--everywhere. In your cellphone too.

There's enormous power to the technology. I saw video footage from China last year, taken by a journalism student who did an internship with Forbes, documenting appalling working conditions in factories that are making high-tech products. Imagine if the Chinese workers who build the Flip hand out a few boxfuls of devices to their friends--and share with the rest of the world some of the heartbreaking images of the human and environment cost of economic progress.

Imagine how a Flip would have changed the history of Watergate--if either Woodward and Bernstein had used them, or if Nixon's gang did.

Imagine how schooling changes: Will my boys videotape future classes instead of taking notes?

Jonathan Schwartz, the pony-tailed chief executive of Sun Microsystems, got his five minutes of stage time too. He described how he had recently asked a fresh-faced new hire at Sun what he thought of the company.

"Well, it's kind of an old-fashioned company," the 20-something conceded.

"Old-fashioned?" fumed Schwartz. "In what way?"

"You use e-mail," replied the younger man.

"What's wrong with that?" demanded Schwartz.

"My parents send me e-mail," he answered.

So uncool, so 1990s.

Just keep your nose powdered.

Your comments are most welcome. Write to me at ecorcoran@forbes.com. Please also note whether I can share your comments with readers.

http://www.forbes.com/technology/2007/06/15/flip-camera-video-tech-cx_ec_0618valleyletter.html

Sphere: Related Content

Thursday, June 14, 2007

Column: Fellowship Of The Ring


Letter From Silicon Valley

BURLINGAME, CA--Sam's Grill, in downtown San Francisco, is a bit like a time machine. It sits on a corner, just off busy Bush Street, where buses and motorcycles and Priuses vie for road space, pedestrians scream into their cellphones and neon signs blink. Once through the heavy wooden doors of Sam’s, the light seems a bit softer, the daily commotion muffled by the curious rabbit-warren architecture of the place. Every table is cloistered in its own, wood-paneled nook. It's a bit like dining in a private car on a train.

Last week, I joined my mother-in-law and 12 of her college friends at Sam's to celebrate their 60th college reunion. The boisterous octogenarians had traveled from various corners of Canada for the event. They retold stories of mischievous classmates and dour professors, of picnics in the snow and even of a peculiar old horse.

One, Lloyd Rodway, who sports a dapper silver mustache, leaned across the table toward me. "Do you know what this ring is?" he asked. He stretched out his right hand on which he wore a dull gray band on his little finger. He took it off and handed it to me.

Related Stories
Grading Google

It was a simple gray metal ring, roughly faceted.

"It’s an iron ring," Rodway said. I handed it back, and he slipped it back on his finger. "When I became an engineer, I took part in the 'Kipling ritual.' I promised to use my engineering skills to the best of my ability to help the rest of the world."

"Like the Hippocratic oath?" I asked.

"Just like that," he answered.

It turns out that in the early 1920s, seven of the past presidents of the Engineering Institute of Canada got together in Montreal and considered how to encourage Canadian engineers to feel closer, part of a special guild. One offered to ask his friend, British writer Rudyard Kipling, to draft an appropriate oath.

Kipling was charmed by the idea. In 1923, he delivered "The Ritual of the Calling of an Engineer," with the caveat that it remain an exclusively Canadian affair. Even today, the ritual is intensely private--never recorded and only attended by other ring wearers. The ceremony calls for the inductee to lay a hand on an iron ring and pledge to use his or her knowledge to serve society.

Kipling even got involved with designing the ring itself: "It is rough as the mind of the young. It is not smoothed at the edges, any more than the character of the young. It is hand-hammered all around, and the young have their hammerings coming to them. It has neither beginning nor end, any more than the work of an Engineer, or as we know Space itself. It will cut into a gold ring if worn next to it: thus showing that one had better keep one's money-getting quite separate."

Since the first ring ceremony in 1925, exactly 312,956 Canadian engineers have joined the fellowship of the ring. Malcolm McGrath, who helps administer the program at the University of Toronto, says that students eagerly anticipate the ceremony, even dropping into his office to try on rings. Some 13,000 Canadians took the ring last year. (In the early 1970s, engineers in Ohio started a similar group inspired by the Canadian rites. About 10,000 American engineers join annually.)

An iron ring won’t get anyone a job; it doesn’t convey any accreditation (although you have to have graduated from recognized programs to qualify). But it has genuine meaning: Ring wearers try to do the right thing.

That thirst for purpose reminded me of the famous letter written by Google's co-founders when they registered their company for an initial public offering in April 2004. The starting principle of Google, wrote Sergey Brin and Larry Page, was, "Don’t be evil."

"We believe strongly that in the long term, we will be better served--as shareholders and in all other ways--by a company that does good things for the world even if we forgo some short-term gains," wrote Google's founders. "We aspire to make Google an institution that makes the world a better place. With our products, Google connects people and information all around the world for free."

But can a corporate pledge ever be as intimate or as genuine as a personal vow?

The most contemporary technology is intensely personal: We carry our own music library in our pocket, our work life on a laptop or PDA, our most important connections burned into our cellphones.

It's nice that Google has adopted an ethical pose--though precisely what it means for a company to "do no evil" becomes hazier the bigger and more complex a corporation becomes.

Critics of Google’s policies in China argued that the company tacitly backed evil by yielding to censorship demands by that country’s central government. Privacy advocates are growing leery of Google as it makes readily available Web pictures so detailed that you can see who has snuck out of work for a coffee break. Newspapers, starting with The Wall Street Journal, may even grouse that Google's adroit control of Web advertising has crippled the press's ability to remain financially independent--and so may eventually hurt the free flow of information.

I'm not sure I know what it means for a company to "not be evil." When individuals decide to try to use their talents to improve the world, they can make a difference.

Your comments are most welcome. Write to me at ecorcoran@forbes.com. Please also note whether I can share your comments with readers.

http://www.forbes.com/technology/2007/06/13/engineers-ring-google-tech-cz_ec_0614letter.html


Sphere: Related Content

Tuesday, June 05, 2007

Column: Darwin In Your Palm

Forbes.com


Letter From Silicon Valley


Burlingame, Calif. -Talk about watching evolution in process.

A bevy of new devices are emerging, machines smaller than a laptop computer, bigger than a cellphone. Like variations of Darwin's finches, each of these is evolving its own specialty:

--Steve Jobs' iPhone will let you talk.

--"Mobile PCs," based on Intel's chips, will let you run the software written for PC on lightweight, portable machines.

--The "Foleo," Palm's new machine created by Palm Pilot and Treo inventor Jeff Hawkins, aims to be a "mobile companion" that sits somewhere between a PDA and a full-fledged laptop.

Each of these design efforts--and I'm sure there are scores more--are scratching away at the environment, trying to figure out what it will take to survive. What will consumers (and businesses) buy? At what price? With what usage caveats?

No one better channels consumers' longing to be cool than Steve Jobs and Apple. In the business world, Palm's Jeff Hawkins is Jobs' separated-at-birth twin: Twice before, Hawkins has proven that he can translate our hazy desires to break free of our desks into silicon and plastic.

(Full disclosure: This week, Elevation Partners, which owns a portion of Forbes, said it was investing in Palm. Fuller disclosure: Elevation didn't whisper a word about the deal to us before it was announced. Darn.)

But what gives me absolute confidence that something like these devices will exist are not just these electronic artists--but the armies of unrecognized design and manufacturing engineers who are steadily building the silicon chips that will power these emerging devices.

Take Intel: Executives there say that they believe the company's future lies with "system on chips," effectively special-purpose microprocessors tuned to carry out specific tasks.

Even more experienced in this area is Texas Instruments. I didn't include TI in my list at the top because TI's chips are used in such a diversity of cellphones and handheld machines. For a decade or so, Texas Instruments has been steadily building an entire ecosystem of design around its platforms. Constellations of companies in India, China and elsewhere are building special-purpose chips on top of TI's design architecture.

Bottom line: If you can dream it up, somebody can make a chip that will make it work.

Fundamental to this equation are the "foundries," the massive chip manufacturing facilities run by companies as diverse as Taiwan's TSMC, China's SMIC and Chartered, even IBM. Chip fabs have been around for decades, of course. But what's different now is the ease with which they can make literally hundreds of different products at once.

Enormously complex manufacturing software--go ahead, call it artificial intelligence software--mean that these factories can be programmed to stamp out very diverse designs. Relatively small batches of design suddenly have inherited many of the cost advantages that once blessed a single design.

Are you old enough to remember the heyday of Xerox PARC, when it was an incubator for astonishing ideas? The guiding design philosophy of those days, as I recall, was simple: Do away with a constraint. Pretend that a key--but expensive--component has become free. Pretend bandwidth is free. Pretend silicon is free.

Silicon chips are almost free. The limitation now is software. Jobs, Hawkins and for that matter, companies like Intel, must all be scrambling to figure out how to inspire software designers to write applications that will make their devices sing.

Prepare to see scores and scores of devices. That much is clear. The billion dollar question in the balance is one of evolution: Which one--or ones--will dominate?

I'm starting to morph this column into more of a blog-like conversation rather than a classic piece of reporting. Your comments are most welcome; you can send me a note at ecorcoran@forbes.com. If you do, please let me know if I can share your comments with readers.

http://www.forbes.com/2007/06/05/palm-foleo-iphone-tech-cz_ec_0605mobile.html?partner=yahootix



Sphere: Related Content

Column: Running Intel's Numbers

Forbes.com


Letter From Silicon Valley

Numbers are the lifeblood of Intel. Here are some of the numbers that can make--or break--the company, including how many bits of data its chips can crunch, the power those chips demand and exactly how much it costs to make each chip.

My recent story on Intel described the struggles within the company to get performance and power numbers right and the pain of trimming the costs of running the entire business. So far, Intel has cut staff, most significantly in marketing and management. But Intel's executives have more controls at their disposal, and how they tune those dials can be a strong indication of the health of the business today and over the year to come.

Those controls are buried deep in the intricacies of chip manufacturing. A neighbor of mine joined Intel in 1974. She remembers working in its first factory in Santa Clara, Calif., using scissors to cut circuit patterns into "rubies," sheets of red plastic. After she finished, others would take those ruby "masks" and shine light through them, exposing the surface of a silicon wafer covered with light-sensitive goop called a photoresist.

The chemicals hardened, forming a protective cover for the silicon. Then the silicon wafer was bathed in an acid bath to "etch," or dissolve, unwanted portions. Coat, expose, etch, rinse and repeat. The process would go on until the silicon was fully patterned with the electronic design.

I thought of my neighbor a few months ago when I visited one of Intel's finest "fabs" in Chandler, Ariz. The process is still called "lithography," but it bears as much resemblance to the work of 30 years ago as a robin does to a dinosaur.

The Chandler fab is an enormous squat building. Although it is staffed night and day by people, sturdy robotic boxes with an equally sturdy name ("Front-operating universal pods," or "FOUPs") run the show.

No person--even one suited up in one of Intel's stylishly androgynous "bunny" suits--ever touches a wafer. Instead, stacks of 25 wafers are encased in plastic cassettes. FOUPs, which travel along narrow gauge tracks in the ceiling, shuttle the cassettes from one stop to the next: to a machine that smears photoresist chemicals onto the wafers, or maybe to a machine that exposes them to ultraviolet light.

Coat, expose, etch, rinse and repeat. Some of the machines are so massive they require special bolts so that the floor beneath them will not buckle. All cost millions--even tens of millions--of dollars apiece. After about 60 days, the cassettes will have finished their Disneyland-like odyssey through the fab. They will have covered about 32 miles in their FOUPs. And then they will be shipped to another factory, where they will be sliced into individual chips and assembled into boards or modules for customers.

By the end of this year, Intel will be able to make chips with components measuring 45 nanometers wide. That size means designers can squeeze more than several hundred million onto a silicon chip smaller than a postage stamp. By contrast, Intel's 8080 microprocessor, introduced in 1974, had 4,500 transistors connected by circuit lines measuring six microns wide.

As the transistors get smaller, the silicon wafers are getting bigger. Today's top of the line silicon wafers measure 12 inches across; their immediate predecessors were a mere 8 inches in diameter. Thanks to the magic geometry of circles, the larger wafers have twice the surface area of the smaller ones. Better manufacturing techniques mean Intel has to use less energy and water to pattern the big guys.

Bottom line: It costs Intel less (in variable costs) to crank out chips in its latest and greatest fabs than it does in older ones. By next year, four of those 12-inch wafer fabs will be equipped to make chips with components measuring 45 nanometers.

The more chips Intel can build in its new fabs, the better its profits.

Or turn it around: Getting rid of some of its older fabs will perk up the bottom line.

Right now, Intel has a stable of 16 fabs operating or under construction, half of which can handle the big 12-inch wafers; the other half process smaller 8-inch disks. Five of those older fabs are in the U.S.

Intel has already put a Colorado fab up for sale and said it would cut the workforce at a New Mexico site by about 1,000 employees. Intel also recently said it would fold its assets for building a type of Flash memory into an independent company, formed jointly with STMicroelectronics.

That leaves five older fabs, including an operation in hometown Santa Clara.

In past years, Intel has converted older fabs so they can make smaller chips or work with larger wafers. But it's a numbers game: Since larger wafers can produce so many more chips, how many factories does a company really need?

Here's my bet: Between now and the end of the year, we'll see Intel sell off some of those older 200-millimeter fabs. Even the Santa Clara location could be on the block.

When that happens, you can expect to see the bottom line benefit--profitability will improve for at least a couple of quarters.

Once the endorphins of selling assets wear off, management will be left with the toughest task of all: growing the business.

With this contribution, I'm starting to morph this column into more of a blog-like conversation rather than a classic piece of reporting. Your comments are most welcome; you can send me a note at ecorcoran@forbes.com. If you do, please let me know if I can share your comments with readers.

http://www.forbes.com/home/technology/2007/06/04/intel-chips-fabs-tech-cz_ec_0605intel.html

Sphere: Related Content

Monday, June 04, 2007

FORBES: Intel Plots A Comeback

Forbes.com

Intel's boss was raised on the inside. Now he must turn it inside out.

Intel Corp. was mired in misery early last year. After two decades dominating the microchip market-- one of the fastest-moving and most unforgiving businesses in the world--Intel seemed lethargic, lagging and stumble-prone. Big layoffs were imminent. Its stock price, having reached $75 in the fall of 2000, had stalled in the $20 range. Net income topped $10 billion in 2000 but had fallen to only half that in the years since. Worse, Intel was getting bested, badly, by a pesky producer one-sixth its size. After years as an also-ran Advanced Micro Devices in 2003 had upstaged Intel's muscle-bound chips, namely Itanium and Pentium 4, Intel's centerpiece. Big customers--IBM, Hewlett-Packard and, eventually, Dell--began turning to AMD. In 2005 Intel lost 2.6 points of market share, far more than it had expected.

Moreover, forays for new growth were fizzling: A big move into new chips for cell phones flopped; a plan to create a business running server farms for corporate clients faded; a billion-dollar gamble on Itanium, a new-generation chip for big servers, failed to pay off. The efforts had been plotted by Craig Barrett, the materials-science engineer who in 1998 succeeded Andrew S. Grove, the salty leader who had helped start the company. Barrett was helped by a successor of his own: Paul S. Otellini, an Intel lifer who in mid-2005 became the first non-Ph.D. to run Intel.

Unaccustomed to losing, Intel's ranks pelted their new chief with bitter e-mails: Intel had lost it; management was incompetent. Some likened Intel to a lumbering Detroit carmaker. Today Otellini, 56, is reluctant to talk about the backlash, though his chagrin is apparent. His de facto number two, Sean Maloney, is more blunt: "It was a swift kick in the gut. We were angry and disappointed in ourselves." He adds: "We just had a visceral emotion: We're gonna fix it."

That frustration pushed Otellini to wage the most sweeping overhaul at Intel in 20 years. Gone are plans for diversifying away from Intel's chips. Gone is 10% of its workforce. Otellini also bailed Intel out of cell phones, selling off the XScale mobile-chip line. In a first for the company, he has put one chip factory--so far--up for sale. And Otellini is reorienting Intel's focus to look beyond its slow-growth mainstay, processor chips for desktops, to what he hopes will be the Next Wave. This is a world of lightweight notebook PCs and a gaggle of ultramobile machines smaller than a laptop but bigger than a BlackBerry. "The PC market has been very good to us. It's near 300 million units [a year]. It's going to grow to 500 million units," he says. "But how do we sell a billion of something? Can we create a multihundred-million-unit market, per year, in handhelds?"

Intel already had in place a growth engine that could fuel its comeback: Centrino, simpler, faster and cheaper than the Pentium 4. The design began with Intel engineers in Haifa, Israel, far from Silicon Valley. Centrino would power notebook computers--but the "core" processor at Centrino's heart would become the inspiration for all of Intel and lead to a starkly different design than the Pentium, which had reigned since 1993.

In Silicon Valley Intel engineers thrived on using every available transistor to get more speed and power from a chip. The adverse side effect: lots of waste heat. The 178 million transistors in Intel's top Pentium 4 give off enough heat to fry an egg.

In the new-gadget era envisioned by Otellini the chips must be the antithesis of a hefty Pentium--sleeker, simpler, far cheaper and, above all, cooler. Lash a couple of processor cores together and bundle in specialized parts for, say, wireless linkups or video graphics, and this system can power everything from a palmtop to a server. Intel says a new family of cores, aimed at mobile devices, will be ready next year.

"How do we fit inside of something that sells for $100 and make some money?" Otellini says. "Costs become essential. Architecture becomes essential. Integration becomes essential. And the culture of the company has to wrap itself around that." This threatens "the ego of the Intel engineering community," says Maloney, executive vice president. "Their whole notion of self-worth was based around bigger and faster. That aspiration needed to change to cooler, sleeker, smaller. That's a big deal."

Thus Intel has abandoned what may be the most prodigious platform--the cell phone, with 1 billion units sold last year, four times the number of PCs--in favor of a new gadget that barely exists. Succeeding requires Intel to do two things it never has done particularly well: make chips at the lowest cost possible and let customers' demands shape development.

The Intel of old held 85% of the PC microprocessor market and routinely dictated upgrades and designs with little input from the clientele. AMD Chief Hector Ruiz tacitly goads Intel for this: "We did something that, unfortunately, is all too rare in the semiconductor industry--we went out and talked with [customers] about what [they] needed," he said in an industry speech in October 2006.

The new Intel must undergo a personality transplant. The Intel that Andy Grove built had enshrined sharp confrontation as constructive engagement, in the imperious and emphatic style of its chairman, for whom decisions were crisp, choices were binary and markets were won or lost.

Otellini, who scoffs privately at the "cult" that can surround a company's founders, can deliberate something to death. He deploys a reserved manner and prefers persuasion over fiats, consensus over combat. Frustration or embarrassment shows in a red flush to his face. His equanimity is a mixed blessing. It can be seen as indecisiveness.

He was born and bred in San Francisco, and during his college years he spent a summer working with his father, a butcher, in a slaughterhouse. ("I think he did that on purpose, because he didn't want me to ever think of that as a career," Otellini has said.) He attended the University of San Francisco, and in 1974 he landed his M.B.A. at the University of California, Berkeley, joining Intel as an analyst. He hasn't missed an Intel paycheck since. He rose in marketing and management--"I'm a product guy"--and spent a year in 1990 as an aide to Grove.

Barrett succeeded Grove in 1998 and began looking beyond microprocessors, a business he derided as a "creosote bush." (In the desert a creosote bush poisons the ground around it to ward off other vegetation.) He had Intel spend $10 billion buying communications and networking firms, even as it invested hundreds of millions more in the Itanium chip project with HP.

By 2002 the dot-com crash and the collapse of telecom had devastated Intel's profits and chilled Barrett's plans. Intel's move into chips for mobile phones had become a quagmire. Although the company had grabbed a promising chip line in a legal settlement with the old Digital Equipment in 1998 and renamed it XScale, the chip wasn't enough. Unlike the PC world, software for such chips was patchy. Even making the chips proved more costly than expected as Intel had to rejigger manufacturing processes.

"In hindsight, phones--even the smart phones we targeted--was not an area in which we had 20 or 30 years of expertise," says Otellini, who became president in early 2002. "It didn't play to any of our strengths. We didn't have the software or the architecture." Nor did Intel have many customers. Research In Motion put XScale into its BlackBerry, but cell phone makers were leery of Intel's reputation in PCs for reaping most of the profits and leaving boxmakers with less. "There were entrenched players, many of whom had seen the PC movie," Otellini says.

Meanwhile Intel was getting into trouble in microprocessors. The Itanium, in gestation since 1994 and a few years behind schedule, faltered when customers balked at the hassle and the cost of rewriting old Intel-based software for the new chip. Worse, the Pentium 4 was a hothead and a power guzzler, at a time when corporate customers eyed even electricity bills in a bid to reduce their tech spending.

AMD, Intel's plucky rival, was poised to benefit. Its engineers had been working on a homegrown chip that rivaled Intel's high-end Itanium for power but easily ran existing software. And it was cool--generating less heat than Intel's big chips. AMD debuted its Opteron for high-end servers in April 2003 and rocked Intel's world. The competition would knock the average selling price for high-end chips from more than $600 apiece in 2003 to half of that today, says IDC analyst Shane Rau.

Intel, meanwhile, had glitches. It canceled one new version of Pentium 4, ran a year late on another, delayed several other products and ran short of chips because of bad forecasting. Only the transition of the chief executive job from Barrett to Otellini, in May 2005, went smoothly. Intel stock rose 8% in calendar 2005; AMD's rose 43%.

Then Intel stumbled in a spectacular way: It missed sales forecasts on Wall Street two quarters in a row, through the first quarter of 2006. And Intel was bloated. In 2000 it had 86,000 people producing $34 billion of revenue; by 2006 it had added 17,000, though the top line had grown only 5%.

Otellini spent much of last year handling the fallout--disillusioned employees, the board demanding to know why Intel had slipped so badly, a huge round of layoffs. Yet Intel already had a key element in place for a dramatic comeback--the processor core inside Centrino.

In 2000 Otellini, then head of Intel's microprocessor business, had realized slim notebooks would need a cooler, less power-hungry processor than the Pentium 4. So he set engineers in Israel to the task. They approached it in a non-Intel way, sacrificing some raw power to get a chip that ran cooler. The idea was scorned inside Intel. "The company had been so successful in the 1990s it was hard to talk about doing things differently," says David Perlmutter, who led the project. "It was easier to be remote and question the basic religion of the company."

In 2002 their work was all but finished, when Otellini had an epiphany: Notebooks and laptops had to be able to connect wirelessly to the Internet. So Otellini decreed that the new chip should wait until the engineers could fuse their core to a homegrown Wi-Fi component. "Making that decision was tumultuous inside of Intel, to say the least," he says. "It was a cultural issue. We're a microprocessor company." The Intel faithful disliked delaying a new chip to wait for adjunct technology. One computer maker jeered at the project, calling it "Latrino."

Intel rolled out its Wi-Fi-ready Centrino in March 2003. Six months later the new chip was a much-needed hit. Intel's Perlmutter was convinced he could see the next horizon. "The first Centrino wasn't bad," he says. "But could we evolve the architecture to be better than the Pentium 4?"


It could. A Centrino-like core was anointed as Intel's flagship for notebook and desktop PCs. In October 2004 Otellini canceled future Pentium 4 efforts. He signed on for a big test of whether Intel's engineers could shed their dictatorial ways to work closely with a most demanding customer: Steve Jobs of Apple.

Subscribe to Forbes and Save. Click Here.


After much wooing by Otellini, Jobs had agreed to consider using the next core, so long as Apple engineers could work hand in hand with Intel's designers. And in mid-2005 Jobs took the stage at his annual powwow with developers to announce that the Apple Macintosh would start using Intel chips. Since the Mac's debut in 1984, it always had run on chips from Motorola and its partners. Otellini gets coy when asked whether Intel might eventually surface in the Apple iPhone, due in June.

In spite of Centrino's success, Intel overall was sagging. Early last year Otellini hired consultants from Bain & Co., who huddled with some of Intel's smartest managers to take a snapshot of the company. Their report, two months later, stung Otellini and staff: Intel was fat and inefficient, hampered by high costs and a swollen marketing department. One example: Intel had too many "two-in-a-box" managers (a pair who share title and job). One set even jointly oversaw a staff of one.

"We weren't used to the sniping. No one had questioned us for years," says marketing chief Sean Maloney. But instead of demurring, senior execs focused on a fix. "How could you let Intel fail?" he says. "We're the company that's famous for technological change. The sense of personal shame would be overwhelming."

But Intel stumbled again in launching its rescue mission. Led by the deliberative Otellini, the company imposed job cuts so slowly that the ranks grew ever more angry. Intel first told Wall Street it would analyze the company's structure but didn't cite layoffs. Later it said it would fire a thousand managers. In June 2006 it sold off the xscale mobile chip line to Marvell Technology Group for $600 million, shedding more staff. Only in September did Otellini & Co. put a number on the layoffs: 10,500 jobs, or 10% of the workforce, the biggest cut at Intel since it abandoned the memory-chip business in 1985. More layoffs loom as Intel weighs selling the weakest part of its business making flash memory (used in cell phones and cameras) by year-end.

Now Otellini plies new growth. Intel's sales of various "core" chips (including the Centrino and the "Core 2 Duo" lines, both inspired by that original core approach in Centrino) will exceed sales of its classic design this year. Notebook chips are gaining fast: By 2009 Intel figures it will ship more chips for notebooks than for desktop PCs--happy news because at least for now Intel makes more money on notebooks.

Next: chips for the ultramobile handhelds. These will incorporate, on one piece of silicon, a Centrino-like core plus circuits handling such tasks as Voice over Internet, graphics for games and search. Otellini argues that by 2011 such chips could compete in what he expects will be three newly formed $10-billion-a-year markets--one each in mobile, consumer electronics and supercheap PCs for the Third World. That enhanced core goes for now by the name Silverthorne.

"Silverthorne could really be a thorn in Intel's side," frets Auguste Richard, a senior analyst at First Albany Capital in San Francisco, who nonetheless admires the Intel overhaul and has a "buy" on the stock. Any system-on-a-chip revenues for Intel are a few years away. Wall Street also worries about the inevitably thinner profit margins in chips for cheap palmtops.

Intel never had obsessed over cutting product costs. "You would never have had a discussion with Andy [Grove] or Craig [Barrett] about us being the lowest-cost producer," Otellini says. When a company has products that could command as much as 80% margins, he notes, "costs are important but not critical." But cost will be everything in the handheld market, and Intel is counting on its mind-boggling prowess in manufacturing for an edge.

It makes some of the tiniest chips ever created for a PC, cramming them onto the largest silicon wafers in the world. It now has 5 factories (of a total 16) that use platters 300 millimeters across (12 inches or so). By year-end Intel will produce chips with transistors measuring 45 nanometers (45 billionths of a meter), smaller than most human viruses. That will let it etch 2,500 chips on each 300mm wafer.

Otellini bets Intel can stay so far ahead of rivals that it can make chips as cheaply as any competitor. That includes China, which is emerging as the foundry for the rest of the chip world. "We've been benchmarking them. We don't think we're at a cost disadvantage," he says. In March Intel set plans to build its next chip factory, typically a $4 billion project in the U.S., in Dalian, China.

But Intel will have to prevail over fearsome foes: Samsung and Texas Instruments, established vendors of chips for the picky cell phone business. And Intel's new push will require its engineers and managers to cater to customers in ways they never have before.

"Intel talks about being customer-centric, but it's not in their DNA. They've been brought up to rule the world," says Henri Richard, AMD sales chief. When he meets with phonemakers, they tell him what they want, what a chip should do and how much it should cost. In PCs, "Intel tells the customer: 'This is the way it's going to be.'"

At TI, Senior Vice President R. Gregory Delagi says his company has spent a decade learning to coddle clients. It reorganized its supply chain to have TI products ready just across the street from a customer assembly plant. During the tech slump in 2001 TI built a site for a customer within its own factory in France to let the client's engineers work alongside TI staff.

But Intel execs say they learned how to mollify customers as fussy as Steve Jobs. And they vow that their foray into the ultra- mobile market will fare better than their effort to make chips for cell phones, in part because Intel's cores are heirs to the mountains of software written for the 850 million or so Intel-based computers in the world today.

"That's the heart of why it was important for us to make the changes we made last year," Otellini says. A billion computers now link up to the Internet, most of them Intel-based, and it took 12 years to reach that milestone. The next billion machines will come online in only half as much time--and Intel will have to fight hard for every single one of them.



http://members.forbes.com/forbes/2007/0604/092.html

Sphere: Related Content

Q&A: A Talk With Intel Chief Paul Otellini

Forbes.com

(originally published online on 5/17/07 but part of magazine 6/04/07 package)

Excerpts from a Q&A with Paul Otellini, chief executive of Intel.

Forbes: When did you think Intel had to change?

Otellini: My epiphany for driving our strategy came around 2002. We were leading up to the launch of what ultimately became Centrino. I felt very strongly that wireless communications as an integral part of a notebook experience was a killer market opportunity. Six months after launch, it became very obvious that this was a home run.

Around the same time, I changed our planning processes. We had had a microprocessor group, a chip set team, a server group and so on. I said, "We're going to turn this around and identify the end markets and do our planning from the markets backwards." So we changed product planning inside of Intel to be around platforms. It played to our unique advantages--silicon technology, platform architecture knowledge and the ability to scale massively.

Intel had accumulated something like $10 billion acquisitions in communications and networking. Did you have to do that?

There was certainly some communications architecture expertise that we didn't have inside the company. Our work in wi-fi and WiMax came out of those acquisitions. Clearly, one of the larger areas of investment was in the handset division, which we exited. There were a lot of lessons learned there. We saw that there wasn't a way for us to make good money there. But I believe very much that the future of computing is in handheld devices.

But if the future is in handheld devices, why get out of handsets?

It's a lot easier to add voice to a computer than to add computing to a phone. We're not just shrinking the notebook. We're asking, How do we provide a full Internet experience that also includes voice, in a handheld form-factor? I think the Internet is the killer app of mobile computing. Of all computing. The Internet runs on Intel architecture today.

So our view is, if you could deliver the full computer experience, in a handheld form factor, with the right kind of power and performance characteristics, then you have a very interesting product. And in that business, we compete based on our strengths, not our weaknesses.

Is the business model for the ultra-mobile industry going to look like the PC world?

I don't think so. We're not taking the PC business model into this area. We're taking the Intel architecture, which runs most of the Internet, into new devices. These are not going to be $200 chips, inside of a $200 phone. You have to deliver the right kind of performance to run the applications, have the right power envelope to give you the all-day battery power, and the right price to hit the sweet spot for the consumer. That means high integration of technology and moving to very small chips and system-on-chip architectures.

Saying, "I'm going to take a pre-existing architecture on which a billion Internet devices run today and move that into handhelds" is wholly different than saying, "I'm going to take the PC business model and move that into handhelds." It's a whole different model. There's different software to some extent. Windows is a player there but not the only player. And you can see that with Apple emerging.

More and more chip manufacturers have stepped back from manufacturing chips and instead are relying on foundries. What about Intel?

Chip manufacturing gets harder and more expensive all the time. It's basically the laws of physics and the laws of economics at play. To build a modern semiconductor plant which uses 300-millimeter wafers, and state-of-the-art lithography with designs measuring 45 nanometers, costs close to $4 billion. Then there are the tools inside it, which amount to close to a billion dollars, per generation of technology. That means you need to generate $4 billion to $5 billion a year in revenue out of it. There are not many semiconductor companies that are $5 billion or more in revenue. So economics leads many companies to collaborate.

No. 2: The laws of physics mean chip making gets harder and harder. We think we have some breakthrough technology at 45 nanometer. We think it will be harder for people to do that than it was in the past at, say, 65 nanometer or 90 nanometer. Our lead over the competition may extend with this generation and probably extend a lot after that. This is not a macho thing. It's all based on sheer economics.

As the Internet and hopefully our architecture come into the world of consumer electronics and handhelds, the price points are not going to be $1,000 but a few hundred dollars. So we have to be able to get this to low-cost, high performance single chips. And make reasonable profits.

Will Intel work differently with these device makers than it has with PC companies?

Over time, the answer is yes. And what we do in the PC space will change too. As you move towards system on a chip, particularly in ultra-small devices, different customers will have different requirements for what's on the chip. We may have to do derivative versions or semi-custom versions--and create ways customers can exploit their own intellectual property. We do some of this today in packaging. And in some areas of consumer electronics we're building system on chips for certain classes of devices such as set-top box makers.

What about Itanium?

It's used for really big machines: the Tokyo stock exchange, mainframe replacements. There are two models for high-performance computing: scale up and scale out. Itanium is a mainframe scale-up kind of machine. Google, with its zillions of racks of servers, is a scale out. At some point in time, the scale-out model may prevail. But for right now, for some classes of applications, particularly ones for organizations where systems have to be ultra-reliable and high performance, scale up still matters. And Itanium is our best architecture for that today.

What's been your hardest decision?

The one that led to us having to downsize. It was pretty obvious at the analyst meeting last year that we said some of our basic economic models, which had existed for many years, were changing. We had to get leaner, get more focused.

What have you learned by being on the Google board?

Much of the success of Google, apart from obviously their technology, is their aggressive willingness to partner and to make their partners successful. Google says, "You, Mister Partner, and I can create a huge opportunity to monetize assets that you may have. You can do it on your own, or you can do it with me and my scale, and I'll cut you in on it." And that "cutting you in on it" has been the heart of much of Google's growth.

That business model of setting up opportunities where you share mutual success financially has not been a model that has been inside this company or even in the classic PC space. I think this is a very interesting model for us.

http://www.forbes.com/2007/05/17/intel-otellini-chips-tech-cz_bc_0517otellini.html?boxes=custom

Sphere: Related Content

Q&A: A Talk With AMD Exec Henri Richard


Forbes.com

(originally published online on 5/17/07 but part of magazine 6/04/07 package)

Q&A with Henri Richard, Advanced Micro Devices' chief sales and marketing officer.

Forbes: How did AMD become such a meaningful competitor to Intel?

Richard: We decided to go out with a strategic direction that was different from Intel's. Before 2002, Advanced Micro Devices had partial success in the consumer market. We understood that the way to greatness was not to ape Intel but to differentiate our solution. If it weren't for AMD, everyone would be speaking "Itanium." Customers need to trust you. I'd like to put a lot of credit to the new AMD management team for demonstrating that. Customers want to know: Are you real? Are you going to support me? Will you deliver on the promise?

I tell my kids, "My job is to sell freedom." I don't sell products. Our chips are fully compatible with Intel's and vice versa. Sometimes we're faster, sometimes they're faster. But essentially the market aspires to have choice. The last thing you want is for every PC or TV to be the same.

What's different about your approach from Intel's?

When I meet with customers in the cellphone or TV business, they present their vision of where they want to take their product, and they tell me what they want the components to be, what they should do and how much they should cost. Then they say, "Can you do it for me?" In the PC space, Intel tells the customer, here's the roadmap. This is the way it's going to be. Since when does every customer in the world decide that every other year is the right schedule to upgrade their chip? They talk about being customer-centric, but it's not in their DNA. They've been brought up to rule the world.

Intel looks at everyone as an extension of their business. We look at ourselves as an extension of our customers' business. It's not just about the products or the technology. It's the fact that we're fundamentally changing the business model.

Do we really need "ultramobile" PCs?

I don't know if it's a fat cellphone or a thin notebook. But there's clearly a missing link in mobile devices. If you have a fat cellphone, you're compromising on computing power and on visual quality. If you have a notebook, you've got computing power but you're trading excellent autonomy and connectivity. At the convergence of these devices, something will emerge: where the screen is good enough, the keyboard is good enough and the connectivity is good enough so that you don't need a Ph.D. to make everything work. That's the device we would all love to have. Then we wouldn't need to carry around two or three devices.

Will it come? Absolutely. Why is it so difficult? There are technology challenges, engineering compromises. And you have a clash of industries: The PC guys are always trying to get in a PC-centric view of the world, but they don't always have the greatest consumer insights. PCs are still way too frustrating and too prone to bugs and errors. If my refrigerator was like my PC, I'd have to wait 60 seconds to get a beer.

AMD has had an awful quarter. How do customers know you'll be viable?

I can't deny the recent quarter was a negative. But there are two ways to look at business: performance and health. There's what you see in the earnings. But what is the AMD customer saying to us? What's the end user demand? We had a series of challenges because we grew so fast in 2006. It put us in a position to disappoint some customers.

Although I can't look at Q1 with anything but disgust, this has never been anything other than a marathon. I look at Q1 like an anomaly and am confident that customers, users and employees' motivation and determination is intact.

You have already teamed up with IBM on research. You've suggested that you might give up manufacturing, too. Is this is a viable strategy?

Increasingly it's going to be a world of partnering, because there are more good ideas in several brains than in just one. And the costs barriers to entry are getting enormous. As the PC industry matures, it's not as homogeneous as it used to be. There's a need for "good enough" devices, where the focus is on keeping costs low. At the other end of the spectrum, there will be this incredible need for performance.

The race in chip design needs to change from "I'm doing it because I can" to "I'm doing it because it's meaningful to the end user." If there's one thing I can be proud of, AMD has helped Intel improve its game. If it hadn't been for AMD, Intel's way of addressing the market would have remained static. I give them a lot of credit for looking at the competition and forcing themselves to change.


http://www.forbes.com/2007/05/17/amd-richard-intel-tech-cz_ec_0517amd.html?boxes=custom

Sphere: Related Content