Intel has a serious problem. Notice the backgrounds of Noyce, Moore,
and Grove. Compare that to Barrett's background. Same lesson is
provided by Hewlett Packard. Dave Packard literally had to return to
HP to rescue the company from John Young's management. Again, notice
Young's background. Apple Computer was originally created by computer
guys. When Apple was on the verge of a demise, well, what were the
backgrounds of Sculley and Spindler who took them there -- and how did
the entire Apple board of directors (BoDs) change? For that matter,
what were the words shouted by stockholders at the BoDs before those
BoDs would change their mind -- and fire Spindler. These are damning
facts. To find failure, start with the background of top management.
Trend is not limited to the computer industry. First Energy of Ohio
singlehandedly created the 14 Aug 2003 NorthEast blackout -- a fact
that virtually no one seems to know. Again the background of Anthony
Alexander, his entire corporate staff, and everyone in the First
Energy BoD. Again, management with no background nor knowledge of the
core business nor of any related industry.
This is simply part of the answer to where Intel is going. There
exists a serious technical problem inside Intel in which the
management is aggravating. The Economist magazine articles did
provide a warning for layman but not enough information to understand
why. Serious problems exist for Intel. But to understand why, one
must first have 'dirt under the fingernails' and not a stockbroker's
bean counter perspective. Why did IBM literally flirt with bankruptcy
only to come back decades later so strong? If one can answer that
question, then one is ready to see where Intel may or may not be
going. Why are Xerox and AT&T all but bankrupt. To understand why,
one must literally start the story in the 1970s. Again, the lessons
of these stories might be applicable to Intel. Intel has a problem -
and it is not AMD.
Marcus Didius Falco wrote:
> http://economist.com/printedition/displayStory.cfm?Story_ID=3D3321802
> http://economist.com/printedition/PrinterFriendly.cfm?Story_ID=3D3321802
> Semiconductors
> What Intel's latest stumble means for the chip industry's rule of thumb.
> I'm so sorry, says Barrett
> IT IS not often that the chief executive of one of the world's
> biggest companies gets down on one knee and begs for forgiveness.
> Yet that is what
> Craig Barrett of Intel, the world's largest chipmaker, did this week
> at an industry conference in Florida. He was only joking, of
> course. But his apology for Intel's decision to cancel the next
> version of its flagship Pentium 4 chip highlights the latest in a
> series of stumbles by the company, which has once again been forced
> to follow the lead of its much smaller but increasingly feisty
> competitor, Advanced Micro Devices (AMD).
> At issue is the best approach to making faster chips. For years, Intel
> has steadily increased the clock speed of its processors, the fastest
> of which now run at 3.4GHz, or 3.4 billion ticks per second. But it
> has now fallen victim to the law of diminishing returns. Although
> boosting the clock speed increases performance, it also increases the
> power consumption of the chip and the need for cooling. (Some of the
> very high-speed PCs used by serious gamers are even water-cooled.)
> So, rather than concentrating on clock speed, Intel has decided to
> boost the performance of future chips in other ways, such as
> increasing the amount of on-board cache memory and, in the coming
> years, switching its chips to a multi-core design. This means putting
> multiple cores in effect, complete processors into a single
> chip. These cores can run more slowly, consume less power and
> generate less heat, while collectively providing more processing
> power than a single core. Multi-core is a way to achieve additional
> performance without turning up clock rates, says Dean McCarron, an
> industry analyst at Mercury Research. This idea is not new: IBM,
> Sun and Hewlett-Packard already sell high-end computers powered by
> their own multi-core chips. But it is only recently that PC software
> has been able to exploit multiple processors.
> Intel's decision to de-emphasise clock speeds is just the latest
> example of how the company has reluctantly ended up following where
> AMD has previously= led. (Earlier this year, AMD forced Intel to make
> a U-turn in its=20 64-bit-chip strategy.) AMD has long argued that
> there is more to=20 performance than clock speed, and gives its chips
> model numbers giving some= idea of their power. Its new Athlon 64
> 4000+ chip, for example, announced=20 this week, runs at 2.4GHz, but
> its name implies rough equivalence with a w4GHz Intel chip. Intel is
> now adopting similar model numbers.
> Having abandoned its obsession with raw speed, Intel is embracing the
> multi-core approach with great enthusiasm. Paul Otellini, Intel's
> number two, who is expected to take over from Mr Barrett next May,
> said last month that he expects 40% of desktop chips sold, and 80% of
> server chips, to be multi-core by the end of 2006. The switch to
> multi-core is, he says, a sea change in computing and a key inflection
> point for the industry .
> What does all this mean for Moore's law, the rule of thumb coined by
> Gordon Moore, Intel's co-founder, which states that the amount of
> computing power available at a given price doubles every 18 months?
> For most people, Moore's law manifests itself as a steady increase in
> clock speed from one year to the next. The cancellation of the 4GHz
> version of the Pentium is Intel's clearest admission yet that clock
> speed is no longer the best gauge of processor performance:
> henceforth, it will increasingly take a back seat to other
> metrics. But the law itself, the death of which has been announced
> many times, will live on. Mr Barrett insisted this week that it would
> continue to apply for at least another 10-15 years. That is because
> multi-core designs mean chips' performance can continue to increase
> even if the formerly much-trumpeted clock speed does not.