Intel's Netburst architecture spent pretty much its entire career as a subject of controversy. The first incarnation, the Pentium 4 'Willamette', was introduced on November 20, 2000, clocked at speeds up to 1.5GHz. This appeared to render a quantum leap possible in comparison to the 1.0GHz Pentium III 'Coppermine' and the 1.2GHz Athlon 'Thunderbird', but it did not take the world long to find out that clock speed does not tell the whole story. Not only was the chip's performance disappointing, it was also a bit of a financial burden for Intel as well as for its customers. The Willamette's core was twice as big as its predecessor and could only be used in combination with Rambus RDRAM memory which was in short supply and expensive. An early socket substitution, taking out the 423 pin sockets and putting 478 pins into the field, did not help to raise the CPU's popularity.
A little more than a year later, on January 7 2002 to be precise, the second generation of Netburst was born, bearing the codename 'Northwood'. This period saw an ironing out of Willamette's rough spots, in part by virtue of the introduction of chipsets with support for DDR-memory, HyperThreading, a larger L2 cache and broader acceptance of the SSE2 instruction set. Intel's strategy of focusing on high clock speeds even began to pay off : when the frequency gap with AMD grew to more than 800MHz, it caused the smaller chip maker to financial losses. But all was not rosy on Intel's side of the field, as energy consumption began to rise rapidly. Northwood started out at 2.0GHz with a TDP of 54.3 Watt, but by the time the CPU was clocking 3.06GHz this had reached 81.8 Watt – a staggering consumption rate for those days.

Fortunately a solution was within reach, or at least this was believed. The transition of the production process from 130nm to 90nm was intended to reduce power consumption by such an extent that a sprint towards clock speeds of 5.0GHz would be possible. It was the Pentium 4 'Prescott' core that was meant to pull this off, in part by utilizing a long pipeline of 31 steps. However, as it turned out, it was at this point that the Netburst-architecture collapsed: because of unexpected amounts of leakage from the small transistors, the intended power savings did not materialize. Even though this played a role in all processors (including those of Intel's competitors), the Pentium 4 turned out to be particularly susceptible to this side effect: higher clock speeds put more pressure on the transistors as well as more voltage, causing increased leakage. Moreover, the effect is self-reinforcing: as the chip heats up, even more leakage occurs.
All in all, the effects for Intel were substantial. When the Prescott was launched on February 2 2004, the new core did not supply Intel's customers with any gains in performance or loss in power consumption. Instead, the resistance against even warmer chips turned out to be the death blow: in spite of reasonable success in making the design more energy efficient, the initial targets could simply not be met. Intel could not have chosen a worse moment for this misfortune, since AMD had only recently introduced the Athlon 64, with which it put a powerful answer to Northwood on the market, in terms of performance as well as power consumption. Left without the ability to take refuge in high clock speeds, the Pentium 4 had become an easy prey, and Intel was forced to watch its lead fade away.

With hindsight, it is easy to say that it was clear at that point that Intel had outplayed its hand, but it took the company a while to accept that Netburst was a dead end. On May 8 2004 it was finally announced that work on two Pentium 4 successors (the projects codenamed Tejas and Nehalem) had been canceled, and that Intel would tackle things 'differently' in the future with 'dualcore' technology. Rumors were flying across the net in abundance, one of the more popular ones being that Intel would opt to develop the Pentium M's design further. This mobile processor had already been under development for a few years by an Intel team in Haifa, Israel, independently from the Netburst project. In spite of the fact that this chip had not been designed for desktops, let alone servers, it managed to perform impressively well. Besides, the Israeli Intel engineers had succeeded in avoiding the pitfall of ever increasing power consumption.
So an efficient 64 bit, dualcore Pentium M, codenamed 'Merom', had been under development even before the Netburst-architecture derailed. According to unconfirmed reports, Merom's development started as early as 2002. Still, it was not until April 9 2005 that Intel officially announced Merom's double offspring: Conroe for desktops and Woodcrest for servers. Together, they were known as NGMA: Next Generation Micro-Architecture. Another year went by before the company announced the final name for the new architecture: Core. On the next pages, we shall look at the improvements that it has to offer.