Of all the players in the world of computing, Intel is one of the oldest as well as one of the most titanic. It can be hard getting excited about Intel, whether the company is dominating as it did in the 2010s or floundering as it is in the 2020s; it’s pretty difficult for people to fall in love with the status quo or a large company that loses to smaller ones. The opposite is true for Intel’s rival AMD, which has always been the underdog, and everyone (usually) loves the underdog.
But Intel couldn’t become the monolithic giant it is today without being a hot and innovative upstart once upon a time. Every now and then, Intel has managed to shake things up on the CPU scene for the better. Here are six of Intel’s best CPUs of all time.
Intel 8086 Intel becomes a leader Thomas Nguyen
The Intel 8086 basically ticks all the boxes for what makes a CPU great: It was a massive commercial success, it represented significant technological progress, and its legacy has endured so well that it’s the progenitor of all x86 processors. The x86 architecture is named after this very chip, in fact.
Although Intel claims the 8086 was the first 16-bit processor ever launched, that’s only true with very specific caveats. The 16-bit computing trend emerged in the 1960s by using multiple chips to form one complete processor capable of 16-bit operation. The 8086 wasn’t even the first single-chip processor with 16-bit capability as other CPUs, having been pipped at the post by the General Instrument CP1600 and the Texas Instruments TMS9900. In actuality, the 8086 was rushed out to put Intel on even ground with its rivals, and finally came out in 1978 after a development period of just 18 months.
Initially, sales for the 8086 were poor due to pressure from competing 16-bit processors, and to address this, Intel decided to take a gamble and embark on a massive advertising campaign for its CPU. Codenamed Operation Crush, Intel set aside $2 million just for advertising through seminars, articles, and sales programs. The campaign was a great success, and the 8086 saw use in about 2,500 designs, the most important of which was arguably IBM’s Personal Computer.
Equipped with the Intel 8088, a cheaper variant of the 8086, the IBM Personal Computer (the original PC) launched in 1981 and it quickly conquered the entire home computer market. By 1984, IBM’s revenue from its PC was double that of Apple’s, and the device’s market share ranged from 50% to over 60%. When the IBM PS/2 came out, the 8086 itself was finally used, along with other Intel CPUs.
The massive success of the IBM PC and by extension the 8086 family of Intel CPUs was extremely consequential for the course of computing history. Because the 8086 was featured in such a popular device, Intel of course wanted to iterate on its architecture rather than make a new one, and although Intel has made many different microarchitectures since, the overarching x86 instruction set architecture (or ISA) has stuck around ever since.
The other consequence was an accident. IBM required Intel to find a partner that could manufacture additional x86 processors, just in case Intel couldn’t make enough. The company Intel teamed up with was none other than AMD, which at the time was just a small chip producer. Although Intel and AMD started out as partners, AMD’s aspirations and Intel’s reluctance to give up ground put the two companies on a collision course that they’ve stayed on to this day.
Celeron 300A The best budget CPU in town Qurren
In the two decades following the 8086, the modern PC ecosystem began to emerge, with enthusiasts building their own machines with off-the-shelf parts just like we do today. By the late 90s, it became pretty clear that if you wanted to build a PC, you wanted Windows, which only ran on x86 hardware. Naturally, Intel became an extremely dominant figure in PCs since there were only two other companies with an x86 license (AMD, and VIA).
In 1993, Intel launched the very first Pentium CPU, and it would launch CPUs under this brand for years to come. Each new Pentium was faster than the last, but none of these CPUs were particularly remarkable, and definitely not as impactful as the 8086. That’s not to say these early Pentiums were bad, they were just meeting standard expectations. This was all fine until AMD launched its K6 CPU, which offered similar levels of performance as Pentium CPUs for lower prices. Intel had to respond to AMD, and it did so with a brand-new line of CPUs: Celeron.
At first glance, Celeron CPUs didn’t appear to be anything more than cut-down Pentiums with a lower price tag. But overclocking these chips transformed them into full-fledged Pentiums. CPUs based on the Mendocino design (not to be confused with AMD’s Mendocino-based APUs) were particularly well regarded because they had L2 cache just like higher-end Pentium CPUs, albeit not nearly as much.
Of the Mendocino chips, the 300A was the slowest but could be overclocked to an extreme degree. In its review, Anandtech was able to get it to 450MHz, a 50% overclock. Intel’s 450MHz Pentium II sold for about $700, while the Celeron 300A sold for $180, which made the Celeron extremely appealing to those who could deal with the slightly lower performance that resulted from having less L2 cache. Anandtech concluded that between AMD’s K6 and Intel’s Celeron, the latter was the CPU to buy.
In fact, the 300A was so compelling to Anandtech that for a while, it just recommended buying a 300A instead of slightly faster Celerons. And when the 300A got too old, the publication started recommending newer low-end Celerons in its place. Among Anandtech’s CPU reviews from the late 90s and early 2000s, these low-end Celerons were the only Intel CPUs that consistently got a thumbs up; even AMD’s own low-end CPUs weren’t received as warmly until the company launched its Duron series.
Core 2 Duo E6300 The empire strikes back Intel
Although Intel had an extremely strong empire in the late 90s, cracks were beginning to appear starting in the year 2000. This was the year Intel launched Pentium 4, based on the infamous NetBurst architecture. With NetBurst, Intel had decided that rapidly increasing clock speed was the way forward; Intel even had plans to reach 10GHz by 2005. As for the company’s server business, Intel launched Itanium, the world’s first 64-bit implementation of the x86 architecture and hopefully (for Intel) the server CPU everyone would be using.
Unfortunately for Intel, this strategy quickly fell apart, as it became apparent NetBurst wasn’t capable of the clock speeds Intel thought it was. Itanium wasn’t doing well either and saw slow adoption even when it was the only 64-bit CPU in town. AMD seized the opportunity to start carving out its own place in the sun, and Intel began rapidly losing market share in both desktops and servers. Part of Intel’s response was to simply bribe OEMs to not sell systems that used AMD, but Intel also knew it needed a competitive CPU as the company couldn’t keep paying Dell, HP, and others billions of dollars forever.
Intel finally launched its Core 2 series of CPUs in 2006, fully replacing all desktop and mobile CPUs based on NetBurst, as well as the original Core CPUs that launched solely for laptops earlier in the year. Not only did these new CPUs bring a fully revamped architecture (the Core architecture had almost no resemblance to NetBurst) but also the first quad-core x86 CPUs. Core 2 didn’t just put Intel on an equal footing with AMD, it put Intel back in the lead outright.
Although high-end Core 2 CPUs like the Core 2 Extreme X6800 and the Core 2 Quad Q6600 amazed people with high performance (the X6800 didn’t lose a single benchmark in Anandtech’s review), there was one CPU that really impressed everyone: the Core 2 Duo E6300. The E6300 was a dual-core with decent overall performance, but just like the 300A, it was a great overclocker. Anandtech was able to overclock its E6300 to 2.59GHz (from 1.86GHz at stock), which allowed it to beat AMD’s top-end Athlon FX-62 (another dual core) in almost every single benchmark the publication ran.
The Core 2 series and the Core architecture revived Intel’s technological leadership, the likes of which hadn’t been seen since the 90s. AMD meanwhile had a very difficult time catching up, let alone staying competitive; it didn’t even launch its own quad-core CPU until 2007. Core 2 was just the beginning though, and Intel had no desire to slow down. At least not yet.
Core i5-2500K Leaving AMD in the dust
Unlike NetBurst, Core wasn’t a dead end, which allowed Intel to iterate and improve the architecture with each generation. At the same time, the