The 6 Worst AMD GPUs of all time

AMD has been one of the top graphics cards manufacturers for well over a decade after picking up the even-longer-standing ATI. Overall, it’s done pretty well for itself. However, AMD (and ATI) has also made several disappointing graphics cards over the years and  GPUs that can barely justify their own existence.

If you’d like to take a trip down memory lane and wince at all AMD’s missteps, here’s a look back at the AMD and ATI GPUs that let us all down.

Radeon 8500 Crippled by bad drivers The Radeon 8500.VGA Museum

ATI, the company that AMD bought in order to get Radeon graphics in its portfolio, was the only company in the late 90s and early 2000s that could stand up to Nvidia, which quickly established itself as a leader in the graphics cards market. In 2001, there was a lot of excitement about what AMD’s Radeon 8000 series GPUs could do. The hardware was good, with the Radeon card expected to beat Nvidia’s flagship GeForce3 Ti 500 in several metrics, and at $299, the 8500 was $50 cheaper than the Ti 500. What could go wrong?

Well, in the actual benchmarks, the 8500 was significantly behind the Ti 500, and sometimes it was just half as fast. On paper, the 8500 should have beaten the Ti 500 by at least a small margin, if not a noticeably large one. It was not quite the flagship reviewers were hoping for, as Anandtech noted that even at $250, the 8500 could only match Nvidia’s GeForce3 Ti 200, which cost under $200.

Ultimately, bad drivers doomed the 8500 and ATI’s desire to beat Nvidia. The 8500 did quite well in synthetic benchmarks like 3DMark 2001, where it beat the Ti 500, but in actual games, it fell noticeably behind. Theoretically, if the 8500 had better software level optimization for games, it would have been able to stand toe to toe with the Ti 500. The situation was so severe that ATI promised it would be releasing new drivers as quickly as every two weeks. Unfortunately, this was not enough to turn the 8500 into a true competitor against Nvidia’s flagship GPUs.

The fact that the 8500 had so much potential is what makes this situation even more depressing. There have been plenty of other AMD or ATI GPUs with terrible driver support (the RX 5000 series was particularly buggy for some), but the 8500 is easily the most heartbreaking. It could have been much more than just another midrange GPU with all that untapped horsepower. ATI was able to claim victory with its next generation Radeon 9000 series, however, so you could argue Radeon 8000 limped so Radeon 9000 could run.

Radeon R9 390X A space heater that can also play games The R9 390X.MSI

We’re going to fast forward about a decade here, because, honestly, ATI and AMD (which acquired ATI in 2006) didn’t really make any particularly bad GPUs after the 8000 series. There were disappointing cards like the HD 3000 and HD 6000 series, but nothing truly bad, while AMD’s 290X scored an impressive blow against Nvidia in 2013. Unfortunately, the years that followed weren’t so kind.

With AMD stuck on TSMC’s 28nm node, the only thing it could really do was sell old GPUs as new GPUs — a tactic known as rebranding. The Radeon 300 series was not the first (nor the last) series to feature rebrands, but it has the unfortunate distinction of being a series of GPUs that was nothing but rebrands.

The R9 290X was rebranded as the R9 390X, which launched in 2015, and while the 290X was fast in 2013, things had changed in 2015. The 390X could just barely catch up to Nvidia’s GTX 980 at resolutions higher than 1080p, but Nvidia’s new flagship GTX 980 Ti was almost 30% faster. Power was also a big issue for the 390X. In TechPowerUp’s review, the 390X consumed 344 watts on average in games, more than double that of the GTX 980 and almost 100 watts more than the 290X. Even something as simple as using multiple monitors or watching a Bluray caused the 390X to use about 100 watts.

The Radeon 300 series and particularly the 390X cemented the reputation of AMD GPUs as hot and loud, and although the 290X was also noted for being hot and loud, the 390X was even more so, which is not a good thing.

Radeon R9 Fury X So close yet so far AMD Radeon Fury XBill Roberson/Digital Trends

The R9 Fury X was the high-end GPU AMD developed after the 200 series, and unlike the 300 series, it was brand new silicon. Fiji, the codename for the graphics chip inside the Fury X, used the third and newest iteration of the GCN architecture and 4GB of cutting edge High Bandwidth Memory (or HBM). It even came with a liquid cooler, prompting an AMD engineer to describe the card as an “overclocker’s dream.” AMD went through a ton of effort to try to beat Nvidia in 2015, but unfortunately, this kitchen sink approach didn’t work.

AMD was faced with trying to dethrone Nvidia’s GTX Titan X, a top-end prosumer card that retailed for $999, and even though the Fury X was slower than the Titan X by a small margin, it was also $350 cheaper. Had everything gone according to plan, AMD could have positioned the Fury X as a viable alternative for gamers who wanted high-end performance for less and without all the extra compute features they didn’t want.

But the Titan X wasn’t Nvidia’s only high-end card at the time. The GTX 980 Ti retailed for the same $649 MSRP as the Fury X and had 6GB of memory, lower power consumption, and about the same performance as the Titan X. For all the trouble AMD went through, with designing a new architecture to improve power efficiency, using HBM to increase memory bandwidth, and putting a liquid cooler on this GPU, the Fury X lost anyway, and Nvidia barely had to lift a finger. It was simply disappointing, and Anandtech put it best:

“The fact that they get so close only to be outmaneuvered by NVIDIA once again makes the current situation all the more painful; it’s one thing to lose to Nvidia by feet, but to lose by inches only reminds you of just how close they got, how they almost upset Nvidia.”

That comment about the Fury X being an “overclocker’s dream” also caused some controversy, because the Fury X was locked down unlike ever before. There wasn’t any way to raise the voltage for higher clocks, and the HBM’s clock speed was locked entirely. Anandtech was able to get its card to 1125MHz, a rise of just 7%. By contrast, GTX 9 cards were well known for easily getting 20% overclocks, sometimes as much as 30% on good cards.

The Fury X wasn’t bad in the way the 390X was; it was bad because it needed to be something more, and AMD just didn’t have it.

Radeon RX 590 Stop, he’s already dead! AMD RX 590Dan Baker/Digital Trends

Fast forward three years, and things were looking better for AMD. The performance crown continued to elude it, but at least its RX Vega GPUs in 2017 got it back on equal footing with Nvidia’s then-current generation x80 class GPU, the GTX 1080. AMD had seemingly planned to launch more RX Vega GPUs for the midrange and low-end segments, but these never materialized, so instead, AMD rebranded its hit RX 400 series as the RX 500 series, which was disappointing but not terrible, as Nvidia didn’t have new GPUs in 2017 either.

By the end of 2018, Nvidia had launched a new generation of GPUs, the RTX 20 series, but it didn’t shake things up very much. These cards didn’t provide better value over the GTX 10 series, and although the RTX 2080 Ti was significantly faster than the GTX 1080 Ti (and AMD’s RX Vega 64), it was also exorbitantly expensive. AMD really didn’t need to launch a new GPU, especially not a new midrange GPU since cards like the RTX 2060 and the GTX 1660 Ti were months away. And yet, AMD decided to rebrand RX 400 for a second time with the RX 590.

The official raison d’être for the RX 590 was that AMD didn’t like there being such a large performance gap between the 580 and the Vega 56, so it launched the 590 to fill in that gap. The thing is, the 590 was just an overclocked 580, which was an overclocked 480. Just adding clock speed didn’t really do much for the RX 590 in our review. So far, not off to a good start.

In order to reach these ever increasing clock speeds, the power needed to come up too, and the RX 590 ended up being rated for 225 watts — 75 watts above the original RX 480. The Vega 56 actually consumed less power at 210 watts, which made it extremely efficient by comparison. Vega even had a bit of a reputation for being hot and loud, but at least it wasn’t the 590.

Radeon VII A fittingly terrible name for a bad GPU AMD Radeon VIIRiley Young/Digital Trends

Although the RX 590 was basically looking for a problem to solve,

Go to Source