The five best AMD GPUs of all time: Looking back at 20 years of Radeon GPUs

AMD Radeon
(Image credit: AMD)

The Radeon brand was established by ATI Technologies back in 2000, and it's persisted through countless architectural changes, paradigm shifts, and even AMD's acquisition of ATI in 2006. The history of Radeon graphics cards is pretty messy, as rival Nvidia GPUs have almost always put up a hard fight. AMD has pretty much always been the underdog throughout Radeon's 23 year history, though its offerings routinely rank among the best graphics cards.

But being the underdog makes winning that much more impressive, and we're not looking at the here and now. Instead, we're looking at AMD's five best graphics cards ever — in our opinion — starting from the bottom and ending at the best and most history-defining Radeon GPU ever made. While we've listed individual GPUs, we're also considering the whole family of each architecture as well, and then selecting the best representative from each.

5 — Radeon RX 480 8GB

The AMD Radeon RX 480 graphics card.

(Image credit: Future)

2016 was a surprisingly optimistic year for AMD, considering how rough the previous two had been. AMD was pretty far removed from its last great GPU, the R9 290X, which offered excellent performance at a competitive price. However, Nvidia struck back pretty hard in 2014 with its Maxwell-based GTX 900-series, and AMD just couldn't catch up. The Radeon 300-series were just 200-series cards with a new name, and the Fury-series were failed flagships.

There was an attitude of reform in the air (especially after AMD just barely staved off bankruptcy), and in 2015 AMD decided it would create the Radeon Technologies Group, a more autonomous graphics division led by Raja Koduri. RTG's first graphics card ended up being the RX 480 based on the Polaris architecture, which had been in development before the reorganization. This graphics card was not a flagship, but instead offered a cost-effective midrange card with a balanced blend of price, performance, and power.

A midrange graphics card is normally not very exciting, but the RX 480 was different. AMD got PC gamers excited with its "The Uprising" marketing campaign, also known as the "Radeon Rebellion." AMD hired marketing agency Brand & Deliver to make ads with phrases like "VR is not just for the 1%" and "the gaming revolution will be streamed" and even "don't silence us, silence the GPU." It was very indicative of what direction RTG and Koduri wanted to take Radeon, making it the brand for gamers who wanted high-end graphics cards but couldn't afford them.

The Radeon Polaris GPU.

(Image credit: AMD)

When the RX 480 launched, there were two models: one with 4GB of VRAM at $200 and one with 8GB for $250. The 8GB model was obviously the better choice, as it had double the memory (and more bandwidth) for just $50 extra. However, on launch day it wasn't clear that the RX 480 8GB was much of a revolution at all. It was about as fast as the existing Radeon R9 390 and GeForce GTX 970, both of which cost just $280. Then, the much more efficient and slightly faster GTX 1060 6GB launched a month later, raining on AMD's parade.

But in the long run, the RX 480 8GB (the 4GB model wasn't nearly as popular) aged quite well. Although initially slower than the 1060 6GB by a decent margin, a series of driver updates boosted performance closer to the 1060 6GB. By the time the RX 580 (a refreshed 480) came out in 2017, the gap was about closed. As an extension of the 480 with higher performance for the same price, the 580 was also quite good.

Although the RX 480 didn't live up to its strange, communist-themed marketing campaign, it had a very long life. It only recently stopped receiving regular driver updates, a run of over seven years. But in those seven years, none of the talk about VR really came to fruition, as the former is still largely limited to the high-end. The AMD reference design also had issues, draw well over the specified 75W from the PCIe x16 slot — custom cards were much better. Nevertheless, the card is still remembered fondly by much of the PC community even today, and it enjoys a positive reputation.

4 — Radeon RX 6800 XT

Radeon RX 6800 XT

(Image credit: AMD)

AMD has always had a very hard time making flagship GPUs, often skipping them entirely. ATI has made almost as many flagship Radeon cards as AMD, and ATI was only on its own for six years, while AMD has owned the brand for 17. In 2020, it had been three years since AMD had made a flagship gaming GPU (the Radeon VII doesn't really count), and it was finally time for a comeback.

After the failure of RX Vega in 2017, AMD decided to take its time to release its next flagship. First, AMD laid the groundwork by jumping to TSMC's 7nm node and making a new RDNA graphics architecture in 2019 with the RX 5000-series, which ended with the upper-midrange RX 5700 XT based on the Navi 10 GPU.

The combination of the 7nm node and the more gaming-focused RDNA 1 architecture got AMD on a good footing, and AMD focused next on developing what was essentially a larger Navi 10 with a newer architecture, RDNA 2. That GPU was codenamed Navi 21, aka Big Navi, and it was AMD's first flagship in over three years when it launched in 2020. Although the top-end flagship was the RX 6900 XT (and later RX 6950 XT), it was the much more affordable RX 6800 XT that really mattered.

AMD RDNA2 Architecture

(Image credit: AMD)

The 6800 XT was a different beast from any other AMD graphics card before it. It could hit well over 2GHz out of the box, far higher than Nvidia GPUs of the time that typically topped out just under 2GHz. It also came with 16GB of GDDR6 memory and had a ton of graphics cores. Perhaps most importantly, AMD's Infinity Cache proved incredibly useful, packing 128MB of L3 cache onto the GPU and allowing the 256-bit but to behave more like a 384-bit interface in terms of effective bandwidth.

The RX 6800 XT's primary competition was Nvidia's GeForce RTX 3080 10GB, and the RX 6800 XT was roughly as fast, while using less power, and it also cost $650 instead of $700. AMD did add support for ray tracing with RDNA 2, though it was clearly more focused on rasterization performance. It was Radeon's Ryzen moment... Sort of.

The problem with any GPU in 2020 (and 2021, and even into early 2022) was that you just couldn't buy them, or if you could they would often cost twice as much they should. It was technologically impressive that AMD had finally caught up to Nvidia in nearly every way, but it didn't matter all that much when nobody could buy the 6800 XT anyway.

The GPU shortage crisis didn't last forever though. It ended by mid-2022, with the death of Ethereum mining, and prices began a steady decline. Eventually, RTX 30-series GPU prices leveled out roughly $50 to $100 above MSRP, but RX 6000-series cards just kept getting cheaper. They hit MSRP, and then they dropped well below MSRP.

Today, the cheapest RX 6800 XT cards cost around $480, while the cheapest RTX 3080 10GB GPUs only ever hit around $700 (a few briefly hit $600, but that was very short-lived). Although the 6800 XT was behind in ray tracing and upscaling tech, it definitely had a big advantage in bang for the buck. Even the new RX 7800 XT is more of a lateral move from the 6800 XT rather than a major improvement, albeit with improved power efficiency.

3 — Radeon HD 7970

The Radeon HD 7970.

(Image credit: Amazon)

2010 had seen AMD nearly overtake Nvidia in GPU marketshare, a record that the company hasn't even come close to matching since. AMD's momentary success was largely down to the failure of Fermi-based GTX 400-series cards that came out in the late 2000s, and as the world entered the 2010s, AMD was having to contend with a revitalized Nvidia. AMD wasn't exactly doing well financially, so this could become a big problem fast.

Historically, AMD (and ATI previously) had always been able to rely on its ability to get to cutting-edge nodes quicker than Nvidia. This strategy didn't always work perfectly, but it did at least give Radeon GPUs a good advantage. AMD was progressing to 28nm after its 40nm HD 6000-series, but Nvidia's upcoming GTX 600-series would also be on 28nm. That was one advantage AMD no longer had.

But it wasn't all doom and gloom for Radeon. AMD was also revamping its graphics architecture, swapping out the Terascale architecture for Graphics Core Next (GCN). GCN debuted in the HD 7000-series, led by the flagship HD 7970 in early 2012. It was significantly faster than the previous generation flagships, the HD 6970 and GTX 580, which was good, though that extra performance did come at a price.

Unfortunately for AMD, Nvidia's GTX 680 was even better. It was a little faster, it was cheaper, it was more efficient, and it was smaller. Technologically and economically, AMD was beaten on the four most important points. AMD's moment in the sun was over, and Nvidia was right back in first place where it had been with the GTX 200-series.

The HD 7970 GHz Edition.

(Image credit: Future)

Except, AMD didn't take its loss lying down. Sure, there was nothing it could do about power efficiency or die size without a radical upgrade, but the HD 7970 wasn't that far behind the GTX 680. If AMD could just make a 7970 with a higher clock speed, then it could at least claim to have one of the world's fastest GPUs.

So, in the middle of 2012, AMD released the HD 7970 GHz Edition, featuring a 1GHz base clock speed, 75MHz higher than the original. While the 7970 GHz did indeed close the gap with the 680 (and maybe even exceeded it), much of the performance gains had to do with driver updates that also applied to the 7970. Perhaps if AMD hadn't rushed the HD 7000-series so much, the GTX 680 wouldn't have been such a big problem.

In the end, it was actually the original 7970 that seemed to be the most appealing high-end card, since it was much faster than it was at launch, it could easily hit 1GHz with an overclock, and it was much cheaper than both the 7970 GHz Edition and the 680. Although 2012 was messy for AMD, it did end up winning the generation overall, if only by a hair.

2 — Radeon HD 5870

The HD 5870 and HD 5970.

(Image credit: Future)

In 2006, AMD acquired ATI, the company behind Radeon. AMD sought to marry its Athlon CPUs with Radeon GPUs, but that would take a while. Of course, ATI was in the middle of developing more desktop graphics cards, and now AMD had a say in it. AMD had decided to price its 2007 HD 3000-series pretty aggressively, especially with the flagship HD 3870. This would herald a change in direction for AMD and ATI.

Starting with the HD 4000-series, AMD pursued its "small die strategy." AMD believed it could nearly match Nvidia's massive flagship GPUs with smaller GPUs (big and small referring to the size of the silicon die), which would save lots of money in both development and production. Those savings could be used to make small die GPUs much cheaper than Nvidia's flagships, which would allow AMD to take tons of market share and break Nvidia's grip on the market.

The HD 4000-series in 2008 was mostly a proof-of-concept, as the HD 4870 couldn't catch up to the GTX 280, but it came awfully close. The HD 5000-series in 2009 was a better realization of AMD's new approach to gaming GPUs, with the HD 5780 leading as the flagship. The 5870 easily beat Nvidia's GTX 285, but this was just a refreshed 280. Surely, Nvidia's next flagship would bring AMD to its knees.

Except Nvidia gave the world GTX 400-series GPUs powered by the Fermi architecture, which is universally agreed to be Nvidia's worst disaster ever. The GTX 480 was faster than the HD 5870 in most games, but it consumed an unbelievable amount of power for the time and it cost $500, versus the 5870's $350 price tag. This was the small die strategy in action, going exactly according to AMD's plan.

The HD 5000-series was so efficient that AMD was even able to make a dual-GPU graphics card, the HD 5970, which essentially had two 5870s. Of course, this card relied on CrossFire, which was never all that reliable, but nonetheless it was powerful when it did work and didn't require a nuclear power plant to operate.

Thanks to the small die strategy, AMD was also nearly able to overtake Nvidia in marketshare for the first time ever. According to Jon Peddie Research, AMD hit an all-time high of 44.5% GPU marketshare in Q2 of 2010, months after the launch of the HD 5000-series. AMD never ended up getting any farther than that, but it was so remarkably close to achieving the majority marketshare for a brief time.

Unfortunately, there was one part of the small die strategy that didn't really work: money. Although AMD had gained lots of marketshare and sold lots of GPUs, it wasn't really turning a profit. Nvidia on the other hand was raking in cash even with the infamous GTX 400-series, and that forced AMD to abandon its aggressive prices. The HD 5870 is probably the best bang for buck flagship GPU there ever was, and we'll probably never see anything like it ever again.

1 — Radeon 9700 Pro

The Radeon Pro 9700.

(Image credit: VGA Museum)

While the Radeon 9700 Pro is definitely the best Radeon GPU ever made, technically it's not AMD's best as it was made back before the company acquired ATI. Still, to not mention this legendary GPU would be wrong, because it's arguably the ancestor of modern gaming GPU flagships.

Throughout the late 90s and early 2000s, the field of companies making gaming GPUs for desktops sharply declined. This was largely because Nvidia was so successful, and the company's GeForce 256 with hardware-accelerated transform and lighting highlighted this. ATI was the only company that was able to stand up to Nvidia, though its brand-new Radeon 7000 and 8000 cards weren't exactly tough competition.

ATI decided to do something radical and totally change the game. Most high-end graphics cards back then used a graphics processor roughly 100mm2 to 140mm2 in size. Instead of making a normally-sized GPU, ATI planned out the largest graphics chip the world had ever seen: R300. Coming in at 215mm2, it was over twice the size as the company's previous R200-based Radeon 8000 GPUs, and 50% larger than Nvidia's GeForce 4 Ti as well as containing 75% more transistors.

Such a massive disparity in size would also mean an equally large disparity in raw horsepower. The 2002 contest between the flagship Radeon 9700 Pro and Nvidia's puny GeForce 4 Ti 4600 was always going to be a bloodbath. The 9700 Pro achieved what is quite possibly the most complete victory a flagship GPU has ever obtained against another bona fide flagship, beating the Ti 4600 in virtually everything by a large margin. The 9700 Pro, though only remembered by PC veterans and historians these days, is synonymous with unbeatable performance.

But beyond achieving an incredible victory, ATI had proven that big GPUs were the way forward for flagships, and there was room to grow. Nvidia started making 200mm2 GPUs in 2003, and then both companies were making 300mm2 chips in 2004. By 2007, Nvidia's flagship GeForce cards were nearly 500mm2, which is still pretty large by today's standards.

Although Nvidia claims to have invented the first GPU with the GeForce 256 (a very dubious claim), ATI arguably introduced the first recognizably modern high-end gaming graphics card. The opportunity to launch such a product comes along extremely infrequently, and it's an achievement Radeon (and AMD too, sort of) gets to claim for itself.

Matthew Connatser

Matthew Connatser is a freelancing writer for Tom's Hardware US. He writes articles about CPUs, GPUs, SSDs, and computers in general.

  • Colif
    My ATI Rage Pro feels overlooked just because it couldn't do 3d. Thats so short sighted :)

    I knew my card wouldn't be on list.

    the ones inside the consoles are the most used.
    Reply
  • Alvar "Miles" Udell
    Really? No mention of the HD 4870, the card which immediately caused Nvidia to cut their prices by a third, or more, and made Radeon GPUs actually competitive again after essentially 4 generations of scrap, their first truly competitive card since the 9600XT?
    Reply
  • pdegan2814
    I loved my 9700 Pro, that thing was awesome.
    Reply
  • COLGeek
    Having owned all of those, I'm not so certain that the "all time" aspect of the story was achieved.

    More than a couple (below) that were arguably more significant in the annals of time that the short list of five in the article.

    https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units
    Reply
  • maestro0428
    I would say the x1950xt should be on this list. Great card.
    Reply
  • brandonjclark
    Alvar Miles Udell said:
    Really? No mention of the HD 4870, the card which immediately caused Nvidia to cut their prices by a third, or more, and made Radeon GPUs actually competitive again after essentially 4 generations of scrap, their first truly competitive card since the 9600XT?
    It's literally mentioned in the article. Do a search, bro!


    I used to own a 9800xt (R360 die). At the time, it was an incredible beast.

    https://www.techpowerup.com/gpu-specs/radeon-9800-xt.c4
    Reply
  • Amdlova
    The best in class is the 9700pro-xt
    The 3870
    The 4870
    The 6870
    And the rx 480/580

    All these card have performance / dollar ratio
    Reply
  • Noir106
    Maybe I'm just newer to the scene, but I think the 5700 xt and r9 290x deserve spots here and the 9700 pro really shouldn't even be here since AMD didn't acquire ATI until after. I feel like this recent gen of AMD Radeon cards just left a bad taste in everyone's mouth so no one's willing to praise Navi or Vega.
    Reply
  • mitch074
    Alvar Miles Udell said:
    Really? No mention of the HD 4870, the card which immediately caused Nvidia to cut their prices by a third, or more, and made Radeon GPUs actually competitive again after essentially 4 generations of scrap, their first truly competitive card since the 9600XT?
    its little brother the HD4850 stayed a recommendation on this very website for 2 years and 7 months - It was a budget GPU from the get-go, but damn, was it a good investment !
    The R300 in general was the first fully programmable GPU ever - it supported DirectX 9.0c before it even came out ! Damn, if you tried hard enough, you could run Windows 7 with all bells and whistles on it.
    Reply
  • Makaveli
    I've owned all of these cards also.

    Good times!
    Reply