The company makes great graphics cards. It has always been the better graphics card and it has done so many times. Despite its penchant for innovation and technological advancement, it has put out a number of bad cards, cursed not necessarily by bad technology, but by poor decision making. We wish we could forget some of the graphics processing units.

GeForce GTX 480

The way it’s meant to be grilled

The Nvidia GeForce GTX 480.
Hyins

Although Nvidia has been in business for over 20 years now, there is only one graphics card the company has ever made that was terrible on a technological level, and that is the GTX 480. With the help of the Fermi architecture, the GTX 480 and the entire 400 series were plagued by a lot of issues which in turn allowedAMD to become the leading graphics chip maker.

Power consumption and heat were the biggest claims to fame. At the time, it was insane that a single GTX 480 could get as hot as 94 C in a normal game, consuming as much power as a dual system. The 480's stock cooler looked like a grill, prompting detractors to turn the slogan "the way it's meant to be played" into "the way it's supposed to be grilled"

The HD 5000 series was the first to launch, and Fermi was late to the party. The 480 was the fastest graphics card with a single die, but the HD 5870 had more performance. In 2010, CrossFire had better support in games than the HD 5970 did. It was too expensive to make it competitive.

The GTX 400 series was killed off eight months later by the launch of the GTX 500 series. The price of the new GTX 580 was the same as the old one.

GeForce GTX 970

3.5 equals 4

Like other 900 series cards powered by the legendary Maxwell architecture, the GTX 970 was very well received when it was first released. It was thought to be a strong contender for the best value champion. Why did the 970 end up on this list?

Some new information came to light after the 970 came out. The remaining half of the 4GB of GDDR5 VRAM was only usable at full speed, with the remaining half running barely any faster thanDDR3 if the graphics card runs out of VRAM. For all intents and purposes, the 970 was a 3.5 gigabyte graphics card, not a 4 gigabyte one, and this led to a lawsuit that was settled out of court.

The performance implications of having half a gigabyte less VRAM weren't very relevant. Most games that demanded more than 3.5GB of VRAM were just too intensive in the first place.

There are a few games where the 970 doesn't work. Performance isn't the point here, and that's not acceptable, and that's stains the legacy of an otherwise great card. The habit of playing fast and loose with the graphics card specifications has been a problem for the company.

GeForce GTX 1060 3GB

Ceci n’est pas une 1060

Best graphics card for gaming

After the 970 debacle, Nvidia never tried to make another graphics card that had a slow segment of VRAM and made sure each card was advertised with the correct amount of memory. The CUDA core count was easier to modify.

It used to be common to see multiple versions of the same graphics card, such as the GTX 960 2GB and the GTX 960 4GB. Most of the time, the graphics cards with more VRAM didn't have more memory bandwidth. With the introduction of the 10 series, the situation changed. There's a catch: it had fewer cores, too.

The lower core count didn't matter to reviewers like Techspot and Guru3D who said the GTX 1060 3GB was okay. This trend of less VRAM and fewer cores has caused a lot of confusion. The VRAM is just a secondary factor of performance when it comes to different models of graphics cards.

The worst example of this was the RTX 4080 12GB, which was supposed to have just 22% of the cores of the RTX 4080 16GB, making it feel like an RTX 4070. The backlash to this was so intense that it was canceled by Nvidia.

GeForce RTX 2080

One step forward and two back

RTX 2080
Riley Young/Digital Trends

Some of the best graphics cards of all time can be found in the GTX1080 Ti and the GTX1080. The next generation of the RTX 20 series introduced real-time ray tracing and artificial intelligence. The 10 series was more technologically advanced than the 20 series.

The price of the RTX 20 series was given the kind of price tag it thought it deserved, with the RTX 2080 Ti coming in at $1,200. The next big thing was ray tracing and it was going to make up for it. On launch day, there were no games with ray tracing, and there wouldn't be for a long time. There were a lot of games with support for the new features by the time the RTX 30 cards were released.

The 20 series graphics card was not good. At least the 2080 Ti could claim to be 25% faster than the old flagship, even though it was $100 more expensive. When ray tracing came into play it was so intensive that it was hard to hit 60 frames per second in most titles. RTX 30 was over the horizon by the time the second iteration of the game was released.

It knew that it had over played its hand. The 20 series was launched eight months after the 500 series and the 400 series was launched eight months after the 300 series. The problems of the original 20 series were fixed by the new Super variant of the 2060, 2070, and 2080.

GeForce RTX 3080 12GB

How to make a good GPU terrible

RTX 3080 graphics card on a pink background.
Jacob Roach / Digital Trends

We have seen what happens when a company takes a good graphics card and cuts its VRAM and core count without changing its name. It sounds like a good idea to make a good graphics card. It resulted in the creation of the most pointless graphics card in the company.

The 3080 12GB wasn't an upgrade compared to the original 3080 10GB. It had more memory but only 3% more. In our review, we found that the 10 gigabyte and 12 gigabyte models had the same performance as the 3 gigabyte model. The name of the 3080 12GB was a noticeable improvement over the 1060 3GB.

There is a problem with offering a new version of a graphics card. It retailed for between $1,250 and $1,600 when it was released, due to the shortage of graphics cards in 2020. Since the memory upgrade clearly didn't matter, it was obvious which card you should buy.

The biggest embarrassment for the 3080 12GB was the fact that it had the same memory size and bandwidth as the 3080 Ti. It had more cores and was able to perform better. The 3080 Ti was cheaper on review day, making the 3080 12GB pointless from every angle and just another card released during the shortage that didn't make sense at all.

Nvidia’s worst GPUs, so far

The 970 was good in spite of its memory, the 1060 was just named poorly, and the RTX 2080 was just overpriced. The GTX 480 was the fastest graphics card with just a single die, and it was made by the company.

Good technology can't make up for bad business decisions like poor naming conventions and high prices, and these are mistakes Nvidia makes every year. It doesn't seem like either of these things is going away any time soon, with the RTX 4080 12GB almost making it to market while the RTX 4080 and RTX 4090 are too expensive to make sense.

It wasn't hard to predict that the price of the graphics cards would keep going up, and I expect this to continue into the future. It won't be let down by shady marketing or misleading branding, but by price. We would be lucky to see the price of the RTX 4070 as low as that of the RX 7900 XTX.

Editors' Recommendations