AMD claims most gamers don’t need more than 8GB of VRAM, after new GPU launch
AMD claims most gamers don’t need more than 8GB of VRAM, after new GPU launch

AMD claims most gamers “have no use for more than 8GB” of VRAM, after new Radeon GPU launch

AMD claims most gamers don’t need more than 8GB of VRAM, after new GPU launch
AMD claims most gamers “have no use for more than 8GB” of VRAM, after new Radeon GPU launch
The full tweet:
Majority of gamers are still playing at 1080p and have no use for more than 8GB of memory. Most played games WW are mostly esports games. We wouldn't build it if there wasn't a market for it. If 8GB isn't right for you then there's 16GB. Same GPU, no compromise, just memory options.
I don't think he's that far off; eSports games don't have the same requirements as AAA single-player games.
This is a much more nuanced take than the headlines implies.
Are you saying journalists will publish articles with inflammatory headlines to maximize engagement with their ad-based website funding? Nah way, I don believe it.
I still see it being an issue of pricing and questionable value (over older/used/already-owned) of a bottlenecked part, particularly when it ends up with users who aren't esports users (for a multitude of reasons). In other words: stagnation.
It's more obvious with AMD selling new 4GB cards still in the budget category rather than ultra-budget, as in they aren't raising the floor. The jokes still work:
EDIT: There were even polaris GPUs with 8GB
In this case, Intel's options of 10/12GB sounds like a more reasonable middle ground.
Counter point.
https://prosettings.net/blog/1440p-competitive-gaming/
Increased resolution has been the trend for a bit now even in these competitive games.
ETA, let's also not pretend that those who play esports games only play esports games too.
Then put 8GB in a 9060 non-XT and sell it for $200. You're just wasting dies that could've been used to make more 16GB cards available (or at least a 12 GB version instead of 8).
That wouldn't work. AMD uses a lot of low memory cheap cheap memory in unison to achieve high speeds, that's why their cards have more vram than nvidia, not because the amount matters, but because more memory chips together can get higher speeds.
Nvidia uses really expensive chips that are high speed so they can fewer memory chips to get the same memory speed.
Then AMD lied and manipulated gamers for advertising that you need 16gb vram.
Memory speed > memory amount
Tell that to game developers. Specifically the ones that routinely don't optimize shit.
Or to gamers who insist on playing these unoptimized games at max settings. $80 for the game, and then spend $1000 buying a gpu that can run the game.
Do you just not want more money?
Nvidia have dropped the ball epically and you have a golden opportunity to regain some GPU share here.
IMHO The Problem is only partly the 8GB VRAM (for 1080p). An at least equal part of the Problem is the sitty Optimisation of some game engines. Especially Unreal Engine 5.
There is nothing wrong with Unreal Engine and UE5 is not meaningfully different than UE4. The problem is that developers only “optimize” to pass console certifications while PC gamers are left out in the cold. It also doesn’t help that PC gamers have a lot more options and will often insist on choosing settings that are far beyond the capabilities of their particular hardware.
Guess I'll stick with my GTX 1070TI until next century when GPU manufacturers have passed the bong to someone else. Prices are insane for the performance they provide these days.
Same. I've encountered exactly one game, ever, that I couldn't play with that card, and that was last month with Doom: Dark Ages which won't even boot without RTX support.
Literally never had a single other problem over the past 7 years of use. I played Cyberpunk 2077 with that card. I'm currently playing Clair Obscur with that card and it looks stupendously beautiful on it.
Greetings fellow 1070Ti user.
Lmao. AMD out here fumbling a lay up.
Seriously.
All AMD had to do here is create a 12GB and 16GB version (instead of 8 and 16), then gesture at all the reviews calling the RTX 5060 8GB DOA because of the very limiting VRAM quantity.
8GB VRAM is not enough for most people. Even 1080p gaming is pushing the limits of an 8GB card. And this is all made worse when you consider people will have these cards for years to come.
Exactly. Even if you accept their argument that 8GB is usually enough today for 1080P (and we all know that is only true for high performance e-sports focused titles), it is not true for tomorrow. That makes buying one of those cards today a really poor investment.
Even worse when you consider the cost difference between 8GB and 16GB can’t be that high. If they ate the cost difference and marketed 16GB as the new “floor” for a quality card, then they might have eaten NVIDIA’s lunch where they can (low-end)
I mean honestly, yeah. With a simple 4 GB chip they could have won the low end and not screwed over gamers.
They've really seemed to have forgotten their roots with the GPU market, which is a damn shame.
I just ditched my 8gb card because it wasn't doing the trick well enough at 1080p and especially not at 1440p.
So if i get this straight AMD agrees that they need to optimize games better.
I hate upscaling and frame gen with a passion, it never feels right and often looks messy too.
First descendant became a 480p mess when there were a bunch of enemies even tho i have a 24gb card and pretty decent pc to accompany that.
I'm now back to heavily modded Skyrim and damn do i love the lack of upscaling and frame gen. The Oblivion stutters were a nightmare and made me ditch the game within 10 hours.
FSR4 appears to solve a lot of problems with both upscaling and frame gen – not just in FSR, but generally. It appears they’ve fixed disocclusion trails, which is a problem even DLSS suffers from.
"8gb ought to be enough for anybody"
I disagree.
why
that's just like, they're opinion
Oh fuck you AMD. NVidia fucked up with the 4060 already, and again with the 5060.
My 4k tv disagrees. Even upscaling from 1440p, my 10GB is barely enough on new games
Last month's Steam survey had 1080p as the most common primary display resolution at about 55%, while 4k was at 4.57%.
4K is a tiny part of the market. Even 1440p is a small segment (albeit rapidly growing).
Ive got 16gb of vram 2k monitor and this tracks pretty accurately. I almost never use over 8gb. The only games that I can break 10gb are games where I can enable a setting (designed for old PCs) where I can load all the textures into vram.
I wish.
Send one of these guys by my place. I'll show them what 8GB can not do..
Oh so it's not that many players are FORCED to play at 1080p because AMDs and Novideos "affordable" garbage can't cope with anything more to make a game seem smooth, or better yet the game detected we're running on a calculator here so it took pity on us and set the graphics bar low.
Hey, give a little credit to our public schools (poorly-optimized eye-candy) new games! (where 10-20GiB is now considered small)
Tell that to my triple 1440p screen flight simulator!
This video I just watched the other day says otherwise (with clear evidence.)
He is only testing AAA games at top settings. And that's the point AMD is "making". Most pc gamers are out there playing Esport titles at the lowest possible settings in 1080p to get the max fps possible. They're not wrong, but you could still say that it's ridiculous to buy a brand-new modern card only expecting to run esport titles. Most people I know that buy modern GPUs will decide to play new hot games.
So the ones who had VGAs do more and more stuff like they were small separate PCs, and pushed for the "1440p Ultra Gaeming!!!1!" are telling us that nah 8GB is enough?
I would agree because 8gb is entry for desktop gaming and most people start at entry level
I personally think anything over 1080p is a waste of resolution, and I still use a card with 8GB of VRAM.
That being said, lots of other people want a 16GB card, so let them give you money AMD!
I personally think anything over 1080p is a waste of resolution
But but Nvidia said at the RTX 3000 announcement that we can now have 8K gaming
anything over 1080p is a waste of resolution
For games, maybe.
But I also use my PC for work (programming). I can't afford two, and don't really need them.
At home I've got a WQHD 1440p monitor, which leaves plenty of space for code while having the solution explorer, watch window, and whatnot still open.
At work we're just given cheap refurbished 1080p crap, which is downright painful to work with and has often made me consider buying a proper monitor and bringing it to work, just to make those ~8h/day somewhat less unbearable.
So I can't go back to 1080p, and have to run my games at 1440p (and upscaling looks like shit, so no).
My gaming rig is also my media center hooked up to a 4k television. I sit around 7 feet away from it. Anything less than 1440p looks grainy and blocky on my display.
I can't game at 4k because of hardware limitations (a 3070 just can't push it at good framerates) but I wouldn't say it's a waste to go above 1080p, use case is an important factor.
My TV has this stupid bullshit where it's only 30hz at 1440p but is 60hz at literally every other resolution (including 4K). 😬
It looks grainy because it's a damn TV and not a monitor. You're not going to be able to tell the difference AT THE DISTANCE that you're supposed to be using them at. Larger monitors are meant to be used from a farther distance away. TVs are meant to be used from across the room.
You're that guy with his retina plastered on the glass of his smartphone going "I CAN SEE THE PIXELS!"
1440p on a 27" monitor is the best resolution for work and for gaming.
If he'd chosen his words more carefully and said "many" rather than "most" nobody would have a reason to disagree.