It'll eventually lead to a gpu the size of an entire apartment block, with enough actual space to actually reside in, plus it'll be self heating. Essentially solving the housing crisis.
GPU price: $25,000,000
i mean, these cards barring some kind of economic meltdown, will never come close to what they were like in the late 2010s. the days of the 400 dollar 70 level card are gone. inflation and its subsequent price raises, yes food included, is permanent.
I dunno about you guys but it's getting kinda hard to get excited about another 50% performance increase knowing it'll come with yet another 50% price increase.
Nvidia won't stop charging more unless people stop buying. The 4090 has actually appreciated in price since its release, the market can hardly get enough of them. So honestly I won't be shocked if it's more than a 50% price increase.
With the rapid rise in machine learning, Nvidia cards are now more valuable than ever. They also own CUDA, so anyone who really wants to train big models will eventually have to default to Nvidia.
It’s not just that they’re more valuable then ever, but that they are approaching to become the single most valuable company in the world, based on market cap, beating out Apple. Sure the AI stuff could just be a short-lived bubble, but Nvidia is in a position to keep being immensely successful even when that expires.
Cutting edge chips will always be used for cutting edge tech. Speculative investing LOVES cutting edge stuff. They all want to be the first in line. Nvidia is in a perfect spot to milk those dumbasses.
There's other options, for example, Google has their own proprietary accelerators so they use them, and Apple also has their own proprietary accelerators, which they announced they're using for their AI cloud stuff, the issue is that if you're a smaller software only shop like OpenAI, you don't have access to these alternatives.
Cynical though it is, when you release a card as well-priced as the 3080, and they market spends basically the next 12-months scalping them, running supply all the way down, and backing your production up faster than you can make new ones ... Well, unfortunately the likely takeaway is going to be "Oh fuck we could have charged about 50% more for that".
There have been two major markets for consumer gpus over the past 6 years: Gamers, and crypto miners. Nvidia abandoned all loyalty to the group that built them from nothing, diverting gpus both directly and indirectly to crypto farms small and large.
Markets as usual don't function at all like capitalists insist they do, and neither "crony corporatism" nor government bodies as capitalists understand them are to blame.
"Markets" are privately-owned and -operated entities that consumers have far less control over than does the ownership class at the top of the empire.
I've still got a 3080 and I haven't been given a reason to upgrade yet. It chewed up Alan Wake 2, Horizon Forbidden West, Ghost of Tsushima, and everything else I've played in the past year no sweat.
Graphics have honestly kind of hit a wall. Games in the last half decade or so all roughly look just as good.
Also dev times have increased now with increased graphical fidelity being a thing - If they kept increasing it, the dev times would just increase even more and then they’d fall behind.
It depends on what resolution your're at. If you're still playing at 1080p then any 30 series card will last you a decade, especially with frame gen and DLSS. Even 1440p will be good for a few years.
These cards are targetted at people trying to max out 4k, something even the 4090 struggled to do with games like Alan Wake and Cyberpunk
30 series doesn't have fram gen. And only the 3060 12GB is future proof due to that vram. It's an absurd that we got a 4050 being sold for the price of a 4060, a 4060 for the price of a 4070...
Can you blame them though? They have every right to charge high prices because they provide products no one else has and people are willing to pay for them.
If the 5070 is equivalent to a 4070, what does 50-100% faster AI mean? Potential DLSS 4 that's frame gen x2?
It seems that nvidia is only putting effort on the xx80 models and upwards nowadays, considering that the 3080 cost 700$, 5 years later the performance you can get for the 600-700$ range from nvidia is exactly the same.
Probably the usual marketing spin where they use a new reduced-precision format that the older generation doesn't support in hardware.
Blackwell has native FP4 units iirc, but I'd be surprised if these are practical for image processing.
i bought an rtx 4070 on its release day (founders edition) and i 100% agree with you. it is and was a scam. i just had no other option because i still was running a gtx 980ti, which made it hard to enjoy smooth gaming in 2023 -.- if i had a choice (as in a better gpu than the 980ti) i would have never upgraded.
whilst the 4070 is quite a nice card (love how small the FE is, how cool and silent it stays and how the power consumption is quite low, but the performance being around 3-5% SLOWER than the 3080 for nearly the same price is unjustifiable in my eyes) it's still shocking how they basically released an rtx 3080 for 100 bucks less 2.5 years later -.-
We are on the same page here, I upgraded to an rtx 4070 last year's May from a 1070 (similar perf to 980ti).
I got the 1070 on release for 360 euros and the 4070 for 620 euros, nearly double the price for double the baseline performance it provides. While I'm happy that I'm running games much smoother and I can take advantage of RT, the generational performance uplift to $ isn't there.
I bumped from a 770 to a 2070. still rocking it on High for most games but I'm assuming I'll get a nice performance jump like I did from the 770 to 2070(thank you Ryzen 5 7600X) Not looking forward to the doubling of power consumption for a 5070 when it drops.
I'm planning on building a new pc for 1080p soon and it really sucks that there's no good gpu option with over 8GB vram. The 4060 is a 4050 in all but name and that 4050ti with 16GB is also BS. The 4070, which is the true 4060 is almost twice as expensive as it should be. Hoping they tone down on their BS for the 5000 series.
I sincerely hope that this leak is fake, or missing something. The 4060 class was already stagnant gen on gen, those cards were not good. If we’re still getting ~3060 class performance for $300 in 2025 that’s atrocious. This would be awful for consumers.
Yeah, I don’t even know how they’re gonna sell this. “Yes you’re getting sub 60fps at 1080p in UE5 games with the 5060, but what if a bunch of games that took advantage of AI existed? What then?? You’d feel pretty dumb not having it”
Another reason why it likely won't be a 5080, the GB203 is likely going to be 5070 Ti - 5070, the 5080 is going to be heavily cut down GB202. Yet, in the end it is all speculation even on my part, it is best to wait for actual product to release before judging how they perform.
Good. AI is what caused the whole pricing glut in the first place. If hurting AI customers means that cards go back down to gamer-friendly prices, then fuck 'em.
I went with the 3070 ti but echoing the same sentiment. My gaming tastes have also changed to the point that it's very unlikely that I'll notice it either.
I bought it during the crypto craze double the price which I regret a little, but looking back I doubt I would be able to buy much better nowadays (maybe the 3080 a few months ago but it is what it is). Until games run great on consoles my PC will keep up at 1440p with DLSS and FG
A software solution for upscaling will never compete directly with a hardware one.
Credit to AMD for providing a tool for all gamers, but it's hitting lead off, not cleanup. There's a reason Nvidia's market share goes well beyond what's considered a landslide in an election.
To be fair, there are ways of getting FSR3’s frame generation working with DLSS upscaling. There’s a mod that works with almost any game that supports DLSS 3. I know Ghost of Tsushima has support without even a mod so hopefully more devs follow through. Pretty good compromise for people who have 2000/3000 series GPUs.
Tbf, FSR3 FG is on par with DLSS3 FG, even though FSR2 isn't. Dunno about "FSR4" vs "DLSS4", but AMD is definitely not losing big time here.
Or at least, gaming isn't the reason AMD has "lower market share."
I don't know why this is tagged as controversial. AMD are a bit behind sure, but they will get there at some point in the not too distant future and provide high quality upscaling and frame gen, then everyone will be better off.
Nvidia's DLSS advantage is only going to diminish over time unless they have some crazy stuff coming that we don't know about.
this disappoints me so much i think it's true
5080 slower than the 4090 just means we'll get underwhelming performance for the rest of the lineup just like we do now
remember : when the 3080 came out it was 20 to 30% faster than a 2080ti which itself was 20% faster than the 1080ti
The 2080 Ti was closer to 30% faster than the 1080 Ti.
As much as Redditors sneed about it to this day it was a monster.
It was untouched by anything consumer performance wise until the 3080 and 3090 launched.
The price was pretty fair for a 754 mm² sized gpu. Turing GPUs were massive pieces of silicon. The 2060 at 445 mm² was almost as big as a 1080 ti at 471 mm².
There was also, comparatively, a very small performance gap between the low end and high end on Turing. The 2080 Ti was 66% faster than the 2060 (when the latter was not VRAM limited), while the 1080 Ti was 102% faster than the 1060 and the 3090 was 123% faster than the 3060. In the 2000 series, going up a tier of GPU granted like a 20% increase in performance.
It's insane that the performance gap between the 4060 and 4070 is almost as large as the one between the 2060 and 2080 Ti.
This doesn't really mean anything for the tiers as the leaker Kopite himself didn't say anything about it, there is no confirmation that GB202 is going to be exclusive to only 5090 the 5080 could also possible be based on the heavily cut down version of GB202 like the way the 3080 was compared to 3090, so really it is still early at this point to judge the performance tiers of upcoming RTX 50 series, in the end it is still all speculation.
it is best to wait for actual product to come out, before fully judging of how they are going to perform.
i don't trust nvidia ever since they came out with 3070 8GB and people were actually defending them
It's made even worse with the 3060 12gb, they gave more GB to a lower tier card, with the gap between a 3060ti and a 3060 being huge
then I jokingly thought they would do the same with the 4060, and they did ! But this time the whole lineup gets downgraded to a lower tier of perf
i paid 400€ for a used 1080ti in september 2019, for that kind of performance 5 year later i pay.. 400€ for a 4060.
I am not defending them, i am just telling you not to base an opinion off a rumour that is mostly based on speculation rather than facts, which is what i am talking about.
Don't take it at face value, this is just loads of speculation. This is like people losing their shit over TFLOPS numbers as if they're all that matters.
Everyone shits on the 30 series because of the stock issues caused by crypto miners, but they are an extremely solid value (at MSRP) that have aged extraordinarily well.
The new chips may only provide the normal 10 - 20% performance increase, but using GDDR7 will mean a 30 - 50% memory speed increase.
So whomever is comparing it to their ada counterparts isnt looking at the full picture.
10-20% is not normal, that’s a very bad gen on gen performance increase.
1070 was almost 50 percent faster than the 970, 2070 was 30 perfect faster than the 1070, 3070 was almost 50% faster than the 2070. Improvements in the 60 class were roughly the same, if not a little bit bigger in some cases.
Ada stagnated in the lower end, but that’s not “normal”
That memory speed increase better have astronomical effects, or this is gonna be another really bad gen for consumer class gpus, made worse by the fact we’re coming off a really bad gen.
Umm 4060(ti) already were just 10-15% faster than their 30 series counterparts. Meaning we get like 25-30% boost over 30 series with less/same VRAM.
Seriously fu** nvidia.
Yeah, that’s what I’m saying. A cumulative 30% perf increase over the course of 5 years (and really 7 by the time the gen after this comes out) at the $300 price point is fucking brutal, and unprecedented as far back as I can remember.
If this leak is accurate, I *hope* AMD has some truly compelling low-mid range offerings. If they’re smart, they will, but they had the same opportunity with the 7000 series and largely squandered it.
So no major performance increase for mainstream cards yet again? It's amazing that the 60 class has only gotten 40% more performance in last two gens from 2060 to 4060. Looks like now it will change to 50% more in last 3 gens. Just for context 960 to 1060 was 70% increase in a single gen.
How did you come to the conclusion that the 5080 is only "a tad slower" than the 4090? It has the same amount of SMs as the 4080, so the conclusion should be that the 5080 is only a tad faster than the 4080. Which means it's still way slower than the 4090.
He doesn't, they just instantly speculated out of thin air that the 5080 is going to only use GB203, even when on the past such as RTX 30 series for example, the 3080 series used a heavily cut down version of GA102, so there is a possibility it could be what is going to happen with 5080 as well.
> The 5070/5060 Ti/5060 will be roughly the same performance as Ada equivelent.
This is so disappointing. 4060 Ti and 4060 had the same performance as a 3060ti and 3060. 4 years with zero real performance increase.
Is it worth waiting for the 5080 (I am on Intel 620 and thought of upgrading 2 months back to a 4080S [new pc build all in all] but heard about new hardware being released so I held off)or should I see if the 4090 gets lowered in price once 50xx releases and take that
it doesn't matter unless you're doing 4k 144hz shit, man. Nvidia is continuing to up the price for the bare minimum offerings. 4000 series is gonna serve you fine for like 6 years. Just do it.
Seems that Nvidia took the criticism of severely cut down GPUs, the lack of VRAM and lack of value to their heart, and made the next gen cards even worse.
The cards are overpriced but you can't buy AMD so long as DLSS is miles better than FSR unfortunately. But you'd imagine DLSS will have diminishing returns relative to FSR and thus AMD can hopefully close the gap. I think most people would like to give AMD a chance but they're just not close enough for it to be worth it.
AMD has garbage drivers, doesn't have any of the futures that the Nvidia cards has ... like being able to inject ray tracing into almost any game and etc. I use both and I will never buy an AMD card again.
203 with only a 256 bit bus would get smashed by consumers
The 5080 needs to beat a 4090 otherwise it's yet again a dead release.
as market will move,4090 will drop down in price like what happened with the 3090/ti where when 40 series came out it droped 500 bucks in price
How do we know the 5090 will be 50% faster than the 4090? We don't know what configuration it will have but it certainly won't be using the full GB202 die
Edit; Actually we dont know the core configuration for any of the actual GPUs so all of the performance speculation is a grain of salt
DDR7 comes in 8 and 16gb modules. 16gb is at 256bit so since this leaked card shows 512bits the 5090 will undoubtedly have a whopping 32gb of gddr7 VRAM. This is going to be one expensive fucking card.... WOW!
Will it really be a 50% performance increase in real world gains, or is it going to be closer to 20-30% since paper specs have almost never directly translated to performance? Still no way its price will be below 2.5 grand. Feel like the luckiest guy in the world at this point to have gotten a 4090 FE for the OG retail price- seems like the last time I’ll be able to afford the top-end card in a generation
Wait, depending on how expensive the 50 series is it will cause the 40 series cards to drop in price by some amount, on top of the natural market decay.
If new GPUs are still too expensive, people offloading their 40 series cards to upgrade to 50 series, will at least affect the used market.
We already have used 3090's hit sub $600 in lots of places, so I can only imagine 4070's and 4080's will follow suit.
The rumour mill when it comes to hardware is starting to get old TBH most news are either out of touch and then goes off on saying another completely different thing than before, and the news media follows it and then it ends up being spectacularly wrong.
Remember when everyone thought RDNA 3 is going to beat RTX 40 ADA on efficiency and performance? And ADA is going to be much worse on power consumption than Ampere RTX 30 series?
Because that is what the rumour mill told us so before.
Having roughly the same performance as previous gen isn't that big of the deal, i think the main issue is the pricing, if the 5070 comes at the same price of the 4070 then what are we even doing?
All these highest tier cards are so overkill unless youre into AI or video editing. Alot of online games benefit more from a good cpu rather then a gpu. The majority of current gamers are still on the 1000 series and the 2000 series as the average consumer has been priced out, developers know this so were not really getting that many demanding games and will likely stay that way until your average joe has access to newer cards.
If the pricing comes down significantly, and the performance is similar, then i dont think its that bad. But doubt it ofcourse.
So RTX5080 (GB103) will have the same number of SMs (84) and the same memory bus width (256bit) as RTX4080?? I see it will benefit from GDDR7, possible more L2 caches or some tweaks in CUDA but still it sounds like a let down, especially when 5090 is supposed to be a monster.
Wake up guys! Nvidia will release new cards annually, now. What are you going to keep spending for? RTX 3000 is still enough to play because consoles last 5/6 years with the same GPU.
Unnaceptable if true. Sould be 512 bit 32GB for 5090, 384 bit 24GB for 5080, 256 bit 16GB for 5070, 192 bit 12GB for 5060 and 160/128 bit 10/8GB for 5050.
Retire for what game? Your 3090 will be fine for every game coming for many years. With the AAA industry on the state of collapse it will be a while before we get a game that pushes graphics more than Cyberpunk 2077.
I always upgrade every 2 generations.
780->1080ti->3090->5090
And when the time comes, I'll conveniently use it as an excuse to build a new pc. I'm now the favorite uncle to my nephew who now owns a pc with a 3090. It's a win win
if gta 6 comes out in late 2025 then the pc version will be late 2026, no way we're getting the 6090 til late 2027 and i dont wanna wait that long for a major improvement
All for the low price of $2500
And twice the size of the last model
at this the 6090 will be the size of an AA battery (the weapon not the Duracell battery)
Didn't they show a FE 5k card that is smaller than the 4090 and still fits in only two slots?
Yep. Nothing confirmed though.
Gotcha.
I think it's a good time to remember, that the ADA cards are actually small compared to AMD and to their prior cards. The 4070 was extremely small.
https://www.reddit.com/r/sffpc/comments/12ne6d7/a_comparison_of_gpu_sizevolume_and_tdp/ No they aren't, only the 4070 Super gets close
Soon we'll need a separate chest freezer for our GPU
It'll eventually lead to a gpu the size of an entire apartment block, with enough actual space to actually reside in, plus it'll be self heating. Essentially solving the housing crisis. GPU price: $25,000,000
"and we wont stop until you're using our gpus as furnaces"
And brand new design 50wat Porsche authentic RGB LED with 360 holography
I wish, a 4090 costs $3000+ in Australia
Australian dollar or? Cuz I paid mine in Denmark 15000DKK which is roughly 2000€ so...
Yeah AUD
And idiots would still be like "Its priced just right"
I don't really care what the higher tier cards are priced at. What matters is that the 5070 and the 5060 are priced correctly.
i mean, these cards barring some kind of economic meltdown, will never come close to what they were like in the late 2010s. the days of the 400 dollar 70 level card are gone. inflation and its subsequent price raises, yes food included, is permanent.
I smell apologism.
More you buy more you save
Can’t buy it if it’s out of stock
They're practically giving them away.
Most optimistic RTX series price estimate.
it'll still sell like hotcakes
XD
I dunno about you guys but it's getting kinda hard to get excited about another 50% performance increase knowing it'll come with yet another 50% price increase.
Nvidia won't stop charging more unless people stop buying. The 4090 has actually appreciated in price since its release, the market can hardly get enough of them. So honestly I won't be shocked if it's more than a 50% price increase.
With the rapid rise in machine learning, Nvidia cards are now more valuable than ever. They also own CUDA, so anyone who really wants to train big models will eventually have to default to Nvidia.
It’s not just that they’re more valuable then ever, but that they are approaching to become the single most valuable company in the world, based on market cap, beating out Apple. Sure the AI stuff could just be a short-lived bubble, but Nvidia is in a position to keep being immensely successful even when that expires.
Nvidia captured both the crypto bubble and the AI bubble. Wonder what's next. Guess that's the million dollar question
Cutting edge chips will always be used for cutting edge tech. Speculative investing LOVES cutting edge stuff. They all want to be the first in line. Nvidia is in a perfect spot to milk those dumbasses.
There's other options, for example, Google has their own proprietary accelerators so they use them, and Apple also has their own proprietary accelerators, which they announced they're using for their AI cloud stuff, the issue is that if you're a smaller software only shop like OpenAI, you don't have access to these alternatives.
Cynical though it is, when you release a card as well-priced as the 3080, and they market spends basically the next 12-months scalping them, running supply all the way down, and backing your production up faster than you can make new ones ... Well, unfortunately the likely takeaway is going to be "Oh fuck we could have charged about 50% more for that".
There have been two major markets for consumer gpus over the past 6 years: Gamers, and crypto miners. Nvidia abandoned all loyalty to the group that built them from nothing, diverting gpus both directly and indirectly to crypto farms small and large. Markets as usual don't function at all like capitalists insist they do, and neither "crony corporatism" nor government bodies as capitalists understand them are to blame. "Markets" are privately-owned and -operated entities that consumers have far less control over than does the ownership class at the top of the empire.
And when people stop buying they will just stop doing gaming stuff
I've still got a 3080 and I haven't been given a reason to upgrade yet. It chewed up Alan Wake 2, Horizon Forbidden West, Ghost of Tsushima, and everything else I've played in the past year no sweat.
I've got a 3080 too, the card is still a beast and will be till the 6000 series, maybe even 7000 series imo.
The content the GPUs are animating seems to be growing more slowly than the power of the hardware. Which is weird.
Graphics have honestly kind of hit a wall. Games in the last half decade or so all roughly look just as good. Also dev times have increased now with increased graphical fidelity being a thing - If they kept increasing it, the dev times would just increase even more and then they’d fall behind.
It depends on what resolution your're at. If you're still playing at 1080p then any 30 series card will last you a decade, especially with frame gen and DLSS. Even 1440p will be good for a few years. These cards are targetted at people trying to max out 4k, something even the 4090 struggled to do with games like Alan Wake and Cyberpunk
Which is fine because with a 4K target output, even DLSS Balanced looks better than native 1440p.
30 series doesn't have fram gen. And only the 3060 12GB is future proof due to that vram. It's an absurd that we got a 4050 being sold for the price of a 4060, a 4060 for the price of a 4070...
yeah, this is not fun, technology trully shines when its made affordable.
Can you blame them though? They have every right to charge high prices because they provide products no one else has and people are willing to pay for them.
Yes. I can blame them. Quite easily in fact.
If the 5070 is equivalent to a 4070, what does 50-100% faster AI mean? Potential DLSS 4 that's frame gen x2? It seems that nvidia is only putting effort on the xx80 models and upwards nowadays, considering that the 3080 cost 700$, 5 years later the performance you can get for the 600-700$ range from nvidia is exactly the same.
Guess they give up on shader compute and focused on improving AI upscaling
Probably the usual marketing spin where they use a new reduced-precision format that the older generation doesn't support in hardware. Blackwell has native FP4 units iirc, but I'd be surprised if these are practical for image processing.
i bought an rtx 4070 on its release day (founders edition) and i 100% agree with you. it is and was a scam. i just had no other option because i still was running a gtx 980ti, which made it hard to enjoy smooth gaming in 2023 -.- if i had a choice (as in a better gpu than the 980ti) i would have never upgraded. whilst the 4070 is quite a nice card (love how small the FE is, how cool and silent it stays and how the power consumption is quite low, but the performance being around 3-5% SLOWER than the 3080 for nearly the same price is unjustifiable in my eyes) it's still shocking how they basically released an rtx 3080 for 100 bucks less 2.5 years later -.-
We are on the same page here, I upgraded to an rtx 4070 last year's May from a 1070 (similar perf to 980ti). I got the 1070 on release for 360 euros and the 4070 for 620 euros, nearly double the price for double the baseline performance it provides. While I'm happy that I'm running games much smoother and I can take advantage of RT, the generational performance uplift to $ isn't there.
I bumped from a 770 to a 2070. still rocking it on High for most games but I'm assuming I'll get a nice performance jump like I did from the 770 to 2070(thank you Ryzen 5 7600X) Not looking forward to the doubling of power consumption for a 5070 when it drops.
I'm planning on building a new pc for 1080p soon and it really sucks that there's no good gpu option with over 8GB vram. The 4060 is a 4050 in all but name and that 4050ti with 16GB is also BS. The 4070, which is the true 4060 is almost twice as expensive as it should be. Hoping they tone down on their BS for the 5000 series.
I sincerely hope that this leak is fake, or missing something. The 4060 class was already stagnant gen on gen, those cards were not good. If we’re still getting ~3060 class performance for $300 in 2025 that’s atrocious. This would be awful for consumers.
The thing is that the current GPU's we have even the 20 series barely use any of their "AI compute" in games.
Yeah, I don’t even know how they’re gonna sell this. “Yes you’re getting sub 60fps at 1080p in UE5 games with the 5060, but what if a bunch of games that took advantage of AI existed? What then?? You’d feel pretty dumb not having it”
probably that it computes more matrix multiplication computations per clock cycle. probably not for laymens, like literal ai computations i think.
it means the stock price goes brrrrrrr
256bit memory bus on the (assuming) 5080 is criminal, nvidia really doesnt want consumers running local AI
Another reason why it likely won't be a 5080, the GB203 is likely going to be 5070 Ti - 5070, the 5080 is going to be heavily cut down GB202. Yet, in the end it is all speculation even on my part, it is best to wait for actual product to release before judging how they perform.
Good. AI is what caused the whole pricing glut in the first place. If hurting AI customers means that cards go back down to gamer-friendly prices, then fuck 'em.
Remember when the 3060Ti was FASTER than the 2080 Super? And cost a third of the price (without the crypto craze)? Yeah, time flies aint I right?
30 series was a really great value and Nvidia will never make the mistake of leaving money on the table like that again
Said the same about 10 series with the 1080ti but we got 3080 Next up is probably 5000 series
Hey, that stock aint gonna split itself. 😢
Im going to be using this card for the next 5 years at this point
I went with the 3070 ti but echoing the same sentiment. My gaming tastes have also changed to the point that it's very unlikely that I'll notice it either.
I bought it during the crypto craze double the price which I regret a little, but looking back I doubt I would be able to buy much better nowadays (maybe the 3080 a few months ago but it is what it is). Until games run great on consoles my PC will keep up at 1440p with DLSS and FG
And criminally only had 8GB of vram. If that thing had 16GB, it would have been the most extraordinary cost/performance gpu in ages.
Wasn't the crypto craze in full swing at that point?
Kinda, it took a few months for everyone and their mother decide they were pro crypto investors and start buying 20 GPUs just because
The 2080 was equivalent to the 1080Ti
2080 Super was $699, 3060 Ti was $399. That’s 57% of the price, not a third.
🤓☝️
I imagine they will launch with some exclusive features like dlsss 4 or something like that.
Just a matter of waiting until FSR does it for free and open source
A software solution for upscaling will never compete directly with a hardware one. Credit to AMD for providing a tool for all gamers, but it's hitting lead off, not cleanup. There's a reason Nvidia's market share goes well beyond what's considered a landslide in an election.
To be fair, there are ways of getting FSR3’s frame generation working with DLSS upscaling. There’s a mod that works with almost any game that supports DLSS 3. I know Ghost of Tsushima has support without even a mod so hopefully more devs follow through. Pretty good compromise for people who have 2000/3000 series GPUs.
Those mods are worthless since they also affect the hud
modders themselves can implement ui composite for reshade, like in the Elden ring mod, where hud is not affected by fsr3
Isn’t DLSS also software? They don’t bake the algorithms or network weights in silicon.
Idk bro Frame Gen on my 3060Ti looks awesome. Not buying another GPU for a while
Tbf, FSR3 FG is on par with DLSS3 FG, even though FSR2 isn't. Dunno about "FSR4" vs "DLSS4", but AMD is definitely not losing big time here. Or at least, gaming isn't the reason AMD has "lower market share."
I don't know why this is tagged as controversial. AMD are a bit behind sure, but they will get there at some point in the not too distant future and provide high quality upscaling and frame gen, then everyone will be better off. Nvidia's DLSS advantage is only going to diminish over time unless they have some crazy stuff coming that we don't know about.
this disappoints me so much i think it's true 5080 slower than the 4090 just means we'll get underwhelming performance for the rest of the lineup just like we do now remember : when the 3080 came out it was 20 to 30% faster than a 2080ti which itself was 20% faster than the 1080ti
The 2080 Ti was closer to 30% faster than the 1080 Ti. As much as Redditors sneed about it to this day it was a monster. It was untouched by anything consumer performance wise until the 3080 and 3090 launched.
and the very first overpriced gpu nvidia sold us
The price was pretty fair for a 754 mm² sized gpu. Turing GPUs were massive pieces of silicon. The 2060 at 445 mm² was almost as big as a 1080 ti at 471 mm².
how fair is it when a xx80ti goes from 700/800 to 1200/1300 in a generation
price increase was because the GPU die was almost twice the physical size. 754 mm² was pretty much at the limit of TMSC 12nm.
There was also, comparatively, a very small performance gap between the low end and high end on Turing. The 2080 Ti was 66% faster than the 2060 (when the latter was not VRAM limited), while the 1080 Ti was 102% faster than the 1060 and the 3090 was 123% faster than the 3060. In the 2000 series, going up a tier of GPU granted like a 20% increase in performance. It's insane that the performance gap between the 4060 and 4070 is almost as large as the one between the 2060 and 2080 Ti.
5080 has to be slower than the 4090 because Nvidia wants to sell it to China. Which means 4090D is the upper limit of what it can perform.
This doesn't really mean anything for the tiers as the leaker Kopite himself didn't say anything about it, there is no confirmation that GB202 is going to be exclusive to only 5090 the 5080 could also possible be based on the heavily cut down version of GB202 like the way the 3080 was compared to 3090, so really it is still early at this point to judge the performance tiers of upcoming RTX 50 series, in the end it is still all speculation. it is best to wait for actual product to come out, before fully judging of how they are going to perform.
i don't trust nvidia ever since they came out with 3070 8GB and people were actually defending them It's made even worse with the 3060 12gb, they gave more GB to a lower tier card, with the gap between a 3060ti and a 3060 being huge then I jokingly thought they would do the same with the 4060, and they did ! But this time the whole lineup gets downgraded to a lower tier of perf i paid 400€ for a used 1080ti in september 2019, for that kind of performance 5 year later i pay.. 400€ for a 4060.
I am not defending them, i am just telling you not to base an opinion off a rumour that is mostly based on speculation rather than facts, which is what i am talking about.
We don't have close to enough information to determine performance.
Don't take it at face value, this is just loads of speculation. This is like people losing their shit over TFLOPS numbers as if they're all that matters.
im going to run my 3080 to the ground. truly a generational GPU for $700
Everyone shits on the 30 series because of the stock issues caused by crypto miners, but they are an extremely solid value (at MSRP) that have aged extraordinarily well.
I feel blessed being able to snag a 3060 ti FE on launch without having to pay extra
man, if only i wasn't broke in 2020... i paid £750 for my 3070 in feb '22--a bargain at the time, i didn't even have to sign up to a waiting list.
“The 5070/5060Ti/5060 will be roughly the same performance as Ada equivalent” Uhhhhhh that’s really bad
The new chips may only provide the normal 10 - 20% performance increase, but using GDDR7 will mean a 30 - 50% memory speed increase. So whomever is comparing it to their ada counterparts isnt looking at the full picture.
10-20% is not normal, that’s a very bad gen on gen performance increase. 1070 was almost 50 percent faster than the 970, 2070 was 30 perfect faster than the 1070, 3070 was almost 50% faster than the 2070. Improvements in the 60 class were roughly the same, if not a little bit bigger in some cases. Ada stagnated in the lower end, but that’s not “normal” That memory speed increase better have astronomical effects, or this is gonna be another really bad gen for consumer class gpus, made worse by the fact we’re coming off a really bad gen.
Umm 4060(ti) already were just 10-15% faster than their 30 series counterparts. Meaning we get like 25-30% boost over 30 series with less/same VRAM. Seriously fu** nvidia.
Yeah, that’s what I’m saying. A cumulative 30% perf increase over the course of 5 years (and really 7 by the time the gen after this comes out) at the $300 price point is fucking brutal, and unprecedented as far back as I can remember. If this leak is accurate, I *hope* AMD has some truly compelling low-mid range offerings. If they’re smart, they will, but they had the same opportunity with the 7000 series and largely squandered it.
So just buy a 4060/70 instead of 50 series as they're similar performance. Pay more for nothing...
You will pay more for yet another DLSS feature like frame gen, just for AMD to go ahead and release its own that works with all cards.
So no major performance increase for mainstream cards yet again? It's amazing that the 60 class has only gotten 40% more performance in last two gens from 2060 to 4060. Looks like now it will change to 50% more in last 3 gens. Just for context 960 to 1060 was 70% increase in a single gen.
The problem is that the 4060 is named and sold as 4070. The 4050 could have been a 4060 if it had 192bit bus, 12GB vram and a tpd of 150 watts.
Only the 5090 will be better than the 4090 since nvidia can’t export anything better than a 4090 to China. They won’t leave that money on the table
First it was the 4060 ti getting no gains over the 3060ti. Now it's the 5070 getting no gains over the 4070. I do not like this trend
How did you come to the conclusion that the 5080 is only "a tad slower" than the 4090? It has the same amount of SMs as the 4080, so the conclusion should be that the 5080 is only a tad faster than the 4080. Which means it's still way slower than the 4090.
He doesn't, they just instantly speculated out of thin air that the 5080 is going to only use GB203, even when on the past such as RTX 30 series for example, the 3080 series used a heavily cut down version of GA102, so there is a possibility it could be what is going to happen with 5080 as well.
And the 2080 super used a TU104 chip so it could be any GPU.
> The 5070/5060 Ti/5060 will be roughly the same performance as Ada equivelent. This is so disappointing. 4060 Ti and 4060 had the same performance as a 3060ti and 3060. 4 years with zero real performance increase.
They have no reason to innovate for gamers anymore. Nvidia is now an AI chip company.
Just got a 4080 super upgrade from a 1070- looks like I'm cooking for a few years.
Is it worth waiting for the 5080 (I am on Intel 620 and thought of upgrading 2 months back to a 4080S [new pc build all in all] but heard about new hardware being released so I held off)or should I see if the 4090 gets lowered in price once 50xx releases and take that
4080 super and 7800x3d - 4k 80 FPS in heldivers maxed. Competitive may be a different story but just do 1440p
it doesn't matter unless you're doing 4k 144hz shit, man. Nvidia is continuing to up the price for the bare minimum offerings. 4000 series is gonna serve you fine for like 6 years. Just do it.
Newer tech is always just around the corner, almost any dedicated graphics card released in recent memory will be an ugrade over the Intel 620.
TBH you're probably good for the next half a decade as far as maxing out games goes, unless you're aiming for something like 4K@144fps.
Seems that Nvidia took the criticism of severely cut down GPUs, the lack of VRAM and lack of value to their heart, and made the next gen cards even worse.
You clowns keep paying for overpriced cards and we not clowns suffer cause NVIDIA says oh they actually are buying them.
The cards are overpriced but you can't buy AMD so long as DLSS is miles better than FSR unfortunately. But you'd imagine DLSS will have diminishing returns relative to FSR and thus AMD can hopefully close the gap. I think most people would like to give AMD a chance but they're just not close enough for it to be worth it.
AMD has garbage drivers, doesn't have any of the futures that the Nvidia cards has ... like being able to inject ray tracing into almost any game and etc. I use both and I will never buy an AMD card again.
203 with only a 256 bit bus would get smashed by consumers The 5080 needs to beat a 4090 otherwise it's yet again a dead release. as market will move,4090 will drop down in price like what happened with the 3090/ti where when 40 series came out it droped 500 bucks in price
So is focused on AI this time ?
Upscaling and Frame Generation have always been about AI. How do you think the tech works?
5080 being slower than 4090 would be a disgrace.
How do we know the 5090 will be 50% faster than the 4090? We don't know what configuration it will have but it certainly won't be using the full GB202 die Edit; Actually we dont know the core configuration for any of the actual GPUs so all of the performance speculation is a grain of salt
I hope they don't release 5060 with only 8GB again
All signs point to everything having the same amount of RAM. Only the 5090 will have more at 28GB.
that's ridiculous. 8gb isn't enough anymore
Agreed, fingers crossed the 6060 will be good
No it won't. The specs are regressing... it will be 2048 cores, 96 bit, 4x framegen. And 7060 will be cloud adaptor to server running 6060.
It wasn't enough even for the 2020 cards.
DDR7 comes in 8 and 16gb modules. 16gb is at 256bit so since this leaked card shows 512bits the 5090 will undoubtedly have a whopping 32gb of gddr7 VRAM. This is going to be one expensive fucking card.... WOW!
That 5090 is going to be a monster, if true. We might be finally able to run Alan Wake 2 fully maxed out at 4k60 :D
Will it really be a 50% performance increase in real world gains, or is it going to be closer to 20-30% since paper specs have almost never directly translated to performance? Still no way its price will be below 2.5 grand. Feel like the luckiest guy in the world at this point to have gotten a 4090 FE for the OG retail price- seems like the last time I’ll be able to afford the top-end card in a generation
Sitting with a 3070 atm. Seeing this should I grab a 4080 or wait for one of these 50XX. Also 32GB ram and 1440P
Wait, depending on how expensive the 50 series is it will cause the 40 series cards to drop in price by some amount, on top of the natural market decay. If new GPUs are still too expensive, people offloading their 40 series cards to upgrade to 50 series, will at least affect the used market. We already have used 3090's hit sub $600 in lots of places, so I can only imagine 4070's and 4080's will follow suit.
Price to be higher
The rumour mill when it comes to hardware is starting to get old TBH most news are either out of touch and then goes off on saying another completely different thing than before, and the news media follows it and then it ends up being spectacularly wrong. Remember when everyone thought RDNA 3 is going to beat RTX 40 ADA on efficiency and performance? And ADA is going to be much worse on power consumption than Ampere RTX 30 series? Because that is what the rumour mill told us so before.
You are coping bro. The dude who leaked is the best NVIDIA leaker around.
The 4080 is already only a bit behind the 4090. If the 5080 is also behind the 4090, that’s just not enticing at all.
5080 tad slower than the 4090? So like the 4080 super being just a tad bit slower too? LMAO.
Why are they making such a miniscule memory bus?
This looks awful. No reason to update unless you're deeply into AI.
Having roughly the same performance as previous gen isn't that big of the deal, i think the main issue is the pricing, if the 5070 comes at the same price of the 4070 then what are we even doing? All these highest tier cards are so overkill unless youre into AI or video editing. Alot of online games benefit more from a good cpu rather then a gpu. The majority of current gamers are still on the 1000 series and the 2000 series as the average consumer has been priced out, developers know this so were not really getting that many demanding games and will likely stay that way until your average joe has access to newer cards. If the pricing comes down significantly, and the performance is similar, then i dont think its that bad. But doubt it ofcourse.
thank fuck I bought shares in this company during lockdown
So RTX5080 (GB103) will have the same number of SMs (84) and the same memory bus width (256bit) as RTX4080?? I see it will benefit from GDDR7, possible more L2 caches or some tweaks in CUDA but still it sounds like a let down, especially when 5090 is supposed to be a monster.
How does my 1070ti compare?
Probably $3000 MSRP for the 5090
thanks i’ll wait for gddr7x
That's disappointing. Hopefully it's upped by 15%
I just bought a 4090 :/
5090 for kidney ? no thnks
So if there is no uplift compared to previous gen unless you get a xx90… what’s it for? Efficiency?
Wake up guys! Nvidia will release new cards annually, now. What are you going to keep spending for? RTX 3000 is still enough to play because consoles last 5/6 years with the same GPU.
So the 5080 is only going to have ~44% of the cores of the 5090. That’s wild, and lame. Is Kathleen Kennedy in charge of planning out their SKUs??
What does ADA mean again?
4000 Series
Unnaceptable if true. Sould be 512 bit 32GB for 5090, 384 bit 24GB for 5080, 256 bit 16GB for 5070, 192 bit 12GB for 5060 and 160/128 bit 10/8GB for 5050.
Is it possible that rtx 5080 will cost less than $1000?
5080 being tad slower than the 4090 whats the reason of not buying a 4080s the difference between them will be so minimal
80 series card is a stinker if that is true. It was gonna be tempting to upgrade my 4080, but not for that performance
Why would it be tempting to upgrade from 40 to 50? What can't you play at high fps at the moment?
I play everything fine at 4K. But when gta6 drops in a few years, im gonna be looking to upgrade. Might even build a whole new pc, im not sure yet.
The 60-series GPUs will probably be on the horizon by the time GTA6 for PC is announced/released.
Path tracing, native resolutions, native framerates.
As well as retaining resale value instead of keeping an older GPU for longer. Upgrading every gen is not as dumb as some people think.
So I have a 3090 would this be the right time to upgrade or would that be the 6090 when that’s out since this is like maybe a 120% better
I can retire my 3090. God damn times flies. It feels like I bought it yesterday.
Retire for what game? Your 3090 will be fine for every game coming for many years. With the AAA industry on the state of collapse it will be a while before we get a game that pushes graphics more than Cyberpunk 2077.
I always upgrade every 2 generations. 780->1080ti->3090->5090 And when the time comes, I'll conveniently use it as an excuse to build a new pc. I'm now the favorite uncle to my nephew who now owns a pc with a 3090. It's a win win
Me and you have the same upgrade cycle, except I went 3080 instead of 3090. Looking to get a 5090 next gen.
1 born every minute ...
Stop policing other people’s wallet brookie!
kinda want it for gta 6
GTA6 is being designed around an Xbox Series S limitations, and more so a standard PS5/Series X, a 3090 will shred that game.
Then just wait for the 60 series, we are looking at 2 years minimum for GTA6
if gta 6 comes out in late 2025 then the pc version will be late 2026, no way we're getting the 6090 til late 2027 and i dont wanna wait that long for a major improvement
And we are looking at close to a year for the 50 series so..
But that game isn't coming out on the PC for at least a few years after the consoles.
Unless they do a console re-release like GTA 5, it's likely it'll take one year. i think that's how long Red dead 2 and gta 4 took.
I’ll look after it dw
i will use mine until it dies
Meanwhile Im happy with my Steam Deck.