Are there lines yet for the 4090? Less than a week away

+
I think AMD will win both efficiency and raw compute power (but not RT as you suggest). They already basically won with previous generation (i.e. AMD's 6000 series beats Nvidia's 3000 series in raw compute power for comparable models), so I'd expect them to keep being ahead still.

But I guess we'll see soon enough when benchmarks show up in November. I'm personally interested in Linux gaming use case.
Hmm well that kinda depends doesnt it? resolution seem too be a big factor in it (4k nvidia wins, under AMD does) also game matters pretty hard but thats nothing new. Fp16 Amd wins the rest Nvida does. AMD wins performance/watt every time tho, just like nvdia wins RT everytime. I would say its debateble.
 
Yeah, it depends on the game, but I think on average, 6800 XT beats 3080 it was roughly positioned with.




That's a bit outdated though since Mesa advanced since then.
 
Yeah, it depends on the game, but I think on average, 6800 XT beats 3080 it was roughly positioned with.

That's a bit outdated though since Mesa advanced since then.
Its possible, like i said it really depends on resolution and game tho. Im not at all sure about linux so no clue there but techpowerups relative performance has 3090ti over all 6000 series. the 3080ti beats the 6800xt but not the 3080. Its all relative it seems. Sadly AMD dont have as good support when it comes too programs for AI and so on it seems. hopefully they will improve that. Its something i consider when im shopping for a gpu.
 
I think 6800 XT was positioned vs 3080, not 3080ti, so I suppose the above is expected. Nvidia releases a wider range of performance options since they tend to have a ton more money generally, while AMD release less models in that range. But you can usually compare models targeted at each other using pricing (at least MSRP). Of course cards with more compute units / drawing more power will be more performant, but that's not the point of comparison.

So it just tells me that in raw compute power, AMD were already basically ahead in the previous generation (i.e. in the sense of their microarchitecture and hardware design). It would be interesting to see how they fare this time using chiplets.
 
Last edited:
I think 6800 XT was positioned vs 3080, not 3080ti, so I suppose the above is expected. Nvidia releases a wider range of performance options since they tend to have a ton more money generally, while AMD release less models in that range
Yea i think it also has too do with Nvidia not expecting Amd too be that good so they refreshed the 3080 too beat them. Price/performance is tbh something i dont care about at all (or i would not have gotten a 3090). What i care about is performance and compability in diffrent senarios (3dmodeling/rendering/AI). Its pretty much the same with Power draw for me, i want great fps at 4k with maxed out settings. Dont care if it draws 300w or 450w. This is why its relative, i get that its no use getting a 3090ti for gaming at 1080p (or very little atleast) The cost/performance is just bad. This is probably why comparing generations of cards is pretty bad :)

Card vs Card is better but cost vs cost can be weird since MSRP is also kinda missleading. Over where i live i never see any MSRP cards since taxes are so high. Cheapest 3090 at release was 2000 dollars (Board partners that is) It went as high as 2700 during the shortage (store not scalpers).
Now there more inline but still above MSRP. Its just a missleading price for people here pretty much.
 
Yeah, besides in the previous generation prices were all over the place due to shortages and cryptocurrencies which all normalized only very recently. So may be this time around pricing will be less wild.

Power wise - I do care for them not to turn into power creep :) I have a 750 W PSU now, and with next upgrade going to 1000 W would be annoying. I hope 850 W will be enough with AMD (I do expect some increase).
 
And Nvidia was at a disadvantage last release because they were on a larger node size while AMD was already on 5nm which is where they got a big boost generation over generation (an RX 5700XT wasn't very power efficient) This time Nvidia will be the one most benefitting because they are on a smaller node while AMD is essentially on the same as before. If you look at the Power vs Performance of the RTX 4090 it's really quite an impressive gain for a single generation

For AMD to match Nvidia they are going to need to use roughly the same amount of Watts because the majority of gain in power vs performance in the last 10 years has come from smaller node sizes but we have pretty much hit the limit on how small you can go and still get enough electrons to flow and Nvidia and AMD are on roughly the same node. However Nvidia is always going to use slightly more because it has extra Tensor cores to power and because of the topology they will use some power even when they aren't being used, granted not much but it still adds to the overall power usage.
Post automatically merged:

I think AMD will win both efficiency and raw compute power (but not RT as you suggest). They already basically won with previous generation (i.e. AMD's 6000 series beats Nvidia's 3000 series in raw compute power for comparable models), so I'd expect them to keep being ahead still.

But I guess we'll see soon enough when benchmarks show up in November. I'm personally interested in Linux gaming use case.

You are confusing rasterization power with compute power. Nvidia blows AMD away in compute power because of the Tensor Cores. I use my old 2080 in dedicated compute applications like modeling air flow (fluid dynamics) and other electronics design applications. AMD did have better rasterization power last generation but a lot of that came down to using a smaller node size allowing them to use more transistors at roughly the same power draw plus they didn't have Tensor cores taking up space that Nvidia needed to sacrifice. RDNA3 will likely be more power efficient than RDNA2 but not by a groundbreaking amount and it's not going to benefit from a smaller node size where most gains in power vs performance are being made
 
Last edited:
And Nvidia was at a disadvantage last release because they were on a larger node size while AMD was already on 5nm which is where they got a big boost generation over generation

That's incorrect. As far as I know, 6000 series is using 7 nm node. Only their upcoming series is using 5 nm one.


So I think your point doesn't apply. They moved over nodes more or less the same way (Nvidia were on comparable Samsung one last time).
Post automatically merged:

You are confusing rasterization power with compute power. Nvidia blows AMD away in compute power because of the Tensor Cores.
No, by compute power I mean general (in GPU sense) purpose compute units. Not specialized AI ASICs like tensor units ("cores" isn't really a good name for them).

I.e. regular SIMD compute units used for graphics. That's more than just rasterization, but including. I.e. in the actual general purpose functions of the GPU AMD beat Nvidia already in the last generation.

AI ASICs are Nvidia specific yes, but they have pretty limited application in comparison, like any ASICs naturally. The trend to cram more ASICs into the GPU can give an edge in specific workloads surely, but it's a dubious trade off, since you have less space for well, actual graphics stuff.

If you need such AI you could go to actual ASICs that focus on it proper, without graphics stuff. So longer term I don't see this being a major advantage if they keep putting more and more into it with current devices being already as monstrous as they look. Essentially, a better approach would be to create a separate device targeted just for AI in addition to essentially a graphics unit that GPU is.
 
Last edited:
The reviews so far for the 4090 are impressive. Almost 50% gain over a 3090-Ti on some tests. I'm interested in seeing how the heat management holds up in real world environments, given the monster heat sink that they had to attach to it.
 
Looks like I was wrong... What a shit show. Nvidia and Best Buy had the FE marked sold out before they even went on sale with no pre-orders ever being offered.
 
Some 3090 are still unstable to this date and overheat like crazy.
Maybe wait for the reviews after 4090 launch? I mean, from random people, not from the pro bench setup.
I have some doubts about the quality of this release.
 
Tbh they look way better then the 3090s so far, even memory temps are way down compared witch was a huge problem with the 3090. Even efficiancy is much better so its looking pretty ok this time around. Expensive as hell tho and way too much for a game card..
 
Top Bottom