I can confirm that RT works perfectly fine at 1080p with DLSS on at Quality. I've been using that setup for the gameplay. The difference between RT is extremely noticeable when compared to rasterized lighting, especially in the gradients of lighting and the reflections. There's no artifacting or anything due to lower resolutions.
I'm not sure what exactly others may be seeing, but if they're encountering a problem with RT and lower resolutions, I don't think its a limitation of the technique. Probably more like a driver / DLSS issue.
Saw that without the chiplets the core in 7900 is about half as big as the 4090. Thats around double the cores/waffer witch is alot cheaper and probably with better high quality yeilds. Then the chiplets are a cheaper 6 nm too add too that. We shall se how it works IRL but im guessing theres not much difference compared too a monolitic, AMD has been doing this for some time so they should know what there doing..
Yeah, just looking at the power requirements and general setup of the AMD cards' design (though, grain of salt, since the hands-on is not really there yet) I find that to be much more sensible. Nvidia's design seems to be focused on pressuring high-end users to basically build whole new systems around this monstrosity of a card, then have to deal with the insane power draw for...what exactly?
I have to get into diminishing returns here, now that I've seen the 4090 in action. Given the tradeoffs required to actually install and use one of these things...it's pointless. The present and near-future market is not going to start making games that run in 4K resolutions natively and demand 12-16 GB VRAM just because Nvidia released a briefcase-sized GPU with a 12-pin power connector. Conversely, AMD's design looks much more sensible and market-practical, not to mention gaming-practical, for what's likely to appear in the next 3-5 years or so.
And once again:
By the time those 3+ years have passed, plenty of cheaper and more reasonable designs will have been delivered, the tech will have been improved, and games that can actually take advantage of these cards will have been made. It's always possible that Nvidia created another "GTX 1080", and this card will prove incredibly powerful long into the future...but I doubt it. To me, it looks like Nvidia is trying to further capitalize on the insane market gouging that began during the pandemic, furthering the illusion that the only way to get "high-end" gaming is to pay out the toofus for ridiculous hardware which will afford you basically the same experience in practice as a card worth literally half the price.
While gaming, the difference between 144 FPS delivered at 1440p and 450w vs. 144 FPS at 4K and 600w is going to negligible to the eye and the brain. It's only going to have a potentially big impact on the ego.