This is true of all parts tbh, cpu threads and so on has been very limited untill recently in gaming since the older systems could not use more then 4 anyways. Thank god theres directstorage on the consoles now so we can start using that atleast.
That's true, but it is also nice to have a couple of unused threads on the side for things like Firefox and the likes
Yea i might get one too. Ill wait a while tho to ose if prices drops since they have pretty much screwed us with the 3000 series ^^ Nvidia will probably be better then AMD on RT atleast. Raw compute will probably be similar or slight ahead for nvidia since AMD goes the efficency route.
I agree, although it will still be quite interesting as to how close the 4090 and 7900XT (7950XT?) will be.
Really wish they started making the consoles upgradeble for real. They can make there own upgrade parts or something. But that would make it hard too set standards for what we can do with this game and so on, anyways rant off ^^
I think the whole point of consoles would be not needing to upgrade those, thus offering such a path would conflict with the intended simplicity of them.
That was my reasoning, too. They looked at the what the scalpers / miners were paying, and thought "hey we could do some scalping of our own!" and jacked up the prices on their 4000 series, while locking the new DLSS behind the new gen paywall. The most questionnable move is the 4070 being rebranded as a "4080 - 12gb" - I've been with nVidia for over a decade, but I'll seriously consider AMD this time around. Anyways, it took 15 months to get my 3090 at a reasonable price, I can wait for the prices to drop.
I just hope they'll continue to improve DLSS 2.x and not abandon it altogether.
I used to be an AMD user for quite a while until I went nuts and wanted the absolute high-end with RT. Then, you are basically stuck with Nvidia...
Currently no one needs a 4080/90 to play any game on the market. You don't even need a 3080/90. People just unfortunately fall prey to marketing and upgrade for no other reason than "bragging rights". I've had a 1080 since launch and I've yet to need to change it. I not only game with it but also do video editing. I could change it but I don't enjoy giving my money to wealthy companies for no reason.
This has been addressed before, but not everyone owning a 3090 or wanting to purchase a 4090 falls prey to marketing or bragging rights. I for one have a 3090 and I intend to buy a 4090 or 7900XT simply because I want to play at the highest settings at a decent framerate. I do think there is a difference between ultra and high settings, albeit I agree the difference is not that large. Also, I want to have RT on maximum and given that I have a 5120x1440 monitor DLSS below quality starts having to many issues.
As a result, I buy one of those high-end CPUs and I am well aware that I pay extra for a comparatively smaller performance uplift.
When you see people swapping gpu's at every new release then you know it's not about performance anymore, it's just about having the latest thing because rarely does performance change drastically from a single development cycle to the next.
I think if you go high-end, you pretty much have made the decision to swap GPUs often. Naturally, it is good to look at reviews and such, but if I decide I will spend 1.6k$ on a new GPU than that's perfectly fine. A friend of mine gets new bikes rather frequently and compared to those, PC gaming is cheap, even with those prices. Likewise, gaming is a hobby of mine and I do not have any qualms of putting money into it.
Yea at 1080p/1440p its harder too within reason get a 3090 unless you really need the extra Vram. I was very conflicted before buying it too but at 4k nowdays i see up too 16 gb vram "usage" at times so im glad i dident get a 3080 with 10 now -.-
Same. Now even more so, given that with the settings I use for Cyberpunk 2077 I see VRAM allocation going up to 22GiB.
^ That's also coming from someone that used to be a "Power User" during the 1990s and early 2000s. I get the lure of trying to put the biggest, baddest rig together, spending hundreds of extra dollars trying to squeeze out every last frame from any game, going for those huge numbers! I get it well.
Kind of a fun fact, but when I was younger I was never going high-end because I could not really afford it and I was also rather content with my GPUs. For me, ppl striving for high-end seemed a bit detached from reality and it also seemed that they were just throwing money away. After all, I enjoy those games as well at reasonable settings of course. Nowadays though, I do chase the high-end but I understand both sides.
(I'm not stating that everyone not going for the high-end can't afford it!
Looking forward to the 4000 series...I'd simply wait. I'd like to see what the real-world performance is like once the cards are on the market. I would not put too much stock into DLSS3 being some revolutionary step forward. I really doubt if the results of DLSS3 are going to be all that noticeable when compared to present DLSS. As graphics approach true 4K resolutions, we're getting seriously into the realm of diminishing returns. Simple fact is that, at those resolutions, the human eye can't actually see the pixels. I might be able to clean up the edges of a polygon with nearly 300% additional accuracy...but unless you have a magnifying glass up to the screen...you won't even notice.
That's always a reasonable approach.
When it comes to releases, I think publishers are gonna focus on releasing games for the most users. And console-users outnumber pc-gamers by orders of magnitude.
I disagree with console-gamers outnumbering pc-gamers by orders of magnitude. Sure, the outnumber PC gamers, but I would argue that PS, XBOX and PC roughly have 1/3 of the cake
Depending on the game, one platform might outnumber another - like with Cyberpunk 2077 - but it is usually distributed in an equal manner.