Okay, a reminder here that no one's opinion is superior to another's. Let's not start labeling people that choose to focus on things that we, personally, find unimportant. Feel free to express your thoughts and views on the topic. Do not assume to speak for others, and be sure that all views are being respected.
_______________
^ That's also coming from someone that used to be a "Power User" during the 1990s and early 2000s. I get the lure of trying to put the biggest, baddest rig together, spending hundreds of extra dollars trying to squeeze out every last frame from any game, going for those huge numbers! I get it well.
It was the constant frustration that resulted with trying to fight the instability of aggressive overclocking, drivers that weren't working terribly well with state-of-the-art tech, games that simply didn't incorporate brand new features, hardware damage or failures that would occur when pushing it a bit too far...etc. That all ended when I had a looong talk with a Falcon-NW rep on the phone as I configured a nearly $4,000 USD Mach V tower, and he consistently tried to push me away from any overclocking, and discouraged me from upgrading certain components because of the motherboard I was using.
In the end, a system config needs to ensure that all of its components are in balance with one another in order to ensure smooth, stable performance across the board. If a piece of hardware doesn't really line up with the overall system functionality, millisecond to millisecond, then that will result in performance hitching or instability as the rest of the system either fails to keep up or gets too far ahead. Plus, there's no benchmark software on the planet that can accurately simulate what the real-world performance of a system will be running actual applications in practice.
So, as is the case with pretty much everything in life, "throwing money at the problem" won't usually solve anything. Most often, it will simply create new issues. The more I learn about Nvidia's 4000 series, the more I'm beginning to smell fast food. I don't think this is really a new iteration of hardware. It feels more like an attempt to put fancy sauce on a cheap meat patty, then jack the price up because it's "the next big thing".
To be honest, if anyone is looking for an upgrade in the present GPU market, I would recommend the RTX 3060 ti. I was really impressed by the performance of the 3060 standard that came in my present rig, and I'm disappointed that I couldn't track down a model with a 3060 ti. Bang for the buck right now, you can't do much better. Granted, it can't really do any meaningful ray tracing: the tech is there, but it simply doesn't have the power to create playable FPS in most games with ray tracing on. It absolutely screams with rasterized graphics, though. Plus, the voltage requirements mean that anyone using a 1000 or 2000 series card will likely be able to just plug it in (mobo permitting).
If players want to future-proof, I'd recommend a 3090 ti. That is a big, bad, mean card. You'll pay for it, but you'll get what you pay for. They are beasts in terms of both performance and price. (Personally, I absolutely refuse to pay that much money for a GPU. It's unhealthy marketing, and it's setting a terrible precedent for the future.)
Looking forward to the 4000 series...I'd simply wait. I'd like to see what the real-world performance is like once the cards are on the market. I would not put too much stock into DLSS3 being some revolutionary step forward. I really doubt if the results of DLSS3 are going to be all that noticeable when compared to present DLSS. As graphics approach true 4K resolutions, we're getting seriously into the realm of diminishing returns. Simple fact is that, at those resolutions, the human eye can't actually see the pixels. I might be able to clean up the edges of a polygon with nearly 300% additional accuracy...but unless you have a magnifying glass up to the screen...you won't even notice.