I respect your opinions but I’m still interested in getting a RTX card, assuming benchmarks show that the RT implementation in Cyberpunk 2077 (and other future games with RT) will be worth it. I’m quite an "atmosphere" guy, that is, I care more about lighting, colors, particles and shadows than texture quality and poly count, to mention some. You could say I’m one of those who do stop to check the neon lights and puddles

So RT looks very appealing to me despite its current shortcomings. Shortcomings that hopefully, by the time I do the purchase, stop being too much of an issue. A man can dream
Sure! No pressure from this end. I wouldn't even go so far as to call my views "advice". (Maybe limit it to, "advice
if one wants to exchange features to optimize.") I'm sure that ray tracing will achieve its fame sooner or later, so no harm in having it. Just be prepared for more glitches, weird incompatibilities or performance hassles, and driver-swapping because it's new tech.
Similar to the Ryzen chips. I've encountered a number of members with performance issues that do seem to be caused by Ryzens -- but it's hardly common enough for it to be a universal issue. Again, just the hiccups that develop with new tech.
Oh, the idea of ray tracing is great. I'm by no means an expert on the concept but, as I understand it, there are various approaches to incorporating it into the games and software. There are degrees of implementation, so to speak. You don't buy an RTX card and just get universal raytracing across the board in a standard fashion. More importantly, the hardware requirements to get wide-spread implementation are quite high. I have serious doubts the existing RTX cards are going to be able to hack it without being brought to their proverbial knees.
The marketing surrounding it conveniently omits these considerations. It's all raytracing is the next best new old technology. You must have it. It will make your games a magical experience. Yadda, yadda, yadda. Again, the concept is great. I just don't think the premium price tag is worth it. Not yet anyway. As with most new features I'd be inclined to wait until it's developed some history before jumping on the bandwagon. It's the safe play. Not to mention GPU prices can go drown

.
That's almost exactly my stance. Hardware has largely followed the same patterns since I got into building systems in the 1980's.
1.) Something new is released for, say, $1,000 dollars. Mega-hype follows.
2.) Hardware is expensive, non-optimized, possibly flawed (hardware failures and such), and developers haven't had a chance to develop anything for it yet.
3.) A couple of years later, there are loads of games that really take advantage of it, the hardware is more powerful and refined, and things are hundreds of dollars cheaper.
4.) Rather than focusing on the stuff that's working well, the next amazing new hardware is announced for, say, $1,200. Mega-hype...
5.) Bubble, rinse, repeat.
So, since the early 2000's, I've always opted to build my systems based on the best-of-the-best stuff from the
last generation of hardware. It does mean that I'll not get the cool new features, but I can usually build something that's very reliable and performs excellently for at least 5 years. (Built my present rig in 2015. Still able to run almost everything current at Ultra settings, 1080p. The few titles I can't require only minor tweaks to run very smoothly. But no, I can't do 1440p at 120+ FPS.)