Building a gaming PC

+
About electric interconnects, weren't there supposed to be light based circuits? Or that went nowhere?
Post automatically merged:

Well the 170 w is really 220w. Both the 7950x and the 13900k is pretty much the same power draw 220w vs 250w. TDP means something completely diffrent nowdays -.- another marketing gimmic. The eco mode is pretty decent at 105 tdp mode still tho, the diminish returns has hit both gpus and cpus hard this generation it seems. Also seen intel have these kinds of limits, a 125w mode and 65w. its a bit worse in work loads then the 7950x on that tho

Yeah, their TDP isn't the expected power draw, but some formula they came up with... to match Intel's TDP, lol.

I'll wait for 3D V-cache 16 core model (7950X3D?). That should be interesting.

I wonder what presets Sapphire Pulse RX 7900 XTX will have. They usually have a VBIOS switch for higher and lower modes.
 
Last edited:
I wonder what presets Sapphire Pulse RX 7900 XTX will have. They usually have a VBIOS switch for higher and lower modes.
Think they are going too make a x3 8 pin card with a 450w+ edition atleast. Im fairly sure most AIB will make a 450w edition card with higher clocks and so on since AMD dont seem too have a 4090 challanger yet.
 
Asus already announced some 3 x 8 pin card. I suppose Sapphire will have one of their Nitro models like that too. Hopefully Pulse model will use 2 connectors like the stock model.
 
About electric interconnects, weren't there supposed to be light based circuits? Or that went nowhere?
Post automatically merged:
The problem there is they haven't found a practical way of implementing it, the "light pipes" are larger than the traditional traces on a silicon wafer. It's a valid theory but currently there is no practical way to implement it and keep everything small and compact ... and of course cost efficient.
 
Ehh gonna have too disagree with that. 4k looks way better then 1440p imo, even at like 100 fps only. Sadly its something you just cant show on youtube or something but it has too be seen irl. Depends on alot of factors too ofc like good screen and so on.
Yeah, it will all be subjective in the end. I've seen it, first-hand, many times. My point, though, is that if I'm into a game -- especially an action game that focuses on fast movements and quick response times -- the enjoyment of the game itself is not going to be dictated by how beautiful the graphics are. (That's why there are a lot of games with incredible graphics that that people just don't like very much, and plenty of excellent games have little in the way of graphical realism.)

And when it comes to 4K gaming as it exists now, we're splitting hairs. Don't get me wrong, I still say that 4K is the future of gaming. I still argue that when it becomes standard, it will revolutionize what can be accomplished graphically.

But we're not there, yet. And for what is available now, along with what can be reasonably expected in the foreseeable future...there's exactly zero reason for anyone to spend $1500 on just the GPU. Almost everything in the next 5 or so years is going to be built around the 1440p to 2K range. In fact, the utterly ridiculous price tag on the 4090 is more or less guaranteeing that, since the only the vast minority of gamers are going to be willing or able to buy one.

AMD's specs, their performance, their voltage, and the simple size of their card is leaps and bounds more sensible and reasonable. I mean, even if their performance falls way short of the 4000 series, the simple lack of accessibility that Nvidia just introduced probably handed AMD a clean win. But, the sales will tell.

Yea i kinda agree atleast, the 7900xtx isent targeting the 4090 tho its a 4080 competitor that has around the same power draw. 320-350w (reference models, the AIB seem too go for 450w on the 7950xtx for more FPS ofc) i think. releases next week so no clue yet about witch is faster, AMD is a bit cheaper im guessing. Most people defenitly dont need a 4090 i fully agree. if you wanna game at 4k with maxed settings and perhaps do some rendering/ai its the better choice. Atleast untill the 4090ti comes out (supposed too have even more memory and a fully unlocked core)
I'd love to play at 4K! But I'm not paying close to two grand USD to do it. I promise those games will be just as engaging at much, much smaller resolutions. And they'll still look incredible.

Sorry in advance for my "unadvised" opinion, but I don't get the comparison between Nvidia 4090 and RX 7900 :)
Because for me, both have clearly a different goal and aim a different market. With the 4090, Nvidia showcase what they're capable off, "no matter" the price, affordable only for those who can. Unlike RX 7900 which are designed to aim a larger public.
Sounds like you're getting it, exactly. It is weird, though, since this is the first time the two have not tried to directly compete with performance output. It does seem that AMD simply ignored Nvidia's "power play" and went: "Riiiiiight... Anyway! Here's a card that will drastically increase performance for what's on the market right now, provide excellent high-end future-proofing, and it's about half the cost. Plus, you can use it with your existing builds. You decide!"

Unusual, since AMD didn't try to "answer" the 4090...they just kinda dismissed it and offered their own, more incremental upgrade.
 
But we're not there, yet. And for what is available now, along with what can be reasonably expected in the foreseeable future...there's exactly zero reason for anyone to spend $1500 on just the GPU. Almost everything in the next 5 or so years is going to be built around the 1440p to 2K range. In fact, the utterly ridiculous price tag on the 4090 is more or less guaranteeing that, since the only the vast minority of gamers are going to be willing or able to buy one.
Well im not expecting it too happend soon either tbh. The thing i find most annoying about this whole thing is the big gap thats forming. You can probably play fine at 4k now with the 4080 and the 7900xtx with decent settings and atleast 60 fps without much issues. If you want the best your going too have too fork out way more cash tho sadly. The 4090 is for the people who dont mind the high price since they dont want too compromise.
Unusual, since AMD didn't try to "answer" the 4090...they just kinda dismissed it and offered their own, more incremental upgrade.
Yea thats a bit weird, im guessing the 7950xtx will be more 4090 like tho. Im just guessing but i dont think they will surrender the crown that easily.
 
Yea thats a bit weird, im guessing the 7950xtx will be more 4090 like tho. Im just guessing but i dont think they will surrender the crown that easily.
But Nvidia don't planned to release a RTX 4090ti too ? (instead of the TITAN RTX)
I assume even more expansive^^
 
But Nvidia don't planned to release a RTX 4090ti too ? (instead of the TITAN RTX)
I assume even more expansive^^
Yea this is why no competitor too the 4090 is quite weird.. if they did one now they could keep up with the 4090ti if it comes later. Its rare that one company just lets another win the performance crown that easy and not even make a card too compete with it. We shall se what happends tho, heck half the cards are not even released yet -.-
 
Yea this is why no competitor too the 4090 is quite weird.. if they did one now they could keep up with the 4090ti if it comes later. Its rare that one company just lets another win the performance crown that easy and not even make a card too compete with it. We shall se what happends tho, heck half the cards are not even released yet -.-
Given the price, and that AMD do not yet have dedicated hardware for RT/AI, trying to compeate with a £1600 card is not the best idea.

pantsing them for price/performance by several hundred per teir is how you capture market share and justfiy investing in delveoping that dedicated hardware.
 
Exactly ..... If they can meet or beat a 4080 and offer it at $200-300 less and you can continue that difference as a percentage all the way down the rest of the product stack then they will have a real winner ..... There really is no sense businesswise to compete at the highest end since the volume is low and so are the margins. Nvidia in the end might not make a dime off the 4090 line it's more or less a marketing gimmick that may help them sell their lower tier products. If AMD tried to compete at that level they would have the exact same power guzzling problems because of the physics involved in delivering that much GPU compute power meaning you have to use a lot of current to power it .... Even if by some luck they could do it 10% more efficient (And that's doubtful at that level) you are only talking 540W vs 600w and you'd still need just as big of a power supply
 
Last edited:
I honestly wouldn't go for bleeding edge at the moment. The optimizations for DX12 Ultimate games coming out next gen are insane. I can't remember the names of the features, but you can demo a lot of them in 3d Mark DLC packs designed specifically for DX12 Ultimate benchmarking. Even with my Ryzen 9 3950X @ 4.3ghz, 3200mhz DDR4, and RX 6900-XT, I can pull some respectable fps at 1440p and 4k given my limitations for bandwidth on my motherboard. There's a couple of features in particular they do with pixel and object mapping that blew my mind. I would say with DX12 Ultimate you should be able to technically do more with less, but that is up to you and your wallet.
 
Given the price, and that AMD do not yet have dedicated hardware for RT/AI, trying to compeate with a £1600 card is not the best idea.

pantsing them for price/performance by several hundred per teir is how you capture market share and justfiy investing in delveoping that dedicated hardware.
It's how they captured the CPU market, and my sentiment with their approach given the recent offerings. I really do hope to see them get to place where they can touch Nvidia, but for now I'm locked in with the green stuff for production. High fidelity vidya is an appreciated bonus. Radeon still can't touch CUDA when I need it most, and having nearly 17k of them on one board is a big deal to me.
 
I don't believe that the 4090 is a result of trying to create an actually marketable product. Everything about the card:
  • Insane power draw / lack of compatibility with the majority of PSUs
  • Massive size, not compatible with the majority of existing cases
  • Poor design in regards to cable management, card overlapping motherboard components, etc
  • Tendency to bottleneck heavily due to CPU capability
  • Proprietary power cables? Really?? With hand-soldered pins!???
  • Utterly ridiculous price tag
...is evidence that Nvidia was not focused on creating something with consumer value in mind. They were focused on trying to capitalize on the graphical card "famine" they themselves intentionally helped to create. I say it's a simple, psychological ploy. Make buyers super-hungry...then try to get them to pay triple value for the biggest, most crazy, shock-and-awe product we can come up with! Dazzle them with a big loud engine...even though the vehicle won't fit the average garage, is completely illogical due to it's gas mileage, and there's absolutely no practical use for it in day-to-day life.

They're trying to convince people that you need this $450,000, Ferrari-engine, double-sized, racing pick-up truck, with ultra-wide, off-road tires that will take up two parking spots...to...what? Go back and forth to work? Take the kids to soccer practice? Go get milk?

It's stupid.

Or, you can just buy a $50,000 Audi. Over twice the mileage per gallon, can cruise just as fast on the expressway with even better handling, carries all the groceries that you can shop for, and you can actually put it your garage without spending another $35,000 to remodel your house.

I suppose, if you find yourself in an off-road race going uphill in the rain, the Ferarri Gorgo-Behemo-Titan 4000 might win...but how often do you wind up that situation in a given week, exactly?

It's a phenomenally unrealistic and completely impractical product for the vast majority of gamer market. For those that must have flagship hardware the instant it hits the shelves, it doesn't change the reality that, by the time that sort of power will actually make a difference for existing games, the technology will be drastically improved, titles will actually take advantage of it, and the cost will have dropped by 50% or more.

Nvidia just shot themselves in the foot using the highest caliber they have.
 
Last edited:
I wish more games would stop using DX12 and would start using Vulkan even on Windows. MS is still playing their lock-in game though.
Vulkan often times is not the better choice. What it does is takes some of the load off the CPU and transfers it to the GPU. That's great if the game is CPU bound however if the game is already GPU bound and you transfer more load over to it then you have stuttering and 0.1% low problems and the game isn't as smooth because you are taking an already overloaded GPU and adding even more load. In some games Vulkan works better but in many games it simply does not
 
That's an article about trying to translate DX12 to Vulkan so it can run on Linux and it doesn't negate the points I made that Vulkan doesn't work as well with GPU bound games but does work better with CPU bound games
 
That article points out deficiencies of DX12 in comparison.

I don't think your statement is even correct. Whether something is GPU or CPU bound has no bearing on whether those low level GPU APIs works well or not. It's about how to use them, not about APIs themselves.
 
Hmm for me i cant remember a game that ran better with vulkan then with dx12. Most games seem too have both atleast the ones i own. RDR2 is probably the one i tested the most and it felt more smooth on dx12 for me, not exactly better fps tho if i remember correctly. That said MS have kinda a stranglehold on the pc market (except for linux i guess) so im not surprised they push more software they make.

There was alot of talk when Vulkan was released i remember, about how good it was and superior too dx and so on. Havent really hade the breaktrough alot of people thought at the time it seems. I dont think its worse then dx12 but its not really better for me atleast either. If more games hade it i guess it could become better with more development?
 
Yes RDR2 is a good example ..... You get a slightly better average FPS with Vulkan but you also have a stuttering problem in many places in the game while DX12 you get a slightly lower average FPS but a better 0.1% low FPS and thus smoother gameplay. The reason is RDR2 is mainly GPU bound. Now if you are playing one of the twitch shooters and have the graphics settings lowered so you get the best FPS possible then Vulkan starts to make more sense. They are boith perfectly good APIs they just have their separate places depending on the game.

So what if I get 63 FPS instead of 66 FPS if it means the overall gameplay is smoother .....

The main reason developers are using DX12 is because it's been unified across both Series X and PC meaning they can write most of the code just once and use it for both systems whereas with Vulkan you'd still have to rewrite the code in DX12 for the Series X. Just using DX12 saves time and money and gets the product out the door faster. It also makes things less complicated so theoretically that means fewer bugs
 
Top Bottom