Dx12 mGPU support. Please.

+
CDPR will never implement it and Nvidia will never make any driver profile for it. SLI is dead that´s why!
I am glad SLI is dead (it was inevitable and only a matter of time)! (y)
I hope CDPR will do the right move and enable dx12 mGPU :D . Until recently they were known for doing things right. This is their first game that dissapointed..but at least they are still working on it. I still hope they won't let me down.:giggle:
 
Much needed feature? It's a feature that 99% of the people playing the game won't be able to take advantage of if it were implemented, I seriously doubt they would waste their time re-engineering the game to add something almost nobody will use when they clearly have much more important and useful things to be doing with their time and skill.

As a secondary note, your statement that 'almost everyone' has a spare GPU powerful enough to be useful just lying around is ridiculous. First of all, a sizeable chunk of people will sell an old graphics card if they buy a new one and I'd say the majority of people will upgrade their graphics card at a point where their old graphics card would be pretty much useless as a second card. Ignoring that, not everyone even has a motherboard capable of holding multiple graphics cards. I sure don't and my computer is half decent.

You seem to be mistaking your personal experience for that of the majority of people. I assure you, your issues are far from the issues of the majority.
 
Much needed feature? It's a feature that 99% of the people playing the game won't be able to take advantage of if it were implemented, I seriously doubt they would waste their time re-engineering the game to add something almost nobody will use when they clearly have much more important and useful things to be doing with their time and skill.

As a secondary note, your statement that 'almost everyone' has a spare GPU powerful enough to be useful just lying around is ridiculous. First of all, a sizeable chunk of people will sell an old graphics card if they buy a new one and I'd say the majority of people will upgrade their graphics card at a point where their old graphics card would be pretty much useless as a second card. Ignoring that, not everyone even has a motherboard capable of holding multiple graphics cards. I sure don't and my computer is half decent.

You seem to be mistaking your personal experience for that of the majority of people. I assure you, your issues are far from the issues of the majority.
We may be a minority, but we have major dreams and hopes. :giggle:
We are going to the major leagues.:p

-not everyone even has a motherboard capable of holding multiple graphics cards.

I would tend to disagree, I know NOONE in my family (whether wife, brother or distant relatives) or my work colleagues (and no, I am not working in the IT business) who doesn't have a multi-GPU capable board. And I surely am not in the top 3% wealthiest people on earth (neither are my relatives or social circle).

You seem to be mistaking your personal experience for that of the majority of people.

Well you are definitely right in one regard: every person ( including me and you) can only speak from his own personal experience. It just happens that our experiences are diametrically opposed.

Much needed feature?
I would dare to say that wanting more than 30fps should not be considered a luxury in 2021 (on a pc that is).
So yeah, it is dearly and urgently needed.
 
Last edited:
Just tossing out there again that I don't think it's likely the game will receive any, official multi-GPU support. I have no idea for sure, but games that require a lot of CPU-intensive processes don't make great candidates for multi-GPU. The reason is that lots of these types of games require specific frame timing for various functions. Multi-GPU, by its very nature, tends to "share" frames between the GPUs, meaning there's no, one frame a game can make a call to. This is less of an issue with games that are GPU-intensive: action games, shooters, non-simulation racers, exploration, adventure, etc. It won't matter to those games whatsoever whether they're running at 60 FPS or 260 FPS -- they can just crank out the frames!

From what I can deduce, CP2077 is both CPU- and GPU-intensive, which is the best and worst of both worlds. Managing multi-GPU support would probably require game processes to be completely re-written to be able to manage data coming from specific, multi-GPU hardware using a specific approach. Not only does the hardware vary greatly, but many offer numerous different rendering and drawing approaches that users can switch between. Thus...game functions would likely have to be re-written to account for each different combination of these elements...

...and that's why I'd argue it's extremely unlikely.


_______________



On the other side of this coin, there's no reason players can't get ~60 FPS gameplay. My specs are below:
i7 4790K
16 GB G.Skill Ripjaws RAM DDR3
eVGA GTX 980 ti 6 GB VRAM DDR4
Samsung EVO 860 SSD
Windows 10

1920x1080, all settings at Ultra (no ray-tracing, as it's not supported by the 980 ti). Frame Limit at 56.

I normally get a pretty steady 45-56 FPS everywhere. Occasional, tiny spots dip all the way down into the 20 FPS range, but those spots are very few and far between (Like the Delamain garage area. The amount of steam seems to drag the GPU to its knees. I can't imagine this will be as much of an issue for a >1080 or equivalent GPU.)

For people with more powerful cards, try locking frames at either 64 or 72. Not needing to constantly adjust the render/draw rate will likely smooth things out. Keeping it divisible by 8 seems to help sync frame timing with multitrheading processes in the CPU. (This approach works for me across a lot of games, as well -- not just Cyberpunk.) Since there's no multi-GPU support, hopefully this will help some players get a smoother experience.
 
Just tossing out there again that I don't think it's likely the game will receive any, official multi-GPU support. I have no idea for sure, but games that require a lot of CPU-intensive processes don't make great candidates for multi-GPU. The reason is that lots of these types of games require specific frame timing for various functions. Multi-GPU, by its very nature, tends to "share" frames between the GPUs, meaning there's no, one frame a game can make a call to. This is less of an issue with games that are GPU-intensive: action games, shooters, non-simulation racers, exploration, adventure, etc. It won't matter to those games whatsoever whether they're running at 60 FPS or 260 FPS -- they can just crank out the frames!

From what I can deduce, CP2077 is both CPU- and GPU-intensive, which is the best and worst of both worlds. Managing multi-GPU support would probably require game processes to be completely re-written to be able to manage data coming from specific, multi-GPU hardware using a specific approach. Not only does the hardware vary greatly, but many offer numerous different rendering and drawing approaches that users can switch between. Thus...game functions would likely have to be re-written to account for each different combination of these elements...

...and that's why I'd argue it's extremely unlikely.


_______________



On the other side of this coin, there's no reason players can't get ~60 FPS gameplay. My specs are below:
i7 4790K
16 GB G.Skill Ripjaws RAM DDR3
eVGA GTX 980 ti 6 GB VRAM DDR4
Samsung EVO 860 SSD
Windows 10

1920x1080, all settings at Ultra (no ray-tracing, as it's not supported by the 980 ti). Frame Limit at 56.

I normally get a pretty steady 45-56 FPS everywhere. Occasional, tiny spots dip all the way down into the 20 FPS range, but those spots are very few and far between (Like the Delamain garage area. The amount of steam seems to drag the GPU to its knees. I can't imagine this will be as much of an issue for a >1080 or equivalent GPU.)

For people with more powerful cards, try locking frames at either 64 or 72. Not needing to constantly adjust the render/draw rate will likely smooth things out. Keeping it divisible by 8 seems to help sync frame timing with multitrheading processes in the CPU. (This approach works for me across a lot of games, as well -- not just Cyberpunk.) Since there's no multi-GPU support, hopefully this will help some players get a smoother experience.
Thank you for your detailed reply. You are completely right about the CPU-intensiveness.
I am gaming on a 5K (resolution) monitor, BUT that's why I bought the freaking 2000$ rtx 3090 (and would happily buy another one for mGPU [dx12 mgpu doesn't need nvlink connection so I could even go for a 3080 or 3080ti for 90% performance uplift and save some money]).
Just to make it clear: my goal is NOT to buy multiple thousand-dollar GPUs and Corsair AX 1600i PSUs (500$) + the electricity bill!! just for the sake of it (although PC-building\PC masterrace is somewhat fun :)).
My aspiration is just to reach higher fps (higher than the 30 I am getting) WITHOUT having to lower the monitor resolution to a sub-native one or turning graphic settings to medium or low (not in this game, in every other game maybe, but in this graphical masterpiece is would be a blasphemy).
If Nvidia (and AMD) could improve DLSS so that it gives us the graphical quality of ''DLSS OFF'' or ''DLSS QUALITY'' but with the framerate of ''DLSS PERFORMANCE'' then... I would delete this thread immediately and I would be on cloud nine. But since Nvidias strategy is to leave people craving for more (the next gen 4000 series) this probably won't happen.
P.S the 30fps I am always refering to is with ''DLSS QUALITY'' turned on. DLSS PERFORMANCE looks like crap at least on my system with terrible smearing artifacts when ANYTHING moves; DLSS QUALITY has it too but it's much more acceptable. DLSS OFF gives me 15fps(unplayable) so don't anyone think I am not using all the technologies my expensive-as-a-second-hand-car GPU has to offer.
The attached screenshots (I made on the day the game released) coincidentally perfectly describe my relationship to Nvidia. I love them for pioneering this amazing ray tracing tech, but at the same time I hate and blame them for the patheticaly weak cards and DLSS they are producing (when considering the prices).
 

Attachments

  • Photomode.jpg
    Photomode.jpg
    2 MB · Views: 68
  • photomode2.jpg
    photomode2.jpg
    1.9 MB · Views: 64
Thank you for your detailed reply. You are completely right about the CPU-intensiveness.
I am gaming on a 5K (resolution) monitor, BUT that's why I bought the freaking 2000$ rtx 3090 (and would happily buy another one for mGPU [dx12 mgpu doesn't need nvlink connection so I could even go for a 3080 or 3080ti for 90% performance uplift and save some money]).
Just to make it clear: my goal is NOT to buy multiple thousand-dollar GPUs and Corsair AX 1600i PSUs (500$) + the electricity bill!! just for the sake of it (although PC-building\PC masterrace is somewhat fun :)).
My aspiration is just to reach higher fps (higher than the 30 I am getting) WITHOUT having to lower the monitor resolution to a sub-native one or turning graphic settings to medium or low (not in this game, in every other game maybe, but in this graphical masterpiece is would be a blasphemy).
If Nvidia (and AMD) could improve DLSS so that it gives us the graphical quality of ''DLSS OFF'' or ''DLSS QUALITY'' but with the framerate of ''DLSS PERFORMANCE'' then... I would delete this thread immediately and I would be on cloud nine. But since Nvidias strategy is to leave people craving for more (the next gen 4000 series) this probably won't happen.
P.S the 30fps I am always refering to is with ''DLSS QUALITY'' turned on. DLSS PERFORMANCE looks like crap at least on my system with terrible smearing artifacts when ANYTHING moves; DLSS QUALITY has it too but it's much more acceptable. DLSS OFF gives me 15fps(unplayable) so don't anyone think I am not using all the technologies my expensive-as-a-second-hand-car GPU has to offer.
The attached screenshots (I made on the day the game released) coincidentally perfectly describe my relationship to Nvidia. I love them for pioneering this amazing ray tracing tech, but at the same time I hate and blame them for the patheticaly weak cards and DLSS they are producing (when considering the prices).

That's a lot of different concerns, aside from multi-GPU! :LOL:

1.) There's no reason that you can't get 60 FPS gameplay with that card, but a few caveats (which will remain true across the board for all games):
  • Resolution is the single, most impactful thing for FPS. If the hardware is not powerful enough to muscle through the numbers generated by higher resolutions, then you are going to experience performance issues. This is true even with multi-GPU support. Plus, resolution affects rendering and drawing on an exponential curve. Lowering it a step or two will often have drastic affects on improving performance, and raising it will often have an equal and opposite effect.
  • A GPU doesn't have total and absolute control of graphical performance -- it simply does the rendering. This can still be throttled by either CPU functions or an engine that's doing a lot of stuff. We can see a lot of examples of this in many different titles, where going from 4K all the way down to 1600x900 may have no affect on graphical performance at all.
  • In such situations, multi-GPUs will often have no effect. It's not always possible to "brute-force" through such issues, as "processing power" isn't the deciding factor. In actuality, introducing multi-GPU may actually make the performance worse or introduce severe instability, making it even harder for the CPU or game engine to get at what it needs when it needs it.
2.) I'll argue that >2K at steady, 60+ FPS is completely out of reach of most games released in the last 5 years or so. Probably a lot of titles over the last 10 years won't be smooth at that insane level of resolution. We're still entering the arena of 1440p and 2K becoming standard resolutions. They're still not. The vast majority of all games on the market in 2020 will still want to default to 1080p. Pushing things beyond what they're really designed for is possible, but not guaranteed to provide consistent or stable results. Realistically, I'd argue that top-of-the-line GPUs right now are capable of handling modern games at 2K, 60 FPS without ray tracing. If we engage ray tracing options or jack resolutions up to 4K+, I think we'll hit 30 FPS really, really fast. (Multi-GPU, I doubt would provide much improvement. Maybe 40-50 FPS. Players would probably still need to tone down settings to make even that stable.) And that's talking about games that are not being specifically designed to operate in 4K (...which is still the vast majority of games and rendering techniques.)

3.) I'm sooo behing DLSS. I am eagerly awaiting new updates. I bought a really slick monitor this past year without considering just how disparate the scaling would be if I wanted to bust the resolution down to 1080p. :facepalm: Playing many things now in a window...or with big, ol' black borders around my fullscreen space. Being able to fill the drawspace and get 1:1 scaling with virtually no blurring or pixelation would be wonderful. I also think this is a tech that probably would have been better to pursue years and years ago, instead of trying to iterate on heavily taxing AA techniques. But -- it's on its way!

These things, though, are really not going to benefit or suffer much from multi-GPU. For example...if I'm getting smooth FPS at 2K+ native...then there's no reason to activate DLSS at all. Unless my monitor is the size of a wall, 2K/4K resolution is too fine for the human eye to discern individual pixels anyway (given the average 15" - 30" screen). Using it would simply add weird pixel blocks around some edges. It would only be useful if trying to render something like 2K resolution on huge, 60", 8K screen.
 
That's a lot of different concerns, aside from multi-GPU! :LOL:

1.) There's no reason that you can't get 60 FPS gameplay with that card, but a few caveats (which will remain true across the board for all games):
  • Resolution is the single, most impactful thing for FPS. If the hardware is not powerful enough to muscle through the numbers generated by higher resolutions, then you are going to experience performance issues. This is true even with multi-GPU support. Plus, resolution affects rendering and drawing on an exponential curve. Lowering it a step or two will often have drastic affects on improving performance, and raising it will often have an equal and opposite effect.
  • A GPU doesn't have total and absolute control of graphical performance -- it simply does the rendering. This can still be throttled by either CPU functions or an engine that's doing a lot of stuff. We can see a lot of examples of this in many different titles, where going from 4K all the way down to 1600x900 may have no affect on graphical performance at all.
  • In such situations, multi-GPUs will often have no effect. It's not always possible to "brute-force" through such issues, as "processing power" isn't the deciding factor. In actuality, introducing multi-GPU may actually make the performance worse or introduce severe instability, making it even harder for the CPU or game engine to get at what it needs when it needs it.
2.) I'll argue that >2K at steady, 60+ FPS is completely out of reach of most games released in the last 5 years or so. Probably a lot of titles over the last 10 years won't be smooth at that insane level of resolution. We're still entering the arena of 1440p and 2K becoming standard resolutions. They're still not. The vast majority of all games on the market in 2020 will still want to default to 1080p. Pushing things beyond what they're really designed for is possible, but not guaranteed to provide consistent or stable results. Realistically, I'd argue that top-of-the-line GPUs right now are capable of handling modern games at 2K, 60 FPS without ray tracing. If we engage ray tracing options or jack resolutions up to 4K+, I think we'll hit 30 FPS really, really fast. (Multi-GPU, I doubt would provide much improvement. Maybe 40-50 FPS. Players would probably still need to tone down settings to make even that stable.) And that's talking about games that are not being specifically designed to operate in 4K (...which is still the vast majority of games and rendering techniques.)

3.) I'm sooo behing DLSS. I am eagerly awaiting new updates. I bought a really slick monitor this past year without considering just how disparate the scaling would be if I wanted to bust the resolution down to 1080p. :facepalm: Playing many things now in a window...or with big, ol' black borders around my fullscreen space. Being able to fill the drawspace and get 1:1 scaling with virtually no blurring or pixelation would be wonderful. I also think this is a tech that probably would have been better to pursue years and years ago, instead of trying to iterate on heavily taxing AA techniques. But -- it's on its way!

These things, though, are really not going to benefit or suffer much from multi-GPU. For example...if I'm getting smooth FPS at 2K+ native...then there's no reason to activate DLSS at all. Unless my monitor is the size of a wall, 2K/4K resolution is too fine for the human eye to discern individual pixels anyway (given the average 15" - 30" screen). Using it would simply add weird pixel blocks around some edges. It would only be useful if trying to render something like 2K resolution on huge, 60", 8K screen.
  • A GPU doesn't have total and absolute control of graphical performance -- it simply does the rendering. This can still be throttled by either CPU functions or an engine that's doing a lot of stuff. We can see a lot of examples of this in many different titles, where going from 4K all the way down to 1600x900 may have no affect on graphical performance at all.
  • In such situations, multi-GPUs will often have no effect. It's not always possible to "brute-force" through such issues, as "processing power" isn't the deciding factor.
THIS 👆 is what I am most afraid of. And yes you figured me out , brute forcing things is indeed a character trait of mine (from losing weight [100+ kg] by just making an insane amount of push-ups daily to small things like always using gcam app on my smartphone with the HDR+ 72 option with a tripod [which basically means the phone takes the photo 72 times and stitches them all on top of each other] just to be able to take the best vacation photos (yes it takes forever, but then I have the bragging rights, because the photos rival my wife's DSLR camera.
You are very wise.
I think everything on this topic has already been said in this thread.
But nonetheless please don't delete this thread. Maybe someday when CDPR are finished reparing all bugs (that the majority of people [not me] find more important than the raw fps) and programmers get bored, they may scroll through this forum and consider adding this feature (if Nvidia hasn't released DLSS 3.0 by that time). Hope dies last.:giggle:

P.S. Rest of my system (to see what hardware is NOT enough to make me happy)
besides the Strix rtx 3090:
-r9 3900x (wanting to upgrade to 5900x but waiting for when it doesn't cost as much as a 5950x)
-samsung 980 pro 1tb nvme (+ other slower ones but cp2077 is on this monster).
-32GB RAM 3600mhz
-Crosshair VIII Hero (doesn't make game faster but gives more stability).
-Corsair H1000w (which I will need to upgrade when doing mGPU)
-a 3440x1440 monitor and the 5K one
 
Last edited:
  • A GPU doesn't have total and absolute control of graphical performance -- it simply does the rendering. This can still be throttled by either CPU functions or an engine that's doing a lot of stuff. We can see a lot of examples of this in many different titles, where going from 4K all the way down to 1600x900 may have no affect on graphical performance at all.
  • In such situations, multi-GPUs will often have no effect. It's not always possible to "brute-force" through such issues, as "processing power" isn't the deciding factor.
THIS 👆 is what I am most afraid of. And yes you figured me out , brute forcing things is indeed a character trait of mine (from losing weight [100+ kg] by just making an insane amount of push-ups daily to small things like always using gcam app on my smartphone with the HDR+ 72 option with a tripod [which basically means the phone takes the photo 72 times and stitches them all on top of each other] just to be able to take the best vacation photos (yes it takes forever, but then I have the bragging rights, because the photos rival my wife's DSLR camera.
You are very wise.
I think everything on this topic has already been said in this thread.
But nonetheless please don't delete this thread. Maybe someday when CDPR are finished reparing all bugs (that the majority of people [not me] find more important than the raw fps) and programmers get bored, they may scroll through this forum and consider adding this feature (if Nvidia hasn't released DLSS 3.0 by that time). Hope dies last.:giggle:

P.S. Rest of my system (to see what hardware is NOT enough to make me happy)
besides the Strix rtx 3090:
-r9 3900x (wanting to upgrade to 5900x but waiting for when it doesn't cost as much as a 5950x)
-samsung 980 pro 1tb nvme (+ other slower ones but cp2077 is on this monster).
-32GB RAM 3600mhz
-Crosshair VIII Hero (doesn't make game faster but gives more stability).
-Corsair H1000w (which I will need to upgrade when doing mGPU)
-a 3440x1440 monitor and the 5K one

Oh -- the topic stays! There are still a lot of players that will want to try to take advantage of multi-GPU, and it's possible to do so even if there is no official support. (That's the way it has been for pretty much ever!) Thus, it's good to keep this around for people to share their experiences and make their arguments. (And I'll repeat -- I have no idea nor any say on whether the game will receive multi-GPU support. Nothing is impossible.)

I'm with you, though, about DLSS. I'd love to see support added for non RTX cards (and I don't see why that would be impossible if they could develop a driver-only approach). But at the least, that would be a good sign to me that it's time for an upgrade. I'm not too phased one way or the other about ray tracing, but I wants DLSS. That, to me, is one of the most perfect solutions for steady performance.
 
On this topic I would absolutely love to see an official patch supporting mGPU setups but far from expect it to happen, I use such a setup for a couple of reasons but primarily to pull strain off of my primary GPU while keeping higher graphics quality and lowering the heat output of my GPUs. I can live with lower frames on some of the newer high graphic games mostly because I can't afford to build anything but a budget build. For reference with my setup which uses a Ryzen 5 2600 with 2 Radeon RX 570 8gb cards and 64gb Ram and a 1080p Freesync monitor (for the gameplay) has my gameplay bouncing between 30 and 60 fps and well over 200fps when I am in the in game menus all in all it averages out to 66fps in Cyberpunk at maximum graphics, though I do exceed 60fps in other games, notably an average of 80fps in Final Fantasy XIV, 70fps with Fallout 76, and Elder Scrolls Online I see an average of 130fps. Now if I could afford the newest cards I am certain I could get better showing than this but it is far from economical for me to do so.

If anyone has found a mGPS setting (preferably one utilizing Crossfire) that is actually stable with this game I would love to know it, I myself will be playing around with those settings and I will share my information here.
 
AAA games with leading edge graphics should scale seamlessly across as many GPU's as a user has. One of the points @Gabe2077 made very early in the thread is the principle reason, no one can run this game and achieve it's full cinematic experience from it. Details and resolutions, even after this game runs at 100+ FPS on newer generations of GPU's, will only continue to increase and increase. For games like this not to scale with the power available across users systems makes no sense what so ever, and it's ridiculous Almost as ridiculous as NVIDIA's farcical 4090 GPU, which doesn't even fit in a standard case. CDPR needs to rip the Band-Aid off, develop the foundations it needs for all it's games to scale perfectly across all mGPU setups, so it can continue to support the most advanced realism and newest gaming experiences going into the future.
 
Top Bottom