Low GPU usage

+
Hi everyone, I've just build my new PC to play Cyberpunk:

- Amd Ryzen 3500x

- 3060ti Ventus 2X OC

- 16Gb 3200Mhz RAM

- MSI MAG B550M Mortar

- 650W PSU 80+ Bronze

Latest driver on the 3060ti. The game is on a ssd and is patched 1.05 so no problem with amd cpu. The problem is that it seems my game is capped on 35-50 fps in night city. Even if I lower the graphics.

I have a low GPU usage (63%) on 1080p RTX on DLLS quality :
r/cyberpunkgame - Low GPU usage
1080p RTX on DLSS quality

But If I put RTX off with DLLS quality the GPU usage goes to 53% :
r/cyberpunkgame - Low GPU usage
1080p RTX off DLSS quality

Something strange is when I don't have DLSS (with RTX on) the GPU usage goes to normal :
r/cyberpunkgame - Low GPU usage
1080p RTX on DLSS off

On 1440p the GPU usage goes to normal with DLSS on or off but the fps gain is real with DLSS on :
r/cyberpunkgame - Low GPU usage
1440p RTX on DLSS off

r/cyberpunkgame - Low GPU usage
1440p RTX on DLSS quality

But if I put RTX off with DLSS quality the GPU usage is again lowered (71%) :
r/cyberpunkgame - Low GPU usage
1440p RTX off DLSS quality

My CPU usage is always high but nothing is running in the background. This problem is only seen on this game. In bf5 I always have >90% GPU usage with RTX on or off, 1440p or 1080p.
The optimal way to play cyberpunk is on 1080p with RTX on DLSS on but with 63% GPU usage it's impossible to get the 60 fps. Do you have ideas ? Thanks in advance.
 
Hi everyone, I've just build my new PC to play Cyberpunk:

- Amd Ryzen 3500x

- 3060ti Ventus 2X OC

- 16Gb 3200Mhz RAM

- MSI MAG B550M Mortar

- 650W PSU 80+ Bronze

Latest driver on the 3060ti. The game is on a ssd and is patched 1.05 so no problem with amd cpu. The problem is that it seems my game is capped on 35-50 fps in night city. Even if I lower the graphics.

I have a low GPU usage (63%) on 1080p RTX on DLLS quality :
r/cyberpunkgame - Low GPU usage
1080p RTX on DLSS quality

But If I put RTX off with DLLS quality the GPU usage goes to 53% :
r/cyberpunkgame - Low GPU usage
1080p RTX off DLSS quality

Something strange is when I don't have DLSS (with RTX on) the GPU usage goes to normal :
r/cyberpunkgame - Low GPU usage
1080p RTX on DLSS off

On 1440p the GPU usage goes to normal with DLSS on or off but the fps gain is real with DLSS on :
r/cyberpunkgame - Low GPU usage
1440p RTX on DLSS off

r/cyberpunkgame - Low GPU usage
1440p RTX on DLSS quality

But if I put RTX off with DLSS quality the GPU usage is again lowered (71%) :
r/cyberpunkgame - Low GPU usage
1440p RTX off DLSS quality

My CPU usage is always high but nothing is running in the background. This problem is only seen on this game. In bf5 I always have >90% GPU usage with RTX on or off, 1440p or 1080p.
The optimal way to play cyberpunk is on 1080p with RTX on DLSS on but with 63% GPU usage it's impossible to get the 60 fps. Do you have ideas ? Thanks in advance.

I'm having this same issue, with an Ryzen 3700X and RTX3070 playing at 1440p.
 
Well, as I stated in some other threads, there is something wrong with perf optimization in the game.

I have an i7-9700, 32GB RAM, 3080 and a 750W gold PSU.

When on certain streets my FPS drops from 105-75FPS to 50ish. At the same time gpu AND cpu usage drops to 60ish%. That is not the case of not enough juice. Juice is there, game is struggling. Also, please don't spam with the useless memory pool, because not only it doesn't work - it can't work as it was never accessed by the game.
 
Hi,

With the recent update 1.05, i usually achieved more fps than before, but i have noticed something extrange. In the 1.04, my GPU was always around 95-99%, which is cool, and the CPU 45-50%. With the 1.05 update, the CPU goes to 80-100% most of the time but the use of GPU is lower now, around 80%, not giving it maximum. It is good that they have fixed the problem of the ryzen 6 cores, but now the GPU, or at least mine, does not work as it should work.

My specs:

-Ryzen 5 2600 3,4 ghz (Overclocked to 4.0ghz)
-RTX 2060 (Overclocked)
-16 GB RAM 3200 Mhz

I play with Ultra settings, RTX on (medium) and DLSS on Balanced (45 fps minimum and 53-54 average)
 
Dont have AMD anything. 1.04 worked aprox 45-50 fps max all. 1.05 i get 20-30 frames dipping into the teens. I7, 2080TI, 32g ram, ssd. Low GPU usage. So, last patch caused a problem.

Update. I just changed DLSS from AUTO and got massive improvement.
 
Last edited:
If your Nvidia GPU is not utilizing ~95-98% then open Nvidia Control Panel and check out the following:

  1. Make sure you're running the latest available WHQL drivers (check on nvidia.com drivers page)
  2. select "Manage 3D Settings"
  3. under "Global Settings" scroll down to "Power management mode"
  4. Change the value from "Normal" to "Prefer maximum performance"
That way your GPU will always run on full utilization when running 3D applications.

To further increase performance, you may want to change the value of "Texture filtering Quality" to "High performance" as well. This option doesn't really impact texture quality at all, yet it can give you back a few fps depending on the games you play.

In addition, I recommend using the following mod with Cyberpunk:
Code:
https://github.com/yamashi/CyberEngineTweaks

CyberEngineTweaks is an open source project that improves game performance and adds some additional features.
Read the Wiki on the github link I shared for how to install the mod and how to change settings.
This mod will improve performance further and does not require you to manually change hex values within the Cyberpunk executable (If you did that, revert the changes, as this may conflict with the mod). It enables AMD SMT by default and also improves performance for intel equivalent CPUs as it prevents frame drops under specific circumstances.

Please note that when AMD SMT is enabled and fully utilized, temperatures of your CPU might run a bit higher than usual. Keep an eye on that just in case your cooler might not be able to handle it.

Hope this helps!
Cheers
TalentX
 
Have you guys tried BenchMarking your pcs? i think there's a free one called heaven benchmark...
Maybe your XMP isnt enabled on your motherboard?
I know intels smart technology with the right motherboard auto clocks my PC just fine...and its stable...
CPUID HWMonitor Good monitor....

I think video cards follow 3 rules...Tempature/Power/ ... i cant remember the other :( , but as long asyou have those things... it should auto clock to maximum...
Man, im tired i cant even think sorry :(
Post automatically merged:

Also, JayzTwoCents and GamersNexus are both good youtubers for PC knowledge ...
There like mythbusters but for PC's
Aw i cant forget Linus too.
 
Have you guys tried BenchMarking your pcs? i think there's a free one called heaven benchmark...
Maybe your XMP isnt enabled on your motherboard?
I know intels smart technology with the right motherboard auto clocks my PC just fine...and its stable...
CPUID HWMonitor Good monitor....

I think video cards follow 3 rules...Tempature/Power/ ... i cant remember the other :( , but as long asyou have those things... it should auto clock to maximum...
Man, im tired i cant even think sorry :(
Post automatically merged:

Also, JayzTwoCents and GamersNexus are both good youtubers for PC knowledge ...
There like mythbusters but for PC's
Aw i cant forget Linus too.

The problem with both GN and J2C (and Linus) is that they do not test out of date CPUs with modern GPUs (Like Wristed above's setup).

Every video they do, they have access to 10900Ks or 5900Xs and then they test with those. They never show what kind of shitshow level of GPU bottleneck that you should expect when pairing an outdated 2015 mid-spec CPU like an i5-6600k with a good GPU from the last 2 years like a RTX 2080 Ti or RTX 3080.

The reason they don't is clear: you shouldn't do that. But a lot of people still try. My best friend went into this gen thinking a 7700K would be enough for a new RTX 3080 and basically got slapped in the face when his old 4-core CPU could barely feed his new GPU. He was noticing FPS that was 30+ less than expected because every RTX 3080 benchmark saw it paired with brand new CPUs that could actually use the GPU's level of rendering power. His 7700K was lucky to even push 55-65% of those because the CPU matters much, much more than he believed. He had to drop another $800 to get a new MB, RAM and CPU to even use his RTX 3080 at most of it's real potential.
 
Last edited:
The problem with both GN and J2C (and Linus) is that they do not test out of date CPUs with modern GPUs (Like Wristed above's setup).

Every video they do, they have access to 10900Ks or 5900Xs and then they test with those. They never show what kind of shitshow level of GPU bottleneck that you should expect when pairing an outdated 2015 mid-spec CPU like an i5-6600k with a good GPU from the last 2 years like a RTX 2080 Ti or RTX 3080.

The reason they don't is clear: you shouldn't do that. But a lot of people still try. My best friend went into this gen thinking a 7700K would be enough for a new RTX 3080 and basically got slapped in the face when his old 4-core CPU could barely feed his new GPU. He was noticing FPS that was 30+ less than expected because every RTX 3080 benchmark saw it paired with brand new CPUs that could actually use the GPU's level of rendering power. His 7700K was lucky to even push 55-65% of those because the CPU matters much, much more than he believed. He had to drop another $800 to get a new MB, RAM and CPU to even use his RTX 3080 at most of it's real potential.

They got older videos that talk about bottlenecking... that talk about older hardware...but they also usually throw in programs and settings in some games.... They talk about what is out that pairs with what that gets you the most for your money....
A 7th gen Intel shouldnt bottleneck a 3080? ... it takes a lot to be bottlenecked by your cpu for games... as most of the power comes from the GPU... I havent watched them in a while... Me and my bro watched em for like a month when we were looking for parts to build our pc's for the first time...that was like 6 months ago?

But there are certain games that are more cpu intensive...
I cant remember jays analogy on Bottlenecking what it was something like, "Information is being fed like lanes on a highway.... i cant remember the rest... something about different size lanes"
Post automatically merged:

Also, is the forums lagging for anyone else since yesterday? :(
Post automatically merged:

I swear they did a video on 7th,8th,9th gen processors can handle a 2080ti....
Post automatically merged:

Aw you're kinda right
He posted, "A lot of people were disappointed that we didn’t test mid range and low end CPUs for our bottlenecking video. I personally never recommend high end GPU with lower end cpu. BUT we will do this test again with mid range CPUs when the 30 series embargo lifts to make a better video for you guys. Thanks for the feedback! <3"
Older video...

I also, think i remember them talking about the next gen GPU's coming out.... that had a focus on DLSS... a new technology that was gonna help take downt he power needed for RayTracing... and Lower system performance...
I forgot all about DLSS.... i need to mess with mine in cyberpunk...
Post automatically merged:

3080 with 7th8th9th10th gen intel
Post automatically merged:

8:44 .. steve talking about what DLSS option in game you should use for your resolution 1080p1440p2k4k
Post automatically merged:

 
Last edited:
I have low gpu usage in night city, graphic preset ultra + RT ultra, dlss - auto, fullHD.
Gpu usage: 60-70%
Cpu usage: 50%
Fps: 45 - 55

● CPU. AMD Ryzen 5 3600
● GPU. Palit RTX 3070 gaming pro OC
● Ram. Kingston HyperX Fury HX436C17FB3R2/16 2x8Gb 3933mhz
● Motherboard. ASUSTeK COMPUTER INC. PRIME X570-P
● SSD: NVMe m2 Samsung evo 970 plus 500gb
● PSU. Bequiet! PURE POWER 11 600W CM Gold® (BN298)
● OS: Windows 10 Pro, version: 20H2, 64 bit, build: 19042.685 (last updated)

● Software Use Recording
● (MSI Afterburner) - performance monitoring
● Driver Version: Nvidia 460.79 (clean installation with DDU)

I fixed it by uninstalling the shitty RGB app \
Post automatically merged:

I have low gpu usage in night city, graphic preset ultra + RT ultra, dlss - auto, fullHD.
Gpu usage: 60-70%
Cpu usage: 50%
Fps: 45 - 55

● CPU. AMD Ryzen 5 3600
● GPU. Palit RTX 3070 gaming pro OC
● Ram. Kingston HyperX Fury HX436C17FB3R2/16 2x8Gb 3933mhz
● Motherboard. ASUSTeK COMPUTER INC. PRIME X570-P
● SSD: NVMe m2 Samsung evo 970 plus 500gb
● PSU. Bequiet! PURE POWER 11 600W CM Gold® (BN298)
● OS: Windows 10 Pro, version: 20H2, 64 bit, build: 19042.685 (last updated)

● Software Use Recording
● (MSI Afterburner) - performance monitoring
● Driver Version: Nvidia 460.79 (clean installation with DDU)

try uninstalling RGB programs and fan profiles, that worked for me
 
Last edited:
Is this still an issue? I'm not massively techy so I don't know what normal should look like, but in most games I see my GPU around 90-100% and my CPU 20-60% but recently in CP2077 I've noticed my GPU at 70-80% and my CPU everywhere between 40-90%

I'm running R5 3600 (at 4ghz all cores) 2060 super with a small OC and 16gb 3200mhz ram. output to 1080p I'm running RT on (only lighting at medium) most other settings high/ultra - generally the gamers nexus settings recommendation.

Surely a game such as this should run around 95-100% gpu and I'd guess with my cpu about 70-80%.

It feels so poorly optimised compared to Witcher 3 GOTY and RDR2, like i constantly have to monitor graphics and performance rather than enjoy the game.
 
Glad others are chiming in on this finally. I've been pointing it out in various places for a week or so now. It's just how broken the game is in optimization and possibly the Nvidia/AMD drivers too. They've been doing a really bad job with driver improvements for awhile now. The Hex Edit for CPU usage won't really help at all overall because of the prevailing main problem of poor hardware utilization. Usually the usage will plummet in the same exact spots around the game no matter what you do. Same with GPU usage. I've seen the CPU dip as low as 12% and since GPU was tanking at the same time, I had like 35 FPS in this particular scene even though I normally play with 60-70 FPS. Totally lame.

My Strix 3090 is running at max power limit with +70 Core +600 Memory, Windows 10 Pro Ultimate Power Plan, Maximum Performance mode in NVCP, etc. I removed all possible power limiters to help it out short of shunt modding the damn thing and moving up from 480W to like 700W Lol. Still only uses 60-70% usage in most of the game. It'll dip much lower in many spots too. 67-70% is a "good" area Lol.

CPU will bounce a lot if you're using PBO to OC with Ryzen. Anywhere from the teens to 60+% usage with my 5950X. Depends on the level of optimization in the particular area you're in. But it fluctuates so much, the performance isn't reliable. I tried manual OCing to 4.6Ghz all core instead, and the results are pretty much the same, with a slightly better average framerate and better lows. Manual OC doesn't help the frame dips at all when CPU and GPU usage tanks though. I even tried disabling SMT in this game to see if that helps. Feels like performance was worse, but that was with PBO OC and not my Manual OC. We just have to wait until they fix the game and drivers get better. Mostly until they fix the game.
 
Glad others are chiming in on this finally. I've been pointing it out in various places for a week or so now. It's just how broken the game is in optimization and possibly the Nvidia/AMD drivers too. They've been doing a really bad job with driver improvements for awhile now. The Hex Edit for CPU usage won't really help at all overall because of the prevailing main problem of poor hardware utilization. Usually the usage will plummet in the same exact spots around the game no matter what you do. Same with GPU usage. I've seen the CPU dip as low as 12% and since GPU was tanking at the same time, I had like 35 FPS in this particular scene even though I normally play with 60-70 FPS. Totally lame.

My Strix 3090 is running at max power limit with +70 Core +600 Memory, Windows 10 Pro Ultimate Power Plan, Maximum Performance mode in NVCP, etc. I removed all possible power limiters to help it out short of shunt modding the damn thing and moving up from 480W to like 700W Lol. Still only uses 60-70% usage in most of the game. It'll dip much lower in many spots too. 67-70% is a "good" area Lol.

CPU will bounce a lot if you're using PBO to OC with Ryzen. Anywhere from the teens to 60+% usage with my 5950X. Depends on the level of optimization in the particular area you're in. But it fluctuates so much, the performance isn't reliable. I tried manual OCing to 4.6Ghz all core instead, and the results are pretty much the same, with a slightly better average framerate and better lows. Manual OC doesn't help the frame dips at all when CPU and GPU usage tanks though. I even tried disabling SMT in this game to see if that helps. Feels like performance was worse, but that was with PBO OC and not my Manual OC. We just have to wait until they fix the game and drivers get better. Mostly until they fix the game.
This all sounds like intelligent stuff I don't understand, but sounds like we both see the same - gpu's not hitting 100% but cpu's fluctuating and FPS no where near maxing out. I think it must be game optimisation, again I don't profess to know much, but I guess this game has been developed on an evolution of what CDPR used for W3, so it just seems weird that this game is all over the shop and I'm only trying 1080p - c'mon according to Steam like majority of pc gamers use gtx 1060 on a 24in 1080p monitor running 75hz. Feels like this game is only made to satisfy 4k ultra everything nvidia/intel pairings.
 
Top Bottom