Performance Issues / Poor Optimization

+
Firstly, some tech specs:

i7 6700k
2070 RTX super
32 gig RAM

This is clearly not an issue with my rig, even if the 6700k is now starting to look a little long in the tooth, for reasons that will become clear.

Ok, so this is a weird one.

I pre-ordered the game and installed it on launch day last week, only to find that the game appeared to be stuck at about 10 fps - it launched fine, 200+fps on the first splash screen, but then plunged down to 10fps on the 'Red Engine' screen and stayed around there, regardless of settings or whatever. The game was flatly refusing to use my gfdx card to anything approaching it's full capacity; it seemed locked on 30% usage.

EXCEPT. When I talked to Bug during the VR training module. Literally only when I was stood on the platform and talking to her tho - if I ran onto the platform before she appeared, it ran at 10 fps. If I got off the platform while talking to her, it fell back down to 10 fps. If I was stood on the platform while Bug was present, tho - 80fps. The moment training started, back down to 10 fps we went. I could even see this in the monitor - 30% gpu usage until I stepped on the platform, then it rocked up to 80-90%. Then straight back down to 30% the moment I left that platform.

I spent most of the following day messing about with it - updating drivers, tweaking settings in Nvidia etc - and eventually used MS's Windows 10 Update Assistant to force my PC onto 20H2. I didn't think this had a cat in hell's chance of making a difference, but to my surprise, it worked.

I launched the game and then was able to spend the following 3 or 4 days playing at 80+ fps on Ultra settings (without ray tracing switched on, tho I'm fairly confident I'd still be getting ~40 or so with it active). This should mean we can dispense with anyone proposing that a 6700k is somehow too old or inferior for playing the game, or that a 2070 won't handle it - it's been handling it just fine, and 2 of those days were using the latest hotfix, too, so it's not like Cyberpunk has changed since then.

Then, on Sunday, a few more Windows Updates installed overnight - KB4586876 and KB4580419. These are cumulative updates for .NET.

Monday morning, Cyberpunk is back down to 10fps. Removing those two windows updates does not appear to restore my previously decent frame rates.

No other game has had any negative impact from these updates - just Cyberpunk. I know I'm not the only person having this problem, either; I've seen a few threads about it on Reddit and Steam, and have seen mention of what sounds similar here.
You're limited in the number of threads your machine can handle. It sounds like something is going on in the background sucking up resources. When the game is running slow is there other things happening in the background? One way you can stop those pesky automatic updates from wrecking havoc with your gaming is to flag your connection to the internet as metered. This will cause Windows 10 to not run updates in the background.
 
The thing is the 2700X does slightly bottleneck a GTX 1080TI in 4K. I saw it myself with my 2700X @ 4.1GHz with a MSI GTX 1080TI Gaming X running on a Crosshair VII Hero Wifi with 32GB of DDR4-3200MHz RAM. It isn't as noticeable and in older games it was not relevant to 4K Ultra 60FPS. If you look at the benchmarks though you can clearly see even at 4K a 8700K got slightly higher frame rates out of the GTX 1080Ti then the 2700X. Like I said it wasn't enough to make it a deal killer at the time.

When compared to Intel's processors of the time the Ryzen Zen+ processors where always struggling to keep up. The way I look at it is, if the processor was struggling with bottlenecking issues in games when the processor was in production then how would it handle these new games now? If it was bottlenecking a GTX 1080TI how will it handle newer faster graphics cards? How much performance will get left on the table because of the processor? That doesn't mean the computer won't see a performance gain from a upgraded graphics card, just that the gain won't be 100% of what the graphics card could bring with a slightly faster CPU.

I did mention that I used my Ryzen system to player Witcher 3 at 4K Ultra 60FPS. Sorry about the confusion. What Zen+ will forever be known for is the continued press by AMD on Intel to force Intel to sink or swim. It's why the 10900K and the 5950X exist now trading blows and giving us PC gamers incredible return on the expense as compared to the stagnant days of 4C/8T incremental increases in performance as Intel sat on improvements to release it bit by bit in order to harvest the money for as long as possible.

It is impossible for me to say if RDR2 is optimized or not without looking at the code. It could simply be that the effects utilized in the engine are not fully hardware accelerated. In addition the RTX 3080 is limited to 10GB of video memory and with all the effects in RDR2 you can easily blow past that in 4K which means performance will take a enormous hit. If nVidia had not been so skimpy with the RAM both the 3080 and 3070 would do better in higher resolutions.


No worries dude, glad we cleared the air.
 
My PC specs to begin with,
6700k stock
1080ti stock (latest nvidia drivers)
32 gb ddr4 2133 ram
Win10 2004
1440p
Fixes from memory allocation to hex edit all done

My FPS will hover at 45 fps at medium settings and it will take a nose dive into 20s when I'm driving or when I move my camera around. GPU and CPU are at 90+% usage, all cores being used. Weird part is if I switch to higher or lower settings, fps will be the same and have the same fps drop issue. There are 6700k + 1080ti benchmark videos of CP and they're all running the game very well, far better than me in comparison.

So there must be an underlying issue with my hardware and cyberpunk because I can run other games completely fine. Games such as MHW, Apex, PUBG, Dota, Witcher 3, and many more have all ran perfectly fine at high/max settings at 1440p. Any help would be greatly appreciated.

Thank you.
 
Last edited:
My game only use 10-30% cpu.
Windows reports CPU usage as 100% / number of threads the processor can handle. So for example if you have two threads one core is maxed or both cores are loaded halfway at 50%. This doesn't really mean much for game performance until the game will use at least the number of threads the CPU can handle and then you will find the processor gets pegged running the game.
My PC specs to begin with,
6700k stock
1080ti stock (latest nvidia drivers)
32 gb ddr4 ram
Win10 2004
1440p
Fixes from memory allocation to hex edit all done

My FPS will hover at 45 fps at medium settings and it will take a nose dive into 20s when I'm driving or when I move my camera around. GPU and CPU are at 90+% usage, all cores being used. Weird part is if I switch to higher or lower settings, fps will be the same and have the same fps drop issue. There are 6700k + 1080ti benchmark videos of CP and they're all running the game very well, far better than me in comparison.

So there must be an underlying issue with my hardware and cyberpunk because I can run other games completely fine. Games such as MHW, Apex, PUBG, Dota, Witcher 3, and many more have all ran perfectly fine at high/max settings at 1440p. Any help would be greatly appreciated.

Thank you.
What speed is your RAM and drives? Could it be a issue loading assets on the fly as you move through the environment? If you're at 90% for GPU and CPU then you are not going to get much more performance out of the system without making a change. I would try going into the nvidia control panel and lowering some settings like texture filtering for Cyberpunk specifically in order to reduce CPU load.
 
What speed is your RAM and drives? Could it be a issue loading assets on the fly as you move through the environment? If you're at 90% for GPU and CPU then you are not going to get much more performance out of the system without making a change. I would try going into the nvidia control panel and lowering some settings like texture filtering for Cyberpunk specifically in order to reduce CPU load.

ddr4 2133, and nvme ssd. Thanks, I'll try that.
it's just weird how in numerous 6700k 1080ti youtube videos with gpu/cpu overlay on, their 6700k are usually hovering at 60% even at 1080p. I play on lower settings and at 1440p but with much higher cpu usage with unplayable fps dips.
 
You're limited in the number of threads your machine can handle. It sounds like something is going on in the background sucking up resources. When the game is running slow is there other things happening in the background? One way you can stop those pesky automatic updates from wrecking havoc with your gaming is to flag your connection to the internet as metered. This will cause Windows 10 to not run updates in the background.

It's not bottlenecking. Nothing is sucking up resources. Nothing that wasn't running on Saturday is running now and suddenly consuming 7 threads. The other processes do not suddenly decide to stop running when I step on the platform during the VR training and start up again when I step off it. My hardware is clearly more than fine, since it ran for 30+ hours on max settings with no issues since Thursday. 8 threads has not suddenly become too few overnight. None of my other games are finding themselves unable to find sufficient threads to run on, either, just Cyberpunk.

As far as I can tell, the game appears to have simply decided not to use the 2070RTX and to instead try and render everything on the CPU's terribad in-built Intel graphics, despite having been explicitly told to use the 2070 in the various control panels in Windows/Nvidia.
 
It's not bottlenecking. Nothing is sucking up resources. Nothing that wasn't running on Saturday is running now and suddenly consuming 7 threads. The other processes do not suddenly decide to stop running when I step on the platform during the VR training and start up again when I step off it. My hardware is clearly more than fine, since it ran for 30+ hours on max settings with no issues since Thursday. 8 threads has not suddenly become too few overnight. None of my other games are finding themselves unable to find sufficient threads to run on, either, just Cyberpunk.

As far as I can tell, the game appears to have simply decided not to use the 2070RTX and to instead try and render everything on the CPU's terribad in-built Intel graphics, despite having been explicitly told to use the 2070 in the various control panels in Windows/Nvidia.
Is this a desktop or a laptop?
Post automatically merged:

ddr4 2133, and nvme ssd. Thanks, I'll try that.
it's just weird how in numerous 6700k 1080ti youtube videos with gpu/cpu overlay on, their 6700k are usually hovering at 60% even at 1080p. I play on lower settings and at 1440p but with much higher cpu usage with unplayable fps dips.
A nvme drive should be way more then adequate but 2133 is kind of slow by todays standards. If you cannot make the RAM go faster you might be able to tighten the timings to get a little more performance out of the RAM.

I don't know how much more performance you are going to get. You might consider a frame rate cap to get more consistent frame rates. Often the dips are worse then running at a lower but consistent frame rate.
 
Last edited:
X
Absolutely shocked. Imagine blaming the hardware, which manages to run other AAA games released this year perfectly fine.

People like you are why CDPR gets away with pulling this stuff. The 2060 S is still not good enough for RT minimum. The game is unoptimized garbage. Needed another year in the oven at least.
You don't have to enable RT. :shrug:

I'd recommend to turn it off unless you want to use it for screenshots or the photomode. It's just too demanding for lower-end RTX-cards like the 2060 S.
 
Is this a desktop or a laptop?

Desktop.

Have just once again confirmed that the VR training thing works - my GPU goes from about 5% usage to 100% as I step onto the platform, and my fps goes from 5 frames per second to 80. My GPU goes from 40C to 58C in a matter of seconds. I can run around up there for as long as I like with proper performance, but if I step back off the platform, it immediately drops back down to 5% GPU use and 5 frames a second.

It is thoroughly bizarre.
 
Hey guys. I hope someone can help me here.
Sorry, im sure this has been mentioned a couple of times already, but since i havent found a good solution for my problem (if there even is one) yet, im hoping to find some answers here.
In almost every area of the Game, my computer can achieve 50-60 fps most of the time, with drops to the mid 40s some time.CPU usage around 60-70%. GPU usage at 100% as it should be :D
But in crowded areas like Jig-Jig street or the Market around there, my CPU goes totally crazy. Usage goes up to 98%, at the same time the GPU usage goes down to 60%, and bc of that i get frame drops down to 20 with some annoying stuttering.


I have an Intel Xeon E3-1231 V3, which is technically the same as an i7 4790,only lower clock speeds and no overclocking, so i dont think that this is a legit CPU Bottleneck.
Complete Hardware specs: CPU: Intel Xeon E3 1231 V3, GPU: NVIDIA GeForce GTX 1060 6GB ASUS STRIX Edition, 16GB DDR3 @1600MHz, Gigabyte B85M-DS3H Motherboard, Cyberpunk is installed on an SSD, and there is 300GB space left on the drive.
I tried mods like Performance overhaul, adjusted my memory pool for my specs and also did the optimisations from the other thread. I even clean installed windows again, to make absolutely sure. Issue still resolves :(
 
Last edited:
Hey guys. I hope someone can help me here.
Sorry, im sure this has been mentioned a couple of times already, but since i havent found a good solution for my problem (if there even is one) yet, im hoping to find some answers here.
In almost every area of the Game, my computer can achieve 50-60 fps most of the time, with drops to the mid 40s some time.CPU usage around 60-70%. GPU usage at 100% as it should be :D
But in crowded areas like Jig-Jig street or the Market around there, my CPU goes totally crazy. Usage goes up to 98%, at the same time the GPU usage goes down to 60%, and bc of that i get frame drops down to 20 with some annoying stuttering.


I have an Intel Xeon E3-1231 V3, which is technically the same as an i7 4790,only lower clock speeds and no overclocking, so i dont think that this is a legit CPU Bottleneck.
Complete Hardware specs: CPU: Intel Xeon E3 1231 V3, GPU: NVIDIA GeForce GTX 1060 6GB ASUS STRIX Edition, 16GB DDR3 @1600MHz, Gigabyte B85M-DS3H Motherboard, Cyberpunk is installed on an SSD, and there is 300GB space left on the drive.
I tried mods like Performance overhaul, adjusted my memory pool for my specs and also did the optimisations from the other thread. I even clean installed windows again, to make absolutely sure. Issue still resolves :(
Reduce the density of the crowd.
Even on my processor, if I move fast in a car, the load jumps up to 70%
 
Hi there. My case:
Specs - i7-9700, 2080Ti, 32GB RAM, SSD

Without RTX i can get 60fps on 1080p 95% of the time. With RTX my CPU goes nuts with 100% load, while my GPU at 10-15% usage. I've tried to tweak RAM usage in config(memory_budgets) file, but it won't make any difference. Should i just wait and pray for patch?
 
Hey Guys. GOOD NEWS for those who have their fps capped by the game.

I finally managed to get more than 30 fps, while I was stucked at 30 no matter what I do.
Now I have 80 fps + everywhere and I can easily push more by turning off completly RTX.

The fix is quite simple ==> Go here : https://www.microsoft.com/en-us/software-download/windows10/ and download the windows 10 installation media. Launch it and follows all the instructions, including the one saying : "upgrade my windows".

Somehow it fixed the issue for me. I hope it could help some of you.

All credit goes for @jeffb0918 who actually posted the fix on another thread : https://forums.cdprojektred.com/ind...resolution-or-settings-on-pc.11043734/page-10

Glad I can decently enjoy the game now :D
 
X

You don't have to enable RT. :shrug:

I'd recommend to turn it off unless you want to use it for screenshots or the photomode. It's just too demanding for lower-end RTX-cards like the 2060 S.

Then they shouldn't lie about it when posting system requirements. Then again this is the same company that hid the fact console performance was broken prior to release, so I shouldn't be surprised.
 
Desktop.

Have just once again confirmed that the VR training thing works - my GPU goes from about 5% usage to 100% as I step onto the platform, and my fps goes from 5 frames per second to 80. My GPU goes from 40C to 58C in a matter of seconds. I can run around up there for as long as I like with proper performance, but if I step back off the platform, it immediately drops back down to 5% GPU use and 5 frames a second.

It is thoroughly bizarre.

I saw this video on youtube yesterday which might explain the performance a bit. CPU Results

It seems like there are some significant lag spikes in the game with lower end processors. As long as you don't have the display plugged into a video output that is directly on the motherboard you are not using the integrated graphics in your processor. In fact unless you went into the BIOS to change the setting most motherboards disable the integrated graphics when it detects a discreet graphics card in a desktop.
 
Then they shouldn't lie about it when posting system requirements. Then again this is the same company that hid the fact console performance was broken prior to release, so I shouldn't be surprised.
I thought it was obvious that the game won't run on standard PS4. The minimum spec for pc is a gpu that is very similar to the integrated gpu of the PS4 Pro.
 
So CDPR released expanded system requirements shortly before the release. For the RT minimum it states an RTX 2060 GPU.

I have an RTX 2060 Super, Ryzen 3600xt, and 16Gb RAM. On the RT medium preset @ 1080p I regularly get less than 30 fps.

In 2020 under 60 fps is poor. Under 30 fps is unacceptable.

Where did these system requirements come from? I understand the game is currently unoptimised/full of bugs, but did they just select hardware that sounded reasonable without any testing whatsoever?

Between the bugs, the performance issues, and the cut content...the only award I would give this game is disappointment of the year. Although I suppose that's still quite an achievement given everything else 2020 has produced.
Post automatically merged:

Is there confirmation anywhere of what the targeted frame rate is?
Are you using DLSS?
 
Hey guys. I hope someone can help me here.
Sorry, im sure this has been mentioned a couple of times already, but since i havent found a good solution for my problem (if there even is one) yet, im hoping to find some answers here.
In almost every area of the Game, my computer can achieve 50-60 fps most of the time, with drops to the mid 40s some time.CPU usage around 60-70%. GPU usage at 100% as it should be :D
But in crowded areas like Jig-Jig street or the Market around there, my CPU goes totally crazy. Usage goes up to 98%, at the same time the GPU usage goes down to 60%, and bc of that i get frame drops down to 20 with some annoying stuttering.


I have an Intel Xeon E3-1231 V3, which is technically the same as an i7 4790,only lower clock speeds and no overclocking, so i dont think that this is a legit CPU Bottleneck.
Complete Hardware specs: CPU: Intel Xeon E3 1231 V3, GPU: NVIDIA GeForce GTX 1060 6GB ASUS STRIX Edition, 16GB DDR3 @1600MHz, Gigabyte B85M-DS3H Motherboard, Cyberpunk is installed on an SSD, and there is 300GB space left on the drive.
I tried mods like Performance overhaul, adjusted my memory pool for my specs and also did the optimisations from the other thread. I even clean installed windows again, to make absolutely sure. Issue still resolves :(
Crowd Density is also set to low. I literally tried every single option, even updating/reinstalling windows. In areas like the Cherry blossom market and some other places, i get a CPU "bottleneck". I think there is some buggy code or assets in the game which is sucking up all the CPU resources... If i would know some debug console commands , to disable certain things in those areas , to see where the problem is.. that would be amazing.
 
Top Bottom