Yes, For GPU VRam put it all, for PoolCPU half of your actual RAM(or leave 4GB for windows)I tried those. Didnt improved much for me. Should I put all my GPU Ram in the pool ?
Yes, For GPU VRam put it all, for PoolCPU half of your actual RAM(or leave 4GB for windows)I tried those. Didnt improved much for me. Should I put all my GPU Ram in the pool ?
It's sponsored by Nvidia.So CDPR released expanded system requirements shortly before the release. For the RT minimum it states an RTX 2060 GPU.
I have an RTX 2060 Super, Ryzen 3600xt, and 16Gb RAM. On the RT medium preset @ 1080p I regularly get less than 30 fps.
In 2020 under 60 fps is poor. Under 30 fps is unacceptable.
Where did these system requirements come from?
It's sponsored by Nvidia.
Basically yes. Sony and Microsoft could have gone for a PCIe solution in order to make the GPUs upgradeable...So you're deflecting the blame to Nvidia? I guess it's also Microsoft and Sony's fault that the game doesn't run well on base consoles?
Basically yes. Sony and Microsoft could have gone for a PCIe solution in order to make the GPUs upgradeable...
The 2060 Super should be the bare minimum requirement for RT, not the 2060.
With the fact that most people show up with old budget processors that they got because someone told them it was "enough" when they bought them I'm not surprised that the increased load overwhelms their systems.Because countless of similar threads are popping up, this thread is for all performance and optimization related issues. So, it's easier to keep an overview.
Denial is a powerful thing but it doesn't change the truth.Lol no.
LOL.
Denial is a powerful thing but it doesn't change the truth.
Has there been any progress made on performance besides the AMD fix? I don't have issues till I get into town which is densely populated. I mean a 3080 and 3600X should get better than 45FPS in town? Biggest problem I see is that GPU utilization is only at like 40%ish when in town. Outside of town or in building GPU usage can hit 95% though. What gives?
Bruh at least, try to know what you're talking about before saying that much BS.
CPU has basically NO CHARGE at 4k. Only very bad cpu could bottleneck a 1080ti at 4k. Now I doubt you know what bottleneck is, cauz if you did, you would know that a 1080ti at 99% usage on 4k resolution is not being bottlenecked.
Now, see this video I'm linking here :
1080ti is always at 99% on 4k resolution, while the 2700x is barely getting used (around 30-40ish pourcent). If it was bottlenecking the GPU, the 1080ti would be at around 70% and the CPU at 70-80% or something, which is clearly not the case here.
Sorry for this little out of subject response, but I felt like I needed to adress this.
Edit: Please keep in mind I am talking about this games performance estimates, not how the 2700X paired with a GTX 1080TI handles Tomb Raider or other older games.
While the CPU has almost no effect at 4K that doesn't mean there is none. What it means is you are frame rate limited no matter what, so crank the graphics settings up until you max out the graphics card paired with the processor.
I'm going to try not to be insulting because I think you simply misunderstood my previous post. The 2700X/GTX 1080TI isn't going to buckle at 4K Ultra because of the 2700X. It's going to buckle because of the GTX 1080TI. It's a great card but it is getting old.
That said as the resolution lowers the bottleneck will shift to the 2700X which is why at lower resolutions the machine will struggle more and more. It's pretty significant at 1080P and a slight issue at 1440P. Even when the processor was new it was struggling. It's been a few years, the performance bar has risen and zen+ is getting long in the tooth. My suggestion if you want more performance for a reasonable amount of money if you have a decent X470 and maybe B450 board is to update the firmware and upgrade to a Ryzen 3000 series processor.
Your example is showing a bunch of old games. I know that old games play ok. That is not comparable to new games like Cyberpunk. Here is a link going over results for Red Dead Redemption 2. It's another very demanding game. Red Dead Redemption 2 benchmarks.
This might help people with AMD CPU's
Amd Performance Fix - Mod on Nexus
Hello, I found a very interesting mod on Nexus that is called "Amd Performance Fix". I opened "Task Manager" when Cyberpunk 2077 was running without fix and with fix and noticed clearly that it worked on my cpu (AMD Ryzen 5 3600 6-Core Processor). If you have a AMD cpu don't hesitate to visit...forums.cdprojektred.com
I'm willing to update this, let me know if you got settings that work out well.)
Links:
Thread: Performance Issues
Game Crashes - Support
Contact Technical Support
- Install Cyberpunk 2077 on an SSD (you can find 128gb second hand for 15$)
- If you want to play on Ultra:
- skip Ray Tracing for now. A patch might be on its way.
- Set DLSS to OFF, quality, or balanced. Don't use AUTO and don't use one of the performance modes.
- If you set resolution to 1080p or lower, set DLSS to OFF, otherwise it looks blurry.
- Only use the DLSS performance modes for 4K and 8K on big monitors / TVs.
- Make sure Vsync is disabled
- Set the powerplan of your PC to High Performance
- Search and open Control panel
- Click Power options
- Set it to High Performance or AMD Ryzen High Performance
- Enable the real full-screen mode (This can lower FPS, please compare at medium settings without RayTracing)
- go to C:\Program Files (x86)\GOG Galaxy\Games\Cyberpunk 2077\bin\x64, or
- Open Steam, Right-click on Rocket League, click Properties
- go to the "Local Files" tab and click browse local files
- right click on Cyberpunk2077.exe, click on properties and Compatibility
- Disable full-screen optimizations
- go to in-game-settings and choose full-screen-mode
- Use MSI Afterburner to make sure your GPU fans are spinning
- enable "Apply at Windows Startup"
View attachment 11074793- Force fan speed update on each period (can increase CPU-usage) if fan curve cannot be applied. Make sure to set curve for both fans, and also activate auto mode for both fans. Keeping your GPU cool is necessary to avoid crashing and throttling.
View attachment 11074388- optimize RAM timings
- Download DRAM calculator
- Follow Steve's guide
- Go to your UEFI / BIOS and set timings (like in that video, set the command rate to 1, or 1T)
- CPU Priority
- In Windows, after launching Cyberpunk, tab out and open Task Manager.
- Go to details tab, right click on Cyberpunk2077.exe, set priority to high
- It won't save this setting, you want to do this every time you start playing. (It doesn't do much tho and it can also cause crashing, but it might give you some extra performance.)
- Optimize CPU performance (Ryzen)
- Go to Bios, there you have two ways of overclocking:
- manually OC your cpu by setting a multiplier and adjusting the voltage. Precision Boost overdrive needs to be turned OFF.
- Alternatively, enable Precision Boost overdrive and SMT without setting the multiplier. CPU will automatically overclock.
- Adjust CPU fan curve to keep temps below 85°C. Also check Chipset and System Fan curve.
- Check for driver updates:
Chipset ; GPU ; Windows ; Monitor- Enable Gsync or Freesync and optimize other gpu settings
- Go to your gpu control panel
- (NVIDIA) click Manage 3d settings and go to Program settings, select or add Cyberpunk2077.exe
- Monitor Technology: G-sync
- Preferred refresh rate: Highest available
- Power management mode: Prefer maximum performance
- Low latency Mode: On (Don't set it to Ultra)
- Texture filtering: Quality, performance or high performance
- Make sure Vsync is set to application controlled
I'm sorry if I misunderstood u but you litterally said : "A 2700X bottlenecked a GTX 1080Ti at 4K". How was I supposed to get you were talking about the 1080ti bottlenecking itself ? Which is quite obvious considering the fact it's indeed a old gpu. On top of that you added "Ryzen Zen+ was never that great at gaming", which is true at lower resolution but not at 4k. All those factors combining with the fact you never mentionned you were talking specifically about cyberpunk 2077 made me a bit confused.
However, the games in the video are pretty old I guess, but some of them are fairly recent, and I dont think it changes a lot of thing regarding bottlenecking. As long as Games dont use more cores on CPU, 4k will always be a GPU speciality, and the CPU won't do much.
Besides, I was only talking about 4k. 1440p and 1080p are a total different story, as bottlenecking is more noticable in this kind of resolution. The video you linked to me on RDR2 is the perfect exemple of that, as some beast GPU are getting bottlenecked at 1080p. But it doesn't show some 4k graphs where the CPU is bottlenecking the GPU. Why ? Because even on RDR2, CPU bottlenecking doesnt exists at 4k.
Regardless of that disagreement about bottlenecking on 4k resolution, I agree with the others points you made. But my position on the matter won't change, unless we get some games which use more cores to be effective.
EDIT : I'm part of some who actually thinks RDR2 is not a very good optimized game. The game is beautiful dont get me wrong, but not reaching 60 fps at 4k with max settings on a 3080, tells me it's heavy in ressources for not that much to show off. Death stranding is the kind of game godly optimized. According to me, RDR2 is not.