PC: AMD 5900x, RTX 3080 OC, 64GB RAM 3600, B550 Motherboard, NVME 2TB x2 SSD, 2 TB SSD, 6TB HDD, BeQuiet case with Arctic Freezer 360 for CPU, very good and cool temperatures. Running 4K Resolution. Using MSI Afterburner to monitor. Totally new and clean game install (no mods), New NVIDIA 527.56 drivers done with a clean DDU install. V-Sync Off. GPU usage 90-100%, CPU usage never more than 20% with temps never going above 65C on either.
Running the game DX12, mix of RT Ultra and Ultra+ settings with DLSS Auto or Balanced. Depending where I am and how long I have been playing up to 10 or so minutes, I get 30-50fps on average. It is OK, but a little disappointing. VRAM is maxed out, and my system RAM starts out around 11-12 GB usage. The longer I play, the higher RAM usage goes up. Pre-Next Gen usage used to hover around 14GB (with heavy texture mod usage, 50+ mods). Next Game RAM usage starts going up until it is well over 20GB and climbing and all the while FPS drops. The drops are probably 10-20 FPS overall with lots of game stuttering..
Save game in that spot. Quit game, restart game and load up that last save in that spot. FPS seems normal again, and I am using maybe 11-12 GB of RAM. Play for awhile and RAM usage starts jumping again and FPS drops again, game stutters. New save, repeat process, etc.
Everything looks really good, but something is definitely different (and wrong?) with this version of the game. I am able to get Cyberpunk 2077 to run at 4K with RT at an almost consistent 55-65 fps and RAM usage is no-where near where this version of The Witcher game goes.
Am I wrong, or is this where the game is (wrongly) using so much VRAM it spills over into the much much slower system RAM which drastically slows down the GPU? Or perhaps something else or a mixture of things?
Looking forward to what comes next, hopefully positive.
Running the game DX12, mix of RT Ultra and Ultra+ settings with DLSS Auto or Balanced. Depending where I am and how long I have been playing up to 10 or so minutes, I get 30-50fps on average. It is OK, but a little disappointing. VRAM is maxed out, and my system RAM starts out around 11-12 GB usage. The longer I play, the higher RAM usage goes up. Pre-Next Gen usage used to hover around 14GB (with heavy texture mod usage, 50+ mods). Next Game RAM usage starts going up until it is well over 20GB and climbing and all the while FPS drops. The drops are probably 10-20 FPS overall with lots of game stuttering..
Save game in that spot. Quit game, restart game and load up that last save in that spot. FPS seems normal again, and I am using maybe 11-12 GB of RAM. Play for awhile and RAM usage starts jumping again and FPS drops again, game stutters. New save, repeat process, etc.
Everything looks really good, but something is definitely different (and wrong?) with this version of the game. I am able to get Cyberpunk 2077 to run at 4K with RT at an almost consistent 55-65 fps and RAM usage is no-where near where this version of The Witcher game goes.
Am I wrong, or is this where the game is (wrongly) using so much VRAM it spills over into the much much slower system RAM which drastically slows down the GPU? Or perhaps something else or a mixture of things?
Looking forward to what comes next, hopefully positive.