Witcher 3 NG Ray Tracing - GPU PCI-E Bus / VRAM Overload cause of FPS drop/stuttering issue - FIX'ed!

+
Solution to Memory leak issue :
In GPU Control Panel in 3d settings choose Witcher3.exe (from <game installation directory/bin/x64_d12 folder) and set Anisotropic filtering to x2 or x4/8/16 (for best quality go x8 or 16):
Texture Filtering - Negative LOD Bias :
have to be set to Clamp
( from 04:39 i'm showing where to change it)

Now i have playable gameplay with almost highest game setting on my mobile GPU with only 6GB VRAM RTX 3060:

For better image quality (less blur use Sharpen to High): https://imgsli.com/MTUyOTEz/6/3
Even with AF on x16 is still fix memory leak for me and with Sharpen is even AF x2 is very smillar to no AF overall, or even better, especially with x8/16.
 
Last edited:
Just play the game in DX11. Problem solved. Even though I've never experienced memory leak with my RTX 3080 12GB and I put into the game 50 hours since hotfix 2 released, when I want to experience 120 fps locked in 1440p, I just play in DX11 and it heavily multithreaded and I'm sure there will be sick new mods for it.
On budget pc this game works bad now, with stuttering and poor FPS. The old version without the update did work very good.
 
Solution to Memory leak issue :
In GPU Control Panel in 3d settings choose Witcher3.exe (from x64_d12 folder) and set Anisotropic filtering to x2:
( from 04:39 i'm showing where to change it)

Now i have playable gameplay with almost highest game setting on my mobile GPU with only 6GB VRAM RTX 3060:

For better image quality (less blur use Sharpen to High): https://imgsli.com/MTUyOTEz/6/3
Even with AF on x16 is still fix memory leak for me and with Sharpen is even AF x2 is very smillar to no AF overall, or even better, especially with x8/16.
Fix... I dont know. AF enable reduce a lot resolution textures. Sharpen + AF is still not enough
 
Fix... I dont know. AF enable reduce a lot resolution textures. Sharpen + AF is still not enough

That is just dead wrong ..... AF sharpens textures and makes them look better. I find this interesting because I didn't seem to have any memory problems with Ray Tracing off and like many games I had Anisotropic Filtering set to 16x in the Nvidia Control Panel because modern graphics cards can do 16x AF (Really should be called 16 tap AF) with less than 1% loss to performance.

Anisotropic filtering eliminates aliasing effects, but improves on other techniques by reducing blur and preserving detail at extreme viewing angles. Anisotropic filtering is a texture filtering technique that can noticeably improve the quality of textures that are seen at an angle. This is generally most noticeable on flat wall- or floor-textures that stretch off into the distance. With no texture filtering, a texture seen at an angle will become noticeably blurry and lose a lot of detail as the angle gets steeper.
Post automatically merged:

BTW the Red Engine has Anisotropic Filtering built it and at least with Ultra settings it is automatically set to:

MaxTextureAnizotropy=16

I wonder if there is a problem with the built-in AF and overriding it in the Nvidia Control Panel fixes it
 
Last edited:
That is just dead wrong ..... AF sharpens textures and makes them look better. I find this interesting because I didn't seem to have any memory problems with Ray Tracing off and like many games I had Anisotropic Filtering set to 16x in the Nvidia Control Panel because modern graphics cards can do 16x AF (Really should be called 16 tap AF) with less than 1% loss to performance.

Anisotropic filtering eliminates aliasing effects, but improves on other techniques by reducing blur and preserving detail at extreme viewing angles. Anisotropic filtering is a texture filtering technique that can noticeably improve the quality of textures that are seen at an angle. This is generally most noticeable on flat wall- or floor-textures that stretch off into the distance. With no texture filtering, a texture seen at an angle will become noticeably blurry and lose a lot of detail as the angle gets steeper.
Post automatically merged:

BTW the Red Engine has Anisotropic Filtering built it and at least with Ultra settings it is automatically set to:

MaxTextureAnizotropy=16

I wonder if there is a problem with the built-in AF and overriding it in the Nvidia Control Panel fixes it
Yes you have right, it seems to be built-in AF issue, but when i tried to disable MaxTextureAnizotropy to = 0 then this setting change itself in game texture settings to Low, thats why best option is using GPU Control Panel AF to x8 or x16 on every GPU with lower then 16GB VRAM (or sometimes even 20GB) and ofc Optimized RT mod is must have.
 
I can confirm by set the AF manually in NVIDIA control panel gives you better performance. Moreover, on the same 16x AF tho the texture filtering seems different, while the built in AF tends to be sharper but shimmer alot, NVIDIA AF gives better looks although is not as sharp as built in AF and better performance.

Edit: It seems Texture Filtering - Negative LOD Bias : Clamp is the cause the game looks softer and less shimmery.
Solution to Memory leak issue :
In GPU Control Panel in 3d settings choose Witcher3.exe (from <game installation directory/bin/x64_d12 folder) and set Anisotropic filtering to x2 or x4/8/16 (for best quality go x8 or 16):
( from 04:39 i'm showing where to change it)

Now i have playable gameplay with almost highest game setting on my mobile GPU with only 6GB VRAM RTX 3060:

For better image quality (less blur use Sharpen to High): https://imgsli.com/MTUyOTEz/6/3
Even with AF on x16 is still fix memory leak for me and with Sharpen is even AF x2 is very smillar to no AF overall, or even better, especially with x8/16.
 
Last edited:
I can confirm by set the AF manually in NVIDIA control panel gives you better performance. Moreover, on the same 16x AF tho the texture filtering seems different, while the built in AF tends to be sharper but shimmer alot, NVIDIA AF gives better looks although is not as sharp as built in AF and better performance.

Edit: It seems Texture Filtering - Negative LOD Bias : Clamp is the cause the game looks softer and less shimmery.
If you change from Clamp to Allow then AF fix to memory leak will stop working. Clamp cause softer with much more less shimmery, same like AF builded in game engine but yea, it's no that sharp. But if you set it to allow it will looks great at the begining, when image is stands still, but when you move then texteres will start shimmery alot, it's looks alot worse.
But Thx for this found, normaly when you change AF in Nvidia CP then Texture Filtering - Negative LOD Bias : in default will be set to Clamp, but someone can have it changed to Allow somehow and fix will not work for him. That's why i added this info to my guide.
 
That's vram overflow. Ray tracing needs much more vram. 4G vram is not sufficient for sure and I find 8G vram is occationally not sufficient too. But I think 8G vram should be fine if optimized well. Why? Because I ofen find there is sufficient dedicated vram but at the same time shared vram is high (see below picture).
捕获.PNG

We can see total 6.8G vram is used(5.6G dedicated vram + 1.3G shared vram). But rtx 4060 has 8G dedicated vram, shared vram is not necessary. When dedicated vram is overflowed, shared vram is used. If my guess was right, even later there is sufficient dedicated vram, the shared vram would not be copyed into dedicated vram then released. So the effect of vram overflow is irreversible once it happened unless restarting the program. There should be a recovery mechanism for temporary vram overflow. I know that's not simple. But there is a simple solution: fresh game reload. Unlike normal game reload, fresh game reload first clears all the dedicated vram and shared vram then continued as normal reload. At least, we needn't restarting the game saving much time. I hope CDP could add "fresh game reload" in the next patch.
I find setting the "number of backgroun characters" to low will alleviate the effect of vram overflow greatly. Maybe vram for background characters ofen resides in shared vram??? For rtx 4060, updating driver to the latest is helpful too.
My spec: 32g ram, rtx 4060(8G vram) @ pcie 3.0 x 8, b450 chipset, ryzen 5600x
Graphic settings: RT Ultra, 1080p, number of backgroun characters is low,DLSS quality is quality
 
Last edited:
You can call it memory overflow, but generally you can also describe it as memory leak and still will be corect, because everyone describe it as an issue where, after some gameplay, their FPS dramatically drops and won't recover until the game is restarted. You can observe it in this video:
(you can also check your PCI-E BUS usage in GPU-Z sensor tab). This is exactly what happens in The Witcher 3 NG with RT GI turned on. When the GPU reaches near 95% VRAM usage, it triggers the transfer of texture data between GPU VRAM and much slower RAM and virtual memory (shared memory of the game process in the page file). The GPU tries to reduce VRAM usage, causing PCI-Bus load to grow. The load depends on how much texture data is transferred (from vram overflow) and how fast your PCI-E bandwidth, ur RAM and SSD speed also have matter if your PCIE bus is fast enough.

In places where this leak occurs, the transferred texture data stored in RAM/virtual memory waits to be sent back to the GPU when needed. In Cyberpunk, the leak may trigger after a longer game session or immediately after loading the next save game (when you've reached near maximum VRAM usage before loading). Even if VRAM usage is lower and GPU space is freed up, the game process remembers it saved those textures in RAM/virtual memory. The GPU retrieves them from there instead of loading them into VRAM again.

This is why there's a massive FPS drop when this situation occurs, especially if the transmitted data is too large, leading to PCI-E Bus overload at a level higher than approximately 40%. More usage leads to much worse performance. If the transferred data is smaller, PCI-E Bus load is smaller, and the transfer between VRAM and RAM/virtual memory is faster, avoiding a significant FPS drop or making it minor. Using ini memory leak mod or Changing Anisotropic Filtering to x16 with simple optimization at Off and Negative LOD Bias at Clamp may help because all thoes settings reduces texture data size at the cost of worse texture quality.

ur RTX 4060 8GB VRAM is limited to PCi-E 8.0 transfer speed leading to worse performance when this vram overflow happens, changing mobo to B550 with PCI-E 4.0 support will improve performance in this scenerio but still not that much like chaning to GPU with higher VRAM amout, 16GB VRAM RTX 4060 vsersion would performance much better wihout overflow problems. This is not a recommendation, just a statement, if I were to recommend a GPU, it would be better to choose one with not only more VRAM, but also the most efficient one in its price segment, and the RTX 4060 16GB is very poorly priced.
 
Last edited:
Nvidia already has good VRAM management and also a suite of tools for VRAM management that covers textures, meshes and Ray Tracing

The problem is these tools aren't properly being used by CDPR to optimize graphics memory ..... Cyberpunk is proof of that because there they do use these tools properly delivering both better graphics and better performance.

If they haven't fixed this problem 13 months later it's likely that it's never going to be fixed and that's a big disappointment
 
You are right.
Recently, I chang my rtx 4060 to rtx 4060 ti 16G. No need to worry about vram overflow afterall. Then I find the real cause of fps drop when ray tracing is on. It's not because of vram leak and it may hasn't anything to do with vram overflow. It's just because of weird vram management of the game. Even there is no vram overflow, this game will insist on using large amount of shared vram. Over time, more and more shared vram is used. When shared vram usage is more then 2G, fps drops severely. Even you are using rtx 4090, fps could be halved. By contrast, in Cyberpunk, only serveral hundred MB shared vram is used.
Nvidia already has good VRAM management and also a suite of tools for VRAM management that covers textures, meshes and Ray Tracing

The problem is these tools aren't properly being used by CDPR to optimize graphics memory ..... Cyberpunk is proof of that because there they do use these tools properly delivering both better graphics and better performance.

If they haven't fixed this problem 13 months later it's likely that it's never going to be fixed and that's a big disappointment
 
Last edited:
Nvidia already has good VRAM management and also a suite of tools for VRAM management that covers textures, meshes and Ray Tracing

The problem is these tools aren't properly being used by CDPR to optimize graphics memory ..... Cyberpunk is proof of that because there they do use these tools properly delivering both better graphics and better performance.

If they haven't fixed this problem 13 months later it's likely that it's never going to be fixed and that's a big disappointment
Well, bad news then. It is true that you can enjoy Night City with ray tracing at 1440p with 8 GB VRAM. But sadly, DLC area, Dogtown is problematic. ı cannot even play at 1080p dlss quality without VRAM overflow (and performance tanks as game starts utilizing more PCI-e bandwidth and shared vram, just like what witcher 3 is doing). So either they didn't use these tools for DLC or that there's a general trend where developers have stopped caring about 8 GB VRAM. Look at horizon forbidden west, another game where 8 GB VRAM buckled. The amount of these games are increasing.

It has come to a point I cannot enjoy gaming with 8 GB VRAM. I cannot even get hyped or excited with news of upcoming games. All I think is "oh is this going to stutter" or "is this going to have horrible textures for 8 GB VRAM". I knew there would be compromises but I hoped games would scale better. Cyberpunk gave me that hope with its scope and visuals and being able to add ray tracing on top of it on 8 GB VRAM. And it was the same game and engine that took that hope indefinitely. I won't even say Dogtown is unoptimized because texture quality, texture variety and geometric density is insane in DLC. I could kind of understand why it struggles so much. In the end, I had to disable ray tracing there. Despite GPU looking to be capable of it. You get 60 FPS but stutters and heavy stalls all the time. You reduce textures to medium, oh, it gets fixed. But then textures everywhere look awful. Then you understand why Night City worked decently with 8 GB cards. Minimal geometric density, buildings with low res textures, low res road and pavement textures everywhere. Sure, it still looks good with ray tracing if you don't look closely, but it is clear that it was held back purely for 8-10 GB VRAM cards.
 
Top Bottom