[SOLVED] GPU rendered cutscenes missing @ custom resolutions
Using a Samsung DLP, because of overscanning that cannot be turned off on this DLP model (nor firmware update from Samsung), to avoid cropping desktop and game resolutions have to be set at 1184x658. First thing I notice is the opening sequence (not a movie, but a GPU rendered cutscene) is missing, the load screen fades to black - then comes back up with narration/text over the load screen. ESC to game and it's fine (gameplay normal no issues). Changing game resolution to any standard resolution (1280x720, 1024x768, etc) and the cutscene displays properly, but is then subject to overscan or black borders. File/movie based cutscenes play fine.Running Windows 7 x64 Phenom 9750 4GB RAM Nvidia GTX260 (and Vista compatibility mode, otherwise game CTD often). Nvidia Driver version 190.62. witcher.exe also set "run as administrator".Is this issue with nonstandard resolutions limited to Win7? Is there a way to fix this without needing to change resolutions at every cutscene to see them?
Using a Samsung DLP, because of overscanning that cannot be turned off on this DLP model (nor firmware update from Samsung), to avoid cropping desktop and game resolutions have to be set at 1184x658. First thing I notice is the opening sequence (not a movie, but a GPU rendered cutscene) is missing, the load screen fades to black - then comes back up with narration/text over the load screen. ESC to game and it's fine (gameplay normal no issues). Changing game resolution to any standard resolution (1280x720, 1024x768, etc) and the cutscene displays properly, but is then subject to overscan or black borders. File/movie based cutscenes play fine.Running Windows 7 x64 Phenom 9750 4GB RAM Nvidia GTX260 (and Vista compatibility mode, otherwise game CTD often). Nvidia Driver version 190.62. witcher.exe also set "run as administrator".Is this issue with nonstandard resolutions limited to Win7? Is there a way to fix this without needing to change resolutions at every cutscene to see them?