Back after 1-1/2 years. GeForce Experience wants to change me from High to Low

+
I just came back to Cyberpunk 1.6 after 1-1/2 years (had to re-install game from Steam) and my graphic settings are way low. I checked and GeForce Experience is setting the resolution to 1360x768 and my settings to low. I don't understand.

Back in April 2021, I previously used GeForce Experience and it set it to 1920x1080 with high settings and sync'd to my 144Hz screen. I loved the game and it ran beautifully back then. So now I've manually adjusted the game back up to mostly high, but I'm getting around 30fps. I guess that's ok for the moment, but does anyone know what's going on with GeForce Experience?

Note: I'm running on a high-end gaming laptop (Lenovo Legion) with 2 internal graphics cards - an Intel and an RTX2080 Max Q. I thought maybe GeForce Experience was making recommendations based on the Intel GPU, but when running the game and checking Task Manager, it's using the Nvidia card. I know the card is a few years old, but it's top in it's generation and it used to run Cyberpunk at nearly max settings. Enabling DLSS 2.0 back then was even better.
 
I just came back to Cyberpunk 1.6 after 1-1/2 years (had to re-install game from Steam) and my graphic settings are way low. I checked and GeForce Experience is setting the resolution to 1360x768 and my settings to low. I don't understand.

Back in April 2021, I previously used GeForce Experience and it set it to 1920x1080 with high settings and sync'd to my 144Hz screen. I loved the game and it ran beautifully back then. So now I've manually adjusted the game back up to mostly high, but I'm getting around 30fps. I guess that's ok for the moment, but does anyone know what's going on with GeForce Experience?

Note: I'm running on a high-end gaming laptop (Lenovo Legion) with 2 internal graphics cards - an Intel and an RTX2080 Max Q. I thought maybe GeForce Experience was making recommendations based on the Intel GPU, but when running the game and checking Task Manager, it's using the Nvidia card. I know the card is a few years old, but it's top in it's generation and it used to run Cyberpunk at nearly max settings. Enabling DLSS 2.0 back then was even better.
It's not Geforce Experience, it's the game exerting a lot more taxes onto our system. Back on 1.5 I even had to disable RT if I wanted to reach 20 FPS on a 3070 with max settings. Now on 1.6, with RT, max settings and over 355 mods, it flexes around 23.

Believe the game's size was approaching 100GB, but the REDS managed to shrink that down a bit. But yeah, the game continues to be one of the most resource hugging IPs on any system.
 
I just came back to Cyberpunk 1.6 after 1-1/2 years (had to re-install game from Steam) and my graphic settings are way low. I checked and GeForce Experience is setting the resolution to 1360x768 and my settings to low. I don't understand.

Back in April 2021, I previously used GeForce Experience and it set it to 1920x1080 with high settings and sync'd to my 144Hz screen. I loved the game and it ran beautifully back then. So now I've manually adjusted the game back up to mostly high, but I'm getting around 30fps. I guess that's ok for the moment, but does anyone know what's going on with GeForce Experience?

Note: I'm running on a high-end gaming laptop (Lenovo Legion) with 2 internal graphics cards - an Intel and an RTX2080 Max Q. I thought maybe GeForce Experience was making recommendations based on the Intel GPU, but when running the game and checking Task Manager, it's using the Nvidia card. I know the card is a few years old, but it's top in it's generation and it used to run Cyberpunk at nearly max settings. Enabling DLSS 2.0 back then was even better.

Nothing wrong with Geforce Experience. Nvidia probably adjusted their game profile in accordance with changes that have been made to the game.
 
Thanks for the replies. I'm learning that the requirements have gone up and I've seen other posts about dropped fps after patch 1.5 and 1.6, but...

2 years ago I could run at max resolution (1920x1080 for me), Ray tracing, DLSS, and high (not ultra) and it ran about 60fps.
Today if I manually force those settings, it runs about 30fps. ok. I can live with that or reduce some settings. fine.

But GeForce Experience wants to go further and set the game to a tiny window (1360x768) and LOW settings. The game looks ridiculous. I can't even read the in-game text, it's so jaggy. Something is wrong with GeForce Experience. I can bypass it and manually set everything, but it seems Nvidia has dropped the ball... It's saying to turn of all Ray Tracing and even DLSS, which doesn't make any sense.
 
Thanks for the replies. I'm learning that the requirements have gone up and I've seen other posts about dropped fps after patch 1.5 and 1.6, but...

2 years ago I could run at max resolution (1920x1080 for me), Ray tracing, DLSS, and high (not ultra) and it ran about 60fps.
Today if I manually force those settings, it runs about 30fps. ok. I can live with that or reduce some settings. fine.

But GeForce Experience wants to go further and set the game to a tiny window (1360x768) and LOW settings. The game looks ridiculous. I can't even read the in-game text, it's so jaggy. Something is wrong with GeForce Experience. I can bypass it and manually set everything, but it seems Nvidia has dropped the ball... It's saying to turn of all Ray Tracing and even DLSS, which doesn't make any sense.

Well, don't forget that you're on a laptop. In other words, your 2080 isn't exactly performing the same way a desktop 2080 is. It's closer to a desktop 2060, which isn't exactly performance inclined.

With that said, doesn't Geforce Experience have settings about what you want out of it? As in performance or graphical fidelity. I'm at work and can't confirm. Is yours set to aim for performance? In which case it would make perfect sense for it to lower your settings that badly
 
Well, I've manually set everything to high, max res, DLSS to quality (not performance), anisotropy to 16, VSync to 144

Then I turned off the following (most of which I don't want anyway): dynamic resolution scaling, film grain, HDR Mode, chromatic aberration, motion blur to low (not off).

All ray tracing is on except reflections. And ray traced lighting is medium.

With this, the game's running at 40-50fps and looks beautiful.

GeForce Experience is set to "performance", and can only manually be adjusted towards "quality" via slider (but it insists "optimal" is all the way over at "performance"). At "performance" it turns everything to "low" while at "quality" it turns everything to "high, ultra & psycho". I'd have to pick somewhere in the middle manually. I've never had to this with other games. Seems broken to me.

I'm better off just setting the game settings myself.
 
Last edited:
Well, I've manually set everything to high, max res, DLSS to quality (not performance), anisotropy to 16, VSync to 144

Then I turned off the following (most of which I don't want anyway): dynamic resolution scaling, film grain, HDR Mode, chromatic aberration, motion blur to low (not off).

All ray tracing is on except reflections. And ray traced lighting is medium.

With this, the game's running at 40-50fps and looks beautiful.

GeForce Experience is set to "performance", and can only manually be adjusted towards "quality" via slider (but it insists "optimal" is all the way over at "performance". At "performance" it turns everything to "low" while at "quality" it turns everything to "high, ultra & psycho". I'd have to pick somewhere in the middle manually. I've never had to this with other games. Seems broken to me.

I'm better off just setting the game settings myself.

Happy to hear you found a healthy place for your settings. 30 FPS is "meh" and anything below is unplayable.

Honestly, I always found it better to set things up myself than let GeForce Experience do it's thing. It's always far more conservative. I assume Nvidia does that to ensure people don't go back to them saying "your preset is unplayable!!". Whereas finding out you can go higher than they indicated is a nice surprise.

Even at 2K resolution on a 3080ti GFE turns down a few things if I let it do it's thing. Meanwhile, if I turn up everything to max I get a healthy 80+FPS. I don't think there is anything wrong with GFE, I think it's meant to be conservative.
 
I just came back to Cyberpunk 1.6 after 1-1/2 years (had to re-install game from Steam) and my graphic settings are way low. I checked and GeForce Experience is setting the resolution to 1360x768 and my settings to low. I don't understand.

You may be in the same boat as me. My computer is so old, that Nvidia experience cannot adequately appraise it and thus sets things to low. (At least, thats my interpretation)

I can still play the game fine on my old I7 which beats many other CPUs on the market today.. Gotta set settings manually tho.
 
You may be in the same boat as me. My computer is so old, that Nvidia experience cannot adequately appraise it and thus sets things to low. (At least, thats my interpretation)

I can still play the game fine on my old I7 which beats many other CPUs on the market today.. Gotta set settings manually tho.


I first ran this game on the promised WIN 7 specs when the game came out (thank you CD RED for keeping your promise as I was a WAY early purchase based on the win 7 specs) with a gtx 970 and the game ran great, look really good and had ZERO bugs. Now that I "upgraded" to win 10 (all other hardware is the same) the game has (or rather "had" as I have not seen many after last update) BAD visual and physics bugs.

If it was not for the fact I wanted to play the upcoming DLC I would reload the day one version on one of my win 7 PC and just play the game that way. It was great! Now the most update version (even though it is without the bugs) the FR is not nearly as good as it was under win 7.
 
You may be in the same boat as me. My computer is so old, that Nvidia experience cannot adequately appraise it and thus sets things to low. (At least, thats my interpretation)

I can still play the game fine on my old I7 which beats many other CPUs on the market today.. Gotta set settings manually tho.
Yeah, it's weird because when Cyberpunk came out, my RTX2080 was top-tier, bleeding edge for laptops and GeForce Experience (GE) worked fine with CP2077. The RTX3000 series was just releasing and now we have the barely released RTX4000 series and for some reason GE now says "Your system doesn't meet the requirements for optimal settings in the game or application. The optimal settings may deliver unplayable framerates."

GE is also wrong, because the borked GE optimal settings deliver HIGH framerates, but the graphics quality and resolution are so poor, I can hardly read the in-game text messages. Nvidia GE is definitely currently broken for CP2077. I'm just glad the game runs fine with manual settings.

Get it together, Nvidia!
 
I generaly dont use GE either for settings, just shadowplay. It can be good with a monster rig since you can set all settings at once instead of 1 at the time but thats what presets are for :D Tbh these drivers with all the supporting software is starting too become very much bloatware... Asus crappy armory crate is even worse of a resource hog with bugs up the wazoo too..
 
GeForce Experience recommends lower than default resolution, and a few other settings, for me too. Personally, I just go into the game, set things how I like, run the game from GoG Galaxy (or directly from the game exe) and do not run it through GeForce Experience. Basically, I ignore what GeForce Experience says.

The game runs perfectly well for me at my chosen settings. Most of the time my frame rate is nice and smooth (vsynced to 60 FPS) and the game looks good. Only very occasionally (like, very rarely), in complicated areas of the city, and only for a few seconds, do I notice frame rate drops but, even then, my FPS values still stick around 30-35. Almost all of the time, the game runs much better than that. It's actually pretty much only in one very traffic-heavy street, with lots of building detail that I sometimes notice a slight drop.

I tried running at the GeForce Experience recommended settings once and the game looked awful. I have no idea why it suggests the settings that it does when my own settings run the game very nicely thank you, and it looks so much better than when I use settings that Geforce thinks are better.

tl:dr My advice is to go into the game via Steam/GoG/anything other then GFE, find the settings that you like yourself and use them.
 
Top Bottom