Low Performance Cyberpunk 2077 PC

+
Hello Goodnight

I am a dear users and moderator, superiors

Please improve the performance of the previous nvidia 10 series graphics cards, at least the GTX 1060 6GB should work in high quality and something medium 1920x1080 at 60 and 65 FPS since the GTX 1070 8GB should work high and something ultra or all high at 60 FPS for which the graphics card is not to blame it is the game that is poorly optimized

it is super low performance or you do not meet the standards said specifications or relationship of the game or said and done I will have to claim no insult far from it I only speak clearly we always blame the graphics card if it is not the graphics card it is a processor if it is not a processor graphics and I think that some players or users have already studied that more than most use the GTX 770 or 970 4GB to play low at 50 to 60 FPS 1920x1080 something to start with is appreciated an amount not only to me that affects me if not to the rest users the problem you have is not optimizing well.

I think that we have already talked many times about improving performance and I am afraid that I will have to make a complaint or claim. I repeat no threats but you already ignore us from what I see, let's hope that we do not have to go to the claims and complaints and the information that you say It is false and that can harm you, total losses, you do what you want and I do not see it as fair, and even less demanding from a user or someone saying that game is optimized and needs a NVIDIA 20 series, for example, no sir, the rest of the GTX series graphics card 10 from NVIDIA we have to improve the previous graphics is more the graphics is not to blame it is the game that is not optimized or improved performance etc

greetings david
 
Well, I can say that with my previous system:
i7-4790K
GTX 980 ti
16 GB G.Skill RAM

My performance was only 45-56 FPS running 1080p at Ultra settings. The most impactful thing on your performance is your resolution, so if you're looking for more FPS without sacrificing quality, that's should be your first stop.

The trouble with resolution on modern LED displays is scaling. Older CRT monitors could resize their displays so that lower resolutions still used the entire screen space at true, 1:1 resolution. Nowadays, we can either run the game in a smaller screen space (meaning in a window or at fullscreen with black borders around the actual drawspace) or we can scale the image up to fill the full screenspace. This, however, introduces pixelation and an overall loss of image quality. If you're like me, and the borders don't bother you, viola! You're all set. Scaling off: nice, crisp, 1:1 image with much improved performance...just smaller. If you're bothered by that, the only real solution at the moment are things like Nvidia's DLSS. DLSS will render the game at a lower, overall resolution, then much more accurately scale the image up on the screen, eliminating aliasing as it goes. The results, however, are not always perfect. It can introduce some graphical oddities at times (shimmering textures, wierd transulcency effects, etc.)

But, can a GTX 1060 run the game at high settings, 1080p, at 60-65 FPS? I doubt that's possible. If you try dropping down to 720p, it may get closer, but I'd still expect that there will be certain scenes that the 1060 just can't power through at High settings. (I saw framerates in the 20s on the 980 ti, at Ultra, but they were very rare.) Keep in mind that the xx60 cards were meant as the "budget" solutions at that point. A GTX 1060 is more or less a GTX 970 with a slight boost and access to the features introduced in the new architecture.
 
Hello, Good morning

The issue of high quality is valid if when I install that game it gives me high quality, it is okay to say that you are capable of moving it to that quality, now that the developers do not want to put themselves on their part to improve or whatever it is that they say because that is called complaints of claim and when I say claim or complaints I will tell you the truth because that thing about the GTX 1060 6GB is not worth it does not work etc etc first look well before launching the game that graph is worth that graph is capable perfectly high and ultra is capable but if the The game is not capable or optimized, much less people can tell me the opposite, they are wrong, the nonsense we say is fine, and that the GTX 980 TI vs GTX 1060 6GB graphics makes no sense, minimum differences, maximum 5 to 10, it's the same GTX 1060 6GB even if you tell me medium and high not even that is 60 FPS I have that clearer okay but very clear that no one is going to beat me ahead and less the experience I have with the games is okay as it happened example GTA 4 wrongoptimized okay and did not move any graphics until I released the last patch performance improvements and it was well noticed okay. , At least the GTX 1060 6GB at least that graphics card is for playing FULL HD 1920x1080 that's worth 1280x720 if we're talking about the GTX 770 or 960 but if I have a GTX 1060 6GB it's worth looking at it online that GTX 1060 6GB graphics card is for playing FULL HD fluidity 60 Frames per second don't come telling me no because it's okay and I can give you a page where you specify that graphic quality and fps it's okay I'm going to put it very big and that's because I have no choice but to complain about a claim saying change this comparing a product 2015 with one 2016 is okay I understand 2021 RTX 30 for example but difference there are 3 RTX 20, RTX 20 Super and RTX 30 3 years differences if we were 5 years old or 10 years old it's the same people have a lot of money to spend but I knew a saying that the one that you spend less it has almost almost the same as the other for less prices it is the same as a Bugatti car that you think that because you have that car it is better it runs more is better and the price because for that price half of another car will take me to the same site in even half the same the graphics card if the graphics card GTX 980 TI or RTX 2060 6GB costs 400 or 450 euros half or a quarter whatever will have the same specifications or almost the same will take me the same in games what happens is that it is easy no improve and that's it simple no more than the GTX 1060 6GB worth it should go.

GTX 1060 6 GB is still a graphics card capable of moving games in 1080p without problems

it's more look at it online without further greetings
Post automatically merged:

By the way, the nvidia graphics card series 10 to series 20 3 different generations and that is few, so the performance is minimal because it has its DLSS, it is already the 10 series, it is not worth it, it is easy to say that, it is already
 
Last edited:
Hello, Good morning

The issue of high quality is valid if when I install that game it gives me high quality, it is okay to say that you are capable of moving it to that quality, now that the developers do not want to put themselves on their part to improve or whatever it is that they say because that is called complaints of claim and when I say claim or complaints I will tell you the truth because that thing about the GTX 1060 6GB is not worth it does not work etc etc first look well before launching the game that graph is worth that graph is capable perfectly high and ultra is capable but if the The game is not capable or optimized, much less people can tell me the opposite, they are wrong, the nonsense we say is fine, and that the GTX 980 TI vs GTX 1060 6GB graphics makes no sense, minimum differences, maximum 5 to 10, it's the same GTX 1060 6GB even if you tell me medium and high not even that is 60 FPS I have that clearer okay but very clear that no one is going to beat me ahead and less the experience I have with the games is okay as it happened example GTA 4 wrongoptimized okay and did not move any graphics until I released the last patch performance improvements and it was well noticed okay. , At least the GTX 1060 6GB at least that graphics card is for playing FULL HD 1920x1080 that's worth 1280x720 if we're talking about the GTX 770 or 960 but if I have a GTX 1060 6GB it's worth looking at it online that GTX 1060 6GB graphics card is for playing FULL HD fluidity 60 Frames per second don't come telling me no because it's okay and I can give you a page where you specify that graphic quality and fps it's okay I'm going to put it very big and that's because I have no choice but to complain about a claim saying change this comparing a product 2015 with one 2016 is okay I understand 2021 RTX 30 for example but difference there are 3 RTX 20, RTX 20 Super and RTX 30 3 years differences if we were 5 years old or 10 years old it's the same people have a lot of money to spend but I knew a saying that the one that you spend less it has almost almost the same as the other for less prices it is the same as a Bugatti car that you think that because you have that car it is better it runs more is better and the price because for that price half of another car will take me to the same site in even half the same the graphics card if the graphics card GTX 980 TI or RTX 2060 6GB costs 400 or 450 euros half or a quarter whatever will have the same specifications or almost the same will take me the same in games what happens is that it is easy no improve and that's it simple no more than the GTX 1060 6GB worth it should go.

GTX 1060 6 GB is still a graphics card capable of moving games in 1080p without problems

it's more look at it online without further greetings
Post automatically merged:

By the way, the nvidia graphics card series 10 to series 20 3 different generations and that is few, so the performance is minimal because it has its DLSS, it is already the 10 series, it is not worth it, it is easy to say that, it is already
Oh, goodness, there's a lot of misunderstandings and misconceptions here. Let me first directly talk about the elephant in the room:

Nvidia's marketing (and AMD's, and Intel's, and [InsertManufacturerNameHere]'s) is intentionally misleading -- trying to get consumers to buy and use products that they know are clearly inferior to what the marketing makes them look like. There's no way around this -- consumers simply have to learn the ropes...often the hard way.

Point in case for this situation in particular:
1657734140060.png

A 14% overall increase is not shabby, especially considering the price of a 970 at launch being around $400 USD. Thus, for anyone building/buying a system 2-3 years later, a 1060 would be a MUCH better option.

But that window was not only small, it was incredibly budget targeted:
1657734329623.png

A 41% overall increase in performance...for a card that was released over a year earlier!? Yup. Welcome to modern marketing. We can use "non-coined terms", like everyday words or numbers to mean...anything we want in marketing. Hence, by adding "ti" to my product's label, I can legally justify that performance difference and cost evaluation. Slimy? Yes. Misleading? Yes. Legally liable? No.

And the responsibility falls squarely back on consumers. Never assume. Never take someone's word for it. Educated purchases...or...burned.

BUT -- !!!

Look at the price.

As wildly disingenuous as the marketing might be, the cost alone should give anyone a second of pause. Those are modern prices. I got my 980 ti in June of 2015 on a good deal...for $800 USD. Normal prices at that point were anywhere from $900-950 USD. So, when the 1060 released, that was a markedly good, budget option: for 2015.

Nvidia's not being as slimy as they are being smart. They want to sell their new, cheap product. It will create income the more volume it moves. But it can't afford to sell its top-shelf stuff for peanuts, so that's going to limit sales at the high-end. Hence, label the low end temptingly, but let the pricing to the talking. Educated consumers will notice immediately, and for everyone else...well...there's a sucker born every minute. That is shitty and dishonest...but it's legal.

End result, however, is that the GTX 1060, despite its label, is a card developed to be competitive at the mid-range market for 2015. Cyberpunk 2077 is a very intensive title that can bring top-of-the-line hardware to its knees in 2019. No 1060 is going to run the game at a solid 60 FPS at 1080p in 2022. You'll need to drop to 720p or 480p and fiddle with graphical settings to get the mark.


_______________


Now that we have hardware out of the way, let's talk about optimization. Optimization does not mean, "Make the best settings run flawlessly on my low-end hardware." Optimization means, "Balance settings to achieve a middle ground between visual fidelity and performance." In order to gain performance, I must give up detail. There's no way around this. It's hardware locked. Cyberpunk is not Call of Duty or Apex Legends -- it's not a game focused on delivering "MaXxX FPS Exxxtreeeme". A lot of its performance is bound to the robust RPG mechanics working under the hood. Hence, a solid 60 FPS is going to be tough for even baseline, recommended specs. Not all games are about performance.

A 1060 cannot offer the raw teraflops and post-processing needed to achieve 60 FPS in CP2077 running at High settings at 1080p. It can't be done. You're asking a Cessna 172 to break the sound barrier. It's not happening. That requires a jet engine developed under completely different architecture. The Cessna is a prop plane. It's max speed is something around 190-200mph. No amount of fuel (or weather manipulation) is going to allow the Cessna to accelerate to 770mph. Laws of physics and thermodynamics. (If it did hit that speed, it would flip out and be uncontrollable, anyway. That speed requires swept wings, more accurate rudder and aileron control, etc. It requires a totally different architecture.)

What you can do is drop your resolution to 720p or 480p, and then play with various settings to hit the magic balance for your system. You should be able to get 50-60 FPS at a mix of High / Ultra settings 480p, though the biggest things, like draw distance and texture filtering, will probably have to take a hit.


_______________


And lastly, the future. What are the rest of your system specs? Having just needed to recently replace two (...grrr...) systems in the last couple of months with off-the-shelf rigs (...grrrRRR...:disapprove:...), I'm almost fully back up to speed on where hardware sits right now. (My 980 ti system was perfectly fine when it died. I was able to run Elden Ring -- at launch! -- at full, Ultra settings at 1080p.)

Right now, we're actually...in a really great place for upgrading components. If you have the specs for it, I'd put my money on an RTX 3060 ti:
1657736963979.png


Just don't buy indiscriminately. Lots of gougers and frauds out there right now, trying to capitalize on the crash. Be sure you're getting new or responsibly used hardware.

Also, be sure you have the specs to run it! What are the specs for the rest of your rig?
 
Last edited:
Optimization means, "Balance settings to achieve a middle ground between visual fidelity and performance."
From what I understood is that, optimization means; "Achieve the same results for less recourses."

It is especially apparent in aviation where aircraft manufacturers constantly are trying to reduce weight, reduce drag, adjust the fan-blades etc., to make the plane consume less and less fuel, while maintaining the same speed.

Concerning Cyberpunk, having done some DLSS testing, it's save to say that although the game already received a vast amount of optimization, there is still a lot of room left for a lot of improvement concerning resource consumption.

Cyberpunk is a real gas-guzzler, so much so, pretty convinced Europe would never-ever in a million lightyears permit it to drive on the open roads were it a car. Nvidia's ultimate flagship graphics card, can have severe difficulties keeping the FPS rate up in those heavy tax spots on the map.

Had DLSS set on 'Auto' assuming that the game would know how to best orchestrate it. It resulted in the game CTD once every 1 to 2 hours. Only when I set it to 'High performance', did the CTD finally ceased. Using an RTX 3070 laptop.
 
And lastly, the future. What are the rest of your system specs?

This.

Case in point - before I bought CP2077, I knew I needed a new video card, as the R9 390X I bought to run Fallout 4 wasn't going to cut it. So, I cast about and (finally) found a Radeon 6600XT - but I had to buy it bundled with a B550 motherboard. I accepted this as a decent performance upgrade, even keeping the same CPU and RAM (which are, because of how the memory controllers work nowadays, effectively best bought as a matched set - buy the fastest RAM the CPU can run and, as long as the motherboard is validated for that speed, it's fine).

Anyway, why does this matter?

The 6600XT uses a PCIe 4.0 interface, which means that, on an x16 card, you get 31.508GB/s of bandwidth. However, as soon as I would have plugged that into my old B450 board, it would have been forced to run at that board's PCIe 3.0 (or 3.1, I don't recall and it doesn't matter) speed of 15.754GB/s. So, by not upgrading my motherboard, I'm effectively cutting off its potential performance by limiting the crucial CPU/Memory to video card data path, which has massive implications for things like getting the textures to the card so that they can be rendered to objects, and so forth.

So, yeah, I bought the bundle, it works great, and eventually I'll upgrade to a later generation Ryzen CPU and faster RAM - probably pegging that to the maximum that the board can take.
 
Maybe it's me (very, very little knowledge), but it seem weird to me to take graphics settings as "rule" for performances...
More the time pass, more the hardware evolve, more graphics evolve too (more effects, more higher quality textures and more detailed animations and so one). So "high" quality setting few years ago have nothing to do with current "high" quality setting which will have nothing to do with the ones which will come in few years...
Just an example (probably false^^)
2015 games2018 games2021 games
low--
mediumlow-
highmediumlow
very highhighmedium
-very highhigh
--very high
I'm maybe (probably) wrong, but in my opinion current "low" settings in 2021 are the equivalent of high ones in 2015 :)
 
I'm maybe (probably) wrong, but in my opinion current "low" settings in 2021 are the equivalent of high ones in 2015 :)
You are not wrong. As with many things, performance is a moving target.

When I was a much younger man, we bought a computer for the family. It was an 80286 running at 8MHz with 640k of RAM, a 10MB hard drive, keyboard, mouse, printer, monitor, and so forth. It was $2800 USD (equivalent to about $7200 today) and was considered reasonably high end.

Now, on the one hand, $2800 gets you quite a decent setup today, and $7200 approaches "the sky is the limit" (I could blow $2800 just on video card and monitor, with $7200... yeah, that'd be the best computer I've ever seen outside of an enterprise setting).

But, on the other hand, I just checked Amazon, and I can get a Samsung Chromebook 4 with display, keyboard, 64GB eMMC drive, 4GB RAM, WiFi, and an N4000 Celeron CPU, which runs at 2.60GHz for $150 - and that blows my 286 out of the water in every conceivable metric.

As another example, as I mentioned above, I bought an R9 390X for Fallout 4, and a Radeon 6600XT for CP 2077. The street price for both are about the same ($340 vs. $380, respectively, which, when you account for inflation, is "about the same"), but the latter is benchmarked at 84% better. Guess what? It runs CP2077 on high just fine (at 1080P), and runs FO 4 with everything cranked up as high as it goes in 1080P and just screams... because FO4 was released 5 years before CP 2077, and that's like, what, 2 generations of video cards?
 
The 1060 6gb was launched in 2016. Cyberpunk was launched at the end of 2020, two generations of graphics cards later.

I mean, that's the end of the discussion, isn't it? You can't seriously expect a developer to cripple the "high" settings so that people with four year old cards can feel good?
 
From what I understood is that, optimization means; "Achieve the same results for less recourses."

It is especially apparent in aviation where aircraft manufacturers constantly are trying to reduce weight, reduce drag, adjust the fan-blades etc., to make the plane consume less and less fuel, while maintaining the same speed.

Concerning Cyberpunk, having done some DLSS testing, it's save to say that although the game already received a vast amount of optimization, there is still a lot of room left for a lot of improvement concerning resource consumption.

Cyberpunk is a real gas-guzzler, so much so, pretty convinced Europe would never-ever in a million lightyears permit it to drive on the open roads were it a car. Nvidia's ultimate flagship graphics card, can have severe difficulties keeping the FPS rate up in those heavy tax spots on the map.

Had DLSS set on 'Auto' assuming that the game would know how to best orchestrate it. It resulted in the game CTD once every 1 to 2 hours. Only when I set it to 'High performance', did the CTD finally ceased. Using an RTX 3070 laptop.
Nope!

Optimization means exactly what I've stated. You cannot create something from nothing. Something that is terribly designed, like a hot air balloon using 10mm thick fabric might be able to achieve better flight characteristics by using 3mm thick fabric instead -- but that's still taking from one area and adding to another.

Normally, for PC software, what we're dealing with is hardware limitations. So, for "Medium" graphics, can I get better performance? Well, yes! I can remove smaller details like railings, radio towers, and window lighting from distant objects, and that makes the overall performance faster. It's hardly noticeable, letting me add more near-field details, and I've increased the performance. That is optimization.

There may be ways of improving things at the code level that allow for fewer calculations (ones that don't really matter that much in practice), and then using that now-available computing power to increase performance instead. BUT, that is still taking from processing accuracy and adding to performance output.

The meaning of "optimization" is exactly what I've written, regardless of what others may believe it to mean. End result: if something is a "resource", it is not unlimited (or at least, not available in unlimited supply). Hence, if I use a resource for A, it cannot simultaneously be used for B.

Maybe it's me (very, very little knowledge), but it seem weird to me to take graphics settings as "rule" for performances...
More the time pass, more the hardware evolve, more graphics evolve too (more effects, more higher quality textures and more detailed animations and so one). So "high" quality setting few years ago have nothing to do with current "high" quality setting which will have nothing to do with the ones which will come in few years...
Just an example (probably false^^)
2015 games2018 games2021 games
low--
mediumlow-
highmediumlow
very highhighmedium
-very highhigh
--very high
I'm maybe (probably) wrong, but in my opinion current "low" settings in 2021 are the equivalent of high ones in 2015 :)
Yes -- exactly!

Whether something is "Ultra", "Medium", or "Low" is entire subjective to the hardware at the time it was released. So, we don't want to put much weight on what a setting is "called".

What we want to focus on is getting good performance at a detail level we're happy with. This will still be hardware-limited, but a lot of people get lost in the numbers instead of playing the game.

For example, during 2016, once I had built my "new" system and got The Witcher 3 set up on it, my gaming buddies that saw it often gawked. They couldn't believe how smoothly it ran. It was like butter. And it was running at Maximum settings -- across the board (except for Hairworks). When I explained that it was running on a 60 Hz display at a locked 48 FPS, they often didn't believe me and asked me to show them. They couldn't believe that, "Less is More".


You are not wrong. As with many things, performance is a moving target.

When I was a much younger man, we bought a computer for the family. It was an 80286 running at 8MHz with 640k of RAM, a 10MB hard drive, keyboard, mouse, printer, monitor, and so forth. It was $2800 USD (equivalent to about $7200 today) and was considered reasonably high end.

Now, on the one hand, $2800 gets you quite a decent setup today, and $7200 approaches "the sky is the limit" (I could blow $2800 just on video card and monitor, with $7200... yeah, that'd be the best computer I've ever seen outside of an enterprise setting).

But, on the other hand, I just checked Amazon, and I can get a Samsung Chromebook 4 with display, keyboard, 64GB eMMC drive, 4GB RAM, WiFi, and an N4000 Celeron CPU, which runs at 2.60GHz for $150 - and that blows my 286 out of the water in every conceivable metric.

As another example, as I mentioned above, I bought an R9 390X for Fallout 4, and a Radeon 6600XT for CP 2077. The street price for both are about the same ($340 vs. $380, respectively, which, when you account for inflation, is "about the same"), but the latter is benchmarked at 84% better. Guess what? It runs CP2077 on high just fine (at 1080P), and runs FO 4 with everything cranked up as high as it goes in 1080P and just screams... because FO4 was released 5 years before CP 2077, and that's like, what, 2 generations of video cards?
Yuppers!

I've always built my own systems whenever I could. The rig I put together in 2015 was based on Falcon Northwest specs. I ditched the liquid cooling (because, frankly, I can't be bothered taking care of it), used a different case (obviously), and went with G.Skill RAM instead of Corsair because it was actually in stock at the store nearby. Aside from that, everything was the same.

The Falcon-NW Mach V was around $8,400. I put together almost the exact same system for $2,100. It just takes time and research to land the parts. The thing ran flawlessly for 7 years of constant use until the motherboard finally fused a CPU pin to its socket and caused a short. (I clearly remember the tiny "tink" when I tried to remove the CPU. It was an amazing system. [I think I'm officially mourning it. It still had gaming life in it, damn it!])
 
The 1060 6gb was launched in 2016. Cyberpunk was launched at the end of 2020, two generations of graphics cards later.

I mean, that's the end of the discussion, isn't it? You can't seriously expect a developer to cripple the "high" settings so that people with four year old cards can feel good?
Not really! The 1060 is a budget card, but it's no slouch! There's a happy medium to be found. It just requires the right settings at the right resolution. Even though it probably won't run at a solid 60 FPS...ever...that doesn't mean it can't run smoothly while looking really good.
 
This.

Case in point - before I bought CP2077, I knew I needed a new video card, as the R9 390X I bought to run Fallout 4 wasn't going to cut it. So, I cast about and (finally) found a Radeon 6600XT - but I had to buy it bundled with a B550 motherboard. I accepted this as a decent performance upgrade, even keeping the same CPU and RAM (which are, because of how the memory controllers work nowadays, effectively best bought as a matched set - buy the fastest RAM the CPU can run and, as long as the motherboard is validated for that speed, it's fine).

Anyway, why does this matter?

The 6600XT uses a PCIe 4.0 interface, which means that, on an x16 card, you get 31.508GB/s of bandwidth. However, as soon as I would have plugged that into my old B450 board, it would have been forced to run at that board's PCIe 3.0 (or 3.1, I don't recall and it doesn't matter) speed of 15.754GB/s. So, by not upgrading my motherboard, I'm effectively cutting off its potential performance by limiting the crucial CPU/Memory to video card data path, which has massive implications for things like getting the textures to the card so that they can be rendered to objects, and so forth.

So, yeah, I bought the bundle, it works great, and eventually I'll upgrade to a later generation Ryzen CPU and faster RAM - probably pegging that to the maximum that the board can take.
As far as i know very few games run into any blocks due too pci-e speeds. As long as your seeing around 100% utilisation that is. im running pci-e 3.0 on my 3090 and its working fine (might loose a few fps at most). Some games ive heard that 3.0 x8 is laggy but thats half the speed of x16. Dont get me wrong, 4.0 is better. But its rare too see those speeds get utilized.

Fps generaly depends on so many factors that its hard too make a general statement. Cpu,memory, gpu all affect it in different ways depending on resolutions and settings. This game is a proper hog when it comes too resources. Without DLSS/FSR its very hard too get decent FPS unless you lower settings.

 
As far as i know very few games run into any blocks due too pci-e speeds. As long as your seeing around 100% utilisation that is. im running pci-e 3.0 on my 3090 and its working fine (might loose a few fps at most). Some games ive heard that 3.0 x8 is laggy but thats half the speed of x16. Dont get me wrong, 4.0 is better. But its rare too see those speeds get utilized.

Fps generaly depends on so many factors that its hard too make a general statement. Cpu,memory, gpu all affect it in different ways depending on resolutions and settings. This game is a proper hog when it comes too resources. Without DLSS/FSR its very hard too get decent FPS unless you lower settings.

In such a situation, I always handle it the same way.

Tweak against the lower end of performance. (On average. There will always be a few scenes in any game here and there that chug a bit. I ignore those.) What I pay attention to are the regular dips I see doing things around the game. For example, if walking around Night City, my FPS constantly fluctuates between 80 and 45 (as it did), set a frame limit a LOT closer to 45 than 80. While 80 FPS may be possible looking at more empty scenes, it's not sustainable. When FPS fluctuates that much, it's very distracting to the eye -- even though 45 FPS is perfectly playable.

So, I lock my FPS at 56 (multiples of 8 a few frames above or below a refresh threshold usually works best. So: if I'm aiming for 60 FPS-ish, I'll lock at 56 or 64. If I'm shooting for 72 FPS, I'll lock at 64 or 80. If I'm going for 120 FPS, I'll lock at 112 or 128. Etc.). What this does is provide overhead for the video card that allows it to maintain a good frame timing at pretty much all times. You'll likely see that low end raise up a little, since the processing power is now there: fewer dropped frames. So those dips to 44-45 FPS will now likely be more like dips to 45-48 FPS. Plus, since the swing up to the high end is now only to 56 FPS (and not 80 FPS) the whole game starts to feel consistently smooth.

This will always be different for each game on a given rig. But the thing to keep in mind is that "buttery smooth" gameplay is the result of consistent FPS -- not high FPS. Obviously, being able to produce both is ideal, but if that's not possible due to hardware limitations or a really demanding game, then getting rid of the huge FPS swings will immediately feel noticeably better.
 
That's an approach I share as well. But in a time when it's often all about maximum FPS, that seems to have gone a bit out of fashion :).
 
This will always be different for each game on a given rig. But the thing to keep in mind is that "buttery smooth" gameplay is the result of consistent FPS -- not high FPS. Obviously, being able to produce both is ideal, but if that's not possible due to hardware limitations or a really demanding game, then getting rid of the huge FPS swings will immediately feel noticeably better.
Butter smooth for me is both high and constant but i get what your saying. The jump from 60 too 100 is quite the shock at first in games that i can run like that and latancy is probably my arch nemesis now :) I refuse too lower my res tho since im spoiled with 4k.
 
Butter smooth for me is both high and constant but i get what your saying. The jump from 60 too 100 is quite the shock at first in games that i can run like that and latancy is probably my arch nemesis now :) I refuse too lower my res tho since im spoiled with 4k.
Geeze, man -- if you're able to run the game at any sort of playable framerates at 4K, I'd gather up your blessings, put them in a box, and remain really, really quite about "performance issues". (The Furies are everywhere! And they are always listening. You should see what they can do to a graphics card when they find it...)
 
That's an approach I share as well. But in a time when it's often all about maximum FPS, that seems to have gone a bit out of fashion :).
There was never a time when this wasn't the case. Even back in the 1990s, you could find me tweaking parts and adding fans to the case to crank out another 5-10 FPS to get "blazing" 60 FPS gameplay. Most games back then had trouble maintaining a steady 30, but I was a "power user"!

I eventually gave that up and just went with a nice balance. Took too much time away from gaming. (Ironically, now that hardware is so powerful, it's normally the games that can't take advantage of all the processing power. That started right around the 2010 point. Made my decade.)
 
Geeze, man -- if you're able to run the game at any sort of playable framerates at 4K, I'd gather up your blessings, put them in a box, and remain really, really quite about "performance issues". (The Furies are everywhere! And they are always listening. You should see what they can do to a graphics card when they find it...)
For me its decent, trying too help others pretty much. Shame it doesnt run better but im not complaining. Lucky i got the DLSS :D
 
For me its decent, trying too help others pretty much. Shame it doesnt run better but im not complaining. Lucky i got the DLSS :D
True dat. I am a definite fan of DLSS, now that I can see it in action. AND the low-latency ultra mode. Very possible to get really smooth play, now, even if the FPS are still wimpy double-digits. (I mostly lock the frames around 72. I'm too far removed from this 120+ FPS generation to even be able to tell when I'm playing.)
 
True dat. I am a definite fan of DLSS, now that I can see it in action. AND the low-latency ultra mode. Very possible to get really smooth play, now, even if the FPS are still wimpy double-digits. (I mostly lock the frames around 72. I'm too far removed from this 120+ FPS generation to even be able to tell when I'm playing.)
hehe im happy if i can get 60 in cp2077 constantly(im close but at some spots it can drop too 45-50), performance DLSS at 4k is pretty good but i heard it gives less at lower resolutions since the render res goes under 1080p so you just get more CPU bound. My limited testing pretty much shows with maxed settings i get around the same FPS at 1080p without DLSS and at 4k with DLSS performance so it seems pretty close too the limit. anything lower just gets me cpu bound and at like 70% usage on the card.

Smoothness is kinda weird in that sense, its very hard too get in some games and easy as cake in others. Smooth fps and latency is very nice tho when you get too feel it and if it can be 144 fps too its like butter (atleast with a 144hz screen). its best too limit tho if you get high fluxuations in max fps for example.
 
Top Bottom