GTX 980ti 1440p FPS

+
GTX 980ti 1440p FPS

Hi, I'm planning to build a desktop that can handle this game at 1440p at Ultra well and I'm looking at the Gtx 980ti.

Can anyone advice me on the FPS you are getting with a GTX 980ti at 1440p?

Thanks
 
On the lowest 4K option, AA completely off, Hairworks off, all other settings on Ultra, I get ~35-45 FPS in most areas. Highest 4K option gives me ~10-30 FPS -- very choppy, overall.

This is on a:

Core i7-4790K, 4.00 GHz
EVGA GTX 980 ti, 6GB VRAM
16 GB GSkill DDR3 Gaming RAM, 2333 MHz
Samsung Evo SSD
Windows 7 x64

If you're looking to run something in 4K, I would really recommend another game. TW3 is playable at 1440p, but it's very taxing on the system. Performance can really bog down suddenly in detail-heavy areas. I stick with 1080p, myself. Smooth 60 FPS everywhere with every detail maxed and increased shadow detail through tweaks. Only minor stuttering which has nothing to do with the graphics load.

If you want to run this game smoothly in 4K, I recommend a top-of-the-line Falcon-Northwest Mach V system. (Should only set you back about $9,000 - $12,000...)
 
On the lowest 4K option, AA completely off, Hairworks off, all other settings on Ultra, I get ~35-45 FPS in most areas. Highest 4K option gives me ~10-30 FPS -- very choppy, overall.

This is on a:

Core i7-4790K, 4.00 GHz
EVGA GTX 980 ti, 6GB VRAM
16 GB GSkill DDR3 Gaming RAM, 2333 MHz
Samsung Evo SSD
Windows 7 x64

If you're looking to run something in 4K, I would really recommend another game. TW3 is playable at 1440p, but it's very taxing on the system. Performance can really bog down suddenly in detail-heavy areas. I stick with 1080p, myself. Smooth 60 FPS everywhere with every detail maxed and increased shadow detail through tweaks. Only minor stuttering which has nothing to do with the graphics load.

If you want to run this game smoothly in 4K, I recommend a top-of-the-line Falcon-Northwest Mach V system. (Should only set you back about $9,000 - $12,000...)

Sorry but I'm afraid I don't understand.

I thought 1440p is not 4k? 4K should be 3820 x 2400 no?

I'm thinking about 2560 x 1440. Can 980ti handle it at Ultra with 60 FPS? Or do I need to get 980ti SLI for that?
 
At 1440p, I believe the 980ti will sit around the 60fps mark. That means sometimes above 60fps and sometimes below 60fps, in the more taxing areas. I recommend getting 2 of them if you want the best possible graphics, but if you don't mind tinkering with the settings or if you have a g-sync screen then a single one should be enough.
 
At 1440p, I believe the 980ti will sit around the 60fps mark. That means sometimes above 60fps and sometimes below 60fps, in the more taxing areas. I recommend getting 2 of them if you want the best possible graphics, but if you don't mind tinkering with the settings or if you have a g-sync screen then a single one should be enough.

I'm originally planning to get 2 Gigabyte G1 Gaming Gtx 980ti. But seeing that 4K is not really possible even with this, I guess I'll be aiming for 1440p instead.

For the monitor I'll definitely be getting a G-sync one. But actually I'm not really sure how much of a difference that'll make, mind explaining it ?
 
I have 2 GTX 980s in SLI and I'm running smooth 60 fps at 2880x1620 with some dips to 50s in most heavy scenes but you really don't notice it. This is everything on Ultra + tweaks except hairworks and AA off ( AA drops fps like 10 fps now after the patch that changed it and you don't really need it at higher reses) On 1440p I can run it with hairworks on aswell at 60 fps but I don't really care for it.

So getting 2 TIs will set you up for good and even one will get you good performance at 1440p but don't expect it to run at 60fps without dips and sacrifices.
 
Yes...technically, 1440p is UltraHD, or whatever. But it's splitting hairs. End result is that you're beginning to compress pixels to almost microscopic levels of detail.

Basically, if you try to run the game over 1920x1080, you will get FPS lag in many places on a 980 ti. It will be playable, though! You can always lower some of the detail settings to High/Medium until you achieve smooth FPS. SLI won't have much of an effect, and it's somewhat problematic with TW3. I don't recommend SLI, overall -- it's always been hit-or-miss technology. A newer Titan model would likely be able to pull off 1440p more smoothly, but that's a lot of extra money for not much additional horsepower.

The 980 ti is a wickedly powerful card. I couldn't be happier with mine. (I really recommend the EVGA brand, too. Runs nice and cool even when it starts getting a workout.) My point is that TW3 taxes most high-end hardware and presses mid-range hardware to the limits. It'll be a generation or so before mainstream computers can run TW3 fluidly at 4K(ish) resolutions. In comparison, I can run Planetside 2 maxed at 7680x4320 at pretty playable FPS. Stutters badly when there's lots of smoke up close, but that's the power of a 980 ti.
 
With the latest patches on my 780ti i get 40-50fps @2560x1600 (16x10 how oldschool :p)... everything ultra no hairworks, cutscene lighting mod, high quality ambient occlusion and sweetfx with the expensive sharpening algorithm set. You should be good with a 980ti.

Its an important point though, there are compulsory ini and control panel tweaks required to get a smooth fps...
 
This conversation is now taking me back. Does anyone remember Outcast? Voxel-based graphics with a wildly open world, mostly non-linear progression, fully-voiced dialogue, an OST by an actual, live, orchestra... YEARS ahead of it's time.

I had just graduated from university, got a job, and bought a new PC -- a custom-built Pentium III monster with frickin' 16 MB of RAM and an Nvidia Riva TNT2. Sucker set me back over $2,000. Had to save until almost December to afford it. (My girlfriend at the time thought I was, seriously, the biggest geek on the planet. After spending 2 grand on myself like that, it also wound up being a rather expensive Christmas to compensate...)

But when I finally got it home. I could play Outcast maxed-out at 512x384.

That was nothing compared to running X-Wing: Alliance with all effects on at 800x600!
 
This conversation is now taking me back. Does anyone remember Outcast? Voxel-based graphics with a wildly open world, mostly non-linear progression, fully-voiced dialogue, an OST by an actual, live, orchestra... YEARS ahead of it's time.

I had just graduated from university, got a job, and bought a new PC -- a custom-built Pentium III monster with frickin' 16 MB of RAM and an Nvidia Riva TNT2. Sucker set me back over $2,000. Had to save until almost December to afford it. (My girlfriend at the time thought I was, seriously, the biggest geek on the planet. After spending 2 grand on myself like that, it also wound up being a rather expensive Christmas to compensate...)

But when I finally got it home. I could play Outcast maxed-out at 512x384.

That was nothing compared to running X-Wing: Alliance with all effects on at 800x600!

I was still running my matrox millenium/voodoo 2 sli setup back then. With the cute external passthrough cable. I was such a fanboy back then, my first nvidia card was the 7800gtx and i haven't looked back since.

pfft who needs 32 bit color anyway heheheheh those were the days. Sure our phones have more gpu power than these things by a long shot, but every new game and every new card was guaranteed to blow away what came before. Then consoles happened. Still where we're at currently is quite impressive in its own right. I'm pretty sure witcher 3's level of detail matched only dreams from back then.. there's a certain sense of dreams coming true when walking around velen.. its gotten good enough to not be left wanting much.

Remember when quake 3 introduced bezier curves and good shadows?
 
I'm playing it with an Asus GTX 980 Ti Strix on 2560x1440, with Ultra Settings, Hairworks and all options enabled. The game runs great with 50+ fps, and will probably be even better when CDPR patch it to use Directx12.
 
I was still running my matrox millenium/voodoo 2 sli setup back then. With the cute external passthrough cable. I was such a fanboy back then, my first nvidia card was the 7800gtx and i haven't looked back since.

pfft who needs 32 bit color anyway heheheheh those were the days. Sure our phones have more gpu power than these things by a long shot, but every new game and every new card was guaranteed to blow away what came before. Then consoles happened. Still where we're at currently is quite impressive in its own right. I'm pretty sure witcher 3's level of detail matched only dreams from back then.. there's a certain sense of dreams coming true when walking around velen.. its gotten good enough to not be left wanting much.

Remember when quake 3 introduced bezier curves and good shadows?

Heh-heh-heh...! High-five. David Puddy me. Come on.

Here's another one: Mechwarrior 2 gets a standalone 3Dfx patch. I upgraded and booted it up for the first time like... :eek: ...sooo...smoooth...!

---------- Updated at 04:18 PM ----------

I'm playing it with an Asus GTX 980 Ti Strix on 2560x1440, with Ultra Settings, Hairworks and all options enabled. The game runs great with 50+ fps, and will probably be even better when CDPR patch it to use Directx12.

Thank for this post, because it made me boot the game back up in 4K and fiddle some more. My memory was slightly off on 2560X1440 performance. I do get relatively smooth gameplay. Definitely over 50 FPS on average. The great thing is that you can kill AA completely, which does make everything look more crisp. When in graphics-heavy areas, though (like the bridge to Oxenfurt), it will bog down noticeably. (Not running any OC here, by the way.)

Jacking things up towards 4x settings under Nvidia Control Panel will destroy playable FPS, but it does look so very nice. I'm quite excited to see where this 4K stuff goes in the future. Some older games run extremely well and look incredible. (I feel like I can reach into the screen and pick up the characters in Kindoms of Amalur.)

Last note, I'm forced to use the DSR settings to achieve this on my rig (23 inch monitor!), so it's not true UltraHD/4K settings. That may be costing me some FPS.
 
4k for PC is 3840x2160. Other 4k source are actually 4096 x 2160. If you cant maintain at least 30FPS@4K with a GTX980Ti than someone is not right. Cause I'm on a single EVGA GTX90TiSC+ACX overclocked and I'm constantly 34-37 fps so i just locked it at 30FPS. MAX setting in game except for hairworks AA is set at 4X instead of 8x. Nvidia High Quality AF settings, Reshade with SMAA injected also.

specs
3930K@4.6ghz
EVGA 980TiSC+ACX custom bios 1.23V, 1490hmz core /8002mhz vram.
 
I'm running a 980ti at 1440p and putting everything on ultra minus hairworks makes the game sometimes drop to 54-56fps. Might not seem that big, but the difference in smoothness is noticeable especially with v-sync enabled.

Ultimately I dropped foliage visibility range to "high" and the game stays at 60.
 
Above 50 fps it runs smoothly and without any fps drop. I prefer to get better graphics with all options on and maxed than 60 fps, but everyone has his tastes.
The geforce experience profile recommend 4x MSAA but that's for the base 980Ti, it runs well here.
 
Above 50 fps it runs smoothly and without any fps drop. I prefer to get better graphics with all options on and maxed than 60 fps, but everyone has his tastes.
The geforce experience profile recommend 4x MSAA but that's for the base 980Ti, it runs well here.

I find MSAA murders my FPS in certain places running in UltraHD, and for not much return. Hopefully, with these higher resolutions, full-on AA will become a thing of the past. Its always been such a resource hog.
 
Top Bottom