Forums
Games
Cyberpunk 2077 Thronebreaker: The Witcher Tales GWENT®: The Witcher Card Game The Witcher 3: Wild Hunt The Witcher 2: Assassins of Kings The Witcher The Witcher Adventure Game
Jobs Store Support Log in Register
Forums - CD PROJEKT RED
Menu
Forums - CD PROJEKT RED
  • Hot Topics
  • NEWS
  • GENERAL
    THE WITCHER ADVENTURE GAME
  • STORY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 THE WITCHER TALES
  • GAMEPLAY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 MODS (THE WITCHER) MODS (THE WITCHER 2) MODS (THE WITCHER 3)
  • TECHNICAL
    THE WITCHER THE WITCHER 2 (PC) THE WITCHER 2 (XBOX) THE WITCHER 3 (PC) THE WITCHER 3 (PLAYSTATION) THE WITCHER 3 (XBOX) THE WITCHER 3 (SWITCH)
  • COMMUNITY
    FAN ART (THE WITCHER UNIVERSE) FAN ART (CYBERPUNK UNIVERSE) OTHER GAMES
  • RED Tracker
    The Witcher Series Cyberpunk GWENT
Menu

Register

The Witcher 3: Wild Hunt - PC System Requirements are here!

+
Prev
  • 1
  • …

    Go to page

  • 65
  • 66
  • 67
  • 68
  • 69
  • …

    Go to page

  • 128
Next
First Prev 67 of 128

Go to page

Next Last
M

mavowar

Senior user
#1,321
Mar 29, 2015
echohaxorelite said:
Hey guys again :)

I just read a few previous posts about the Gamestar and the gtx980 and also the gtx970. i have a gtx970 but only just recently heard about the Vram issue with the 500mb vram not being "unlocked".
So im not really wanting to return it but do you people think the witcher 3 in 1080p be pushing past 3.5Gb Vram? with everything ultra?... all im looking for is a steady +40fps with ultra
Click to expand...
You will be fine. That whole 3.5G ram thing was blown out of proportion.
If you play 1080p or 2k, you will have no problems 99 percent of the time. It is when you hit 4k and Ultra textures you may run into an issue. Also, I noticed Nvidia has done some really good driver optimizations for the 970(even though they claimed they wouldn't or couldn't) I have seen improvement in various games/bench marks.tests. The card still has 4gigs of Vram, it does have a caveat though. That last .5 G Vram is seven times slower. So, what Nvidia has done is used that pool of memory for stuff that does not require fast turn around. I ran 10 bench marks/games and only had issues in 4k, but these cards are not meant for 4k. Never was, never will be. It really takes 2 980s to really shine or a Titan x or 2 if you want awesome 4k performance.

Now, all that said, Nvidia did mess up, as the ROP counts are lower than advertised 56 vs 64 for the 980 and The L2 cache was wrong as well, 1792kb vs 2048kb for the 980. What is that all worth, roughly 10 to 15 percent performance loss. OC that baby and beat stock 980s!!!.

Bottom line for the price they are great cards for the here and now, they will not age as well as a 980 or 290x/290 but tyou will still get 2 to 4 years of playing top games at high settings IMHO!!!. I got 2. I love em!!! even if they are "handicapped" ;-) They do just fine on my Asus ROG swift even in 3d gaming.....I am looking at you Witcher 3!!!!
 
Last edited: Mar 29, 2015
cyberpunkforever

cyberpunkforever

Forum veteran
#1,322
Mar 29, 2015
mavowar said:
You will be fine. That whole 3.5G ram thing was blown out of proportion.
If you play 1080p or 2k, you will have no problems 99 percent of the time. It is when you hit 4k and Ultra textures you may run into an issue. Also, I noticed Nvidia has done some really good driver optimizations for the 970(even though they claimed they wouldn't or couldn't) I have seen improvement in various games/bench marks.tests. The card still has 4gigs of Vram, it does have a caveat though. That last .5 G Vram is seven times slower. So, what Nvidia has done is used that pool of memory for stuff that does not require fast turn around. I ran 10 bench marks/games and only had issues in 4k, but these cards are not meant for 4k. Never was, never will be. It really takes 2 980s to really shine or a Titan x or 2 if you want awesome 4k performance.

Now, all that said, Nvidia did mess up, as the ROP counts are lower than advertised 56 vs 64 for the 980 and The L2 cache was wrong as well, 1792kb vs 2048kb for the 980. What is that all worth, roughly 10 to 15 percent performance loss. OC that baby and beat stock 980s!!!.

Bottom line for the price they are great cards for the here and now, they will not age as well as a 980 or 290x/290 but tyou will still get 2 to 4 years of playing top games at high settings IMHO!!!. I got 2. I love em!!! even if they are "handicapped" ;-) They do just fine on my Asus ROG swift even in 3d gaming.....I am looking at you Witcher 3!!!!
Click to expand...
and what about 1080p with everything on ultra, physx, hairworks, ultra textures, ubersampling etc can the gtx 970 still get 60 stable fps? that would be awesome,
 
D

dragonbird

Ex-moderator
#1,323
Mar 29, 2015
cyberpunkforever said:
and what about 1080p with everything on ultra, physx, hairworks, ultra textures, ubersampling etc can the gtx 970 still get 60 stable fps? that would be awesome,
Click to expand...
Ubersampling? Definitely not. For the rest, we know that ultra settings without ultrasamping are possible with a high framerate on the 980 that was used in one particular demo. We can extrapolate from that fact that it could also be possible on a 970.

But extending that to a blunt statement like "everything on ultra" (except ultrasampling), with a specific named framerate (stable at 60fps) and resolution (1080p) is moving into dangerous waters for someone making a purchase decision. There are a lot of variables on any PC build.
 
Last edited: Mar 29, 2015
  • RED Point
Reactions: mavowar
cyberpunkforever

cyberpunkforever

Forum veteran
#1,324
Mar 29, 2015
ubresampling requires too much power processing, so, the 980 would be necessary for it
 
D

dragonbird

Ex-moderator
#1,325
Mar 29, 2015
cyberpunkforever said:
the 980 would be necessary for it
Click to expand...
That would still be making a dangerous assumption. Ubersampling comes close to halving the framerate, and there's been no actual data on the performance with ubersampling on in TW3.

Earlier on, we highlighted that the purpose of this thread is to help people make informed decisions on what they may need to buy, and warned people off over-stating what was needed, which might make some people spend more than was necessaary. The same applies in the other direction - overstating performance expectations may result in people having too-high an expectation and being disappointed.

So it would probably be best to stick to known facts and reasonable extrapolations. If you want to speculate beyond that, there are other threads where it would be more appropriate. There's a good thread on ubersampling here, which explains more about why it's not really a realistic expectation, (and also why it may not be that important :) )..
http://forums.cdprojektred.com/threads/28763-Übersampling-in-The-Witcher-3-petition
 
Last edited: Mar 29, 2015
sidspyker

sidspyker

Ex-moderator
#1,326
Mar 29, 2015
cyberpunkforever said:
ubersampling etc can the gtx 970 still get 60 stable fps?
Click to expand...
Not in a million years, not even a Titan X can manage to do 4K60 constantly in 'really' demanding games.(2x2 Ubersampling is the same as playing on 4K resolution)
 
  • RED Point
Reactions: mecha_fish
Sardukhar

Sardukhar

Moderator
#1,327
Mar 29, 2015
Well, here's what my Ubersampling experience was with Witcher 2:

"Yes and Ubersampling is 2x2 supersampling with a few added effects(high quality AF) making it 12800x3200 render and then downsampling it to 6400x1600"

So, with this in mind, I trotted off to test to see if it was pretty much an exact ratio. I only hard crashed my system once, probably because I tried to force 2560x1600 while Eyefinity was set up. Don't do that. I am -so- just getting a big monitor next time....anyway.

I was running most settings at max or ultra, motion blur and cutscene depth of field off because ugh. Vsync was off.

2560 x 1600, no Ubersampling: 85-135 FPS, depending on if I was looking at a fire or not. (4 million pixels)

2560, Ubersampling On: 50-70 FPS (supposed to be 5120 x 3200, 16+ million pixels)

6400 x 1600, no Ubersampling: 55-60 FPS (10+ million pixels)

6400, Ubersampling On: 25-30 FPS (supposed to be 12800 x 3200, 40.9 million pixels)

I learned a few things.

1) That downsampling must be really effective, because rendering 41 million pixels with a frame rate between 1/3 and 1/4 of rendering 4 million pixels is pretty good.

2) Fire Bad. Rendering the flames in my sample area was quite pricey. I guess I already knew this, but a 20 FPS hit at 2560 with US was ow.

3) 6400 x 1600 with or without US is really close in framerate min-max range. Almost like Vsync was on, but it wasn't. Weird.

4) Ubersampling at 6400 IS playable! I already knew that, but still nice. Depending on your definition of playable of course. Not a crazy busy scene, but not bad. NPCs, view across the valley, fire going. Cursed flames. In a totally open area, with lots of NPCs, this is why my GPU hits 97 with watercooling.

So, framerate-wise, it's not as bad as if it was quadrupled, (real 12800 x 3200 render) but it's still pretty tough.
 
P

prince_of_nothing

Forum veteran
#1,328
Mar 29, 2015
mavowar said:
You will be fine. That whole 3.5G ram thing was blown out of proportion.
If you play 1080p or 2k, you will have no problems 99 percent of the time. It is when you hit 4k and Ultra textures you may run into an issue. Also, I noticed Nvidia has done some really good driver optimizations for the 970(even though they claimed they wouldn't or couldn't) I have seen improvement in various games/bench marks.tests. The card still has 4gigs of Vram, it does have a caveat though. That last .5 G Vram is seven times slower. So, what Nvidia has done is used that pool of memory for stuff that does not require fast turn around. I ran 10 bench marks/games and only had issues in 4k, but these cards are not meant for 4k. Never was, never will be. It really takes 2 980s to really shine or a Titan x or 2 if you want awesome 4k performance.

Now, all that said, Nvidia did mess up, as the ROP counts are lower than advertised 56 vs 64 for the 980 and The L2 cache was wrong as well, 1792kb vs 2048kb for the 980. What is that all worth, roughly 10 to 15 percent performance loss. OC that baby and beat stock 980s!!!.
Click to expand...
I agree with this for the most part, but I think it's important to note that the GTX 970 also has a bandwidth reduction compared to the 980 due to having that disabled ROP/L2 unit. The GTX 970 really has a 224 bit bus with 196GB/s bandwidth for the 3.5GB block, and for the 512MB block it's 28GB/s on a 32 bit bus.

This shouldn't be an issue at 1080p though, but at 1440p and above or any bandwidth limited circumstance, it will rear it's head. Thats when the GTX 980 will have a greater lead on the 970, from the average 18% to 30% and greater.
 
A

Ashwind123

Rookie
#1,329
Mar 29, 2015
Valaskjalf1414 said:
What I want to know is, since I dont have some super computer, and simply run a GTX750TI, 8gb RAM and I53470....how on earth is this game going to run on any next gen consoles when, comparatively speaking, their specs arent that much better than the PC I run? Is it simply due to poo optimization that the PC specs are so much more demanding than that of the consoles?
Click to expand...
There is a fact that PS4 and XB1 are more optimized for gaming and has less overhead. They will also most likely reduce the draw distance, details and physics on the console version and maybe use different AA etc. I5-3470 + GTX750Ti is about the same as the PS4 and XB1 but it has more to do, anti-virus and other background processes etc etc.

Finally, a lot of times, minimum/recommended specs are often bloated and it needs to be bloated because nobody knows what sort of crazy processes people run on their PC :p


edit:
What I want to know is how will TW3 use the CPU. Recommended i7? 99% of the games I have played and tested does not really make use of the CPU or a more powerful CPU does not really yields significant/noticeable performance gain.
 
Last edited: Mar 29, 2015
G

Giovanni1983

Forum veteran
#1,330
Mar 29, 2015
As posted above at 1080p or even 1440p, you will be fine with the 970. You would only have problems at 4k or with ubersampling on. Don't worry about it, it's a freaking powerful card currently second best after the 980. There is no way that the witcher will hit the 3.5GB mark. My guess is that it won't even be close to that.

I had a 290X and swapped to the 970 and i am still amazed by it cause it gives up to 8-10 fps in most games, which is a really huge difference.

Edit: As for the CPU, i wouldnt worry about it if you have an i5 or i7. They did post an i7 3770 as a recommended but i think that an i5 4460 and up would do great.
 
Last edited: Mar 29, 2015
P

prince_of_nothing

Forum veteran
#1,331
Mar 29, 2015
Giovanni1983 said:
There is no way that the witcher will hit the 3.5GB mark. My guess is that it won't even be close to that.
Click to expand...
I'd be amazed if it didn't hit that mark. Most games today are using extra VRAM for texture preallocation, which makes sense as it reduces memory overhead.

Why go all the way to system memory for a texture, when you can simply store it in your VRAM if you have the space available? As for whether this could cause issues for the GTX 970, it's certainly possible.

Bad PC developers like Ubisoft use VRAM texture preallocation to hide their crappy memory handling algorithms for PC, which uses discrete memory architecture. When I had GTX 970 SLI, I would get stuttering in Watch Dogs when driving at high speeds in parts of the downtown area. When I got GTX 980 SLI, those stutterings were gone as the engine could preallocate more textures due to having access to more VRAM.

That said, I don't think the Witcher 3 will fall prey to this as CDPR is an experienced PC developer that takes pride in their work. Red Engine 3 also seems to be very impressive and efficient..
 
eskiMoe

eskiMoe

Mentor
#1,332
Mar 29, 2015
If the system reqs of The Witcher 3 and The Witcher 2's VRAM usage are any indications, I doubt VRAM will become an issue with the game.
 
  • RED Point
Reactions: mavowar
G

Giovanni1983

Forum veteran
#1,333
Mar 29, 2015
The problem with Watch dogs was not graphics card memory to be honest but really bad game engine.

From what i have seen the stutterings and other problems it had didn't have anything to do with enough memory cause it never hit my graphics cards memory limit (it was using about half of it) but still had stuttering here and there amongst other issues. Bad coding at it's best.

3.5GB vram used is an "out-of-this-world-huge" amount for any game currently running at 1080p. I doubt that CDPR or any company out there would currently make a game that eats up more than 3GBs of vram and based on the recommended settings i'd say that's already the case. The 770 has 2gigs.

As i mentioned a few pages back, DA:I which is also a very very demanding game on ultra, had at max 1700+ vram used for hours of running. Never even went over 1800. So i think (and hope) it's pretty safe to say that it's quite impossible for the witcher to eat up double that amount of vram. We are talking about 3.5GB + 500 just in case.

My estimations are without ubersampling and at 1080p maybe even possibly 1440p although i doubt it, just making this clear. I do not think that at a higher resolution than 1440p it would cut it.
 
Last edited: Mar 29, 2015
A

Asurat

Rookie
#1,334
Mar 29, 2015
970 SLI worth it for Witcher 3

I'm thinking about picking up another 970 to run in SLI for Witcher 3 and other games. But i was wondering how good the SLI profile was optimized for previous CDPR games?
 
G

GuyNwah

Ex-moderator
#1,335
Mar 29, 2015
It took a while to get the drivers and profile for SLI right. I would be surprised if they were already in an ideal state on release. There will probably be an nVidia driver update. Spending a lot of money on hardware, on speculation that it will work well with a game before the game has been released, is always a risky proposition. I don't take that kind of risk with my money.

Anyway, hardware thread is in the News subforum. I'll move this one there.
 
Last edited: Mar 29, 2015
S

sp3tan

Rookie
#1,336
Mar 29, 2015
Asurat said:
I'm thinking about picking up another 970 to run in SLI for Witcher 3 and other games. But i was wondering how good the SLI profile was optimized for previous CDPR games?
Click to expand...
In my honest opinion Asurat. Just wait. Seriously, just wait. There's no need to rush buying another card just to maximize the performance. Since you have a GTX 970, considering its power on stock clock, you shouldnt worry. Wait for the game to get released, do alot of tests with the settings untill you get stable 60fps everywhere and THEN make up your mind wherever you like to buy another 970 for SLI or not.

Another recommendation but less necessary is that you overclock your GPU. If youre going to do this i suggest you do some GOOD research and learn(if you have to) or look what the "average" overclocking of some 970 GPU's are. Reason i say this is because maybe you want to overclock but youre not sure and just want to gain a little bit of nice performance boost without having the worry about it overheating and so on so forth.
 
M

mavowar

Senior user
#1,337
Mar 29, 2015
https://www.youtube.com/watch?v=UtWL3D9ZL3Q

This video by digital foundry clears the air and agrees with what I have found in my own testing....
concerning the 970 and 970sli in 1080p.1440p, and 4k

Hint: as of now there really are only about 1 to 2 percent chance you will have an issue. Even in 2k and 4k using SLI it is really rare as well. Now, I will agree and say these cards will not last as long as I would hope. Then again I upgrade every 2 years.
 
Last edited: Mar 29, 2015
S

StarG

Rookie
#1,338
Mar 29, 2015
>> Sorry for the long time to answer in the meantime i upgraded my main system and ran in some issues which are now mostly resolved.

GuyNwah said:
Nah, the difference between Core 2 Duo (E8400-E8600) and Haswell G3258, even before overclocking, is almost a factor of 2.
Click to expand...
You might accidentally put the quad-core advantage into play here. We were discussing pure IPC back then. Found a nice comparison at THG. See here: http://www.tomshardware.com/reviews/ivy-bridge-wolfdale-yorkfield-comparison,3487-14.html. Why SC2? The game is known to be very cpu bound and its also known to be pretty much limited to dual core. So more or less perfect real world benchmark.

i5-3570K Turbo 3,8ghz -> 44,2fps (in high settings)
E8400 (OC) 4,0ghz -> 32,5fps (high too)

E8400 (OC) 3,8ghz -> 30,9fps (high too) INTERPOLATED

IvyBridge i5 running at 143% + 10% of the performance (ivy bridge is about 10% slower in IPC than haswell so that comes on top) of the Wolfdale at same clock. Note: the E8400 value had been interpolated to match the clock of the IvyBridge. So not 2x but 1,5x in speed comparison. 2x in threads will add in performance in alot of todays typical games so that will skew the value.

System's now running said E8400 (not yet clocked though). GPU is being exchanged to a GTX 670.. Hope it will run Witcher3. If things break i might switch to a Q9450-alike Xeon with somehat slower clocks than the dualcore. Aiming for medium detail @ 1200p.

--------

VRAM utilization will be an interesting thing to see too. I initially aimed for a 660ti which is to be had amazingly cheap and clocks well. But then noticed the very same issue with the 970: its soft limitation vram wise. In the case of the 660ti its driver cutting off usage at 1.5gb of its 2gb total. Hope CDPR will take this into consideration and keep detail levels as such that there are options to restrain them to 1280 (fermi based cards had often low amounts as such) to 1500mb at low.
 
Last edited: Mar 29, 2015
G

Giovanni1983

Forum veteran
#1,339
Mar 29, 2015
mavowar said:
https://www.youtube.com/watch?v=UtWL3D9ZL3Q

This video by digital foundry clears the air and agrees with what I have found in my own testing....
concerning the 970 and 970sli in 1080p.1440p, and 4k

Hint: as of now there really are only about 1 to 2 percent chance you will have an issue. Even in 2k and 4k using SLI it is really rare as well. Now, I will agree and say these cards will not last as long as I would hope. Then again I upgrade every 2 years.
Click to expand...
That video is a truly great find.

Although we don't know specifics on witcher 3, it pretty much proves that if you want to play at 1080p and even maybe 1440p, you won't have any problem whatsoever with a 970 on ultra. Probably at around 45-50fps and without ubersampling of course, can't say that enough :D.

It's a damn powerful card and the vram it has is pretty much more than enough.
 
  • RED Point
Reactions: mavowar
M

mavowar

Senior user
#1,340
Mar 30, 2015
Giovanni1983 said:
That video is a truly great find.

Although we don't know specifics on witcher 3, it pretty much proves that if you want to play at 1080p and even maybe 1440p, you won't have any problem whatsoever with a 970 on ultra. Probably at around 45-50fps and without ubersampling of course, can't say that enough :D.

It's a damn powerful card and the vram it has is pretty much more than enough.
Click to expand...
I will post what numbers I get on release day with SLI 970's on 1440p Rog Swift G-Sync Monitor. I am Pumped!!
 
Prev
  • 1
  • …

    Go to page

  • 65
  • 66
  • 67
  • 68
  • 69
  • …

    Go to page

  • 128
Next
First Prev 67 of 128

Go to page

Next Last
Share:
Facebook Twitter Reddit Pinterest Tumblr WhatsApp Email Link
  • English
    English Polski (Polish) Deutsch (German) Русский (Russian) Français (French) Português brasileiro (Brazilian Portuguese) Italiano (Italian) 日本語 (Japanese) Español (Spanish)

STAY CONNECTED

Facebook Twitter YouTube
CDProjekt RED Mature 17+
  • Contact administration
  • User agreement
  • Privacy policy
  • Cookie policy
  • Press Center
© 2018 CD PROJEKT S.A. ALL RIGHTS RESERVED

The Witcher® is a trademark of CD PROJEKT S. A. The Witcher game © CD PROJEKT S. A. All rights reserved. The Witcher game is based on the prose of Andrzej Sapkowski. All other copyrights and trademarks are the property of their respective owners.

Forum software by XenForo® © 2010-2020 XenForo Ltd.