Predicted witcher 3 system specs? Can I run it .

+
Status
Not open for further replies.
I didn't put it higher then Witcher 3 which looks great, I just said SOM rivals it. Witcher 3 is a very good looking game, but like Mordor, some screens of W3 don't look at good as others.

A new thread in the community section can be used for this, this is a system requirements thread not talking about graphics in other games or comparisions. I can move the posts to a new thread if we're going to discuss this further if no one objects.

Okay, probably easier if I just shut up about SOM. This came about due to the 6gb Vram issue, and fears Witcher 3 could turn out the same.
 
Last edited by a moderator:
Okay, probably easier if I just shut up about SOM. This came about due to the 6gb Vram issue, and fears Witcher 3 could turn out the same.

As @Guy N'wah said, since the game runs on the updated version of the engine that ran also TW2, even if we're talking about an open world, I don't think RAM requisites will be that high. TW2 seemed very well-optimized for me (I played only Enhanced Edition though).
 
As @Guy N'wah said, since the game runs on the updated version of the engine that ran also TW2, even if we're talking about an open world, I don't think RAM requisites will be that high. TW2 seemed very well-optimized for me (I played only Enhanced Edition though).

I know.

As for its system requirements, time will tell.
 
Im being scared for real now. The Evil within demanding 4GB, Shadows of Mordor demanding 6GB for Ultra textures... whats next man? What crazy developers are doing games?

I hope my 780 3GB could handle Witcher 3 ultra textures...if not, im defintly finished as a PCGamer, its over... this industry is going crazy...

Im seeign right now (i can up screens) my Crysis 3 using less than 2GB, Far Cry 3 using 1,3Gb max, Metro Redux using 1,2GB, All with filters, MSAA, SMAA etc. I can't believe supposed "next-gen" games needing x3 or x4 times these Vrams specifications... its not fair for PC consumers...

I'm sure this is only because developers are getting used to making big open worlds for these next gen games, and are just trying to push deadlines for their publishers. As time goes on, devs will adapt their game engines to be more efficient but for now we just have to deal with it.
 
A lot of people don't know this, but most of the VRAM on your video card is not used for render targets, but as storage for graphics data; mostly textures..

So when Shadows of Mordor states that 6GB of VRAM is required for ultra textures, it's not NEEDED. Since the game has been out, there have been many people running the ultra level textures on GPUs with 3GB and 4GB without any problems. Here's one example of a guy with a GTX 970 running the game with ultra level textures. 2GB cards is really pushing it though, but even that's doable if you don't mind a bit of stutter or pausing every now and then when the textures are being swapped out.

Games have been doing this for a while, and it's a smart way for developers to increase performance because VRAM is so much faster than system memory. So honestly, I don't think that VRAM capacity is going to be an issue with upcoming games unless you're running 2GB and under and trying to use the most detailed textures.
 
Will a single GTX 970 max this?
Just a sec. I'll check.

 
http://www.eurogamer.net/articles/digitalfoundry-2014-eyes-on-with-pc-shadow-of-mordors-6gb-textures

Monolith recommends a 6GB GPU for the highest possible quality level - and we found that at both 1080p and 2560x1440 resolutions, the game's art ate up between 5.4 to 5.6GB of onboard GDDR5. Meanwhile, the high setting utilises 2.8GB to 3GB, while medium is designed for the majority of gaming GPUs out there, occupying around 1.8GB of video RAM.
Yeah. So the 6GB VRAM was no joke after all..
 
Ahem....: https://www.youtube.com/watch?v=fGEe-SX0VSI

There are many more videos showing the same thing. The game runs with the Ultra textures pack installed and enabled, all settings at maximum, without a hitch on a GTX 980. I wouldn't trust any of those sites.

That's what I said on the previous page. The fears of not having enough VRAM is truly unfounded. Only in certain circumstances where the engine is extremely inefficient with resource management (ie Watch Dogs) does it really become a problem, because ultimately, what's on the screen at any one time uses a small amount of VRAM as a buffer. Most of the VRAM is being utilized as storage.

I started a thread over at Anandtech forums about this very issue if anyone wants to read it..

One guy in that thread is playing Shadows of Mordor with a GTX 580 1.5GB card, and using high level textures with no issues. The developer recommends 3GB and greater to use high level textures, so I think that's strong evidence..
 
The various AA techniques also take up a huge amount of memory, so do many lighting effects. Yes, there is no doubt they inflated the Vram requirements to gain attention. Watch Dogs was also a prime example of a terrible resource hog, with nothing in its visuals to justify its requirements, but, I can max the thing out on my GTX 980, with absolutely no drops in frame rate, and that's using the Ultra mod. I think it's safe to say, 4GB is more than enough for the foreseeable future.
 
Last edited:
Ahem....: https://www.youtube.com/watch?v=fGEe-SX0VSI

There are many more videos showing the same thing. The game runs with the Ultra textures pack installed and enabled, all settings at maximum, without a hitch on a GTX 980. I wouldn't trust any of those sites.
I'm not so sure it runs "without a hitch", because I saw some benchmark results @ overclockers.co.uk and while their average frame rate with the texture pack installed was good, the minimum frame rates were 20ish or something. That could imply that in certain points the VRAM isn't enough. Also digital foundry actually played the game instead of running the benchmark to calculate how much VRAM was used. The fact seems to be that at least in-game the game does use more than 5GBs of VRAM.

And lol at calling Digital Foundry untrustworthy. I've still yet to see any other site doing more detailed tech analyses than DF.
 
Last edited:
A 780 Ti 6GB or Titan 6GB is a waste of money, you are paying for extra storage of cached extras. The GPUs are incapable of rendering a frame buffer to even remotely fill that. Let alone the crappy GPU in these consoles.
LOL! Thats pretty much exactly what I said here.

This whole Vram thing seems like fear mongering to me. The ps4 needs some of its unified memory for os purposes. So of that 8 devs can only use 4.5 gigs. So the idea that because of consoles we need more vram seems like nonsense. Besides, we dont only have vram, we have the system ram too, of which you should have 8 gigs. So with 3 gigs of vram you should be more then ok for anything but 4k or multiple monitor gaming.

I mean really, are there even any cards that are powerful enough to power 6 gigs of vram being used? Thats a TON of power.
People freaking out over vram right now are the less informed consumers getting fooled by inflated requirements.
 
I will add this : I've played shadow of Mordor and even if the graphic side is good, it's far from stunning (I'm running it with a 780 overclock) so my opinion : they didn't optimize much their game and let powerfull hardware do the rest; something I'm confident CDPR won't.
 
I'm not so sure it runs "without a hitch", because I saw some benchmark results @ overclockers.co.uk and while their average frame rate with the texture pack installed was good, the minimum frame rates were 20ish or something. That could imply that in certain points the VRAM isn't enough. Also digital foundry actually played the game instead of running the benchmark to calculate how much VRAM was used. The fact seems to be that at least in-game the game does use more than 5GBs of VRAM.

And lol at calling Digital Foundry untrustworthy. I've still yet to see any other site doing more detailed tech analyses than DF.


That min FPS happens at the start of loading the textures in the first place and can be seen in the video I linked, that always happens with every game and benchmark unless you run a pre-bench loop before such as in the likes of Unigine benchmarks, it means nothing. Benchmarks deliberately run the most demanding scenario in a game, there is nothing in the game that will use more resources.


Again 4GB is more than enough, and I don't trust any of those websites. What I do trust is real world evidence from actual users, who are far less likely to have an agenda. All the evidence shows a GTX 980, mops the floor with this game.
 
Last edited:
Again 4GB is more than enough, and I don't trust any of those websites. What I do trust is real world evidence from actual users, who are far less likely to have an agenda. All the evidence shows a GTX 980, mops the floor with this game.
What agenda could Digital Foundry possibly have..?

Also, did you notice this:

First of all, it's worth pointing out that as of this writing, actually gaining access to the texture pack itself is an involved, convoluted procedure that we only discovered thanks to the legwork done by German site, PCHardware.de. You need to access this Steam URL, hit the launch or install buttons, then when it errors out, head into your Steam client, right-click on Shadow of Mordor in your Steam library, select DLC, tick the HD texture pack and then force an update (or verify the files). If there are any problems with the last part, restarting the Steam client should sort it out. The optional texture pack is a 3.7GB download.

Curiously, the option for ultra textures is still present in the game, even without the pack installed - it simply defaults to high quality art instead.

As of October 2nd you can directly download the patch without a hassle but that benchmark video was from an earlier date. So it's entirely possible that despite the fact that Ultra textures were enabled from the options menu, they weren't actually loaded with the game. That would explain why the game only uses around 3GB of VRAM.

Monolith recommends a 6GB GPU for the highest possible quality level - and we found that at both 1080p and 2560x1440 resolutions, the game's art ate up between 5.4 to 5.6GB of onboard GDDR5. Meanwhile, the high setting utilises 2.8GB to 3GB, while medium is designed for the majority of gaming GPUs out there, occupying around 1.8GB of video RAM.
 
Last edited:
Status
Not open for further replies.
Top Bottom