Predicted witcher 3 system specs? Can I run it .

+
Status
Not open for further replies.
I want one simple thing included in games released on the PC.
The option to LOCK THE FRAME RATES to stop the STUTTERING caused by fps fluctuations, in order to get a smoother and more consistent experience.
And I personally like to lock my FPS to 30, because 60 fps makes me dizzy especially when I'm playing games. A lot of people get motion sickness, so this is another reason to have the FPS lock option.
Or just tweak the settings from your drivers?
 
If Witcher requires 6GB of Vram to max out, I will be very.......displeased.

I don't think it's likely, with the possible exception of 1440p and 1920p, where anybody's guess is as good as anybody else's.

CDPR showed in Red Engine 2 that they can do a good job of memory budgeting. They're also working hard to squeeze as much of the graphics onto the consoles as they can. So they have to do memory management well, and they have shown that they have the developers who can do that.

I'm not worried about the game performing well at 2GB, 3, 4 at the most.
 
I don't think it's likely, with the possible exception of 1440p and 1920p, where anybody's guess is as good as anybody else's.

CDPR showed in Red Engine 2 that they can do a good job of memory budgeting. They're also working hard to squeeze as much of the graphics onto the consoles as they can. So they have to do memory management well, and they have shown that they have the developers who can do that.

I'm not worried about the game performing well at 2GB, 3, 4 at the most.

Yes, I can see it becoming an issue at higher resolutions alright, especially looking at the quality of the graphics in Wither 3. The fact that CDPR were talking about the 780Ti's performance a while ago is encouraging also, I'd say 3gbs will be ample for 1080p.

I'm just worried with the way many developers seem to be heading. When I saw the 6gb requirement for 1080p, I nearly fell off my chair. There is nothing I've seen of SOM that justifies it needing that kind of Vram. As I said earlier, it wouldn't be such an issue if graphics cards (Nvidia) weren't so tight with the Vram on reference models, or didn't charge a small fortune for more of it.
 
@seasonedwitcher it's not that they're tight, but rather the amount requested which is exaggerated
@shawn_kh sometimes it's like you say, but only for settings conflictual with the game engines.. For fps limitation you should have no problem whatsoever :)
 
Yes, I can see it becoming an issue at higher resolutions alright, especially looking at the quality of the graphics in Wither 3. The fact that CDPR were talking about the 780Ti's performance a while ago is encouraging also, I'd say 3gbs will be ample for 1080p.

I'm just worried with the way many developers seem to be heading. When I saw the 6gb requirement for 1080p, I nearly fell off my chair. There is nothing I've seen of SOM that justifies it needing that kind of Vram. As I said earlier, it wouldn't be such an issue if graphics cards (Nvidia) weren't so tight with the Vram on reference models, or didn't charge a small fortune for more of it.

I don't think it's nVidia being stingy, exactly. Memory prices are notoriously volatile, GDDR5 is expensive and in high demand and short supply, circuits to handle GDDR5 are expensive, and all memory got more expensive last year after the accident at Hynix.

Less "stingy" and you'd be paying $100-200 more a card, and supply would be even more limited than it is already. That's why I'm glad first-class engineering operations like CDPR's are good at making the most of available RAM, and second-rate operations like Bethesda's just evade the problem by demanding more of it.
 
Last edited:
I don't think it's nVidia being stingy, exactly. Memory prices are notoriously volatile, GDDR5 is expensive and in high demand and short supply, circuits to handle GDDR5 are expensive, and all memory got more expensive last year after the accident at Hynix.

Less "stingy" and you'd be paying $100-200 more a card, and supply would be even more limited than it is already. That's why I'm glad first-class engineering operations like CDPR's are good at making the most of available RAM.

Compared to AMD, they're stingy. Only now are reference models coming with 4gb, following after of course the way overpriced Titan. And since memory doesn't grow on trees, some devs acting like it does, makes the practice even more unforgivable.
 
Compared to AMD, they're stingy. Only now are reference models coming with 4gb, following after of course the way overpriced Titan. And since memory doesn't grow on trees, some devs acting like it does, makes the practice even more unforgivable.

You're right about the developers. But the Titan is a different breed of cat. It's a professional-type graphics card that sneaked into the enthusiast market. I'm still not sure why nVidia did it. It undercuts their Quadro line badly. Still, it is not a good example of card pricing: it is not overpriced but drastically underpriced for its real use.

Different breed of cat: The Titan is an enthusiast card like this is a small cat.
 
Last edited:
Devs being lazy and using lax methods doesn't mean you start to demand for things that aren't really necessary, that's like some kind of weird bullying. 4GB VRAM is more than enough for a well done title, especially the likes of Evil Within which looks like any other game we've had in the past few years with nothing taxing and still wanting 4GB VRAM.

And just what kind of ridiculous development is this? They said "we don't have a minimum specification" what nonsense?!?!
 
You're right about the developers. But the Titan is a different breed of cat. It's a professional-type graphics card that sneaked into the enthusiast market. I'm still not sure why nVidia did it. It undercuts their Quadro line badly. Still, it is not a good example of card pricing: it is not overpriced but drastically underpriced for its real use.

Ah yeah, but let's not kid ourselves, the 780Ti is pretty much the Titan, with it's double precision performance hobbled, and less memory, and of course, cost a lot less. It's hard to believe all the same, Nvidia have just released their new flagship, and we're being told it's not good enough. I don't think this has ever happed so quickly before?
 
*sigh* Let's not turn this into a NV v AMD "discussion". Bottom line is those are absolutely RIDICULOUS VRAM requirements and thoroughly unjustified. Nobody should fall for that, we don't "need" more VRAM to become a standard yet, atleast for videogames.
 
I'm a Nvidia fan, but right now I would go for an AMD r9 295X2 for 4K gaming. It has 8GB of GDDR5 memory, and it is much cheaper than the Titan.
Right now you could get a PC with 295X2 and a 4790K with a 1000 watts 80 plus PSU at Cyberpowerpc for 1707$, or you could build one and end up paying less. I think Nvidia cards are overpriced in general, but I keep getting Nvidia cards for some reason. My first card was an Nvidia card with 64MB of memory, and since then all my cards have been Nvidia.
To be fair they are trying to improve on their pricing with the GTX 970 and the 980.
And I agree that if developers try harder to optimize their games, we would need much lower specs to run games and we would get smoother gameplay.
A good example would be Watch Dogs. The textures are not impressive at all, but you need a 3GB card to run it on Ultra ? Are you kidding me ! Just optimize your game, and don't do a cheap port to PC and on the side note stop milking Assassin's Creed Ubisoft !
 
Last edited:
It will be a Titan 2, or a GTX 990. Most likely however, just unfounded rumours. They are not going to release a 980Ti a couple of months after the 980, it's just not going to happen. Especially with no competition coming from AMD until spring/summer next year.
 
I'm a Nvidia fan, but right now I would go for an AMD r9 295X2 for 4K gaming. It has 8GB of GDDR5 memory, and it is much cheaper than the Titan.
Right now you could get a PC with 295X2 and a 4790K with a 1000 watts 80 plus PSU at Cyberpowerpc for 1707$, or you could build one and end up paying less. I think Nvidia cards are overpriced in general, but I keep getting Nvidia cards for some reason. My first card was an Nvidia card with 64MB of memory, and since then all my cards have been Nvidia.
To be fair they are trying to improve on their pricing with the GTX 970 and the 980.
And I agree that if developers try harder to optimize their games, we would need much lower specs to run games and we would get smoother gameplay.
A good example would be Watch Dogs. The textures are not impressive at all, but you need a 3GB card to run it on Ultra ? Are you kidding me ! Just optimize your game, and don't do a cheap port to PC and on the side note stop milking Assassin's Creed Ubisoft !

The sad fact is that even though it has 8gigs total it only uses 4gigs. Just like any crossfire or sli set up Vram does not stack...............
 
The sad fact is that even though it has 8gigs total it only uses 4gigs. Just like any crossfire or sli set up Vram does not stack...............

I know the Vram doesn't stack, but surely that fact each card only renders every other line in SLI or Crossfire, should mean less memory is used on each? And if that's not what happens, it's what should happen.
 
Im being scared for real now. The Evil within demanding 4GB, Shadows of Mordor demanding 6GB for Ultra textures... whats next man? What crazy developers are doing games?

I hope my 780 3GB could handle Witcher 3 ultra textures...if not, im defintly finished as a PCGamer, its over... this industry is going crazy...

Im seeign right now (i can up screens) my Crysis 3 using less than 2GB, Far Cry 3 using 1,3Gb max, Metro Redux using 1,2GB, All with filters, MSAA, SMAA etc. I can't believe supposed "next-gen" games needing x3 or x4 times these Vrams specifications... its not fair for PC consumers...
 
Last edited:
I know the Vram doesn't stack, but surely that fact each card only renders every other line in SLI or Crossfire, should mean less memory is used on each? And if that's not what happens, it's what should happen.

Well, no, because each card has to have a complete set of textures, and the textures take up most of the VRAM. The frame buffer itself is only 33MB at 3840x2160; cutting that in half by doing checkerboard or alternate line rendering makes only an insignificant difference.
 
Last edited:
Status
Not open for further replies.
Top Bottom