Predicted witcher 3 system specs? Can I run it .

+
Status
Not open for further replies.
Do You Guys think that the new Nvidia 900s series will be worth upgrading to from 780 GTX?

In a few weeks everytihng will be pubblic domain.. Until then only assumptions can be made. I think as a general rule - no: the market is very quick and I personally find that chaging GPU every new generation isn't enough bang for your buck. I am for new GPU every two generations.

And I don't think these 900's will be much better than 700's simply because there is no actual request for more power, same for CPU's.. I'm talking for gaming
 
It will for sure.. and 900's will be of course ahead.. but I mean that one must decide if the gap between a 780 and a 980 is worth the expense ;)
 
I have a 55" Samsung LED TV that I mostly use for (among other things) console games and PC games that are best played with a controller
Can I ask which tv model you have & does it have a gaming mode ? im thinking of getting a 55 inch 6270 Samsung Smart tv as a montior
 
Nobody actually knows any of this "for sure". It's all rumors and a supposed leak. There has been no announcement of a 900-anything.

We should know more in two weeks or so. Until then, repeating rumors puts your post at as big a risk of being wrong as the rumors.
 
Can I ask which tv model you have & does it have a gaming mode ? im thinking of getting a 55 inch 6270 Samsung Smart tv as a montior
I have a Samsung 6 series F6475. And yes it has a gaming mode. But like I said, it's great for playing games on a couch with a controller but I wouldn't use it as a monitor. I mean the response times in TVs in general are not that great. Even in gaming modes.

If you want more screen real estate than your standard 24/27 inch displays I'd check those new 32" WQHD monitors Benq, Asus and Samsung have released (I talked about them here). I'm personally getting that Benq once I get my pay.
 
Well I was simply saying that W3 will "for sure" push tecnology and any new x80 (call it 880 or 980) will surely perform better than "previous" 780 (very, sliightly, we don't know). Not the hardest assumptions to make :) For the rest I was the first to say than we must wait for the right infos
 
Nvidia is holding a 24 hour gameathon.......they left a few hints this will be when they reveal the maxwell 980 and 970. Also, there has been confirmation by a few insiders that nda lifts on or around that date.....

http://game24.nvidia.com/#/info

Just a couple weeks too wait

Their initial announcement is anticipated before then, with more details to be revealed at the Game24 event. They've announced Maxwell before, and missed dates before, so I'm still in "show me the product" mode.
 
Their initial announcement is anticipated before then, with more details to be revealed at the Game24 event. They've announced Maxwell before, and missed dates before, so I'm still in "show me the product" mode.

FYI, if you put in a bad url for http://game24.nvidia.com/ it calls the home page the "Maxwell Teaser Page". For example http://game24.nvidia.com/error

So, I hope this solidifies that we should see something by the 19 maybe sooner......fingers crossed
 
Hey guys just a quick question :
Right now I have a 3570k stock, and a 7950 also stock.
My question would be if I upgrade to a potential GTX 970( here's hoping it will land at 400$) would I be bottlenecked by the cpu ,
and if so would OC be an option.All I want is 1080p very high(without ubersampling).
Thanks
 
Hey guys just a quick question :
Right now I have a 3570k stock, and a 7950 also stock.
My question would be if I upgrade to a potential GTX 970( here's hoping it will land at 400$) would I be bottlenecked by the cpu ,
and if so would OC be an option.All I want is 1080p very high(without ubersampling).
Thanks[/QUOTE


You would be just fine!!!!
 
OS requirements

Someone close to developers, please advise on OS requirements, as these are the roots, from which my hardware choices grow.
Heard rumors, that no XP support expected, Win7 / 8 only. Is it true? Many thanks in advance.
 
Someone close to developers, please advise on OS requirements, as these are the roots, from which my hardware choices grow.
Heard rumors, that no XP support expected, Win7 / 8 only. Is it true? Many thanks in advance.

Definitely no XP support. The game uses DirectX 11, and that is only on Vista and later. 64-bit Vista/7/8/8.1 is what everybody's expecting.
 
Definitely no XP support. The game uses DirectX 11, and that is only on Vista and later. 64-bit Vista/7/8/8.1 is what everybody's expecting.
Thank you for the answer. Still, i'm not convinced.
See, it's years now when majotity of games started to use DirectX 10, and then 11, all the while they work on DirectX 9 just fine, too.
I'm no specialist, but this is a fact. 'Course, developers can disable DirectX compatibility delibaretely, as some already do with x86 OS's support - if you believe that would be the case please let me know your thoughts. Thank you again.
 
No, the reason for not continuing to support DirectX 9 is it's fucking ancient. And just like XPDM, it gets in the way of doing better things in graphics. You'd have a game that would cost more and run worse if you tried to preserve compatibility. They started from the beginning to make this a pure DirectX 11 title. And it will be far the better game for it. It would not advance the state of the art in gaming graphics if they had to drag that DirectX 9 anchor everywhere.

And the need to keep your working set to 2GB in the 32-bit world is an even bigger reason to ditch the old systems. It makes you budget memory like a tightwad CFO, paging resources out just because they're lower priority than the resources you have to load right now (but had to page out earlier because you needed the memory for something else). Since the whole purpose of having memory is to keep as much of the game as you can in memory to keep from having to go back to disk, 32-bit memory budgets are a thing of the past on gaming systems, and good riddance to them.

Costs more. Runs worse. Don't want it. End of rant.
 
Just tried running The Witcher 1 at a downsampled resolution of 1440p (with everything max & 2xAA) just to see what kind of performance I'll be getting once my 1440p Benq arrives. The frame rate dropped below 20 in the opening scene with the Frightener. (but to be honest it dropped below 30 even @ 1080p.)

I guess CDPR really pushed the Aurora engine beyond what it really was capable of. The game runs like crap at times. Even on modern hardware.
 
Last edited:
Just tried running The Witcher 1 at a downsampled resolution of 1440p (with everything max & 2xAA) just to see what kind of performance I'll be getting once my 1440p Benq arrives. The frame rate dropped below 20 in the opening scene with the Frightener. (but to be honest it dropped below 30 even @ 1080p.)

I guess CDPR really pushed the Aurora engine beyond what it really was capable of. The game runs like crap at times. Even on modern hardware.

Especially the swamps were occasionally a slideshow on my fairly powerful PC :mean:
 
Status
Not open for further replies.
Top Bottom