Predicted witcher 3 system specs? Can I run it .

+
Status
Not open for further replies.
I was at a masterclass with Marcin Iwinski last friday in Paris and when asked about specs for TW3, he said that current high end gaming rig would run TW3 on high but probably not in Ultra... I'm not sure if he was serious or not but he also said that to enjoy TW3 in Ultra we'll have to upgrade to something that doesn't exist yet haha

Wait & See !
 
http://www.kitguru.net/components/g...o-retain-256-bit-bus-and-4gb-of-gddr5-memory/

If will retain 256 bus and 4GB it might be somewhere in between gtx770 - r9 290x

---------- Updated at 07:21 AM ----------



hmm - 40 fps it's good for me, human eye wouldn't spot difference >40 anyway

The best rumors say 256-bit bus, 11 shader modules (a Maxwell shader module is 128 shaders and 8 texture units), 64 output processors, but clocked slower than the 970 and 980. Don't know if it will be SLI-capable. I wouldn't be surprised if it wasn't. If TW3 is output-bound the way TW2 is, it may not underperform the 970 and 980 by much.
 
The best rumors say 256-bit bus, 11 shader modules (a Maxwell shader module is 128 shaders and 8 texture units), 64 output processors, but clocked slower than the 970 and 980. Don't know if it will be SLI-capable. I wouldn't be surprised if it wasn't. If TW3 is output-bound the way TW2 is, it may not underperform the 970 and 980 by much.
At the moment I have 3 options for my rig uograde:
1. gtx 960 if will be able to handle the game.
2. gtx 970 so far the best bang for a buck but will have to change psu.
3. Whatever AMD is planning to release before May that could compete with gtx 900 series.
By the way I red Samsung started 14nm production process, If new AMD card would be 14nm+HBM......... , but I don't think will see it before Witcher release.
 
I do not see any CPU from Core i5 Sandy Bridge and up being a bottleneck. A GTX 970 would be a good fit for it. If you're running DDR3-1333 RAM, I might upgrade that. But clst is right; don't spend money before specs are announced, and better yet don't spend money until more details than just recommended specs are available.

I'm not sure about that, especially when looking at i5-2500k as a minimum CPU for AC Unity (i7 recommended)... I know nothing is official and we don't know how heavy on the processor Wild Hunt will be... I just to play it in its full (or at least almost full) glory but I can only afford 1 upgrade at the moment. :(
 
In DAI my frame rate was dropping to mid 30s in some areas on ultra, even on Very High it would drop into 40s. Running on GTX-780, 3770k, 16gb ram. Is it fair to say i can expect the same results for W3? or was DAI poorly optimized and alot will depend on how well W3 is optimized?
 
I'm not sure about that, especially when looking at i5-2500k as a minimum CPU for AC Unity (i7 recommended)... I know nothing is official and we don't know how heavy on the processor Wild Hunt will be... I just to play it in its full (or at least almost full) glory but I can only afford 1 upgrade at the moment. :(

If you believe those requirements than good luck with the upgrades lol :D

Ubisoft also recommended i7 4770k as "Ultra" specs for The Watch Dogs but I ran the game on my i5 4690k with R9 290 and I was always averaging around 60 fps on ultra settings + 2x MSAA @ 1080p and max fps was hitting 75 - 80 sometimes with vsync off, I even tried it with my old i5 2500k and the result was quite similar so that i7 4770k requirement was just an exaggeration from Ubisoft.
 
I'm not sure about that, especially when looking at i5-2500k as a minimum CPU for AC Unity (i7 recommended)... I know nothing is official and we don't know how heavy on the processor Wild Hunt will be... I just to play it in its full (or at least almost full) glory but I can only afford 1 upgrade at the moment. :(

CPU recommendations made by manufacturers are so inflated and presumptuous that I extend them no credibility whatsoever.

The member I was responding to is running an Ivy Bridge Core i5. The hell it can't run anything now on the market.
 
hmm - 40 fps it's good for me, human eye wouldn't spot difference >40 anyway

Yea, sorry, that's bullshit. :p
We can easily see up to and past 60fps. We can even see the difference at 120fps if we have an appropriate monitor with a 120Hz refresh rate.
 
Yea, sorry, that's bullshit. :p
We can easily see up to and past 60fps. We can even see the difference at 120fps if we have an appropriate monitor with a 120Hz refresh rate.

I suggest to go to cinema and watch any movie in 24 fps than come back to you computer and check again if fps is what makes difference.
40 or 50 fps difference is none except statistically driven orgasmic feeling.

---------- Updated at 09:02 PM ----------

In what epoc a low end card was able to maxout a new demanding game ?

Shadow of Mordor recommended system requirements = gtx670<gtx770<gtx960
 
system requirements

hi, i want to know the requirements for the witcher 3 so i can know what i need to upgrade..what graphic card we need? an nvidia gtx 770 3 gb is ok? for ultra?
 
I suggest to go to cinema and watch any movie in 24 fps than come back to you computer and check again if fps is what makes difference.
40 or 50 fps difference is none except statistically driven orgasmic feeling.

Except a 24fps movie will be 24fps whether you view it in the cinema or at home on a computer screen, it doesn't change when you view it on a 60Hz display.
They're also MOVIES not games. The way the framerates are handled between them are very different.

Movies are capture from real life using a variety of different cameras, that capture light photons through a lens and save them as frames, either on film or to a piece of digital media in more modern cameras.
The effect of this means that movies get their frames blended together naturally with motion blur because of how light is captured due to lens exposure times.
Video games have to render the frames instead.
They don't get the benefit of this motion blur. Instead they have to rely on post process effects to make motion blur, which is still terrible because it has to fit within a render time of 16ms/33ms per frame in addition to everything else that has to be rendered. So all we get is terrible looking motion blur. We're only now starting to get decent stuff like per-object motion blur that more closely mimics what cameras do thanks to the new consoles finally being above toaster tier..

Please don't tell me it's "a placebo" or a "statistically driven orgasmic feeling", because it's bullshit.
Or that there's no difference between 40 or 50 or 60fps. Because there is.
I don't need any framerate counter at the corner of my screen, nor do most other people that know what they're talking about, to see when a game isn't running at 60fps but rather at 50 or 40.
Stuff like GSync and FreeSync is being made for a reason, and not to achieve some statistical orgasm.
 
hi, i want to know the requirements for the witcher 3 so i can know what i need to upgrade..what graphic card we need? an nvidia gtx 770 3 gb is ok? for ultra?

Join the queue. System requirements have not been given out. The best anybody can do is guess. Our guesses are not likely to be very good, and if you were to spend money based on our guesses, we would feel bad. So please don't.

Current thought is that a single GTX 770 is not going to be enough to carry the game with all the Ultra-level eye candy.
 
Last edited:
In what epoc a low end card was able to maxout a new demanding game ?

A GTX 750 Ti which was debut Maxwell GPU (most low end in Maxwell line) can max out Tomb Raider 2013 (still considered as one of the most demanding games) on ultra settings @ 1080p while giving playable frame rates and several other games so it's not always about low end vs high end as architecture also makes difference. Today's high end could be defeated by tomorrow's low end.

http://www.techspot.com/review/783-geforce-gtx-750-ti-vs-radeon-r7-265/page4.html
 
Last edited:
Except a 24fps movie will be 24fps whether you view it in the cinema or at home on a computer screen, it doesn't change when you view it on a 60Hz display.
They're also MOVIES not games. The way the framerates are handled between them are very different.

Movies are capture from real life using a variety of different cameras, that capture light photons through a lens and save them as frames, either on film or to a piece of digital media in more modern cameras.
The effect of this means that movies get their frames blended together naturally with motion blur because of how light is captured due to lens exposure times.
Video games have to render the frames instead.
They don't get the benefit of this motion blur. Instead they have to rely on post process effects to make motion blur, which is still terrible because it has to fit within a render time of 16ms/33ms per frame in addition to everything else that has to be rendered. So all we get is terrible looking motion blur. We're only now starting to get decent stuff like per-object motion blur that more closely mimics what cameras do thanks to the new consoles finally being above toaster tier..

Please don't tell me it's "a placebo" or a "statistically driven orgasmic feeling", because it's bullshit.
Or that there's no difference between 40 or 50 or 60fps. Because there is.
I don't need any framerate counter at the corner of my screen, nor do most other people that know what they're talking about, to see when a game isn't running at 60fps but rather at 50 or 40.
Stuff like GSync and FreeSync is being made for a reason, and not to achieve some statistical orgasm.

Read your post, you just proven a lot of other stuff matters more than fps. Anyway wow you made lot of effort copying&pasting Wikipedia, sorry we are on different page I can not see things you have red about when playing games.
IMO silicon semiconductors advancement reached it's limits and now you will be fed with 10 fps/200 dpi improvement every year, and that's bullshit AMD/Nvidia is selling you.
 
Last edited:
Moderator: The tone of the argument about how much frame rates matter has descended to a level where members have already questioned each other's competence and insults are about to be traded So the time to cease arguing about who is right about how much frame rates matter is now.
 
Moderator: The tone of the argument about how much frame rates matter has descended to a level where members have already questioned each other's competence and insults are about to be traded So the time to cease arguing about who is right about how much frame rates matter is now.




Owkay, Peace Brothers, Hawk!

Moderator: "Cease arguing now" is not an invitation to repeat arguments. Thank you for not doing so in the future.

---------- Updated at 11:16 PM ----------

View attachment 8582
 

Attachments

  • cute_spongebob_by_leilush12-d4j9j8u.jpg
    cute_spongebob_by_leilush12-d4j9j8u.jpg
    218.4 KB · Views: 53
Last edited by a moderator:
CPU recommendations made by manufacturers are so inflated and presumptuous that I extend them no credibility whatsoever.

The member I was responding to is running an Ivy Bridge Core i5. The hell it can't run anything now on the market.

I'm still not quite convinced. Dying Light, another upcoming next-gen only game (made in Poland), also points at i5-2500 as a minimum CPU to run it. And according to CPUBOSS, i5-2500 is on par with my 3450... And frankly, I don't want to barely meet Witcher's minimum, especially if I buy a high end graphics card.
 
Status
Not open for further replies.
Top Bottom