Forums
Games
Cyberpunk 2077 Thronebreaker: The Witcher Tales GWENT®: The Witcher Card Game The Witcher 3: Wild Hunt The Witcher 2: Assassins of Kings The Witcher The Witcher Adventure Game
Jobs Store Support Log in Register
Forums - CD PROJEKT RED
Menu
Forums - CD PROJEKT RED
  • Hot Topics
  • NEWS
  • GENERAL
    THE WITCHER ADVENTURE GAME
  • STORY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 THE WITCHER TALES
  • GAMEPLAY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 MODS (THE WITCHER) MODS (THE WITCHER 2) MODS (THE WITCHER 3)
  • TECHNICAL
    THE WITCHER THE WITCHER 2 (PC) THE WITCHER 2 (XBOX) THE WITCHER 3 (PC) THE WITCHER 3 (PLAYSTATION) THE WITCHER 3 (XBOX) THE WITCHER 3 (SWITCH)
  • COMMUNITY
    FAN ART (THE WITCHER UNIVERSE) FAN ART (CYBERPUNK UNIVERSE) OTHER GAMES
  • RED Tracker
    The Witcher Series Cyberpunk GWENT
THE WITCHER
THE WITCHER 2
THE WITCHER 3
MODS (THE WITCHER)
MODS (THE WITCHER 2)
MODS (THE WITCHER 3)
Menu

Register

Predicted witcher 3 system specs? Can I run it .

+
Status
Not open for further replies.
Prev
  • 1
  • …

    Go to page

  • 125
  • 126
  • 127
  • 128
  • 129
  • …

    Go to page

  • 134
Next
First Prev 127 of 134

Go to page

Next Last
W

wildspirit

Senior user
#2,521
Dec 15, 2014
I was at a masterclass with Marcin Iwinski last friday in Paris and when asked about specs for TW3, he said that current high end gaming rig would run TW3 on high but probably not in Ultra... I'm not sure if he was serious or not but he also said that to enjoy TW3 in Ultra we'll have to upgrade to something that doesn't exist yet haha

Wait & See !
 
  • RED Point
Reactions: Fallout_Wanderer, LoneWolf and WildFinn96
G

GuyNwah

Ex-moderator
#2,522
Dec 15, 2014
dasega said:
http://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-geforce-gtx-960-to-retain-256-bit-bus-and-4gb-of-gddr5-memory/

If will retain 256 bus and 4GB it might be somewhere in between gtx770 - r9 290x

---------- Updated at 07:21 AM ----------



hmm - 40 fps it's good for me, human eye wouldn't spot difference >40 anyway
Click to expand...
The best rumors say 256-bit bus, 11 shader modules (a Maxwell shader module is 128 shaders and 8 texture units), 64 output processors, but clocked slower than the 970 and 980. Don't know if it will be SLI-capable. I wouldn't be surprised if it wasn't. If TW3 is output-bound the way TW2 is, it may not underperform the 970 and 980 by much.
 
D

dasega

Senior user
#2,523
Dec 15, 2014
Guy N'wah said:
The best rumors say 256-bit bus, 11 shader modules (a Maxwell shader module is 128 shaders and 8 texture units), 64 output processors, but clocked slower than the 970 and 980. Don't know if it will be SLI-capable. I wouldn't be surprised if it wasn't. If TW3 is output-bound the way TW2 is, it may not underperform the 970 and 980 by much.
Click to expand...
At the moment I have 3 options for my rig uograde:
1. gtx 960 if will be able to handle the game.
2. gtx 970 so far the best bang for a buck but will have to change psu.
3. Whatever AMD is planning to release before May that could compete with gtx 900 series.
By the way I red Samsung started 14nm production process, If new AMD card would be 14nm+HBM......... , but I don't think will see it before Witcher release.
 
N

nick_scryer

Rookie
#2,524
Dec 15, 2014
Guy N'wah said:
I do not see any CPU from Core i5 Sandy Bridge and up being a bottleneck. A GTX 970 would be a good fit for it. If you're running DDR3-1333 RAM, I might upgrade that. But clst is right; don't spend money before specs are announced, and better yet don't spend money until more details than just recommended specs are available.
Click to expand...
I'm not sure about that, especially when looking at i5-2500k as a minimum CPU for AC Unity (i7 recommended)... I know nothing is official and we don't know how heavy on the processor Wild Hunt will be... I just to play it in its full (or at least almost full) glory but I can only afford 1 upgrade at the moment. :(
 
S

shawn_kh

Rookie
#2,525
Dec 15, 2014
There are rumors that Nvidia is going to release a card that is 50% faster than Titan Black, and it has 6GB or 8GB vRam. It might possibly be the 980 ti. So I would wait for the new card to get released before upgrading the GPU just to make sure I'm not making a rash decision.
Here's a link to an article about the card:
http://www.kdramastars.com/articles/52474/20141113/gtx-980-ti-release-date.htm
 
sidspyker

sidspyker

Ex-moderator
#2,526
Dec 15, 2014
nick_scryer said:
I'm not sure about that, especially when looking at i5-2500k as a minimum CPU for AC Unity (i7 recommended)
Click to expand...
Well, there's your problem :p

EDIT: @shawn_kh "kdramastars" it's that site which makes up BS rumours all the time. So be wary...
 
I

ImPeakingIt

Rookie
#2,527
Dec 15, 2014
In DAI my frame rate was dropping to mid 30s in some areas on ultra, even on Very High it would drop into 40s. Running on GTX-780, 3770k, 16gb ram. Is it fair to say i can expect the same results for W3? or was DAI poorly optimized and alot will depend on how well W3 is optimized?
 
T

tahirahmed

Rookie
#2,528
Dec 15, 2014
nick_scryer said:
I'm not sure about that, especially when looking at i5-2500k as a minimum CPU for AC Unity (i7 recommended)... I know nothing is official and we don't know how heavy on the processor Wild Hunt will be... I just to play it in its full (or at least almost full) glory but I can only afford 1 upgrade at the moment. :(
Click to expand...
If you believe those requirements than good luck with the upgrades lol :D

Ubisoft also recommended i7 4770k as "Ultra" specs for The Watch Dogs but I ran the game on my i5 4690k with R9 290 and I was always averaging around 60 fps on ultra settings + 2x MSAA @ 1080p and max fps was hitting 75 - 80 sometimes with vsync off, I even tried it with my old i5 2500k and the result was quite similar so that i7 4770k requirement was just an exaggeration from Ubisoft.
 
G

GuyNwah

Ex-moderator
#2,529
Dec 15, 2014
nick_scryer said:
I'm not sure about that, especially when looking at i5-2500k as a minimum CPU for AC Unity (i7 recommended)... I know nothing is official and we don't know how heavy on the processor Wild Hunt will be... I just to play it in its full (or at least almost full) glory but I can only afford 1 upgrade at the moment. :(
Click to expand...
CPU recommendations made by manufacturers are so inflated and presumptuous that I extend them no credibility whatsoever.

The member I was responding to is running an Ivy Bridge Core i5. The hell it can't run anything now on the market.
 
W

WFMS2

Senior user
#2,530
Dec 15, 2014
dasega said:
hmm - 40 fps it's good for me, human eye wouldn't spot difference >40 anyway
Click to expand...
Yea, sorry, that's bullshit. :p
We can easily see up to and past 60fps. We can even see the difference at 120fps if we have an appropriate monitor with a 120Hz refresh rate.
 
  • RED Point
Reactions: WildFinn96
S

sblantipodi

Senior user
#2,531
Dec 15, 2014
dasega said:
http://hexus.net/tech/news/graphics/78185-nvidia-geforce-gtx-960-rumoured-january-release/

Do you think gtx 960 will be able to max out W3?
Click to expand...
In what epoc a low end card was able to maxout a new demanding game ?
 
  • RED Point
Reactions: WildFinn96 and LoneWolf
D

dasega

Senior user
#2,532
Dec 15, 2014
WFMS said:
Yea, sorry, that's bullshit. :p
We can easily see up to and past 60fps. We can even see the difference at 120fps if we have an appropriate monitor with a 120Hz refresh rate.
Click to expand...
I suggest to go to cinema and watch any movie in 24 fps than come back to you computer and check again if fps is what makes difference.
40 or 50 fps difference is none except statistically driven orgasmic feeling.

---------- Updated at 09:02 PM ----------

sblantipodi said:
In what epoc a low end card was able to maxout a new demanding game ?
Click to expand...
Shadow of Mordor recommended system requirements = gtx670<gtx770<gtx960
 
  • RED Point
Reactions: tahirahmed
D

DarkFurry

Rookie
#2,533
Dec 15, 2014
system requirements

hi, i want to know the requirements for the witcher 3 so i can know what i need to upgrade..what graphic card we need? an nvidia gtx 770 3 gb is ok? for ultra?
 
W

WFMS2

Senior user
#2,534
Dec 15, 2014
dasega said:
I suggest to go to cinema and watch any movie in 24 fps than come back to you computer and check again if fps is what makes difference.
40 or 50 fps difference is none except statistically driven orgasmic feeling.
Click to expand...
Except a 24fps movie will be 24fps whether you view it in the cinema or at home on a computer screen, it doesn't change when you view it on a 60Hz display.
They're also MOVIES not games. The way the framerates are handled between them are very different.

Movies are capture from real life using a variety of different cameras, that capture light photons through a lens and save them as frames, either on film or to a piece of digital media in more modern cameras.
The effect of this means that movies get their frames blended together naturally with motion blur because of how light is captured due to lens exposure times.
Video games have to render the frames instead.
They don't get the benefit of this motion blur. Instead they have to rely on post process effects to make motion blur, which is still terrible because it has to fit within a render time of 16ms/33ms per frame in addition to everything else that has to be rendered. So all we get is terrible looking motion blur. We're only now starting to get decent stuff like per-object motion blur that more closely mimics what cameras do thanks to the new consoles finally being above toaster tier..

Please don't tell me it's "a placebo" or a "statistically driven orgasmic feeling", because it's bullshit.
Or that there's no difference between 40 or 50 or 60fps. Because there is.
I don't need any framerate counter at the corner of my screen, nor do most other people that know what they're talking about, to see when a game isn't running at 60fps but rather at 50 or 40.
Stuff like GSync and FreeSync is being made for a reason, and not to achieve some statistical orgasm.
 
  • RED Point
Reactions: WildFinn96, LoneWolf and kittehoverlord.863
G

GuyNwah

Ex-moderator
#2,535
Dec 15, 2014
BrightFury said:
hi, i want to know the requirements for the witcher 3 so i can know what i need to upgrade..what graphic card we need? an nvidia gtx 770 3 gb is ok? for ultra?
Click to expand...
Join the queue. System requirements have not been given out. The best anybody can do is guess. Our guesses are not likely to be very good, and if you were to spend money based on our guesses, we would feel bad. So please don't.

Current thought is that a single GTX 770 is not going to be enough to carry the game with all the Ultra-level eye candy.
 
Last edited: Dec 15, 2014
  • RED Point
Reactions: dasega
T

tahirahmed

Rookie
#2,536
Dec 15, 2014
sblantipodi said:
In what epoc a low end card was able to maxout a new demanding game ?
Click to expand...
A GTX 750 Ti which was debut Maxwell GPU (most low end in Maxwell line) can max out Tomb Raider 2013 (still considered as one of the most demanding games) on ultra settings @ 1080p while giving playable frame rates and several other games so it's not always about low end vs high end as architecture also makes difference. Today's high end could be defeated by tomorrow's low end.

http://www.techspot.com/review/783-geforce-gtx-750-ti-vs-radeon-r7-265/page4.html
 
Last edited: Dec 15, 2014
D

dasega

Senior user
#2,537
Dec 15, 2014
WFMS said:
Except a 24fps movie will be 24fps whether you view it in the cinema or at home on a computer screen, it doesn't change when you view it on a 60Hz display.
They're also MOVIES not games. The way the framerates are handled between them are very different.

Movies are capture from real life using a variety of different cameras, that capture light photons through a lens and save them as frames, either on film or to a piece of digital media in more modern cameras.
The effect of this means that movies get their frames blended together naturally with motion blur because of how light is captured due to lens exposure times.
Video games have to render the frames instead.
They don't get the benefit of this motion blur. Instead they have to rely on post process effects to make motion blur, which is still terrible because it has to fit within a render time of 16ms/33ms per frame in addition to everything else that has to be rendered. So all we get is terrible looking motion blur. We're only now starting to get decent stuff like per-object motion blur that more closely mimics what cameras do thanks to the new consoles finally being above toaster tier..

Please don't tell me it's "a placebo" or a "statistically driven orgasmic feeling", because it's bullshit.
Or that there's no difference between 40 or 50 or 60fps. Because there is.
I don't need any framerate counter at the corner of my screen, nor do most other people that know what they're talking about, to see when a game isn't running at 60fps but rather at 50 or 40.
Stuff like GSync and FreeSync is being made for a reason, and not to achieve some statistical orgasm.
Click to expand...
Read your post, you just proven a lot of other stuff matters more than fps. Anyway wow you made lot of effort copying&pasting Wikipedia, sorry we are on different page I can not see things you have red about when playing games.
IMO silicon semiconductors advancement reached it's limits and now you will be fed with 10 fps/200 dpi improvement every year, and that's bullshit AMD/Nvidia is selling you.
 
Last edited: Dec 15, 2014
G

GuyNwah

Ex-moderator
#2,538
Dec 15, 2014
Moderator: The tone of the argument about how much frame rates matter has descended to a level where members have already questioned each other's competence and insults are about to be traded So the time to cease arguing about who is right about how much frame rates matter is now.
 
  • RED Point
Reactions: WildFinn96 and kittehoverlord.863
D

dasega

Senior user
#2,539
Dec 15, 2014
Guy N'wah said:
Moderator: The tone of the argument about how much frame rates matter has descended to a level where members have already questioned each other's competence and insults are about to be traded So the time to cease arguing about who is right about how much frame rates matter is now.
Click to expand...



Owkay, Peace Brothers, Hawk!

Moderator: "Cease arguing now" is not an invitation to repeat arguments. Thank you for not doing so in the future.

---------- Updated at 11:16 PM ----------

View attachment 8582
 

Attachments

  • cute_spongebob_by_leilush12-d4j9j8u.jpg
    cute_spongebob_by_leilush12-d4j9j8u.jpg
    218.4 KB Views: 28
Last edited by a moderator: Dec 15, 2014
  • RED Point
Reactions: GuyNwah and tahirahmed
N

nick_scryer

Rookie
#2,540
Dec 16, 2014
Guy N'wah said:
CPU recommendations made by manufacturers are so inflated and presumptuous that I extend them no credibility whatsoever.

The member I was responding to is running an Ivy Bridge Core i5. The hell it can't run anything now on the market.
Click to expand...
I'm still not quite convinced. Dying Light, another upcoming next-gen only game (made in Poland), also points at i5-2500 as a minimum CPU to run it. And according to CPUBOSS, i5-2500 is on par with my 3450... And frankly, I don't want to barely meet Witcher's minimum, especially if I buy a high end graphics card.
 
Prev
  • 1
  • …

    Go to page

  • 125
  • 126
  • 127
  • 128
  • 129
  • …

    Go to page

  • 134
Next
First Prev 127 of 134

Go to page

Next Last
Status
Not open for further replies.
Share:
Facebook Twitter Reddit Pinterest Tumblr WhatsApp Email Link
  • English
    English Polski (Polish) Deutsch (German) Русский (Russian) Français (French) Português brasileiro (Brazilian Portuguese) Italiano (Italian) 日本語 (Japanese) Español (Spanish)

STAY CONNECTED

Facebook Twitter YouTube
CDProjekt RED Mature 17+
  • Contact administration
  • User agreement
  • Privacy policy
  • Cookie policy
  • Press Center
© 2018 CD PROJEKT S.A. ALL RIGHTS RESERVED

The Witcher® is a trademark of CD PROJEKT S. A. The Witcher game © CD PROJEKT S. A. All rights reserved. The Witcher game is based on the prose of Andrzej Sapkowski. All other copyrights and trademarks are the property of their respective owners.

Forum software by XenForo® © 2010-2020 XenForo Ltd.