Forums
Games
Cyberpunk 2077 Thronebreaker: The Witcher Tales GWENT®: The Witcher Card Game The Witcher 3: Wild Hunt The Witcher 2: Assassins of Kings The Witcher The Witcher Adventure Game
Jobs Store Support Log in Register
Forums - CD PROJEKT RED
Menu
Forums - CD PROJEKT RED
  • Hot Topics
  • NEWS
  • GENERAL
    THE WITCHER ADVENTURE GAME
  • STORY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 THE WITCHER TALES
  • GAMEPLAY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 MODS (THE WITCHER) MODS (THE WITCHER 2) MODS (THE WITCHER 3)
  • TECHNICAL
    THE WITCHER THE WITCHER 2 (PC) THE WITCHER 2 (XBOX) THE WITCHER 3 (PC) THE WITCHER 3 (PLAYSTATION) THE WITCHER 3 (XBOX) THE WITCHER 3 (SWITCH)
  • COMMUNITY
    FAN ART (THE WITCHER UNIVERSE) FAN ART (CYBERPUNK UNIVERSE) OTHER GAMES
  • RED Tracker
    The Witcher Series Cyberpunk GWENT
THE WITCHER
THE WITCHER 2
THE WITCHER 3
MODS (THE WITCHER)
MODS (THE WITCHER 2)
MODS (THE WITCHER 3)
Menu

Register

Predicted witcher 3 system specs? Can I run it .

+
Status
Not open for further replies.
Prev
  • 1
  • …

    Go to page

  • 58
  • 59
  • 60
  • 61
  • 62
  • …

    Go to page

  • 134
Next
First Prev 60 of 134

Go to page

Next Last
ChrisStayler

ChrisStayler

Senior user
#1,181
Jun 20, 2014
theLaughingStorm said:
Turning down the resolution to "max" the game is counterproductive if your goal is the best possible image quality. There is a reason people pay through the nose for higher resolution monitors.
Click to expand...
Is their a big difference in performance from 1080p to 1440p? How much fps will you lose by jumping to 1440p? I know it's guess work but if you had to guess.
 
T

theLaughingStorm.108

Rookie
#1,182
Jun 20, 2014
ChrisStayler said:
Is their a big difference in performance from 1080p to 1440p? How much fps will you lose by jumping to 1440p? I know it's guess work but if you had to guess.
Click to expand...
Well, if you look back at the benchmark for crysis 3 that you linked earlier, and you compare the 1080 performance to the 1440 performance, the minimum fps goes from 61 at 1080 to 38 at 1440. Thats a 23 frame difference with all the same settings. Which is about a 38% difference in fps. Now I think that is an extreme example, most of the time I think the performance hit would be closer to 30% then 40. But this is as good an example as any. You can expect at least a 30% reduction in frame-rate with the same settings going from 1080 to 1440.

For further reference 1440p has 80% more pixels then 1080p. So you are NOT taking an 80% performance hit, that would be ridiculous.
 
Last edited: Jun 20, 2014
  • RED Point
Reactions: ChrisStayler
ChrisStayler

ChrisStayler

Senior user
#1,183
Jun 20, 2014
theLaughingStorm said:
Well, if you look back at the benchmark for crysis 3 that you linked earlier, and you compare the 1080 performance to the 1440 performance, the minimum fps goes from 61 at 1080 to 38 at 1440. Thats a 23 frame difference with all the same settings. Which is about a 38% difference in fps. Now I think that is an extreme example, most of the time I think the performance hit would be closer to 30% then 40. But this is as good an example as any. You can expect at least a 30% reduction in frame-rate with the same settings going from 1080 to 1440.

For further reference 1440p has 80% more pixels then 1080p. So you are NOT taking an 80% performance hit, that would be ridiculous.
Click to expand...
And what about a difference of 1080p to 1366x768 how much more fps would you get?
 
G

GuyNwah

Ex-moderator
#1,184
Jun 20, 2014
ChrisStayler said:
And what about a difference of 1080p to 1366x768 how much more fps would you get?
Click to expand...
It depends on how output-bound the game is. Some games, TW2 is maybe the best example, are thoroughly output-bound. There, the frame rate is almost an exact inverse of the pixel count. Games like Crysis 3 are not as strongly output-bound, and the slope will be different.

But here's the problem: @theLaughingStorm is trying to draw a proportion of fps to pixels, but the relationship is reciprocal, not linear, and that makes the arithmetic very different. Make the comparison on frame time instead, which is the inverse of frame rate, and you will have a correct view of the effect of pixel count.

61 fps = 16.4 msec
38 fps = 26.3 msec
The frame time increased by 60%: that is your performance hit when you increase the pixels by 78%. It's a much stronger dependency on pixel count than the incorrect comparison on frame rate would indicate.
 
ChrisStayler

ChrisStayler

Senior user
#1,185
Jun 21, 2014
http://www.kdramastars.com/articles/19164/20140408/witcher-3-pc-max-settings.htm
Well according to this. (And it can be false but lets just say it's not.) The GTX 780 Ti will run the Witcher 3 PC version at 35-45 fps at 1080p max settings with 8x MSAA enabled. How can TW3 run maxed with 60fps without losing too much eyecandy?
1. Released state: Gain 5fps
2. Turn off Physics: Gain 10fps
3. Lower to 2x MSAA: Gain 10fps

That would then be 60/70 fps gain
 
Last edited: Jun 21, 2014
G

GuyNwah

Ex-moderator
#1,186
Jun 21, 2014
This kind of analysis is still based on faulty arithmetic. Frame rates are reciprocal, not linear. If you try to add or subtract them, you get a false picture of differences or improvements.

Getting from 35 fps to 60 fps is not a 25 fps (about 71%) improvement. It's going from 1/35 to 1/60 seconds per frame, and that's about a 41% improvement. So the work needed to render a frame needs to be decreased by about 40 percent to make this happen.

By my calculation, TW2 needs about 200-250 instructions to render a pixel. A 40% decrease means you have to make it so that 80-100 of those instructions go away (or get executed across more parallel processors). That's a huge challenge for the programmers. Part of that is writing better code, and the other part is taking advantage of newer technology such as multiple rendering threads.

Mantle, the latest editions of DirectX 11 (I know Windows 8.1 is unpopular for other good reasons, but the graphics stack is much improved), and the latest OpenGLs all do this. I'm sure the consoles do too, though they're not coded to publicly known APIs: those 8 cores have to be good for something, and multithread rendering is about the best thing they're good for in a console. That's why they're of such great and immediate interest.
 
Last edited: Jun 21, 2014
C

curlyhairedboy

Senior user
#1,187
Jun 21, 2014
i heard the 880 is gonna be cheaper than the 780 Ti...
 
ChrisStayler

ChrisStayler

Senior user
#1,188
Jun 21, 2014
curly haired boy said:
i heard the 880 is gonna be cheaper than the 780 Ti...
Click to expand...
But then would that make it a less powerful of a GPU? I mean if it's cheaper and better than the 780 Ti then that would be to good to be true. But then again i heard the TITAN is more expensive than the 780 Ti and less powerful as well.
 
G

GuyNwah

Ex-moderator
#1,189
Jun 21, 2014
ChrisStayler said:
But then would that make it a less powerful of a GPU? I mean if it's cheaper and better than the 780 Ti then that would be to good to be true. But then again i heard the TITAN is more expensive than the 780 Ti and less powerful as well.
Click to expand...
Not at all. Different generation, different architecture. When there is an 880, it won't be Kepler technology, it will be Maxwell (even if it is a reworked 28nm Maxwell). We've already seen what Maxwell can do on a small scale with the 750/750Ti,

Whether it's cheaper or not will be a different question. If they can't get 20nm into production, it probably won't be cheaper, because big chips are expensive to make,

Anyway, Titan's not for gamers, it's for number crunchers, and even if it's not the equal of the 780/780Ti at crunching textures, it blows the magic smoke off them in double precision. In its intended market, it's a flaming bargain.
 
Last edited: Jun 21, 2014
ChrisStayler

ChrisStayler

Senior user
#1,190
Jun 21, 2014
http://www.tomshardware.com/reviews/geforce-gtx-780-ti-review-benchmarks,3663-7.html
This link really get's me thinking it shows how the 7990 runs Crysis 3 much better than the 780 Ti and that's because Crysis 3 is AMD friendly. TW3 will be probably run better on Nivdia GPUs. And that's one reason why i'm interested in getting a Nivdia GPU. Of course there was not that big of a difference when Crysis 3 launched but now there is because of the updates as well as Mantel.
 
ChrisStayler

ChrisStayler

Senior user
#1,191
Jun 21, 2014
GTX 580 Release - November 9th, 2010
GTX 680 Release - March 22, 2012
GTX 780 Release - May 23, 2013
GTX 880 - Fall 2014? - Has to be this year right?

They might release the 880 before TW3 comes out. For some reason in my gut i think that it will be out next year. I hope not.
 
K

Krojek

Rookie
#1,192
Jun 21, 2014
Will it work on me PC?

Will the game work with this characteristics:8 gb RAM,GTX 650 ti boost and i5?
 
O

Osiris666

Senior user
#1,193
Jun 21, 2014
i am not even asking of my gtx260 :D BUT it handled tw2 on ultra just with uber off and vsync off ;)
 
M

mavowar

Senior user
#1,194
Jun 21, 2014
ChrisStayler said:
http://www.tomshardware.com/reviews/geforce-gtx-780-ti-review-benchmarks,3663-7.html
This link really get's me thinking it shows how the 7990 runs Crysis 3 much better than the 780 Ti and that's because Crysis 3 is AMD friendly. TW3 will be probably run better on Nivdia GPUs. And that's one reason why i'm interested in getting a Nivdia GPU. Of course there was not that big of a difference when Crysis 3 launched but now there is because of the updates as well as Mantel.
Click to expand...
A7990 is a dual gpu card.........that is why it runs it better......compare that 7990 to 780 sli or 780ri sli and then you get the real picture....
 
  • RED Point
Reactions: GosuPL
DriesNL

DriesNL

Rookie
#1,195
Jun 21, 2014
So how long do you guys generally last with your CPUs? I'm asking this because in the summer of 2015 I'll be upgrading my 2500K 3.3ghz to either go for a Haswell (E?) or a Broadwell. The reason I'll be upgrading my CPU is because I've set my sights on a GTX 800 series card, probably the GTX 880 and it just doesn't feel logical to run a sparkly new GPU on an 5 year old CPU. So how much gain should I get from upgrading my 2500K? It feels kinda stupid saying this, but this may affect the advice you'll give:
money isn't really an issue

Would you guys consider it fairly neseccary, a complete waste of money, or somewhere in between?

Props to all you guys informing the lesser informed btw, good stuff going on here.
 
A

alextyc1

Rookie
#1,196
Jun 21, 2014
well the i5 2500 is a pretty strong cpu even now,in all the games benchmarks i check(regulary,they test different cpu's performances )this cpu is on par with the i7 2600/i5 4670 (with max of 5 frames differences).
if i were you i would just spend my money on a new gpu,your cpu is more than enough for current games unless they are not optimized of course (ahm*ubisoft*ahm).
 
Last edited: Jun 21, 2014
G

GuyNwah

Ex-moderator
#1,197
Jun 21, 2014
Upgrading CPUs is mostly a waste of money. Before an upgrade is worth it, you actually have to have a need to run programs that require greater performance than you get now.

Dual-core Core 2's and nasty old things like Athlon 64x2 and original Phenoms are the only things that really have to be upgraded. In particular, any Core i7 and any Sandy Bridge Core i5 don't need an upgrade unless you're a professional number cruncher.

Heavy SLI or Crossfire setups that push the bandwidth of earlier PCI-Express systems are the only things where the CPU is likely to become a bottleneck.

Everything else is just having the latest CPU for bragging rights.
 
Last edited: Jun 21, 2014
  • RED Point
Reactions: LoneWolf
5

501105

Forum veteran
#1,198
Jun 21, 2014
Guy N'wah said:
Upgrading CPUs is mostly a waste of money. Before an upgrade is worth it, you actually have to have a need to run programs that require greater performance than you get now.

Dual-core Core 2's and nasty old things like Athlon 64x2 and original Phenoms are the only things that really have to be upgraded. In particular, any Core i7 and any Sandy Bridge Core i5 don't need an upgrade unless you're a professional number cruncher.

Heavy SLI or Crossfire setups that push the bandwidth of earlier PCI-Express systems are the only things where the CPU is likely to become a bottleneck.

Everything else is just having the latest CPU for bragging rights.
Click to expand...
Which is kind of sad when you think about it, the cpu market has become very predictable and slow when it comes to new tech that actually matters.
 
G

GuyNwah

Ex-moderator
#1,199
Jun 21, 2014
501105 said:
Which is kind of sad when you think about it, the cpu market has become very predictable and slow when it comes to new tech that actually matters.
Click to expand...
Well, the high performance computing market (servers, pro workstations, number crunchers) needs those new CPUs. Every bit of straight performance and performance per watt means money in the bank to them. For ordinary gamers, the point of diminishing returns is probably around the Sandy Bridge Core i5. But even in the high performance market, the advances over Sandy Bridge-E are merely fractional.
 
Last edited: Jun 21, 2014
W

WFMS2

Senior user
#1,200
Jun 21, 2014
501105 said:
Which is kind of sad when you think about it, the cpu market has become very predictable and slow when it comes to new tech that actually matters.
Click to expand...
In the consumer CPU market yea. Mostly because there's no demand for more powerful CPU's and also because AMD isn't much of a competitor in the consumer space.
As far as I know it's a lot different in the Professional market for servers, low power chips for mobile and various other things.
That is where Intel is putting most of it's effort in right now. Trying to compete with ARM and Qualcomm. (Losing pretty badly mind you.)
They even spent a ridiculous amount of money for a new Fab plant a while back for this stuff and it was mostly a waste of money.
Edit: Guy N'wah pretty much beat me to it. :p
 
Prev
  • 1
  • …

    Go to page

  • 58
  • 59
  • 60
  • 61
  • 62
  • …

    Go to page

  • 134
Next
First Prev 60 of 134

Go to page

Next Last
Status
Not open for further replies.
Share:
Facebook Twitter Reddit Pinterest Tumblr WhatsApp Email Link
  • English
    English Polski (Polish) Deutsch (German) Русский (Russian) Français (French) Português brasileiro (Brazilian Portuguese) Italiano (Italian) 日本語 (Japanese) Español (Spanish)

STAY CONNECTED

Facebook Twitter YouTube
CDProjekt RED Mature 17+
  • Contact administration
  • User agreement
  • Privacy policy
  • Cookie policy
  • Press Center
© 2018 CD PROJEKT S.A. ALL RIGHTS RESERVED

The Witcher® is a trademark of CD PROJEKT S. A. The Witcher game © CD PROJEKT S. A. All rights reserved. The Witcher game is based on the prose of Andrzej Sapkowski. All other copyrights and trademarks are the property of their respective owners.

Forum software by XenForo® © 2010-2020 XenForo Ltd.