Forums
Games
Cyberpunk 2077 Thronebreaker: The Witcher Tales GWENT®: The Witcher Card Game The Witcher 3: Wild Hunt The Witcher 2: Assassins of Kings The Witcher The Witcher Adventure Game
Jobs Store Support Log in Register
Forums - CD PROJEKT RED
Menu
Forums - CD PROJEKT RED
  • Hot Topics
  • NEWS
  • GENERAL
    THE WITCHER ADVENTURE GAME
  • STORY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 THE WITCHER TALES
  • GAMEPLAY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 MODS (THE WITCHER) MODS (THE WITCHER 2) MODS (THE WITCHER 3)
  • TECHNICAL
    THE WITCHER THE WITCHER 2 (PC) THE WITCHER 2 (XBOX) THE WITCHER 3 (PC) THE WITCHER 3 (PLAYSTATION) THE WITCHER 3 (XBOX) THE WITCHER 3 (SWITCH)
  • COMMUNITY
    FAN ART (THE WITCHER UNIVERSE) FAN ART (CYBERPUNK UNIVERSE) OTHER GAMES
  • RED Tracker
    The Witcher Series Cyberpunk GWENT
THE WITCHER
THE WITCHER 2
THE WITCHER 3
MODS (THE WITCHER)
MODS (THE WITCHER 2)
MODS (THE WITCHER 3)
Menu

Register

960 runs it on High @ 1080p and Titan X on Ultra @ 4K

+
Prev
  • 1
  • …

    Go to page

  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
Next
First Prev 6 of 9

Go to page

Next Last
B

blizzzilla

Rookie
#101
May 15, 2015
Static-Jak, your CPU is still faster than the min required AMD CPU Phenom II X4 940. My guess is that all all i5 since the 2000 series will run the game fine. But only benchmarks can really give an answer.
 
U

uthvag

Rookie
#102
May 15, 2015
Static-Jak said:
My 970 has never been my concern, it's my i5-2500 (non K) I worry won't be able to keep up.

I was considering upgrading but at this point I want to hold off till the Skylake Intel CPUs come out.

All I really want to know is what my set up can do at 60FPS.

After that, I can worry about maxing it out a year or two from now when I do my second playthrough with all the DLC out.
Click to expand...
mostly wont be a problem although i suggest waiting for benchmarks
 
S

Static-Jak

Rookie
#103
May 15, 2015
uthvag said:
mostly wont be a problem although i suggest waiting for benchmarks
Click to expand...
Hopefully we see benchmarks soon. I'd like to dive into this game without too much fiddling with the settings.
 
P

Puksy

Rookie
#104
May 15, 2015
Hi guys just saw this on Geforce website, and I am very happy with my 970. I'll try hairworks and lower some settings if it is really beautiful and use my dedicated Physx card
1920x1080, Low settingsGTX 960
1920x1080, Medium settingsGTX 960
1920x1080, High settingsGTX 960
1920x1080, Uber settingsGTX 970
1920x1080, Uber settings w/ GameWorksGTX 980
2560x1440, Uber settingsGTX 980
2560x1440, Uber settings w/ GameWorksGTX TITAN X, or 2-Way SLI GTX 970
3840x2160, Uber settingsGTX TITAN X, or 2-Way SLI GTX 980
3840x2160, Uber settings w/ GameWorks2-Way SLI GTX 980 or GTX TITAN X
 
D

D4rKy2014

Rookie
#105
May 15, 2015
Puksy said:
Hi guys just saw this on Geforce website, and I am very happy with my 970. I'll try hairworks and lower some settings if it is really beautiful and use my dedicated Physx card
1920x1080, Low settingsGTX 960
1920x1080, Medium settingsGTX 960
1920x1080, High settingsGTX 960
1920x1080, Uber settingsGTX 970
1920x1080, Uber settings w/ GameWorksGTX 980
2560x1440, Uber settingsGTX 980
2560x1440, Uber settings w/ GameWorksGTX TITAN X, or 2-Way SLI GTX 970
3840x2160, Uber settingsGTX TITAN X, or 2-Way SLI GTX 980
3840x2160, Uber settings w/ GameWorks2-Way SLI GTX 980 or GTX TITAN X
Click to expand...
Dont forgot that the system that its tested with is with this cpu and ram Intel i7-5960X, 16GB DDR4 RAM
 
S

sonicdart

Rookie
#106
May 15, 2015
tjapstijl said:
Have you guys seen this?

http://www.geforce.com/whats-new/articles/the-witcher-3-wild-hunt-is-your-system-ready

1920x1080, Low settings GTX 960
1920x1080, Medium settings GTX 960
1920x1080, High settings GTX 960
1920x1080, Uber settings GTX 970
1920x1080, Uber settings w/ GameWorks GTX 980
2560x1440, Uber settings GTX 980
2560x1440, Uber settings w/ GameWorks GTX TITAN X, or 2-Way SLI GTX 970
3840x2160, Uber settings GTX TITAN X, or 2-Way SLI GTX 980
3840x2160, Uber settings w/ GameWorks 2-Way SLI GTX 980 or GTX TITAN X

Also a little bit weird what Nvida saying:
The Witcher 3: Wild Hunt’s May 18th release is tantalizingly close. On PC, the experience will be enhanced with higher-resolution textures, higher-quality effects, higher levels of detail, uncapped framerates, and many other enhancements. And like the latest and greatest PC games, The Witcher 3: Wild Hunt is further enhanced with NVIDIA technologies and GameWorks effects that increase immersion and image quality throughout the 100-hour action RPG.

Higer-resolution textures?? They were the same as console, right?.

Also the difference between SSAO en HBAO+ is big.
Far Cry 4 SSAO: http://international.download.nvidia.com/geforce-com/international/images/far-cry-4/far-cry-4-ambient-occlusion-002-ssao.png
Far Cry 4 SSAO:http://international.download.nvidia.com/geforce-com/international/images/far-cry-4/far-cry-4-ambient-occlusion-008-ssao.png
Far Cry 4 HBAO+: http://international.download.nvidia.com/geforce-com/international/images/far-cry-4/far-cry-4-ambient-occlusion-002-nvidia-hbao-plus.png
Far Cry 4 HBAO+: http://international.download.nvidia.com/geforce-com/international/images/far-cry-4/far-cry-4-ambient-occlusion-008-nvidia-hbao-plus.png

We only seeing ultra gameplay with SSAO.
Click to expand...
is that with 60 fps, also i'm wondering I wonder much would it dip down if I was using a 970 and I used gameworks
 
U

uthvag

Rookie
#107
May 15, 2015
sonicdart said:
is that with 60 fps, also i'm wondering I wonder much would it dip down if I was using a 970 and I used gameworks
Click to expand...
i wouldnt trust nvidia with the system specs. Always trying to sell their top end hardware. the system requirements for all nvidia partnered games like AC unity , shadow of mordor were insanely high , gtx 680 ,770 etc while my 860m can do high ultra in dragon age inqui(no msaa and ultra textures , meshes FX) and ac unity (with fxaa and hbao+) at 30/30+ FPS

plus it mostly will improve with the latest drivers +patch

should be more worried for amd hardware though

---------- Updated at 12:44 PM ----------

sonicdart said:
Have you guys seen this?

http://www.geforce.com/whats-new/art...r-system-ready

1920x1080, Low settings GTX 960
1920x1080, Medium settings GTX 960
1920x1080, High settings GTX 960
1920x1080, Uber settings GTX 970
1920x1080, Uber settings w/ GameWorks GTX 980
2560x1440, Uber settings GTX 980
2560x1440, Uber settings w/ GameWorks GTX TITAN X, or 2-Way SLI GTX 970
3840x2160, Uber settings GTX TITAN X, or 2-Way SLI GTX 980
3840x2160, Uber settings w/ GameWorks 2-Way SLI GTX 980 or GTX TITAN X

Also a little bit weird what Nvida saying:
The Witcher 3: Wild Hunt’s May 18th release is tantalizingly close. On PC, the experience will be enhanced with higher-resolution textures, higher-quality effects, higher levels of detail, uncapped framerates, and many other enhancements. And like the latest and greatest PC games, The Witcher 3: Wild Hunt is further enhanced with NVIDIA technologies and GameWorks effects that increase immersion and image quality throughout the 100-hour action RPG.

Higer-resolution textures?? They were the same as console, right?.

Also the difference between SSAO en HBAO+ is big.
Far Cry 4 SSAO: http://international.download.nvidia...n-002-ssao.png
Far Cry 4 SSAO:http://international.download.nvidia...n-008-ssao.png
Far Cry 4 HBAO+: http://international.download.nvidia...-hbao-plus.png
Far Cry 4 HBAO+: http://international.download.nvidia...-hbao-plus.png

We only seeing ultra gameplay with SSAO.
Click to expand...
also says that it releases on monday

PS- they say it is an old build which they tested on , so mostly better performance
 
Last edited: May 15, 2015
S

sonicdart

Rookie
#108
May 15, 2015
uthvag said:
i wouldnt trust nvidia with the system specs. Always trying to sell their top end hardware. the system requirements for all nvidia partnered games like AC unity , shadow of mordor were insanely high , gtx 680 ,770 etc while my 860m can do high ultra in dragon age inqui(no msaa and ultra textures , meshes FX) and ac unity (with fxaa and hbao+) at 30/30+ FPS

plus it mostly will improve with the latest drivers +patch

should be more worried for amd hardware though

---------- Updated at 12:44 PM ----------



also says that it releases on monday
Click to expand...
my friend is worried he won't be able to play on high with a 770 4gb I bassicly told him not to worry lol
 
U

uthvag

Rookie
#109
May 15, 2015
Static-Jak said:
Hopefully we see benchmarks soon. I'd like to dive into this game without too much fiddling with the settings.
Click to expand...
cheers for you mate , nvidia said that this was not a cpu intensive game
 
S

Sainur

Rookie
#110
May 15, 2015
uthvag said:
nvidia said that this was not a cpu intensive game
Click to expand...
Good. Hopefully CDPR has taken the time to properly optimize this. Skyrim for example can still run like shite due memory issues, and not being optimized for modern 4-6 core CPUs.
 
S

sonicdart

Rookie
#111
May 15, 2015
uthvag said:
cheers for you mate , nvidia said that this was not a cpu intensive game
Click to expand...
which is weird considering how much I hear how busy the towns will be and various aspects such as tree's blowing in the wind of the draw distance I would assume would be intensive on the CPU seeing as that is what I usually assume taxes it
 
E

Edo34

Rookie
#112
May 15, 2015
Can't wait to see how this game actually looks like on ultra settings especially compared to PS4 and XB1.

---------- Updated at 01:03 PM ----------

Also...I can't believe that the developer who was until recently only PC centered would include only AA options as on/off.
 
U

uthvag

Rookie
#113
May 15, 2015
sonicdart said:
my friend is worried he won't be able to play on high with a 770 4gb I bassicly told him not to worry lol
Click to expand...
show him this ( the 779 is almost the same as the 960)

View attachment 14135

---------- Updated at 01:12 PM ----------

sonicdart said:
which is weird considering how much I hear how busy the towns will be and various aspects such as tree's blowing in the wind of the draw distance I would assume would be intensive on the CPU seeing as that is what I usually assume taxes it
Click to expand...
true , but look at AC unity though unoptimized 1000s on npcs revolting and it takes around 40% of my cpu (i7 4710 @ 2.4 - a mobile cpu)

also i have the 2500k , trust me it holds out well.

the physix is CPU because of consoles having APU

---------- Updated at 01:13 PM ----------

Aiden Pearce said:
Can't wait to see how this game actually looks like on ultra settings especially compared to PS4 and XB1.

---------- Updated at 01:03 PM ----------

Also...I can't believe that the developer who was until recently only PC centered would include only AA options as on/off.
Click to expand...
You can force it through the NVCP, also 8xmsaa on all those grasses....a nightmare
 
S

Siven80

Forum regular
#114
May 15, 2015
Hmm, so according to http://international.download.nvidia.com/webassets/en_US/shared/images/products/shared/lineup-full.png

My 670 is just short of a 960, but have to take into account the better tessellation performance the 900 series has, and then take into account the better processor and DDR4 RAM Nvidia have used in the results too. (compared to my 8GB DDR3 RAM and 2500k CPU).

I find myself in a difficult position. Do i stick it out with the 670 for a while longer, or sell it while it still sells well and buy a 970 which will last me around 2.5 - 3 years like my 670 has.
I really am tempted by the 970, its DX12 ready, will last a good few years and is very quiet which i like.
 
S

sonicdart

Rookie
#115
May 15, 2015
Siven80 said:
Hmm, so according to http://international.download.nvidia.com/webassets/en_US/shared/images/products/shared/lineup-full.png

My 670 is just short of a 960, but have to take into account the better tessellation performance the 900 series has, and then take into account the better processor and DDR4 RAM Nvidia have used in the results too. (compared to my 8GB DDR3 RAM and 2500k CPU).

I find myself in a difficult position. Do i stick it out with the 670 for a while longer, or sell it while it still sells well and buy a 970 which will last me around 2.5 - 3 years like my 670 has.
I really am tempted by the 970, its DX12 ready, will last a good few years and is very quiet which i like.
Click to expand...

well if you buy the 970/980 right now they have a deal where you get the witcher 3/batman arkham knight as a code, sooo yeah I felt like shooting myself in the foot seeing as I bought my 970 like 3 months ago before this


http://www.geforce.com/witcher-3-batman-ak-bundle

here you go
 
S

Siven80

Forum regular
#116
May 15, 2015
sonicdart said:
well if you buy the 970/980 right now they have a deal where you get the witcher 3/batman arkham knight as a code, sooo yeah I felt like shooting myself in the foot seeing as I bought my 970 like 3 months ago before this


http://www.geforce.com/witcher-3-batman-ak-bundle

here you go
Click to expand...
Yeah, saw the bundle a bit ago, but already have W3 preordered on Steam.

But i could sell the 2 free games + my 670 for a good amount which will lower the price on the 970...... so tempting
 
U

uthvag

Rookie
#117
May 15, 2015
Siven80 said:
Hmm, so according to http://international.download.nvidia.com/webassets/en_US/shared/images/products/shared/lineup-full.png

My 670 is just short of a 960, but have to take into account the better tessellation performance the 900 series has, and then take into account the better processor and DDR4 RAM Nvidia have used in the results too. (compared to my 8GB DDR3 RAM and 2500k CPU).

I find myself in a difficult position. Do i stick it out with the 670 for a while longer, or sell it while it still sells well and buy a 970 which will last me around 2.5 - 3 years like my 670 has.
I really am tempted by the 970, its DX12 ready, will last a good few years and is very quiet which i like.
Click to expand...
trust me i build pcs as a hobby, and you wont find a difference between 1333mhz CL9 ddr3 with a 2400mhz cl7 ddr3 or a 3200mhz cl ddr4 IN GAMING( 3D rendering is entirely different)

also the latency of ddr4 is high , so high end ddr3 is usually cheaper and give simlar/better performance

CPUs - you dont need i7 , although 6 cores give best performance currently , scaling is bad above 6 cores nowadays.Also the performance difference between 4 cores and 6 cores isnt much as most 4 core i5s are not being pushed to the max

note that this is purely speculation and this is based on games which are currently in the market. For concrete evidence i suggest waiting for benchmarks

---------- Updated at 01:57 PM ----------

about the tessellation - havent they reduced/removed it in w3

also about the 670 , stick with it at least till you get a glimpse at amd 3xx series which offers a lot of features (esp for 4k due to HBM) .if you can wait till 2016 or are a nvidia fan , the pascal architecture offering the same advantages as amd 3xx is going to be released soon
 
Last edited: May 15, 2015
I

il_corleone_manu

Rookie
#118
May 15, 2015
Static-Jak said:
My 970 has never been my concern, it's my i5-2500 (non K) I worry won't be able to keep up.

I was considering upgrading but at this point I want to hold off till the Skylake Intel CPUs come out.

All I really want to know is what my set up can do at 60FPS.

After that, I can worry about maxing it out a year or two from now when I do my second playthrough with all the DLC out.
Click to expand...
I have, had the same concern as you, but I managed to OC it from 3,3 to 3,7 on the Bios,(only modifiying the Multplier and not the K version, as you have) not much Temp change, and if you see, the new i5, series that are High end, are i5 4670 that is from 3,4 to 3,8 , and the 4690 that is from 3,5 to 3,9, there is almost not much change, I say to you to do the same simple orveclock, our CPU are still tier 1 on gaming, and we should wait till skylake or the next year

also, the specs for CPU don`t seems to be very Realistic
 
Last edited: May 15, 2015
S

SelfIgnition

Rookie
#119
May 15, 2015
uthvag said:
trust me i build pcs as a hobby, and you wont find a difference between 1333mhz CL9 ddr3 with a 2400mhz cl7 ddr3 or a 3200mhz cl ddr4 IN GAMING( 3D rendering is entirely different)
Click to expand...
i will not agree on this one.
According to this article (google translate it): http://www.purepc.pl/pamieci_ram/jakie_pamieci_ram_wybrac_test_ddr3_13332400_mhz_w_grach
you can get a 7-20% boost in fps (depending on game, game settings, CPU architecture).
Generally, if you play without any AntiAliasing, then quality ram helps to achieve higher frames


It might not be much, but good RAM isn't expensive. Especially, when we compare its price to a new CPU or GPU.
 
Last edited: May 15, 2015
M

Maydawn

Rookie
#120
May 15, 2015
Everywhere I see are reports regarding nVidia cards, but sadly, I made a mistake some time ago and got myself an AMD R9 280. Any idea how it would do ? Except that, I have i5-4690k and 16GB ram.

EDIT: Just for clarification, I know pretty much nothing about PC hardware, and I have no idea if my card comares to which nVidia card. All I know is that buying that AMD wasn't a good idea, or so I was told by people who know much better than me <.<
 
Prev
  • 1
  • …

    Go to page

  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
Next
First Prev 6 of 9

Go to page

Next Last
Share:
Facebook Twitter Reddit Pinterest Tumblr WhatsApp Email Link
  • English
    English Polski (Polish) Deutsch (German) Русский (Russian) Français (French) Português brasileiro (Brazilian Portuguese) Italiano (Italian) 日本語 (Japanese) Español (Spanish)

STAY CONNECTED

Facebook Twitter YouTube
CDProjekt RED Mature 17+
  • Contact administration
  • User agreement
  • Privacy policy
  • Cookie policy
  • Press Center
© 2018 CD PROJEKT S.A. ALL RIGHTS RESERVED

The Witcher® is a trademark of CD PROJEKT S. A. The Witcher game © CD PROJEKT S. A. All rights reserved. The Witcher game is based on the prose of Andrzej Sapkowski. All other copyrights and trademarks are the property of their respective owners.

Forum software by XenForo® © 2010-2020 XenForo Ltd.