Predicted witcher 3 system specs? Can I run it .

+
Status
Not open for further replies.
Hello everybody. I hope somebody here can help me with a bit of a problematic situation. My laptop recently died on me so I have to get a new one. I`m going to buy a good multimedia laptop that can have a pricetag around €1100,-. My previous laptop was de asus n53sv which I was very pleased with. Now I`m a huge fan of the witcher and I have to play the witcher 3. Now I know that I wont be able to play the game on high settings with a multimedia laptop but that`s okay. I played almost all my games on medium on my previous laptop. I was hoping that someone here would now what a good multmedia laptop is at the moment, that would be able to run the witcher 3. I am aware that the system requirements are not yet out, but I`m hoping that someone could give me some indications on the current multimedia laptops and the witcher 3 settings. Thanks in advance and greets from Holland.

- - - Update - - -

So I found a good multimedia laptop myself. The ASUS N750JK-T4193H. These are the most important specs:

17,3 inch FHD • Intel Core i7-4710HQ • 8 GB • 500 GB HDD • NVIDIA GeForce GTX 850M (2 GB)

Does these specs look any good? Also it doesn`t have a ssd drive, will that be important?

Thanks in advance and greets from Holland
 
Would you go with a 970, or with a 980 for 1080p max settings in the Witcher 3?

My rig: INTEL i7-4770K + SCYTHE MUGEN 4 - GIGABYTE Z97X-UDH5 - 8GB DDRIII 1600 G.SKILL - GIGABYTE GTX 770 OC 4GB - ZALMAN Z11 PLUS
SAMSUNG SATA III EVO SSD 250GB - SAMSUNG SATA II SpPOINT 500 GB - BE QUIET L8 PURE POWER 630W MODULAR - LG 22EA63V IPS
 
Mobo: Asus Maximus VII Ranger
CPU: Intel Core i5-4690K
RAM: Corsair Vengeance Pro 2133-8GB
Fan: Cooler Master Hyper 212 EVO
GPU: Asus Strix GeForce GTX970
PSU: Corsair RM 850W
(Case: Corsair Graphite 780T)
=+- €1200

Opinions would be appreciated. :D

Edit: I wanne go SLI in 1-2 years, therefore the high PSU. And I already have 2.5 TB on HDD space.

Should run TW3 on High-Ultra High in my opinion. Do you have an SSD?
 
Quoted for truth.

Once you overclock, you are working at your own risk. It is entirely possible that you can destroy hardware by running it at full continuous load, overclocked and especially if it is also overvoltaged.

But blaming the benchmark programs for causing the failure is really not accurate. The failure was caused by the decision to overclock and then run at full load without being ready to back down really fast at the first sign of trouble.

But ONLY 3dMark Fire Strike Extreme - Combined test do that.
In past two months i have 9 different GTX 780Ti and 4 GTX 980 and ONLY 780Ti Lighting and one of 980 references died, during this test.
780Ti Lighting at stock voltage...

3D Mark Fire Strike (newest version) have some kind of voltage peak bug durning combined test.

I agree with Guy N'wah here, you cannot compare one test program with another because they work differently, some features that might be present in one test may not be in others so basically it all comes down to your decision of strong OC. If a test program is killing GPUs in stock clocks then you can blame it but not after OC.

FireStrike is currently known to be one of the most intensive GPU test so maybe it pushed your card more than it was capable of while overclocked. One thing that I learned from my overclocking experience is that never OC your hardware to satisfy benchmarks because it's like an addiction and you keep pushing it for higher scores, that kind of OC is best left for reviewers and pros who usually kill hardware to test it. As a consumer you should only OC till you get decent game performance and that too carefully with small increments.
 
Hi guys! :)
As I need to update some hardware (still using DDR2) I would like to know what you think:

CPU: Intel i5 4690 3.5GHz 1150
RAM: Kingston HyperX Fury 8GB 1866
MB: ASUS H97-Plus 1150 DDR3 ATX
GPU: ASUS 4GB STRIX-GTX970

Maybe I need to buy a little less powerful GPU as I still don't know my final budget.
 
Hello everyone...
I see that you discuss about PC requirements, and I'd love to hear your opinions about this.
Witcher 3 use the same engine as 2 am I right? The difference is that the new version of this engine supports an open world and more detailed graphic as we expected.
I dont know why, but something tells me that Witcher 3 dont be uber demanding If optimization be good. At one interview, member of CDPR said that if we have GTX 660 we can go on High quality. Here is the link, watch on 3:00 min when he say that: http://www.youtube.com/watch?v=tEFBVIKrKco
So what do you think about this PC_ Understand that I do not have it but I plan to buy something similar if iI get money.
CPU: Intel i5 3570 (3,4 GHz) or AMD FX 8350 (4 GHz) I dont know.
MBO: I dont know
RAM: 16 GB 1600 MHz
GPU: GTX 770 4 GB - I think that this GPU is a beast and it will max out (just turn off ubersampling) Witcher 3 on 1920x1080 - 30 FPS for me is good but not under that.
Feel free to talk about this, I just hope that these CPUs are good enough for this card.
 
Witcher 3 will use a new engine, Red Engine 3. The degree to which it is a new engine with new characteristics, as opposed to a derivative of Witcher 2's Red Engine with characteristics similar to the old engine, is not public knowledge.

I think you are fine with a GTX 760 or 770, if you are expecting playable performance at high quality and not demanding maximum performance and quality at the same time.

AMD CPUs eat Intel's dust these days. There's not really any reason to choose an AMD FX over a Core i5. Any Intel CPU from the Sandy Bridge Core i5 and up (3570 is Ivy Bridge) is fine, but if you're buying new, go with a Haswell Core i5 and a motherboard with a Z97 chipset.
 
motherboard with a Z97 chipset.

The difference between H97 and Z97 is just that with the latter you can overclock the CPU, right?

Now I'm using a GTX650. Does it make sense to keep it and use it as a dedicated Physx card?
 
Last edited:
Well gtx 680 can max out Farcry 4 which has big open world and high quality textures so I wouldn't be worry much about system requirements.
 
The difference between H97 and Z97 is just that with the latter you can overclock the CPU, right?

Now I'm using a GTX650. Does it make sense to keep it and use it as a dedicated Physx card?

Z97 also has SLI support. H97 does not. Both are compatible with upcoming Broadwell CPUs (and no other Intel chipsets are).
 
Would you go with a 970, or with a 980 for 1080p max settings in the Witcher 3?

My rig: INTEL i7-4770K + SCYTHE MUGEN 4 - GIGABYTE Z97X-UDH5 - 8GB DDRIII 1600 G.SKILL - GIGABYTE GTX 770 OC 4GB - ZALMAN Z11 PLUS
SAMSUNG SATA III EVO SSD 250GB - SAMSUNG SATA II SpPOINT 500 GB - BE QUIET L8 PURE POWER 630W MODULAR - LG 22EA63V IPS

Personally I would wait for 2015 Jan/Feb as AMD is rumored to release R9 380X that is powered by Fiji GPU based on 20nm process and features newer HBM memory but if you would like to stick with GTX 900 series than go with GTX 970, it's the best performer in it's price range atm and with little overclock it could match or even outperform stock GTX 980.
 
AMD has no 20nm GPU that can be released as early as first quarter 2015. Their upcoming GPUs will still be 28nm. Claims that they are preparing a 20nm design or a stacked RAM design are speculation unfounded in any fact that is public knowledge or in any official statement.

Their only known 20nm design is a CPU that was "taped out" earlier this year. It takes at least a year from taping out to production, longer if you're also trying to master a difficult process at the same time.

They still may beat nVidia to market at 20nm, but it won't be for retail GPUs soon enough to justify waiting.
 
Last edited:
Personally I would wait for 2015 Jan/Feb as AMD is rumored to release R9 380X that is powered by Fiji GPU based on 20nm process and features newer HBM memory but if you would like to stick with GTX 900 series than go with GTX 970, it's the best performer in it's price range atm and with little overclock it could match or even outperform stock GTX 980.

I bought a 970 today. I think it is the best option right now, as you say and as I have read everywhere. I think I will be able to run The Witcher 3 perfectly with my rig.
 
I have a GTX 650 and i was able to play TW2 at 1080p 30fps on medium-high settings. Now i played the last games like Evil within and Assassin's Creed Unity and both of them are unplayable at 1080p. I had to lower the resolution at 720p. Now I don't know if these games lack in optimization or were released early like some rumors say but to run the next gen games on high settings i think you should at least have a Gtx 770
 
I have a GTX 650 and i was able to play TW2 at 1080p 30fps on medium-high settings. Now i played the last games like Evil within and Assassin's Creed Unity and both of them are unplayable at 1080p. I had to lower the resolution at 720p. Now I don't know if these games lack in optimization or were released early like some rumors say but to run the next gen games on high settings i think you should at least have a Gtx 770
The game requirements jump up, whenever a new generation of consoles get released. This is due to two reasons. New consoles have more raw power, and so developers can push the hardwares more. Also the Graphics manufacturers should make some adjustments concerning the drivers and also hardware. For example the PS4 and XBone have access to large amounts of graphics memory, so we saw a big jump in memory requirements of games in PC since the new console's release.
So it's a combination of poor optimization and stronger hardware.
 
AMD has no 20nm GPU that can be released as early as first quarter 2015. Their upcoming GPUs will still be 28nm. Claims that they are preparing a 20nm design or a stacked RAM design are speculation unfounded in any fact that is public knowledge or in any official statement.

Their only known 20nm design is a CPU that was "taped out" earlier this year. It takes at least a year from taping out to production, longer if you're also trying to master a difficult process at the same time.

They still may beat nVidia to market at 20nm, but it won't be for retail GPUs soon enough to justify waiting.

And that's why I called it a rumor, it's going on from quite sometime now so if I were to upgrade then I'll take my chances and wait. The game is not going anywhere.

http://wccftech.com/gtx-980-lose-performance-crown-amds-r9-380x-febuary-2015/

http://wccftech.com/amd-20nm-r9-390x-feautres-20nm-hbm-9x-faster-than-gddr5/

If it has some truth then in coming months we'll probably see more leaks about it.

I often suggest this because if you're an owner of GTX 780 or 780 ti then upgrading to GTX 970 or 980 makes no sense as the performance improvement doesn't justify the switch, specially real game performance but if you own something older then yes getting a 970 is the wise option and best price/performance choice atm.

I bought a 970 today. I think it is the best option right now, as you say and as I have read everywhere. I think I will be able to run The Witcher 3 perfectly with my rig.

Congrats mate :)

EDIT:

The game requirements jump up, whenever a new generation of consoles get released. This is due to two reasons. New consoles have more raw power, and so developers can push the hardwares more. Also the Graphics manufacturers should make some adjustments concerning the drivers and also hardware. For example the PS4 and XBone have access to large amounts of graphics memory, so we saw a big jump in memory requirements of games in PC since the new console's release.
So it's a combination of poor optimization and stronger hardware.

I think it's more poor optimization and exaggeration than stronger hardware, to be honest I haven't seen Crysis 1 like breakthrough in industry to justify sudden requirements change specially in vram and system ram not to mention some games ask for way bigger CPUs then what they actually need.

I think TW3 will bring something great to the table, something truly next gen and if it will be optimized properly as well then it will be a great lesson to some lazy devs out there.
 
Last edited:
The more I learn about 20nm tech, the more I think it's been overpromised, and I don't think we're going to see products useful for gaming for some time, maybe a year or more. And there's nothing at all wrong with 28nm; it works damn well at high performance, and the GCN and Maxwell GPUs you can buy now prove it.

First, TSMC. TSMC's the only foundry for 20nm chips that's production-ready. (GlobalFoundries is several billion dollars away, and Intel's production is captive.) But their process is low-power, not high-performance. It's suitable for things like the ultra-low-power mobile CPUs that Apple and Qualcomm want, not the hundreds-of-gigatexel GPUs that nVidia and AMD want. And TSMC's booked so solid with orders for those chips, for the foreseeable future, that even Apple is suffering from undersupply.

Second, transistors. Conventional designs still work fine at 28nm. But they scale badly to smaller than that. You need what are loosely called "FinFET" transistors below 28nm. Intel, where Moore's Law is marching orders, has already mastered FinFET technology. They've been producing them at 22nm for years, they're ramping up production at 14nm, and they're already working on 10nm. Neither AMD nor nVidia has FinFET designs. Nor do they have Intel's immense resources to rework a whole processor in FinFET.

If either nVidia or AMD simply takes their successful 28nm designs and shrinks them to 20nm, they will bomb. They would be difficult and expensive to produce, even if anybody could produce them, and they wouldn't run any better than 28nm.

The entire GPU business is a long way from making a viable product at 20nm. I'm not holding my breath, and I'm not holding up purchase decisions, waiting for it.
 
Last edited:
Never said that 20nm will be immensely successful nor do I care much, my main interest is AMD's next GPU which is rumored to be on 20nm, now that could be wrong but AMD can't stay long without countering GTX 900 series so a high end product will come to market sooner than expected and if it features HBM then it's good.

Remember how AMD surprised us with HD 4000 series featuring GDDR5 ?
 
Status
Not open for further replies.
Top Bottom