Predicted witcher 3 system specs? Can I run it .

+
Status
Not open for further replies.
Right since the announcement of the delay and 3 extra months for prices to go down I feel like I might upgrade my graphics card just before release. I've got a 760 GTX at the moment which I highly doubt will run the game at ultra (at least not at a passable framerate and with some settings turned off). What do people think will achieve ultra with most things turned up to full? I quite like the look of the 970? I think all my other components are great and should be up to the task, just can't remember all of them just now. Thanks for any help!
 
In the light of recent events (delay) I have a question guys. I wanted to buy i5 4690K this Christmas for my new PC for Witcher 3, but now with the delay I wonder if I should wait for broadwell CPUs to hit market. Anyone can help me with this?
 
How will this laptop fare at 1080p? Ultra @30fps?

Sharp 4K IGZO IPS display
Intel Core i7-4710HQ (2.5 - 3.5GHz)
16GB DDR3L 1600MHz RAM
NVIDIA GTX 980M 4GB
Plextor M6e 512GB & Samsung 850 Pro 1TB & Samsung 840 Pro 512GB
Killer Wireless-N 1202
Windows 8.1 64bit
 
In the light of recent events (delay) I have a question guys. I wanted to buy i5 4690K this Christmas for my new PC for Witcher 3, but now with the delay I wonder if I should wait for broadwell CPUs to hit market. Anyone can help me with this?

Desktop (LGA 1150, Broadwell-K) is going to be well behind embedded Broadwells. Nobody is predicting earlier than late second quarter 2015. It may not actually be worth waiting for. Skylake and AMD Zen will be much closer to market by then.
 
Desktop (LGA 1150, Broadwell-K) is going to be well behind embedded Broadwells. Nobody is predicting earlier than late second quarter 2015. It may not actually be worth waiting for. Skylake and AMD Zen will be much closer to market by then.

Thank you Guy, I secretly hoped you will answer my question since you seem to be pretty experienced with these things, so i5 4690K it is. Thanks again :)
 
I'm not an expert with these questions but I'l throw some opinions. Since it's an open-world game with lots of NPCs I think it will use a lot of the CPU for the AI and other things. And it's supposed to have insane graphics detail so a powerful GPU is needed for all the good stuff. These are the most obvious things that came to my mind. I think Guy N'Wah can be more specific and give more detailed thoughts.

Yeah i do agree with you about the CPU using AI and being open world, but i think it will be more dependant on GPU at this timebecause of graphics after some thought thats what iv'e came up with lol but i think TW2 was CPU bound if im correct?..not too sure,personally i hope its nicely balanced and if anything more GPU bound than anything but that's only cos iv got a 970 lol

---------- Updated at 11:09 PM ----------

Desktop (LGA 1150, Broadwell-K) is going to be well behind embedded Broadwells. Nobody is predicting earlier than late second quarter 2015. It may not actually be worth waiting for. Skylake and AMD Zen will be much closer to market by then.

Im guessing this mean an upgrade of motherboard in the near future? i was reading a lil about the K12 64bit, does that mean these ("APU"s) (can you define APU for me please?) im guessing its the new CPU and something to do with it being embedded.

the x32 will not work on a 64bit OS?
 
TW2 is heavily GPU-bound, though I have always gotten the impression that responsiveness was less and jitter was greater on 2-core CPUs. I think they would have to throw a real boatload of scripting, AI, or CPU-based special effects into the game to change that in TW3.

APU is an AMD term, "Accelerated Processing Unit". It's their way of describing a combined CPU and GPU combination, where compute tasks can be assigned to as many as 4 CPU cores and 6 GPU modules at the same time. The A-series APUs (especially the 4-core A8 and A10) are good bargains for mid-range tasks.

K12 ("Zen") is a new ARM design. SkyBridge is a planned ARM/AMD64 pin-compatible architecture. Manufacturers would be able to deliver ARM or AMD64 computers on the same board. Details have been vague, but it looks more and more like chief designer Jim Keller has made a sharp break with the Bulldozer/Piledriver architecture. It's expected for 2016, later if you listen to cynics and pessimists.

32-bit code will run on a 64-bit OS. The reverse (64-bit code on 32-bit OS) won't work. TW3 will be 64-bit code and require a 64-bit OS, most likely Vista x64 and later.

[Jim Keller took over AMD's CPU division in 2012. He's a legend, the one who's responsible for the original 64-bit Athlons that were eating Intel's sales before the Israeli Coup.]
 
Last edited:
Thanks for the insight Guy,

Ii love trying to keep up with the latest, i was beginning to think AMD were dead after the bulldozer and piledriver (Which was first only a prototype)

Well i have Win 7 Ultimate x64, so would my bets going the x64 version as i like to invest towards the future, as x64 or whats my best bet? my current hardware setup is in my Sig, what upgrades you think ill need to be doing in the future?

Whats are your specs Guy?, hope AMD gets their heating issues out of the way on their CPU's.

Btw are APU's the "new" or do we class AMD CPU's as APU's now?

As i have the Piledriver FX9590, i bought just before intel haswell and broadwell CPU"s were on the market, if i had of waited i could've got my AMD CPU for AUD $200 rather than double, cos they split their price due to intel releasing their chips, or i could've just went with intel

And yes most games now are requiring x64 rather than x86
As I've heard too is is quite the smart engineer, i didn't know he did the Athlons though!
I had a Athlon Duo back in the day in my Compact laptop, used to play BF2 until the hardrive died and it was worth buying a new laptop than replacing the part.
I was about 15-16
 
APUs have long been what AMD has thought to be the future of computers. It's why they bought ATI, to get the GPU technology. (They first tried to buy nVidia. The deal fell through because nVidia wanted to run the merged company.) Unfortunately, it may have split their scarce R&D dollars and engineers between the APUs and the FX CPUs, and cost them market share in the all-important server business (which has little interest in APUs).

I almost never have state-of-the-art computers. I get on the bleeding edge with software; I don't fancy being up there with hardware too.

My big iron is an Intel Sandy Bridge Core i7 2700K, Powercolor 7950XT (the 7950XT is a strangely named AMD Tahiti), ASRock Z77 Extreme 6 motherboard, 4x4GB Corsair Vengeance DDR3-1600, WD Black 2TB, Samsung 840 EVO 120GB, Seasonic M12II 520W, Corsair 200R case. Total cost about $800, runs 75 Linpack GFlops, and TW2 at 75 fps in Ultra.
 
Last edited:
Not a bad setup for the price, yeah i didn't know AMD tried buying Nividia out
So APU's are for mainly Servers? not really for general or hardcore Desktop use?

75fps TW2 not bad at all, with ubersampling? i find for ubersampling there is no need for it
 
Not a bad setup for the price, yeah i didn't know AMD tried buying Nividia out
So APU's are for mainly Servers? not really for general or hardcore Desktop use?

75fps TW2 not bad at all, with ubersampling? i find for ubersampling there is no need for it

No, the problem with APUs is that they're close to useless for servers. What they're good for is low-cost desktops.

AMD used to have about 25% of the server market, with the Opteron line. They've lost most of it, down to maybe 3%. Those are high-end, high-volume, high-profit CPUs. Not enough performance, too much power (high power is just an annoyance when you're running a single desktop; when you have a whole rack of servers, it becomes really expensive), Intel playing dirty by making exclusive deals.

I sometimes turn Ubersampling off. I don't like the effect. (I like motion blur even less. That always gets turned off.) AMD Tahitis are great cards for TW2. I hope they're not too hampered in TW3.
 
No, the problem with APUs is that they're close to useless for servers. What they're good for is low-cost desktops.

AMD used to have about 25% of the server market, with the Opteron line. They've lost most of it, down to maybe 3%. Those are high-end, high-volume, high-profit CPUs. Not enough performance, too much power (high power is just an annoyance when you're running a single desktop; when you have a whole rack of servers, it becomes really expensive), Intel playing dirty by making exclusive deals.

I sometimes turn Ubersampling off. I don't like the effect. (I like motion blur even less. That always gets turned off.) AMD Tahitis are great cards for TW2. I hope they're not too hampered in TW3.

Ahh okay so do you know if AMD is making a new high end CPU? or have a misread on the article about the Zen and Skybridge?
gee down to roughly 3%, what happened?...intel making "exclusive deals" yeah? lol, i heard a rumor intel did a deal with the military for there broadwell chips.

I like personally like the motion blur effect in games, i like to have the added realism, plus i just generally like the effect it causes, just depends how much they introduce.

How much Vram does the Tahiti have?, and whats the clock speed? do you overclock at all or no need?
The card i have is great to overclock but i just keep it at stock because there just isn't really any need for it atm
 
Ahh okay so do you know if AMD is making a new high end CPU? or have a misread on the article about the Zen and Skybridge?
gee down to roughly 3%, what happened?...intel making "exclusive deals" yeah? lol, i heard a rumor intel did a deal with the military for there broadwell chips.

I like personally like the motion blur effect in games, i like to have the added realism, plus i just generally like the effect it causes, just depends how much they introduce.

How much Vram does the Tahiti have?, and whats the clock speed? do you overclock at all or no need?
The card i have is great to overclock but i just keep it at stock because there just isn't really any need for it atm

AMD's plans for new CPUs are not well publicized at this point. They have largely missed out on the mobile devices market, which is why they want K12 to be ARM. Keller won't be satisfied with playing second fiddle to Intel and Friedman for long, so I would expect them to come out with a new high-performance CPU, probably based on Jaguar. But there is nothing tangible to go on yet, and I wouldn't expect a new high-performance CPU before 2016.

What happened to AMD's market share was the "Israeli Coup". When AMD had Keller's K8, and Intel had only Netburst CPUs, AMD won a lot of sockets. Then Intel Israel, which had the mobile CPU division, came up with a simpler, lower power, faster design that became Core and then Core 2. I still have a laptop that's so old it says Core and not Core 2 on the sticker. Rony Friedman and his crew marched into Intel headquarters and started persuading everybody who would listen that they had a better design than the Pentium 4. They won. Friedman took over the CPU division, and everything they've made since has been the fastest and coolest-running CPUs on the market.

The Tahiti LE (7950 XT) is normally 2GB, which is what I have. I don't usually overclock high-performance GPUs; there's little percentage there.
 
I've put a new comp. together just for The Witcher 3. Dragon Age: I and the rest are only icing on the cake. :D

Hoping to run the game on high-ultra settings 1080p.

-GPU: MSI Gaming GTX970 4GB (I also have my old GTX660 2GB, maybe i can use it for dedicated PhysX Effects? Is that possible?)
-CPU: Intel i74790 4*3,6
-MB: ASUS Maximus VII Hero
-RAM: Corsair Vengeance Pro Silver 8GB KIT CL9 10-9-27
-PSU: Corsair CX600
-House: Corsair Carbide Series 300R
-Drive: LG 12NS30 Bluray Combo
 
Literally the first thing I turn off in EVERY game unless it's something by Crytek because they used object based motion blur instead of using it has a cheap post process effect(99% of games) and that is amazing.
Yeah.
I always turn off Motion Blur and any kind of DOF in games.
DOF pisses me off for some reason. Why reduce your FPS to see less of a game !!!! But surprisingly many like it.
 
Can some tech expert give me some insight about this? A couple of novice questions of mine:

I was under the impression that the specs can shift, higher or lower, depending on the optimization. Is that not the case? If it is, isn't 5 months before release a very early time to announce specs?

During this phase, the graphics can still potentially improve, right? I remember reading here on the boards that TW2 actually looked better than much of the promotional material. So if this is a possible scenario, and in the upcoming months the game will look better, doesn't that mean that the spec requirements will be higher (for the Ultra settings)?

Thanks in advance.
 
I'm guessing they're at a stage where they can make a solid prediction. And then spend the following months making that prediction reality.
 
Status
Not open for further replies.
Top Bottom