Yes I did it also bought it in november 2014lol I just purchased my first GTX 980 a month ago and a second one a week ago only for this game, sigh... guess you can't get everything in life....
- I was thinking that w3 is ready(before delays)
Yes I did it also bought it in november 2014lol I just purchased my first GTX 980 a month ago and a second one a week ago only for this game, sigh... guess you can't get everything in life....
Last time I checked, PhysX Flex aka PhysX 3.4 was still in closed alpha or beta. So no, Witcher 3 won't have it unfortunately..But isn't PhysX FLEX being used in TW3 ? I think I saw it mentioned somewhere but I can't remember where. Also with PhysX 3.x being open source now with UE4, it's very likely they make devs use PhysX FLEX more.
The reason why I'm against GPU PhysX in the Witcher 3, is because I think it will take more resources away from the GPU that could be used for rendering. The Witcher 3 is going to be I think, the first TRUE next gen title in terms of visual fidelity. AC Unity would have been first, but Ubisoft sacrificed too much to get it to run on the consoles which were tapped out CPU wise due to having to decompress the lighting data and process all of the A.I from the massive crowds.Additionally they haven't shown the game on PC using the highest settings so maybe those advanced PhysX effects are only available on highest preset. In any case I will be happier if the game looks beautiful and runs well.
I have GTX 980 SLI as well. I'm hoping that I'll be able to max out the game minus ubersampling when it ships.lol I just purchased my first GTX 980 a month ago and a second one a week ago only for this game, sigh... guess you can't get everything in life....
Which brand you picked for 980s ? I went with Gigabyte. Just curious because this is the first time I ever built a dual card setup.Yes I did it also bought it in november 2014
- I was thinking that w3 is ready(before delays)
I see, I think I must be mixing it up with something else because I saw FLEX mentioned somewhere. I hope the version of PhysX they are using in TW3 (regardless of CPU or GPU) is optimized enough.Last time I checked, PhysX Flex aka PhysX 3.4 was still in closed alpha or beta. So no, Witcher 3 won't have it unfortunately..
The reason why I'm against GPU PhysX in the Witcher 3, is because I think it will take more resources away from the GPU that could be used for rendering. The Witcher 3 is going to be I think, the first TRUE next gen title in terms of visual fidelity. AC Unity would have been first, but Ubisoft sacrificed too much to get it to run on the consoles which were tapped out CPU wise due to having to decompress the lighting data and process all of the A.I from the massive crowds.
I have GTX 980 SLI as well. I'm hoping that I'll be able to max out the game minus ubersampling when it ships.
Witcher 3 uses PhysX 3.3.3 if I'm not mistaken, and it's very fast and capable of delivering advanced physics effects via software so don't worry about performance.I see, I think I must be mixing it up with something else because I saw FLEX mentioned somewhere. I hope the version of PhysX they are using in TW3 (regardless of CPU or GPU) is optimized enough.
Yes, you have that option. It's under "Configure SLI, Surround, PhysX."Btw don't we have the option to switch PhysX from GPU to CPU in Nvidia Control Panel ? so is it necessary to completely drop it ? Previously when I was using Sapphire R9 290 and Lords of The Fallen got released, I had no trouble with it but some of my friends with Nvidia cards had trouble with PhysX on GPU and to solve it they switched to CPU and the game started working fine then the dev patched the game and now it works fine with GPU PhysX as well.
I wouldn't personally characterize AC Unity as unoptimized. It's just very dense in it's detail, which puts a tremendous burden on the CPU. I have an overclocked 4930K driving my GTX 980s, so I have no problem maintaining 60 FPS at all times...lol don't mention AC Unity in next gen category, that game is the worst optimized game I have ever seen and even with my two GTX 980s it drops to 50 fps occasionally when I am in crowded streets, that is mostly because the game takes CPU utilization to 100% and that is the point where CPU cannot serve GPUs otherwise I think two GTX 980s are overkill for any game available at the moment at 1080p.
Like you said, GTX 980 SLI is overkill for 1080p. I game at 1440p, so it's more of a necessity for me to use SLI if I want to max out the latest games..I too hope that I can max out TW3 on day one (excluding uber sampling) with 2x MSAA at 1080p. I hope it's not too much to ask lol.
Thanks for clarifying. I too tried PhysX 3.x in Metro Redux when I had R9 290 and had no problem with my i5 4690k driving the PhysX via software so yes it is indeed very well optimized for CPU.Witcher 3 uses PhysX 3.3.3 if I'm not mistaken, and it's very fast and capable of delivering advanced physics effects via software so don't worry about performance.
Well in PC gaming a good optimized game usually puts less load on CPU and more load on GPU which is wise choice as well because PC has tremendous power in GPUs while CPUs are powerful too; they are often under overhead of APIs and OS. This is exactly opposite of what consoles offer so a game that is well optimized for PC should rely more on GPU rather than CPU.I wouldn't personally characterize AC Unity as unoptimized. It's just very dense in it's detail, which puts a tremendous burden on the CPU.
i7 4930k is indeed a very powerful CPU. For now my i5 4690k didn't had problems with any of the games except ACU showing 100% utilization. In Crysis 3 I can reach a max of 165 fps in less dense parts of the game with CPU utilization being on 80% - 85% on all cores and if I target just 60 fps with Vsync the utilization goes further down, same goes for DAI at 60 fps I only get 45% - 60% utilization on all cores.I have an overclocked 4930K driving my GTX 980s, so I have no problem maintaining 60 FPS at all times...
If you have a standard quad core though, I can see how it would be problematic. In fact, when the Witcher 3 comes out, it may be better for you to run the PhysX on the GPU rather than the CPU as you will have more spare GPU power on hand than CPU.
Yes it's kind of stupid to remain on 1080p with two powerful cards like GTX 980 though at the moment I can't really afford to get a good 1440p monitor therefore I just super sample the games to 1440p with DSR and they look awesome, I don't know if I will get that DSR luxury with The Witcher 3 but who knowsLike you said, GTX 980 SLI is overkill for 1080p. I game at 1440p, so it's more of a necessity for me to use SLI if I want to max out the latest games..
Usually this would be the case, but for some workloads, the CPU is faster than the GPU, ie A.I. But I do agree that Ubisoft should have used the GPU for decompressing the lighting data in AC Unity at the very least. If I had to guess, the reason why they didn't use the GPU for that task is because they didn't have the proper software in place yet to exploit the compute capabilities of the PS4 and Xbox One..Well in PC gaming a good optimized game usually puts less load on CPU and more load on GPU which is wise choice as well because PC has tremendous power in GPUs while CPUs are powerful too; they are often under overhead of APIs and OS. This is exactly opposite of what consoles offer so a game that is well optimized for PC should rely more on GPU rather than CPU.
I agree, that the crowds in AC Unity were used primarily for atmosphere than anything else. But that can be excused because AC Unity is an adventure game. Witcher 3 on the other hand, is an action RPG, so even minor NPCs should be a lot more interactive and possess more depth..But you are also right in saying that ACU is very dense in detail though what bothers me is the fact that the detail they added in game (crowd) is just for visual enhancement, the game doesn't offer anything beyond that for example it's not a full open world like some previous AC games, it's just one big city with streets crowded by people and those people doesn't offer anything significant. To be honest AC4 Black Flag having less people in cities/villages offered way more interactivity and realism then this game, for me ACU is the prime example of the fact that graphics alone doesn't make a game good.
Have you tried overclocking your CPU. Overclocking can make a big difference in performance when you're CPU bound..i7 4930k is indeed a very powerful CPU. For now my i5 4690k didn't had problems with any of the games except ACU showing 100% utilization. In Crysis 3 I can reach a max of 165 fps in less dense parts of the game with CPU utilization being on 80% - 85% on all cores and if I target just 60 fps with Vsync the utilization goes further down, same goes for DAI at 60 fps I only get 45% - 60% utilization on all cores.
I think ACU is a game where an API like Mantle or DX12 could have helped a lot relaxing the CPU ?Usually this would be the case, but for some workloads, the CPU is faster than the GPU, ie A.I. But I do agree that Ubisoft should have used the GPU for decompressing the lighting data in AC Unity at the very least. If I had to guess, the reason why they didn't use the GPU for that task is because they didn't have the proper software in place yet to exploit the compute capabilities of the PS4 and Xbox One..
Decompression algorithms have been run on GPUs on PC games as well, like Rage which used CUDA to decompress textures, and Civilization 5 which used DirectCompute for texture decompression as well..
You mean TW3 will be very demanding on CPU as well, I hope my 4690k can hold upI forgot exactly, but I think the City of Novigrad has over a thousand distinct A.I entities..
Previously I used to do OC of 4.5 Ghz on my 4690k with my R9 290 and it helped in some cases but I cannot do it any longer as I will face power related issues. Actually I am facing power related issues right now, with 2x Gigabyte GTX 980 G1 graphic cards my old OCZ 850w PSU cannot handle them. When the top card reaches 80c the system shut down and restart after a second. I guess this is happening because of overload protection in PSU ?Have you tried overclocking your CPU. Overclocking can make a big difference in performance when you're CPU bound..
Definitely!I think ACU is a game where an API like Mantle or DX12 could have helped a lot relaxing the CPU ?
Of that there can be no doubt, due to the sheer amount of physics, A.I and the level of detail in the game. There is a reason why a hyperthreaded CPU is in the recommended specs. And while your CPU is very fast and has the latest core microarchitecture, it has no hyperthreading. One way to offset that would be to overclock it.You mean TW3 will be very demanding on CPU as well, I hope my 4690k can hold up![]()
That has only happened to me once, years ago when I had GTX 480 SLI. When I overclocked the cards, the PSU's overload protection would kick in and shut down the system. It happened when total system power draw approached 1000 watts. Eventually, I ended up killing that PSU, so be careful..Previously I used to do OC of 4.5 Ghz on my 4690k with my R9 290 and it helped in some cases but I cannot do it any longer as I will face power related issues. Actually I am facing power related issues right now, with 2x Gigabyte GTX 980 G1 graphic cards my old OCZ 850w PSU cannot handle them. When the top card reaches 80c the system shut down and restart after a second. I guess this is happening because of overload protection in PSU ?
They used Furmark to reach that level of power draw though, which doesn't really mirror real gaming.. Is your PSU SLI certified? I have an Antec HCP 1200w myself. I never recommend skimping on the power supply. Like a chassis, the PSU is one of those things that you will have for a long time, so it's best to get a high quality unit if you can afford it.If I use single cards and let them reach 80c by slowing fans then nothing happens (no restart) it only happens with 2 cards and when top one reaches 80c. I guess the card ask for more wattage at higher temps to keep those rather high factory overclock of 1366 Mhz stable ? As you can see in below page the card even cross R9 290X when pushed to maximum so I guess that's happening with me.
Looks like Gigabyte has a completely different AiB design for their GTX 980s. I'm pretty sure mine are 8+6, and I have the EVGA GTX 980 FTWs..I have plan to buy Cooler Master V1000 in a month so I can revert back to original settings without worrying of restart or any other power related issue. Another reason of switching the PSU is because Gigabyte cards ask for a pair of 8+8 pin power connectors while my PSU only has one pair of 8+8 pin connectors, the other pair is just 6+6 pin so it cannot be utilized for the second card therefore I am forced to use 4x molex connectors to power the second one.
I guess then I just get a better PSU and then overclock that i5 to see how it performs in TW3 otherwise it's finally time for an i7. My reason to stay away from i7 is the fact that a lot of devs these days just exaggerate on CPU requirements and mention i7 in their recommended specs but it turns out their games perform just as well on good i5s, one such game is Watch Dogs which asked for a high end i7 (4770k I think ?) but I had no problem running it on my stock 4690k, I even ran it on an overclocked i5 2500k and had no issues so I kind of get i7 requirements lightly but you're right that slowly hyper threading will become more useful.Of that there can be no doubt, due to the sheer amount of physics, A.I and the level of detail in the game. There is a reason why a hyperthreaded CPU is in the recommended specs. And while your CPU is very fast and has the latest core microarchitecture, it has no hyperthreading. One way to offset that would be to overclock it.
At 4.5ghz, you shouldn't have any issues I'd wager, although your CPU usage will still be high.
Yes that's why I just down clocked the cards to 1200 Mhz and set the power limit to 85% with aggressive fan profile. Now I no longer face any shut downs during gaming but I am just afraid that I am still pushing the PSU too far.That has only happened to me once, years ago when I had GTX 480 SLI. When I overclocked the cards, the PSU's overload protection would kick in and shut down the system. It happened when total system power draw approached 1000 watts. Eventually, I ended up killing that PSU, so be careful..
Yes the PSU is SLI certified though it's very old now, I don't remember exactly but I think I purchased it in 2007 - 2008, back then it was regarded as one of the best PSU but now I think this kind of load is too much for it.They used Furmark to reach that level of power draw though, which doesn't really mirror real gaming.. Is your PSU SLI certified? I have an Antec HCP 1200w myself. I never recommend skimping on the power supply. Like a chassis, the PSU is one of those things that you will have for a long time, so it's best to get a high quality unit if you can afford it.
Yes I saw most of the 980s to be 6+8 pins. It's just my luck that I went with Gigabyte modelsLooks like Gigabyte has a completely different AiB design for their GTX 980s. I'm pretty sure mine are 8+6, and I have the EVGA GTX 980 FTWs..
Games still don't really "require" hyperthreading, but they benefit from it. Watch Dogs is one such example. Look at the performance difference between a i5 2500k and a i7 2600K. Most of that difference is due to hyperthreading. Even in games that can't take advantage of hyperthreading, it can still provide benefits because it mitigates the performance impact of thread stalls.I guess then I just get a better PSU and then overclock that i5 to see how it performs in TW3 otherwise it's finally time for an i7. My reason to stay away from i7 is the fact that a lot of devs these days just exaggerate on CPU requirements and mention i7 in their recommended specs but it turns out their games perform just as well on good i5s, one such game is Watch Dogs which asked for a high end i7 (4770k I think ?) but I had no problem running it on my stock 4690k, I even ran it on an overclocked i5 2500k and had no issues so I kind of get i7 requirements lightly but you're right that slowly hyper threading will become more useful.
Yeah I can see a problem with that PSU. It doesn't deliver enough amperage on the +12V rail for your cards.. I would look into getting a PSU with at least 60A on the +12V rail...Yes the PSU is SLI certified though it's very old now, I don't remember exactly but I think I purchased it in 2007 - 2008, back then it was regarded as one of the best PSU but now I think this kind of load is too much for it.
Here is an old review of my PSU from Overclock3D
http://www.overclock3d.net/reviews/power_supply/ocz_gamexstream_850w_atx_psu/1
1000w is plenty. Even 850 would likely be enough, as long as it delivered enough amperage on the +12V rail.Do you think 1000w PSU will do fine for my system or I need to go above that ? I have 4 hard drives (2x SSDs and 2x HDDs), 4690k that I plan to OC, MSI Z97 G45 Gaming Motherboard, 5 casing fans including one for CPU and two Gigabyte GTX 980 G1s.
Looks dandy!This is the PSU I am planning to get, I read a lot of good reviews about it
http://www.coolermaster.com/powersupply/v-series-psu/v1000/
Like I said earlier, I think your PSU's problem is that it's not delivering enough amps via the +12V rail. Whenever you buy a PSU, thats one of the most important things to look for, especially if you are using SLI.Sorry for asking so many questions, I have no idea what PSU would be enough for a dual card system and what combined power requirement will be. Gigabyte recommends a PSU of at least 600 watts for single card but that's for entire system so I was assuming that my 850w PSU will handle two cards but that's not the case or maybe it's Gigabyte GTX 980s that ask for too much power.
The G1 cards are among the very best of the 980s so don't think like that. I'm just surprised that Gigabyte deviated so far from the normal specifications.Yes I saw most of the 980s to be 6+8 pins. It's just my luck that I went with Gigabyte modelssaw good reviews about them though.
Hmm that's certainly enlightening... sometime ago I was actually quite serious about getting core i7 4690k but then I dropped the idea but now I'll keep it on my list after PSU. Thanks for the chart, numbers are much better than explanations and proofsGames still don't really "require" hyperthreading, but they benefit from it. Watch Dogs is one such example. Look at the performance difference between a i5 2500k and a i7 2600K. Most of that difference is due to hyperthreading. Even in games that can't take advantage of hyperthreading, it can still provide benefits because it mitigates the performance impact of thread stalls.
Well I really have to thank you for pointing out core of the problem, a lot of people just told me that an 850w PSU should handle 980s in SLI and I admit that I don't really know much about this ampere ratings and all, I just bought this 850w PSU long time ago and since then it was working really well with my single card configs so I didn't bothered looking into PSU specs and details.Yeah I can see a problem with that PSU. It doesn't deliver enough amperage on the +12V rail for your cards.. I would look into getting a PSU with at least 60A on the +12V rail...
Now I can see what's the difference, even if it's just 150w over my PSU it delivers way more amperage on +12v rails compared to 18 or 20 amperes on my current PSU's rails. I wonder how it kept powering hungry card like R9 290 without a hitch ? maybe because it wasn't in crossfire ?Looks dandy! That PSU delivers 83A on the +12V rail so thats more than enough to completely guarantee stability.
I am not saying they are bad cards, I really like their performance and the all black look is one of my favorite, it's just that they are more demanding than most of the 980's out there and ask for two 8+8 pin power connectors. They have good factory overclock and reviews say they overclock like a charm so there is not much I can complain aboutThe G1 cards are among the very best of the 980s so don't think like that. I'm just surprised that Gigabyte deviated so far from the normal specifications.
Likely they did it to improve overclocking and clock speed stability..
Yeah, most likely because it wasn't in Crossfire. Although the R9 290 doesn't really suck as much juice as most people think.Now I can see what's the difference, even if it's just 150w over my PSU it delivers way more amperage on +12v rails compared to 18 or 20 amperes on my current PSU's rails. I wonder how it kept powering hungry card like R9 290 without a hitch ? maybe because it wasn't in crossfire ?
Download the uncompressed video. PhysX is all over the place, but still no GPU PhysX yet that I've seen. Hairworks was also enabled in the last GDC video. You can see it on the wolves, and on the Wyvern. BTW, you may or may not know this, but hairworks doesn't use PhysX. It uses DirectCompute, so it should be available to AMD users as well.Anyway to get it back on topic, have you noticed any PhysX in PAX gameplay video ? I don't know if it was from a console or a PC using high settings (non-ultra). The HairWorks was missing from Geralt's hairs and the wolves were too far to notice anything but I think one of his magic (Igni) showed many fire particles, I think it was PhysX but can't say if it was CPU or GPU PhysX.
With Maxwell out people started to think very bad of Hawaii and probably Kepler also in terms of power requirements.Yeah, most likely because it wasn't in Crossfire. Although the R9 290 doesn't really suck as much juice as most people think.
I see, I guess I need to download that uncompressed video from Gamersyde. Is there any way to differentiate CPU PhysX from GPU PhysX ? I mean how you recognize it's just CPU PhysX in action ? Also if it's just CPU PhysX then we can't take advantage of GPU even if we have Nvidia ?Download the uncompressed video. PhysX is all over the place, but still no GPU PhysX yet that I've seen. Hairworks was also enabled in the last GDC video. You can see it on the wolves, and on the Wyvern.
Yes I know, I used HairWorks in FarCry 4 with my R9 290 and it worked fine. It's just Nvidia's version of TressFX. Even most of the other GameWorks features work just fine on AMD, from a long time I am advocating that factBTW, you may or may not know this, but hairworks doesn't use PhysX. It uses DirectCompute, so it should be available to AMD users as well.
All in all though, the cloth simulation and environmental PhysX is very impressive so far in the Witcher 3.. CDPR made the RIGHT decision to drop Havok for PhysX, as PhysX is more impressive..
The only reason I'm certain what we've seen is CPU PhysX so far, is because the settings have all been on high. High settings should only use the CPU for PhysX. But for ultra settings, perhaps the GPU is used as well.I see, I guess I need to download that uncompressed video from Gamersyde. Is there any way to differentiate CPU PhysX from GPU PhysX ? I mean how you recognize it's just CPU PhysX in action ? Also if it's just CPU PhysX then we can't take advantage of GPU even if we have Nvidia ?
Yes I saw the uncompressed video and that might be the case but even if it's CPU PhysX, it's kind of impressive to see that much. I mean we'll need a beastly CPU to do that kind of PhysX, I hope it doesn't put too much load on CPU because even GPU has a lot to do for a game like this, it's still much faster than CPU and since CPU already has enough A.I load and other stuff, I hope this PhysX doesn't prove too much burden.The only reason I'm certain what we've seen is CPU PhysX so far, is because the settings have all been on high. High settings should only use the CPU for PhysX. But for ultra settings, perhaps the GPU is used as well.
I see, maybe they're hiding GPU PhysX just like they're hiding ultra preset ? I wonder what options we'll have for ultra and since they used HairWorks in high settings it means ultra has some other stuff in store for us.By the way, according to this article from pcgameshardware.de, GPU PhysX might still be in play. It's kind of ambiguous though, as the dev doesn't explicitly state that they will be using GPU PhysX:
Source
Yep, it's like we discussed earlier about hyperthreading. The CPU is having to do a lot in this game, so that is likely why a 3770K is the recommended CPU for this game, as it has hyperthreading..Yes I saw the uncompressed video and that might be the case but even if it's CPU PhysX, it's kind of impressive to see that much. I mean we'll need a beastly CPU to do that kind of PhysX, I hope it doesn't put too much load on CPU because even GPU has a lot to do for a game like this, it's still much faster than CPU and since CPU already has enough A.I load and other stuff, I hope this PhysX doesn't prove too much burden.
Thats great! You won't regret it, as you'll have that PSU for years to come!Btw just want to tell you that I bought Cooler Master V1200 (1200w) PSU. It has 100 amps on +12 rail and it's certified as 80+ Platinum supply, I think it's overkill for my system but since PSU is a long term investment this will be a worth while purchase for the future. This will also give me ample room to OC the CPU as well as those 980's. Once again thank you for all the help![]()
Lets see what my overclocked i5 can do in this, it will be a real testYep, it's like we discussed earlier about hyperthreading. The CPU is having to do a lot in this game, so that is likely why a 3770K is the recommended CPU for this game, as it has hyperthreading..
And if you don't have hyperthreading, then overclocking is the best way to mitigate that difference..
Thanks, yes that was my thinking better buy expensive one time than buying cheap many times.Thats great! You won't regret it, as you'll have that PSU for years to come!
SourceThough not a selectable or configurable setting, The Witcher 3: Wild Hunt also features NVIDIA PhysX, which adds realistic cloth and destruction effects throughout the game. Powered by the CPU and available on all platforms, these effects make the world feel more alive and dynamic, with cloth blowing the in the wind and objects exploding realistically. On PC, these effects run at a higher detail level and with additional particles that further increase their fidelity and realism.
Yeah I saw that as well.. But keep in mind that Andrew has an elite tier rig equipped with an Intel 5960x which he's using to test the game. That might skew his opinion somewhat..Another thing the man mentioned, which is not really related to the topic but... the game isn't very CPU intensive.