Forums
Games
Cyberpunk 2077 Thronebreaker: The Witcher Tales GWENT®: The Witcher Card Game The Witcher 3: Wild Hunt The Witcher 2: Assassins of Kings The Witcher The Witcher Adventure Game
Jobs Store Support Log in Register
Forums - CD PROJEKT RED
Menu
Forums - CD PROJEKT RED
  • Hot Topics
  • NEWS
  • GENERAL
    THE WITCHER ADVENTURE GAME
  • STORY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 THE WITCHER TALES
  • GAMEPLAY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 MODS (THE WITCHER) MODS (THE WITCHER 2) MODS (THE WITCHER 3)
  • TECHNICAL
    THE WITCHER THE WITCHER 2 (PC) THE WITCHER 2 (XBOX) THE WITCHER 3 (PC) THE WITCHER 3 (PLAYSTATION) THE WITCHER 3 (XBOX) THE WITCHER 3 (SWITCH)
  • COMMUNITY
    FAN ART (THE WITCHER UNIVERSE) FAN ART (CYBERPUNK UNIVERSE) OTHER GAMES
  • RED Tracker
    The Witcher Series Cyberpunk GWENT
THE WITCHER
THE WITCHER 2
THE WITCHER 3
MODS (THE WITCHER)
MODS (THE WITCHER 2)
MODS (THE WITCHER 3)
Menu

Register

Should the Witcher 3 support GPU PhysX? Probably not...

+

Should the Witcher 3 support GPU PhysX? Probably not...

  • Hell yes!

    Votes: 60 71.4%
  • No way!

    Votes: 7 8.3%
  • Don't give a damn either way!

    Votes: 17 20.2%

  • Total voters
    84
Prev
  • 1
  • 2
  • 3
Next
First Prev 2 of 3

Go to page

Next Last
M

MUPPETA

Rookie
#21
Mar 6, 2015
tahirahmed said:
lol I just purchased my first GTX 980 a month ago and a second one a week ago only for this game, sigh... guess you can't get everything in life....
Click to expand...
Yes I did it also bought it in november 2014
- I was thinking that w3 is ready(before delays)
 
P

prince_of_nothing

Forum veteran
#22
Mar 6, 2015
tahirahmed said:
But isn't PhysX FLEX being used in TW3 ? I think I saw it mentioned somewhere but I can't remember where. Also with PhysX 3.x being open source now with UE4, it's very likely they make devs use PhysX FLEX more.
Click to expand...
Last time I checked, PhysX Flex aka PhysX 3.4 was still in closed alpha or beta. So no, Witcher 3 won't have it unfortunately..

Additionally they haven't shown the game on PC using the highest settings so maybe those advanced PhysX effects are only available on highest preset. In any case I will be happier if the game looks beautiful and runs well.
Click to expand...
The reason why I'm against GPU PhysX in the Witcher 3, is because I think it will take more resources away from the GPU that could be used for rendering. The Witcher 3 is going to be I think, the first TRUE next gen title in terms of visual fidelity. AC Unity would have been first, but Ubisoft sacrificed too much to get it to run on the consoles which were tapped out CPU wise due to having to decompress the lighting data and process all of the A.I from the massive crowds.

---------- Updated at 11:57 AM ----------

tahirahmed said:
lol I just purchased my first GTX 980 a month ago and a second one a week ago only for this game, sigh... guess you can't get everything in life....
Click to expand...
I have GTX 980 SLI as well. I'm hoping that I'll be able to max out the game minus ubersampling when it ships.
 
T

tahirahmed

Rookie
#23
Mar 6, 2015
korov said:
Yes I did it also bought it in november 2014
- I was thinking that w3 is ready(before delays)
Click to expand...
Which brand you picked for 980s ? I went with Gigabyte. Just curious because this is the first time I ever built a dual card setup.

prince_of_nothing said:
Last time I checked, PhysX Flex aka PhysX 3.4 was still in closed alpha or beta. So no, Witcher 3 won't have it unfortunately..

The reason why I'm against GPU PhysX in the Witcher 3, is because I think it will take more resources away from the GPU that could be used for rendering. The Witcher 3 is going to be I think, the first TRUE next gen title in terms of visual fidelity. AC Unity would have been first, but Ubisoft sacrificed too much to get it to run on the consoles which were tapped out CPU wise due to having to decompress the lighting data and process all of the A.I from the massive crowds.

I have GTX 980 SLI as well. I'm hoping that I'll be able to max out the game minus ubersampling when it ships.
Click to expand...
I see, I think I must be mixing it up with something else because I saw FLEX mentioned somewhere. I hope the version of PhysX they are using in TW3 (regardless of CPU or GPU) is optimized enough.

Btw don't we have the option to switch PhysX from GPU to CPU in Nvidia Control Panel ? so is it necessary to completely drop it ? Previously when I was using Sapphire R9 290 and Lords of The Fallen got released, I had no trouble with it but some of my friends with Nvidia cards had trouble with PhysX on GPU and to solve it they switched to CPU and the game started working fine then the dev patched the game and now it works fine with GPU PhysX as well.

lol don't mention AC Unity in next gen category, that game is the worst optimized game I have ever seen and even with my two GTX 980s it drops to 50 fps occasionally when I am in crowded streets, that is mostly because the game takes CPU utilization to 100% and that is the point where CPU cannot serve GPUs otherwise I think two GTX 980s are overkill for any game available at the moment at 1080p.

Right now I am really happy with Dragon Age Inquisition performance, even after being an AMD optimized title it runs extremely well on my cards, I can easily go over 100 fps on max settings + 2x MSAA + 1440p achieved with DSR. The game looks phenomenal at those settings and there are no SLI scaling problems.

I too hope that I can max out TW3 on day one (excluding uber sampling) with 2x MSAA at 1080p. I hope it's not too much to ask lol.
 
Last edited: Mar 6, 2015
P

prince_of_nothing

Forum veteran
#24
Mar 6, 2015
tahirahmed said:
I see, I think I must be mixing it up with something else because I saw FLEX mentioned somewhere. I hope the version of PhysX they are using in TW3 (regardless of CPU or GPU) is optimized enough.
Click to expand...
Witcher 3 uses PhysX 3.3.3 if I'm not mistaken, and it's very fast and capable of delivering advanced physics effects via software so don't worry about performance.

Btw don't we have the option to switch PhysX from GPU to CPU in Nvidia Control Panel ? so is it necessary to completely drop it ? Previously when I was using Sapphire R9 290 and Lords of The Fallen got released, I had no trouble with it but some of my friends with Nvidia cards had trouble with PhysX on GPU and to solve it they switched to CPU and the game started working fine then the dev patched the game and now it works fine with GPU PhysX as well.
Click to expand...
Yes, you have that option. It's under "Configure SLI, Surround, PhysX."

lol don't mention AC Unity in next gen category, that game is the worst optimized game I have ever seen and even with my two GTX 980s it drops to 50 fps occasionally when I am in crowded streets, that is mostly because the game takes CPU utilization to 100% and that is the point where CPU cannot serve GPUs otherwise I think two GTX 980s are overkill for any game available at the moment at 1080p.
Click to expand...
I wouldn't personally characterize AC Unity as unoptimized. It's just very dense in it's detail, which puts a tremendous burden on the CPU. I have an overclocked 4930K driving my GTX 980s, so I have no problem maintaining 60 FPS at all times...

If you have a standard quad core though, I can see how it would be problematic. In fact, when the Witcher 3 comes out, it may be better for you to run the PhysX on the GPU rather than the CPU as you will have more spare GPU power on hand than CPU.

I too hope that I can max out TW3 on day one (excluding uber sampling) with 2x MSAA at 1080p. I hope it's not too much to ask lol.
Click to expand...
Like you said, GTX 980 SLI is overkill for 1080p. I game at 1440p, so it's more of a necessity for me to use SLI if I want to max out the latest games..
 
T

tahirahmed

Rookie
#25
Mar 7, 2015
prince_of_nothing said:
Witcher 3 uses PhysX 3.3.3 if I'm not mistaken, and it's very fast and capable of delivering advanced physics effects via software so don't worry about performance.
Click to expand...
Thanks for clarifying. I too tried PhysX 3.x in Metro Redux when I had R9 290 and had no problem with my i5 4690k driving the PhysX via software so yes it is indeed very well optimized for CPU.

prince_of_nothing said:
I wouldn't personally characterize AC Unity as unoptimized. It's just very dense in it's detail, which puts a tremendous burden on the CPU.
Click to expand...
Well in PC gaming a good optimized game usually puts less load on CPU and more load on GPU which is wise choice as well because PC has tremendous power in GPUs while CPUs are powerful too; they are often under overhead of APIs and OS. This is exactly opposite of what consoles offer so a game that is well optimized for PC should rely more on GPU rather than CPU.

But you are also right in saying that ACU is very dense in detail though what bothers me is the fact that the detail they added in game (crowd) is just for visual enhancement, the game doesn't offer anything beyond that for example it's not a full open world like some previous AC games, it's just one big city with streets crowded by people and those people doesn't offer anything significant. To be honest AC4 Black Flag having less people in cities/villages offered way more interactivity and realism then this game, for me ACU is the prime example of the fact that graphics alone doesn't make a game good.

prince_of_nothing said:
I have an overclocked 4930K driving my GTX 980s, so I have no problem maintaining 60 FPS at all times...

If you have a standard quad core though, I can see how it would be problematic. In fact, when the Witcher 3 comes out, it may be better for you to run the PhysX on the GPU rather than the CPU as you will have more spare GPU power on hand than CPU.
Click to expand...
i7 4930k is indeed a very powerful CPU. For now my i5 4690k didn't had problems with any of the games except ACU showing 100% utilization. In Crysis 3 I can reach a max of 165 fps in less dense parts of the game with CPU utilization being on 80% - 85% on all cores and if I target just 60 fps with Vsync the utilization goes further down, same goes for DAI at 60 fps I only get 45% - 60% utilization on all cores.

prince_of_nothing said:
Like you said, GTX 980 SLI is overkill for 1080p. I game at 1440p, so it's more of a necessity for me to use SLI if I want to max out the latest games..
Click to expand...
Yes it's kind of stupid to remain on 1080p with two powerful cards like GTX 980 though at the moment I can't really afford to get a good 1440p monitor therefore I just super sample the games to 1440p with DSR and they look awesome, I don't know if I will get that DSR luxury with The Witcher 3 but who knows :)
 
Last edited: Mar 7, 2015
P

prince_of_nothing

Forum veteran
#26
Mar 7, 2015
tahirahmed said:
Well in PC gaming a good optimized game usually puts less load on CPU and more load on GPU which is wise choice as well because PC has tremendous power in GPUs while CPUs are powerful too; they are often under overhead of APIs and OS. This is exactly opposite of what consoles offer so a game that is well optimized for PC should rely more on GPU rather than CPU.
Click to expand...
Usually this would be the case, but for some workloads, the CPU is faster than the GPU, ie A.I. But I do agree that Ubisoft should have used the GPU for decompressing the lighting data in AC Unity at the very least. If I had to guess, the reason why they didn't use the GPU for that task is because they didn't have the proper software in place yet to exploit the compute capabilities of the PS4 and Xbox One..

Decompression algorithms have been run on GPUs on PC games as well, like Rage which used CUDA to decompress textures, and Civilization 5 which used DirectCompute for texture decompression as well..

But you are also right in saying that ACU is very dense in detail though what bothers me is the fact that the detail they added in game (crowd) is just for visual enhancement, the game doesn't offer anything beyond that for example it's not a full open world like some previous AC games, it's just one big city with streets crowded by people and those people doesn't offer anything significant. To be honest AC4 Black Flag having less people in cities/villages offered way more interactivity and realism then this game, for me ACU is the prime example of the fact that graphics alone doesn't make a game good.
Click to expand...
I agree, that the crowds in AC Unity were used primarily for atmosphere than anything else. But that can be excused because AC Unity is an adventure game. Witcher 3 on the other hand, is an action RPG, so even minor NPCs should be a lot more interactive and possess more depth..

I forgot exactly, but I think the City of Novigrad has over a thousand distinct A.I entities..

i7 4930k is indeed a very powerful CPU. For now my i5 4690k didn't had problems with any of the games except ACU showing 100% utilization. In Crysis 3 I can reach a max of 165 fps in less dense parts of the game with CPU utilization being on 80% - 85% on all cores and if I target just 60 fps with Vsync the utilization goes further down, same goes for DAI at 60 fps I only get 45% - 60% utilization on all cores.
Click to expand...
Have you tried overclocking your CPU. Overclocking can make a big difference in performance when you're CPU bound..
 
T

tahirahmed

Rookie
#27
Mar 7, 2015
prince_of_nothing said:
Usually this would be the case, but for some workloads, the CPU is faster than the GPU, ie A.I. But I do agree that Ubisoft should have used the GPU for decompressing the lighting data in AC Unity at the very least. If I had to guess, the reason why they didn't use the GPU for that task is because they didn't have the proper software in place yet to exploit the compute capabilities of the PS4 and Xbox One..

Decompression algorithms have been run on GPUs on PC games as well, like Rage which used CUDA to decompress textures, and Civilization 5 which used DirectCompute for texture decompression as well..
Click to expand...
I think ACU is a game where an API like Mantle or DX12 could have helped a lot relaxing the CPU ?

prince_of_nothing said:
I forgot exactly, but I think the City of Novigrad has over a thousand distinct A.I entities..
Click to expand...
You mean TW3 will be very demanding on CPU as well, I hope my 4690k can hold up :D

prince_of_nothing said:
Have you tried overclocking your CPU. Overclocking can make a big difference in performance when you're CPU bound..
Click to expand...
Previously I used to do OC of 4.5 Ghz on my 4690k with my R9 290 and it helped in some cases but I cannot do it any longer as I will face power related issues. Actually I am facing power related issues right now, with 2x Gigabyte GTX 980 G1 graphic cards my old OCZ 850w PSU cannot handle them. When the top card reaches 80c the system shut down and restart after a second. I guess this is happening because of overload protection in PSU ?

If I use single cards and let them reach 80c by slowing fans then nothing happens (no restart) it only happens with 2 cards and when top one reaches 80c. I guess the card ask for more wattage at higher temps to keep those rather high factory overclock of 1366 Mhz stable ? As you can see in below page the card even cross R9 290X when pushed to maximum so I guess that's happening with me.

https://www.techpowerup.com/reviews/Gigabyte/GeForce_GTX_980_G1_Gaming/25.html

For now I have down clocked the cards from 1366 to 1200 and reduced power limit to 85% which solved the issue, I also made aggressive fan profile to keep the top card cooler and now it remains around 70c - 72c max. 66c with light load.

I have plan to buy Cooler Master V1000 in a month so I can revert back to original settings without worrying of restart or any other power related issue. Another reason of switching the PSU is because Gigabyte cards ask for a pair of 8+8 pin power connectors while my PSU only has one pair of 8+8 pin connectors, the other pair is just 6+6 pin so it cannot be utilized for the second card therefore I am forced to use 4x molex connectors to power the second one :(

After PSU purchase I will certainly OC the CPU as with two 980s it can really boost up the performance.
 
P

prince_of_nothing

Forum veteran
#28
Mar 7, 2015
tahirahmed said:
I think ACU is a game where an API like Mantle or DX12 could have helped a lot relaxing the CPU ?
Click to expand...
Definitely!

You mean TW3 will be very demanding on CPU as well, I hope my 4690k can hold up :D
Click to expand...
Of that there can be no doubt, due to the sheer amount of physics, A.I and the level of detail in the game. There is a reason why a hyperthreaded CPU is in the recommended specs. And while your CPU is very fast and has the latest core microarchitecture, it has no hyperthreading. One way to offset that would be to overclock it.

At 4.5ghz, you shouldn't have any issues I'd wager, although your CPU usage will still be high.

Previously I used to do OC of 4.5 Ghz on my 4690k with my R9 290 and it helped in some cases but I cannot do it any longer as I will face power related issues. Actually I am facing power related issues right now, with 2x Gigabyte GTX 980 G1 graphic cards my old OCZ 850w PSU cannot handle them. When the top card reaches 80c the system shut down and restart after a second. I guess this is happening because of overload protection in PSU ?
Click to expand...
That has only happened to me once, years ago when I had GTX 480 SLI. When I overclocked the cards, the PSU's overload protection would kick in and shut down the system. It happened when total system power draw approached 1000 watts. Eventually, I ended up killing that PSU, so be careful..

If I use single cards and let them reach 80c by slowing fans then nothing happens (no restart) it only happens with 2 cards and when top one reaches 80c. I guess the card ask for more wattage at higher temps to keep those rather high factory overclock of 1366 Mhz stable ? As you can see in below page the card even cross R9 290X when pushed to maximum so I guess that's happening with me.
Click to expand...
They used Furmark to reach that level of power draw though, which doesn't really mirror real gaming.. Is your PSU SLI certified? I have an Antec HCP 1200w myself. I never recommend skimping on the power supply. Like a chassis, the PSU is one of those things that you will have for a long time, so it's best to get a high quality unit if you can afford it.

I have plan to buy Cooler Master V1000 in a month so I can revert back to original settings without worrying of restart or any other power related issue. Another reason of switching the PSU is because Gigabyte cards ask for a pair of 8+8 pin power connectors while my PSU only has one pair of 8+8 pin connectors, the other pair is just 6+6 pin so it cannot be utilized for the second card therefore I am forced to use 4x molex connectors to power the second one :(.
Click to expand...
Looks like Gigabyte has a completely different AiB design for their GTX 980s. I'm pretty sure mine are 8+6, and I have the EVGA GTX 980 FTWs..
 
T

tahirahmed

Rookie
#29
Mar 7, 2015
prince_of_nothing said:
Of that there can be no doubt, due to the sheer amount of physics, A.I and the level of detail in the game. There is a reason why a hyperthreaded CPU is in the recommended specs. And while your CPU is very fast and has the latest core microarchitecture, it has no hyperthreading. One way to offset that would be to overclock it.

At 4.5ghz, you shouldn't have any issues I'd wager, although your CPU usage will still be high.
Click to expand...
I guess then I just get a better PSU and then overclock that i5 to see how it performs in TW3 otherwise it's finally time for an i7. My reason to stay away from i7 is the fact that a lot of devs these days just exaggerate on CPU requirements and mention i7 in their recommended specs but it turns out their games perform just as well on good i5s, one such game is Watch Dogs which asked for a high end i7 (4770k I think ?) but I had no problem running it on my stock 4690k, I even ran it on an overclocked i5 2500k and had no issues so I kind of get i7 requirements lightly but you're right that slowly hyper threading will become more useful.

prince_of_nothing said:
That has only happened to me once, years ago when I had GTX 480 SLI. When I overclocked the cards, the PSU's overload protection would kick in and shut down the system. It happened when total system power draw approached 1000 watts. Eventually, I ended up killing that PSU, so be careful..
Click to expand...
Yes that's why I just down clocked the cards to 1200 Mhz and set the power limit to 85% with aggressive fan profile. Now I no longer face any shut downs during gaming but I am just afraid that I am still pushing the PSU too far.

prince_of_nothing said:
They used Furmark to reach that level of power draw though, which doesn't really mirror real gaming.. Is your PSU SLI certified? I have an Antec HCP 1200w myself. I never recommend skimping on the power supply. Like a chassis, the PSU is one of those things that you will have for a long time, so it's best to get a high quality unit if you can afford it.
Click to expand...
Yes the PSU is SLI certified though it's very old now, I don't remember exactly but I think I purchased it in 2007 - 2008, back then it was regarded as one of the best PSU but now I think this kind of load is too much for it.

Here is an old review of my PSU from Overclock3D

http://www.overclock3d.net/reviews/power_supply/ocz_gamexstream_850w_atx_psu/1

Do you think 1000w PSU will do fine for my system or I need to go above that ? I have 4 hard drives (2x SSDs and 2x HDDs), 4690k that I plan to OC, MSI Z97 G45 Gaming Motherboard, 5 casing fans including one for CPU and two Gigabyte GTX 980 G1s.

This is the PSU I am planning to get, I read a lot of good reviews about it

http://www.coolermaster.com/powersupply/v-series-psu/v1000/

and these are the PSUs easily available in my country (should I change my choice).

http://www.softland.com.sa/index.php?route=product/category&path=100_156

Sorry for asking so many questions, I have no idea what PSU would be enough for a dual card system and what combined power requirement will be. Gigabyte recommends a PSU of at least 600 watts for single card but that's for entire system so I was assuming that my 850w PSU will handle two cards but that's not the case or maybe it's Gigabyte GTX 980s that ask for too much power.

prince_of_nothing said:
Looks like Gigabyte has a completely different AiB design for their GTX 980s. I'm pretty sure mine are 8+6, and I have the EVGA GTX 980 FTWs..
Click to expand...
Yes I saw most of the 980s to be 6+8 pins. It's just my luck that I went with Gigabyte models :( saw good reviews about them though.
 
P

prince_of_nothing

Forum veteran
#30
Mar 8, 2015
tahirahmed said:
I guess then I just get a better PSU and then overclock that i5 to see how it performs in TW3 otherwise it's finally time for an i7. My reason to stay away from i7 is the fact that a lot of devs these days just exaggerate on CPU requirements and mention i7 in their recommended specs but it turns out their games perform just as well on good i5s, one such game is Watch Dogs which asked for a high end i7 (4770k I think ?) but I had no problem running it on my stock 4690k, I even ran it on an overclocked i5 2500k and had no issues so I kind of get i7 requirements lightly but you're right that slowly hyper threading will become more useful.
Click to expand...
Games still don't really "require" hyperthreading, but they benefit from it. Watch Dogs is one such example. Look at the performance difference between a i5 2500k and a i7 2600K. Most of that difference is due to hyperthreading. Even in games that can't take advantage of hyperthreading, it can still provide benefits because it mitigates the performance impact of thread stalls.



Yes the PSU is SLI certified though it's very old now, I don't remember exactly but I think I purchased it in 2007 - 2008, back then it was regarded as one of the best PSU but now I think this kind of load is too much for it.

Here is an old review of my PSU from Overclock3D

http://www.overclock3d.net/reviews/power_supply/ocz_gamexstream_850w_atx_psu/1
Click to expand...
Yeah I can see a problem with that PSU. It doesn't deliver enough amperage on the +12V rail for your cards.. I would look into getting a PSU with at least 60A on the +12V rail...

Do you think 1000w PSU will do fine for my system or I need to go above that ? I have 4 hard drives (2x SSDs and 2x HDDs), 4690k that I plan to OC, MSI Z97 G45 Gaming Motherboard, 5 casing fans including one for CPU and two Gigabyte GTX 980 G1s.
Click to expand...
1000w is plenty. Even 850 would likely be enough, as long as it delivered enough amperage on the +12V rail.

This is the PSU I am planning to get, I read a lot of good reviews about it

http://www.coolermaster.com/powersupply/v-series-psu/v1000/
Click to expand...
Looks dandy! ;) That PSU delivers 83A on the +12V rail so thats more than enough to completely guarantee stability.

Sorry for asking so many questions, I have no idea what PSU would be enough for a dual card system and what combined power requirement will be. Gigabyte recommends a PSU of at least 600 watts for single card but that's for entire system so I was assuming that my 850w PSU will handle two cards but that's not the case or maybe it's Gigabyte GTX 980s that ask for too much power.
Click to expand...
Like I said earlier, I think your PSU's problem is that it's not delivering enough amps via the +12V rail. Whenever you buy a PSU, thats one of the most important things to look for, especially if you are using SLI.

Yes I saw most of the 980s to be 6+8 pins. It's just my luck that I went with Gigabyte models :( saw good reviews about them though.
Click to expand...
The G1 cards are among the very best of the 980s so don't think like that. I'm just surprised that Gigabyte deviated so far from the normal specifications.

Likely they did it to improve overclocking and clock speed stability..

---------- Updated at 09:17 AM ----------

BTW mods, this thread is now way off topic. Can you just merge it with the big hardware specs thread please?
 
T

tahirahmed

Rookie
#31
Mar 8, 2015
prince_of_nothing said:
Games still don't really "require" hyperthreading, but they benefit from it. Watch Dogs is one such example. Look at the performance difference between a i5 2500k and a i7 2600K. Most of that difference is due to hyperthreading. Even in games that can't take advantage of hyperthreading, it can still provide benefits because it mitigates the performance impact of thread stalls.
Click to expand...
Hmm that's certainly enlightening... sometime ago I was actually quite serious about getting core i7 4690k but then I dropped the idea but now I'll keep it on my list after PSU. Thanks for the chart, numbers are much better than explanations and proofs :D

prince_of_nothing said:
Yeah I can see a problem with that PSU. It doesn't deliver enough amperage on the +12V rail for your cards.. I would look into getting a PSU with at least 60A on the +12V rail...
Click to expand...
Well I really have to thank you for pointing out core of the problem, a lot of people just told me that an 850w PSU should handle 980s in SLI and I admit that I don't really know much about this ampere ratings and all, I just bought this 850w PSU long time ago and since then it was working really well with my single card configs so I didn't bothered looking into PSU specs and details.

prince_of_nothing said:
Looks dandy! That PSU delivers 83A on the +12V rail so thats more than enough to completely guarantee stability.
Click to expand...
Now I can see what's the difference, even if it's just 150w over my PSU it delivers way more amperage on +12v rails compared to 18 or 20 amperes on my current PSU's rails. I wonder how it kept powering hungry card like R9 290 without a hitch ? maybe because it wasn't in crossfire ?

prince_of_nothing said:
The G1 cards are among the very best of the 980s so don't think like that. I'm just surprised that Gigabyte deviated so far from the normal specifications.

Likely they did it to improve overclocking and clock speed stability..
Click to expand...
I am not saying they are bad cards, I really like their performance and the all black look is one of my favorite, it's just that they are more demanding than most of the 980's out there and ask for two 8+8 pin power connectors. They have good factory overclock and reviews say they overclock like a charm so there is not much I can complain about :)

Btw it's me who turned this thread into a hardware thread so I am sorry about it, was just too anxious about my PSU issue.

Anyway to get it back on topic, have you noticed any PhysX in PAX gameplay video ? I don't know if it was from a console or a PC using high settings (non-ultra). The HairWorks was missing from Geralt's hairs and the wolves were too far to notice anything but I think one of his magic (Igni) showed many fire particles, I think it was PhysX but can't say if it was CPU or GPU PhysX.
 
Last edited: Mar 8, 2015
P

prince_of_nothing

Forum veteran
#32
Mar 9, 2015
tahirahmed said:
Now I can see what's the difference, even if it's just 150w over my PSU it delivers way more amperage on +12v rails compared to 18 or 20 amperes on my current PSU's rails. I wonder how it kept powering hungry card like R9 290 without a hitch ? maybe because it wasn't in crossfire ?
Click to expand...
Yeah, most likely because it wasn't in Crossfire. Although the R9 290 doesn't really suck as much juice as most people think.

Anyway to get it back on topic, have you noticed any PhysX in PAX gameplay video ? I don't know if it was from a console or a PC using high settings (non-ultra). The HairWorks was missing from Geralt's hairs and the wolves were too far to notice anything but I think one of his magic (Igni) showed many fire particles, I think it was PhysX but can't say if it was CPU or GPU PhysX.
Click to expand...
Download the uncompressed video. PhysX is all over the place, but still no GPU PhysX yet that I've seen. Hairworks was also enabled in the last GDC video. You can see it on the wolves, and on the Wyvern. BTW, you may or may not know this, but hairworks doesn't use PhysX. It uses DirectCompute, so it should be available to AMD users as well.

All in all though, the cloth simulation and environmental PhysX is very impressive so far in the Witcher 3.. CDPR made the RIGHT decision to drop Havok for PhysX, as PhysX is more impressive..
 
T

tahirahmed

Rookie
#33
Mar 9, 2015
prince_of_nothing said:
Yeah, most likely because it wasn't in Crossfire. Although the R9 290 doesn't really suck as much juice as most people think.
Click to expand...
With Maxwell out people started to think very bad of Hawaii and probably Kepler also in terms of power requirements.

prince_of_nothing said:
Download the uncompressed video. PhysX is all over the place, but still no GPU PhysX yet that I've seen. Hairworks was also enabled in the last GDC video. You can see it on the wolves, and on the Wyvern.
Click to expand...
I see, I guess I need to download that uncompressed video from Gamersyde. Is there any way to differentiate CPU PhysX from GPU PhysX ? I mean how you recognize it's just CPU PhysX in action ? Also if it's just CPU PhysX then we can't take advantage of GPU even if we have Nvidia ?

prince_of_nothing said:
BTW, you may or may not know this, but hairworks doesn't use PhysX. It uses DirectCompute, so it should be available to AMD users as well.

All in all though, the cloth simulation and environmental PhysX is very impressive so far in the Witcher 3.. CDPR made the RIGHT decision to drop Havok for PhysX, as PhysX is more impressive..
Click to expand...
Yes I know, I used HairWorks in FarCry 4 with my R9 290 and it worked fine. It's just Nvidia's version of TressFX. Even most of the other GameWorks features work just fine on AMD, from a long time I am advocating that fact :D a lot of AMD owners don't know that and on top of this Nvidia keep calling those features "Exclusive" which makes things even more difficult to understand, for example Nvidia call the GodRays as exclusive Nvidia feature while it works fine on AMD too so that's misleading for people who haven't tried it personally.
 
P

prince_of_nothing

Forum veteran
#34
Mar 11, 2015
tahirahmed said:
I see, I guess I need to download that uncompressed video from Gamersyde. Is there any way to differentiate CPU PhysX from GPU PhysX ? I mean how you recognize it's just CPU PhysX in action ? Also if it's just CPU PhysX then we can't take advantage of GPU even if we have Nvidia ?
Click to expand...
The only reason I'm certain what we've seen is CPU PhysX so far, is because the settings have all been on high. High settings should only use the CPU for PhysX. But for ultra settings, perhaps the GPU is used as well.

By the way, according to this article from pcgameshardware.de, GPU PhysX might still be in play. It's kind of ambiguous though, as the dev doesn't explicitly state that they will be using GPU PhysX:

Source
 
T

tahirahmed

Rookie
#35
Mar 12, 2015
prince_of_nothing said:
The only reason I'm certain what we've seen is CPU PhysX so far, is because the settings have all been on high. High settings should only use the CPU for PhysX. But for ultra settings, perhaps the GPU is used as well.
Click to expand...
Yes I saw the uncompressed video and that might be the case but even if it's CPU PhysX, it's kind of impressive to see that much. I mean we'll need a beastly CPU to do that kind of PhysX, I hope it doesn't put too much load on CPU because even GPU has a lot to do for a game like this, it's still much faster than CPU and since CPU already has enough A.I load and other stuff, I hope this PhysX doesn't prove too much burden.

prince_of_nothing said:
By the way, according to this article from pcgameshardware.de, GPU PhysX might still be in play. It's kind of ambiguous though, as the dev doesn't explicitly state that they will be using GPU PhysX:

Source
Click to expand...
I see, maybe they're hiding GPU PhysX just like they're hiding ultra preset ? I wonder what options we'll have for ultra and since they used HairWorks in high settings it means ultra has some other stuff in store for us.

Btw just want to tell you that I bought Cooler Master V1200 (1200w) PSU. It has 100 amps on +12 rail and it's certified as 80+ Platinum supply, I think it's overkill for my system but since PSU is a long term investment this will be a worth while purchase for the future. This will also give me ample room to OC the CPU as well as those 980's. Once again thank you for all the help :)
 
P

prince_of_nothing

Forum veteran
#36
Mar 13, 2015
tahirahmed said:
Yes I saw the uncompressed video and that might be the case but even if it's CPU PhysX, it's kind of impressive to see that much. I mean we'll need a beastly CPU to do that kind of PhysX, I hope it doesn't put too much load on CPU because even GPU has a lot to do for a game like this, it's still much faster than CPU and since CPU already has enough A.I load and other stuff, I hope this PhysX doesn't prove too much burden.
Click to expand...
Yep, it's like we discussed earlier about hyperthreading. The CPU is having to do a lot in this game, so that is likely why a 3770K is the recommended CPU for this game, as it has hyperthreading..

And if you don't have hyperthreading, then overclocking is the best way to mitigate that difference..

Btw just want to tell you that I bought Cooler Master V1200 (1200w) PSU. It has 100 amps on +12 rail and it's certified as 80+ Platinum supply, I think it's overkill for my system but since PSU is a long term investment this will be a worth while purchase for the future. This will also give me ample room to OC the CPU as well as those 980's. Once again thank you for all the help :)
Click to expand...
Thats great! You won't regret it, as you'll have that PSU for years to come!
 
T

tahirahmed

Rookie
#37
Mar 17, 2015
prince_of_nothing said:
Yep, it's like we discussed earlier about hyperthreading. The CPU is having to do a lot in this game, so that is likely why a 3770K is the recommended CPU for this game, as it has hyperthreading..

And if you don't have hyperthreading, then overclocking is the best way to mitigate that difference..
Click to expand...
Lets see what my overclocked i5 can do in this, it will be a real test :D

prince_of_nothing said:
Thats great! You won't regret it, as you'll have that PSU for years to come!
Click to expand...
Thanks, yes that was my thinking better buy expensive one time than buying cheap many times.

Speaking about the topic itself, I have seen this article today and was wondering if GPU PhysX is indeed removed from the game. Additionally I haven't seen Marcin Momot mentioning GPU PhysX or only PhysX in the promotion of TW3 with GTX 900 series cards, he only said that the game supports Nvidia HairWorks, APEX Clothing and Destruction.
 
P

prince_of_nothing

Forum veteran
#38
May 15, 2015
NVidia's Andrew Burnes (he writes Geforce optimization guides for NVidia) confirmed that the Witcher 3 Wild Hunt will use CPU PhysX only. Whether this might change in the future is anyone's guess, but personally I'm happy that they ditched GPU PhysX. CPU PhysX is much better now than it was say five years ago due to a more optimized SDK that properly leverages SIMD and multithreading technology on modern CPUs..

Though not a selectable or configurable setting, The Witcher 3: Wild Hunt also features NVIDIA PhysX, which adds realistic cloth and destruction effects throughout the game. Powered by the CPU and available on all platforms, these effects make the world feel more alive and dynamic, with cloth blowing the in the wind and objects exploding realistically. On PC, these effects run at a higher detail level and with additional particles that further increase their fidelity and realism.
Click to expand...
Source
 
sidspyker

sidspyker

Ex-moderator
#39
May 15, 2015
Another thing the man mentioned, which is not really related to the topic but... the game isn't very CPU intensive.
 
P

prince_of_nothing

Forum veteran
#40
May 15, 2015
sidspyker said:
Another thing the man mentioned, which is not really related to the topic but... the game isn't very CPU intensive.
Click to expand...
Yeah I saw that as well.. But keep in mind that Andrew has an elite tier rig equipped with an Intel 5960x which he's using to test the game. That might skew his opinion somewhat..
 
Prev
  • 1
  • 2
  • 3
Next
First Prev 2 of 3

Go to page

Next Last
Share:
Facebook Twitter Reddit Pinterest Tumblr WhatsApp Email Link
  • English
    English Polski (Polish) Deutsch (German) Русский (Russian) Français (French) Português brasileiro (Brazilian Portuguese) Italiano (Italian) 日本語 (Japanese) Español (Spanish)

STAY CONNECTED

Facebook Twitter YouTube
CDProjekt RED Mature 17+
  • Contact administration
  • User agreement
  • Privacy policy
  • Cookie policy
  • Press Center
© 2018 CD PROJEKT S.A. ALL RIGHTS RESERVED

The Witcher® is a trademark of CD PROJEKT S. A. The Witcher game © CD PROJEKT S. A. All rights reserved. The Witcher game is based on the prose of Andrzej Sapkowski. All other copyrights and trademarks are the property of their respective owners.

Forum software by XenForo® © 2010-2020 XenForo Ltd.