Nvidia’s GameWorks: A double-edged sword for Witcher 3

+
I would have no problem with that. Currently the game I'm most looking forward to doesn't use any 3rd party software for trees, physics, etc. so it won't impact me either way. The game uses an engine that was completely designed from scratch and is more advanced than anything else currently available (PhysX, Havok, etc.).

Which title is that? I mean crap, even indie games I supported lately which wouldn't have ever been ALLOWED to be produced by big mega corp seem to be made with glitchy coding anyway (not all but enough), as the teams are so small. Something good on a better engine would be a relief.
 
Which title is that? I mean crap, even indie games I supported lately which wouldn't have ever been ALLOWED to be produced by big mega corp seem to be made with glitchy coding anyway (not all but enough), as the teams are so small. Something good on a better engine would be a relief.
I have an inkling it might be Sui Generis ;D
(see his signature ;) )
 
Which is exactly why I hope they listen to fans, lock anything like Tress FX and anything further, just the same, treat the Nvidia lovers to what they support, all the time, on every title, from now on. ;)

I think and hope for good coding to lock it to single threads even, every single in game effect they get designed from now on, best way to market, you know I'm right!
OpenCL performance on Nvidia is pretty crap and the drivers still use OpenCL 1.1 last I checked so they don't need to do anything at all. That and... no game other than TR is using it.
 
Star Citizen, Lichdom ;)
Lichdom is using Tress? Huh, didn't know that or rather didn't notice because one of the few chars actually visible wears a hood, atleast in the beginning part.
Star Citizen isn't exactly out yet :p
(modules don't count)

that's just digression though, OpenCL performs pretty terrible on NV in general that's what I'm saying.
 
Lichdom is using Tress? Huh, didn't know that or rather didn't notice because one of the few chars actually visible wears a hood, atleast in the beginning part.
Star Citizen isn't exactly out yet :p
(modules don't count)
Yeah, therefore i did that winky-smiley-thingy.

that's just digression though, OpenCL performs pretty terrible on NV in general that's what I'm saying.
Time for Nvidia to change that, it's only a matter of their will to do something about that. OpenCL is getting more important anyway. But again, they have to stick to their proprietary API CUDA, because Nvidia.
 
Completely ignoring that many previously CUDA modules are now Direct Compute. But whatever is important to sustaining the narrative I guess.
 
Time for Nvidia to change that, it's only a matter of their will to do something about that. OpenCL is getting more important anyway. But again, they have to stick to their proprietary API CUDA, because Nvidia.
Yeah true. Both have their place though OpenCL is good at some things, CUDA at others.

Completely ignoring that many previously CUDA modules are now Direct Compute. But whatever is important to sustaining the narrative I guess.
Yeah and PhysX is now entirely CPU(and pretty damn fast) barring the additional GPU Accelerated effects. But yes there's a narrative thing going. Everything AMD has ever done is great and completely open and free but everything NV does is bad. I don't know why, and I can't care for that nonsense but what's untrue is simply untrue.

Mantle for example for all the claims of 'openness' it's not open in any sense of the word so far, only 3-4 days ago they announced that an SDK will be coming and will FINALLY become an open standard and that it was shared with MS and Khronos. Anyone will be able to build Mantle drivers now, no license fee or restrictions. Just a few days ago, and yet you hear so many uninformed people peddling it as something that's so open and free.

Likewise with Freesynch. "It's free! G-Sync needs new hardware!" ultimately turned out that even their latest GPUs can't do it only the ones made after a certain date when DisplayPort 1.2a was finally incorporated can do FreeSynch and you need a new monitor for it afterall.
 
Likewise with Freesynch. "It's free! G-Sync needs new hardware!" ultimately turned out that even their latest GPUs can't do it only the ones made after a certain date when DisplayPort 1.2a was finally incorporated can do FreeSynch and you need a new monitor for it afterall.

Not exactly. All 'volcanic islands' GPUs (Hawaii, Bonaire, Tonga) support Freesync with variable refresh rates, older GPUs still support Freesync for different fixed refresh rates (eg for tear- and stutter-less video playback). And yes, you need a new monitor, but said new monitor doesn't need an additional 100$ FPGA-Board plus Kepler- or Maxwell-GPU as G-Sync does, nor do monitor vendors have to pay a license fee (same goes for Intel and Nvidia, should they decide to adopt the DisplayPort 1.2a specification for their GPUs).
 
Last edited:
Not exactly. All 'volcanic islands' GPUs (Hawaii, Bonaire, Tonga) support Freesync with variable refresh rates, older GPUs still support Freesync for different fixed refresh rates (eg for tear- and stutter-less video playback). And yes, you need a new monitor, but said new monitor doesn't need an additional 100$ FPGA-Board plus Kepler- or Maxwell-GPU as G-Sync does, nor do monitor vendors have to pay a license fee (same goes for Intel and Nvidia, should they decide to adopt the DisplayPort 1.2a specification for their GPUs).
Not very honest now were they? And I'm not too informed on AMD's architecture names :p

That was never mentioned until quite late.
Turns out only the newest GPU silicon from AMD will support FreeSync displays. Specifically, the Hawaii GPU that drives the Radeon R9 290 and 290X will be compatible with FreeSync monitors, as will the Tonga GPU in the Radeon R9 285. It's also possible the Bonaire chip that powers the Radeon R7 260X and HD 7790 cards could support FreeSync.

Since the current Radeon lineup is populated by a mix of newer and older GPU silicon, there are brand-new graphics cards selling today that will not support FreeSync monitors when they arrive. The list of products that won't work with FreeSync includes anything based on the older revision of the GCN architecture used in chips like Tahiti and Pitcairn.

That means brand-new cards like the Radeon R9 280, 280X, 270, and 270X won't be FreeSync-capable. Nor will any older Radeons in the HD 7000 and 8000 series. AMD tells us these prior-gen GPUs don't have the necessary support for the latest DisplayPort standard.

By contrast, Nvidia's G-Sync works with GeForce graphics cards based on the Kepler architecture, which include a broad swath of current and past products dating back to the GeForce GTX 600 series.
http://techreport.com/news/27000/amd-only-certain-new-radeons-will-work-with-freesync-displays
Point being, they're just as 'bad' as everyone else with the whole smoke & mirror crap. They were selling the those GPUs without mentioning this, likewise with Mantle they gave confusing messages when it was announced about whether it only worked on GCN or not. Repi said something else, AMD said something else.

Also we've gone completely offtopic with unrelated API and middleware talk, should take it to a different thread.
 
Which title is that? I mean crap, even indie games I supported lately which wouldn't have ever been ALLOWED to be produced by big mega corp seem to be made with glitchy coding anyway (not all but enough), as the teams are so small. Something good on a better engine would be a relief.

I see you've already found it. The alpha versions of the game which I've been playing are already a ton of fun so I'd say they are succeeding with their ambitions :). It's fairly impressive what Madoc (the one programmer on the Bare Mettle team) has accomplished. In order to not further derail this thread you can read more information I've been posting about the game in THIS thread.
 
Yeah true. Both have their place though OpenCL is good at some things, CUDA at others.


Yeah and PhysX is now entirely CPU(and pretty damn fast) barring the additional GPU Accelerated effects. But yes there's a narrative thing going. Everything AMD has ever done is great and completely open and free but everything NV does is bad. I don't know why, and I can't care for that nonsense but what's untrue is simply untrue.

Mantle for example for all the claims of 'openness' it's not open in any sense of the word so far, only 3-4 days ago they announced that an SDK will be coming and will FINALLY become an open standard and that it was shared with MS and Khronos. Anyone will be able to build Mantle drivers now, no license fee or restrictions. Just a few days ago, and yet you hear so many uninformed people peddling it as something that's so open and free.

Likewise with Freesynch. "It's free! G-Sync needs new hardware!" ultimately turned out that even their latest GPUs can't do it only the ones made after a certain date when DisplayPort 1.2a was finally incorporated can do FreeSynch and you need a new monitor for it afterall.

Yeah, nope. Physx kills my CPU (An OC'd i7 4930k) on Borderlands 2 and TONS more titles, it is not fair in the slightest, not until performance is that bad, not even close, for EVERY amd title coming out, and I can't believe they opened Mantle, with competing on this level..

The Nvidia fans are strong here, clearly.
 
Last edited:
Borderlands 2 and TONS more titles, it is not fair in the slightest, not until performance is that bad, not even close, for EVERY amd title coming out, and I can't believe they opened Mantle, with competing on this level..
You're referring to GPU Accelerated PhysX effects, I'm referring to the PhysX SDK, they're different things.

Borderlands 2? No that's 2012.
I mean the newest PhysX SDK runs on CPU with CPU specific SIMD optimizations, try Metro Redux for example I guess, some people said it was using the latest PhysX SDK and giving great performance.

Mantle is not comparable here at all. Things like GPU Acc PhysX are just pointless additions nobody misses but an entire rendering API?
 
You're referring to GPU Accelerated PhysX effects, I'm referring to the PhysX SDK, they're different things.

Borderlands 2? No that's 2012.
I mean the newest PhysX SDK runs on CPU with CPU specific SIMD optimizations, try Metro Redux for example.

Alice: Madness Returns, is another, a classic now, and always a fav, but to run it, I have to use my workstation that has an Nvidia gpu for 3d design programs.

I'm saying that is useless to gamers with games now, that have never been fixed. For newer games, it's nice to hear. Doesn't change who screwed up more games than the other for profit and gain, to begin with, all my point was, and why I'm more leary of that one than the other.
 
Except nothing in those games has been screwed. The ADDITIONAL NV features aren't ruining the core experience in any way, if you want to max out the game with their technology then yes you have to pay for that, such is their business.

I didn't expect to use NV features when I was still on my AMD but certain things worked which is good. NV's effects don't work on your AMD without killing the performance? Well stop enabling them. I don't get what's so hard about that, I'm not entitled to the premium proprietary technology of a GPU brand I do not own.
 
Last edited:
I didn't expect to use NV features when I was still on my AMD but certain things worked which is good. NV's effects don't work on your AMD without killing the performance? Well stop enabling them. I don't get what's so hard about that, I'm not entitled to the premium proprietary technology of a GPU brand I do not own.
You said it.
If you really want to have PhysX, then get an Nvidia card.
 
You said it.
If you really want to have PhysX, then get an Nvidia card.

Cool , titles to stay away from, permanently. All I needed to know. Fanbois, can't live with them can't kill them all. Haha.

---------- Updated at 01:48 AM ----------

Except nothing in those games has been screwed. The ADDITIONAL NV features aren't ruining the core experience in any way, if you want to max out the game with their technology then yes you have to pay for that, such is their business.

I didn't expect to use NV features when I was still on my AMD but certain things worked which is good. NV's effects don't work on your AMD without killing the performance? Well stop enabling them. I don't get what's so hard about that, I'm not entitled to the premium proprietary technology of a GPU brand I do not own.

Spoken like a true fan. Yet, the whining was endless over the single game it was even attempted with (in reverse TR), which was the point.
 
Last edited:
Spoken like a true fan.
?
Not sure what you're suggesting. Nvidia features are extras added on top, you want ALL of them then you'll just have to get an Nvidia card. They're added optionals, you pay the price for wanting to use them, it's their added incentive. Somehow not using them doesn't ruin your game, if you have a fixation with maxing everything out just for the sake of maxing everything out the onus is on you. 'You' here meaning anyone, not you in particular.

Not expecting things you didn't pay for is being a 'fan' now?

Yet, the whining was endless over the single game it was even attempted with (in reverse TR), which was the point.
?
 
Last edited:
Top Bottom