Forums
Games
Cyberpunk 2077 Thronebreaker: The Witcher Tales GWENT®: The Witcher Card Game The Witcher 3: Wild Hunt The Witcher 2: Assassins of Kings The Witcher The Witcher Adventure Game
Jobs Store Support Log in Register
Forums - CD PROJEKT RED
Menu
Forums - CD PROJEKT RED
  • Hot Topics
  • NEWS
  • GENERAL
    THE WITCHER ADVENTURE GAME
  • STORY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 THE WITCHER TALES
  • GAMEPLAY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 MODS (THE WITCHER) MODS (THE WITCHER 2) MODS (THE WITCHER 3)
  • TECHNICAL
    THE WITCHER THE WITCHER 2 (PC) THE WITCHER 2 (XBOX) THE WITCHER 3 (PC) THE WITCHER 3 (PLAYSTATION) THE WITCHER 3 (XBOX) THE WITCHER 3 (SWITCH)
  • COMMUNITY
    FAN ART (THE WITCHER UNIVERSE) FAN ART (CYBERPUNK UNIVERSE) OTHER GAMES
  • RED Tracker
    The Witcher Series Cyberpunk GWENT
THE WITCHER
THE WITCHER 2
THE WITCHER 3
MODS (THE WITCHER)
MODS (THE WITCHER 2)
MODS (THE WITCHER 3)
Menu

Register

Nvidia’s GameWorks: A double-edged sword for Witcher 3

+
Prev
  • 1
  • …

    Go to page

  • 8
  • 9
  • 10
  • 11
  • 12
  • …

    Go to page

  • 19
Next
First Prev 10 of 19

Go to page

Next Last
F

facemeltingsolo

Rookie
#181
Nov 16, 2014
Welp looks like CDPR is now moderated by Nvidia marketing. The fact a MOD told someone to get a Nvidia GPU instead is the most blatant, laughable BS I have ever seen. Not even people who always buy an Nvidia product are this naive. They know that without AMD? The top tier GPU would still be a 1000 dollars. Also the benchmarks people linked show Nvidia being ahead by a lot WITHOUT the "Nvidia features" on, so the point is completely moot.

Add to this Hairworks is supposed to be Direct Compute. Performance should be based around Direct Compute only and be open to optimization. It isn't. Ran tests on a watercooled R9 290x and a GTX 760 which is not even in the same league on the Nvidia Hairworks tool. Performance is the same. Direct Compute on the R9 290x is much higher.I expect it to be weighed even worse on Maxwell.

Add to all this? The new Games works games are heavily slanted towards Maxwell. Nvidia has no one keeping them in check and you want to put all the blame on Ubisoft, even though all the other Games Works games run like crap as well. This is simply about pushing a new graphics cards series, screwing over past customers on Nvidia hardware and screwing over competitors. A 780 should be nowhere near this far behind a 970.

http://www.extremetech.com/gaming/194123-assassins-creed-unity-for-the-pc-benchmarks-and-analysis-of-its-poor-performance/2

The funny thing here is GTX 780/Ti owners are getting screwed the most on Games Works games, all to push a new graphics series. Anyways enjoy the rampant piracy on this game CDPR, from basically everyone that isn't a GTX 9xx owner. This is what happens when you sign a contract and give a company CLOSED libraries that they control. I won't pirate this game, I will instead wait a year or two to buy it, when it is 5 bucks on steam. You went from getting my CE edition money to 5 bucks. Enjoy. Also this interview with CDPR is hilarious. It doesn't matter if Hairworks is Direct Compute. The libraries are closed and Nvidia can hamper performance on AMD hardware and non Maxwell (an OLD Titan is WAY ahead on direct compute over a 980) as much as they want and CDPR can't stop it. You signed a deal with the devil, now reap those rewards. I now understand the "downgrade shots" and the vetted post on CDPR on Neogaf (see them in all these impossibly unoptimized Game Works games). Also I screenshotted this reply, some of the lunacy by the mods in this forum and will be posting it to reddit and Neogaf,

PCGH: Will the hair and fur simulation run on Radeon GPUs?


Balázs Török: Yes, yes it should. At the moment it works, but whether it also runs at the end, the decision of Nvidia. What matters is the direction in which they develop the technology and whether and what barriers they are installed. But I think it will also run on Radeon graphics cards.

http://www.pcgameshardware.de/The-Witcher-3-PC-237266/Specials/The-Witcher-3-welche-Grafikkarte-welche-CPU-1107469/

Have a nice day CDPR *cough*, I mean Nvidia marketing team.
 
Last edited by a moderator: Nov 16, 2014
M

Merc616

Senior user
#182
Nov 16, 2014
sidspyker said:
Is anyone else offering these additional effects and support for them? There's TressFX yes but we've seen that in 1 game and it tanked the framerate hard for hair on just one single character. Is there a 3rd party offering anything comparable to either of them?

What about everything else?
Nvidia's offering TXAA, the only quality Temporal anti-aliasing solution we have so far.
They're offering PhysX SDK and we have nothing comparable here again Havok? Bullet? They need to step it up. It also runs entirely on CPU, not GPU.
Clothing & Destruction modules which again have nothing to do with the GPU.
What do we have comparable to HBAO+? Normal HBAO? But that uses much lesser samples and costs
Click to expand...
High-end AMD cards have performed on par with mid-range Nvidia cards on some GameWorks titles so far. The reason is AMD is unable to figure out thousands of lines of code that they are not permitted to see. They are unable to offer their customers optimized drivers. They can't offer suggestions to game developers to improve performance either because developers are forbidden to share the GameWorks code with them. The effects look impressive and I'm not disputing that.

sidspyker said:
Unfair and wrong projection because I've used AMD ATI for atleast 5-6 years now and switched to Nvidia just a few months ago.
Click to expand...
You don't have to turn off the effects for GameWorks titles so it isn't going to affect you if you have a good Nvidia card.

sidspyker said:
Bottom line is we can either have pretty one one side of the GPUs or none at all. Even then a few of Nvidia's effects work on AMD just fine(HBAO+ for example).

I'm not trying to dismiss that the effects might perform worse on AMD's side but rather saying that this does not affect the performance of the base game for anyone claiming so. It's Nvidia's proprietary tech, it's their rules, they have every right to make sure their tech does not work on the competition at all(this is the essence of how you have COMPETITION), you can feel that is very scummy but that doesn't change anything.
Click to expand...
Open and fair competition is best for the industry and consumers. A strong and competitive AMD ensures choice, innovation and better prices. Remember, without AMD you'd be paying a lot more for your Intel processors and Nvidia graphics cards. And without real competition complacency sets in and that's just human nature.

Docttj said:
Does Far cry 4 use gameworks? I think it does. I was watching a stream and the guy had a 770 and i5 and on ultra with no AA he got like 27FPS. Lowering every setting to high or lower, turning HBAO+ to SSAO, and removing any AA, think some other options he then got a constant 60. The game didn't look amazing at ultra either. If you have AMD and are getting Far Cry 4 don't immediately blame Gameworks because this game seemingly runs like shit maxed out on Nvidia too
Click to expand...
Yes, Far Cry 4 uses GameWorks. Have updated drivers been released yet? Anyway, they recommend a Radeon R9 290X and an Nvidia GTX 680. Those specs already suggest you can expect a lopsided performance difference between brands.
 
Last edited: Nov 16, 2014
D

Doctalen

Rookie
#183
Nov 16, 2014
@Merc616

Not sure what driver he was on. Obviously the game isn't out yet and the guy was playing a leak. Nvidia may release drivers on-release like they did for COD and AC:U. It was a really low FPS though. I find it hard to blame GameWorks when it comes to a Ubisoft product, personally.

I'm still sure CDPR will do their best to optimize for each card though. They are a way more PC centered company than I think a lot of the current companies using GameWorks are. If AMD in general happens to preform worse then I'd think they could and would be willing to release some sort of fix if they can.

@facemeltingsolo

The community managers are the only people who are actually employed by CDPR here aren't they? Mods are just volunteers on most gaming forums I've gone to. Not sure why what they say is so important to post on NeoGaf or Reddit. They don't actually work on the game.
 
Last edited: Nov 16, 2014
sidspyker

sidspyker

Ex-moderator
#184
Nov 16, 2014
facemeltingsolo said:
Also the benchmarks people linked show Nvidia being ahead by a lot WITHOUT the "Nvidia features" on, so the point is completely moot.
Click to expand...
Where is the part where this is somehow relevant to libraries added on top of a game? Some games are performing better on Nvidia than AMD even without the Nvidia fluff enabled okay, how is this relevant to GameWorks? Since they're mostly additional libraries added on top this has nothing to do with the performance of the base game since it's disabled. It's not some super blackmagic library that can cripple the performance of devices even when it's not enabled.

facemeltingsolo said:
Add to this Hairworks is supposed to be Direct Compute.
Click to expand...
It is otherwise it wouldn't work on non-Nvidia devices since they don't use CUDA.

facemeltingsolo said:
Performance should be based around Direct Compute only and be open to optimization. It isn't. Ran tests on a watercooled R9 290x and a GTX 760 which is not even in the same league on the Nvidia Hairworks tool. Performance is the same. Direct Compute on the R9 290x is much higher.I expect it to be weighed even worse on Maxwell.
Click to expand...
DirectCompute is an API which Nvidia's proprietary technology is using in one of their closed libraries in this case. Similarly AMD's TressFX uses OpenCL, and Nvidia cards are not so good at OpenCL calculations.

Going on a limb here but since it's an API I'd wager to say it works on a software by software basis on how it's programmed, there is no "X is better than Y at DirectCompute" it's "X is better than Y at DirectCompute in this specific program."

facemeltingsolo said:
This is what happens when you sign a contract and give a company CLOSED libraries that they control.
Click to expand...
So CDPR should cancel contracts with any company that gives closed libraries? Just about any and all proprietary middleware? So if say SpeedTree or something else tell you that you can use their code but they don't let you modify the code... then No SpeedTree, no Wwise, No Havok or PhysX? No PathFinder? No nothing? Basically a game without trees, without audio, without physics and without AI.

facemeltingsolo said:
Also I screenshotted this reply, some of the lunacy by the mods in this forum and will be posting it to reddit and Neogaf
Click to expand...
Great, can't wait.
Lunacy? Not expecting exclusive features of 1 GPU brand on the other is lunacy now? So going by that logic sanity is expecting features you didn't pay for?

facemeltingsolo said:
PCGH: Will the hair and fur simulation run on Radeon GPUs?

Balázs Török: Yes, yes it should. At the moment it works, but whether it also runs at the end, the decision of Nvidia. What matters is the direction in which they develop the technology and whether and what barriers they are installed. But I think it will also run on Radeon graphics cards.

http://www.pcgameshardware.de/The-Witcher-3-PC-237266/Specials/The-Witcher-3-welche-Grafikkarte-welche-CPU-1107469/
Click to expand...
People who own the rights to a technology decide what to do with it, more news at 11.
 
Gilrond-i-Virdan

Gilrond-i-Virdan

Forum veteran
#185
Nov 16, 2014
DirectCompute is MS attempt to push for Windows lock-in in GPGU field. It's not an option for any serious development because it's not cross platform. Real competitors in the field are OpenCL (open API) and CUDA (Nvidia proprietary one). These are both cross platform.
 
F

facemeltingsolo

Rookie
#186
Nov 16, 2014
"Wait for Far Cry 4 benchmarks" they said.

Here you go.

https://www.youtube.com/watch?v=TOaNGwEl0vU&list=UUJEhJYDr4NAUEtcMRNrZmcw

No thanks CDPR. Your company doesn't even have control of the libraries to ensure fairness, which your own person working on the game stated. You CAN'T optimize anything. Nvidia controls what card series can excel at something and pretty obvious now that they are targeting 60 FPS at 1080p (LOL) on a GTX 970, with Game Works to sell/promote the new card series and SLI which is a joke.

Also TXAA is crap. That is why no one benchmarks with it. Hilarious to hear you "mods" selling it, when I never ran it with my GTX 700 series card because it was a blurry mess. Total Biscuit and anyone else not related to Nvidia marketing all say TXAA is a blurry mess.

Have fun with your Nvidia controlled game. Like I said. I will buy it when it is 5 dollars on steam and not before. Under 50 FPS lows at 1080p on a game that barely looks better than Far Cry 3 on a R9 290x. Hilarious.

I hope Nvidia paid you a boat load of money, because this is going to cost you a lot of sales.
 
Gilrond-i-Virdan

Gilrond-i-Virdan

Forum veteran
#187
Nov 16, 2014
sidspyker said:
So CDPR should cancel contracts with any company that gives closed libraries? Just about any and all proprietary middleware? So if say SpeedTree or something else tell you that you can use their code but they don't let you modify the code... then No SpeedTree, no Wwise, No Havok or PhysX? No PathFinder? No nothing? Basically a game without trees, without audio, without physics and without AI.
Click to expand...
There are open alternatives, but not always on par in features. Audio at least rarely needs any proprietary libraries.
 
D

Doctalen

Rookie
#188
Nov 16, 2014
facemeltingsolo said:
"Wait for Far Cry 4 benchmarks" they said.

Here you go.

https://www.youtube.com/watch?v=TOaNGwEl0vU&list=UUJEhJYDr4NAUEtcMRNrZmcw

No thanks CDPR. Your company doesn't even have control of the libraries to ensure fairness, which your own person working on the game stated. You CAN'T optimize anything. Nvidia controls what card series can excel at something and pretty obvious now that they are targeting 60 FPS at 1080p (LOL) on a GTX 970, with Game Works to sell/promote the new card series and SLI which is a joke.

Also TXAA is crap. That is why no one benchmarks with it. Hilarious to hear you "mods" selling it, when I never ran it with my GTX 700 series card because it was a blurry mess. Total Biscuit and anyone else not related to Nvidia marketing all say TXAA is a blurry mess.

Have fun with your Nvidia controlled game. Like I said. I will buy it when it is 5 dollars on steam and not before. Under 50 FPS lows at 1080p on a game that barely looks better than Far Cry 3 on a R9 290x. Hilarious.

I hope Nvidia paid you a boat load of money, because this is going to cost you a lot of sales.
Click to expand...
770 was getting way way worse performance than those AMD cards on the stream I was watching. 27 or less FPS at the opening scene where you are at the dinner table without any AA even being on. Why would it preform that poorly? The game doesn't look better than FC3 and that didn't run this badly. Strikes me as Ubisoft being shitty at optimizing their games once again. AMD cards shouldn't be getting the FPS drops they are in that video but if it's Nvidia making it happen then why does Nvidia cards get shit performance too? In previous posts people said that a high end AMD card was running at the same as a 770 but this isn't the case here. So why isn't it running the same as a 770 here? I think it's because it's all up to the devs to optimize their damn games. If GamesWorks caused that AMD card to run the same as a 770 before then why is it running way better than one here?

Also, are you aware that the mod are just people like you and me? They aren't getting paid by CDPR and they're not getting paid to shill like you seem to think.
 
Last edited: Nov 16, 2014
Sardukhar

Sardukhar

Moderator
#189
Nov 16, 2014
Wheee! I think you guys should have this discussion over on the Cyberpunk forums, where, you know, we're ALL ABOUT THE FUTURE.

Also, I am not a cruel and malicious mod. I have NO IDEA who told you that.

I have been paid by Nvidia and AMD, directly. In cash. Quite a lot of cash. The first party paid me one half of three times what the second party paid me. The second party gave me 80 dollars less than the first party paid me. Now, who is the first party and how much did they pay me?

I'll be posting my R9 290x CF setup benchmarks when I get Witcher 3, be interested in if anyone who is running Nvidia SLI would like to do the same, then we can mess around with all these APIs and whatnot, see what differences we see.

Edit: Heh, I said "Nvidia SLA". Oh, if only.

 
Last edited: Nov 16, 2014
D

Doctalen

Rookie
#190
Nov 16, 2014
Here @facemeltingsolo

http://www.anandtech.com/show/8714/benchmarked-lords-of-the-fallen

The 770 and R9 290x are about on the same level here. People blamed GameWorks for it. In Far Cry 4 though the R9 290x has a way higher FPS average than the 770 does. The 770 in what I watched was getting 27FPS on average without even having set foot into the open world and just inside a building. So if it's GamesWorks apparently doing it in Lord of the fallen then why such a difference here? Could it be shitty optimization in general? Could it be that GamesWorks games can be made to run well on AMD too and that it's up to the developer to get them on the same level?
 
Last edited: Nov 16, 2014
sidspyker

sidspyker

Ex-moderator
#191
Nov 16, 2014
Docttj said:
Could it be shitty optimization in general?
Click to expand...
I would say yes, Ubisoft Kiev is the studio that does the AC ports(and we know the condition of those) and this time did the Far Cry 4 port, Far Cry 3 was not done by Kiev... that and Far Cry 3 had plenty of issues at launch too but luckily got patched. Two patches have already been detailed for FC4 as well.

facemeltingsolo said:
Hilarious to hear you "mods" selling it
Click to expand...
"is" "this" "supposed" "to" "mean" "something" "?"
 
Last edited: Nov 16, 2014
D

dragonbird

Ex-moderator
#192
Nov 16, 2014
facemeltingsolo said:
Welp looks like CDPR is now moderated by Nvidia marketing. The fact a MOD told someone to get a Nvidia GPU instead is the most blatant, laughable BS I have ever seen.
Click to expand...
Moderators in this forum, like any other forum member, are entitled to express opinions. The opinion expressed was the moderator's own, as you well know. And you are welcome to disagree with it and express alternative views.

What you, and any other forum member, are NOT allowed to do is make ad hominem attacks on others, moderator or otherwise. I'd therefore suggest that this discussion return to a civil basis.

And, as a forum member expressing an opinion:
I find it difficult to criticise Nvidia for innovating, or for wanting to gain market share. What I can, and would, criticise is game developers who fail to ensure that their games work adequately (not "perfectly", but "adequately") on systems that were high end a year ago, and I'd criticise AMD for blaming Nvidia for their own inability to compete.
 
  • RED Point
Reactions: facemeltingsolo and Doctalen
Tracido

Tracido

Forum veteran
#193
Nov 16, 2014
Well, I've been a Witcher fan a long time and it will be a shame to have to skip this one if it isn't going to work as good on my system, simply because I decided to spend 450$ instead of 750$ for an equivalent card.

My r9 290x...

I do need to know, because until I do, I will be advising all friends with AMD cards to stay away from this one, until much later if this proprietary shit is gonna get nasty.

I want no part of it. Just want my game. I may or may not just buy it years later on PS4, if this is the case on PC.
 
Last edited: Nov 16, 2014
F

facemeltingsolo

Rookie
#194
Nov 16, 2014
"770 was getting way way worse performance than those AMD cards on the stream I was watching. 27 or less FPS at the opening scene where you are at the dinner table without any AA even being on"


1) A 770 is not close to a R9 290x. I have had both cards. The 770 can play Witcher 2 at ubersampling at 30 and under FPS. A decent brand 290x at high 40 to past 60 at times. That is how far they are apart and that is how a game developer not working with bribes should look like on performance. A 290x can play Shadows of Mordor (which uses the bloody Assassin's Creed Engine) at 4k past console FPS. Good freakin luck doing that on a GTX 770. At higher AA settings/resolution the R9 290x beats a GTX 980 if the game isn't rigged and it is as simple as looking at the number for "bandwidth" in GPU-z between the two cards. Does it use more power and require exotic cooling and a GOOD brand r9 290 like Sapphire/Power Color PCS +? Sure. THAT is where the 980 is ahead. That has nothing to do with FPS. Two good brand or water cooled 290x's are literally as good as it gets at 4k.

2) Nvidia does not give a damn about their past customers. They want those customers to upgrade and get SLI setups. A GTX 770 should be SLAUGHTERING this game (Far Cry 4). You are talking about a GPU well past 3 TFLOPS compares to a Xbox Ones 1.31 and a PS4's 1.84, and a barely updated Far Cry 3 engine.

To put it simply? Nvidia can tier cards performance anyway they want through the closed library. They can hold a 770 and 780 back to sell more 970's/980's. They can give a later patch and blame Ubisoft. This is why a closed library should never exist. Nvidia is already pulling these shenanigans, to sell the Maxwell's as amazing tech, when a GTX 970 is not close to what a 780ti is other than having more VRAM, and belongs nowhere close to a r9 290x at higher AA settings.

You have no one calling out Nvidia because most of the "tech sites" or "gaming sites" have marketing deals with Nvidia. IGN is obvious with their marketing for Nvidia. PCPER.com likened AC Unity to Crysis 1 (LOL) and acted as an apologist for the Games Works game, meanwhile at the bottom of that blog video? "Sponsored by Nvidia". Most absurd thing I have ever heard. Crysis 1 did not run at the same resolution on a pitiful X86 ultra budget AMD PC. This is how laughable the tech media has become. They are as bought out as CDPR.

The only justice here is that GTX 770/780 owners are getting screwed over as much as AMD owners, by the company they are so loyal to.

TLDR? If you buy this game at full price as a AMD GPU owner? You are contributing to the problem. If you upgrade from a 780 to a 970 which should not be CLOSE to the upgrade it is due to game works? You are an idiot. You should be flooding Nvidia's forums with complaints. If you bought two GTX 980's and can't even run AC Unity at high MSAA well at 1080p (which is the most pathetic, rigged thing I have ever seen)? You should consider a class action law suit and get your money back for that free game.
 
Last edited: Nov 16, 2014
Sardukhar

Sardukhar

Moderator
#195
Nov 16, 2014
Waaaait, wait wait wait. You might not buy Witcher 3 and deprive yourself of what will be a great gaming experience because your (our) 290x or whatever GPU might run it at 65 fps instead of 70? Presuming you aren't CPU limited or resolution maxed, making it pretty irrelevant anyway?

Really? For 5 fps or 10 fps or whatever, you'll not enjoy a totally playable game, ( well, after the first three patches, maybe, depending on how release goes..tee hee)?

And you'll do this to yourselves because...you want to teach CDPR or NVidia some kind of lesson?
Bwahahahahahahahaahahah! A-maazing!

I totally support these plans. Please provide pics of not buying and not playing Witcher 3. For...posterity. Yes. Grin.
 
  • RED Point
Reactions: Fallout_Wanderer, MikeKing and freakie1one
V

Vigilance.492

Ex-moderator
#196
Nov 16, 2014
facemeltingsolo said:
That is how far they are apart and that is how a game developer not working with bribes should look like on performance.
Click to expand...
Well isn't it lucky that in your specific example you're talking about The Witcher 2, which is a game made by CDProjektRED, the developer that you later accuse ...

facemeltingsolo said:
They are as bought out as CDPR.
Click to expand...
Again, I think some of you are blowing this completely out of proportion. You don't even have evidence yet that The Witcher 3 runs poorly on AMD GPU's and yet you're already trying to accuse CDPR of being "sellouts"... Come on. I said this before and I'll say it again... Hold off on your early purchase if you're worried (As you are) and then when the benchmarks come in and if we end up having proof that the game runs poorly on AMD vs Nvidia, then you can start throwing around terms like "Sellouts" and start a shitstorm at CDPR, but as of now it's absurd and honestly just comes off as some obsessive hatred of Nvidia.
 
  • RED Point
Reactions: freakie1one
F

facemeltingsolo

Rookie
#197
Nov 16, 2014
Sardukhar said:
Waaaait, wait wait wait. You might not buy Witcher 3 and deprive yourself of what will be a great gaming experience because your (our) 290x or whatever GPU might run it at 65 fps instead of 70? Presuming you aren't CPU limited or resolution maxed, making it pretty irrelevant anyway?

Really? For 5 fps or 10 fps or whatever, you'll not enjoy a totally playable game, ( well, after the first three patches, maybe, depending on how release goes..tee hee)?

And you'll do this to yourselves because...you want to teach CDPR or NVidia some kind of lesson?
Bwahahahahahahahaahahah! A-maazing!

I totally support these plans. Please provide pics of not buying and not playing Witcher 3. For...posterity. Yes. Grin.
Click to expand...
The order was cancelled a few days ago. I did not take a picture. It was the CE edition, because I had no idea Games Works would continue to be this slanted. I also cancelled Watch Dogs order after the repeated lies from Ubisoft. I saw where that was heading. I viewed it as a sad event. In fact I did not cancel until I read this thread and these biased and laughable mods. A mod telling someone to buy an Nvidia GPU, when they already have a GPU that should run this game well is so FUBAR I don't even know where to begin. I also lost complete faith in CDPR when I read that they have ZERO control over the Games Works libraries.

How about all these pro Nvidia mods post a screenshot of their GTX 970's GPU-Z and I will post my GPU's and show you just how full of absolute crap they are, and how much you have to cheat in a closed library that CDPR has zero control over, to be ahead on higher resolutions and higher MSAA. These mods also claimed AMD drivers were bad, when there is no way to optimize a closed library. CDPR can't even control it and you expect AMD to guess at optimization? Get real.

A GTX 970 is slower in Ubersampling in Witcher 2 than my Tri-x. I know. I had both in identical builds. I build systems for friends. I do not work with hardware, I make a living in networking. Just something I have always done for families and friends. The GTX 970 was slower in everything except Games Works games and DSR is worse than Gedosato and custom resolutions which we already had. Why would I ever support a closed library that gives artificial increases, and also screws over Nvidia's own customers. I also built systems with a GTX 780 in it (which is getting hosed by Games Works).

As far as CPU bound? Not bloody likely. My CPU is about as good as it gets (for gaming) on this crappy API MS has given us and this is on a 30 dollar Air cooler. I can push higher any time I want with water. Only a low level API like Mantle can solve that. You know, something CDPR should be using instead of Game Works. The API that AMD just made Public and what Intel (who asked for it) and Nvidia with NO licensing fee and NO oversight. When Nvidia refuses to use it and Windows 10 goes to a subscription fee (worst kept secret in the world) to charge us for a start bar (because most people don't know about things like "start is back") and Direct X 12, you can thank Nvidia for that to. I am sure no money is being thrown around from MS to Nvidia... BTW see the EVGA precision icon? That is because I came from a GTX 770 which is slow as hell compared to this Tri-x and Games Works makes my new card into a GTX 770... You and Nvidia are literally stealing money from me by artificial performance limitations.

So thanks Nvidia. I hope they have a blast when AMD gets so sick of this crap, they just do the same thing back and ruin all EA games and AMD titles on Nvidia hardware, cus that is what we are heading to. I don't think people are going to be happy if AMD does that. A R9 290 already beats a 980 in Dragon's Age Inquisition due to bandwidth at 4xMSAA. Would you be happy if AMD cripples it on purpose and makes it perform like a GTX 770?

Here is my "cpu bound" CPU that has at least .08 volts of headroom left (which is on the safe side). Also the swings we are seeing in games works are not 5 fps. They are as big as 20-30 and moving a card down a couple tiers. The R9 290x is being put at BELOW 40 fps at times. This is not a slight difference. This is CRIPPLING performance on cards that aren't Maxwell all to sell a new series and promote SLI setups, when we need nowhere near that for 1080p, or even 1440p.

http://i.imgur.com/4Qw3ioJ.jpg?1

But whatever. This will be my last post on this forum. This forums moderation is an absolute joke and the most unprofessional thing I have ever seen. Favoring a GPU brand, telling people to switch brands, and belittling people. Enjoy my 5 dollars for Witcher 3 when it is on sale. My CE edition money is going towards Dragon's Age Inquisition, which is a title I had much less interest in. At least that company is not treating people like crap and posting pictures of monkeys, when the people have legitimate concerns. Pretty freakin sad when EA of all companies treats their customers better than CDPR.

Have fun with Nvidia ruining your game on 40 percent of the market share, and that doesn't include all the non Maxwell GPU's from Nvidia that will also run sub optimally so Nvidia can push a upper mid tier GPU as the greatest GPU ever made...
 
Last edited: Nov 16, 2014
Tracido

Tracido

Forum veteran
#198
Nov 16, 2014
Sardukhar said:
Waaaait, wait wait wait. You might not buy Witcher 3 and deprive yourself of what will be a great gaming experience because your (our) 290x or whatever GPU might run it at 65 fps instead of 70? Presuming you aren't CPU limited or resolution maxed, making it pretty irrelevant anyway?

Really? For 5 fps or 10 fps or whatever, you'll not enjoy a totally playable game, ( well, after the first three patches, maybe, depending on how release goes..tee hee)?

And you'll do this to yourselves because...you want to teach CDPR or NVidia some kind of lesson?
Bwahahahahahahahaahahah! A-maazing!

I totally support these plans. Please provide pics of not buying and not playing Witcher 3. For...posterity. Yes. Grin.
Click to expand...
Hey, if that's what it takes to get the mighty mods to confirm it is that slight a difference, by all means I sure will. I've been here since this place was still an unknown little title censored by Atari out of fear it wouldn't be sold anywhere.
 
tommy5761

tommy5761

Mentor
#199
Nov 16, 2014
I don`t see the point in slinging mud or fud one way or the other . Do Nvidia cards run on "Gaming Evolved " games sure they do and will AMD cards run on Gameswork titles sure they do and will . Now we may not get the performance that is expected from either card on either software so that is left up to the consumer to decide on which is better for them . AMD and Nvidia have always and will always be competitors in the GPU market with one always trying to outdo the other .
 
  • RED Point
Reactions: sidspyker
sidspyker

sidspyker

Ex-moderator
#200
Nov 16, 2014
tracido said:
Hey, if that's what it takes to get the mighty mods to confirm it is that slight a difference, by all means I sure will. I've been here since this place was still an unknown little title censored by Atari out of fear it wouldn't be sold anywhere.
Click to expand...
Well then if you have any concerns shouldn't you be waiting to see the benchmarks instead of passing judgement with just about... no information.
 
Prev
  • 1
  • …

    Go to page

  • 8
  • 9
  • 10
  • 11
  • 12
  • …

    Go to page

  • 19
Next
First Prev 10 of 19

Go to page

Next Last
Share:
Facebook Twitter Reddit Pinterest Tumblr WhatsApp Email Link
  • English
    English Polski (Polish) Deutsch (German) Русский (Russian) Français (French) Português brasileiro (Brazilian Portuguese) Italiano (Italian) 日本語 (Japanese) Español (Spanish)

STAY CONNECTED

Facebook Twitter YouTube
CDProjekt RED Mature 17+
  • Contact administration
  • User agreement
  • Privacy policy
  • Cookie policy
  • Press Center
© 2018 CD PROJEKT S.A. ALL RIGHTS RESERVED

The Witcher® is a trademark of CD PROJEKT S. A. The Witcher game © CD PROJEKT S. A. All rights reserved. The Witcher game is based on the prose of Andrzej Sapkowski. All other copyrights and trademarks are the property of their respective owners.

Forum software by XenForo® © 2010-2020 XenForo Ltd.