Will TW3 have AMD-specific tech?

+
GoodDay All,

I have to say I am a AMD user and have no real complaints thus far except for maybe the Driver updates for Amd, however they " Seem " to be getting better as of late.

I am so comfortable with AMD now that I would find it really hard to switch over to Nvidia. Honestly I don't plan on utilizing 3D tech anytime soon so I suppose that why I'm reluctant to switch to Nvidia.

Which brings me to my question for this thread,

If Nvidia is going to be the mainstay for TW3, would it be at all possible that if someone bought the " Collectors Edition " version that those particular purchases could be optimized for the GPU you had.



I know that's way out on a limb there but just a thought.......

Cheers!
 
I've had a AMD HD69502GB for year and a half now I got myself a Nvidia GTX 780 "Gainward GLH". I'm not so fussed about the minor differences but I think PhysX is really interesting if you're a enthusiast.
 
Well, fur? hair? Tessla? I don't care because if those effects give me frame rate drops thereand there I wouldnt get fun with this game. Just give me the story with well optimized, I hate un-optimized effects and frame-rate drops or to get a powerfull video card to run this 60+ FPS. There is no point for me. I will play this on PS4 and I need only good play times thats all and thanks.
 
I just read quite a few interesting things about AMD's Mantle graphics API. From NeoGAF.

http://www.neogaf.com/forum/showthread.php?t=715349

DICE's Johan Andersson re Mantle: Asked Nvidia, Intel, others, only AMD listened: http://heise.de/-2045398

German c't has a interview at APU13 with DICE's rendering architect Johan Andersson regarding Mantle:

Noteworthy statements:
- Effort to have console-like access and programmability on PC started about 5 years ago. Spoke to different companies including Nvidia and Intel.
- Respect for AMD being the sole company to realize his suggestions.
- Yearly meetings, long discussions. Mantle code was started one and a half year ago.
- Project was internally treated as top secret, similar to Eyefinity.
- Mantle is *not* a console-like AMD interface for GCN graphic chips. It allows finer grained access to the GPU. Too specific cases can be realized through extensions. Possible for other manufacturers to support Mantle.
- DirectX 11 compatible doesn't automatically mean potentially Mantle compatible. GPU architecture needs to fulfil specific requirements. Nvidia's Kepler should be able to do that.
- Wants to see Mantle everywhere (re smartphones, tablets, mobile, Linux and Mac OS).
- Still too early on how much faster Mantle will be over Direct3D.
- Porting console games using Direct3D may still be faster, Mantle may take longer but helps doing it well re performace, level of details and robustness.

So lets start with #1.

- Effort to have console-like access and programmability on PC started about 5 years ago. Spoke to different companies including Nvidia and Intel.

Do we PC gamers really want console like programmability on PC? I mean we already suffer with poor video games developed with consoles in mind and ported to the PC with poor optimizations.

#5 is weird.

- Mantle is *not* a console-like AMD interface for GCN graphic chips. It allows finer grained access to the GPU. Too specific cases can be realized through extensions. Possible for other manufacturers to support Mantle.

I get what it says but go back to #1 it says console-like programmability.

Then there is #6.

- DirectX 11 compatible doesn't automatically mean potentially Mantle compatible. GPU architecture needs to fulfil specific requirements. Nvidia's Kepler should be able to do that.

Then there is #8.

- Still too early on how much faster Mantle will be over Direct3D.

#8 I have a feeling Mantle will be what Glide is.

Last one #9.

- Porting console games using Direct3D may still be faster, Mantle may take longer but helps doing it well re performace, level of details and robustness.

So D3D (DirectX) is still faster than Mantle. It may take longer but helps doing it well re performance, level of details and robustness.

Also I knew it Mantle is just coding to the metal.

From a few people who posted on comments on NeoGAF.

Question: Can someone explain in laymans terms what mantle is?

Answer: "Code to the metal!!"

Answer #2: These engines process an absolutely huge amount of data, more than anyone could have anticipated when designing the very core aspects of Direct3D. Since the developers obviously know what this data is, what it looks like, where it is, and where it's going, it's more efficient for them to have direct access to it rather than having to access it through Direct3D, which acts as a middleman. This low-level access has always been the case on consoles, but is very difficult if not impossible to do on PCs because of the different hardware configurations that need to be supported.

What Mantle does is allow developers a similar level of low-level control, which allows them to save tons of CPU resources by removing the need for every GPU action to be seen and approved by the CPU. It also may allow for engine developers to bring some of the low-level algorithms that they use frequently on console GPUs to PC GPUs. This could lead to large performance advantages for Mantle-based renderers over Direct3D and OpenGL implementations.

TL;DR: Coding to da metal

AMD is targeting 100,000 draw calls. Nvidia from a rumor I heard is pushing Microsoft to add in some DirectX update in the future to do 100,000 draw calls if not more I don't know how true it is but it was from NeoGAF yesterday when I read about AMD's APU2013 don't remember the name of the topic though :/.

Personally I don't think CD Projekt RED should add Mantle API support for the PC version of The Witcher 3: Wild Hunt because DirectX 11 is still faster it seems and DirectX 11 works on both AMD hardware and Nvidia graphic cards.

Anyone have any thoughts on this?
 
Ballowers100 said:
AMD is targeting 100,000 draw calls. Nvidia from a rumor I heard is pushing Microsoft to add in some DirectX update in the future to do 100,000 draw calls if not more I don't know how true it is but it was from NeoGAF yesterday when I read about AMD's APU2013 don't remember the name of the topic though :/.
You probably mean this
http://www.dsogaming.com/news/microsoft-details-direct3d-11-improvements-for-windows/

Not related to you but the next time someone says "coding to the metal" I swear I'll start murdering people.
 
sidspyker said:
You probably mean this
http://www.dsogaming.com/news/microsoft-details-direct3d-11-improvements-for-windows/

Not related to you but the next time someone says "coding to the metal" I swear I'll start murdering people.
Yep that's what I meant.

Sorry for my mistake.

Anyways I heard AMD will release Mantle as a open source Software Development Kit (SDK) sometime in 2014 to the public. I can't wait for it to be released so I can download it and try it out myself and see how much stuff it really does. For now it's only gonna be for AAA video game publishing companies and AAA video game development companies.
 
I should probably leave these here...and in the Mantle thread.




 
I believe Nvidia will lower the prices of their cards down next gen to be able to compete and survive.
Both Consoles are running on AMD hardware next gen, which means many games are going to be better optimized on AMD cards, and most importantly this means that Nvidia won't be making any money off of console sales. The only option for Nvidia to make money next gen is PC. All of their focus should be PC, and if they mess up they may not recover. Nvidia has no other way but stop overpricing their cards, and I think they will.

Maxwell, the new Nvidia line up is getting introduced in 2014. So I suggest everyone waits until Maxwell hits the market, and then upgrade or buy new PCs. Let's see what happens. But if Nvidia keeps overpricing their cards I'm definitely gonna go with an AMD card since consoles are AMD as well, and there will certainly be advantages on PCs running on AMD cards and CPUs as well.
 
I don't get why people think there is no way that they will add AMD effects like tressFX. It doesn't matter if Nvidia is helping them or not, the fact is both the PS4 and Xbone use AMD hardware. Why would they not take advantage of something that would make both consoles hardware? Same thing goes for the 8 core AMD processors, they will start to pull ahead now that games will actually use 8 cores, right now they tend to fall behind i7s since no current game supports more than 4 cores as far as I know atm. But now that the PS4 and Xbone use 8 cores you can bet games will start using those 8 cores to its full extent.

The fact is its going to be hard to find a developer that will only support Nvidia tech for a multiplatform game since 2 of the platforms don't support Nvidia tech. Does this mean PC games will stop having Nvidia tech support, no. But it mean that games will probably support both, and if not the bet would be that AMD would get primary support on anything multiplatform.

Face it, AMD secure the hardware contracts on the PS4 and Xbone was the best move they have ever made, they are back in the game, and regardless of anything the competition that brings is great for us gamers.


P.S. Also like to point out for those worried about missing out on PhysX, you wont miss much, I went from using Nvidia based computers for 6 years to a AMD based one, and the only difference I noticed was on 1 gun in BL2. But my AMD system runs great, I absolutely love it.
 
On the contrary, I wouldn't worry about nVidia. The consumer desktop and notebook market is a small tail of a very big dog to them. The GPU computing market is growing at a rate of 12%, and nothing AMD is shipping is going to take away enough share of that market to make them cut prices on Titan and the Quadro and Tesla lines. Moreover, they're selling Tegra 4's into the smartphone and tablet market as fast as they can make them. They're making boatloads of money, while AMD is borrowing money to keep the lights on.
 
GuyN said:
On the contrary, I wouldn't worry about nVidia. The consumer desktop and notebook market is a small tail of a very big dog. The GPU computing market is growing at a rate of 12%, and nothing AMD is shipping is going to take away enough share of that market to make them cut prices on Titan and the Quadro and Tesla lines. Moreover, they're selling Tegra 4's into the smartphone and tablet market as fast as they can make them. They're making boatloads of money, while AMD is borrowing money to keep the lights on.
Don't forget Nvidia is making money from astronomers and scientist from all over the world using Nvidia graphic cards in their super computers to render the entire Milky Way galaxy and universe. No astronomer in the world uses AMD graphic cards. At least not that I know of yet.
 
GuyN said:
On the contrary, I wouldn't worry about nVidia. The consumer desktop and notebook market is a small tail of a very big dog to them. The GPU computing market is growing at a rate of 12%, and nothing AMD is shipping is going to take away enough share of that market to make them cut prices on Titan and the Quadro and Tesla lines. Moreover, they're selling Tegra 4's into the smartphone and tablet market as fast as they can make them. They're making boatloads of money, while AMD is borrowing money to keep the lights on.

Partially true, I don't think anyone is really worried about Nvidia, they wont go bottoms up yet. Nothing really compares to a tegra 4 for mobile atm except apples processors, but with android rising and apple falling, Nvidia will make boatloads in that market. However Nvidia will have to drop prices on PC GPUs to survive, AMD is on the rise. Yes they are borrowing money to keep the lights on, but that will change with their contracts with the PS4 and Xbone. This is a good thing, a little competition will spur graphics to the next level faster.

As for the previous comment about astronomers using Nvidia, most of that equipment is bought using government contracts, governments that don't care how much of the taxpayers dollars the waste. But besides that not even a year ago AMD was down in the dumps, and had nothing to compete with Nvidia even on a dollar to value scale. Things are changing though and AMD is making a comeback, 10 years from now that role may be reversed. In all honesty though it has little effect on the games. Would be like me saying almost all professional video editors use Macs, so PC and consoles don't stand a chance in gaming.
 
Ballowers100 said:
Don't forget Nvidia is making money from astronomers and scientist from all over the world using Nvidia graphic cards in their super computers to render the entire Milky Way galaxy and universe. No astronomer in the world uses AMD graphic cards. At least not that I know of yet.

GPU-based supercomputers are still in the minority, but you're right; the significant ones, as well as the professional desktops, are almost always nVidia hardware.

nVidia has a near monopoly in that market, which is driven by the need for performance at any price, so they can demand -- and get -- obscenely high gross margins.
 
Adhal said:
Partially true, I don't think anyone is really worried about Nvidia, they wont go bottoms up yet. Nothing really compares to a tegra 4 for mobile atm except apples processors, but with android rising and apple falling, Nvidia will make boatloads in that market. However Nvidia will have to drop prices on PC GPUs to survive, AMD is on the rise. Yes they are borrowing money to keep the lights on, but that will change with their contracts with the PS4 and Xbone. This is a good thing, a little competition will spur graphics to the next level faster.

As for the previous comment about astronomers using Nvidia, most of that equipment is bought using government contracts, governments that don't care how much of the taxpayers dollars the waste. But besides that not even a year ago AMD was down in the dumps, and had nothing to compete with Nvidia even on a dollar to value scale. Things are changing though and AMD is making a comeback, 10 years from now that role may be reversed. In all honesty though it has little effect on the games. Would be like me saying almost all professional video editors use Macs, so PC and consoles don't stand a chance in gaming.

AMD is still down in a very deep dump. More likely, they will have to get lucky to survive, and they will probably be bought out in another year or two. Their financial position is scary, and those deals for the console APUs had better be enormously profitable for them. If they're consistent with AMD's recent gross margins, which are only about 29%, it won't be enough to get them out of the hole they dug when they overpaid for ATI and made a Charlie Foxtrot of the Global Foundries launch.

The point Ballowers and I were making is that nVidia does not need to cut prices to survive. Their success or failure in launching new products and making profits as a company will not be determined in the consumer discrete GPU market.
 
Possibly. I'm not saying your wrong, you could be right, but I believe they will come out ahead (profit wise, not ahead of NVidia)). Like I said earlier the contract with the consoles provide more benefit than just the contracts themselves. Since 2 of the 3 main gaming platforms use AMD hardware, games will start taking advantage of that hardware. Which means the disparity between the 2 companies will be less and people will start looking more at AMD as an viable card, especially those of us who live on a tight budget. :D/>

No Nvidia wont go bottoms up, but the reason they have had massive profits is from the lack of competition. The will still be more expensive than AMD cards, but I do see them lowering there prices. Why? Because they can without taking a loss and will want to slow AMDs growth. But that's just how I see it, I could be wrong.

Either way as a gamer I pray for AMDs success because competition benefits all of us gamers.
 
Nobody wants to see AMD fail. Consumers who benefit from the competition certainly don't. Even their competitors don't -- especially Intel, which has made bundles off of AMD patents they licensed. I'm sure not even nVidia wants them out of business.

But "when your outgo exceeds your income, the upshot is your downfall." AMD's position is precarious, and their survival may depend on the success of the consoles and their ability to produce the console APUs at a profit.
 

Aver

Forum veteran
GuyN said:
Moreover, they're selling Tegra 4's into the smartphone and tablet market as fast as they can make them. They're making boatloads of money, while AMD is borrowing money to keep the lights on.

You should read some more news Guy, because you have a lot of outdated data or false info. You are saying things like "Moreover, they're selling Tegra 4's into the smartphone and tablet market as fast as they can make them." while there are a lot of news about not-so-good sales of Tegra 4 (54% year-to-year drop).

Unless their production capability went down by 54% year-to-year, they don't have problems with producing Tegra. Basically Qualcomm has dominated the chip market for phones and tables, and it probably won't change anytime soon.

http://www.theverge.com/2013/11/8/5080038/nvidia-q3-2013-earnings-lte-tegra

http://www.androidheadlines.com/2013/11/nvidia-still-android-game-tegra-sales-rise.html
 
Top Bottom