The AMD Mantle Thread

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

daveybrat

Elite Member
Super Moderator
Jan 31, 2000
5,805
1,018
126
Regardless of how Mantle actually performs and plays out in the long run, i'm just happy that AMD is finally making the forums more lively and interesting again.

I mean, just look at the views on this thread alone.....6,758 views in one day!!

I LOVE competition and reading all of the thoughts and opinions expressed in this thread. :)
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Regardless of how Mantle actually performs and plays out in the long run, i'm just happy that AMD is finally making the forums more lively and interesting again.

I mean, just look at the views on this thread alone.....6,758 views in one day!!

I LOVE competition and reading all of the thoughts and opinions expressed in this thread. :)

Agreed.

AMD is obviously doing something right with Mantle going on how worked up it has nvidia's most loyal. Everyone is looking at GCN in both consoles, in PC gaming desktops and the Mantle API available for all three platforms; PS4, XB One and PC.

It could be a game changer for all cross platform titles. I think we will see just how powerful it is or is not when we get a look at Dx11 vs Mantle Battlefield 4 benchmarks. Looking forward to it.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
I find it fun how people seem to find so incredible that AMD can rip off whatever low APIs Xbone and PS4 are using.

And if the DX and OpenGL ES APIs in the Xbone and PS4 were the same as the computer DX and OpenGL ES then everyone could port the optimizations for the desktop they use in the consoles and the we could run games like BF3 in a Geforce 7800GTX with an Athlon X2.

So the logic dictates that there is something different in the consoles and that is the low level API.

And if that low level API can interact with the GCN GPU in the consoles it can also interact with the GCN in the PCs via the mantle API.

DIpDvC5.png
 
Last edited:

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Everybody must assume that AMD has done everything they could possibly to do to make it easily and quickly portable, at a minimum. It's possible that they took lower margins on the console chips themselves in order to get the exact API they wanted in each. With this in place they could give them away for free and it would still be well worth it. Nvidia never stood a chance.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
I find it fun how people seem to find so incredible that AMD can rip off whatever low APIs Xbone and PS4 are using.

And if the DX and OpenGL ES APIs in the Xbone and PS4 were the same as the computer DX and OpenGL ES then everyone could port the optimizations for the desktop they use in the consoles and the we could run games like BF3 in a Geforce 7800GTX with an Athlon X2.

So the logic dictates that there is something different in the consoles and that is the low level API.

And if that low level API can interact with the GCN GPU in the consoles it can also interact with the GCN in the PCs via the mantle API.

DIpDvC5.png

So that tweet doesn't deny that this is the API of the next gen consoles.

The one thing about the Anandtech article that was strange was that they talked about next gen consoles and then just started referring to them as the Xbox One the rest of the article. Did they mean both consoles and just get lazy in writing?
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Everybody must assume that AMD has done everything they could possibly to do to make it easily and quickly portable, at a minimum. It's possible that they took lower margins on the console chips themselves in order to get the exact API they wanted in each. With this in place they could give them away for free and it would still be well worth it. Nvidia never stood a chance.

I'm not assuming anything until there is results.

But in theory doesn't seem that complicated although seem to be having some difficulty in grasping simple concepts that were clearly explained in the anandtech article.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
So that tweet doesn't deny that this is the API of the next gen consoles.

The one thing about the Anandtech article that was strange was that they talked about next gen consoles and then just started referring to them as the Xbox One the rest of the article. Did they mean both consoles and just get lazy in writing?

I believe that porting HLSL to GLSL ES isn't a tall order anyway.

For example.
http://msdn.microsoft.com/en-us/library/windows/apps/dn166865.aspx
https://code.google.com/p/angleproject/

Also http://www.eurogamer.net/articles/digitalfoundry-inside-playstation-4.

Low-level access and the "wrapper" graphics API
In terms of rendering, there was some interesting news. Norden pointed out one of the principal weaknesses of DirectX 11 and OpenGL - they need to service a vast array of different hardware. The advantage of PlayStation 4 is that it's a fixed hardware platform, meaning that the specifics of the tech can be addressed directly. (It's worth pointing out at this point that the next-gen Xbox has hardware-specific extensions on top of the standard DX11 API.)

"We can significantly enhance performance by bypassing a lot of the artificial DirectX limitations and bottlenecks that are imposed so DirectX can work across a wide range of hardware," he revealed.

The development environment is designed to be flexible enough to get code up and running quickly, but offering the option for the more adventurous developers to get more out of the platform. To that end, PlayStation 4 has two rendering APIs.

"One of them is the absolute low-level API, you're talking directly to the hardware. It's used to draw the static RAM buffers and feed them directly to the GPU," Norden shared. "It's much, much lower level than you're used to with DirectX or OpenGL but it's not quite at the driver level. It's very similar if you've programmed PS3 or PS Vita, very similar to those graphics libraries."

But on top of that Sony is also providing what it terms a "wrapper API" that more closely resembles the standard PC rendering APIs.


"The key is that it doesn't sacrifice the efficiency of the low-level API. It's actually a wrapper on top of the low-level API that does a lot of the mundane tasks that you don't want to have to do over and over."

The cool thing about the wrapper API is that while its task is to simplify development, Sony actually provides the source code for it so if there's anything that developers don't get on with, they can adapt it themselves to better suit their project.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Also, transcript from MS (from this thread http://www.neogaf.com/forum/showthread.php?t=642427).

around the 22:55 mark

Whitten:
"Since E3, an example is that we've dropped in what we internally call our mono driver. It's our graphics driver that really is 100 percent optimised for the Xbox One hardware. You start with the base [DirectX] driver, and then you take out all parts that don't look like Xbox One and you add in everything that really optimises that experience. Almost all of our content partners have really picked it up now, and I think it's made a really nice improvement."
Major Nelson:
"And this is how game developers can really, like we say, write to the metal?"
Whitten:
"That's right, exactly"
(23:23)

Up to this point, people always claimed this isn't possible on X1. The new driver now seems to allow coding to the metal and thus should improve performance.

So yeah, it seems that saying "XBone has an API and its name is DX11" seems less and less a fact.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Anyone stating that XB1 is using DX11 as the preferred API is intentionally skewing facts; The XB1 just as the PS4 have "wrappers" that allow quick porting from DX11 to the XB1 and PS4 respectively, which can allow a developer to port an existing game (such as planetside 2) to the PS4 to get it up and running while the developer is in the process of doing their port from DX11 to their specific hardware. But they are not using DX11 as the primary API. They both have low level APIs for direct hardware programming, just as all prior consoles did.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Yeah, price/perf. A metric used always by everyone except the market leader.

If NV released Titan II with 20% more performance over Titan I/R9 290X, you'd pay 3000 EUR for it? Please tell us more.

I bet 97% of the GPU market considers this metric. It doesn't mean they ignore other aspects such as performance/watt, game bundles, features, etc. If they didn't consider price/perf, the VGA forum would be a pretty empty place. We would all go out and buy 4 Titans and come back to the forums in 2 years when 4 new $1000 flagships arrive.

So, give me a reason why people should now buy a new card?

They were too busy doing other things in the summer like house renovation, travelling, and didn't have time for games. Now that winter is approaching (i.e., gaming is probably a more popular activity for many people during the cold months in North America), and BF4 is here, people will consider upgrading their systems.

Another possibility is that they may have lacked the funds due to other personal commitments. Others may have waited to build a new Haswell system with a new GPU and waited until now to put it together due to Intel's USB 3.0 bug fix.

Further not all gamers follow the GPU market closely. Some may be going to school and are putting together a new system now that they received student loans. Then there is the holiday season buying / gifting. I presume people may buy a new GPU for their family members too.

Maybe if you someone is a huge "]COH2 gamer?

There are many other reasons; for example, if R9 290X beats Titan in BF4.

They can wait another 8 months and will get this kind of performance for half the price..

Source? Where are there guarantees that we'll have 20nm chip with GTX780/R9 290X's performance for $299-325 in 8 months? I'd love to hear more.

I am skipping R9 290X and 780 since the price/performance is not sufficient for me to upgrade but given the progress of GPUs of late and 20nm delays, it'll be a while before we can get this level of performance for $299. Not everyone may have 7970s/680s in their rig.

Of course none of this has anything to do with Mantle. Care to explain why is it when NV worked for 10 years with developers to optimize performance for its games via TWIMTBP this was considered "excellent developer relationships", but when AMD takes it 1 step further and gives developers access to lower-level API, it's considered "evil" and "unfair". Really now?

Working closer with developers, bundling more games, raising prices. That sounds so familiar. What other GPU company perfected this strategy before? :hmm:

If Mantle takes off, maybe NV will price 20nm 550mm2 Maxwell at $499-549 instead of $1,000. I can see a lot of positive coming out of this if AMD's cards gain 20-30% from Mantle. Mantle will mean more competition, not less, since it will force NV to try that much harder.

Everyone is looking at GCN in both consoles, in PC gaming desktops and the Mantle API available for all three platforms; PS4, XB One and PC.

MS already stated they are committed to a 10 year life cycle for XB1. If developers could squeeze even 20-30% from XB1/PS4 via Mantle, you can bet some will try it. Since AAA games strive for higher production values, graphics and performance, I can bet the top studios/1st party developers of XB1/PS4 will definitely want to learn more about Mantle. Once they learn how to use Mantle to tap into GCN of XB1/PS4, assuming it brings better performance, it may be better for them since they will end up with a game hitting that magical 30 fps console gamers are OK with vs. sluggishness of 23 fps, while delivering the level of graphics / # of NPCs on screen they desire.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Yeah, apparently, only nvidia is allowed to create brand specific features to differentiate their product. If AMD does the same, well, that's BAD. I think the hypocrisy is pretty amusing.

Anyway, I think it's a pretty ballsy move by AMD - and a good one. They need product specific features to differentiate their product and separate them from status quo - and AMD is in a unique position to do so since they have all of the next-gen consoles. So they're basically using that fact to create value and differentiation for their product - it's a pretty smart business move, we'll see how it plays out. This will especially pay off for AMD if they can get mantle baked into more multiplatform engines; Frostbite 3 is a good start as it is multi platform and will have 10+ games using it. If they can get CryEngine 3 or UE4, that could be a big win for AMD. Activision as well is discussing Mantle, although COD games have never been known to be graphically demanding (so I doubt that would create a benefit). Anyway, perhaps this will mark the first smart business move by AMD in the past decade.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
http://www.dsogaming.com/news/battl...-a-single-r9-290x-at-amds-event-at-5760x1080/

Battlefield 4′s demo (at its recent AMD event) was running on a single R9 290X card at 5760×1080 (though it has not been confirmed whether it was running with constant 60fps. We should note that we did not notice any slowdowns, so we are most probably looking at a 60fps gameplay heaven).

http://www.youtube.com/watch?feature=player_embedded&v=MDiOSbQRUgo

5760x1080 eyefinity on a single card, BF4 gameplay. Pretty impressive I think. I do not believe this demo was using Mantle, however.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
So Mantle is basically designed to let developers directly access DX and open GL features that are supported by the GPU hardware without going through the CPU and any abstraction. This is actually pretty huge for a console that people were worried about having enough CPU power.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
And that's the reason why we see so many consoles ports nowadays. Xbox uses DX and you can easily port it to the PC. No sane company will start a multi port game with an low level API in the mind. Optimizing only starts after the port to the target plattform.

We'll see about that. PS360+Wii reached > 250 million consoles in 7 years of sales. China is about to lift a ban on console sales. If that happens, the installed userbase for PS4/XB1 and Wii U may exceed 250 million. You think major developers making games for these consoles won't want to use an API that gives them a performance advantage and extends the console's useful life? I am pretty sure they would love to sell the next COD/GTA game to 250 million gamers and if they can get "free" 20-30% performance increase from Mantle, why wouldn't they use it? Ironically, GTAIV is one of the worst poorly ported games to the PC which undermines your entire position in that you think developers strive to optimize PC ports. Some of these developers would prioritize consoles over PC development without a 2nd thought. They probably wouldn't even blink an eye if the DX port of their 1 billion dollar game was a poorly optimized PC port. It happened so many times during the PS360 generation already.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Yeah, apparently, only nvidia is allowed to create brand specific features to differentiate their product. If AMD does the same, well, that's BAD. I think the hypocrisy is pretty amusing.

Actually I think it is good for both NVIDIA and AMD to let developers have access to the metal.

Obviously small developers will have problems supporting a NVIDIA and an AMD render path, but big developers and engines that are licensed should.

The performance gains in GPUs via pure hardware has been slowing down for a few years and in fact the average graphics quality hasn't increased dramatically.
 

Haider

Member
May 15, 2008
63
0
0
I would imagine that eventually every cross platform title will use mantle. Mantle is as much for the consoles as it is for the PC. Any optimizations devs make for their consoles games can (allegedly) just be used directly on any PC using a GCN gpu. From what I understand is that there is no additional work involved after you optimize your console version.

It's working in Frostbite 3, and I think they said there are close to 10 titles coming for that engine. If they get Crytek to insert this in Cryengine, and Epic to get it running in UE4 then they have basically got all their bases covered. I would wager that will cover close to 80% of next gen games being released. Gone are the days where each studio is using their own game engine. They license tech from the major players like Epic, Crytek, DICE (first time they are letting others use their engine?), Valve, Id, et al. All AMD has to do is get those major engine developers to bake Mantle support into their engines and anyone releasing a multiplatform game has every reason to use mantle and literally no reason not to. Mantle to optimize your console version which will automatically translate to GCN packing PC's, and a DX or Open GL version.

They literally have the exact same amount of work to do in the long run. Actually less, because they would no longer have to worry about optimizing their DX or Open GL version for two GPU manufacturers on the PC. Strangely that may benefit nvidia a little bit, but not as much as direct to metal programming for AMD.

From what I read and know to get people up to speed from day one the high-level API in each console will be used. The consoles have sufficient power to do that. The big developers who have the resources will get involved with using the low-level calls for the WOW factor. From conjecture as with having a development kits it's hard to say. LibGCM on the Playstation 3 has the low-level calls as part of it. LibGCM for PS4 will probably have the work AMD did for Mantle embedded into it. Obviously the API calls will be similar as the underlying hardware is GCN. XB1 could well have it as separate API that works with HLSL. However it is it exposing the functions of GCN hardware regardless of how many compute units etc that are there. That's where the work in porting the code from PS4, XB1 and Radeon will have to be done. I don't see Mantle being used in every console or PC title. It's more where you want to optimise and have the know-how and money/resources to do that. Not every developer does. It's there for WOW/AAA titles. Remember when you first saw Crysis. I would also like DirectX address the issue with the PC having up to 10x less performance for draw calls. GTA IV anyone? I know a lot of people are slagging of MS and DirectX. I don't like the fact that they are treating the PC as after thought with regard to gaming any more than the next guy but it's not the worst system in the world. May be the SteamOS/box will give them the kick up the backside and focus their minds. Anyone who has messed with high-level know that it has advantages and disadvantages just like low-level. I suppose a bit like fan-boys;)
 

Haider

Member
May 15, 2008
63
0
0
I would say it depends on how much of a performance increase and programming time reduction using Mantle will yield. If a dev can write one block of low-level Mantle coding and have it work on all three GCN platforms (XBox One, PS4 & AMD PC), it might actually be quicker than writing three separate blocks of high-level programming. Plus he would end up with higher performance on each by utilizing Mantle.

You also have to look at money. 1000 lines of C will do what a 10000 lines of assembler will achieve. Optimistation takes money in having firstly the expensive tools to get you the info/profiling of your app so you know what to optimise. Debugging 1000 lines of C vs 10,000 of assembler? It also an iterative process. Dependent on your budget it may just be a case 'just don't have the budget'. Deus Ex was great game, there were better looking and sounding games but Deus Ex is one hell of game. May they spent the budget on the actual game design than the graphics etc...
 

ams23

Senior member
Feb 18, 2013
907
0
0
RussianSensation said:
but when AMD takes it 1 step further and gives developers access to lower-level API, it's considered "evil" and "unfair". Really now?

Think logically about what you are saying. This is an IHV-specific API that is only realistically useable for GCN-equipped hardware. Any proliferation of IHV-specific graphics API's would be disastrous and chaotic for game development. What do you think would happen if NVIDIA, Intel, Qualcomm, ImgTech, ARM all started pushing IHV-specific graphics API's? Each of these companies could provide an excuse to do so (NVIDIA is strong at GPU computing, Intel is strong at CPU computing, Qualcomm is strong at ultra mobile computing, etc.), but that doesn't mean it makes sense for the industry to move in such a direction.

AMD will need to pay developers to make optimal use of "Mantle", but considering their dwindling cash balance and heavy debt load, I'm not sure how far they can go with that strategy.

The repercussions of AMD pushing an IHV-specific API go way beyond just having Sony and Microsoft being very upset with them. In the future, don't be too surprised if new game consoles come out without AMD hardware inside (well, PSP/PSVita and Nintendo Ds/2Ds/3Ds already do not have AMD hardware inside, buy who's counting).
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The repercussions of AMD pushing an IHV-specific API go way beyond just having Sony and Microsoft being very upset with them. In the future, don't be too surprised if new game consoles come out without AMD hardware inside (well, PSP/PSVita and Nintendo Ds/2Ds/3Ds already do not have AMD hardware inside, buy who's counting).

This is an IHV-specific API that is only realistically useable for GCN-equipped hardware. Any proliferation of IHV-specific graphics API's would be disastrous and chaotic for game development.

Problem with your theory is the developers asked AMD to do this. Why would they want AMD to release a new API with lower-level access (something they begged for for years) and then not use it? It's not logical. Secondly, the game development market now consists of 2 leading consoles which are all powered by GCN and AMD will use GCN for at least another 3-4 years I bet. You know how large that userbase will become soon? Don't forget the chicken and egg scenario. If you don't release a lower-level API targeting a specific GPU architecture, developers will never use it. Therefore, you have to take a risk. AMD now has PS4/XB1 locked in, HD7000 series and Rx 200 series. That's a whole lot of GCN parts and then 20nm, 16nm, etc.

You think MS and Sony wouldn't want "free" 20-30% performance increase for their consoles from the very company which is providing them with the APUs?

And new consoles coming out, well I don't think we have to worry about that for another 7 years. We don't even know if next gen consoles will be physical or subscription-based model from the cloud. What about from now until PS5/XB2? That's a long time. Mantle is here and now, not in 7 years.
 
Last edited:

stateofmind

Senior member
Aug 24, 2012
245
2
76
www.glj.io
Problem with your theory is the developers asked AMD to do this. Why would they want AMD to release a new API with lower-level access (something they begged for for years) and then not use it? It's not logical.

You think MS and Sony wouldn't want "free" 20-30% performance increase for their consoles from the very company which is providing them with the APUs?

And new consoles coming out, well I don't think we have to worry about that for another 7 years. What about from now until PS5/XB2?

according to what is written in the article:
1. Because you can run your games on lower specs systems, resulting in a wider crowd. Not everybody can/want shell off more and more money on a new hardware each now and then, especially when there is no reason for it.

2. Console -> PC port to some degree

3. You are less limited in what you can do visually.
 

ams23

Senior member
Feb 18, 2013
907
0
0
RussianSensation said:
You think MS and Sony wouldn't want "free" 20-30% performance increase for their consoles from the very company which is providing them with the APUs?

Sony and Microsoft will provide their own very low-level API for their closed platform console hardware. "Mantle" is still different, and will likely have one extra abstraction layer on top of what you see in the consoles due to the fact it needs to support all current and future GCN-equipped hardware and a variety of CPU's too.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Sony and Microsoft will provide their own very low-level API for their closed platform console hardware. "Mantle" is still different, and will likely have one extra abstraction layer on top of what you see in the consoles due to the fact it needs to support all current and future GCN-equipped hardware and a variety of CPU's too.

I think the real strategy will rest upon getting Mantle baked into various engines that are designed for cross platform use - Frostbite 3 will have it baked in starting in December for all games to use, and Frostbite 3 will be in nearly every high profile EA release in 2014, including Dragon Age: Inquisition. I think it's safe to state that Dragon Age 3 using Mantle is pretty much a certainty, since Frostbite 3 will have the ability baked in. If you remember, this year EA consolidated their dev studios heavily and commented that most of their developers would be using Frostbite 3 to power their games. So that would indicate multiple EA games using Frostbite 3 in 2014.

Aside from that, AMD is probably trying to get other engines to use it. The thing is, very few developers build their own "from scratch" engines so we have a situation where a few engines power the majority of the games. So far, despite the aging nature of Unreal Engine - UE3 does power quite a few games. Anyway, I'd imagine AMD would focus on implementing mantle in the 3-4 game engines that power 80%+ of games, and Crytek has mentioned on Twitter that they're looking into Mantle (this may make sense since AMD has developer relations with Crytek now). CryEngine 3 using Mantle would be a significant milestone, UE4 would be a big win too but so far Epic has not commented on Mantle. I should add that Activision is also receptive to Mantle, although let's not kid ourselves - Activision games have never pushed the "boundary" so to speak in terms of graphics.

If AMD can manage to get Mantle into 2-3 of the game engines that power 80% of games, well that would be the winning business strategy. They could presumably power a TON of games with little effort on behalf of any developer - and like I said, those 2-3 game engines power the vast majority of games. Developers are less willing to spend time creating an engine and generally license them these days. I'm 99% sure that AMD will go for this strategy.
 
Last edited:
Status
Not open for further replies.