Any word on when Mantle + TrueAudio will launch for Thief?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

motsm

Golden Member
Jan 20, 2010
1,822
2
76
Who is using that now?
Occlusion is easily possible on any current hardware that is capable of gaming, even software solutions like FMOD support it, developers just rarely bother implementing "advanced" sound features.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Thanks, Ryan

2nd :thumbsup:

So it turns out that the convolution reverb that will be enabled in the TrueAudio patch will also have the option to run on the CPU. Glad to see they aren't locking the feature out, as it's a Wwise effect and should run on modern CPU's just fine. I thought they might make it exclusive simply to enforce TrueAudio's appearance.

http://www.legitreviews.com/amd-mantle-trueaudio-patch-thief-coming-march-18th_137560

With all that said, convolution reverb won't offer anything earth shattering in a game, but if you have the CPU headroom, why not.

:Edit: Not that anyone probably cares, but the impulse responses were likely recorded by AudioEase, who I'd say are one of the best in that field.

:Edit: Ok, sounds like their AudioEase license is only for the outdoor spaces library, which is a bit disappointing. The rest may be done in house, but who knows, an improperly recorded IR can be a lot worse than a standard algorithmic reverb, and they are extremely common. Hopefully they sourced their IR's properly.

AMD doesn't lock features from other peoples hardware if they are capable of running it. They actually prefer the wider adoption.
 

motsm

Golden Member
Jan 20, 2010
1,822
2
76
AMD doesn't lock features from other peoples hardware if they are capable of running it. They actually prefer the wider adoption.
Wwise has had the convolution reverb for at least a couple years, more than likely it has been used in many games before this point as well. So I guess I shouldn't have even thought it would be locked out to anyone. You never know though, can sometimes just come down to a lazy developer on the games end.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
2nd :thumbsup:



AMD doesn't lock features from other peoples hardware if they are capable of running it. They actually prefer the wider adoption.


That seems kind of silly to say, seeing how Mantle and True Audio run on a small portion of AMD's own marketshare.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
That seems kind of silly to say, seeing how Mantle and True Audio run on a small portion of AMD's own marketshare.

Your statement is irrelevant to what I said. Hardware not supporting a feature is not the same as locking the feature so it won't run on competitor's hardware.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
That seems kind of silly to say, seeing how Mantle and True Audio run on a small portion of AMD's own marketshare.

Back in the start of the 90ies i bought the first Creative 16bit card with a add on wavetable card with 4 megs of ram. It was about 900 usd ! - lol
I would love creative to have been moving our sound forward, using the 3d technique. I would love if MS or Nvidia would have done it. If it had been Creative as a add on tech, it would have been available to more people and adoption would be secured and we would have games using it now. Damn sad it didnt happen.

Now here comes a solution. As explained by Ryan its perfectly capable of doing what was discussed. The software, the hardware and the patents is in place.

On the box there is printed 3 letters. A M D. For some reason it seems then its a portal for badmouthing the technology. So damn sad that technology, ironically invented and produced by others, is trashed on a technology board, because the wrong letters is on the box.

If there is a problem its that AMD lead engineer on the product dont know anything about sound, but that just show the sad attention sound have been giving during the last 10 years and what dire need we are in.

So please. We all know the hidden agenda of this thread, no need to repeat it. That the agenda is also used to a failed attemp to personally attack Erenhardt because of a picture is pathetic at best.

As for the question about how many reflection the DSP is able to handle, its completely out of proportions to reality. In bf4 i can hardly get precise horisontal positioning, and no usefull besides from that. In my own house, that my cognitive system know very well, positioning after 3-4 reflection is weak at best. I therefore dont give a damn if we get 0 or 4 reflections in games, just take me 99% of the way. And i can say that, as a user af Linkwitz Orion dipole speakers, and own designed dipole speaker for my surround system. So i would say i actually take reflections serious in the real world.
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I'd love it if we could get SDK access as regular ol joes to the TrueAudio dsps. I want to see if someone could come up with a way to hack the Axe-FX II firmware to run on those Tensilica DSPs, save myself a couple grand in the process. That would be awesome. I'd poke around at it if I had SDK access. Do any of you have additional information on that? Probably not possible as the DSPs are most likely too dissimilar to run without source code level modifications, as it uses 2 of Analog Device's TigerSHARC DSPs @ 600 MHZ.

Or at the least, be able to adapt some open source plugins to run some really nice rack-level quality studio effects through it. I'm chomping at the bit to try and see if I can't port a particular open source reverb VST plugin to run on it

Although digging a little bit I think my programming chops aren't up to snuff for this kind of modeling http://www.normankoren.com/Audio/Tubemodspice_article.html
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
This seems to be the way at the moment in AMD for some reason, they have "released" these technologies but only to a select group of companies. Everyone else is shut out from using them. Doesn't really make a lot of sense. I for instance am quite keen to see just how much performance these DSPs have, what sort of things can be run on them, if they have access to GPU data and such. But its all private right now, can't access any of it despite the cards being out for quite a while.

I don't understand the secrecy, but it is not the way to build an ecosystem around an API or piece of hardware.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
AMD seems to want to build up hype before they have something ready. I can't think of any other reason why they would announce something and then only allow selected developers to use it for the foreseeable future.

Not a cool way to do things, that's for sure.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
I'd love it if we could get SDK access as regular ol joes to the TrueAudio dsps. I want to see if someone could come up with a way to hack the Axe-FX II firmware to run on those Tensilica DSPs, save myself a couple grand in the process. That would be awesome. I'd poke around at it if I had SDK access. Do any of you have additional information on that? Probably not possible as the DSPs are most likely too dissimilar to run without source code level modifications, as it uses 2 of Analog Device's TigerSHARC DSPs @ 600 MHZ.

Or at the least, be able to adapt some open source plugins to run some really nice rack-level quality studio effects through it. I'm chomping at the bit to try and see if I can't port a particular open source reverb VST plugin to run on it

Although digging a little bit I think my programming chops aren't up to snuff for this kind of modeling http://www.normankoren.com/Audio/Tubemodspice_article.html

Yeaa. The good old tigersharc. Man if the software was open to this the sound could even be perfectly adapted to the phones and the ear. Imagine calibrating sound in your game eg bf4 to your own phones, ears and cognitive system. And recalibrating after you have played a lot in game. Then select your 3d sound profile for the day eg some you named "Today i am angry on Metro and dont want to camp" lol.
 

motsm

Golden Member
Jan 20, 2010
1,822
2
76
On the box there is printed 3 letters. A M D. For some reason it seems then its a portal for badmouthing the technology. So damn sad that technology, ironically invented and produced by others, is trashed on a technology board, because the wrong letters is on the box.

If there is a problem its that AMD lead engineer on the product dont know anything about sound, but that just show the sad attention sound have been giving during the last 10 years and what dire need we are in.

So please. We all know the hidden agenda of this thread, no need to repeat it. That the agenda is also used to a failed attemp to personally attack Erenhardt because of a picture is pathetic at best.
There is no hidden agenda and no one really bad mouthed AMD to any tangible degree. I and a couple others have just been trying to straighten out what is fact and what is speculation, nothing more.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Why in the world do the review sites go along with AMD's instructions for benchmarking these things?

290x on low settings @ 1080p? Lol.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Maybe for the same reason they go along with NV's instructions?

I'm not saying it's the right way, but double standards aren't.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
And those are...?

From the top of my head, and I remember this from Kepler launch:

use fully enclosed casing while benchmarking and disable any driver optimization that might compromise IQ.

The exact wording was "we encourage reviewers to..."
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
From the top of my head, and I remember this from Kepler launch:

use fully enclosed casing while benchmarking and disable any driver optimization that might compromise IQ.

The exact wording was "we encourage reviewers to..."

If a reviewer followed those instructions the frame rate would be effected. So if i got those instructions i would think that Nvidia wanted to keep it realistic and true. As in, the results wouldnt be artificial.

The average consumer will use cases. Thus the suggestion to use a enclosed casing. I have no idea what would be wrong with disabling optimizations that compromise IQ. Having those would only pump up the fps. These are both commendable in every way.
 
Last edited:

Jodell88

Diamond Member
Jan 29, 2007
8,762
30
91
Why in the world do the review sites go along with AMD's instructions for benchmarking these things?

290x on low settings @ 1080p? Lol.
Mantle helps in CPU bound situations and not GPU bound situations. Benchmarking a 290x on low @ 1080P creates a CPU bound situation and as you can see mantle works.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Mantle helps in CPU bound situations and not GPU bound situations. Benchmarking a 290x on low @ 1080P creates a CPU bound situation and as you can see mantle works.

It almost feels like creating artificial situations that are intended to show Mantle in the best light possible. I don't know about anyone else here, but i'm not so interested in low end stuff and i'm not interested in going out of my way in terms of gimping my image quality settings just for the express purpose of showing Mantle in a good light.

Does someone buying a 290X or 780ti, do they plunk down that much cash and say to themselves, "hey, let me run at LOW SETTINGS, just for the express purpose of benchmarks". Or even more interestingly, do they buy such GPUs with the express purpose of pairing it with some slow CPU just to boost Mantle's benchmarks. I would say no to both. Especially considering that Mantle has 2 Titles slated for 2014. Yeah there are 2-3 other "maybe" titles that aren't finalized with absolute release dates.

This is just my opinion. Mantle is benefiting the high end, and it's an OKAY benefit. It isn't amazing for the high end. It is what it is. It's what, a 5-10% benefit for high end hardware? Just leave it at that. I don't see why AMD is going out of their way to ask reviewers, "hey test it at low settings on an IVB-E with a 290X". What the heck is that crap? Nobody plays their GAMES on an ivy bridge with a 600$+ GPU like that.

Just my .02. Feel free to disagree if you think that low settings on a 290X + x79 platform is something that should be tested. Why can't AMD just let it be what it is. A 10% benefit for the high end. Or maybe 5%. That's 5-10% framerate benefit. How is that bad? 5-10% performance for free. That, my friend, is an incredible value add for AMD users. It is what it is. But instead, we have AMD tell reviewers "hey guys test low settings on your 290X + IVB-E" sounds completely freaking stupid to me. What. Ever. Leave it alone. It is what it is. Dont try to manipulate tests with low settings on hardware that costs past 1500$ just for the CPU and GPU.

Oh. And just to be clear. If you want to test the low end, i'm all for that. That's fine with me. But test the LOW END. TEST something like a 6300FX CPU with a 260X GPU. That's realistic. Testing low settings at 1080p with a IVYB-E and a 290X. Give me a break. The latter test is eyeroll worthy.
 
Last edited:

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
^ Give me a break?

Nice rant, did you ignore the fact they test both max settings and low settings? You can see them both. They didn't even exclude max settings so what are you complaining about again?
http://www.anandtech.com/show/7868/evaluating-amds-trueaudio-and-mantle-thief

Crossfire, quadfire, cpu limited situations (maybe an i7 920 user?) can benefit from mantle more then the few % that a single card with a highly overclocked $1000 CPU can. Also, using CPUs at stock are more bottlenecked then with major OCs.

Let me ask you this. What are the 290 users running as CPUs? I'll guess that most are not using the $1k extreme edition cpu, probably something from the i7 920 to a 4770k would be my guess.

What's wrong with seeing both data sets? (rhetorical question)
 

Jodell88

Diamond Member
Jan 29, 2007
8,762
30
91
It almost feels like creating artificial situations that are intended to show Mantle in the best light possible. I don't know about anyone else here, but i'm not so interested in low end stuff and i'm not interested in going out of my way in terms of gimping my image quality settings just for the express purpose of showing Mantle in a good light.

Does someone buying a 290X or 780ti, do they plunk down that much cash and say to themselves, "hey, let me run at LOW SETTINGS, just for the express purpose of benchmarks". Or even more interestingly, do they buy such GPUs with the express purpose of pairing it with some slow CPU just to boost Mantle's benchmarks. I would say no to both. Especially considering that Mantle has 2 Titles slated for 2014. Yeah there are 2-3 other "maybe" titles that aren't finalized with absolute release dates.

This is just my opinion. Mantle is benefiting the high end, and it's an OKAY benefit. It isn't amazing for the high end. It is what it is. It's what, a 5-10% benefit for high end hardware? Just leave it at that. I don't see why AMD is going out of their way to ask reviewers, "hey test it at low settings on an IVB-E with a 290X". What the heck is that crap? Nobody plays their GAMES on an ivy bridge with a 600$+ GPU like that.

Just my .02. Feel free to disagree if you think that low settings on a 290X + x79 platform is something that should be tested. Why can't AMD just let it be what it is. A 10% benefit for the high end. Or maybe 5%. That's 5-10% framerate benefit. How is that bad? 5-10% performance for free. That, my friend, is an incredible value add for AMD users. It is what it is. But instead, we have AMD tell reviewers "hey guys test low settings on your 290X + IVB-E" sounds completely freaking stupid to me. What. Ever. Leave it alone. It is what it is. Dont try to manipulate tests with low settings on hardware that costs past 1500$ just for the CPU and GPU.

Oh. And just to be clear. If you want to test the low end, i'm all for that. That's fine with me. But test the LOW END. TEST something like a 6300FX CPU with a 260X GPU. That's realistic. Testing low settings at 1080p with a IVYB-E and a 290X. Give me a break. The latter test is eyeroll worthy.
I highly doubt AMD told them to use Ivy-E. It seems it was chosen because it could simulate the low end with out swapping the processor (2c/4T). If you look at the 2C/4T test @ 3.3 GHz paired with the 260X you see a similar gain in favour of mantle. The test would be akin to a $120 CPU paired with a $120 GPU.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
I would consider 120Hz gaming and multi-GPU gaming, where Mantle has the potential for the highest benefit (not sure if it's fully realized in BF4 with the current drivers-- last I heard there was microstutter still, can't test myself) to be distinctly high end.

Actually, I would consider 120Hz, 144Hz, and multi-GPU to be the highest end gaming. There is no more CPU demanding situations currently than those.

So I think you're wrong, but feel free to disagree.

EDIT: And Mantle benefits to the low end are academic at best, as the low end has a "hard cap" at 60FPS because of their monitors, and a monkey banging rocks together would hit that no problem. Real CPU limits start once you try to make it past 80-90 FPS, BF4 notwithstanding as that is an outlier currently wrt how demanding it is on CPUs
 
Last edited:
Aug 11, 2008
10,451
642
126
To me, it depends on the game. For BF4, it is reasonable to test low settings/resolutions because some players (what percent I dont know) use those settings to get a max framerate for multiplayer. In Theif, OTOH, since it is a single player game, it would seem to me that the vast majority of players will be cranking up the image quality so that they are playing at the max their gpu will allow, so that will be a more realistic scenario.

The problem comes in when either mantle detractors or fans emphasize one situation or the other to promote their agenda.

My take is that mantle is a nice benefit in certain situations in the two games that support it. It is what it is. Right now it is a value add to AMD gpus, but not widely used or beneficial in enough different scenarios to change the overall gaming landscape unless one almost exclusively plays BF4.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Here are some really nice graphs from Tom's:
http://www.tomshardware.com/reviews/thief-mantle-benchmarks,3773.html
Thief-Mantle-Low.png


Lest start with slow R7-250X.
What would be a perfect CPU for that build? Definitely not 4770K. 8350? No. I would say dual-core intel or 4 core FX would match. Lest look at the graph... Bam! Mantle shows improvements with 8 core FX. Not much, but the difference in min. fps is noticeable. Why would you go for fx-8350 when you have the same performance with 4 slower cores for half the price? Meanwhile i7 is faster even in dx, but what is the point?

Thief-Mantle-Mid.png

Lest do the same with R9-270. I would pair it with fx-6300 for relatively cheap gaming rig. With mantle you can have almost the same (<10% diff) performance using fx-4170 as if you were running i7-4770

Looks like my 7870 and phenom2 965 combo will stay with me for longer. Had an itch to upgrade to fx-6350 or fx-8320... oh well... ;)
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
@blackened23

What happens when 20nm GPUs hit and even high end CPUs are possibly always the bottleneck? An api like mantle is going to be more and more important moving forward because CPUs currently do not scale performance with each generation like GPUs do.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
@blackened23

What happens when 20nm GPUs hit and even high end CPUs are possibly always the bottleneck? An api like mantle is going to be more and more important moving forward because CPUs currently do not scale performance with each generation like GPUs do.

Not to mention Mantle is much smoother and more consistent.