AMD V Nvidia by Richard Huddy

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Genx87

Lifer
Apr 8, 2002
41,091
513
126
then i don't know why you feel so compled to paint ATI/AMD as the worst company ever in every thread. It's a trend and people noticed that you know? I told you several times: try to be as neutral as you can.

Yes he does because you said so right?

The guy is director of dev relations for AMD and you dont believe he has an agenda????
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
Way to dispute the statement there chief. Don't attack someone's character just because you aren't capable of refuting their point.

The thread is about Huddy. It says so right in the title. Should he not be questioned or is everything he says taken as gospel?

well i'm just fed up with his constant attacks to a comapny that gives so much back to all of us. I respect his technical knowledge, i have said it many times but it gets tiring to have to read to all that thread crap created by him.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
well i'm just fed up with his constant attacks to a comapny that gives so much back to all of us. I respect his technical knowledge, i have said it many times but it gets tiring to have to read to all that thread crap created by him.

I dont see any crap in this thread. He is simply refuting what Huddy said. And what does AMD "give back" to us? They sell a product, we spend our money on that product.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
then i don't know why you feel so compled to paint ATI/AMD as the worst company ever in every thread.

I'm just saying that Huddy wasn't aware of Futuremark's shader algorithms (or deliberately misrepresented them), and tried to make Futuremark look bad by claiming they could have used 3Dc to make 3DMark faster and improve IQ, neither of which were true.
So not only did he lie about 3Dc's merits for 3DMark (which he should have known about, or at least could have known about, since ATi devrel was working with Futuremark), but he basically promoted ATi's technology at the cost of Futuremark, a company that is part of ATi's devrel program. How's that for relations?

Yes he does because you said so right?

Did you read his job description?
"Worldwide Developer Relations and European ISV Relations Manager at AMD"
It's his job to promote AMD's products. That is his agenda.
If you're looking for neutral, Huddy most certainly is not it (nor are the PR people of nVidia, Intel, or whatever other companies. Their job is to make their products look good). And if Huddy were neutral, he wouldn't be doing his job very well.

I'm neutral, but if your idea of neutral is Richard Huddy, then yes, apparently to you I'd look biased. You'd just be looking at it from the wrong direction.
 
Last edited:

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
then i don't know why you feel so compled to paint ATI/AMD as the worst company ever in every thread. It's a trend and people noticed that you know? I told you several times: try to be as neutral as you can.

Yes he does because you said so right
?

If Huddy posted on this forum would you consider him an unbiased source that had no agenda? Or would everything he says be suspicious because of his affiliation with AMD?
 
Last edited:

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
In a thread previously made up of many viewpoints, on pg 3 Scali comes in and takes over 44% (11 of 25) of the posts and the thread goes directly down hill and becomes more about about Scali than anything else.

Personally that's not what I want and whenever a user, in any forum, is spamming their viewpoint that hard, it is always to the detriment of the topic at hand.


Call-outs are not acceptable and are considered to be a form of personal attack.

Moderator Idontcare
 
Last edited by a moderator:

AnandThenMan

Diamond Member
Nov 11, 2004
3,979
589
126
He lied and you are defending the liar. :)
Of course he is. Is anyone even the least bit surprised? Scali has a clear agenda to defend Nvidia at all costs. It's painfully obvious.
It's a fact. I don't have an agenda...
This is a lie. And for you to expect anyone to believe otherwise insults everyone on this forum. At least be a man and own up to the fact that you have an agenda. Of course you don't have the integrity to do that.

BTW, to say that JHH never said that the card he held up was a working card. That's true. But everyone has the expectation that when you hold up your latest and greatest piece of hardware, it's the real thing. Not a hack, not a sawed off PCB, not a card that has no hope of operating. But the real deal. It's reasonable to expect that, everyone DID expect that. And when people noticed the card is a non working fake, Nvidia PR back peddled and went in spin mode.

It's one of the most dishonest stunts I've seen TBH.

BTW, I find Scali's attacks on the Huddy's character pathetic.


Personal Attacks are not acceptable.

Moderator Idontcare
 
Last edited by a moderator:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
The one thing Scali does is back up his opinions with good reasons. I seen him post a few good ones about this guy Huddy so far. He doesn't just knock someone for no reason.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
Of course he is. Is anyone even the least bit surprised? Scali has a clear agenda to defend Nvidia at all costs. It's painfully obvious.

This is a lie. And for you to expect anyone to believe otherwise insults everyone on this forum. At least be a man and own up to the fact that you have an agenda. Of course you don't have the integrity to do that.

BTW, to say that JHH never said that the card he held up was a working card. That's true. But everyone has the expectation that when you hold up your latest and greatest piece of hardware, it's the real thing. Not a hack, not a sawed off PCB, not a card that has no hope of operating. But the real deal. It's reasonable to expect that, everyone DID expect that. And when people noticed the card is a non working fake, Nvidia PR back peddled and went in spin mode.

It's one of the most dishonest stunts I've seen TBH.

BTW, I find Scali's attacks on the Huddy's character pathetic.

Yes, Nvidia PR telling people it was a mock up when asked about it is some crazy spin all right. (see the bit tech article for proof of this) Do you have any points to back up your statement that Scali's points on Huddy are untrue? Or you just don't like them because you are a DAAMIT fan?
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,979
589
126
Yes, Nvidia PR telling people it was a mock up when asked...
When asked being the key. Only when Nvidia had no choice did they own up to the truth. When a new product is unveiled, people expect it to represent what will actually be for sale. Even a pre-production sample works, it's just not 100% representative of what will be in the box. And keep in mind the source, Nvidia reps told us that Fermi was on track for a 2009 release, which of course was far from reality. (JHH still thinks Fermi was released in 2009). No matter how you slice it, Nvidia tried to pull the wool over everyone's eyes. If AMD or Intel had done the same thing, me and most everyone else would have condemned the stunt just the same.

As for Huddy, I'm not here to defend or dispute things he has said in the past. He works for AMD, worked for Nvidia, so what he said in the context of being employed by them is just that, in context. Has he lied in the past? Probably. Do I like it? No. Same as I don't like JHH standing up in front of everyone and taking them all for fools. It reflects badly on him and his company.

But Scali does not work for Nvidia, although you would never know it. Everyone has their preferences, and everyone has bias. But Scali takes it to another level. And that level is someone with a clear agenda.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,979
589
126
I dont see what this has to do with Huddys character?

Is this a case of if Nvidia does it so can Huddy?
It's not a case of can they, but should they. I personally think Huddy needs pipe down and not do these types of Q&A's anymore. Nothing good can come of it, it just ends up being a pissing contest in the end, regardless of who has the moral high ground. But at the same time, I do respect Huddy and listen to what he has to say. He's been in the business for a long time.

Same goes for JHH, when he talks about technology I listen, the man knows what he's talking about. He didn't found a company from scratch and turn it into a market leader by being a dummy. But at the same time, JHH has started to drink way too much of his own kool-aid and IMO is very arrogant. He thought nothing of holding up a fake card, he knew exactly what he was doing.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Like all other threads, when it comes to argument, Richard Huddy's and Charlie Demerjian's will be used as proofs. Can Richard say anything negative about AMD on public? Yes, if he have no problem getting fired. Several days ago there was a thread about "viral advertiser" and how their words can't be trusted. Well, Richard is an employee of AMD.

Richard is very professional when it comes to speeches. Instead of lying, he likes to mis-lead readers. As I have pointed out, there is no "well known" method to gain MSAA + Deferred shading on games that are written in Dx9, but there is a "well known" method on games that are written in Dx10. By putting Batman AA with the "well known" method together minus the Dx10 part makes people believe that it is actually easy to make MSAA happen along with deferred shading under Dx9. However, simply google "Deferred shading MSAA Dx9" and you will see that it is a "well known problem" instead of a "well known" solution to it.

As to Tessellation, it is rather amusing. What is Dx11? To most it is tessellation. Dx11 is open for both AMD and Nvidia, but Richard lead people to believe that Nvidia has done tricks to tessellation.

“Tessellation is about enriching detail, and that’s a good thing, but nVidia is pushing to get as much tessellation as possible into everything”. Huddy reminds us of the problem you get when adding too much salt to your pasta. “Tessellation breaks your image components down in order to add more detail at lower levels and, when it’s done right, it can be stunning”.

Huddy says that you can have too much of a good thing, although at KitGuru we're not entirely sure if that applies to sun tan lotion models or gold bullion bars or genie wishes
Huddy then got scientific, “These days, the most typical resolution for serious gaming is 1080p. A resolution where you have 1920 dots left to right and 1080 from top to bottom. That gives you around 2 million pixels’ worth of work onto the final screen. However, you actually end up working with a much larger picture, to allow for things like light sources and shadow generators that are off screen in the final image, but which still need to be accounted for. The same goes for situations where something is in front of something else, but you don’t know that at the start, so you end up doing work on pixels that may or may not make it to the final cut”.
“Overall, the polygon [Triangle - Ed] size should be around 8-10 pixels in a good gaming environment”, said Huddy. “You also have to allow for the fact that everyone’s hardware works in quads. Both nVidia and AMD use a 2×2 grid of pixels, which are always processed as a group. To be intelligent, a triangle needs to be more than 4 pixels big for tessellation to make sense”.
Interesting enough, but why are we being told this? “With artificial tests like Stone Giant, which was paid for by nVidia, tessellation can be done down to the single pixel level. Even though that pixel can’t be broken away from the 3 other pixels in its quad. Doing additional processing for each pixel in a group of 4 and then throwing 75% of that work away is just sad”.

To me, I understood that the work load on nvidia's card is no less than amd's card, so what is wrong? Stone Giant is used to test the strength of video cards, it is not a game. Its purpose is to drive video cards at maximum, not generate maximum FPS.

Richard understand what it is suppose to do, he understood it very well. However, instead of its purpose, he mis-direct reader into thinking otherwise.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Like all other threads, when it comes to argument, Richard Huddy's and Charlie Demerjian's will be used as proofs. Can Richard say anything negative about AMD on public? Yes, if he have no problem getting fired. Several days ago there was a thread about "viral advertiser" and how their words can't be trusted. Well, Richard is an employee of AMD.

Richard is very professional when it comes to speeches. Instead of lying, he likes to mis-lead readers. As I have pointed out, there is no "well known" method to gain MSAA + Deferred shading on games that are written in Dx9, but there is a "well known" method on games that are written in Dx10. By putting Batman AA with the "well known" method together minus the Dx10 part makes people believe that it is actually easy to make MSAA happen along with deferred shading under Dx9. However, simply google "Deferred shading MSAA Dx9" and you will see that it is a "well known problem" instead of a "well known" solution to it.

As to Tessellation, it is rather amusing. What is Dx11? To most it is tessellation. Dx11 is open for both AMD and Nvidia, but Richard lead people to believe that Nvidia has done tricks to tessellation.



To me, I understood that the work load on nvidia's card is no less than amd's card, so what is wrong? Stone Giant is used to test the strength of video cards, it is not a game. Its purpose is to drive video cards at maximum, not generate maximum FPS.

Richard understand what it is suppose to do, he understood it very well. However, instead of its purpose, he mis-direct reader into thinking otherwise.
I will be interested then how well the NVDA cards do on this benchmark then....
http://www.legitreviews.com/article/1444/3/
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Nobody has asked me any questions yet... Weird.
You don't even want me to prove my point apparently.

I mean, how many of you here have written stuff like this, for example:
http://scali.scali.eu.org/caustic/demo.html

Or this:
http://bohemiq.scali.eu.org/croissant9/demo.html

Both being full software implementations, one being a photon-mapping raytracer, the other a triangle rasterizer.

And then there would be plenty of hardware-accelerated stuff, such as some of the stuff here:
http://bohemiq.scali.eu.org/forum/viewforum.php?f=4
Or here:
http://sourceforge.net/projects/bhmfileformat/

And that's just some of the hobby projects I've done in my spare time.

Hate me all you want, but I know graphics. It's my job. I am betting that none of you can get anywhere close with your knowledge or experience. Nor Huddy, for that matter.
I've been doing graphics for about 20 years now, I think... I've pretty much seen and done it all... Amiga, (S)VGA, OpenGL, Direct3D, raytracing, reyes rendering, shaders, texture filters... you name it.
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
To me, I understood that the work load on nvidia's card is no less than amd's card, so what is wrong? Stone Giant is used to test the strength of video cards, it is not a game. Its purpose is to drive video cards at maximum, not generate maximum FPS.

Yup, the problem is simply that the 5000-series has a weak tessellator. Huddy tries to discredit any benchmarks that point out this weak spot, but basically he's saying: Our hardware cannot handle high polycounts/resolutions very well, but we don't think such image quality enhancements are important.

I've said it right away on my blog, when I had my 5770, and played around with it a bit: the tessellator is too weak to be of any practical use. That was long before Fermi was even out, so I didn't know whether or not nVidia would do any better. I wasn't comparing it against any other, or trying to call out which is 'better'... Just establishing the fact that AMD's implementation couldn't do what it should be doing (much like the SM2.0 implementation on GeForce FX for example... it's there, but it's too slow to use).
AMD knows this. And I hope they fixed it in the 6000-series. For now, Huddy tries to 'fix' it with FUD.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Scali has a clear agenda to defend Nvidia at all costs. It's painfully obvious.

It is not, actually.
Firstly, I don't just randomly crap on any AMD thread. I haven't said a word in most threads regarding the upcoming 6000-series. I could have spread a lot of FUD in favour of nVidia, if I wanted. But I didn't.
Secondly, I defend AMD where necessary as well, and I also criticize nVidia, Intel or any other company, when they are spreading misinformation or whatnot. Google around a bit and you'll see.

Problem is that these days, things are very pro-AMD, and AMD's PR department is rather amateuristic. You won't often catch nVidia or Intel on the type of transparent FUD and misinformation that eg Richard Huddy is spreading. Their PR strategy is just different.
I mean, AMD is the company with Randy Allen claiming 40% performance boost over Intel's top products, with Barcelona. When have you ever heard any other company make such bogus claims? I would criticize other companies just as much if they make such claims... But really, none of them have such poor PR, AMD really takes the cake.

Really, the pro-nVidia song is getting really old, really fast.
I suggest you do some research and figure out where I'm coming from. I was an early adopter and promotor of the Radeon 8500, because it just was a better card than the GeForce 3 at the time. I also pushed the Radeon 9000-series hard, and fought all the nVidia fanboys who thought that this poor SM2.0 performance of the GeForce FX in all games and benchmarks was just a conspiracy against nVidia.
At that time though, I also defended Futuremark's use of nVidia's DST/PCF shadowmapping features, despite not owning any nVidia hardware during that period.
And I also made an application that demonstrated the off-by-one error in the Radeon 9000-series alphablending circuit, the card I used at the time, and which I continued to back, despite this flaw.
I don't care about the brand. I care about promoting good products, and getting the truth out to people, fighting the FUD.
This year I've bought both a Radeon 5000-series card and a GeForce 400-series card. That's about as neutral as you can get.
 
Last edited:

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
I dont see what this has to do with Huddys character?

Is this a case of if Nvidia does it so can Huddy?

see the first part that i bolded of his quote....it was sarcastic obviously as nvidia told people it was a mockup = no spin......that is what this had to do with
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
...basically he's saying: Our hardware cannot handle high polycounts/resolutions very well, but we don't think such image quality enhancements are important.

What if they arnt? how many current games use tessellation to the point where the current top 5xxx series cant really make use of it?

also if you have a benchmark that wastes 75% of the tessellation it uses, and Im sure since its a benchmark, it uses ALOT more tessellation that ANY playable current game does, how is that a representation of the real world of gameing?

is it so far fetched to think, that Nvidia because they have better tessellation (on their current gen cards), want benchmarks to not reflect the real world of gameing so they look better in the eyes of the buyers?

short answear: does nvidia cards do tessellation better 5xxx series vs 4xx series? yes.

Longer answear: does it currently matter in games? probably not.



Have you seen how the 4xx cards do direct x7-8 games? theres about 500,000 ffxi players and some that might have upgraded to 4xx cards.
Currently 4xx card users are getting 5 fps on 450/460/470/480 cards, where older cards can run that game fluently? And this isnt just ffxi, theres ALOT of older games where 4xx series just runs like crap.

Does buying a current gen card, and finding out its running 1/10th the speed of your old card, on old games effect your gameplay? Yes probably more than this tessellation issue.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
What if they arnt? how many current games use tessellation to the point where the current top 5xxx series cant really make use of it?

That's not the point, is it?
nVidia did the same when AMD had DX10.1 hardware, and they didn't. AMD tried to promote the extra features, while nVidia downplayed them.
Basically the argument is always "Old/slow hardware is good enough, don't get the newer/better stuff", which is obviously weak.

also if you have a benchmark that wastes 75% of the tessellation it uses

Wait just a second, we're not going to use Huddy's 75% figure as any kind of factual basis.

is it so far fetched to think, that Nvidia because they have better tessellation (on their current gen cards), want benchmarks to not reflect the real world of gameing so they look better in the eyes of the buyers?

The reality is different. AMD promoted tessellation when they were the only ones with DX11 hardware on the market. Now nVidia is promoting tessellation as well. Result: a lot of new titles will support tessellation, and thus will be close to the benchmarks.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
I dont see any crap in this thread. He is simply refuting what Huddy said. And what does AMD "give back" to us? They sell a product, we spend our money on that product.

As VC&G moderator I agree. Scali is providing a point of view that many of us here can benefit from being exposed to regardless whether we agree with it or not.

(the message below is not directed towards Genx87 but rather is directed to the thread's participants en masse)

If you take it personal or make it personal then you are making it (the discussion) needlessly painful for yourself, and your response in this thread can become a problem both for your fellow forum members as well as your moderation staff.

Right now there are some very borderline posts in this thread...bordering on being actionable personal attacks against Scali as well as actionable misrepresentation of facts (JHH is a liar..., etc).

Take this as your only official warning to knock it off, discussing credibility is acceptable but making negative characterizations and ad hominem attacks on your fellow forum members is not acceptable.

Moderator Idontcare
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
...
also if you have a benchmark that wastes 75% of the tessellation it uses, and Im sure since its a benchmark, it uses ALOT more tessellation that ANY playable current game does, how is that a representation of the real world of gameing?
You have to read every single word Richard used to understand what he is trying to say. He said most LCD is 1920 pixel from left to right, and 1080 from top to bottom, which is about 2 mil of pixels. At this resolution, the degree of tessellation is higher than the resolution can display. It is a fact that it needs at least 3 pixels to form a triangle (3 dots), but that has nothing to do with the size of the smallest triangle from tessellation. In fact, the point isn't to display small triangles, but the object which is assembled by those triangles. For example, what is the smallest triangle size needed to construct a round object? Remember, when a picture is in motion, human can actually see more detail than the resolution can support. The 75% is a number coming out of thin air based on a stand still 2d picture in 1080p resolution.

is it so far fetched to think, that Nvidia because they have better tessellation (on their current gen cards), want benchmarks to not reflect the real world of gameing so they look better in the eyes of the buyers?

short answear: does nvidia cards do tessellation better 5xxx series vs 4xx series? yes.

Longer answear: does it currently matter in games? probably not.
IMO, 4xx does not do tessellation better than 5xxx series. However, 4xx series seems to have more processing power than 5xxx and 5xxx have more raw power than 4xx. Dx11 requires GPU to process tessellation, so if the complexity of tessellation is low, 5xxx will be able to generate better FPS. It isn't surprising as AMD use the same tessellation unit for their 5xxx series, and Fermi series use cuda cores for that, meaning tessellation power increases with the number/speed of cores.


Have you seen how the 4xx cards do direct x7-8 games? theres about 500,000 ffxi players and some that might have upgraded to 4xx cards.
Currently 4xx card users are getting 5 fps on 450/460/470/480 cards, where older cards can run that game fluently? And this isnt just ffxi, theres ALOT of older games where 4xx series just runs like crap.

Does buying a current gen card, and finding out its running 1/10th the speed of your old card, on old games effect your gameplay? Yes probably more than this tessellation issue.
update your driver. By the way, Fermi doesn't have an issue with tessellation.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
You have to read every single word Richard used to understand what he is trying to say. He said most LCD is 1920 pixel from left to right, and 1080 from top to bottom, which is about 2 mil of pixels. At this resolution, the degree of tessellation is higher than the resolution can display. It is a fact that it needs at least 3 pixels to form a triangle (3 dots), but that has nothing to do with the size of the smallest triangle from tessellation. In fact, the point isn't to display small triangles, but the object which is assembled by those triangles. For example, what is the smallest triangle size needed to construct a round object? Remember, when a picture is in motion, human can actually see more detail than the resolution can support. The 75% is a number coming out of thin air based on a stand still 2d picture in 1080p resolution.


IMO, 4xx does not do tessellation better than 5xxx series. However, 4xx series seems to have more processing power than 5xxx and 5xxx have more raw power than 4xx. Dx11 requires GPU to process tessellation, so if the complexity of tessellation is low, 5xxx will be able to generate better FPS. It isn't surprising as AMD use the same tessellation unit for their 5xxx series, and Fermi series use cuda cores for that, meaning tessellation power increases with the number/speed of cores.



update your driver. By the way, Fermi doesn't have an issue with tessellation.
That's not correct. Cuda cores don't do tessellation. The tessellator is part of the polymorph engine.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
You have to read every single word Richard used to understand what he is trying to say. He said most LCD is 1920 pixel from left to right, and 1080 from top to bottom, which is about 2 mil of pixels. At this resolution, the degree of tessellation is higher than the resolution can display. It is a fact that it needs at least 3 pixels to form a triangle (3 dots), but that has nothing to do with the size of the smallest triangle from tessellation. In fact, the point isn't to display small triangles, but the object which is assembled by those triangles. For example, what is the smallest triangle size needed to construct a round object? Remember, when a picture is in motion, human can actually see more detail than the resolution can support. The 75% is a number coming out of thin air based on a stand still 2d picture in 1080p resolution.

It also ignores the fact that pretty much everyone uses at least 4xAA these days. With 4xAA your triangles are effectively rendered at 3840x2160, and the whole story of using 2x2 quads for rendering with only one pixel falls to pieces. Basically the triangles are 4 times as large as what Huddy claims them to be.

Another thing he ignores is the fact that tessellation is adaptive to the screen resolution. If an object is further away, or under a certain angle, triangles can use less subdivision. Basically the idea is to subdivide the objects in such a way that each triangle on screen has more or less the same size.
 
Last edited: