AMD's Richard Huddy on DirectX 11, Eyefinity, and the competition

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Does it matter who is a fanboy?

Lets review some simple facts:
ATI have hardware that uses Dx11, namely the 5xxx series. It is cheap and relatively fast. It introduced a built in multi displays support, they named it Eyefinity.

Nvidia is new Fermi series will also support Dx11, bulti multi displays and should be faster than all existing cards, plus PhysX and 3D. Unfortunately, it will be expensive and it is not out yet.

ATI: 1
NV: 0

Go on...

Some of the extra cost on Nvidia's video card went to TWIMTBP, a program that offers aid to game developers. ATI also have a program like that, but in a much smaller size.

Smaller, that's a fact - much smaller, that's your speculation.

ATI has tried to engineer their video card to support DirectX, openCL, and openGL. The only innovative about the design is its naming technology.

Ehh? NV doesn't even have ANY of those.

Nvidia also support DirectX,

Does not support DX11.

openCL/GL, plus CUDA, physX, and 3D vision.

This listing clearly shows you have no clue about the entire subject - OpenCL/GL? "Plus" CUDA, "physX", 3D vision?
It's sooo obviously hilarious on many counts I don't even bother to correct it... seriously: just give up already.
We know you don't understand half of it, don't sweat it. :D

The idea of 3D is not new, but the 3D vision glasses is.

Compared to the age of the planet, it is - sadly, it has nothing to do with Nvidia, it's just another stolen-rehashed idea.

So Nvidia's video card is more expensive simply because it have some more features. If features that are backed only by Nvidia is not your cup of tea, then get an ATI video card.

If you ain't rich, or willing to spend money on gaming, that ATI is the way to go. 60Hz LCD are cheap, and it is a scalable option. Otherwise, Nvidia is the way to go. SLI + one 120Hz + 3D vision glasses is great, Tri SLI + 3x 56" 120hz projectors + 3D vision glasses is epic. When you are there, 7.1 sound system is more or less a requirement. Don't go cheap on the sound card please. That is why blue rays are for. And do not forget to have several SSDs at raid 0, perfect for gaming or converting Vids. Yes, the cost of changing a bulb = a high end ATI card. So what? Only ATI claims that new techs are cheap, while others know that nothing new is cheap.

I will repeat myself, money is never a problem to me. My problem is, I don't have money.

Neither the slightest clue about any of these topics... :D
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
NVIDIA holds a patent on tessellation that goes back nearly 7 years.
http://patft.uspto.gov/netacgi/nph-...50&s1=6906716.PN.&OS=PN/6906716&RS=PN/6906716

It's nothing new.
So what's your point? I never said it was new.

Nvidia holds a patent on tessellation? Fantastic.


ATi has their own patents on the subject:

http://www.google.com/patents/about?id=DX4OAAAAEBAJ&dq=tessellation+ati

http://www.google.com/patents/about?id=nBuaAAAAEBAJ&dq=tessellation+ati

http://www.google.com/patents/about?id=mYZ6AAAAEBAJ&dq=tessellation+ati


The difference between Nvidia and ATi is that ATi is the one that went out and did something with theirs. As I stated earlier, ATi has included hardware tessellation capability in their GPUs since the 2000 series because they foresaw that tessellation would eventually become an important feature.

Nvidia, on the other hand, has a piece of paper in a filing cabinet.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Tend to look at Tessellation and 3d Stereo as wonderful choices to raise the bar of immersion for games. Can't pick and choose - both welcomed.

As we already explained ATI had Truform since early 2000s - I know, I used it with UT2k4 and it looked awesome.

It has nothing to do with 3D stereoscopic - there's no mutual exclusivity and with the ATI 5xxx-series you are getting both. (Watch out for the next CATALYSTs, it's all I can say... ;))

However, 3d stereo is more than just games though as the industry is moving to 3d stereo -- especially if some payed a bit attention to CES. Movies, Videos, Photos, games, Internet, medical and work place -- hardware from different displays, methods and cameras. 3D stereo has its own ecosystem in a way -- and you call this simply dusting off old technology and quite insulting for the hard work even AMD/ATI is doing with working with Bit Cauldron and IZ3d.
Ahh, pleahhhse, just cut the empty cliches already.
I work in the industry and there's almost nothing on the market just yet - for a recent 3D project I had to buy that ugly-crappy yet overpriced LaserVue Mitsu and my 3D guys *hate* NV's lousy-shitty 3D system because only one can use it at a time! - and there's NOTHING NEW in the upcoming 3D offerings. Bit Cauldron will cure NV's broken 3D glass implementation but still relies on active glasses - that's a FAIL in my book.

Passive glasses are a step in the right direction but 3D will be here to stay when no glasses will be needed and that's couple of years away.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
As we already explained ATI had Truform since early 2000s - I know, I used it with UT2k4 and it looked awesome.

It has nothing to do with 3D stereoscopic - there's no mutual exclusivity and with the ATI 5xxx-series you are getting both. (Watch out for the next CATALYSTs, it's all I can say... ;))



Ahh, pleahhhse, just cut the empty cliches already.
I work in the industry and there's almost nothing on the market just yet and there's NOTHING NEW in the upcoming 3D offerings. Bit Cauldron will cure NV's broken 3D implementation but still relies on active glasses - that's a FAIL in my book.

Passive glasses are a step in the right direction but 3D will be here to stay when no glasses will be needed and that's couple of years away.

I like 3d stereo choice and all welcomed.
 
Last edited:

T2k

Golden Member
Feb 24, 2004
1,665
5
81
You're forgetting about tessellation. ATi hardware has included tessellation support since the 2000 series.

Wrong. ATI introduced Truform in Radeon 8500 (DX8) in 2001.

Among supported games are Enemy Territory, Quake-Q2-Q3, UT-UT2k3-UT2k4, CS, R6 etc including their engine-derivatives... the only difference is that since DX9 it's not a fixed HW but rather a shader-based feature - just like the current, now DirectX-sanctioned implementation.
 
Last edited:

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
and you call this simply dusting off old technology and quite insulting for the hard work even AMD/ATI is doing with working with Bit Cauldron and IZ3d.
The reason I used the term "dusting off" is that, from what I can tell, the Nvidia 3D vision technology is exactly the same as my old Elsa Revelator glasses. A pair of glasses with LCDs set in them to alternately block the view to either eye 60 times a second. I've just described both my Revelator glasses and the Nvidia 3D vision glasses.

I also disagree with Nvidia over their pricing. IMO, $200 is WAY overpriced for what it costs to make these glasses and for the software support. If they could be produced and sold profitably for $80 ten years ago, there is no conceivable way that Nvidia needs to charge $200 for them today. Inflation hasn't run THAT amok.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
The reason I used the term "dusting off" is that, from what I can tell, the Nvidia 3D vision technology is exactly the same as my old Elsa Revelator glasses. A pair of glasses with LCDs set in them to alternately block the view to either eye 60 times a second. I've just described both my Revelator glasses and the Nvidia 3D vision glasses.
I also disagree with Nvidia over their pricing. IMO, $200 is WAY overpriced for what it costs to make these glasses and for the software support. If they could be produced and sold profitably for $80 ten years ago, there is no conceivable way that Nvidia needs to charge $200 for them today. Inflation hasn't run THAT amok.
It's the same reason the GTX280 was launched at $650 - too much ego, not enough common sense.

3D technology isn't going to take the market until it gets better. I don't care who makes it, it still sucks. To me, and I know I'm not the only one, it looks like a bunch of cardboard cutouts in 3D space, and is actually more annoying than cool (that summary is heavily game-dependent though).
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
You're forgetting about tessellation. ATi hardware has included tessellation support since the 2000 series. Obviously ATi saw the advantages to be realized from tessellation and were proactive in gaining industry support for it. Now it is one of the biggest features of DX11. That's innovation right there.

Actually, 3D shutter glasses are an OLD technology. I have a pair of them sitting up in my closet that I purchased 10 years ago.

http://www.techspot.com/reviews/hardware/elsa_revelator/

The Nvidia 3D Vision glasses are identical in almost every respect. Well, except for price that is. Back then, you could pick up a set of the wired glasses for $50 or the IR version for $80. Nvidia dusted off the technology and is now charging $200 for the exact same thing you could buy for $80 a decade ago.
Tessellation is a mathematic term, not a thing or tech. Hardware Tessellator is a thing and a tech that can be patented. As Wreckage pointed out, was patented by Nvidia. One way or the other it isn't new, but the technology back then can't make it practical. Of course now tessellation is seen as thing from "Dx11", which really isn't. It should be said the other way, Microsoft support Tessellation through Dx11. ATI did implemented a hardware tessellator to support Dx11 on this matter, but it isn't innovative, as that is the way microsoft wants tessellation to be done.

Now do not be confused, tessellation don't enhance graphics, but allow complex model to be stored in a much smaller size. Models that uses tesselation can't be retrofitted backward nor vice versa dynamically. Only models that are built natively on Dx11 can use its advantages, allowing computer to produce to produce Dx10 graphics with 10% of its memory usages, but keeping and maintaining 2 set of arts isn't in favor to designers. So all you really get is the name for now. Wait 3 years.

As for why I said the 3d vision glasses is innovative, please read the following link:
http://www.xbitlabs.com/articles/multimedia/display/nvidia-gf-3d-vision.html

Edit: 3D vision glasses support independent on/off as well as the depth of 3D effect through a wheel, which is something new and does not exist before. None of the competitors have this function. This is very practical as 3D does hurt FPS and sometimes you do wanna turn it off. Sometimes scenes are not rendered properly and you want to turn it off. WoW support 3D cursor, which to people without the 3D, is pointless. However, those who are in 3D, it is as big as an expansion. They can finally see one arrow and click on things without guessing. Before that, 3d vision user can click off 3D to use the mouse effectively, while others were playing in hell. Who will have thought that a button makes such differences.
 
Last edited:

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
NV may have patented tess. But ATI patented a long time ago and its always been Hardware .. Now your saying after ATI has worked on it for years NV has a better solution . You guys go way to far .

Microsoft is losing its grip on gaming the sooner the better.
 

Schmide

Diamond Member
Mar 7, 2002
5,745
1,036
126
Tessellation is a mathematic term, not a thing or tech. Hardware Tessellator is a thing and a tech that can be patented. As Wreckage pointed out, was patented by Nvidia.

WTF you can patent a whole lot more than just a thing and it doesn't have to involve a physical device. PS the nVidia patent is just an addition to other patents, as are the ATI patents. All this crap on patents is irrelevant as both companies could not exist without the cross licensing of such technology from all parties.

Edit:

Now do not be confused, tessellation don't enhance graphics, but allow complex model to be stored in a much smaller size. Models that uses tesselation can't be retrofitted backward nor vice versa dynamically.

You don't see any contradiction in this sentence? If it allows for greater detail without having to send extra information, it enhances graphics!
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
NV may have patented tess. But ATI patented a long time ago and its always been Hardware .. Now your saying after ATI has worked on it for years NV has a better solution . You guys go way to far .

Microsoft is losing its grip on gaming the sooner the better.

I don't know, tend to think it was great to see n-patches and enjoyed it in Half-life and Myth -- great to see tessellation offered with the R-600 through Rv-770 and become in important part of DirectX 11 and the 5XXX series. Also think it will be great to see nVidia take advantage of tessellation as well.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
The reason I used the term "dusting off" is that, from what I can tell, the Nvidia 3D vision technology is exactly the same as my old Elsa Revelator glasses. A pair of glasses with LCDs set in them to alternately block the view to either eye 60 times a second. I've just described both my Revelator glasses and the Nvidia 3D vision glasses.

I also disagree with Nvidia over their pricing. IMO, $200 is WAY overpriced for what it costs to make these glasses and for the software support. If they could be produced and sold profitably for $80 ten years ago, there is no conceivable way that Nvidia needs to charge $200 for them today. Inflation hasn't run THAT amok.

The pricing is a bit high and would like to see it around 99-129 but you're not just buying glasses but the software and commitment from the company. And, it's great to see another 3d stereo choice though with new monitor tech -- one doesn't have to buy it but at least it may be a consideration though for some and should improve with more competition and choice -- considering the industry is moving in this direction.

It was great to see ATI/AMD working with Bit Cauldron.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
WTF you can patent a whole lot more than just a thing and it doesn't have to involve a physical device. PS the nVidia patent is just an addition to other patents, as are the ATI patents. All this crap on patents is irrelevant as both companies could not exist without the cross licensing of such technology from all parties.
I am not trying to argue who had it first, as I don't think it is relevant. However, the credit of new goes to microsoft's Dx11, which I don't think either of us will like to agree.
Edit:
You don't see any contradiction in this sentence? If it allows for greater detail without having to send extra information, it enhances graphics!
In theory, tessellation won't hurt FPS, but increases it as the computation occurs within the GPU. In practice, 50% tax. Don't ask me where these taxes came from, I honestly don't know, and I think there will be tax relief coming. Will this come in a form of hardware upgrade, i.e. Dx11.1? I don't know.

As to enhance graphics? LoL. Those Dx11 demo running at Dx10 is worst than WoW. Gimmick detected.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
I am not trying to argue who had it first, as I don't think it is relevant. However, the credit of new goes to microsoft's Dx11, which I don't think either of us will like to agree.

In theory, tessellation won't hurt FPS, but increases it as the computation occurs within the GPU. In practice, 50% tax. Don't ask me where these taxes came from, I honestly don't know, and I think there will be tax relief coming. Will this come in a form of hardware upgrade, i.e. Dx11.1? I don't know.

As to enhance graphics? LoL. Those Dx11 demo running at Dx10 is worst than WoW. Gimmick detected.

Tesselation improves performance compared to a scene that is the same poly count as with teselation done with conventional means. There will obviously be a "tax" when comparing no tesellation low poly to with tessellation high poly... THe shaders are being asked to do more work when you enable it.

Noone (who has a clue) has ever claimed that running tesselation has no performacne hit, I have no idea where you got that idea from.
 

Schmide

Diamond Member
Mar 7, 2002
5,745
1,036
126
In theory, tessellation won't hurt FPS, but increases it as the computation occurs within the GPU.

Actually it kills FPS. You're basically trading detail for performance.

In practice, 50% tax. Don't ask me where these taxes came from, I honestly don't know, and I think there will be tax relief coming. Will this come in a form of hardware upgrade, i.e. Dx11.1? I don't know.

As to enhance graphics? LoL. Those Dx11 demo running at Dx10 is worst than WoW. Gimmick detected.

I find it so entertaining that you play authority then play dumb.

Taxes? Tax Relief? Post in P&N

It's not too complected what's going on. You take a mesh and subdivide it on the fly such that there are more vertices to make light calculations with. Whatever interpolations being made are up to the algorithm.

Comparing it to WoW is just lame. They are totally different scenes. If you took WoW and added tessellation to it, that would be a fair comparison. Otherwise you're just attempting to muddy the waters.

Edit:

I am not trying to argue who had it first, as I don't think it is relevant. However, the credit of new goes to microsoft's Dx11, which I don't think either of us will like to agree.

Credit for implementation goes to MS. Credit for the algorithm goes to Okamoto; Kiyokazu (Tokyo, JA), Hayashi; Hitoshi (Tokyo, JA),Lesche; Wolfgang (Berlin, DE), Bresenham; Jack E. (Chapel Hill, NC), Grice; Donald G. (Kingston, NY), Pi; Shing-Chou (Poughkeepsie, NY), Rosener; Harvey J. (Sherwood, OR), Knierim; David L. (Wilsonville, OR), Dalrymple; John C. (Newberg, OR), Yam; David S. (Eagan, MN), Corthout; Marc E. A. (Eindhoven, NL), Mielekamp; Pieter M. (Eindhoven, NL), Dines; Steven (San Jose, CA), Sfarti; Adrian (Sunnyvale, CA), Daniel; Andrew D. (Sunnyvale, CA) etc. I got tired of following links but needless to say it goes on forever.

You want to continue this argument? Look under the nVidia patent above and then look at 4648049 March 1987. Oh no the nVidia patent built on an AMD patent.
 
Last edited:

waffleironhead

Diamond Member
Aug 10, 2005
7,064
570
136
I am not trying to argue who had it first, as I don't think it is relevant. However, the credit of new goes to microsoft's Dx11, which I don't think either of us will like to agree.

In theory, tessellation won't hurt FPS, but increases it as the computation occurs within the GPU. In practice, 50% tax. Don't ask me where these taxes came from, I honestly don't know, and I think there will be tax relief coming. Will this come in a form of hardware upgrade, i.e. Dx11.1? I don't know.

As to enhance graphics? LoL. Those Dx11 demo running at Dx10 is worst than WoW. Gimmick detected.

which came first AMD's on silicon tesselator or directx11? Do you think the adoption of hardware tessalation by microsoft had anything to do with the existence of said tessalator, or was amd just getting prepared years ahead of time?
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
which came first AMD's on silicon tesselator or directx11? Do you think the adoption of hardware tessalation by microsoft had anything to do with the existence of said tessalator, or was amd just getting prepared years ahead of time?
Hey, I don't like it, don't look at me. How many times do I need to say it isn't something new? Are you trying to say Nvidia is able to suddenly retrofit tessellator just because ATI has it on 5xxx series?
 

Schmide

Diamond Member
Mar 7, 2002
5,745
1,036
126
Hey, I don't like it, don't look at me. How many times do I need to say it isn't something new? Are you trying to say Nvidia is able to suddenly retrofit tessellator just because ATI has it on 5xxx series?

Are you seriously saying it isn't new? Just because you say it doesn't make it true. Can you seriously stop with the rhetoric and just accept some of the facts. It's a new function, built into a new library, executing on a new hardware standard. FFS.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Seero,

FFS: even I feel sorry for you after reading this brutal public re-schooling you got here - please, stop embarrassing yourself and start reading up about subjects you want to make a comment about before you post anything.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Are you seriously saying it isn't new? Just because you say it doesn't make it true. Can you seriously stop with the rhetoric and just accept some of the facts. It's a new function, built into a new library, executing on a new hardware standard. FFS.
Yeah, i am saying that hardware tessellator isn't new. If anything is new, it is the API of Dx11.

Actually it kills FPS. You're basically trading detail for performance.



I find it so entertaining that you play authority then play dumb.

Taxes? Tax Relief? Post in P&N
By tax I mean the decrease in performance, not the tax from government. I thought I don't need to explain that, but I have overestimated something.

Dig a bit into tessellation from ATI and you will know that it was not suppose to tax performance that hard. Now in case you will say "Seero said it doesn't make it true", there is a document from AMD.
http://developer.amd.com/gpu_assets/Real-Time_Tessellation_on_GPU.pdf
Table 2. Performance data for comparing rendering terrain patch with different rendering modes. Note that rendering terrain with adaptive tessellation results in much higher quality visual results while maintaining rendering times close to the original, low-resolution mesh rendering.

We measured performance for our application on a quad-core AMD Phenom™ X4 processor-based system with 2GB of RAM and an ATI Radeon graphics cards. Table 2 provides a detailed overview of the results. We notice that adaptive tessellation renders at frame rates close to the original low-resolution rates (less than 1% difference). In the case of rendering the original low-resolution input mesh as well as with rendering using continuous tessellation, we render the mesh as indexed primitives whereas, with adaptive tessellation, we use non-indexed primitives. As mentioned earlier in this chapter, given the results of using adaptive tessellation, we notice the lack of any serious performance penalties due to rendering with non-indexed primitives in the adaptive tessellation case. Instead, the performance is easily comparable to the cost of rendering the original low-resolution mesh, even though the resulting rendered terrain surface is significantly higher quality when rendered with adaptive tessellation. Even by simply turning on adaptive tessellation in far-away views, we double the polygon count for the rendered mesh.
Figure

By tax relief I mean some code change will ease the performance hit if we are lucky. If not some hardware change may required. If we are very unlucky, which is most likely, structural change is required.

It's not too complected what's going on. You take a mesh and subdivide it on the fly such that there are more vertices to make light calculations with. Whatever interpolations being made are up to the algorithm.

Comparing it to WoW is just lame. They are totally different scenes. If you took WoW and added tessellation to it, that would be a fair comparison. Otherwise you're just attempting to muddy the waters.
Please read my post carefully. The so call "enhance graphics" is by making Dx10 graphics look worst than Dx9? Sorry, but tessellation demo is this lame.

Edit:



Credit for implementation goes to MS. Credit for the algorithm goes to Okamoto; Kiyokazu (Tokyo, JA), Hayashi; Hitoshi (Tokyo, JA),Lesche; Wolfgang (Berlin, DE), Bresenham; Jack E. (Chapel Hill, NC), Grice; Donald G. (Kingston, NY), Pi; Shing-Chou (Poughkeepsie, NY), Rosener; Harvey J. (Sherwood, OR), Knierim; David L. (Wilsonville, OR), Dalrymple; John C. (Newberg, OR), Yam; David S. (Eagan, MN), Corthout; Marc E. A. (Eindhoven, NL), Mielekamp; Pieter M. (Eindhoven, NL), Dines; Steven (San Jose, CA), Sfarti; Adrian (Sunnyvale, CA), Daniel; Andrew D. (Sunnyvale, CA) etc. I got tired of following links but needless to say it goes on forever.

You want to continue this argument? Look under the nVidia patent above and then look at 4648049 March 1987. Oh no the nVidia patent built on an AMD patent.
Is it hard to read what you quote?
Nvidia patented "Integrated tessellator in a graphics processing unit"
AMD patented "Rapid graphics bit mapping circuit and method"
 
Last edited:

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Seero,

FFS: even I feel sorry for you after reading this brutal public re-schooling you got here - please, stop embarrassing yourself and start reading up about subjects you want to make a comment about before you post anything.
I don't know if this is enough to get you banned my friend.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
Dig a bit into tessellation from ATI and you will know that it was not suppose to tax performance. Now in case you will say "Seero said it doesn't make it true", there is a document from AMD.
http://developer.amd.com/gpu_assets/Real-Time_Tessellation_on_GPU.pdf

I already explained where you misunderstand things..

Tessellation is not performance neutral, it takes up a lot of shader power to do. However, what is said in that white paper is that the performance hit between rendering an advanced object into, say, a million polygons is greater than using tessellation to get the same effect.

Tessellation has ALWAYS hurt performance.. the whole point is that it does not hurt it as much as rendering equivalently huge meshes... Nothing is free, you need to look up some of this stuff more before you spout off on it.
 

zebrax2

Senior member
Nov 18, 2007
977
70
91
I already explained where you misunderstand things..

Tessellation is not performance neutral, it takes up a lot of shader power to do. However, what is said in that white paper is that the performance hit between rendering an advanced object into, say, a million polygons is greater than using tessellation to get the same effect.

Tessellation has ALWAYS hurt performance.. the whole point is that it does not hurt it as much as rendering equivalently huge meshes... Nothing is free, you need to look up some of this stuff more before you spout off on it.
^ This