beyond 3d interview with Eric Demers on R600 Architecture

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
http://www.beyond3d.com/content/interviews/39/1


---------------------------------------------------------------------------------- quote------------------------------------------------------------------------------------------
Here's a "from the geeks wet dreams file" question: Is there any chance that there might be an interface to allow third parties (and knowledgeable enthusiasts) to write their own CFAA algorithms and run/implement them on their R6xx? Even distribute them for others to run?

It?s certainly possible, and in DX10.1, applications will have access to the fragment data themselves and they can certainly do it quickly. But the issue here is access to the compression data. I?m not sure we?d expose that; so we would be limited in offering people DX10.1 style functionality, which could be done. Again, it would be a question of making tools available and supporting users. Perhaps through CTM or OGL this will be possible too.



Lastly, as you look around the web at reaction to the R6 family. . .are there any specific aspects to the design that you feel have been particularly misunderstood or not emphasised to the degree they deserve?

I can?t help but be a little disappointed that we did not have enough time to get more optimizations into our drivers in time for the launch. I still cringe when I see poor performance (especially if it?s compared, to, say, our previous generation products), and send daily emails to the performance and driver team, begging for this or that. In fact, I do believe that they all hate me now. They should join the club.

Also, on the feature side, we weren?t able to expose the adaptive edge detect CFAA modes to the public, or even to the press, until very late in the review process. This means that most reviewers did not have a chance to look at the amazing quality this mode offers ? There is nothing comparable to it out there.

We also had some last minute performance possibilities, which is always incompatible with stability, and we did not have enough time to get those tested and integrated in time for launch. We see in the latest driver, some 2x to 3x improvement in adaptive AA performance, for example, which is great but came later than I would have liked. But, I?ll admit, there?s so much to do still, that I haven?t really spent that much time on reviews and such. The reality is that I expect things to continue to improve and be much better in a few months.

---------------------------------------------------------------------------------- quote-------------------------------------------------------------------------------------------------

If you have no idea what Eric demers is talking about or what interviewer is asking then i suggest you read up on "R600 Architecture and GPU Analysis" by beyond 3d. Also wikipedia all the terms if you don't what they are or mean.

 

mruffin75

Senior member
May 19, 2007
343
0
0
This is an interesting answer:

"We also developed the UVD solutions for HD 2400 and HD 2600, so that they could run in very low power mode and display a full resolution HD movie on a notebook, in one battery. The higher end has a shader based solution, which leads to higher quality (but has more power consumption)."

So I take it that the HD2900XT will have a higher quality HDDVD/Blu-ray playback as compared to the 2400 and 2600 (when the driver implements it?)...
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
It's not like Rys asked him the most difficult questions on earth.. Wavey must have helped on that.. ;)
A good read overall though..:thumbsup:

I really respect Sireric he is one of the most gifted individuals in the market.. :)
 

gunblade

Golden Member
Nov 18, 2002
1,470
0
71
Originally posted by: mruffin75
This is an interesting answer:

"We also developed the UVD solutions for HD 2400 and HD 2600, so that they could run in very low power mode and display a full resolution HD movie on a notebook, in one battery. The higher end has a shader based solution, which leads to higher quality (but has more power consumption)."

So I take it that the HD2900XT will have a higher quality HDDVD/Blu-ray playback as compared to the 2400 and 2600 (when the driver implements it?)...


The higher quality is due to that it is going through the programmable shader processor and different adaptive filters(sharpening, noise reduction, etc) can be implemented and tuned to improve the video quality whereas the UVD is fixed funtion pipeline.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Good article but again the card is just too loud for me to consider. I'll be revisting the issue after a die-shrink.
 

CrystalBay

Platinum Member
Apr 2, 2002
2,175
1
0
Yeah, from what I gather they're just going to add GPU's to PCB's to gain performance at the die shrink in order to compete...

That 512 memory bus should be helpful there.
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
This is a question: When a GPU design was set, and the drivers team had a few working sillicons. The yields at the fab isn't that good so hardware design team are busy tweaking bugs off of the original design. But the design itself isn't going through a drastical change, as a few game developers had the early revision chips (errm... Crytek) for almost a year and know they are working. The biggest problem towards the final (shipping) silicons looked to be manufacturing-related. (Nothing fundamentally changed in the basics)

Why couldn't AMD's drivers team have more-or-less finished drivers for the launch? Oh, a game didn't have a problem with A12 silicons but all of a sudden it's slower with A13 silicons? I know I said it's a question at the beginning, but to be honest it shows just the incompetency of AMD drivers team. Blaming hardware for their lazy ass (hey, we have no hardware to code drivers!) and taking their sweet time.. Sorry but this is the impression that I used to have WRT ATI's driver teams. Instead of getting hard at it to fix/build polished drivers, being lazy in the office doing nothing, and when issues pop up then busy patching things up.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Why couldn't AMD's drivers team have more-or-less finished drivers for the launch?
Because getting drivers truly optimal takes a lot longer than building the hardware they run on. That process takes years for existing titles and then continues each time a new title is released.

but to be honest it shows just the incompetency of AMD drivers team.
Oh really? How long did nVidia have access to Vista? How long did they have access to a finished G80?

Now look at the debacle of nVidia's Vista drivers. Seven months later they still aren't working right for many users.
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: BFG10K
Why couldn't AMD's drivers team have more-or-less finished drivers for the launch?
Because getting drivers truly optimal takes a lot longer than building the hardware they run on. That process takes years for existing titles and then continues each time a new title is released.

but to be honest it shows just the incompetency of AMD drivers team.
Oh really? How long did nVidia have access to Vista? How long did they have access to a finished G80?

Now look at the debacle of nVidia's Vista drivers. Seven months later they still aren't working right for many users.

please. ati/amd dropped the ball. lets all admit it. there only trump card (and excuse) was that they said there drivers were ready. that failed
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: CrystalBay
Yeah, from what I gather they're just going to add GPU's to PCB's to gain performance at the die shrink in order to compete...

That 512 memory bus should be helpful there.

i don't think anyone want a 400W monster for a gpu.
 

thilanliyan

Lifer
Jun 21, 2005
12,042
2,257
126
Originally posted by: lopri
I know I said it's a question at the beginning, but to be honest it shows just the incompetency of AMD drivers team. Blaming hardware for their lazy ass (hey, we have no hardware to code drivers!) and taking their sweet time.. Sorry but this is the impression that I used to have WRT ATI's driver teams. Instead of getting hard at it to fix/build polished drivers, being lazy in the office doing nothing, and when issues pop up then busy patching things up.

Lol what about NVidia?? You think their drivers are perfect? And how long has G80 been out? Back then the excuse was "the architecture is brand new". What's the excuse now that all this time has passed?

IIRC Crossfire was working in Vista when R600 came out. Wasn't SLI support for Vista added recently (5 months after the launch of Vista)?

I'd imagine it takes to get the drivers polished and performance up to par. If it's the same situation 5-6 months down the road then there's a problem.
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
Reading again my post above I was way too excited. Heh.. I guess I was upset that the overall tone of the interview was set to blaming on the drivers for R600's performance and stability. I really wished the R600 would be a success, and more than anything they'd come out with a out-of-the-box polished drivers. Alas, to be very honest that I've always had a doubt on ATI driver teams and they didn't disappoint again. :( I don't know.. With the way Barcelona seems to be shaping up, I'm getting more and more pessimistic.
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
I mean, I'm not even talking about some TWIMTBP games with brand new engines. When the HD 2900XT shows similar (or even less) performance than X1900XT in yester-years' games.. And they say that is a driver problem, the real problem is the driver team, isn't it?

Edit: I started reading the comments on the interview @B3D, and this is what the interviewee has to say.

Originally posted by: sireric

The driver development has been tremendous and I applaud all the efforts that the driver team has been doing. But, having a BRAND new architecture, with needs for new DX driver (our DX10 is from scratch), updated DX9, new OGL, and all this with a new OS and required support for that and the old one is just too much right now. It's going to take time to get the best performance out of this chip -- Both in terms of coding all the elements, and also because it's a new arch and the teams need to learn its ins&outs
This sounds amazingly familiar to what someone else in green garment said a few months back, doesn't it? I almost thought it's a copy/paste.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Too much marketing and PR in that interview (some points were vague and some were good like the 512bit interface). He seems to blame the drivers quite abit, when really i remember AMD/ATi bashing nVIDIA for incompetency in the drivers few months ago.

 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Will be interesting to see if drivers actually improve performance in a big way. I also am most tempted to wait for a die shrink, but a good sale by either company may change that.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Cookie Monster
Too much marketing and PR in that interview (some points were vague and some were good like the 512bit interface). He seems to blame the drivers quite abit, when really i remember AMD/ATi bashing nVIDIA for incompetency in the drivers few months ago.

The GTX beat 2 X1950s on launch day, if that's bad drivers than ATIs must be absolutely awful right now.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
please. ati/amd dropped the ball. lets all admit it. there only trump card (and excuse) was that they said there drivers were ready. that failed
ATi had Crossfire DX10 working in Vista at launch. Meanwhile it took nVidia about six months to get SLI into Vista and even today I'm not sure whether it's DX10 yet. So no, ATi are vastly ahead of nVidia in terms of driver support.

The GTX beat 2 X1950s on launch day, if that's bad drivers than ATIs must be absolutely awful right now.
Not bad drivers, good hardware. Why do you think nVidia insisted reviewers use XP and not Vista? Why do you think we haven't heard a peep from them about Quad SLI, 7950 GX2 and G80 SLI on Vista?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: BFG10K

Not bad drivers, good hardware. Why do you think nVidia insisted reviewers use XP and not Vista?

Because Vista was not out yet.
 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,771
58
91
lol this guy wreckage can't be taken seriously anymore.
lets have a poll:
who takes wreckage seriously about anything based on a comparison between ati and nvidia?
i vote: i don't!
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: LOUISSSSS
lol this guy wreckage can't be taken seriously anymore.
lets have a poll:
who takes wreckage seriously about anything based on a comparison between ati and nvidia?
i vote: i don't!

I take him seriously when he posts valid information and has info to back it up.

That's more than I can say about the posters who attack him and his valid information just because he is an nvidia fan and because they have a predetermined bias against him.

Just use your own judgement to filter out the BS and take in the valid information. Then use your own intelligence to form your own conclusion. Just because Wreckage posted something doesn't make it any more true or false. It is what it is.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: LOUISSSSS
lol this guy wreckage can't be taken seriously anymore.
lets have a poll:
who takes wreckage seriously about anything based on a comparison between ati and nvidia?
i vote: i don't!

Wreckage said 2 things in this thread:

"The GTX beat 2 X1950s on launch day, if that's bad drivers than ATIs must be absolutely awful right now."

"Because Vista was not out yet."

Now tell us. Which of these comments are you not taking seriously, and why.
Unless I'm mistaken, both of these comments are true.

So that must mean you have a problem with the poster, and not the information given by the poster? Or is it you have a problem with the information because it was from the said poster?

Either way, I think you need to tell us why we should take you seriously now.

Proceed.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
You guys never give up on your little flame wars. All four of you should proceed and tell us how you added to this thread.

Will be interesting to see if drivers improve greatly and/or they find anyway to use those redundant shaders. I have a hard time thinking drivers will add anymore than 10% overall, but would like to be proved wrong. I guess new games will tell the story.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Because Vista was not out yet.
I'm reasonably certain this was happening after Vista had launched.

And again you don't hear nVidia pimping G80 SLI for that very same reason.
 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,771
58
91
"The GTX beat 2 X1950s on launch day, if that's bad drivers than ATIs must be absolutely awful right now."
"Because Vista was not out yet."

1. ati's drivers are known to be better than nvidia's. if you don't believe this, you're obviously blind
2. Vista has been out for a while. and nvidia's drivers JUST started to clear up and show signs of games working properly