ATI months ahead of NVIDIA with DirectX 11 GPU schedule?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Genx87

Lifer
Apr 8, 2002
41,095
513
126
ATI's problem hasnt been so much performance since 2002 but delivery of their product. Wasnt it the x1800 that was 6 months behind because there was a bug in their design software? Then it showed up slow and hot and they quickly followed up with the X1900. And the X800 XTX or whatever the hell the highest end card they had to compete with the 6800 was backordered for months. I think some people still have the card on backorder.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Wreckage
Originally posted by: Just learning
If this is true I think it will really help ATI.

I mean seriously when was the last time they were considered "the best" and "first to market"? Wasn't it in 2003 with the 9800 series cards?

Yeah the R300 launched in 2002 and they have pretty much struggled ever since. After that NVIDIA launched the 6xxx series with SLI, SM3 and Purevideo. ATI pretty much just played catch up.

Unbiased history would disagree with you.
X800XT was faster than any card from the 6-series
X1900XT was faster and more advanced than anything from the 7-series.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
:thumbsup:
Originally posted by: munky
Originally posted by: Wreckage
Originally posted by: Just learning
If this is true I think it will really help ATI.

I mean seriously when was the last time they were considered "the best" and "first to market"? Wasn't it in 2003 with the 9800 series cards?

Yeah the R300 launched in 2002 and they have pretty much struggled ever since. After that NVIDIA launched the 6xxx series with SLI, SM3 and Purevideo. ATI pretty much just played catch up.

Unbiased history would disagree with you.
X800XT was faster than any card from the 6-series
X1900XT was faster and more advanced than anything from the 7-series.

 

SunnyD

Belgian Waffler
Jan 2, 2001
32,674
145
106
www.neftastic.com
Originally posted by: Keysplayr
Good, Bad & Ugly.

The Good: ATI first to market with a next gen product in a very long time.
Early adopters will buy the product.

The Bad: As Chizow had mentioned, most will wait for Nvidia's product launch to see how it compares. If all goes well for Nvidia, it's just a two month window between ATI's launch and NV's.

The Ugly: Nvidia launches a DX11.1 part.

It's really strange that ATI wasn't first to market with several DX revisions, given that ATI has had a direct hand and generally worked with Microsoft directly on each DX revision (likely given because of the Xbox GPU).

Indeed - this is an example of what ATI did to Nvidia with the 4xxx series... they hedged the bet that Nvidia would make something huge, fast - yes, but expensive. The 4xxx series was an ace, I think far better than expected. I don't think ATI will depart much from the formula they implemented with the 4xxx series. Small-Fast-Affordable.

The one interesting thing - regarding a DX11.1 part - that would be interesting given Nvidia's reaction to DX10.1. It would be interesting to see the marketing machine spin that one up despite what the marketing machine did regarding ATI's promotion of DX10.1.
 

nyker96

Diamond Member
Apr 19, 2005
5,630
2
81
I wonder if all this dx11 jazz is worth the wait. are there any demo of the new dx11 vs dx10 that would show significant benefits? I do think both company is timing the release with windows 7.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Originally posted by: munky
Originally posted by: Wreckage
Originally posted by: Just learning
If this is true I think it will really help ATI.

I mean seriously when was the last time they were considered "the best" and "first to market"? Wasn't it in 2003 with the 9800 series cards?

Yeah the R300 launched in 2002 and they have pretty much struggled ever since. After that NVIDIA launched the 6xxx series with SLI, SM3 and Purevideo. ATI pretty much just played catch up.

Unbiased history would disagree with you.
X800XT was faster than any card from the 6-series
X1900XT was faster and more advanced than anything from the 7-series.

I would think that the X1900 card will hold up in games a lot better today than a 7-series card as well.

 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: OCguy


On topic, what is the point of DX11 capable cards before W7 goes retail?

First, of course it's all rumor and second will there even be a DirectX 11 in July? I have not seen a release date from Microsoft.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: SickBeast


I guess I could also add the 9800Pro to the mix for nostalgia's sake.

Like I said since the R300. Sure they clocked the heck out of their cards to gain FPS but they were behind in major features and their cards were hot and noisy.

A trend that still continues today.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
X800XT was faster than any card from the 6-series

The Phantom Edition x800 was the only board that was competitive with the 6800Ultra, overall probably a draw between those two parts. ATi's refresh(850) was needed for them to have a card that people could buy that was competitive.

X1900XT was faster and more advanced than anything from the 7-series.

The 7900GX2 smoked the x1900xt, it wasn't close.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: Wreckage
Originally posted by: SickBeast


I guess I could also add the 9800Pro to the mix for nostalgia's sake.

Like I said since the R300. Sure they clocked the heck out of their cards to gain FPS but they were behind in major features and their cards were hot and noisy.

A trend that still continues today.

None of the "key" features of the 6000 and 7000 series were relevant at all during their lifetimes.

Originally posted by: BenSkywalker
X800XT was faster than any card from the 6-series

The Phantom Edition x800 was the only board that was competitive with the 6800Ultra, overall probably a draw between those two parts. ATi's refresh(850) was needed for them to have a card that people could buy that was competitive.

X1900XT was faster and more advanced than anything from the 7-series.

The 7900GX2 smoked the x1900xt, it wasn't close.

The Radeon X800XT, Radeon X800XTPE, Radeon X850XT, and Radeon X850XTPE all very soundly beat the 6800 Ultra.

The 7900GX2 was around twice as expensive as the x1900xt.
 

akugami

Diamond Member
Feb 14, 2005
5,654
1,845
136
Originally posted by: james1701
One problem for ATI this time is the fact they caught Nvidia with their pants down this last time. They were able to offer 80% of the performance for 50% of the price. I think a lot of people will wait to buy buy because they want to see if Nvidia releases a much faster product this time, and not get caught like they did the last time, with a grossly over priced card.

The question is it takes a lot of work to do major redesigns or a completely new architecture. Pure speculation but the G300 might just be an updated G200. Much like CPU's, GPU's are hatched years in advance. It probably takes one and a half to two+ years for a truly new design to be implemented and we likely won't see the answer to ATI's smaller, but still powerful, GPU's until 2010 when the next iteration of nVidia GPU's come out.

I'm not saying there won't be a large performance upgrade but we've seen how the G200's monolithic size has hindered it from a business perspective. Maybe not from a performance standpoint but every business still has to turn a profit. In that respect, the Radeon 4xxx series has been very good for ATI while the nVidia G200 series has been worse, not bad since nVidia still makes a buck, just worse. You can sell similar performing ATI cards for less and still make similar profits.

I do believe that the G300 series part will continue the "big and powerful" GPU's in a similar vein to the G200 while the new RV870 will continue with their current design of "small but adequate" and just slap two GPU's together in a single board XFire configuration to compete on the higher end. My preference has always been for single GPU solutions even though I've had SLI capable motherboards for the last five+ years but you have to admit that ATI's strategy is sound. They may concede the high end of the market but the meat and the potatoes are in the low and mid end cards. Heck, look at how much Intel makes on their crappy integrated GPU's.

I think nVidia will price their GPU's much more competitively this time around (they have to), but it'll likely continue what has been happening for the last half year roughly since the Radeon 4870 has been released. The real action will again heat up next year when nVidia releases their answer to the smaller and sleeker design of ATI's GPU's.

Either way, consumer's FTW.

@Wreckage. If nVidia was the one doing some of the stuff that ATI has first, such as DX 10.1, first to use GDDR5, 40nm GPU, and if these rumors are true first consumer DX 11 part, etc. you'd be lauding them instead of trashing them. ATI has been the first to use some technologies before nVidia so your comment about how nVidia was always the first to use new technologies is also a blatant lie since I've seen other threads where other posters have refuted the very same false information you are posting in this thread.

Any time someone comes up with valid arguments against the FUD you spew, you choose instead to retreat and ignore what the poster said. The only time you push an argument is when it's opinion based and there is no clear right or wrong answer.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The Radeon X800XT, Radeon X800XTPE, Radeon X850XT, and Radeon X850XTPE all very soundly beat the 6800 Ultra.

Really?

If that's too difficult a decision to make, our data shows the 6800 GT to lead the X800 Pro in performance. Maybe the $10 is a factor to you and maybe it isn't, but in this close race, price should definitely be a determining factor.

We can also see the 6800UE leading the X800XTPE most of the time as well.

Link.

So when you say 'very soundly beat' that actually means flat out loses? Dialect difference perhaps?

The 7900GX2 was around twice as expensive as the x1900xt.

And the 4890 cost a lot more then the 9400GT, doesn't change the fact that the 7900gx2 was significantly faster.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: dguy6789

None of the "key" features of the 6000 and 7000 series were relevant at all during their lifetimes.

Actually they all were relevant and many people are still using these cards.

SLI, Purevideo, SM3, HDR all ground breaking features that the competition struggled to adapt even years later.
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126
Why do people think it is ground breaking to have GDDR5, 40nm, and 1Ghz GPU's? That is a process\manufacturing decision.

I'd rather have something that is tangible as a consumer like SLI, HDR, PureVideo, Non Adapative AF.

/shrug

DX11 should be in both next gen GPU's. But that doesnt matter until Microsoft actually releases DX11.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Genx87
Why do people think it is ground breaking to have GDDR5, 40nm, and 1Ghz GPU's? That is a process\manufacturing decision.

I'd rather have something that is tangible as a consumer like SLI, HDR, PureVideo, Non Adapative AF.

/shrug

DX11 should be in both next gen GPU's. But that doesnt matter until Microsoft actually releases DX11.

Exactly. Heck there are barely any DX10 games on the market yet.
 

HOOfan 1

Platinum Member
Sep 2, 2007
2,337
15
81
I think it is time to create an ATI subforum and an nVidia subform (sorry S3 guys....all 0 of you)

too many threads get derailed like this. 1 week ban for posting about the competition in the other forum.

Anyway. we are still getting fourth hand information here. Xbit is quoting two other sites. One of which is certainly not known for being accurate.
 

imported_Scoop

Senior member
Dec 10, 2007
773
0
0
Originally posted by: chizow
Looks good, I'm sure many will be interested in what the next-generation brings in terms of performance. Unless RV870 is ~1.75-2x the performance of RV790 however, I think most people will at least wait and see what GT300 offers before buying, especially after the latest round of product refreshes.

True, I'm in no rush. My 3850 handles everything I throw at it with ease! :)
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Originally posted by: Genx87
Why do people think it is ground breaking to have GDDR5, 40nm, and 1Ghz GPU's? That is a process\manufacturing decision.

I'd rather have something that is tangible as a consumer like SLI, HDR, PureVideo, Non Adapative AF.

How aren't any of those items groundbreaking? It may be a process/manufacturing decision, but it doesn't magically appear on video cards. There is still a lot of engineering and designing necessary to incorporate those technological advances into new cards. And the company that releases them first generally has an upper hand on its competition, be it in overall speed or cost to manufacture/street cost.

I'd say that makes them very tangible benefits for end consumers.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Creig
Originally posted by: Genx87
Why do people think it is ground breaking to have GDDR5, 40nm, and 1Ghz GPU's? That is a process\manufacturing decision.

I'd rather have something that is tangible as a consumer like SLI, HDR, PureVideo, Non Adapative AF.

How aren't any of those items groundbreaking? It may be a process/manufacturing decision, but it doesn't magically appear on video cards. There is still a lot of engineering and designing necessary to incorporate those technological advances into new cards. And the company that releases them first generally has an upper hand on its competition, be it in overall speed or cost to manufacture/street cost.

I'd say that makes them very tangible benefits for end consumers.

Nobody cares how the card is made. They care about how it plays games and what features it has (such as video encoding or physics).

Manufacturing process is the worry of the card maker not the card buyer.

Besides it not like any of that helped ATI to make cooler cards or more power efficient cards... those things would also benefit the end user.
 

OCGuy

Lifer
Jul 12, 2000
27,227
36
91
Originally posted by: Creig
Originally posted by: Genx87
Why do people think it is ground breaking to have GDDR5, 40nm, and 1Ghz GPU's? That is a process\manufacturing decision.

I'd rather have something that is tangible as a consumer like SLI, HDR, PureVideo, Non Adapative AF.

How aren't any of those items groundbreaking? It may be a process/manufacturing decision, but it doesn't magically appear on video cards. There is still a lot of engineering and designing necessary to incorporate those technological advances into new cards. And the company that releases them first generally has an upper hand on its competition, be it in overall speed or cost to manufacture/street cost.

I'd say that makes them very tangible benefits for end consumers.

What % of consumers really care about GDDR3 or 5? All they care about is that 3Dmark score when they come home and fire their new toy up for the first time.

We have all seen component pissing contests, and I dont think I have ever been based of type of RAM or die size. Frame-rates and synthetic benchies are king, even though I believe all we should care about is real-world performance.

As far as the 1ghz GPU, that sure is sexy. I wouldnt put that in the same breath as 40nm or RAM type. :beer:

 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Well, I thought this was a tech forum. Not 'Consumer Reports'. And GDDR3 vs. GDDR5 directly relates to performance as well as power/heat. I am interested in 40nm because of its potential for better products, not because of better margins company X will make. (Though some seem to care about such)
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: lopri
Well, I thought this was a tech forum. Not 'Consumer Reports'. And GDDR3 vs. GDDR5 directly relates to performance as well as power/heat. I am interested in 40nm because of its potential for better products, not because of better margins company X will make. (Though some seem to care about such)

+1. Since AMD has always had a head start in the process race and has a better understanding of implementing GDDR5, I wonder how nVIDIA will fare with the new 40nm process/GDDR5 modules. I guess they will pre test this process when they release GT21x based cards rumoured to hit the stores around late Q2.
 

Forumpanda

Member
Apr 8, 2009
181
0
0
I would say that a video card using less power to achieve the same performance is a tangible benefit.

And manufacturing tech is not totally irrelevant, having more experience with never (and better) manufacturing technology should give ATI the ability o put out a much stronger card everything else being equal.

I was actually a bit surprised that ATI did not use their 'tech lead' to put out a equally large chip (die size) and really hit it home performance wise.
I suspect they might be planning something big when/if they switch to using AMD/global foundry's for manufacturing.

But I am just an outside observer, with little technical knowledge.
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126
Originally posted by: Creig
Originally posted by: Genx87
Why do people think it is ground breaking to have GDDR5, 40nm, and 1Ghz GPU's? That is a process\manufacturing decision.

I'd rather have something that is tangible as a consumer like SLI, HDR, PureVideo, Non Adapative AF.

How aren't any of those items groundbreaking? It may be a process/manufacturing decision, but it doesn't magically appear on video cards. There is still a lot of engineering and designing necessary to incorporate those technological advances into new cards. And the company that releases them first generally has an upper hand on its competition, be it in overall speed or cost to manufacture/street cost.

I'd say that makes them very tangible benefits for end consumers.

Because it is an evolutionary step every manufacturer goes through. We can pretty much map out the mask sizes for the next 20 years. Achieving each step isnt ground breaking. It is a process change and something that the end user wont see.

But I admit perhaps I should have reworded the intentions of my response from "thinking it is groundbreaking" to "who gives a flying eff". None of these process changes matters to the end user. It could be on the .30 process, have a size of 2000mm, and run at 1Mhz. If it plays my games at top speed what do I care how it is done via manufacturing?

Where as HDR, SLI, AF ect are technologies I can actually see and care about as a gamer.