ATI RV770 in May??!!!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Erm. Chips are a square. Take a square root of the number of transistors before comparing percentages and things look a bit different.

As a rough guestimate to see the shrinkage from an optical shrink of a 530mm^3 90nm part to a 55nm process, we'd go something like (sqrt(530)*55/90)^2 and get 196. A 600 mm^2 90nm part wouldnot be 300mm^2 on a 45nm process. It would be closer to 150 mm^2. factor of 2 decrease in process size, factor of 4 decrease in chip size.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
so a little smaller than G80? even with a densely packed chip .. and i guess it'll be hot
- G80 is really pretty big .. i guess we are looking at "expensive" ... and i hope they learned i do NOT want FRY the inside of my case or i WILL go for the budget solution .. again :p

... sigh :(
 

Rusin

Senior member
Jun 25, 2007
573
0
0
With those specs.. should consume about same amount of wattage compared to 9800 GX2. It should be smaller than G80..even with Nvio integrated. With those rumoured specs it should be basically 9800 GX2 on single GPU (little bit less processing units, but higher clocks. When you remove all dual GPU solution bottlenecks from 9800 GX2 (+ the fact that single G92s are bottlenecked by their memorybandwidht..not to mention 9800 GX2) you'll get huge performance leap.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Looks possibly midrange. Still a nice cheap mid range card works for me.
 

daveybrat

Elite Member
Super Moderator
Jan 31, 2000
5,829
1,042
126
I love my Nvidia 8800GTS, but come on Nvidia, no new official drivers released since December 2007??? Hellooo!!!

At least ATI users get new drivers almost monthly!
 

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
Originally posted by: daveybrat
I love my Nvidia 8800GTS, but come on Nvidia, no new official drivers released since December 2007??? Hellooo!!!

At least ATI users get new drivers almost monthly!

Just use the betas, 174.14 are practically WHQL.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The 3870 X2 may be dual-GPU, but that is just the way the market is heading. It is no longer possible (in general) to sustain GPU advancement with a single GPU setup. Look at the size of G80.... ~480-530mm^2 and that's not even including the display chip which was separate. No company wants to produce a chip that big, it's not good for business. Now look at the 3870 X2.... it is made up of 2 small chips 192mm^2 in size, allowing for excellent yields and cheap cost. R700 will be the same idea, it will be two relatively small chips that are not too expensive to make. Not only does this make manufacturing easier, it makes design easier; the same RV770 core in the high-end parts can be used in the midrange cards.

Now nVidia might be going for a last hurrah with GT200, which will probably be a similar size to G80. But it's just not sustainable, process technology cannot keep up with GPU advancement. A few years ago, a top of the line GPU was ~200mm^2. Now we're looking at 500mm^2 on much more advanced processes.

R700 will be a success against G92b chips, the question is of course how well it will compete against GT200. But with 50% more shading processors, 2X texture units (major bottleneck in R600 design), higher clockspeeds, and GDDR5 memory, I think it should be a success.

nVidia has two build process advancements they can make over their current offerings to catch up with fab tech and their current clock rate is a fraction of current CPUs, there is a LOT of room left for significant improvements in single GPU performance before we approach the point where dual GPUs are required to advance, we are not remotely close to that point actually. Given the raw fillrate requirements have leveled off and are not likely to increase in any significant way in the near future(30" displays are about as big as we are going to get for a while) the amount of additional die space made available by newer build processes will overwhelmingly be devoted to more shading power which is where we are currently coming up seriously short. Given the amount of build processes still currently available and given what the additional die space is going to be utilized for we are still easily looking at exponential growth in the areas that need it most. At some point we may approach the point where multi GPUs would be the only way to sustain their rapid advancement, but that point is still well beyond what we can see on the horizon.

As far as R700 goes, ATi's current offerings should smack nVidia's down with ease particularly in shader heavy games, this hasn't materialized. Pretty much since the R9700 ATi has taken baby steps in advancement and went from dominating to being dominated slowly, at this point in the game given the 'advancements' they have offered over the last several generation of chips it is up to them to prove they are not going to release another mediocre part. As it stands now there last couple parts couldn't displace nVidia's prior cycle offerings on the high end or even the mid high end, given that trend it would indicate that the R700 wouldn't be capable of taking down the 8800Ultra which is gettng up their in age. Will that be the case? I honestly don't know, any comment one way or the other would be entirely speculative. They may well have the next R9700 tucked up their sleeve, but the more time goes on it seems increasingly obvious that that part was a fluke and isn't something they have been able to come remotely close to replicating.
 

thilanliyan

Lifer
Jun 21, 2005
12,082
2,281
126
Originally posted by: BenSkywalker
As far as R700 goes, ATi's current offerings should smack nVidia's down with ease particularly in shader heavy games, this hasn't materialized. Pretty much since the R9700 ATi has taken baby steps in advancement and went from dominating to being dominated slowly, at this point in the game given the 'advancements' they have offered over the last several generation of chips it is up to them to prove they are not going to release another mediocre part. As it stands now there last couple parts couldn't displace nVidia's prior cycle offerings on the high end or even the mid high end, given that trend it would indicate that the R700 wouldn't be capable of taking down the 8800Ultra which is gettng up their in age. Will that be the case? I honestly don't know, any comment one way or the other would be entirely speculative. They may well have the next R9700 tucked up their sleeve, but the more time goes on it seems increasingly obvious that that part was a fluke and isn't something they have been able to come remotely close to replicating.

Up until the 2900 ATI were very competitive. The X800/X850, X1800, X1900/X1950 basically led their classes and were fairly successful commercially (didn't ATI have the larger market share until G80 was released?). There was a huge reversal of fortunes however.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
ATI's problem past the 9700 days was: they were months behind nv's offerings for slightly better performance. By the time the X800 came out enthusiasts already bought 6800s. Repeat again with the 7800 vs 1800. And again with 7900 vs 1950. And once more with 8800 vs 2900 (although here we don't see the performance improvement). In each case the ATI part was a better performer, but not enough so to warrant an upgrade from the higher end NV parts at release prices.

While ATI still sold wagonloads of those cards they couldn't command the kind of early adopter e-peen premiums that NV's been getting away with -- with the enthusiast market mostly tapped they were left with value conscious upper mainstream/lower enthusiast buyers like myself. Sure, I got the X850XT PE and X1800XT -- but it was for $150 and $249 respectively, 4-6 months after launch. Not $500 and $650 MSRP that ATI would have received had they beaten NV to the market.
 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: v8envy
ATI's problem past the 9700 days was: they were months behind nv's offerings for slightly better performance. By the time the X800 came out enthusiasts already bought 6800s. Repeat again with the 7800 vs 1800. And again with 7900 vs 1950. And once more with 8800 vs 2900 (although here we don't see the performance improvement). In each case the ATI part was a better performer, but not enough so to warrant an upgrade from the higher end NV parts at release prices.

While ATI still sold wagonloads of those cards they couldn't command the kind of early adopter e-peen premiums that NV's been getting away with -- with the enthusiast market mostly tapped they were left with value conscious upper mainstream/lower enthusiast buyers like myself. Sure, I got the X850XT PE and X1800XT -- but it was for $150 and $249 respectively, 4-6 months after launch. Not $500 and $650 MSRP that ATI would have received had they beaten NV to the market.

My feelings exactly. They also seem to engineer the cards for future trends and not as much for current games. Over time, ATI cards seem to age better and play newer games better than NV cards, but at launch Nvidia usually does better on the current games.
 

Martimus

Diamond Member
Apr 24, 2007
4,490
157
106
Originally posted by: golem
My feelings exactly. They also seem to engineer the cards for future trends and not as much for current games. Over time, ATI cards seem to age better and play newer games better than NV cards, but at launch Nvidia usually does better on the current games.

That has been my experience as well.
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: golem

My feelings exactly. They also seem to engineer the cards for future trends and not as much for current games. Over time, ATI cards seem to age better and play newer games better than NV cards, but at launch Nvidia usually does better on the current games.

You don't get it, NV cards run better because most of the publishers are affliated with NV, they involve in driver developent and creative work. AMD has limited choice, this isn't a question of who has better technology as both companies seem to make equally powerful crappy cards -the problem lies in the software side. If they really wanted to please fans or consumers or cared about innovation, they'd have waited until something extraordinry is invented but that is not how they work- say if AMD releases a card next month, nv will release one of their incomplete backup card lineup that scores 2 points higher than amd and they naturally steal consumer attention.
 

Martimus

Diamond Member
Apr 24, 2007
4,490
157
106
Originally posted by: Aberforth
Originally posted by: golem

My feelings exactly. They also seem to engineer the cards for future trends and not as much for current games. Over time, ATI cards seem to age better and play newer games better than NV cards, but at launch Nvidia usually does better on the current games.

You don't get it, NV cards run better because most of the publishers are affliated with NV, they involve in driver developent and creative work. AMD has limited choice, this isn't a question of who has better technology as both companies seem to make equally powerful crappy cards -the problem lies in the software side. If they really wanted to please fans or consumers or cared about innovation, they'd have waited until something extraordinry is invented but that is not how they work- say if AMD releases a card next month, nv will release one of their incomplete backup card lineup that scores 2 points higher than amd and they naturally steal consumer attention.

The HD3XXX series does have more features than the G80 and G92/94 cards, such as onboard sound, DX10.1, and AVIVO. If I were buying a new HTPC, I would probably get an AMD card for the HDMI sound, but for gaming I would prefer one of the faster nVidia cards.
 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: Aberforth
Originally posted by: golem

My feelings exactly. They also seem to engineer the cards for future trends and not as much for current games. Over time, ATI cards seem to age better and play newer games better than NV cards, but at launch Nvidia usually does better on the current games.

You don't get it, NV cards run better because most of the publishers are affliated with NV, they involve in driver developent and creative work. AMD has limited choice, this isn't a question of who has better technology as both companies seem to make equally powerful crappy cards -the problem lies in the software side. If they really wanted to please fans or consumers or cared about innovation, they'd have waited until something extraordinry is invented but that is not how they work- say if AMD releases a card next month, nv will release one of their incomplete backup card lineup that scores 2 points higher than amd and they naturally steal consumer attention.

Does it really matter why NV cards run better or is the important thing that they run better?
Isn't it good that Nvidia is working with publisher to make sure that their customers get the best experience they can?

Innovation doesn't grow on trees, it takes time but until something new comes along you still can give your customers value by using process shrinks and respins to give slightly better performance and lower prices. Can you name one company, just one that introduces something extraordinary with every new release?

Nvidia is a business out to make money, if they have a slightly better product to counter a new product by AMD, even it the difference is very slight shouldn't they introduce it? Especially if it's priced lower?

What do you mean by incomplete backup card line?


 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: Aberforth

You don't get it, NV cards run better because most of the publishers are affliated with NV, they involve in driver developent and creative work. AMD has limited choice, this isn't a question of who has better technology as both companies seem to make equally powerful crappy cards -the problem lies in the software side. If they really wanted to please fans or consumers or cared about innovation, they'd have waited until something extraordinry is invented but that is not how they work- say if AMD releases a card next month, nv will release one of their incomplete backup card lineup that scores 2 points higher than amd and they naturally steal consumer attention.

I wonder what caused software makers to eschew the market leader at the time (ATI) and current market leader (Intel) by optimizing their software to run only on nv cards? </sarcasm>

Software vendors want the broadest market possible. They don't care to increase nv's sales, ATI's sales or BitBoyz sales. They'll happily take marketing dollars to put the vendor's logo on a splash screen, whether or not the title runs better on current hardware from that vendor. But when the day ends they only care that their title will run well on the largest amount of hardware out there.

If you recall, halflife2 ran so much better on ATI hardware of the time (9600 and up) that Valve had to do backflips and handstands to give the poor guys with ti and FX series any kind of HL experience at all.

In summary, until November 2006 ATI and NV used to leapfrog each other into holding performance/$ crowns at various price levels. Lately it's been completely one-sided, but that isn't a conspiracy between NV and software makers. It's not NV's fault that ATI can't come up with hardware to beat a 1+ year old design.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: BenSkywalker
The 3870 X2 may be dual-GPU, but that is just the way the market is heading. It is no longer possible (in general) to sustain GPU advancement with a single GPU setup. Look at the size of G80.... ~480-530mm^2 and that's not even including the display chip which was separate. No company wants to produce a chip that big, it's not good for business. Now look at the 3870 X2.... it is made up of 2 small chips 192mm^2 in size, allowing for excellent yields and cheap cost. R700 will be the same idea, it will be two relatively small chips that are not too expensive to make. Not only does this make manufacturing easier, it makes design easier; the same RV770 core in the high-end parts can be used in the midrange cards.

Now nVidia might be going for a last hurrah with GT200, which will probably be a similar size to G80. But it's just not sustainable, process technology cannot keep up with GPU advancement. A few years ago, a top of the line GPU was ~200mm^2. Now we're looking at 500mm^2 on much more advanced processes.

R700 will be a success against G92b chips, the question is of course how well it will compete against GT200. But with 50% more shading processors, 2X texture units (major bottleneck in R600 design), higher clockspeeds, and GDDR5 memory, I think it should be a success.

nVidia has two build process advancements they can make over their current offerings to catch up with fab tech and their current clock rate is a fraction of current CPUs, there is a LOT of room left for significant improvements in single GPU performance before we approach the point where dual GPUs are required to advance, we are not remotely close to that point actually. Given the raw fillrate requirements have leveled off and are not likely to increase in any significant way in the near future(30" displays are about as big as we are going to get for a while) the amount of additional die space made available by newer build processes will overwhelmingly be devoted to more shading power which is where we are currently coming up seriously short. Given the amount of build processes still currently available and given what the additional die space is going to be utilized for we are still easily looking at exponential growth in the areas that need it most. At some point we may approach the point where multi GPUs would be the only way to sustain their rapid advancement, but that point is still well beyond what we can see on the horizon.

As far as R700 goes, ATi's current offerings should smack nVidia's down with ease particularly in shader heavy games, this hasn't materialized. Pretty much since the R9700 ATi has taken baby steps in advancement and went from dominating to being dominated slowly, at this point in the game given the 'advancements' they have offered over the last several generation of chips it is up to them to prove they are not going to release another mediocre part. As it stands now there last couple parts couldn't displace nVidia's prior cycle offerings on the high end or even the mid high end, given that trend it would indicate that the R700 wouldn't be capable of taking down the 8800Ultra which is gettng up their in age. Will that be the case? I honestly don't know, any comment one way or the other would be entirely speculative. They may well have the next R9700 tucked up their sleeve, but the more time goes on it seems increasingly obvious that that part was a fluke and isn't something they have been able to come remotely close to replicating.

The HD 3870 X2 has already surpassed or equalled the 8800 Ultra in virtually every game, so R700 will definitely be faster.

R9700 was hardly a "fluke"... from the launch of the 9700 Pro to the launch of G80 in November 2006 ATI held a solid lead in the graphics industry. That's a longer lead than nVidia has had since G80, which is less than 2 years.

The 9xxx series of ATI cards wasn't even playing in the same league as the GeForce FX.
GeForce 6 series definitely made it a closer game, but X800XT PE was certainly faster than the Ultra in everything but OpenGL.
X1800XT beat 7800GTX, X1900XTX beat 7900GTX. The 7950GX2 outperformed the X1950XTX most of the time, but it was more expensive, had lower IQ, and was too slow (500/1200 clock vs 650/1600 for the GTX).

And the latest rumor seems to suggest that GT200 is dual chip as well:
http://www.fudzilla.com/index....=view&id=6858&Itemid=1

 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: v8envy
ATI's problem past the 9700 days was: they were months behind nv's offerings for slightly better performance. By the time the X800 came out enthusiasts already bought 6800s. Repeat again with the 7800 vs 1800. And again with 7900 vs 1950. And once more with 8800 vs 2900 (although here we don't see the performance improvement). In each case the ATI part was a better performer, but not enough so to warrant an upgrade from the higher end NV parts at release prices.

While ATI still sold wagonloads of those cards they couldn't command the kind of early adopter e-peen premiums that NV's been getting away with -- with the enthusiast market mostly tapped they were left with value conscious upper mainstream/lower enthusiast buyers like myself. Sure, I got the X850XT PE and X1800XT -- but it was for $150 and $249 respectively, 4-6 months after launch. Not $500 and $650 MSRP that ATI would have received had they beaten NV to the market.

That's not accurate with the 6800 vs X800... X800 came out about a month later or even less.

The X1800XT was late, so that is accurate. But there was no competition when it launched, X1800XT sold for $500 anyway; the 7800GTX 256MB was slower and <$500, the 7800GTX 512MB was faster but >$700.

The X1900XTX launched before the 7900 series and was still faster. The X1950 series was just a refresh w/ GDDR4 memory and lower prices.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: Extelleron

The HD 3870 X2 has already surpassed or equalled the 8800 Ultra in virtually every game, so R700 will definitely be faster.
Only in timedemo's. It's funny that all sites that use actual gameplay gets totally different story..

 

thilanliyan

Lifer
Jun 21, 2005
12,082
2,281
126
Originally posted by: Rusin
Originally posted by: Extelleron

The HD 3870 X2 has already surpassed or equalled the 8800 Ultra in virtually every game, so R700 will definitely be faster.
Only in timedemo's. It's funny that all sites that use actual gameplay gets totally different story..

Most games other than the ones with built in demos are benched using actual gameplay aren't they (ie. CoD4)?? Since most games don't have built in timedemos, doesn't that mean most reviews use something like a fraps run through which can be indicative of actual gameplay?

Got any links to "real" reviews? (note: Please no HardOCP :D )
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The HD 3870 X2 has already surpassed or equalled the 8800 Ultra in virtually every game, so R700 will definitely be faster.

Really? If I applied the same logic a single 9800 would be faster then 8800GTXs in SLI. If you are saying that some part using any number of chips will end up being faster then the 3870x2 in particular then I would have to assume the same. That is very, very different then saying that the R700 will be more then twice as fast as their current high end offering, we'll come back to that though :)

R9700 was hardly a "fluke"... from the launch of the 9700 Pro to the launch of G80 in November 2006 ATI held a solid lead in the graphics industry.

The R9700Pro was in some cases three to four times faster then the high end offering it replaced in actual game settings people would utilize, the norm was closer to twice as fast(a bit more then that, but general range). Since then ATi has seen very small performance steps forward since that time. nV didn't have a huge leap part moving from the Ti4600 to the 5800FX, but it was a considerable amount faster then its' predecessor and if not for the enormous leap ATi made with the R9700Pro it likely wouldn't be remembered as a bad part at all. In the context it was released however, it was a lemon. Since the R9700Pro we have seen ATi release parts over and over again that are FAR closer to a Ti4600-5800FX style improvment then anything at all like the R8500-R9700Pro was. nVidia, otoh, has continued at a slightly faster rate of advancement. If this was one or two parts it would be easy to dismiss as coincidental, but given that this trend has been happening for many years now it must be given some consideration in any honest analysis. Quite frankly, if the GT200 is, say 40% faster then the G92 and the RV700 is 200% faster then the 3870 ATi still fails to take back the throne.

GeForce 6 series definitely made it a closer game, but X800XT PE was certainly faster than the Ultra in everything but OpenGL.

Are you talking about the phantom edition seriously? I mean, come on now.

And the latest rumor seems to suggest that GT200 is dual chip as well:

The GT200 is a single chip design, I'll tell you that for certain right now. It will be available in multiple chip offerings as long as they sell as nV has proven they are not stupid on the market demands end of the spectrum. If they can move ATi into the 4th or 5th fastest position right off it significantly benefits them in the long run from a business perspectve.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: thilan29
Most games other than the ones with built in demos are benched using actual gameplay aren't they (ie. CoD4)?? Since most games don't have built in timedemos, doesn't that mean most reviews use something like a fraps run through which can be indicative of actual gameplay?

Got any links to "real" reviews? (note: Please no HardOCP :D )
Nope.. for example in this one Anandtech review they actually said that they were testing that opening intro :D.
-----

http://plaza.fi/muropaketti/ar...dia-geforce-9800-gx2,3
They use actual gameplay and use same settings.

http://www.bit-tech.net/hardwa...600m_1gb_graphics_card

 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: Martimus
Originally posted by: Aberforth
Originally posted by: golem

My feelings exactly. They also seem to engineer the cards for future trends and not as much for current games. Over time, ATI cards seem to age better and play newer games better than NV cards, but at launch Nvidia usually does better on the current games.

You don't get it, NV cards run better because most of the publishers are affliated with NV, they involve in driver developent and creative work. AMD has limited choice, this isn't a question of who has better technology as both companies seem to make equally powerful crappy cards -the problem lies in the software side. If they really wanted to please fans or consumers or cared about innovation, they'd have waited until something extraordinry is invented but that is not how they work- say if AMD releases a card next month, nv will release one of their incomplete backup card lineup that scores 2 points higher than amd and they naturally steal consumer attention.

The HD3XXX series does have more features than the G80 and G92/94 cards, such as onboard sound, DX10.1, and AVIVO. If I were buying a new HTPC, I would probably get an AMD card for the HDMI sound, but for gaming I would prefer one of the faster nVidia cards.

The HD3XXX series do not have onboard sound. They have HDMI audio "throughput" via an S/PDIF audio cable. The 9800GTX, 9600GT and GX2 have this ability.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Highend was always a toss up between ATi and NVIDIA. But here are the important facts that cleared the paths to nVIDIA success:

-The 6800GT was the key for nVIDIA striking back after the FX debacle. It was cheaper than the flagship cards, OCed up to Ultra speeds with ease, and most importantly beat the X800pro across the board. ATi was too late with the X800XL.
-The 6600GT beat everything on the market at its price range. X700? this card totally failed because of the 6600GT. This card definitely gave nVIDIA the edge. (ATis response was the usual, with a low margin card, in this case the X800GTO)
-Introduction of SLi when NV45 hit the scene. No competition from ATi in the multi GPU scene til later later when RV670 came out with native crossfire which infact is pretty damn competitive with SLi if not better (but this is after 2~3 or so years). Notice how ATi completely abandoned the hyped "super tiling".
-The R520 was way too late, because everyone just went out and bought the 7800GT, GTX cards or two.
-By the time it was released, the X1800XT was barely faster than the 7800GTX 256mb at the time of its release. (Only after a couple of months with optimized drivers did it performed better by alot but by then it was too late anyway because of the refreshes).
-All 7800GTX 512mb were sold and people who could buy it bought one, or two. For the price premium, the amount of profit earned per card must've been pretty damn high.
-7600GT again dominated the mid range since the X1600 series was too slow. X1800GTO was a poor attempt in ATI's part since it had to use a 256bit memory interface AND a crippled R520 core (no margins at all) just to fight off this mid range card based on the G73.
-X1900 vs 7900 series. Id say it favored ATi, but this was a tossup. G71 cores were tiny compared to the R580. It packed less transistors also compared to the G70. More G71s can be produced per wafer. So in terms of production cost, well overall cost id think that G71 cards had a bigger margin compared to its ATi counterpart. Many people bought 7900GTs, 7900GTXs OCed damn mad. Many people also bought X1900 series cards. When these cards were introduced the 7900GTX was roughly on par with the X1900XT with a slight edge to the ATi card.
-7950GX2 was definitely the fastest single PCI-e slot solution compared to the X1950XTX. But price difference, so it was a tossup here as well. Once again ATi late with X1650XT, and X1950pro but these products did give ATi quite the boost needed.
-G80/G9x, well we all know how this turned out.

You can see that ATi was always late, and seemed to do badly in the mid range (something that contrasts to their high end GPUs). Not to mention the tendency of releasing cards that sometimes had low to no margins at all. They recovered by releasing more competitive products (products such as X700, X800pro, X1600, X1800GTO and so on) but by then its already too late with the next generation of cards close to release.

There are other things that can be mentioned. Innovation is great, but nVIDIA and ATi are business companies also. The keyword here is business. Like for instance the R5x0 architecture excled in great dynamic branching because this part of the architecture was greatly improved. Well was this ever used? Anwser is no. And by the time it does, there will be faster cards out that excel in those situations. These design decisions can later come back to haunt you, like in this case the R580 could have been a cheaper GPU to produce if all these unneeded features were taken away. This is what nVIDIA excels at. Look how theyve successfully turned the NV47 to a 7 series product (potentially saving loads of R&D and other related costs) and still be competitive. The way they introduce features at a timely fashion, not being too late or too early (just look at ATI's tessellation feature, its pretty much wasted transistor budget/die space). An example is S.M 3.0 with NV40. Sure most of its features were too slow to run, but this gave the devs an opportunity to do something new. The kind of support nVIDIA gives to devs is light years ahead of ATi. Just look at the "GPU gems" series and the amount of effort has gone to support the devs in utilizing these features.

Although sometimes this can be harsh on consumers, it gives these companies the edge needed to beat the other. That is why ATI is acquired by AMD and nVIDIA is the last one out of many (yea and i mean many :D) that started many years ago in the graphics race.

Ok enough OT, RV770 will potentially bring some competition in the highend. Judging by the specs, im still not sure what its specs going to be. 32 TMU is a given. I highly suspect this RV770 is based on the RV670 with its drawbacks greatly looked at (poor texturing performance, AA performance etc). Im also certain that R700 is dual RV770 like what the R680 is (dual RV670). For reference a 8800Ultra is 30~50% faster than the HD3870.

edit - think i went a little overboard :eek:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Cookie Monster
Highend was always a toss up between ATi and NVIDIA. But here are the important facts that cleared the paths to nVIDIA success:

-The 6800GT was the key for nVIDIA striking back after the FX debacle. It was cheaper than the flagship cards, OCed up to Ultra speeds with ease, and most importantly beat the X800pro across the board. ATi was too late with the X800XL.
-The 6600GT beat everything on the market at its price range. X700? this card totally failed because of the 6600GT. This card definitely gave nVIDIA the edge. (ATis response was the usual, with a low margin card, in this case the X800GTO)
-Introduction of SLi when NV45 hit the scene. No competition from ATi in the multi GPU scene til later later when RV670 came out with native crossfire which infact is pretty damn competitive with SLi if not better (but this is after 2~3 or so years). Notice how ATi completely abandoned the hyped "super tiling".
-The R520 was way too late, because everyone just went out and bought the 7800GT, GTX cards or two.
-By the time it was released, the X1800XT was barely faster than the 7800GTX 256mb at the time of its release. (Only after a couple of months with optimized drivers did it performed better by alot but by then it was too late anyway because of the refreshes).
-All 7800GTX 512mb were sold and people who could buy it bought one, or two. For the price premium, the amount of profit earned per card must've been pretty damn high.
-7600GT again dominated the mid range since the X1600 series was too slow. X1800GTO was a poor attempt in ATI's part since it had to use a 256bit memory interface AND a crippled R520 core (no margins at all) just to fight off this mid range card based on the G73.
-X1900 vs 7900 series. Id say it favored ATi, but this was a tossup. G71 cores were tiny compared to the R580. It packed less transistors also compared to the G70. More G71s can be produced per wafer. So in terms of production cost, well overall cost id think that G71 cards had a bigger margin compared to its ATi counterpart. Many people bought 7900GTs, 7900GTXs OCed damn mad. Many people also bought X1900 series cards. When these cards were introduced the 7900GTX was roughly on par with the X1900XT with a slight edge to the ATi card.
-7950GX2 was definitely the fastest single PCI-e slot solution compared to the X1950XTX. But price difference, so it was a tossup here as well. Once again ATi late with X1650XT, and X1950pro but these products did give ATi quite the boost needed.
-G80/G9x, well we all know how this turned out.

You can see that ATi was always late, and seemed to do badly in the mid range (something that contrasts to their high end GPUs). Not to mention the tendency of releasing cards that sometimes had low to no margins at all. They recovered by releasing more competitive products (products such as X700, X800pro, X1600, X1800GTO and so on) but by then its already too late with the next generation of cards close to release.

There are other things that can be mentioned. Innovation is great, but nVIDIA and ATi are business companies also. The keyword here is business. Like for instance the R5x0 architecture excled in great dynamic branching because this part of the architecture was greatly improved. Well was this ever used? Anwser is no. And by the time it does, there will be faster cards out that excel in those situations. These design decisions can later come back to haunt you, like in this case the R580 could have been a cheaper GPU to produce if all these unneeded features were taken away. This is what nVIDIA excels at. Look how theyve successfully turned the NV47 to a 7 series product (potentially saving loads of R&D and other related costs) and still be competitive. The way they introduce features at a timely fashion, not being too late or too early (just look at ATI's tessellation feature, its pretty much wasted transistor budget/die space). An example is S.M 3.0 with NV40. Sure most of its features were too slow to run, but this gave the devs an opportunity to do something new. The kind of support nVIDIA gives to devs is light years ahead of ATi. Just look at the "GPU gems" series and the amount of effort has gone to support the devs in utilizing these features.

Although sometimes this can be harsh on consumers, it gives these companies the edge needed to beat the other. That is why ATI is acquired by AMD and nVIDIA is the last one out of many (yea and i mean many :D) that started many years ago in the graphics race.

Ok enough OT, RV770 will potentially bring some competition in the highend. Judging by the specs, im still not sure what its specs going to be. 32 TMU is a given. I highly suspect this RV770 is based on the RV670 with its drawbacks greatly looked at (poor texturing performance, AA performance etc). Im also certain that R700 is dual RV770 like what the R680 is (dual RV670). For reference a 8800Ultra is 30~50% faster than the HD3870.

edit - think i went a little overboard :eek:

very long and essentially corerect

you have to realize that ATi realized that they could no longer *compete* with NVIDIA's high end after x1900 ... they saw G80 coming - i saw it coming - so they did the unexpected - they decided to join AMD and develop their CPU into a CPU-GPU and AMD's current Vision is to take the midrange with r700 and aim higher with CrossFireX until Fusion becomes a reality. They can *hope* for another r300/9700p but i think their best engineers are helping AMD with revolutionary architecture.

actually x1900 in the beginning was a *disaster* for ATi - it was the leader but way too expensive to produce and somewhat "over-engineered" for the games of the day - unlike 7800/7900 series which was "good enough" as a performance alternative and appeared cheap to produce. NVIDIA's business sense was always better; and ATi's marketing and especially PR was "bottom feeder" juvenile although for awhile it appeared that NVIDIA also had some monkeys in charge of PR to combat ATi's.

Since NVIDIA can't partner [anylonger] with Intel, they will attemp to wrest the entire Discreet Graphics market away from AMD with a vertiable *flood* of value [GT100] and performance [GT200] GPUs.

since they are cheap to produce, it looks like great strategy - for them.
AMD is allowed to "escape" as Intel engages NVIDIA - and AMD will stand or fall on CPU - their Graphics can be "value" and "competitive" until Fusion and they will be fine. And of course, NVIDIA needs VIA for SIS to make their own CPU-GPU

interesting times as everything realigns - and great value for us!

rose.gif