Anandtech second review of 8600GT/GTS is very good

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Originally posted by: schneiderguy
Originally posted by: fierydemise
Originally posted by: schneiderguy
Originally posted by: munky
Simply put, Nvidia got greedy and thought they could pass off a low end card for the price of a midrange card.

I wouldnt blame them considering that is what ATI has done for the past two generations
Last gen ATI's midrange offerings were very good although most of them were never intended as midrange cards.

The fact that nVidia has had such good midrange cards the past few generations make the 8600 even more surprising.

the x1600xt couldnt beat a 6800gt (nvidia's second highest card)
the 8600gt cant beat a 7900gt (nvidia's second highest card)

they are both crap. :)
I'll agree completely with that assessment, I was referring more the X1800 when I said ATI's midrange offerings last gen were pretty good.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: schneiderguy
Originally posted by: munky
Originally posted by: schneiderguy
Originally posted by: munky
Simply put, Nvidia got greedy and thought they could pass off a low end card for the price of a midrange card.

I wouldnt blame them considering that is what ATI has done for the past two generations

Ati did that with the x1600's, and hardly anyone bought those. Do they expect Ati to repeat that mistake this time around?

they did it with the x600's too. then they did the same with the x1600's. they obviously didnt learn the first time ;)

im HOPING ati doesnt make that mistake this round, or the entire midrange of this generation will suck :) (until someone does a refresh)

The X700PRO wasn't that bad and performs as fast as a 9800PRO and sometimes faster, but, after failing with the X600PRO, they did the X700PRO, after failing with the X1600XT, they did the X1650XT, I just hope there's no more of these failures.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: fierydemise
Originally posted by: coldpower27
The 6600 GT had 1 full optical node over the 9800 Pro, as did the 7600 GT over the 6800 Ultra, the 8600 GTS only has a half node over the 7600 GT/7900 GS. So the transistor budget increase wasn't as large and hence Nvidia was left with less leeway to where they could spend resources on.
I'm not talking about transistor count I'm just looking at the actual performance and the 8600 doesn't follow the trend that we have seen for the 2 generations. I think munky said it best
Simply put, Nvidia got greedy and thought they could pass off a low end card for the price of a midrange card.

You can't have extra performance without extra transistors. Extra transistors mean a larger die size (unless process technology has advanced a full node, where you are generally able to have extra tranistors AND extra performance at the same time). A larger die size is bad in the mid range market as it drives prices up (unless you are in the situation ATi find themselves at present - forced to sell a high perfromance die as mid range in order to survive).
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: evolucion8
Originally posted by: schneiderguy
Originally posted by: munky
Originally posted by: schneiderguy
Originally posted by: munky
Simply put, Nvidia got greedy and thought they could pass off a low end card for the price of a midrange card.

I wouldnt blame them considering that is what ATI has done for the past two generations

Ati did that with the x1600's, and hardly anyone bought those. Do they expect Ati to repeat that mistake this time around?

they did it with the x600's too. then they did the same with the x1600's. they obviously didnt learn the first time ;)

im HOPING ati doesnt make that mistake this round, or the entire midrange of this generation will suck :) (until someone does a refresh)

The X700PRO wasn't that bad and performs as fast as a 9800PRO and sometimes faster, but, after failing with the X600PRO, they did the X700PRO, after failing with the X1600XT, they did the X1650XT, I just hope there's no more of these failures.

yup, it's looking like nvidia will do the same this generation with the 8800/8900gs and their quest for a non crappy midrange card :p
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: schneiderguy
Originally posted by: fierydemise
Originally posted by: schneiderguy
Originally posted by: munky
Simply put, Nvidia got greedy and thought they could pass off a low end card for the price of a midrange card.

I wouldnt blame them considering that is what ATI has done for the past two generations
Last gen ATI's midrange offerings were very good although most of them were never intended as midrange cards.

The fact that nVidia has had such good midrange cards the past few generations make the 8600 even more surprising.

the x1600xt couldnt beat a 6800gt (nvidia's second highest card)
the 8600gt cant beat a 7900gt (nvidia's second highest card)

they are both crap. :)

X1600 XT, funnily enough with only 12Pixel Shaders/4 TMU was still at least 150mm2. The problem with ATI that generation was that the bulk of the transistor budget was poured into the increased feature set of Shader Model 3.0, HDR+AA, Internal Ring Bus.

The X1600 XT bear in mind had only a half node increase over the X700 Pro, unfortunately for ATI. It was impressive already what ATi was able to do given the constraints with that budget.

The 7600 GT had the luxury of spending most of the budget on savings and performance as the feature set was not increased over the 6 Series, or to negligible extent.

The 8600 GTS, is showing the issues because it is going through the same problems. What a surprise. :D

If you had a X700 Pro, the X1600 XT would be a minimal upgrade, same thing with the 7600 GT to 8600 GTS. Faster, but not draw droppingly so.

I wouldn't say they are crap, just the best ATI & Nvidia could do without breaking the bank. They won't unless forced too.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: schneiderguy
yup, it's looking like nvidia will do the same this generation with the 8800/8900gs and their quest for a non crappy midrange card :p

Assuming ATi can actually beat Nvidia for a change on the mid range. ;)
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Gstanfor
Originally posted by: fierydemise
Originally posted by: coldpower27
The 6600 GT had 1 full optical node over the 9800 Pro, as did the 7600 GT over the 6800 Ultra, the 8600 GTS only has a half node over the 7600 GT/7900 GS. So the transistor budget increase wasn't as large and hence Nvidia was left with less leeway to where they could spend resources on.
I'm not talking about transistor count I'm just looking at the actual performance and the 8600 doesn't follow the trend that we have seen for the 2 generations. I think munky said it best
Simply put, Nvidia got greedy and thought they could pass off a low end card for the price of a midrange card.

You can't have extra performance without extra transistors. Extra transistors mean a larger die size (unless process technology has advanced a full node, where you are generally able to have extra tranistors AND extra performance at the same time). A larger die size is bad in the mid range market as it drives prices up (unless you are in the situation ATi find themselves at present - forced to sell a high perfromance die as mid range in order to survive).

Not really, the Radeon X1650XT is based on the RV560 which is 80nm and is not that expensive to manufacture, also the X1950PRO is based on a RV570 which is 80nm too and is cheap to manufacture. The only really crippled R580 based card was the X1900GT and that's no longer the case, the new revisions including the X1950GT are based on the RV570. So I don't think that ATi is selling expensives R580+ chips as crippled cards to survive at all?
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: evolucion8
Originally posted by: Gstanfor
Originally posted by: fierydemise
Originally posted by: coldpower27
The 6600 GT had 1 full optical node over the 9800 Pro, as did the 7600 GT over the 6800 Ultra, the 8600 GTS only has a half node over the 7600 GT/7900 GS. So the transistor budget increase wasn't as large and hence Nvidia was left with less leeway to where they could spend resources on.
I'm not talking about transistor count I'm just looking at the actual performance and the 8600 doesn't follow the trend that we have seen for the 2 generations. I think munky said it best
Simply put, Nvidia got greedy and thought they could pass off a low end card for the price of a midrange card.

You can't have extra performance without extra transistors. Extra transistors mean a larger die size (unless process technology has advanced a full node, where you are generally able to have extra tranistors AND extra performance at the same time). A larger die size is bad in the mid range market as it drives prices up (unless you are in the situation ATi find themselves at present - forced to sell a high perfromance die as mid range in order to survive).

Not really, the Radeon X1650XT is based on the RV560 which is 80nm and is not that expensive to manufacture, also the X1950PRO is based on a RV570 which is 80nm too and is cheap to manufacture. The only really crippled R580 based card was the X1900GT and that's no longer the case, the new revisions including the X1950GT are based on the RV570. So I don't think that ATi is selling expensives R580+ chips as crippled cards to survive at all?

The RV560 and RV570 have a die size of 230mm2 still larger then Nvidia's 7900 line. Being 80nm alone isn't enough, it's how much transistor there is plus the optical node that you need to consider. RV560 and RV570 both have 330 Million Transistors, with quite a few disabled on RV560.

I think Gstanfor is refering to selling the Radeon X1900 XT and X1950 XT at mainstream prices as those are based on full fledged R580 cores with their considerable die size.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I predict that we will see 65nm variants of g84 & g86. These will perform much closer to how people expected the current mid range to perform, due to extra transistors being able to be spend on extra quads etc.

If the rumors turn out to be true and AMD's mid range and budget offerings are based on 65nm, they will likely have an advantage over nvidia there until nvidia catches up process wise.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: coldpower27
Originally posted by: schneiderguy
Originally posted by: fierydemise
Originally posted by: schneiderguy
Originally posted by: munky
Simply put, Nvidia got greedy and thought they could pass off a low end card for the price of a midrange card.

I wouldnt blame them considering that is what ATI has done for the past two generations
Last gen ATI's midrange offerings were very good although most of them were never intended as midrange cards.

The fact that nVidia has had such good midrange cards the past few generations make the 8600 even more surprising.

the x1600xt couldnt beat a 6800gt (nvidia's second highest card)
the 8600gt cant beat a 7900gt (nvidia's second highest card)

they are both crap. :)

X1600 XT, funnily enough with only 12Pixel Shaders/4 TMU was still at least 150mm2. The problem with ATI that generation was that the bulk of the transistor budget was poured into the increased feature set of Shader Model 3.0, HDR+AA, Internal Ring Bus.

The X1600 XT bear in mind had only a half node increase over the X700 Pro, unfortunately for ATI. It was impressive already what ATi was able to do given the constraints with that budget.

The 7600 GT had the luxury of spending most of the budget on savings and performance as the feature set was not increased over the 6 Series, or to negligible extent.

The 8600 GTS, is showing the issues because it is going through the same problems. What a surprise. :D

If you had a X700 Pro, the X1600 XT would be a minimal upgrade, same thing with the 7600 GT to 8600 GTS. Faster, but not draw droppingly so.

I wouldn't say they are crap, just the best ATI & Nvidia could do without breaking the bank. They won't unless forced too.

I guess that going from a X700PRO to a X1600XT is more of a downgrade than anything else, the X700PRO had 8 pixel pipelines, 6 vertex shaders and 8 pixel shaders, while the X1600XT had only 4 pixel pipelines, 5 vertex shaders and 12 pixel shaders.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: evolucion8
I guess that going from a X700PRO to a X1600XT is more of a downgrade than anything else, the X700PRO had 8 pixel pipelines, 6 vertex shaders and 8 pixel shaders, while the X1600XT had only 4 pixel pipelines, 5 vertex shaders and 12 pixel shaders.

Yeah ATI offset most of that by increasing the clockspeed. So it was a marginal upgrade all things considered.

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Note that RV560 is actually RV570 with a quad disabled. Another one of those mysteries where the actual RV560 project couldve been canned due to unexpected failures using the 80nm process.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Aikouka
Originally posted by: yacoub
Well if the 8600GT blew away the 7600GT as well as the 8800GTS blows away the 7900GT, and the 8600GTS blew away the 7900GS as well as the 8800GTS blows away the 7900GT, then yeah, the 7900GT is the previous generation competitor for the 8800GTS. But given that the 8600GTS can barely keep pace with a 7900GS, there's a huge disparity there (because NVidia hacked down the 8600GTS too much on the stream processors and bus bandwidth).

That's not how I compare them:

8800GTX : 7900GTX
8800GTS : 7900GT
8800GTS 320 : 7900GS
8600GTS : 7600GT
8600GT : 7600GS

Seems more logical to me. I don't see why a #600 part should ever be compared to a #800 part, unless the #800 part had some serious downscaling that made it on par with the #600 part.

You mean like the 6600GT vs. 6800nu ??
 

gobucks

Golden Member
Oct 22, 2004
1,166
0
0
what I don't understand is why the graphics companies keep busting their asses to come out with expensive-to-produce, super fast $500 video cards that double in performance every year or so, when very few gamers buy them. Meanwhile, the there hasn't be a good midrange card since the 6600GT. I thought that when that card came out, nvidia finally got it - come out with a $200 card (which is the average gamer's price range) with half the horsepower of the $400/$500 card, and make it cheap to produce by basically designing a chip the exact same way as the high-end one, but with only half the transistor count (8 pipes vs. 16, 128-bit bus vs. 256, half the memory, etc). Everybody is happy - the cards aren't cut down; you get what you pay for, the cards run nice and cool, and nvidia doesn't cannibalize its own sales by selling $200 cut-down cards that people softmod into $500 ones.

Instead, what we get is essentially minor speed bumps on the midrange cards - 20% bump with the 7600, another 20% bump with the 8600. meanwhile, over the same time period, the 8800GTX is probably 3-4 times faster than a 6800 Ultra. so now there are $200 crappy midrange cards that no one wants, and the graphics companies come up with crippled versions of their flagship cards, or keep producing their old high-end cards (remember when ATI kept producing 9800 pros to compete with the 6600GT?). this is bad for them because a) they are expensive to produce and b) people softmod them, spending hundreds less than they otherwise might. If they had, or example, made the 8600 a card with a 256-bit memory bus, half the stream processors, and comparable clock speeds to the 8800, people would eat it up for $200-250 and they would have a nice, cheap to make, non-blast furnace of a video card on their hands.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
This is what I've been considering: will Nv see the benefits of all the money they saved on a 32 shader card if it has poor sales due to lower than expected performance?

Also, just for fun, here's what the (st)Inq reported on the g84 and g86 specs in March - Text.
Ironic, how the 8600gts turned out to have lower specs than the rumored 8500gt specs. And there's no 8600u in sight.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: coldpower27
Originally posted by: Acanthus
Originally posted by: coldpower27
Originally posted by: ShadowOfMyself
I dont see how someone can still try to come up with stuff to try to justify the poor performance of these things *points at Genx87*

Seriously, who cares where the cards are SUPPOSED to fit? Fact of the matter is at the moment a x1950pro is much cheaper AND equal or faster, making the 8600GTS look like crap, and 8600GT well.. I wont even comment on it

The X1950 Pro is more expensive to fabricate for ATI, so no greater performance for greater cost is not a win, it's tit for tat.

We arent talking about price to produce, learn to read.

Stop defending them like a rabid guard dog and look at the damn numbers.

They arent up to snuff for a "next generation product".

Hence why you seem very close minded. Given their transistor budget and what they needed to accomplish, it just wasn't possible unless Nvidia was willing to take in the increase costs which it wasn't.

It's not disappointing at all once you see what they were able to do with the budget they did have.

A mild increase with a half-node shrink is disappointing.