-=[OFFICAL HD2600 and 2400 Review Thread]=-

dug777

Lifer
Oct 13, 2004
24,778
4
0
Those cards make my brain hurt :|

I'd like to tear those responsible for the cards (from both companies) a new cloaca, i would :|
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Yea, initial numbers .. dont look so good. But the MSRP of these cards are pretty cheap i think. And i thought the 8600GTS was underwhelming.. :(
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
The 2600s will undoubtably see drivers increase performance, but still, their performance is pretty underwhelming. ATI's mid range cards have been underwhelming since the 9500 Pro days though. Typically, they figure this out a several months down the line with a XL or GT version of their high end product.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Company of Heroes under DX10:

http://www.beyond3d.com/content/reviews/28/8

HD 2600XT is 2x or more powerful than the 8600GTS under CoH in DX10, and the HD 2900XT is able to beat the 8800GTX as well.

Personally I believe ATI has cut down the HD 2600 far too much in terms of texture power. The HD 2600 has a clear advantage in shader performance, as the 8600 is cut down much more in this area. However, the 8600 has TWICE the number of TMUs and ROPs as the 2600XT and doesn't run at much lower a clockspeed (675MHz vs 800MHz). The 8600GTS has 69% more texture processing power than the HD 2600XT. For comparison, the 8800GTS has less than 1% more texture power than the HD 2900XT, and even the 8800GTX has only 54% more texture power than the 2900XT. Of course, the 4 ROPs are also going to kill performance, too.

This is why the HD 2600XT suffers so much in alot of games. Games are becoming more shader-heavy, but they still need texture power as well. ATI made this mistake last-gen with the X1600XT only having 4 TMUs and getting crushed by the 7600 series. Sadly, they've made the same mistake again. I am confident, however, that the HD 2600XT is going to be a much better performer than the 8600GT/GTS in DX10 games such as Crysis.

Regardless, the good news is, the HD 2400 and 2600 are going to be rediculously cheap (HD 2600XT at around $100) and will be great inside a setup that doesn't demand the best gaming performance. If you do game, I think the HD 2600XT is going to be a good choice... it can't beat the 8600GTS in current games, but in future games I think you'll be happy with it. Also, ATI has finally released a card with LOWER POWER CONSUMPTION than nVidia. If you want a nice quiet, low power system, the HD 2400 and 2600 are the best choices.
 

TejTrescent

Member
Apr 20, 2006
41
0
0
I'm sorry, if the HD2600XT is roughly 120$-150$ like Anandtech is saying, specifically closer to the lower edge of that, I'm sold. It's the performance close to the 8600GTS in most without AA (which is fine) and normally not much different than the 8600GT if things do go bad for it. This, of course, is 1280x1024, which is what my desktop's monitor pulls. Only, without that crazy price on the 86 lineup. EDIT: Which isn't as bad as it was any more (guess the supplies improved since I last looked, go figure, ahah), but still.

Works for me, I'm sold.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
http://www.firingsquad.com/har...0_performance_preview/

"
In more modern games like Lost Planet and Company of Heroes, the Radeon HD 2600 XT holds a decisive lead over the GeForce 8600 GT. In fact, it even outperforms the GeForce 8600 GTS in these titles. We know that NVIDIA is hard at work on a driver that?s supposed to bring performance improvements in Lost Planet, but by how much, we just don?t know yet. Meanwhile, our testing with another newer game, Oblivion, indicates that both the GeForce 8600 GT and Radeon HD 2600 XT are evenly matched. Both cards compete very closely with one another in both outdoors and foliage testing. The Radeon HD 2600 XT enjoys a performance advantage in Far Cry HDR as well.

Based on all this, what?s our recommendation? Clearly at the high-end of the mainstream segment NVIDIA?s GeForce 8600 GTS delivers the best all-round performance, but at the $150 price point where the Radeon HD 2600 XT competes in it?s really going to come down to what games you play, and just what kinds of performance optimizations both AMD and NVIDIA have under their sleeve. The Radeon HD 2600 XT doesn?t quite deliver a knockout blow to the GeForce 8600 GT due to its lack of performance in older games and in FEAR. STALKER is another new title that seems to favor the GeForce 8600 GT at the moment.

We haven?t tested DX10 games just yet, but in all honesty we?re not convinced that any of these mainstream cards have the horsepower to play any of the current DX10 titles on the market with adequate performance. We will certainly be looking at the DX10 aspect next though, and will report back our findings shortly.

If you were expecting a GeForce 8600 GTS killer, you?re likely quite disappointed right now ? AMD?s not even competing with the GTS at this point. But it looks like they?ve put together a competent competitor to the GeForce 8600 GT and perhaps the 8500 GT and 8400 GS (more on those later). We still need to examine DX10 and Vista performance in general though before we can come to any definite conclusions, and we wouldn?t be surprised if upcoming drivers from both AMD and NVIDIA could swing a few battles in the direction of either camp."
 

nyker96

Diamond Member
Apr 19, 2005
5,630
2
81
Originally posted by: sliderule
These cards are just like nvidia's, crap.

I have to agree with this gentleman here. I think the problem may be first generation DX10 implementations are bit tough to get right for both companies. So I would just wait til second generation dx10 parts before going dx10 at all.
 

ShreddedWheat

Senior member
Apr 3, 2006
386
0
0
How much of a difference would we see if these cards were using 256 bit instead of 128 bit?

Very disappointing on both sides for ATi and Nvidia. Guess I will be hanging on to my x800xt awhile longer. Crazy that there is such a large gap between low end and high end with nothing really in the middle.
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
Extelleron - while that's technically true, my friend, the objective performance is still not impressive: 20fps vs. 10fps for the GTS at 10x7.

In general, I couldn't be more disappointed with this entire round of midrange parts. The only thing I can think is that both nVidia and ATI decided that they were losing too much money in the ~ $200 price range, and just have given up on it this time around.

It's their own fault, truthfully, if they feel they've lost too much money selling cut-down and derivatives of $400 parts at that price point. How about, gosh, giving the midrange parts some better legs.

In some sense, I think they never should have pushed the price of the highest-end to $500+, if they couldn't profitably fill the gaps between those cards and the 'real' midrange of $150.


I would say that I was happy to sit this round out, but my x850xt just doesn't do what I wanted it to.


If nVidia left the door ajar with the 8600s, ATI just tripped on their own shoelaces and fell face-first into the wall (missing the door completely).

IF (and it is a big, big IF), the HD2000 series parts have better legs in DX10 than their counterparts, maybe they'd turn out to be a decent buy. Given that the 2400/2600 series are really unlikely to give decent DX10 performance anyway, I can't see that being relevant here.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Suddenly the 8600's don't look so bad. Reading the Anandtech review, they beat the snot outta of these "new" cards.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Wreckage
Suddenly the 8600's don't look so bad. Reading the Anandtech review, they beat the snot outta of these "new" cards.

The 8600's still look bad compared to the x1950pro and 7900gs :confused:
 

rmed64

Senior member
Feb 4, 2005
237
0
0
Originally posted by: Extelleron
Company of Heroes under DX10:

http://www.beyond3d.com/content/reviews/28/8

HD 2600XT is 2x or more powerful than the 8600GTS under CoH in DX10, and the HD 2900XT is able to beat the 8800GTX as well.

Personally I believe ATI has cut down the HD 2600 far too much in terms of texture power. The HD 2600 has a clear advantage in shader performance, as the 8600 is cut down much more in this area. However, the 8600 has TWICE the number of TMUs and ROPs as the 2600XT and doesn't run at much lower a clockspeed (675MHz vs 800MHz). ->This is not exactly true. The architecture of the 2600 can do 120 Shaders in theory, but it is really a 24 shader card that can do 5 shaders per if the "line" is filled. Most games don't fill that shader "line". The 8600GTS has 69% more texture processing power than the HD 2600XT. For comparison, the 8800GTS has less than 1% more texture power than the HD 2900XT, and even the 8800GTX has only 54% more texture power than the 2900XT. Of course, the 4 ROPs are also going to kill performance, too.->Yes, this is the worst part of it all probably. Why they decide to cripple their mid range in almost EVERY generation is still beyond me

This is why the HD 2600XT suffers so much in alot of games. Games are becoming more shader-heavy, but they still need texture power as well. ATI made this mistake last-gen with the X1600XT only having 4 TMUs and getting crushed by the 7600 series. Sadly, they've made the same mistake again. I am confident, however, that the HD 2600XT is going to be a much better performer than the 8600GT/GTS in DX10 games such as Crysis.

Regardless, the good news is, the HD 2400 and 2600 are going to be rediculously cheap (HD 2600XT at around $100) and will be great inside a setup that doesn't demand the best gaming performance. If you do game, I think the HD 2600XT is going to be a good choice... it can't beat the 8600GTS in current games, but in future games I think you'll be happy with it. ->If it sucks in performance now, it will suck in performance in the future. Even if it outperforms the 8600GTS a year from now in some games....will it really matter if you are getting 20 fps over the other guys' 17? Also, ATI has finally released a card with LOWER POWER CONSUMPTION than nVidia. If you want a nice quiet, low power system, the HD 2400 and 2600 are the best choices.

 

Jassi

Diamond Member
Sep 8, 2004
3,296
0
0
I'm glad I jumped on a 7900GS. I'm upgrading from a Xpress 200M so I don't think I'll mind the loss of DX10 capability ;)
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: dreddfunk
Extelleron - while that's technically true, my friend, the objective performance is still not impressive: 20fps vs. 10fps for the GTS at 10x7.

In general, I couldn't be more disappointed with this entire round of midrange parts. The only thing I can think is that both nVidia and ATI decided that they were losing too much money in the ~ $200 price range, and just have given up on it this time around.

It's their own fault, truthfully, if they feel they've lost too much money selling cut-down and derivatives of $400 parts at that price point. How about, gosh, giving the midrange parts some better legs.

In some sense, I think they never should have pushed the price of the highest-end to $500+, if they couldn't profitably fill the gaps between those cards and the 'real' midrange of $150.


I would say that I was happy to sit this round out, but my x850xt just doesn't do what I wanted it to.


If nVidia left the door ajar with the 8600s, ATI just tripped on their own shoelaces and fell face-first into the wall (missing the door completely).

IF (and it is a big, big IF), the HD2000 series parts have better legs in DX10 than their counterparts, maybe they'd turn out to be a decent buy. Given that the 2400/2600 series are really unlikely to give decent DX10 performance anyway, I can't see that being relevant here.

Yeah I think the spread is great enough now that they should start designing 4 cores for 1 generation of video card instead of 3.

We need a fourth core at the "performance-mainstream" level. Something that will fit in the $200-$300 range give or take.

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I'm disappointed. ATi had the opportunity to dominate the mid-range but they squandered it.

For ATi's sake I hope OEMs will buy the cards.
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
Cold - I hadn't really thought of it in those terms but I think you're spot on. It's not that I think that the 8800GTS 320MB isn't a great part, it's just clear that they can't come much further in terms of a sustainable price level. They might drop a bit more, but probably only just before being discontinued as stock is hustled out the door. I sincerely doubt that they could sell it for much less on an ongoing basis. Who knows the real answer to that question, however, except the financial analysts, marketers and accountants crunching the numbers...

BFG - yeah, I'm terribly disappointed in all of this. I kind of figured that great DX10 performance wasn't going to be in the cards for the first generation of mid-ranged DX10 cards, but to have such utterly lackluster performance in DX9 titles coupled with near certain 'oblivion' (forgive the pun) in DX10 titles, the only two valid reason for purchasing them appear to be vista compatibility and video decoding features.
 

bigsnyder

Golden Member
Nov 4, 2004
1,568
2
81
Which review is better suited for those looking for a new HTPC card? For gaming all these cards suck, but
for HTPC there is a lot of potential.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: dreddfunk
Cold - I hadn't really thought of it in those terms but I think you're spot on. It's not that I think that the 8800GTS 320MB isn't a great part, it's just clear that they can't come much further in terms of a sustainable price level. They might drop a bit more, but probably only just before being discontinued as stock is hustled out the door. I sincerely doubt that they could sell it for much less on an ongoing basis. Who knows the real answer to that question, however, except the financial analysts, marketers and accountants crunching the numbers...

Oh I also think the 8800 GTS 320 is a great part, but because of it's gargantuan die size of 480mm2 and it's 320Bit PCB it is unlikely we will see it much less then it's is now unless Nvidia wants to eat the cost again.

It's sort of similar to the situation with the 7800 GT, I don't think that card fell too much below $250 at the best case. Though that story had a happy ending as the 7900GS/GT were produced with cheaper cost and it was sustainable for awhile.

A native 80nm/65nm preferred would be 65nm with 96 Shader processors clocked high and 256Bit PCB with 2GHZ GDDR3 would certainly do the trick at this level and can displace the 8800 GTS 320 entirely, leaving larger profit margin for Nvidia. This disabled G80 die stuff just isn't going to work in the long run.

 

blckgrffn

Diamond Member
May 1, 2003
9,686
4,345
136
www.teamjuchems.com
What I find weird is that ATI finally "fixed" the X16xx series with the x1650xt (made it what it should have been all along) and then go and pull the same crap with the HD 26xx. Here's to hoping that three months from now they add another set of ROPS and TMU's... it seems like that is all it really needs. 128 bit memory interface with that fast of ram likely isn't the limiting factor now. Heck, they have more bandwidth than the x800xt generation now.

Nat
 

daveybrat

Elite Member
Super Moderator
Jan 31, 2000
5,807
1,021
126
All i can say is i too am happy i picked up a 7900GS and didn't waste time waiting for these new cards.