RV670 ---------Radeon HD 3870 card high res photo

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Demoth

Probably would recommend that people who need to upgrade now, hold for the 8800GT at a $200 price point. Probably mid-December. ATI at this point has too many problems- high heat, slow driver implementation, big hit using AA and still a poorer IQ compared to the 8XXX series.

Those are old news, actually ATi improved their FSAA quality and in some situations is better than nVidia's FSAA, and sometimes is worse than nVidia, usually anti aliased objects far from the came looks slighly better on nVidia, while certain angles near the camera looks slighly smoother on ATi hardware, nVidia still having the upper hand in AF though the difference cannot be seen in the game with the naked eye.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: bryanW1995
I found this from last week: http://www.pureoverclock.com/story1513.html

Are they going to call it 2950 x2x?

for this you need the real FUD . ... the original sewe .. um, source:

http://www.fudzilla.com/index....=view&id=3476&Itemid=1
Radeon HD 2950XT will be the top of ATI's offer Q4 but the next quarter ATI plans to introduce its dual chip card simply called Radeon HD 2950X2X.


The new card will have two RV670 chips on a single PCB and the new solution is codenamed R670. This card will have two times 320 shader units and two times 512MB GDDR4 memory again with 256 bit.
they claim q1 ... i think we will see a single-gpu 2950XTX sooner ... shortly after the 2950XT and nvidia's response.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: apoppin
2900xt did what it HAD to do ... what AMD aimed it at ... it's "purpose" was to take the upper-midrange - to compete in the GTS slot. It fulfilled that purpose admirably as i and many other happy 2900xt owners can attest to. The fact you pointed out that is is close to GTX performance without AA is just a "plus" to most owners.

You actually believe in what AMD/ATi said during one of the press conference that a full fledged R600 was taking on a crippled G80 right from the beginning instead of battling the GTX for performance crown?

A GPU based on 65nm process, with 700 million transistors, 320 shaders (64 Vec5), 512bit memory bus, clocked at 750MHz core capable of half a teraflop of computing power was intentionally targeted at the mid high end 8800GTS???

Now, lets take a look at history. Since when has ATi NOT battled nVIDIA (and other IHVs) for the performance crown? Never til now. Why? they had very competitive product to contend for that title and at one stage, ATis flagship was so much better than the competition that it made the competition resort into humiliating acts such as cheating in certain benchmarks.

After years of hype, and the R600 hit. It only competed against a crippled G80. Now are you saying that ATi intentionally aimed at that market segment? Think about the margins. Even on a 65nm process the R600 is no small chip. Add in a complex PCB to support 512bit memory interface, 16 memory chips, a full 2~3 heatpipe copper HSF, etc etc and with a price tag of $399 is there any kind of return from such product?

The thing is that R600 could have had battled the GTX. What munky is trying to say is that due to the nature of shader based AA resolve, this was one of factors that crippled R600 from making it a possible contender to the full fledged G80. Even with a performance hit roughly that of R580 when enabling AA would have made a 2900XT such a competitive product against the competition and they could have priced it accordingly i.e $499 etc. Were they forced to price it $399? Absolutely. However does this mean the end of the world? No, and we are going to see ATi comeback with a vengeance with the RV670 and R680!

However if you still cant see the point, how about this one.

The FX5800ultra was targeted at 9700pro. The 2900XT targeted at 8800GTS.
Now let me rgo back in time, and change history.
The FX5800ultra was targeted at 9700 non pro and will have an MSRP of $299.

You see the difference? ATi was smart. nVIDIA on the other hand,, well we all know how that turned out.



 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
what i believe - or you believe - make no difference. What ATi intended is not what AMD did.

AMD set the 2900xt as a competitor to the 8800GTS - not ATi to the GTX .... a clue is it's pricing. Who knows what kind of annoying DustBuster ATi might have set against the GTX :p

and it's no longer between nvidia and ATi ... it's all AMD ... and what ATi did in the past shouldn't matter to AMD's future vision or plans.
 

shabby

Diamond Member
Oct 9, 1999
5,779
40
91
Cookie Monster: check apoppin's sig, you'll find your answer there why he's not "understanding" you.

apoppin: ati didnt set out to build an awesome mid-high-end card from the begining, they were caught off gaurd and had to settle for second place and a price drop of its top-end card. If you cant see that, then lay off the crack.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: shabby
Cookie Monster: check apoppin's sig, you'll find your answer there why he's not "understanding" you.

apoppin: ati didnt set out to build an awesome mid-high-end card from the begining, they were caught off gaurd and had to settle for second place and a price drop of its top-end card. If you cant see that, then lay off the crack.

the 2900xt in my rig has nothing to do with it ... i had a GTS640OC in my rig for awhile also - they are almost equally performing cards with individual strengths and weaknesses that balance each other. Several of us actually did the work and benchmarked them ... i did 2900xt vs GTS640-OC each in Vista vs XP. i kept 2900xt as they are equal performers BUT i got my 2900xt for $320 and Orange box bundled in also. Th GTS was $70 more in June and no bundle. I would have been just as happy with the GTS if i had kept it instead [if it was $70 cheaper and with a game bundle :p].

Check out my post #5 for the benches:

In House HD2900XT vs. 8800GTS 640

it's not ATi vs Nvidia ... yes, ATi would have gone after the GTX ... with some OC'd 12" firebreather coupled to a leafblower to keep it cool. But ATi did not launch the 2900xt ... AMD did

i believe AMD completely reworked r600 ... that is why there were so damn many respins. AMD - not ATi - made the wise decision to go after the midrange GTS with a higher-yielding money-making GPU.

It is just like AMD launcheded with Barcelona ... using their their strengths in the server market then go for high-yield faster desktop CPUs later ... just like they will go after the [new] GTX with their 2950xt. The 2950GT supplants the 2900xt in going after the 8800GTS . . . . logical

Can't you sense their new pattern?
 

thilanliyan

Lifer
Jun 21, 2005
11,822
2,018
126
Originally posted by: shabby
Cookie Monster: check apoppin's sig, you'll find your answer there why he's not "understanding" you.

apoppin: ati didnt set out to build an awesome mid-high-end card from the begining, they were caught off gaurd and had to settle for second place and a price drop of its top-end card. If you cant see that, then lay off the crack.

While I agree with your main point, what a person's video card is at the moment doesn't mean he's a fanboy of any company.

I myself bought the GTS when it came out because there was really no other option other than the GTX....but I'm not by any means a nVidia fanboy.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: thilan29
Originally posted by: shabby
Cookie Monster: check apoppin's sig, you'll find your answer there why he's not "understanding" you.

apoppin: ati didnt set out to build an awesome mid-high-end card from the begining, they were caught off gaurd and had to settle for second place and a price drop of its top-end card. If you cant see that, then lay off the crack.

While I agree with your main point, what a person's video card is at the moment doesn't mean he's a fanboy of any company.

I myself bought the GTS when it came out because there was really no other option other than the GTX....but I'm not by any means a nVidia fanboy.

if i - along with several of us - didn't do a long and extensive review, comparing the cards to each other, i might not feel quite so in a position to comment on them both. i think all of us reviewers here - along with the bulk of unbiased pro HW reviewers - concluded that 2900xt is a competitive card with 8800GTS. If the situation were *reversed* back in June - and the 8800GTS-OC was $320 with a game bundled - i'd have one in my rig right now. :p

i believe AMD *intended* to aim the 2900xt exactly at the 8800GTS ... they priced it identically ... and market forces have also concluded that as both of them have not dropped in price since the end of May.

i will agree that ATi - originally in their design of r600 - planned to take on the Ultra ... but clearer-visioned AMD changed their designs into a midrange challenge to nvidia. R600 cannot be judged "AA-flawed" - yet - as the 2950 series is built on it and we haven't seen anything yet.
 

thilanliyan

Lifer
Jun 21, 2005
11,822
2,018
126
Personally I think ATI would liked to have competed with the Gtx/Ultra but they just couldn't. Nothing wrong with how things turned out though since they priced it fairly well. If it was at GTX price levels and couldn't compete then it would have been a problem but I think other than the lateness, the 2900XT turned out to be a decent card.

If the 2950XT turns out even better performance for $250...they definitely have a winner on their hands and I'll definitely buy one...or maybe two :).

I had an X1800XL, X800GTO^2, and a X800XL before this GTS (discounting the 7900GTO which I only had for a month) and considering I had so much driver troubles from the very beginning (I bought the card on launch day), I'm itching to get back to an ATI card, with which I never had anywhere near the amount of issues I've had with nVidia.
 

Demoth

Senior member
Apr 1, 2005
228
0
0
Originally posted by: bryanW1995
Probably would recommend that people who need to upgrade now, hold for the 8800GT at a $200 price point. Probably mid-December. ATI at this point has too many problems- high heat, slow driver implementation, big hit using AA and still a poorer IQ compared to the 8XXX series.

You are way behind bro. 8800 gt will be out on oct 29, 2950 series in mid november. 2950 will be on 55 nm vs 65 nm for 8800gt, so guess who's going to have higher heat? As to driver implementation, why do you think that nvidia's drivers are better than ati's? Big hit using AA is highly debatable (see munky vs apoppin argument), but it appears that ati has addressed this in rv670. I'm not sure what you mean about IQ, must be because I'm a stooooopid ati guy :)

-------------------------------------------------------------

The 8800GT will be released at the end of October, but will sell about as far over price point as the 2900 Pro is selling at the time of this post.

http://www.newegg.com/Product/...x?Item=N82E16814102707
(The only DDR4 model at newegg)

Would be silly to go for this card now as-

http://www.newegg.com/Product/...x?Item=N82E16814102095

It will take a several weeks for the price to stabalize as vendors start getting in stock from each manufacturer. In the case of the 8800GT, X-mas and Crysis should greatly increase supply as large demand is anticipated. This should make prices drop a bit faster.

As far as the 2950 series, I hope it does solve a few of my issues regarding ATI products right now. My recommendation is based upon what is known as it stands now and right now I would not go ATI just based on heat alone. Rumors and hype are always a crap shoot in this business. If the 2950 Pro is better/as good as the 8800GT when prices are close to at least MSR, then I will revise my opinion.

IQ is image quality, something more people should be talking about in their reviews over simply max FPS. I have seen plently of rigs running both the higher end 8800s and 2900s at lan parties. Subjectively, on the same monitors, the Nvidia cards displayed at maxxed settings looked slightly smoother all around.

The driver implementation problem is not really ATI's fault, it lies with game devs. This issue won't effect most people. However, if you happen to be one of the unlucky ones who isn't playing a top 10 title and there is a driver problem with your new card, it can be pretty frustrating to wait months to have the issue resolved.

I am by no means anti ATI. My all time favorite card is the 9700 pro. Definately got my money's worth on 3 years of solid game play. I know ATI will have something better then Nvidia in the near future, I just feel right now they are definately a step behind.

Rumors and hype look promising right now for ATI, but until we have hard numbers from anandtech, Tom's, etc., it's just a guessing game as to when to pull the trigger and make a purchase.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
My recommendation is based upon what is known as it stands now and right now I would not go ATI just based on heat alone.

i would only choose power consumption if i really wanted to nitpick - and the difference is evidently way less then a dollar a month ...

Your possible 'excess heat' makes ZERO difference as my case is just as cool with a 2900xt as a 640GTS ... and i would say - at idle, the 2900xt is possibly cooler then the GTS ... since more then 90% of the time it is "idle" ... there may actually be less total heat outputted into your room :p
[which IS an advantage in Winter]
:Q

no difference in IQ ... sorry ... i guarantee you couldn't pick nvidia gfx from AMD in a side-by-side test

:D

they are competitive cards with a slight overall advantage maybe going to the GTS because you need a beefier PS for the XT ... that is the only practical difference i could find
 

Demoth

Senior member
Apr 1, 2005
228
0
0
Originally posted by: apoppin
My recommendation is based upon what is known as it stands now and right now I would not go ATI just based on heat alone.

i would only choose power consumption if i really wanted to nitpick - and the difference is evidently way less then a dollar a month ...

Your possible 'excess heat' makes ZERO difference as my case is just as cool with a 2900xt as a 640GTS ... and i would say - at idle, the 2900xt is possibly cooler then the GTS ... since more then 90% of the time it is "idle" ... there may actually be less total heat outputted into your room :p
[which IS an advantage in Winter]
:Q

no difference in IQ ... sorry ... i guarantee you couldn't pick nvidia gfx from AMD in a side-by-side test

:D

they are competitive cards with a slight overall advantage maybe going to the GTS because you need a beefier PS for the XT ... that is the only practical difference i could find

Heat output is a concern for me. Past experience has proven that less heat makes for a more stable over clocked card with less artifacting during a long gaming session. The last thing you want during a long MMO RAID is your video going screwy.

Heat and power for ATI go hand in hand and is also a factor as far as considering a future crossfire option. Wouldn't have a problem going 2 8800GTs in SLI, but unless the 2950 Pro fixes the issue, I would not consider the 2900 XT/Pro in crossfire.

Don't care how good they perform idle, it's how they hold up after gaming 4 hours at high settings.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Demoth

Heat output is a concern for me. Past experience has proven that less heat makes for a more stable over clocked card with less artifacting during a long gaming session. The last thing you want during a long MMO RAID is your video going screwy.

Heat and power for ATI go hand in hand and is also a factor as far as considering a future crossfire option. Wouldn't have a problem going 2 8800GTs in SLI, but unless the 2950 Pro fixes the issue, I would not consider the 2900 XT/Pro in crossfire.

Don't care how good they perform idle, it's how they hold up after gaming 4 hours at high settings.
well, there is no artifacting with my card - EVAR ... it goes back to VT if that happens while i am owner ... lifetime warranty
--i demand absolute stability in my rig also and don't feel any less secure or stable with 2900xt - i never OC my GPUs. My PS is quite up to the task of even two of 'em.


the point i am trying to make is that power issues should be your only concern ... i can't seen a problem with 'heat' in any GPU with a properly designed cooler that vents the exhaust outside of the case.
What, you are worried about warming you toes an extra degree? - it doesn't change your case temps a bit - except perhaps to lower them when the fan spins higher.
:confused:


Every single new DX9 game i play is at maxed out in-game setting with 4xAA/16xAF at 16x10 ... and i still play 8 hour SP sessions - at times ... recently with 2Worlds. :p
--Why would i settle for any less then a GTS owner?

 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: apoppin


no difference in IQ ... sorry ... i guarantee you couldn't pick nvidia gfx from AMD in a side-by-side test

:D

That's true, even though the 8800 series have a near perfect AF implementation, in real gaming scenarios is just impossible to pinpoint difference between it and the Radeon HD, and FSAA quality is identical on both, in some scenarios like edges that are far from screen looks slighly smoother on the 8800, while on certain angles near the screen looks slighly better on the Radeon HD. The power consumption difference is barely an issue, is not like something that's huge like it was between NV40 and the R420 or the R580 and the G71
 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
Update again with new news !

Now IF they change the name and go with HD 3000 series...could that mean that performance is higher than expected ?
 

T2k

Golden Member
Feb 24, 2004
1,664
5
0
Not getting something... I read somewhere (Inq?) this RV670 supposed to be a 256-bit membus design - since we know the current $250 2900 Pro can run at 860MHz/2GHz w/ 512-bit membus and same number of SPs the XT has why would anyone go with the RV670 w/ 256-bit membus, clocked at the same level? To catch up with the current crop this 2950/3800 should run its mem around 4GHz which is out of question.
In other words there must be more to it i.e. changes in the core...
 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
But they have said over and over recently in quite a few articles...they really don't need anything more as of yet more than 256-bit. Whether true or not, I dont know.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: T2k
Not getting something... I read somewhere (Inq?) this RV670 supposed to be a 256-bit membus design - since we know the current $250 2900 Pro can run at 860MHz/2GHz w/ 512-bit membus and same number of SPs the XT has why would anyone go with the RV670 w/ 256-bit membus, clocked at the same level? To catch up with the current crop this 2950/3800 should run its mem around 4GHz which is out of question.
In other words there must be more to it i.e. changes in the core...

The rv670 is supposed to be a mainstream performance card, not high end, hence the V. A 256-bit bus will keep the costs down, and besides, there aren't many situations right now where the r600 actually benefits from a 512-bit bus. With higher clocks, improved AA/AF performance and more polished drivers, the rv670 would perform significantly better than the r600 even with a 256-bit bus.
 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
Originally posted by: munky
Originally posted by: T2k
Not getting something... I read somewhere (Inq?) this RV670 supposed to be a 256-bit membus design - since we know the current $250 2900 Pro can run at 860MHz/2GHz w/ 512-bit membus and same number of SPs the XT has why would anyone go with the RV670 w/ 256-bit membus, clocked at the same level? To catch up with the current crop this 2950/3800 should run its mem around 4GHz which is out of question.
In other words there must be more to it i.e. changes in the core...

The rv670 is supposed to be a mainstream performance card, not high end, hence the V. A 256-bit bus will keep the costs down, and besides, there aren't many situations right now where the r600 actually benefits from a 512-bit bus. With higher clocks, improved AA/AF performance and more polished drivers, the rv670 would perform significantly better than the r600 even with a 256-bit bus.

Thanks munky :)

I was looking for that but apparently you found it before me. That is the exact statement I was looking for. And actually, a few other sites has said the same thing that I remember reading, they just reworded a little different.
 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
And I just noticed, I wonder why....

The RV670 thread has a 2 star rating ?

The 8800GT thread has a 4 star rating ?