9800GX2-->"Up to a 90% performance increase over the...

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nRollo
Originally posted by: apoppin

really .. my original guess was right .. sucks to be right, sometimes
:Q

nvidia better have some *solid* performance .. $600 hints to me to probably wait for r700 ... i am not "suffering" .. but i always want [bang-for-buck] better ... letsee, it would now cost me $600-$350=$250 ... i need a +50% potential performance increase to justify my curiosity

I think the only valid question here is whether or not a stock clocked 9800GX2 significantly outperforms an OCd 3870X2. If it does, than things like "scaling" and "OCing" become pretty much irrelevant- the card justifies it's price being the highest performing single slot solution.

That is what *i* just said :p ... i believe you are repeating my analysis' conclusion almost identically - copy cat
Originally posted by: apoppin

same thing happens all the time with nvidia, imo ... they make a top card and then *Stick* you in the ass .. er, pocketbook
--that is the *privilege* you pay for saying i have the "fastest"
- is it worth it? ...
I guess you feel the same way about AMD then if the R700 launches when it's supposed to, a few months after the 3870X2? And you must have felt the same when the X1900 launched 3 months after the X1800 and blew it away.
Your argument is specious- a person pays for what is in the market the day they buy, not based on what upcoming parts will be or will cost. Always been that way, always will be that way.

Why, yes i did and i think i said SO .. *especially* about x1800 series and GTX7900 ... i think we could see it 'coming' - i warned people. IF nvidia overprices it in OUR eyes ... never mind what *you* or even your company thinks ... sales may not meet expectations ;)

Originally posted by: apoppin

i'd say ... 'yes', 'no', AND 'sometimes' - it didn't turn out so well for 7900GTX owners ... and i hate to have to agonize over decisions that AMD makes SO easy and affordable for me ... they are the "cheap alternative" - imo
I'd say NVIDIA competes very well AMD on a price/performance scale. 9600GTs offer 95% of 3870 and cost a little less, 8800GTs offer 115% of the performance and cost a little more. 3870X2s offer sometimes more, sometimes less performance than a 8800GTX, and cost more. 9800GX2s won't have any competition, as such, they'll cost more. Getting the absolute best always commands a premium- if AMD had it, you can bet they'd be charging for it.

i said *top cards* ... and AMD did have it ... several times in the past ... but they don't "gouge" like nvidia [apparently] does
Originally posted by: apoppin

nvidia has only recently "caught" onto bargain pricing in the midrange .. maybe they will really think this through .. i AM turned-off by what many consider to be 'gouging' for their "status" products :p
The bargain pricing is trying to put their competitor out of business, like every company in the world does. The status pricing is charging most for best, like every company in the world does. No free lunch Apoppin- companies are in money to crush their competition flat, and get as much as they can out of their clients. AMD would be doing the same if they could (e.g. FX series cpu in days gone by, 9700Pro/X800XT PE/X1900XTX launch price)

Yes, we are *agreeing* you will usually find me very realistic even though i speak of "idealism" .. in case you missed we are debating philosophies; and i think it is kinda cool that you are here and get to give NVIDIA a little of what "we ' and even some of what "i" think ... in a constructive way, of course.
Originally posted by: apoppin

they may make a few extra bucks in the short term but they are creating future customers for their rivals by practices ... and i think nvidia really needs to think long-term and customer loyalty ,,, i think they are losing it

You may think they are losing it, but their record setting profits and huge market share every quarter would seem to disprove your theory. ;)


[/quote]

no, actually it reinforces it. "Pride goeth before a fall" ... an old proverb. i think your company is even out of touch with its own fans now. You have very little vocal forum support here - it used to be quite different 4 years ago. intel was there ... and they are back from there. ... ask them how they feel about ignoring their customers needs. AMD is there right now ... they lost fans - a few diehards clung on remembering the [supposed] "good days'. AMD is struggling to rebuild everything ... if they survive ... and it is a big IF ... still ... you are going to need the most loyal of fans for your possible difficult times ahead ... you DO realize that we are in a recession.

--it is all a matter of perception. And the way your company is perceived by it's very vocal fans really does count toward overall sales.
 

NinjaJedi

Senior member
Jan 31, 2008
286
0
0
If the price is $600 I don't see how that is in anyway a "bang for your buck." I always thought that term meant getting more for less.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: NinjaJedi
If the price is $600 I don't see how that is in anyway a "bang for your buck." I always thought that term meant getting more for less.

IF it is $600 - in my eyes - it *better* eat the HD3780-X2 and crap the vaporized particles out its exhaust

edited
 

NinjaJedi

Senior member
Jan 31, 2008
286
0
0
Originally posted by: apoppin
Originally posted by: NinjaJedi
If the price is $600 I don't see how that is in anyway a "bang for your buck." I always thought that term meant getting more for less.

IF it is $600 - in my eyes - it *better* eat the HD3780 and crap the vaporized particles out its exhaust

lol That made me laugh. Thanks
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
I'd say the high price makes sense. It's going to protect NV and partners from selling too many of them so they won't get that many RMA's due to overheating. 7800GTX 512MB all over again.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: ArchAngel777

Will it outperform a 3870x2? Probably by a little bit, but enough to justify a 50% price premium? That is iffy.
Well Geforce 9600 GT SLI beats HD3870 X2 by 10%, 8800 GT SLI beats HD3870 X2 by 20-60% and I don't think that those extra 32SP and 16 TMU compared to 8800 GT SLI will do harm.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Rusin
Originally posted by: ArchAngel777

Will it outperform a 3870x2? Probably by a little bit, but enough to justify a 50% price premium? That is iffy.
Well Geforce 9600 GT SLI beats HD3870 X2 by 10%, 8800 GT SLI beats HD3870 X2 by 20-60% and I don't think that those extra 32SP and 16 TMU compared to 8800 GT SLI will do harm.


IMHO if it matches the 8800GT SLi performance it earns the rumored $600 because:
1. Has the performance to justify.
2. Only SLi you can run on any motherboard
3. More flexible drivers
4. Higher performance single core for non scaling games.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
8800gt sli performance +-10% in all games is my guess. Will it be faster than 3870x2? Sure, probably by 15-20%. That won't mean much to most of us, but for somebody with a 30" monitor it could be the difference between "playable" and "slideshow". Of course, most of us could get a 3870x2, keep our current mobo, and throw in a 3850, 4870, or whatever in 6 months, and still be cheaper than a 9800gx2. I'm just glad that we're having a discussion right now instead of paying unbelievable prices for cards that are 18 mos old.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nRollo
Originally posted by: Rusin
Originally posted by: ArchAngel777

Will it outperform a 3870x2? Probably by a little bit, but enough to justify a 50% price premium? That is iffy.
Well Geforce 9600 GT SLI beats HD3870 X2 by 10%, 8800 GT SLI beats HD3870 X2 by 20-60% and I don't think that those extra 32SP and 16 TMU compared to 8800 GT SLI will do harm.


IMHO if it matches the 8800GT SLi performance it earns the rumored $600 because:
1. Has the performance to justify.
2. Only SLi you can run on any motherboard
3. More flexible drivers
4. Higher performance single core for non scaling games.

*ridiculous* ... is this YOUR own idea or is it nvidia markleting? .. if it is marketing, they are in bigger trouble than i ever thought
You guys need help?
:Q

utterly ridiculous .. so i want to spend more for less because they slapped two hot GPUs together on a single board?
:roll:

not me - at least, if i really wanted SLi that badly, i just get a SLi MB and 2 GTs

sounds like Ford's slogan from the 90s ...
... and their attiude and inability to change direction

*you* buy one :p
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
apoppin, you're just mad that they're going to be $599 instead of $449 ;)

seriously, the good news for all of us is that if they're really $150 more than a 3870x2 then they're going to be quite a bit faster. It's nice to see a little bit more improvement over an ultra, even if it has to be a 2 pcb, dual gpu solution.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: apoppin

*ridiculous* ... is this YOUR own idea or is it nvidia markleting? .. if it is marketing, they are in bigger trouble than i ever thought
You guys need help?
:Q

You apparently missed the "IMHO" part. Unless I say I'm posting something from NVIDIA, it's my own thoughts.

At the end of the day it won't matter what you or I think, people will buy them up.

You seem to like saying "NVIDIA is in trouble" lately, but you haven't really linked to any credible proof of why anyone should believe you when they're dominating the market and their competitors keep sinking.

I wish every company in the country was in the kind of "trouble" NVIDIA is in, the only worry we'd have would be inflation. LOL



 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: nRollo
You seem to like saying "NVIDIA is in trouble" lately, but you haven't really linked to any credible proof of why anyone should believe you when they're dominating the market and their competitors keep sinking.

I wish every company in the country was in the kind of "trouble" NVIDIA is in, the only worry we'd have would be inflation. LOL
The reason people say that nVidia is in trouble right now is due to their lack of an X86 licence. Many are suggesting that the GPU's days are numbered in its current form, and will soon enough be implemented in all CPUs.

I suppose you could make a comparison of sorts with the oil industry. Petrolium is a finite resource. They can sell it at $105/barrel today, but in 10 years there may be no oil left on earth to sell. :light:

In essence, nVidia's troubles are not a figment of apoppin's imagination. I happen to agree with him to some extent, and there have been several editorials written recently which also reflect this sentiment.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: bryanW1995
apoppin, you're just mad that they're going to be $599 instead of $449 ;)

seriously, the good news for all of us is that if they're really $150 more than a 3870x2 then they're going to be quite a bit faster. It's nice to see a little bit more improvement over an ultra, even if it has to be a 2 pcb, dual gpu solution.

mad .. hell no .. i just saved wasting $600 :p

and how is $600 good news for 'all' of us .. something else that is really impractical that we also can't afford?
-what is wrong with just getting 2 GTs ?
:confused:

2 GTs would be "good news" for most of us :p
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: apoppin
mad .. hell no .. i just saved wasting $600 :p

and how is $600 good new for all of us .. something else that is really impractical that we also can't afford
-what is wrong with just getting 2 GTs ?
:confusedl

2 GTs would be "good news" for most of us :p
One would hope that it's twice as good as two 9600GTs when it basically costs twice as much as them.

My guess is that it will be marginally faster than a pair of 8800GTs, however I was reading the AT CeBIT coverage, and the 790 chipset coverage intrigued me. It looks as though it may have user-expandable memory slots for nVidia graphics cards (it was shown with a 9800GX2).

I agree that $600 for a graphics card is absurd. I will never pay that much for one. When I spend over $200 on a GPU, that's pushing it.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: SickBeast
Originally posted by: apoppin
mad .. hell no .. i just saved wasting $600 :p

and how is $600 good new for all of us .. something else that is really impractical that we also can't afford
-what is wrong with just getting 2 GTs ?
:confusedl

2 GTs would be "good news" for most of us :p
One would hope that it's twice as good as two 9600GTs when it basically costs twice as much as them.

My guess is that it will be marginally faster than a pair of 8800GTs, however I was reading the AT CeBIT coverage, and the 790 chipset coverage intrigued me. It looks as though it may have user-expandable memory slots for nVidia graphics cards (it was shown with a 9800GX2).

I agree that $600 for a graphics card is absurd. I will never pay that much for one. When I spend over $200 on a GPU, that's pushing it.

am i missing something? ... 2 9600GTs will save you roughly $200 over a GX2 ... isn't GX2 now $600? .. and a 9600 pair well under $400 ... even 2 GTs are around $500, right

what is the GX2 for again?
--bragging rights over the 3870x2? ... that is all it looks to be .. so far

UNLESS it is a screamer performance wise ... $600 is too much .. especially when you can get 2 GTs for less. IF it IS, i still want one
 

thilanliyan

Lifer
Jun 21, 2005
12,039
2,251
126
Originally posted by: Blacklash
My first question would be does the GX2 work properly in titles where the X2 does not?

The problem with Crossfire, including on board Crossfire on the X2, is in some games it causes a horrid performance hit that yields performance far less than a single card. Titles I am aware of this occurs in; SupCom, Gothic 3, NWN2, Need for Speed: Pro Street, WiC on the DX10 path with AA, Lost Planet: Extreme Condition, Tomb Raider: Legend, Hitman: Blood Money, Hellgate: London, and Jericho with edge smoothing active.

Weren't some of your claims about the games not scaling disproved by Apoppin? I think I remember him saying that Hellgate:London and Lost Planet DO work.

I've seen you bring those up in several threads now and several times Apoppin has responded...why do you continually bring up the same games if some of your claims are NOT true? Do you even have anything XFire right now to test out what you are saying or are you talking from very limited experience?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
nvidia has only recently "caught" onto bargain pricing in the midrange ..

Yeah, the only board I can think of over all the years of the 3D market that was comparable to the value at midrange we have now was that one that company put out that board that was called the Ti4200- nVidia should have paid attention to that company ;)

People's memory seems to only go back about a year or two lately :p

Many are suggesting that the GPU's days are numbered in its current form, and will soon enough be implemented in all CPUs.

Only the truly ignorant are seeing that as happening anywhere inside the next 20 years. Cell is by far the closest thing we have seen to date and it isn't remotely close to being in the leage of a dedicated rasterizer nor does it have the pixel shading capabilities. The CPU ten years from now will be capable of doing a lot of what today's GPUs are doing without a doubt, unfortunately at that point GPUs will be considerably more powerful and simply looking at what a CPU does it will not move in the direction of being an effective rasterizer at any point in the forseeable future.

A cluster of 50 quad core CPUs couldn't fully render Crisis equal to a 3850.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: apoppin
Originally posted by: SickBeast
Originally posted by: apoppin
mad .. hell no .. i just saved wasting $600 :p

and how is $600 good new for all of us .. something else that is really impractical that we also can't afford
-what is wrong with just getting 2 GTs ?
:confusedl

2 GTs would be "good news" for most of us :p
One would hope that it's twice as good as two 9600GTs when it basically costs twice as much as them.

My guess is that it will be marginally faster than a pair of 8800GTs, however I was reading the AT CeBIT coverage, and the 790 chipset coverage intrigued me. It looks as though it may have user-expandable memory slots for nVidia graphics cards (it was shown with a 9800GX2).

I agree that $600 for a graphics card is absurd. I will never pay that much for one. When I spend over $200 on a GPU, that's pushing it.

am i missing something? ... 2 9600GTs will save you roughly $200 over a GX2 ... isn't GX2 now $600? .. and a 9600 pair well under $400 ... even 2 GTs are around $500, right

what is the GX2 for again?

bragging rights over the 3870x2? ... that is all it looks to be .. so far
Well...AFAIK:

2 8800GTss cost ~$400
2 9600GTs cost ~$330

So...the 8800GTs are $200 cheaper. The 9600GTs are $270 cheaper (I hear you can get them at $165 each).
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: BenSkywalker
Many are suggesting that the GPU's days are numbered in its current form, and will soon enough be implemented in all CPUs.

Only the truly ignorant are seeing that as happening anywhere inside the next 20 years. Cell is by far the closest thing we have seen to date and it isn't remotely close to being in the leage of a dedicated rasterizer nor does it have the pixel shading capabilities. The CPU ten years from now will be capable of doing a lot of what today's GPUs are doing without a doubt, unfortunately at that point GPUs will be considerably more powerful and simply looking at what a CPU does it will not move in the direction of being an effective rasterizer at any point in the forseeable future.

A cluster of 50 quad core CPUs couldn't fully render Crisis equal to a 3500.
The Cell by no means attempted to act as a GPU in any way, shape, or form. It added dedicated processors to help with physics, sound, and media processing, but it does nothing that current GPUs do.

The first example of such a processor will no doubt be Fusion. My guess is that it will incorporate 1-3 low-end or midrange GPUs into a CPU core. If they manage to put the equivalent of a few 3870 GPUs (for its time) into the thing, it surely will perform quite well, thus persuading many people to avoid purchasing a graphics card altogether.

It is currently in both AMD and intel's best interests to eliminate the need for a seperate GPU. It eliminates nVidia from the competition to a large degree.

I'm not saying that graphics cards are going anywhere anytime soon, but I do see the low-end of nVidia's GPU market being ravaged by processors like Fusion. From what I understand, that's where they make most of their money, which is really why they need to be worried somewhat. :light:

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
No .. i just checked - 8800GT SLi pair is $200 each - $400 :p
2 9600GTs cost ~$350 for a PAIR

So, if i understand this, i need to pay an *extra $200* for the privilege of running a single slot? ??
--is this marketing or voodoo economics; is Reagan back from the dead?
- explain it to me again, Mr. [nv]Rollo

maybe i just DON'T get it :p
:confused:

i will ATTEMPT to keep an open mind


[open]
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The Cell by no means attempted to act as a GPU in any way, shape, or form. It added dedicated processors to help with physics, sound, and media processing, but it does nothing that current GPUs do.

I can only assume that you did not follow Cell's development closely nor have you looked at much of what it is capable of. The initial designs for the PS3 didn't call for a GPU of any sort, they were going to utilize a rasterizer paired with Cell(in it's 'full' configuration without execution units disabled for yields- this design choice was selected after they decided to go with a GPU paired with Cell). Cell is an absolute monster when it comes to handling transformation, lighting and vertex shader operations- it is in fact more powerful then the GPU it ended up being paired with. The entire layout of Cell is closer to that of a GPU then a traditional CPU(actually, it is closest to a very high powered DSP).

The first example of such a processor will no doubt be Fusion. My guess is that it will incorporate 1-3 low-end or midrange GPUs into a CPU core. If they manage to put the equivalent of a few 3870 GPUs (for its time) into the thing, it surely will perform quite well, thus persuading many people to avoid purchasing a graphics card altogether.

Where is this die space going to appear from? With that approach you are going to be dealing with both a crippled CPU and a crippled GPU.

It is currently in both AMD and intel's best interests to eliminate the need for a seperate GPU. It eliminates nVidia from the competition to a large degree.

What you are saying is it is in AMD's best interest to take billions of dollars and burn it. nVidia is only competitive with AMD because they purchased ATi- prior to that nV was AMD's saving grace as a platform provider and had no real cross competitive points. The acquisition of ATi is what made nV a competitor- so they should now eliminate a market they just spent billions to enter because the have competition in that field?

I'm not saying that graphics cards are going anywhere anytime soon, but I do see the low-end of nVidia's GPU market being ravaged by processors like Fusion.

I've heard this talk time and time again, I always get a chuckle out of it. Noone is going to hand anyone free die space for a new idea. The transistor budget must come from somewhere, so where are you going to compromise? As a super low end solution that requires nigh to no 3D sure, but that is a long way removed from replacing what is going to be billions of transistors utilized for a dedicated GPU.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: BenSkywalker
So, if i understand this, i need to pay an *extra $200* for the privilege of running a single slot? ??

Does this help explain it at all?

not really .. i think IT is overpriced ... compared to 3870 Crossfire :p
-and no more performance than O/C'd 2900pros in tandem

so i have to pay $200 still more for another WAAY overpriced card .. if this is 'competition', they can both shove their 'X2 solutions'
[imo]

this is not progress

rose.gif
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: BenSkywalker
The Cell by no means attempted to act as a GPU in any way, shape, or form. It added dedicated processors to help with physics, sound, and media processing, but it does nothing that current GPUs do.

I can only assume that you did not follow Cell's development closely nor have you looked at much of what it is capable of. The initial designs for the PS3 didn't call for a GPU of any sort, they were going to utilize a rasterizer paired with Cell(in it's 'full' configuration without execution units disabled for yields- this design choice was selected after they decided to go with a GPU paired with Cell). Cell is an absolute monster when it comes to handling transformation, lighting and vertex shader operations- it is in fact more powerful then the GPU it ended up being paired with. The entire layout of Cell is closer to that of a GPU then a traditional CPU(actually, it is closest to a very high powered DSP).

The first example of such a processor will no doubt be Fusion. My guess is that it will incorporate 1-3 low-end or midrange GPUs into a CPU core. If they manage to put the equivalent of a few 3870 GPUs (for its time) into the thing, it surely will perform quite well, thus persuading many people to avoid purchasing a graphics card altogether.

Where is this die space going to appear from? With that approach you are going to be dealing with both a crippled CPU and a crippled GPU.

It is currently in both AMD and intel's best interests to eliminate the need for a seperate GPU. It eliminates nVidia from the competition to a large degree.

What you are saying is it is in AMD's best interest to take billions of dollars and burn it. nVidia is only competitive with AMD because they purchased ATi- prior to that nV was AMD's saving grace as a platform provider and had no real cross competitive points. The acquisition of ATi is what made nV a competitor- so they should now eliminate a market they just spent billions to enter because the have competition in that field?

I'm not saying that graphics cards are going anywhere anytime soon, but I do see the low-end of nVidia's GPU market being ravaged by processors like Fusion.

I've heard this talk time and time again, I always get a chuckle out of it. Noone is going to hand anyone free die space for a new idea. The transistor budget must come from somewhere, so where are you going to compromise? As a super low end solution that requires nigh to no 3D sure, but that is a long way removed from replacing what is going to be billions of transistors utilized for a dedicated GPU.
You're correct in stating that I did not follow the development of the Cell. I've read several reviews of the PS3, all of which stated my above 'opinion' (of which I now stand corrected). :)

As for the rest of your post:

The die space can come from at least one of the CPU cores! For all intents and purposes, most desktop PC users can't properly take advantage of more than 2 cores. If you turn 2 cores of a quad-core CPU into GPUs, for most people you're saving them money and making more effective use of the die space. There are diminishing returns as the number of CPU cores increases. In the near future there will be 8 core CPUs, then 16. If CPU cores prove to be redundant beyond a certain point, why not make them GPUs (or something else like a PPU)?

As for your "crippled CPU/crippled GPU" comment, you're correct. That said, if done correctly it will still annihilate gaming on a dual-core CPU with integrated graphics (and will probably come close to a midrange graphics card performance-wise). If I could save $200 on a graphics card by sacrificing 2 of 4 CPU cores, I would most definately do it. Keep in mind that AMD plans to offer a variety of Fusion cores, some with more CPUs/GPUs than others.

Regarding your "burning money" statement, it makes no sense to me whatsoever. If nVidia is eliminated from the GPU business, chipsets are all that they have left! I never said that AMD should stop them from producing chipsets for them! (although they seem to be doing a great job making their own lately)

AFAIK AMD's main reason for aquiring ATI was so that they could create Fusion. Intel is embarking on a similar project.

Your last comment confuses me. :confused: AMD already is dedicating die size to this type of project. As I've stated repeatedly it's called Fusion, and I'm pretty sure that the project has been confirmed as being real.

I could see video encoding enthusiasts and 3D-rendering people being upset at the loss of a CPU core when they already have 4, but as for the average user, I don't think they'll mind much at all (let alone notice). :beer:
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: apoppin
Originally posted by: BenSkywalker
So, if i understand this, i need to pay an *extra $200* for the privilege of running a single slot? ??

Does this help explain it at all?

not really .. i think IT is overpriced ... compared to 3870 Crossfire :p
-and no more performance than O/C'd 2900pros in tandem

so i have to pay $200 still more for another WAAY overpriced card .. if this is 'competition', they can both shove their 'X2 solutions'
[imo]

this is not progress

rose.gif

Well, in all fairness, we haven't seen benchmarks yet. Lets see those first, and then talk about where the price "should" fall. Weigh the pros and cons.
My *guess* is that it will fall in between 8800GT SLI and 8800GTS SLI. Unless the 420 core has anything special to offer, which is unlikely.