9800GX2-->"Up to a 90% performance increase over the...

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
Originally posted by: apoppin
Originally posted by: BenSkywalker
So, if i understand this, i need to pay an *extra $200* for the privilege of running a single slot? ??

Does this help explain it at all?

not really .. i think IT is overpriced ... compared to 3870 Crossfire :p
-and no more performance than O/C'd 2900pros in tandem

so i have to pay $200 still more for another WAAY overpriced card .. if this is 'competition', they can both shove their 'X2 solutions'
[imo]

this is not progress

rose.gif

Well, in all fairness, we haven't seen benchmarks yet. Lets see those first, and then talk about where the price "should" fall. Weigh the pros and cons.
My *guess* is that it will fall in between 8800GT SLI and 8800GTS SLI. Unless the 420 core has anything special to offer, which is unlikely.

You ARE right ... we are not even sure about the pricing, although [nv]Rollo doesn't deny. And i actually HAVE been saying this .. basically, that it needs a SOLID performance increase - not "features" like it's ill-begotten brother [sorry guys, imo] HD3870X2 - only 3870x2 IS bargain-priced. i think DOUBLE "ultra" performance would definitely be in 'bargain territory' for $600 ... so ... agreed

... let's wait and see ... i really *hope* for nvidia's sake ...

[who am i bullshitting?]

.. for OUR sakes, that it is NOT $600 UNLESS it is AWESOME

... and i hope it is awesome
rose.gif


the way i look at it, nvidia SHOULD be targeting GX2 at *you* - to get you to upgrade. But - if what you are guessing is right - wouldn't you *rather* just ADD another 8800 GT for $220 or so ... or sell your GPU and spend $600?

... and you *keep* saying "in all fairness" .. of course .. you are a moderator
... i.e. moderate, fair .. otoh, i tend to be a little bit outspoken :p
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
We are in agreement with most points, but I did want to take exception to one in particular :)

As for your "crippled CPU/crippled GPU" comment, you're correct. That said, if done correctly it will still annihilate gaming on a dual-core CPU with integrated graphics (and will probably come close to a midrange graphics card performance-wise).

Let's say, for arguments sake, you could easily add a high end DX11 style GPU on to any current processor without any transistor penalty at all- it's still going to get a major beat down from even current mid range boards. The reason? Absolute best case scenario you are limited to an effective 4 ROP output due to bandwidth. In any realistic sense at all you are actually limited to half that(as you need reads also) and that is if the CPU was utilizing no bandwith at all. To hit even that rate we have to assume a 100% hit rate on the texture cache and assume 100% effectiveness on shader instruction scheduling. The entire way a CPU interfaces with the system as a whole is going to continue to be a limiting factor for any CPU/integrated type graphics solution. If we moved to a UMA then it would help out some, but we still would have the problem of how to que up the data constraints of both the CPU and GPU sitting on the same die utilizing the same RAM pool without overlapping each other's priotities.

In order to make a CPU/GPU combo chip that could compete with current mid range solutions you would need to have a 256bit system wide bus, minimum, running GDDR3 style RAM(must be able to read and write at the same time) and we would also need for the GPU to have a considerable sized texture/instruction cache on die to prevent stalls(stalls on something with such enormously constrained memory limitations would be catastrophic to performance, and even with those improvement it would still be horribly memory constrained).

AFAIK AMD's main reason for aquiring ATI was so that they could create Fusion.

STMicro would have been a MUCH wiser investment if that was their real goal, and I can't help but think AMD knows that very well. ATi and nV are both pushing in a direction that is at the polar opposite of where something like Fusion is going. STMicro has for years been working in exactly that direction(not that I think that is the direction we should be going, just you don't buy an Escalade to carve corners ;) ).

You ARE right ... we are not even sure about the pricing

The reality is, the 3870GX2 is selling, despite the fact that 8800GTs in SLI smack it around silly for a decent amount less, and 9600GTs run with it for a whole lot less. The fact remains, it is selling. Why wouldn't nV do something comparable and try to make as much money as they can off of it? Sure, I'd love to see the 9800x2 launch at $100, wouldn't we all, but they are a business and if ATi can sell clearly inferior performance for a price premium, nV would be pretty foolish not to follow ;)
 

Blacklash

Member
Feb 22, 2007
181
0
0
I had an X2. I still have HD 3850 Crossfire in another computer.

http://www.madonion.com/compare?3dm06=4997365

http://www.madonion.com/compare?3dm06=4487629

My plan was to get a 2560x monitor and use the X2 to push it. Since I couldn't count on it, I ditched the card. Sometimes being stuck with single HD 3870 performance or less on a 2560x monitor was not something I was interested in.

I don't really call the below proper scaling-

LP DX10 w/AA
http://img67.imageshack.us/img67/3264/lp104xqd5.gif

Hellgate DX10 w/AA
http://img67.imageshack.us/img67/3505/hell104xer9.gif

In Hellgate it's slower than a single HD 3870 @ 1280x and 2FPS faster @ 1600x + 1920x.

My experience with the card backs up everything I've said. Reliable review sites echo my experiences. On the other hand, it seems people that have never been within 20 feet of the X2 have a whole lot to say on the issue. Most of the people making blanket recommendations for folks to upgrade to the X2 have not used one. Sad and true. The X2's performance is very reliant on the application in question. Examples; great in Rainbow 6 Vegas, yet horrid in SupCom.

If you want to spend around 440usd on that card, be my guest.

On a side note, I also remember others saying VisionTek officially supports end user overclocking, which is not true according to their tech support. "Any overclocking voids your limited lifetime warranty".

I asked the same question of XFX and Evga. They told me "Yes, and do it gradually."







Originally posted by: thilan29
Originally posted by: Blacklash
My first question would be does the GX2 work properly in titles where the X2 does not?

The problem with Crossfire, including on board Crossfire on the X2, is in some games it causes a horrid performance hit that yields performance far less than a single card. Titles I am aware of this occurs in; SupCom, Gothic 3, NWN2, Need for Speed: Pro Street, WiC on the DX10 path with AA, Lost Planet: Extreme Condition, Tomb Raider: Legend, Hitman: Blood Money, Hellgate: London, and Jericho with edge smoothing active.

Weren't some of your claims about the games not scaling disproved by Apoppin? I think I remember him saying that Hellgate:London and Lost Planet DO work.

I've seen you bring those up in several threads now and several times Apoppin has responded...why do you continually bring up the same games if some of your claims are NOT true? Do you even have anything XFire right now to test out what you are saying or are you talking from very limited experience?

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: SickBeast
Originally posted by: nRollo
You seem to like saying "NVIDIA is in trouble" lately, but you haven't really linked to any credible proof of why anyone should believe you when they're dominating the market and their competitors keep sinking.

I wish every company in the country was in the kind of "trouble" NVIDIA is in, the only worry we'd have would be inflation. LOL
The reason people say that nVidia is in trouble right now is due to their lack of an X86 licence. Many are suggesting that the GPU's days are numbered in its current form, and will soon enough be implemented in all CPUs.

I suppose you could make a comparison of sorts with the oil industry. Petrolium is a finite resource. They can sell it at $105/barrel today, but in 10 years there may be no oil left on earth to sell. :light:

In essence, nVidia's troubles are not a figment of apoppin's imagination. I happen to agree with him to some extent, and there have been several editorials written recently which also reflect this sentiment.

I don't share the opinion of discrete graphics going away anytime soon.

Until I actually see some integrated CPU/GPU solutions that I'd consider "high end" I won't.

It takes multiple high end GPUs to run current games at high res/high detail now- what will change in this future era? Will devs be going back to Quake2 level graphics so we don't need discrete anymore? Or will GPU design be so advanced it somehow now does the same work in a package that will fit on the same die with the cpu?

Too many "ifs" to speculate for me, and my crystal ball is in the shop.

People said NVIDIA was doomed when ATi was purchased by AMD also- and look what happened there.


EDIT If I would have read down to Ben's posts, I could have just done a "QFT"! :)
 

ajaidevsingh

Senior member
Mar 7, 2008
563
0
0
These new card (9800GTX and GT) are a load of hot air tooo expensive and not enough processing...

I want real time ray tracing in games :cry:
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Besides the mechanics of "Fusion" processors not lending themselves to the high end, there's the whole issue of "Why would you WANT your CPU and GPU integrated?"

So you can be forced to replace both when you want to replace one or the other?

Doesn't make sense on a lot of levels, except for Grandpa buying a box his grandkids can play some games on.
 

ajaidevsingh

Senior member
Mar 7, 2008
563
0
0
Originally posted by: nRollo
Besides the mechanics of "Fusion" processors not lending themselves to the high end, there's the whole issue of "Why would you WANT your CPU and GPU integrated?"

So you can be forced to replace both when you want to replace one or the other?

Doesn't make sense on a lot of levels, except for Grandpa buying a box his grandkids can play some games on.

Fusion would be loved in a PS4 or Xbox 720 :laugh: ....Don"t you think so!!:D
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Many are suggesting that the GPU's days are numbered in its current form, and will soon enough be implemented in all CPUs.
Discrete GPUs aren't going anywhere for the simple fact that system RAM bandwidth will never come close to dedicated VRAM bandwidth no matter how many cores you throw at the problem.

You can make the argument to make CPU caches bigger but that won?t help textures, geometry, shaders or AA buffers, not when Crysis already eats 512 MB video cards alive when enabling AA.

Then there's the issue of cooling: take a look at an 8800 Ultra and tell me exactly how you're planning to mount that on a motherboard and keep it cool.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: BFG10K
Many are suggesting that the GPU's days are numbered in its current form, and will soon enough be implemented in all CPUs.
Discrete GPUs aren't going anywhere for the simple fact that system RAM bandwidth will never come close to dedicated VRAM bandwidth no matter how many cores you throw at the problem.

You can make the argument to make CPU caches bigger but that won?t help textures, geometry, shaders or AA buffers, not when Crysis already eats 512 MB video cards alive when enabling AA.

Then there's the issue of cooling: take a look at an 8800 Ultra and tell me exactly how you're planning to mount that on a motherboard and keep it cool.

You are right. Discrete GPUs don't go away but if Fusion (or Intel solution or ???) is done right then it can be inside all new TVs and there is mobile market as well.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: thilan29
Originally posted by: Blacklash
My first question would be does the GX2 work properly in titles where the X2 does not?

The problem with Crossfire, including on board Crossfire on the X2, is in some games it causes a horrid performance hit that yields performance far less than a single card. Titles I am aware of this occurs in; SupCom, Gothic 3, NWN2, Need for Speed: Pro Street, WiC on the DX10 path with AA, Lost Planet: Extreme Condition, Tomb Raider: Legend, Hitman: Blood Money, Hellgate: London, and Jericho with edge smoothing active.

Weren't some of your claims about the games not scaling disproved by Apoppin? I think I remember him saying that Hellgate:London and Lost Planet DO work.

I've seen you bring those up in several threads now and several times Apoppin has responded...why do you continually bring up the same games if some of your claims are NOT true? Do you even have anything XFire right now to test out what you are saying or are you talking from very limited experience?

backlash, please quit trolling. If you have irrefutable PROOF that crossfire doesn't scale in EVERY ONE of those games then please provide it. If not, go back to HOCP.

 

trajan2050

Member
Nov 14, 2007
92
0
0
I've seen plenty of reviews complain about lack of Crossfire support with the 3870x2. Blacklash used one, and he's perfectly free to share his experience.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: trajan2050
I've seen plenty of reviews complain about lack of Crossfire support with the 3870x2. Blacklash used one, and he's perfectly free to share his experience.

and his info is 100% out of date ,, he used it for only 24 hours .. just for the sole purpose of dissing it, i think

ask me about crossfire

anything:p

:D

what he did would be like me buying a GX2 ... and then looking for all the shit to say about it
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: apoppin
Originally posted by: trajan2050
I've seen plenty of reviews complain about lack of Crossfire support with the 3870x2. Blacklash used one, and he's perfectly free to share his experience.

and his info is 100% out of date ,, he used it for only 24 hours .. just for the sole purpose of dissing it, i think

ask me about crossfire

anything:p

:D

what he did would be like me buying a GX2 ... and then looking for all the shit to say about it

That may be so, but I wasn't impressed with my 3870X2 either and I was using the most recent Cat 8.3 drivers.

Your experiences with 2900 Frankenfire is about as relevant as to the HD 3870 X2 my SLI experiences with the 7-series NVIDIA cards is to 9600GT SLI - It's not relevant at all. Your 'information' is 100% out of date because you don't have any experience with the hardware being discussed.

My experience with the X2 was this: the avg fps appeared to be decent, but game play was often not smooth. My guess is that the X2 would occasionally 'spike' to a very low min fps that didn't affect the avg. fps in terms of numbers, but was very noticeable when playing games. Honestly, for about equal money, the 8800GTX (still!) offers an overall better and more consistent experience IMO. I know you hate them, but HardOCP's subjective testing basically says the same thing.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: bryanW1995
cat 8.3 just came out 2 days ago. do you still have the 3870x2?

I do. It's in a box waiting to be shipped to the highest bidder.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nitromullet
Originally posted by: apoppin
Originally posted by: trajan2050
I've seen plenty of reviews complain about lack of Crossfire support with the 3870x2. Blacklash used one, and he's perfectly free to share his experience.

and his info is 100% out of date ,, he used it for only 24 hours .. just for the sole purpose of dissing it, i think

ask me about crossfire

anything:p

:D

what he did would be like me buying a GX2 ... and then looking for all the shit to say about it

That may be so, but I wasn't impressed with my 3870X2 either and I was using the most recent Cat 8.3 drivers.

Your experiences with 2900 Frankenfire is about as relevant as to the HD 3870 X2 my SLI experiences with the 7-series NVIDIA cards is to 9600GT SLI - It's not relevant at all. Your 'information' is 100% out of date because you don't have any experience with the hardware being discussed.

My experience with the X2 was this: the avg fps appeared to be decent, but game play was often not smooth. My guess is that the X2 would occasionally 'spike' to a very low min fps that didn't affect the avg. fps in terms of numbers, but was very noticeable when playing games. Honestly, for about equal money, the 8800GTX (still!) offers an overall better and more consistent experience IMO. I know you hate them, but HardOCP's subjective testing basically says the same thing.

what a load of nonsense .. you are guessing, as you admit

my CrossFire *scales* just like any other CrossFire
--Frankenfire or not :p

HardOCP is the holy grail of pro-NVIDIA benchmarks - their "real world" testing is bullshit ... with "imagined spikes" that supposedly affects performance
:roll:
 

thilanliyan

Lifer
Jun 21, 2005
12,039
2,251
126
Originally posted by: bryanW1995
cat 8.3 just came out 2 days ago. do you still have the 3870x2?

As an AMD fanboi I get free "AMD rulz" tshirts and other apparel. I don't actually work for them, however, so I will pretend to make pro-nvidia statements sometimes to throw n00bs off my scent.

Haha Bryan your sig made me lol :D
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
well, I'll be happier if I ever get that "my 2900xt isn't a HOT video card, it's a HEATER that doubles as a video card" shirt. that is the coolest one that I've seen on the amd viral marketer website. they do have some cool biker shorts, though...
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: apoppin
what a load of nonsense .. you are guessing, as you admit

my CrossFire *scales* just like any other CrossFire
--Frankenfire or not :p

HardOCP is the holy grail of pro-NVIDIA benchmarks - their "real world" testing is bullshit ... with "imagined spikes" that supposedly affects performance
:roll:

You wanna buy the X2, it's for sale...?
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: nRollo
Besides the mechanics of "Fusion" processors not lending themselves to the high end, there's the whole issue of "Why would you WANT your CPU and GPU integrated?"
To save the $$$ I would otherwise spend on a graphics card! (like the $360 I paid for my 8800GTS 320mb)

I suppose when nVidia gives you free hardware, it becomes easy to lose track of this stuff, huh?

$360 can go a long way toward upgrading other parts of my rig. If Fusion could do that, I would be all over it.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
join the amd side, they have better perks! nvidia gives out free 8800gtx's, amd gives out red sweats with hot chicks tattooed on your a$$! ;)
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: nRollo
It takes multiple high end GPUs to run current games at high res/high detail now- what will change in this future era? Will devs be going back to Quake2 level graphics so we don't need discrete anymore? Or will GPU design be so advanced it somehow now does the same work in a package that will fit on the same die with the cpu?
That's not even true. My 8800GTS 320mb, a midrange card, runs every game out there at high-res and high settings, aside from Crysis (which nothing can run at those settings).

The one point that both you and Ben have failed to address remains: What will happen to nVidia when their low-end GPU sales are syphoned off by these Fusion-type processors? Most people are casual gamers and are happy if their PC can run Solitare. I'm thinking this is bound to affect them financially.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nitromullet
Originally posted by: apoppin
what a load of nonsense .. you are guessing, as you admit

my CrossFire *scales* just like any other CrossFire
--Frankenfire or not :p

HardOCP is the holy grail of pro-NVIDIA benchmarks - their "real world" testing is bullshit ... with "imagined spikes" that supposedly affects performance
:roll:

You wanna buy the X2, it's for sale...?

i think i already gave my opinion previously, but i am not at all impressed with X2 ... sorry. i am hoping [and a' prayin' :music:] that GX2 is not "really" $600 as i would LOVE to test it against my own FrankenFire
[btw, i love that, thanks! .. is that totally original? really funny!]
:thumbsup:

:D

first of all, does your X2 match the benchmarks for other X2s in similar rigs? [it wouldn't be defective, is what i might be asking]

edit
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
What will happen to nVidia when their low-end GPU sales are syphoned off by these Fusion-type processors? Most people are casual gamers and are happy if their PC can run Solitare. I'm thinking this is bound to affect them financially.

Uhm, they take a very slight hit in revenue and almost no hit in profits, I didn't think that was all the big of an issue. On die graphics aren't going to compete with anything but the lowest tier integrated solutions- and even then the overwhelming majority of people that end up putting an add in board later do so a year or more after purchasing their machine. CPU/GPU singular solutions will be good for the 'welfare' PC market in terms of performance, that is about it. This isn't a market nVidia stands to lose in any way, they never had it to start with. We are entering the phase of FP64 fb ops, at that point with current system architectures something like Fusion will be limited to, peak rate, around 1.4Mpixel/sec. That is the lucrative market nV is faced with losing :)

That's not even true. My 8800GTS 320mb, a midrange card, runs every game out there at high-res and high settings, aside from Crysis (which nothing can run at those settings).

You must have the most seriously OCd 8800GTS around. 8800Ultra pushing 29FPS in COD4, 8800Ultra getting 33FPS in STALKER, even in an old title like FEAR it only manages 45FPS. Or do you consider some very low rez high rez? I know a lot of people for a long time hung on to the notion that the absurdly low 1600x1200 was high res, heh.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: BenSkywalker
You must have the most seriously OCd 8800GTS around. 8800Ultra pushing 29FPS in COD4, 8800Ultra getting 33FPS in STALKER, even in an old title like FEAR it only manages 45FPS. Or do you consider some very low rez high rez? I know a lot of people for a long time hung on to the notion that the absurdly low 1600x1200 was high res, heh.
Yes, the card is moderately OCed, and yes, I think 1920x1200 is high res. :)

When they make televisions with that crazy 2560x1600 resolution, then I'll consider it. :beer:

As far as the rest goes, I was under the impression that intel was actually the world's leading GPU manufacturer based on their sales volume. I simply figured that the low-end was a gravy train for nVidia as well.