New ATI dual core gpu

gobucks

Golden Member
Oct 22, 2004
1,166
0
0
these things are all crap. 1) The X800 Pro sucks ass, especially considering the dual nvidia cards are 16-pipe 6800 Ultras, which will no doubt be preferred among people willing to buy dual GPU cards. Plus, dual GPUs are way less than 100% efficient, so it's not like you're really getting a 24-pipe solution equivalent. 2) The power these dual cards provide will end up being wasted on current games, and by the time a game comes out to use the power, the GPUs available will probably be faster than your dual setup while using less power and costing less than or equal to your second card. 3) The design is inefficient. The memory is redundant, so the 512MB is really just 2 identical 256MB buffers, adding to manufacturing cost without yielding any more performance. The PCB is enormous, costing a lot in addition to requiring a large case and sacrificing PCI slots to accomodate the ginormous cooler. 4) They're gonna cost like 7.23 Kajillion dollars at MSRP, and nobody's gonna have them in stock cause only like 10 people on the planet are stupid enough to waste their paychecks on them, so they're gonna end up costing 10 Kajillion after the price gauging.
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
I read it fine...Damn!!! it nearly double the pro's performanced an shattered Nvidia SLI systems...If Dual core is the way both will go then SLI systems seems like an utter waste. 2x 6800GTs only gain like 33% i nSLI mode whereas the ultra 2x more then doubled the x800 pro in some of the test.....

Imagine the heat and the power of this thing.....

Imagine...dual core cpus, dual core gpus, and soon to be standalone PPUs.....getting more and more into that parallel processing I have been talking about for the last 3 years....
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
Originally posted by: gobucks
these things are all crap. 1) The X800 Pro sucks ass, especially considering the dual nvidia cards are 16-pipe 6800 Ultras, which will no doubt be preferred among people willing to buy dual GPU cards. Plus, dual GPUs are way less than 100% efficient, so it's not like you're really getting a 24-pipe solution equivalent. 2) The power these dual cards provide will end up being wasted on current games, and by the time a game comes out to use the power, the GPUs available will probably be faster than your dual setup while using less power and costing less than or equal to your second card. 3) The design is inefficient. The memory is redundant, so the 512MB is really just 2 identical 256MB buffers, adding to manufacturing cost without yielding any more performance. The PCB is enormous, costing a lot in addition to requiring a large case and sacrificing PCI slots to accomodate the ginormous cooler. 4) They're gonna cost like 7.23 Kajillion dollars at MSRP, and nobody's gonna have them in stock cause only like 10 people on the planet are stupid enough to waste their paychecks on them, so they're gonna end up costing 10 Kajillion after the price gauging.

Did you actually read it???

I have argued worthlessness for years but the fps fanboys in here love that shite..I mean 270fps is great right???? noticebale huh?? (sarcasm)

I wonder what it could do in a quadro or firegl setup!!!

 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
(*sigh*) Don't these websites ever get tired of making April Fool's Day articles?
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
Originally posted by: bR
ok i hope ^ you guys are playing along with it


Yeah I have been waiting for some of these!!! Got to love the internet!!!

The pics gave it away for me...that and the scaling seemed to good to be true. Why would any company double it performance and shoot itself in the foot for the rest of its product line....
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
I'm not sure how much I trust that website, as it looks like it's written by kids, but the card looks amazing if their testing is accurate.

I've said it before, I'll say it again: Loved my MAXX, have been waiting for ATI to bring that back for many years now.

Kudos ATI if this is all true, only downside I see is lack of SM3.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Creig
(*sigh*) Don't these websites ever get tired of making April Fool's Day articles?

Arggg. I've said it before, I'll say it again: Miss the MAXX.

These guys shouldn't raise people's hopes like this. :(
 

boyRacer

Lifer
Oct 1, 2001
18,569
0
0
Originally posted by: Rollo
I'm not sure how much I trust that website, as it looks like it's written by kids, but the card looks amazing if their testing is accurate.

I've said it before, I'll say it again: Loved my MAXX, have been waiting for ATI to bring that back for many years now.

Kudos ATI if this is all true, only downside I see is lack of SM3.

ok i hope you're playing along with it too
 

imported_humey

Senior member
Nov 9, 2004
863
0
0
Thats a very goos ite BTW, funnyif its april fools cause it was posted on 31st and unless its in OZ its still 31st there and here GMT at time i posted it, but i say again i cant load page fully so not got clue what pics look like. :(
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Graphics cores are infinately parrellel by design, and a single core will most likely always be more efficient than 2 working concurrently on the same task.

This is of course my opinion...

But a highly optimised 32 pipe core will always slaughter 2 16s, and a 16 will always slaughter 2 8s. This is of course assuming that transistor counts are similar between the dual and next gen single core GPU.

SLI is a way to get next gen performance now. Using it as an upgrade path is feasible but probably not the best option out there as many have said.

This is coming from someone planning to go SLI when it hits the pentium 4 :p
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Acanthus
Graphics cores are infinately parrellel by design, and a single core will most likely always be more efficient than 2 working concurrently on the same task.

This is of course my opinion...

But a highly optimised 32 pipe core will always slaughter 2 16s, and a 16 will always slaughter 2 8s. This is of course assuming that transistor counts are similar between the dual and next gen single core GPU.

(note: folks, it's April 1st.)

But on a more technical note -- yes, one big core is probably always going to be at least marginally more efficient than multiple cores working in parallel (although any such losses should be pretty minimal, since graphics tasks are so easily parallelizable -- I don't think it will "slaughter" a multicore solution if both are well-designed). But I'm not sure how much longer it's going to stay feasable to make single-core GPUs (and/or graphics cards with a single GPU on them)!

NVIDIA is pushing 200+M transistors -- and almost all of that is logic, unlike on a CPU! -- on their 16-pipe GPUs; 400+M transistors in a single core may just not be feasable with today's chip design and manufacturing technology. The cores get huge, and the yields and speeds start to drop precipitously (and the leakage current/power usage would be nuts). It might end up being a *lot* easier to either design multicore GPUs, or to build graphics cards with multiple simpler GPUs onboard that are tied together by software and/or hardware means (similar to the "SLI on a board" designs we've seen, but with the GPUs actually sharing memory and designed to work that way from the ground up).
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BouZouki
Dual gpu > SLI

It wasn't when it was actually available, the MAXX had some pretty significant issues synching output of AFR. You'd have wild fluctuations in framerate.

Nonetheless, loved my MAXX- very cool hardware.