R680 naked

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
If this card don't beat up a 8800U , then i would be very disappointed in ATI. Also i wished ATI would have used 2x 6pin instead of 1x 8 pin :!
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,256
126
Originally posted by: tuteja1986
If this card don't beat up a 8800U , then i would be very disappointed in ATI. Also i wished ATI would have used 2x 6pin instead of 1x 8 pin :!

Yeah exactly. If they can sell it for about ~$400 and beat the Ultra that would be great. Unless of course GF9 is much faster.
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
Originally posted by: tuteja1986
If this card don't beat up a 8800U , then i would be very disappointed in ATI. Also i wished ATI would have used 2x 6pin instead of 1x 8 pin :!

JUst use the 6-pin to 8-pin adapter that'll be included with the video card.
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,256
126
Originally posted by: angry hampster
Any info on when this card will be released? I'm assuming it'll have 1GB of VRAM?

late Q1 2008 or early Q2 I think. It will probably have 1gb of vram but still it'll probably be like Crossfire/SLI where each card only uses 512mb.
 

ConstipatedVigilante

Diamond Member
Feb 22, 2006
7,670
1
0
Originally posted by: tuteja1986
If this card don't beat up a 8800U , then i would be very disappointed in ATI. Also i wished ATI would have used 2x 6pin instead of 1x 8 pin :!

It doesn't matter too much if it's faster than the 8800Ultra. It matters more how much they can sell them for in relation to price. The super-high end market is a niche that doesn't give that much revenue.
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Originally posted by: thilan29
Originally posted by: angry hampster
Any info on when this card will be released? I'm assuming it'll have 1GB of VRAM?

late Q1 2008 or early Q2 I think. It will probably have 1gb of vram but still it'll probably be like Crossfire/SLI where each card only uses 512mb.

According to the article and the picture presented they are using a PLX chip for two GPU commmuications.

This means that the crossfire will not be software based. It will be a hardware solution which will hopefully provide even better scaling.
 

Griswold

Senior member
Dec 24, 2004
630
0
0
Originally posted by: tuteja1986
If this card don't beat up a 8800U , then i would be very disappointed in ATI. Also i wished ATI would have used 2x 6pin instead of 1x 8 pin :!

Hmm on that picture I spy 1x6 and 1x8.

 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: ConstipatedVigilante

It doesn't matter too much if it's faster than the 8800Ultra. It matters more how much they can sell them for in relation to price. The super-high end market is a niche that doesn't give that much revenue.

Probably neither of those points matter. What matters is its price and performance in relation to G100 which is rumoured to be out in March of 2008. Either way, ATI performs poorly with AA enabled and unless they made drastic changes to this architecture, it's going to be a tough battle (not to mention scaling issues).
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Also we can notice two digital PWN power modules.
Haha proof this card is going to pwn!

Interesting part though, definitely hope it makes it to market. PLX bridge chip piqued my interest, but it doesn't seem much more than a PCI-E bridge for the 2nd GPU or both GPU to access the PCI-E slot. Essentially it'll just make an a single x16 into 2 x8 for the two GPU unless I'm misinterpreting some of the info on PLX's website.

PLX website

 

Quiksilver

Diamond Member
Jul 3, 2005
4,725
0
71
Anyone know the length of this card? If it is any bigger than the 8800 Ultra then that sucks as I won't be able to use it in my case :( (NZXT Zero)
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,256
126
Originally posted by: Zstream
Originally posted by: thilan29
Originally posted by: angry hampster
Any info on when this card will be released? I'm assuming it'll have 1GB of VRAM?

late Q1 2008 or early Q2 I think. It will probably have 1gb of vram but still it'll probably be like Crossfire/SLI where each card only uses 512mb.

According to the article and the picture presented they are using a PLX chip for two GPU commmuications.

This means that the crossfire will not be software based. It will be a hardware solution which will hopefully provide even better scaling.

Isn't that just a PCI-e bridge chip? (ie. just for communication but Crossfire software will be needed?)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Zstream
Originally posted by: thilan29
Originally posted by: angry hampster
Any info on when this card will be released? I'm assuming it'll have 1GB of VRAM?

late Q1 2008 or early Q2 I think. It will probably have 1gb of vram but still it'll probably be like Crossfire/SLI where each card only uses 512mb.

According to the article and the picture presented they are using a PLX chip for two GPU commmuications.

This means that the crossfire will not be software based. It will be a hardware solution which will hopefully provide even better scaling.

Very good assumption... which might be the big break for AMD.. finally they did something right.
hardware multi chip implementations are the ONLY way to go.
 

toadeater

Senior member
Jul 16, 2007
488
0
0
Even though it's not too profitable, AMD needs to go after the mainstream ($150-$350) market to keep their marketshare up and ensure game and OEM support for their hardware. They can relinquish the high end to Nvidia for now, because very few people buy $500 graphics cards no matter how good they are. They've basically done this throughout 2007 already, except they didn't have a competitive mainstream card. Now they might.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: chizow
Also we can notice two digital PWN power modules.
Haha proof this card is going to pwn!

Interesting part though, definitely hope it makes it to market. PLX bridge chip piqued my interest, but it doesn't seem much more than a PCI-E bridge for the 2nd GPU or both GPU to access the PCI-E slot. Essentially it'll just make an a single x16 into 2 x8 for the two GPU unless I'm misinterpreting some of the info on PLX's website.

PLX website
No, you're interpreting it correctly. It's just like the 7950GX2, they need a bridge chip to split the PCIe lanes for each GPU, which is what the PLX chip does. This looks exactly like the rumors have been stating, it's a pair of RV670s on a single card, running in Crossfire. It's ATI's 7950GX2/Rage Fury Maxx.

Performance wise, I would be shocked if it performed significantly different from a pair of 3870's in Crossfire today. That's going to be a problem for ATI, Crossfire doesn't perfectly scale among all games as we're well past the days where rendering was simple enough to so easily split.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
oh its a pcie splitter? then that sucks... hardcore suckage. Forget it than, i aint buying.

anyways, i forgot to make my joke... "If UFOs taught me anything, its that if the pictures are blurry than it doesn't exist"...
 

zeroburrito

Member
Dec 5, 2007
128
0
0
mainstream is 150 and below. you're crazy if you think mainstream is above that. over 150 is niche enthusiasts.
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Originally posted by: ViRGE
Originally posted by: chizow
Also we can notice two digital PWN power modules.
Haha proof this card is going to pwn!

Interesting part though, definitely hope it makes it to market. PLX bridge chip piqued my interest, but it doesn't seem much more than a PCI-E bridge for the 2nd GPU or both GPU to access the PCI-E slot. Essentially it'll just make an a single x16 into 2 x8 for the two GPU unless I'm misinterpreting some of the info on PLX's website.

PLX website
No, you're interpreting it correctly. It's just like the 7950GX2, they need a bridge chip to split the PCIe lanes for each GPU, which is what the PLX chip does. This looks exactly like the rumors have been stating, it's a pair of RV670s on a single card, running in Crossfire. It's ATI's 7950GX2/Rage Fury Maxx.

Performance wise, I would be shocked if it performed significantly different from a pair of 3870's in Crossfire today. That's going to be a problem for ATI, Crossfire doesn't perfectly scale among all games as we're well past the days where rendering was simple enough to so easily split.

The reason is that the 7950gx2 was split into two seperate PCB's. Also the Nvidia chip did not use PLX technology. I will wait and see about my judgement but for now it looks like a decent card.

Also it is shorter then the Ultra. So it is not that long.
 

gmofftarki

Member
Nov 30, 2007
27
0
0
Originally posted by: zeroburrito
mainstream is 150 and below. you're crazy if you think mainstream is above that. over 150 is niche enthusiasts.

For 'gamer mainstream'? If you're talking about 'I need my computer to play the Vista window-opening animations, you may have a point.

For those that are looking to play 'Doom 3 or more recent' 3D computer games? 200-250 is probably a much more fair assessment of the 'mainstream cutoff' given that... well, wasn't the ATI 9800 Pro, a mainstream 'gamer' card if there ever was one, was 180-250 dollars 8 years ago, and I doubt the mainstream price would go down.

Is it possible to get a card from the previous generation with playable performance within those price restrictions? Sure. 7800GT or 2600 Pro or 8600GTS (I guess you can't really argue against those cards for stuff like WoW or TF2) will work at lower resolutions. But more and more consumers are getting 1680x1050 widescreen monitors without a second thought for 'Hmm, maybe my GeForce 2 can't drive this thing', and as such the average gamer 'video card mainstream price' is inflating.

Honestly, with the market the way it is currently, if I were talking to someone that ever wanted to use his computer for gaming more advanced than, say, FreeCell, I'd point him straight to the 3850 and say 'Trust me on this one', rather than let them try to use an 8400GS or Intel integrated chip and leave PC gaming entirely.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I know a LOT of gamers... they all routinely buy 300+$ video cards. the non gamers I know buy a dell...

So yea... 300 is pretty mainstream... Oh yea, you should ask how much people spend on their consoles nowadays