• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Crossfire previews

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: BFG10K
that is way beyond speculation to the point of ridiculous. . . clearly it will be sli'd
You sure about that? What if it has dual slot cooling?

I don't know as I don't have the official specs.

as certain as i have ever been without seeing them . . . [why?

because] SLI would be a "failure" if nVidia offered it only for the 6800s . . . . ;)
[ati wins]

g70 does look to be single slot on the smaller die . . . .
[however] EVEN IF their top card is dual slot, you can be sure nVidia will come up with a way to sli it . . . . [expensive and liquid-metal cooled, no doubt :p]

absolutely positively certainly their "GT" will NOT be be dual-slot and would be an excellent "budget" candidate for sli.

i really DO expect the g70 to be a smaller-cooler card than the 6800 series . . . nVidia did not design their SLI for the 6800s . . . rather the "future" and we DO know SLI-2 ["AMR Killer"] is in "the works"



edit: think logically what SLI - 2 will do . . . give gamers what they want:
1) ability to use two cards with different BIOSes and maybe different generations
2) Supertile and other refinements that ati is using
3) Allow for SLI changes on-the-fly without rebooting
4) OF course, 16 lanes PCIX for EACH card
5) Multimonitor support with sli and address all the other complaints

of course ATI will follow with AMR-2

You don't have to be psychic to see where things are going - outrageously expensive :(
[and i am going to console gaming]
:brokenheart:

edited again and again :p
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Isn't G70 .11? And clocked (according to the rumors) about the same, possibly a bit higher, than the NV40 cards (which were .13)?

If so, that could account for the cooler running and single slot design. Additionally, that may be why the card is so long...spread the die out, spread the heat out - of course, it hurts yields, but hey, who knows...

Frankly, BFG, I have to agree with Apoppin re: SLi. Introducing it for a single gen, and then killing it when ATi plays follow the leader doesn't make any sense to me.
 

trinibwoy

Senior member
Apr 29, 2005
317
3
81
Originally posted by: apoppin
edit: think logically what SLI - 2 will do . . . give gamers what they want:
1) ability to use two cards with different BIOSes and maybe different generations
2) Supertile and other refinements that ati is using
3) Allow for SLI changes on-the-fly without rebooting
4) OF course, 16 lanes PCIX for EACH card

Supertiling on Nvidia hardware is highly unlikely. Firstly, ATi has been doing it for a long time so it was a no brainer for them. Secondly, the caching architecture of NV40 is such that SFR will probably perform better than any tiling implementation (on NV40).
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
I don't like the idea of tile rendering to begin with. I'm sure ATi will tout it as a feature just because it's there, but to me it's a big waste of clocks and memory. It requires more double processing of geometry (some polygons are split between tiles) than AFR or Split Screen. As if the local double-up on textures wasn't enough, this too?

I think tile is going to bog compared to the other two modes. You heard it here first.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: trinibwoy
Originally posted by: apoppin
edit: think logically what SLI - 2 will do . . . give gamers what they want:
1) ability to use two cards with different BIOSes and maybe different generations
2) Supertile and other refinements that ati is using
3) Allow for SLI changes on-the-fly without rebooting
4) OF course, 16 lanes PCIX for EACH card

Supertiling on Nvidia hardware is highly unlikely. Firstly, ATi has been doing it for a long time so it was a no brainer for them. Secondly, the caching architecture of NV40 is such that SFR will probably perform better than any tiling implementation (on NV40).

edit: think logically what SLI - 2 will do . . . give gamers what they want:
1) ability to use two cards with different BIOSes and maybe different generations
2) "Supertile" or other refinements that ati is using
3) Allow for SLI changes on-the-fly without rebooting
4) OF course, 16 lanes PCIX for EACH card
5) Multimonitor support with sli and address all the other complaints

of course ATI will follow with AMR-2

You don't have to be psychic to see where things are going - outrageously expensive :(
[and i am going to console gaming]
:brokenheart:

edited again and again :p

i think you get my point . . . "other refinements" should cover it ;)
edit: Clearly SLI-2 is for G70 and beyond . . . who knows about "supertile" and nvidia . . . no doubt they will have "Super supertile"
:roll:
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Originally posted by: apoppin
of course ATI will follow with AMR-2

You don't have to be psychic to see where things are going - outrageously expensive :(
This is the truest thing to be said on this entire thread, although in theory having competition drives down prices, and speeds up development with ATI and nVidea trying for the fastest banchmarks to out do each other the people with a budget are going to get lost. As much as I love innovation and improvement I can see this getting real ugly for the midrange gamers.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Insomniak
Isn't G70 .11? And clocked (according to the rumors) about the same, possibly a bit higher, than the NV40 cards (which were .13)?

If so, that could account for the cooler running and single slot design. Additionally, that may be why the card is so long...spread the die out, spread the heat out - of course, it hurts yields, but hey, who knows...

Frankly, BFG, I have to agree with Apoppin re: SLi. Introducing it for a single gen, and then killing it when ATi plays follow the leader doesn't make any sense to me.

probably 09
. . . . maybe on release and certainly on refresh. Since the "g70" for the Sony PS3 is 09 it is logical tha nVidia has also made the transition

if it IS 09 THEN they have their yield AND cooler running ;)

ANd why would nVidia be working on SLI-2 IF they weren't planning to use it with the G70?
:shocked:
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: fierydemise
Originally posted by: apoppin
of course ATI will follow with AMR-2

You don't have to be psychic to see where things are going - outrageously expensive :(
This is the truest thing to be said on this entire thread, although in theory having competition drives down prices, and speeds up development with ATI and nVidea trying for the fastest banchmarks to out do each other the people with a budget are going to get lost. As much as I love innovation and improvement I can see this getting real ugly for the midrange gamers.

There's no doubt at all that Nvidia and ATI each LOVE the thought of selling two video cards to each gamer at every core refresh instead of one. The limiting factor will soon become the games themselves. Even running at ungodly high resolutions, these new cards may churn out framerates high enough to dramatically extend their useable lifespan.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: fierydemise
Originally posted by: apoppin
of course ATI will follow with AMR-2

You don't have to be psychic to see where things are going - outrageously expensive :(
This is the truest thing to be said on this entire thread, although in theory having competition drives down prices, and speeds up development with ATI and nVidea trying for the fastest banchmarks to out do each other the people with a budget are going to get lost. As much as I love innovation and improvement I can see this getting real ugly for the midrange gamers.


i am a mid-range gamer - although i LOVE pc gaming. i am currently running a P4@3.31Ghz/1GB PC3500/480w TT PS/Radeon 9800xt on a Samsung 19"Flat CRT that maxes out at 11x8 @ 85hz . . . . there isn't a SINGLE new game that won't run at my resolution or 10x7 with hi details and max and on . . . . ALL i can upgrade THIS system to is a single agp 7800gt [or likely my agp last card] a 6800Ultra which will probably run Unreal 3 Engine games "so-so". :(

Anything else FURTHER requires a complete system upgrade and BIG bucks . . . multi core CPU/Physics proc/Sli'd Video cards/2GB DDR2-3 ram . . . what this is doing is getting me to set aside PC games and pick up a $350 next gen console and HD tv or monitor [that will get double duty] . . . . for at least the next 2-3 years.

i'll miss you guys







:roll:





:D







:laugh:
-------------------

Originally posted by: Creig

There's no doubt at all that Nvidia and ATI each LOVE the thought of selling two video cards to each gamer at every core refresh instead of one. The limiting factor will soon become the games themselves. Even running at ungodly high resolutions, these new cards may churn out framerates high enough to dramatically extend their useable lifespan.
HW is ALWAYS a generation or two ahead of SW . . . that's WHY my 9800xt still is satisfactory for my resolutions.

Unreal3 and Doom3 engines (don't forget FC2/Starbreeze2/and all the other next gen games) will really start to use 512MB vRAM and then SLI2/AMR will be "necessary" for mucho AA/AF at ultra-hi resolutions. ;)
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0

Originally posted by: BFG10K
that is way beyond speculation to the point of ridiculous. . . clearly it will be sli'd
You sure about that? What if it has dual slot cooling?

I don't know as I don't have the official specs.

I cannot believe you would say something so silly BFG10K. I know you have your own views on a lot of things such as resolution and I respect them even if I don't personally agree with them, however this is beyond silly.

First of all, why would nVidia suddenly discontinue support for SLI on flagship products? How does that make any sense whatsoever, taking the overall timespan of nVidia SLI into account?

Secondly, if you examine the photographs provided by anand and the other website, you can clearly see the SLI header at the top of the card.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
that kind of speculation is just to be a PITA.. i'd be willing to bet large sums of money it will be SLI capable. looking at the photos, i feel like this is a better investment than my XM radio stock purchase years ago turned out for me.

any amount anyone would like to place on that bet, I can match it.
$50,000 is on the table that the high end G70 is SLI capable.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: housecat
that kind of speculation is just to be a PITA.. i'd be willing to bet large sums of money it will be SLI capable. looking at the photos, i feel like this is a better investment than my XM radio stock purchase years ago turned out for me.

any amount anyone would like to place on that bet, I can match it.
$50,000 is on the table that the high end G70 is SLI capable.

i don't think you can legally make that bet over the internet. ;)

i think we've already - especially me - overkilled the subject in answer to a "thought" that was just tossed out into the air. :p

CLEARly nVidia is not abandoning SLI anytime soon . . . not for g70 and it doesn't look like the sli trend will go away - IF pc gaming expects to keep up with the consoles. ;)
(personally i'm looking forward to multicore GPUs on a single videocard that can also be sli'd)
[and a sugar momma who will support all my habits] :heart:

edit: how DID xm radio turn out?
{xm? :confused:}
 

housecat

Banned
Oct 20, 2004
1,426
0
0
XM (xmsr) i bought at $5 and its now about $30. when you hold a stock like that for years, and make a substantial initial investment.. it gives you a nice nestegg.. if you sell it of course which im going to do.

i know years ago, some of the crew here made some cash off NV stock hehe.
i dont think wallstreet understands tech.. which is why google is trading like 32X earnings.. i wont be buying that one at this time

but ya, the best no one is going to take anyway. cuz they'd lose and they know it. i was trying to show how ridiculous and absurd it is to assume that. its not absurb to believe that NV wants to sell you 2 high end G70s though.

i mean, that wouldnt make blow this jack daniels out on my screen if Jen-Hsun Huang even said that himself.. ;)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
never cared much for stock . . . i don't like that kind of investment preferring to do similar with RE.

ANYway, no one will take this bet as nVidia is comitted to SLI . . . committed to selling that 2nd videocard for each gaming machine.

if you are really gonna offer a bet - do it BEFORE the discussion, not after ;)
:roll:

edit: Crap . . . my average ppd is picking up again up to 9.33 down from a recent "low" of 9.31 . . . NEVERmind what it USED to be :eek:

cutting down ;)

and logging out . . . g'nite
peace and aloha
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
From what I've read, RSX/G70 is a completely new core.. needs new drivers basically from scratch ect.
What are they changing exactly? It doesn't appear to be more than a tweaked NV4x core at this stage and the most significant change would probably be manufacturing process.

g70 does look to be single slot on the smaller die . .
Whatever the case it looks like it'll be pulling a lot more juice and generating a lot more heat than a 6800U.

it for a single gen, and then killing it when ATi plays follow the leader doesn't make any sense to me.
First of all, why would nVidia suddenly discontinue support for SLI on flagship products?
Don't get me wrong, I fully expect the the product to be SLI capable.

But think about the logistics of putting two cards together at once when each likely drains more juice and produces more heat than a 6800U.
 

biostud

Lifer
Feb 27, 2003
19,937
7,041
136
Originally posted by: BFG10K
From what I've read, RSX/G70 is a completely new core.. needs new drivers basically from scratch ect.
What are they changing exactly? It doesn't appear to be more than a tweaked NV4x core at this stage and the most significant change would probably be manufacturing process.

g70 does look to be single slot on the smaller die . .
Whatever the case it looks like it'll be pulling a lot more juice and generating a lot more heat than a 6800U.

I doubt it, if the single slot cooling is true. I *think* the shrink to .11 will make it able to run within the power range of nv40. The pics of G70 boards only show one power connector, and combined with the maximum power that can be delivered through the slot doesn't max out at ~150W?

If the G70 is ever to be made on .09 for PC's, nvidia will probably do a relaunch similar to the x850 line.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Creig
Originally posted by: housecat
I am on the side of truth, Creig. I dont think R-own-o cares if I'm on his side or not, he is whipping you with one hand behind his back anyway.
I just keep score.


Hey housecat, since you like keeping score so much, how 'bout this one?

AnandTech Moderators - 2
housecat - 0

Now let's try keeping the thread on track. If you want to keep kissing up to Rollo, create a thread in OT about it.

HAHAHAHA Owned :D
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
What are they changing exactly? It doesn't appear to be more than a tweaked NV4x core at this stage and the most significant change would probably be manufacturing process.

That is hard to say considering Nvidia has been pretty tight lipped. However the chip in the PS3 is speculated to be similar to the G70 as Nvidia didnt have time to build a GPU from the ground up like ATI has with XboX 360.

The shader performance if the two GPUs are within a stones throw of each other will more than double from the NV40 while using the same clock speed.

I dont think slapping on two more quads will more than double your performance unless you are doing some things under the hood.

 

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
I'm pretty excited about this fall. It should be REALLY interesting scene with these new products from ATi and nVidia. :)
 

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
Originally posted by: apoppin
Originally posted by: BFG10K
that is way beyond speculation to the point of ridiculous. . . clearly it will be sli'd
You sure about that? What if it has dual slot cooling?

I don't know as I don't have the official specs.

as certain as i have ever been without seeing them . . . [why?

because] SLI would be a "failure" if nVidia offered it only for the 6800s . . . . ;)
[ati wins]

g70 does look to be single slot on the smaller die . . . .
[however] EVEN IF their top card is dual slot, you can be sure nVidia will come up with a way to sli it . . . . [expensive and liquid-metal cooled, no doubt :p]

absolutely positively certainly their "GT" will NOT be be dual-slot and would be an excellent "budget" candidate for sli.

i really DO expect the g70 to be a smaller-cooler card than the 6800 series . . . nVidia did not design their SLI for the 6800s . . . rather the "future" and we DO know SLI-2 ["AMR Killer"] is in "the works"



edit: think logically what SLI - 2 will do . . . give gamers what they want:
1) ability to use two cards with different BIOSes and maybe different generations
2) Supertile and other refinements that ati is using
3) Allow for SLI changes on-the-fly without rebooting
4) OF course, 16 lanes PCIX for EACH card
5) Multimonitor support with sli and address all the other complaints

of course ATI will follow with AMR-2

You don't have to be psychic to see where things are going - outrageously expensive :(
[and i am going to console gaming]
:brokenheart:

edited again and again :p
I feel another edit coming. ;)
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: BFG10K
From what I've read, RSX/G70 is a completely new core.. needs new drivers basically from scratch ect.
What are they changing exactly? It doesn't appear to be more than a tweaked NV4x core at this stage and the most significant change would probably be manufacturing process.

I was under the impression NV50 was a design based on NV40, but G70/RSX is a complete reworking of everything and how it works.. I could be wrong, but I thought they decided to scrap NV50 because they developed G70 from the ground up with the Cell team, carried over some stuff from them and used their money/fabs ect.
From a feature standpoint, we dont have information -yet- on it being different from NV40. But even NV40 is not behind the times. I'm hoping for WGF2.0 spec if they go beyond DX9C anyway, if they know what that is yet at all.
We'll have to wait and see really.. speculation sucks.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: BFG10K
Don't get me wrong, I fully expect the the product to be SLI capable.

But think about the logistics of putting two cards together at once when each likely drains more juice and produces more heat than a 6800U.


I think that comment is far more speculative than anything else we've heard on here. AFAIK the 6800U consumes less power than the 9800XT - next gen doesn't always mean power consumption and heat go up.

Especially considering the fact that G70 is being produced on a smaller manufacturing process than the 6800U, I think it's quite possible it will use less power/produce less heat than the 6800U.

Either that, or they decide to crank the clockspeed and transistor count up for more performance and keep the heat/power levels as high or higher. Regardless, it's a win.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: Genx87
What are they changing exactly? It doesn't appear to be more than a tweaked NV4x core at this stage and the most significant change would probably be manufacturing process.

That is hard to say considering Nvidia has been pretty tight lipped. However the chip in the PS3 is speculated to be similar to the G70 as Nvidia didnt have time to build a GPU from the ground up like ATI has with XboX 360.

The shader performance if the two GPUs are within a stones throw of each other will more than double from the NV40 while using the same clock speed.

I dont think slapping on two more quads will more than double your performance unless you are doing some things under the hood.


I agree, but two more quads with an upshot in clock frequency could pull it off - and that is one of the rumors floating around.

Guess we won't know for a bit yet though - time will tell.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: JackBurton
Originally posted by: apoppin
Originally posted by: BFG10K
that is way beyond speculation to the point of ridiculous. . . clearly it will be sli'd
You sure about that? What if it has dual slot cooling?

I don't know as I don't have the official specs.

as certain as i have ever been without seeing them . . . [why?

because] SLI would be a "failure" if nVidia offered it only for the 6800s . . . . ;)
[ati wins]

g70 does look to be single slot on the smaller die . . . .
[however] EVEN IF their top card is dual slot, you can be sure nVidia will come up with a way to sli it . . . . [expensive and liquid-metal cooled, no doubt :p]

absolutely positively certainly their "GT" will NOT be be dual-slot and would be an excellent "budget" candidate for sli.

i really DO expect the g70 to be a smaller-cooler card than the 6800 series . . . nVidia did not design their SLI for the 6800s . . . rather the "future" and we DO know SLI-2 ["AMR Killer"] is in "the works"



edit: think logically what SLI - 2 will do . . . give gamers what they want:
1) ability to use two cards with different BIOSes and maybe different generations
2) Supertile and other refinements that ati is using
3) Allow for SLI changes on-the-fly without rebooting
4) OF course, 16 lanes PCIX for EACH card
5) Multimonitor support with sli and address all the other complaints

of course ATI will follow with AMR-2

You don't have to be psychic to see where things are going - outrageously expensive :(
[and i am going to console gaming]
:brokenheart:

edited again and again :p
I feel another edit coming. ;)

how does it feel?

i posted between that final edit and going to bed last night . . . nothing further has changed.
:roll:
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Well Rollo theres no convincing each other, I guess we will have to wait and see how it all plays out.