are the real GF9 cards coming out in march or later?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Last I read was about g200, coming in the second half of 2008, and so will r700 from AMD. I actually think R700 is going to hit the market before nvidia's new high-end videocards, because samples are allready out, or so say the rumours, and the R700 seems to be at least 50% faster then R670. G200 still being produced at 65nm, containing more transistors then ever before, is going to run HOT as hell, or so say the rumours, hehe. It sure is going to be interesting.
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Sounds like a return to competition, always a good thing..
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: taltamir
"As of September 2007, the G80 was the largest commercial GPU ever constructed. It consists of 681 million transistors covering a 480 mm² die surface area built on a 90 nm process" - wikipedia.
Going from that to 1800 mil is quite a leap.
Well G92 has 754 million transistors, but this would still be huge leap. I mean 2,387.. times more transistors.. It would really be monolithic. If they'll make it with 45nm and this chip would be 535mm^2 then in that case 1800M transistors would be realistic. How ever 45nm proces for G100 and 535mm^2 chip size aren't realistic.
-----

By the way Nvidia's next gen high end cards could still sport 256 bit mem bandwith. I mean Samsung & co. have been saying that first GDDR5 products will come out 2008 H2 (suprisingly 2008 H2 is also the expected time for next gen launch for Nvidia and AMD). Nvidia is skipping GDDR4 so it would be weird if Nvidia's next gen would still use GDDR3.

Samsung sent GDDR5 2.5GHz (5000MHz DDR) samples already at late October/early November to their customers. If they would get that kind of chips to card with 256-bit bus it would translate in to a 160GB/s membandwith wich should be enough or more than enough for next gen high end (if you combine HD3870 X2's both mem bus..you'll get total of 115,2GB/s..).


 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: MarcVenice
Last I read was about g200, coming in the second half of 2008, and so will r700 from AMD. I actually think R700 is going to hit the market before nvidia's new high-end videocards, because samples are allready out, or so say the rumours, and the R700 seems to be at least 50% faster then R670. G200 still being produced at 65nm, containing more transistors then ever before, is going to run HOT as hell, or so say the rumours, hehe. It sure is going to be interesting.
Well both GPU's were "taped out" January:
http://www.tcmagazine.info/com...shownews=17735&catid=2
-----
Eerhm those numbers:
R680 - two RV670 GPU's
R700 - two RV770 GPU's

Yes it will be one hot GPU, but will it consume more power than two AMD GPU's? Well they still have room for extra power consumption. If they add extra 40-50W for 8800 Ultra power consumption they would be at HD3870 X2 level:
http://techreport.com/r.x/rade...3870-x2/power-load.gif

Well it all comes to that what kind of clock frequencies they think that G100 needs to beat R700. G100 has one advantage (performance wise) because it's single GPU solution; two GPU's doesn't mean twice the performance, but twice the power consumption..atleast.
-----
 

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
Sadly the 9600GT isn't too hot, I mean it has good bandwidth but only 64 shader processors, early benchmarks show that it's faster than a 8800GS which in turn is faster than the HD 3850 so I guess it's not too bad for the price but I expected a more powerful card.

Anyone looking to buy a card in the $200 range should just buy the 8800GT because it's here to stay, we'll get 9600GT (below it) in March, a 9800GX2 in late march (way above the $200 price range) and the GT200 in late Q2 or early Q3 (hopefully) but that will be ultra high end as well so I guess the 8800GT will remain the top $200 card the whole year.

Apparently nvidia's plan is to rebrand the 8800GT as 9800GT and the 8800GTS (G92) as 9800GTX, possibly with some minor tweaks, but G92 nevertheless.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
sounds about right... no competition? then we don't need new products, we will just rename our old ones and sell them for more money!
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Well it's good performer if it costs like $170 thing is that cheapest 8800 GT's cost like $190.. so there's not much room there.
---

Rumours say that G94 wouldn't have anymore this AA problem that G92 had (with G92 using AA consumes a lot of vram..reason why 8800 GT 256MB has so hard time against HD3850 256MB). If this is true this would be very good news considering 9600 GT 256MB.
 

CP5670

Diamond Member
Jun 24, 2004
5,697
798
126
Originally posted by: MarcVenice
Last I read was about g200, coming in the second half of 2008, and so will r700 from AMD. I actually think R700 is going to hit the market before nvidia's new high-end videocards, because samples are allready out, or so say the rumours, and the R700 seems to be at least 50% faster then R670. G200 still being produced at 65nm, containing more transistors then ever before, is going to run HOT as hell, or so say the rumours, hehe. It sure is going to be interesting.

Are the G100 and GT200 the same thing? That German site says 55nm for G100, so assuming that is correct, it seems strange that they would go back to 65nm unless these cards are coming out around the same time, with G100 being a midrange version.
 

justlnluck

Senior member
Jul 13, 2004
261
0
0
Originally posted by: taltamir
I just heard that the only GF9 card that comes out in march is a GX2 which is gonna be a G92 based card... (even if it isn't it isn't worth it, I am not doing a dual GPU this year).
So have I been waiting in vain? If it is true that the GF9 single gpu cards aren't coming in march then I might as well go ahead and get a GT right now and upgrade later. Or maybe a GTS...

Taltamir, didn't you already buy a GTS a couple weeks ago? At least that's what I remember from an earlier thread.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Syntax Error
Who knows.

[n]Rollo knows :p
:Q

but he ain't telling ... either

you have to realize that nvidia is playing their cards closer to their chest then ever before - now that they are dealing with a tight-lipped AMD instead of blab-all ATi

perhaps they will have a surprise for AMD
:evil:
 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,771
58
91
gf9 will be paper launched like the 8800gt in Q1 2008 cuz they can't sell enough to compete with the competition now to make enough
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: justlnluck
Originally posted by: taltamir
I just heard that the only GF9 card that comes out in march is a GX2 which is gonna be a G92 based card... (even if it isn't it isn't worth it, I am not doing a dual GPU this year).
So have I been waiting in vain? If it is true that the GF9 single gpu cards aren't coming in march then I might as well go ahead and get a GT right now and upgrade later. Or maybe a GTS...

Taltamir, didn't you already buy a GTS a couple weeks ago? At least that's what I remember from an earlier thread.

I returned that and decided to wait. Good memory.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: taltamir
Originally posted by: justlnluck
Originally posted by: taltamir
I just heard that the only GF9 card that comes out in march is a GX2 which is gonna be a G92 based card... (even if it isn't it isn't worth it, I am not doing a dual GPU this year).
So have I been waiting in vain? If it is true that the GF9 single gpu cards aren't coming in march then I might as well go ahead and get a GT right now and upgrade later. Or maybe a GTS...

Taltamir, didn't you already buy a GTS a couple weeks ago? At least that's what I remember from an earlier thread.

I returned that and decided to wait. Good memory.

We all have excellent memories here - and good look-up capabilities ...
... for what *other* posters say :p

i have even been 'called' on stuff i said years ago
:Q

[the latest was a PM re: my prediction (last year) that Blue Ray/HD burners for PC would drop below $100 this year ... i still think so]

:D
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
forget blue ray... wanna buy a tapestry media? 300GB per disk, 18000$ per drive, 180$ per disk. They say they will soon come out with a 800GB model and 1600GB afterwards. Also HVD is in the works which will be available afterwards with up to 5TB per disk (if it ever comes to fruition)... but its still years until one of those is availble for purchase.

http://www.inphase-technologie...cts/media.asp?subn=3_2
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Wtf...

So now we're waiting till this summer for a new high end solution?

Horrible, horrible crappiness. :frown:
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Unless they pull a surprise. But I'm tired of waiting. There's always some ebay sap six months from now. ;)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
yea, nvidia forgot about competing with THEMSELVES... so what if ATI has nothing, leaving more then 2 years between high end products is retarded. They could have had people buying more nvidia cards to replace existing nvidia cards.. instead they leave those people hanging and try to snatch up ATI customers or people who haven't upgraded in a long time.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: taltamir
yea, nvidia forgot about competing with THEMSELVES... so what if ATI has nothing, leaving more then 2 years between high end products is retarded. They could have had people buying more nvidia cards to replace existing nvidia cards.. instead they leave those people hanging and try to snatch up ATI customers or people who haven't upgraded in a long time.

HD3870x2 is nothing? :p
:confused:

last i checked, it had the performance crown
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
It should. Can't tell for sure until it actually arrives on the market. But the 9600 will be PCIe v2 and your mobo should be good for it.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: apoppin
Originally posted by: taltamir
yea, nvidia forgot about competing with THEMSELVES... so what if ATI has nothing, leaving more then 2 years between high end products is retarded. They could have had people buying more nvidia cards to replace existing nvidia cards.. instead they leave those people hanging and try to snatch up ATI customers or people who haven't upgraded in a long time.

HD3870x2 is nothing? :p
:confused:

last i checked, it had the performance crown

for a two GPU... in one slot... yea its king there... But if you wanted to do multiple GPUs then the 8800ultra x3 or just two 3870s beat it... I want to see how two x2 cards stack against 3 ultras when the quadfire drivers become available... that will be exciting.

Anyways since the x2 is ONLY faster on pcie v2 boards and requires a huge PSU you are not exactly talking flexibility here.

But I wasn't actually looking at SLI vs xfire here... Single GPU performance the ultra has the crown, which is pathetic considering how old it is. nvidia didn't even bother to dethrone its own champion with its last refresh.
 

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
I think we are going to have to accept that video cards are heading the multi-gpu way, there isn't any more that can be done to really up the ante in performance like NV and ATI used to do, the max they can do is tweak the architecture and get more efficient cards like G92.

The HD 3870 X2 is not that bad for a dual gpu card, it's cheap, it doesn't have too many of the glaring drawbacks of CF and in fact according to many reviews it behaves like a single card, so yeah, I think that with a little driver maturity and some better implementation dual GPU cards will become the norm and in this matter it looks like ATI will be back on top, considering the huge advance ATI is doing in Crossfire.

Maybe by 2009 Crossfire/SLI implementations will be so good that a dual-gpu card will be indiscernible from a single-gpu card (except of course for performance), that and even power performance-per-watt optimizations and suddenly dual-gpu on one card sounds very attractive.