Nvidia's G80 delayed until November-December?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Jun 14, 2003
10,442
0
0
Originally posted by: hemmy
Originally posted by: otispunkmeyer
Originally posted by: Gstanfor
Originally posted by: Cookie Monster
Originally posted by: Bobthelost
Originally posted by: Cooler
Its the INQ, who really listens to them!?

;)

Agreed. INQ often contradicts themselves, or make things up time to time.
Majority of the B3D guys dont seem to think INQ is wrong because some of the stuff that they said doesnt make sense.

The G80 PCB has been ready for a long time ago. I dont think NV wants to lose their "6 week" advantage from ATi.

Another interesting bit of G80 is that the PCB has 12 ram chips.

384 bit bus? Sounds tasty! (EDIT: for those who don't know video card drams are 32 bits wide, so 8 of them yeilds a 256 bit bus, 128 bit cards use 4 drams and a 384 bit card would use 12. a 512 bit card would naturally use 16.)


well my 9500pro has 8 drams but its only 128bit.... they are however arranged 4 on top, 4 on the bottom, like theyre soldered to the same contacts

a lot of 9500pros were 9700s in disguise


well not really

yes the 9500pro had the same R300 core with all 8 pipes enabled, but none of the pro's came with 256bit memory. they could never become a 9700

some 9500np's however, were built on the 256bit PCB, but suitably they only had 4 pipelines.

lucky people could easily flash the non pro's to unlock the pipes and hey presto 9700-9700pro
 

Avalon

Diamond Member
Jul 16, 2001
7,567
156
106
Originally posted by: otispunkmeyer
Originally posted by: hemmy
Originally posted by: otispunkmeyer
Originally posted by: Gstanfor
Originally posted by: Cookie Monster
Originally posted by: Bobthelost
Originally posted by: Cooler
Its the INQ, who really listens to them!?

;)

Agreed. INQ often contradicts themselves, or make things up time to time.
Majority of the B3D guys dont seem to think INQ is wrong because some of the stuff that they said doesnt make sense.

The G80 PCB has been ready for a long time ago. I dont think NV wants to lose their "6 week" advantage from ATi.

Another interesting bit of G80 is that the PCB has 12 ram chips.

384 bit bus? Sounds tasty! (EDIT: for those who don't know video card drams are 32 bits wide, so 8 of them yeilds a 256 bit bus, 128 bit cards use 4 drams and a 384 bit card would use 12. a 512 bit card would naturally use 16.)


well my 9500pro has 8 drams but its only 128bit.... they are however arranged 4 on top, 4 on the bottom, like theyre soldered to the same contacts

a lot of 9500pros were 9700s in disguise


well not really

yes the 9500pro had the same R300 core with all 8 pipes enabled, but none of the pro's came with 256bit memory. they could never become a 9700

some 9500np's however, were built on the 256bit PCB, but suitably they only had 4 pipelines.

lucky people could easily flash the non pro's to unlock the pipes and hey presto 9700-9700pro

Perhaps the DRAM modules on the 9500pro were 16bits wide, thus 8x16 = 128?
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: R3MF
but you would expect cards to either have 4, 8, or 16 chips if they had a regular Power-of-two data bus like 128bit or 256bit.

12 chips does hint at a 192/384 bit data bus.

It does hint at 384-bit bus (not sure why they'd drop bandwidth to 192; even with 2+ GHz GDDR4 I don't think they'd hinder the bandwidth like that).

However, 12 chips would also mean it has the bizarre configuration of 768MB of RAM (since to my knowledge the RAM chips all have to be the same density). So 384MB (again, unlikely they decrease the memory) or 768MB on-card.

Definitely would be weird...
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: otispunkmeyer


same as any other brit tabloid.... all the papers make ****** up, the inq jus does ot online

The Inq needs "page 3" girls, then I could forgive em.... a bit.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Wreckage
The Inq makes up a date. That date is wrong, so instead of the Inq being wrong company X is late.

Not company X, but Nvidia :Q needed a respin and is late. :thumbsdown: Does this mean even lower IQ to push the fps? :laugh:


Heck all companies need the occassional respin, no biggie, but will likely lead to wild rumours of a overly hot, underperformer. Time for those pr guys to make their bucks.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Avalon
Originally posted by: otispunkmeyer
Originally posted by: hemmy
Originally posted by: otispunkmeyer
Originally posted by: Gstanfor
Originally posted by: Cookie Monster
Originally posted by: Bobthelost
Originally posted by: Cooler
Its the INQ, who really listens to them!?

;)

Agreed. INQ often contradicts themselves, or make things up time to time.
Majority of the B3D guys dont seem to think INQ is wrong because some of the stuff that they said doesnt make sense.

The G80 PCB has been ready for a long time ago. I dont think NV wants to lose their "6 week" advantage from ATi.

Another interesting bit of G80 is that the PCB has 12 ram chips.

384 bit bus? Sounds tasty! (EDIT: for those who don't know video card drams are 32 bits wide, so 8 of them yeilds a 256 bit bus, 128 bit cards use 4 drams and a 384 bit card would use 12. a 512 bit card would naturally use 16.)


well my 9500pro has 8 drams but its only 128bit.... they are however arranged 4 on top, 4 on the bottom, like theyre soldered to the same contacts

a lot of 9500pros were 9700s in disguise


well not really

yes the 9500pro had the same R300 core with all 8 pipes enabled, but none of the pro's came with 256bit memory. they could never become a 9700

some 9500np's however, were built on the 256bit PCB, but suitably they only had 4 pipelines.

lucky people could easily flash the non pro's to unlock the pipes and hey presto 9700-9700pro

Perhaps the DRAM modules on the 9500pro were 16bits wide, thus 8x16 = 128?

That's the likely reason. You often see 4 drams on 64 bit budget cards for the same reason where in theory only two are required.
 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Originally posted by: josh6079
We should just lock threads that deal with quotes from the inquirer. Honestly, why give that trashy BS site another hit so it can pump more revenue into more BS. Let the site die and don't visit it.
I'm Barry Scott and this post is BANG on!
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: ronnn
Originally posted by: Wreckage
The Inq makes up a date. That date is wrong, so instead of the Inq being wrong company X is late.

Not company X, but Nvidia :Q needed a respin and is late. :thumbsdown: Does this mean even lower IQ to push the fps? :laugh:


Heck all companies need the occassional respin, no biggie, but will likely lead to wild rumours of a overly hot, underperformer. Time for those pr guys to make their bucks.

:confused:

You dont have to be a troll just because wreckage is being one.

Right now, G80 might have dedicated Pixel shaders, a pool of VS/GS (unified shaders), and a unified TMU/ROPs. Sounds strange doesnt it.

G80 is going to be really weird. Literally.
 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Why anyone would bother with a mess like that instead of just having pixel/vertex or unified shaders is beyond me, in fact I am seriously hoping that Nvidia get into the unified shader game right off the bat like ATI. I will be upgrading in the middle or at the end of next year, and I am pretty bored of Nvidia right now as they don't seem competative in price nor in features.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Cookie Monster
[
:confused:

You dont have to be a troll just because wreckage is being one.

Right now, G80 might have dedicated Pixel shaders, a pool of VS/GS (unified shaders), and a unified TMU/ROPs. Sounds strange doesnt it.

G80 is going to be really weird. Literally.

The G80 might have anything at this point. The Inquirer is exciting that way, being right some of the time keeps us all reading. I do believe the g80 is late, but maybe intentionally so or maybe a respin. One thing is certain, their more flamboyant articles on some company's problems, lead to colourful threads.

 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: jiffylube1024
Originally posted by: R3MF
but you would expect cards to either have 4, 8, or 16 chips if they had a regular Power-of-two data bus like 128bit or 256bit.

12 chips does hint at a 192/384 bit data bus.

It does hint at 384-bit bus (not sure why they'd drop bandwidth to 192; even with 2+ GHz GDDR4 I don't think they'd hinder the bandwidth like that).

However, 12 chips would also mean it has the bizarre configuration of 768MB of RAM (since to my knowledge the RAM chips all have to be the same density). So 384MB (again, unlikely they decrease the memory) or 768MB on-card.

Definitely would be weird...

Maybe not though, more than 512, not quite a gig

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: ronnn
Originally posted by: Cookie Monster
[
:confused:

You dont have to be a troll just because wreckage is being one.

Right now, G80 might have dedicated Pixel shaders, a pool of VS/GS (unified shaders), and a unified TMU/ROPs. Sounds strange doesnt it.

G80 is going to be really weird. Literally.

The G80 might have anything at this point. The Inquirer is exciting that way, being right some of the time keeps us all reading. I do believe the g80 is late, but maybe intentionally so or maybe a respin. One thing is certain, their more flamboyant articles on some company's problems, lead to colourful threads.

Well, one good troll deserves another, so my *purely speculative* theory on why G80 is "delayed" is so nvidia can blow R600 completely out of the water and hasten ATi's departure from the highend discrete market.

One more thing, if G80 launches before R600 (which is extremely likely, "delay or no "delay"), the general public will see R600, not G80 as the delayed chip...
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Gstanfor


Well, one good troll deserves another, so my *purely speculative* theory on why G80 is "delayed" is so nvidia can blow R600 completely out of the water and hasten ATi's departure from the highend discrete market.

You should write for the inquirer. The logic is good, they are waiting because the g80 is still growing and will be much stronger and faster in 2 months.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: ronnn
Originally posted by: Wreckage
The Inq makes up a date. That date is wrong, so instead of the Inq being wrong company X is late.

Not company X, but Nvidia :Q needed a respin and is late. :thumbsdown: Does this mean even lower IQ to push the fps? :laugh:


Heck all companies need the occassional respin, no biggie, but will likely lead to wild rumours of a overly hot, underperformer. Time for those pr guys to make their bucks.

Well I guess ATI is late. Maybe they need to cut down more trees for a bigger paper launch. No R600 until next year folks.
http://www.theinquirer.net/default.aspx?article=34359
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Well I guess ATI is late. Maybe they need to cut down more trees for a bigger paper launch. No R600 until next year folks.
They'll have to make sure they don't accidentally take some from Nvidia's pile, since G80 keeps getting pushed back.

I'll agree with you that the X1950 was a paper launch, but I can't see the R600 being a paper launch yet because there aren't even any reviews on it presently, nor G80 for that matter.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: josh6079
but I can't see the R600 being a paper launch yet because there aren't even any reviews on it presently, nor G80 for that matter.

It was sarcasm in response to Ronn's trolling.
 

lopri

Elite Member
Jul 27, 2002
13,212
597
126
The final (read: production level) chip was delivered to NV. With some luck, they will barely make it to Christmas time. (mid~late November) On the other hand, R600 will have to wait till spring time but will have a superior characteristic. But then again the 2~3 months will give NV enough time to squeeze something out to compete with R600. Very typical story, if you ask me.


 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Wreckage
Originally posted by: josh6079
but I can't see the R600 being a paper launch yet because there aren't even any reviews on it presently, nor G80 for that matter.

It was sarcasm in response to Ronn's trolling.

I didn't make the inquirer write that story, nor did I hold up the G80. I do agree that the R600 is looking to be likely late also. About paper launches, not sure and don't care. As long as they are on the shelves for a good price when I am ready to buy.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
An interesting rumour from beyond3d about the R600.

The R600 is going to use a 33cm PCB.
Its going to take 3 PCI-e slots. (One slot for cooler/PCB, And maybe 2 for the external PSU?)
Use up 200W by itself.

:Q

Yes, its very horrific.
 

Ackmed

Diamond Member
Oct 1, 2003
8,486
529
126
Because as we've found out in the past, everything posted on the internet is true.

Why people take rumors as facts is beyond me.
 

GundamSonicZeroX

Platinum Member
Oct 6, 2005
2,100
0
0
Originally posted by: tanishalfelven
Originally posted by: GundamSonicZeroX
I don't care. I'm not getting the DX10 cards until later next year when I upgrade to Vista. For now, I'm going to get a second GeForce 7900GT.

it sure as hell matters. G80's arrival will drive the cost of the current high ends. remeber x850xt selling for <150. i want that sittuation again. then i'll buy a x1950xtx. :)

I never said it didn't amtter, I just said I don't care. If the G80 comes out before I'm able to get a second 7900GT I can get it cheaper :p
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Ackmed
Because as we've found out in the past, everything posted on the internet is true.

Why people take rumors as facts is beyond me.

Heres the link. This thread is ALL speculation. I never said it was fact. I said rumour.
Dont have to get all worked up for nothing. ;)

Link

Next month, ATi will launch two cards X1650XT and X1950Pro. Right after that, we should be expecting a new card based on the R600 core.

From our friends who saw it, the size is like the 7900GX2. The length is 33cm and takes up the space of 3 PCIe slots. It uses a single GPU and require 200W with a 8 pin power connector.
 

Ackmed

Diamond Member
Oct 1, 2003
8,486
529
126
Im not worked up, and I wasnt replying to anyone really. This thread is all about ATi and NV being late, when no launch dates were even set, only rumors of them. Now there are more rumors that they're going to be late. Just seems silly to dive in head first and believe the Inq and others, with zero facts to back it up.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Ackmed
Im not worked up, and I wasnt replying to anyone really. This thread is all about ATi and NV being late, when no launch dates were even set, only rumors of them. Now there are more rumors that they're going to be late. Just seems silly to dive in head first and believe the Inq and others, with zero facts to back it up.

I dont think theres ANY fact involved in a speculation thread. :)
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: lopri
The final (read: production level) chip was delivered to NV. With some luck, they will barely make it to Christmas time. (mid~late November) On the other hand, R600 will have to wait till spring time but will have a superior characteristic. But then again the 2~3 months will give NV enough time to squeeze something out to compete with R600. Very typical story, if you ask me.


SPROMNGDGOIJN time? What the hell happened to the Christmas season? Jeez we've been stuck with this same ass crap for ages it seems like. I should have gone crossfire