G80 To Use GDDR4 Memories

Soccerman06

Diamond Member
Jul 29, 2004
5,830
5
81
I bet the R600 will use GDDR4 and SM 4.0 also, but until a reliable source posts an answer, we cant tell forsure.
 

Banzai042

Senior member
Jul 25, 2005
489
0
0
Wow, that's not wild speculation about a product that is probably close to a year away</sarcasm>
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Banzai042
Wow, that's not wild speculation about a product that is probably close to a year away</sarcasm>

What is so wild about it? We have been using GDR3 this generation and last... For instance, the 6800 Ultra has more memory bandwidth than the 7800 GT. This is going to be a problem... Solution? Faster memory... But since the mfg process can do only so much to ramp up clock speed, we have to offer another solution. GDR4.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: ArchAngel777
Originally posted by: Banzai042
Wow, that's not wild speculation about a product that is probably close to a year away</sarcasm>

What is so wild about it? We have been using GDR3 this generation and last... For instance, the 6800 Ultra has more memory bandwidth than the 7800 GT. This is going to be a problem... Solution? Faster memory... But since the mfg process can do only so much to ramp up clock speed, we have to offer another solution. That solution is GDR4.

Keep in mind the 7800 GTX only a tad bit faster memory... I believe, 100Mhz faster. That is a clock increase of roughly 8%. The 7800 GT actually has slower memory than the 6800 Ultra... :-/ This was supposed to be an Edit in the above post...

 

Banzai042

Senior member
Jul 25, 2005
489
0
0
I know that it is a logical assumption, however it still is just that, the G70 cards haven't exactly been out for an incredibly long time, and only 2 of the cards in the line have even been released so far. I persionally think it's far to soon to put such things out there as solid fact, even if we can logically *assume* that it will be the case.
 

klah

Diamond Member
Aug 13, 2002
7,070
1
0
I'm sure they work closely with Samsung since their products are so heavily codependent.


 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Banzai042
I know that it is a logical assumption, however it still is just that, the G70 cards haven't exactly been out for an incredibly long time, and only 2 of the cards in the line have even been released so far. I persionally think it's far to soon to put such things out there as solid fact, even if we can logically *assume* that it will be the case.

Basically until someone has the card in their hand, everything will be speculation. The OP didn't state it as a fact, just that early reports indecate that the G80 will be using GDR4. You have to remember that nVidia has been working on the G80 for quite a while, just like ATI is already working on the R600. I am pretty sure that once the card is released, the expert R & D engineers begin working on the next design, while another less experienced team handles the refreshed part that follows a few months down the road. Again, this is my speculation, as it seems to make some sense. It would be a waste of a core group of engineers time to have them working on the 'refresh' cycle, when nothing too significant happens with them.

This is speculation on my part... But given that these GPU's are getting more complex every generation, they need to get a head start right away.

Take it for what you will, but these forums are full of thinkers and being that we think, we love to speculate.
 

Spike

Diamond Member
Aug 27, 2001
6,770
1
81
Man, just when I was looking forward to getting a 7800GTX and it's all-ready out-dated... ;)

-spike
 

Pabster

Lifer
Apr 15, 2001
16,986
1
0
rumors ... speculation ...

Seen a few wild rumors floating that G80 will be the first gigahertz video processor... we'll see.
 

Hacp

Lifer
Jun 8, 2005
13,923
2
81
Originally posted by: jiffylube1024
That's a lot of Fours. GDDR4, SM 4... 80 is a multiple of 4 as well ;) . The stars align.

Maybe 40 pipes? How about 40NM also? Hmm... Maybe even 4 cores on a single card running on a seudo-sli like setup?
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Hacp
Originally posted by: jiffylube1024
That's a lot of Fours. GDDR4, SM 4... 80 is a multiple of 4 as well ;) . The stars align.

Maybe 40 pipes? How about 40NM also? Hmm... Maybe even 4 cores on a single card running on a seudo-sli like setup?

Awesome! I think 40-pipes and 4-core SLI could be possibilities. I don't hold out much hope for 40nm though ;) .

I think they should Call it GeForce 2^3 (TM) and make people call it: "Geforce: 2 to the power of 3." THAT's marketing!

But the question is, Will ATI's XXL1800 XT Strontium Edition be faster? ;)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: jiffylube1024
Originally posted by: Hacp
Originally posted by: jiffylube1024
That's a lot of Fours. GDDR4, SM 4... 80 is a multiple of 4 as well ;) . The stars align.

Maybe 40 pipes? How about 40NM also? Hmm... Maybe even 4 cores on a single card running on a seudo-sli like setup?

Awesome! I think 40-pipes and 4-core SLI could be possibilities. I don't hold out much hope for 40nm though ;) .

I think they should Call it GeForce 2^3 (TM) and make people call it: "Geforce: 2 to the power of 3." THAT's marketing!

But the question is, Will ATI's XXL1800 XT Strontium Edition be faster? ;)
The only four i'd like to see [again] is $400 for the top card. :p

"Power of Four"
:thumbsdown:
 
Oct 19, 2000
17,860
4
81
Originally posted by: jiffylube1024
That's a lot of Fours. GDDR4, SM 4... 80 is a multiple of 4 as well ;) . The stars align.
Me thinks you stretched a bit too far for that last one :p :D.
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Do you think theyd have to stick something like a reserator or a fridge maybe ontop of that then?

If so id like it to be a beer fridge...

And id expect to get a crate of free beer of my choosing when that beast comes out with a beasty pricetag.
 

jazzboy

Senior member
May 2, 2005
232
0
0
To be fair drifter106 does make a point.

The time really has come to design hardware which is more power efficient. Good to see AMD and eventually intel doing it, Nvidia has done it a bit with G70 - we have only ATI to see do this.

And also, speaking of which, it would be nice if upcoming video cards had a much more aggressive cool & quiet/speedstep feature - where, say the clockspeeds dropped by like 70 or 80% and therefore being able to also switch off the fans in idle mode saving even more power.