• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

G80 To Use GDDR4 Memories

I bet the R600 will use GDDR4 and SM 4.0 also, but until a reliable source posts an answer, we cant tell forsure.
 
Originally posted by: Banzai042
Wow, that's not wild speculation about a product that is probably close to a year away</sarcasm>

What is so wild about it? We have been using GDR3 this generation and last... For instance, the 6800 Ultra has more memory bandwidth than the 7800 GT. This is going to be a problem... Solution? Faster memory... But since the mfg process can do only so much to ramp up clock speed, we have to offer another solution. GDR4.
 
Originally posted by: ArchAngel777
Originally posted by: Banzai042
Wow, that's not wild speculation about a product that is probably close to a year away</sarcasm>

What is so wild about it? We have been using GDR3 this generation and last... For instance, the 6800 Ultra has more memory bandwidth than the 7800 GT. This is going to be a problem... Solution? Faster memory... But since the mfg process can do only so much to ramp up clock speed, we have to offer another solution. That solution is GDR4.

Keep in mind the 7800 GTX only a tad bit faster memory... I believe, 100Mhz faster. That is a clock increase of roughly 8%. The 7800 GT actually has slower memory than the 6800 Ultra... :-/ This was supposed to be an Edit in the above post...

 
I know that it is a logical assumption, however it still is just that, the G70 cards haven't exactly been out for an incredibly long time, and only 2 of the cards in the line have even been released so far. I persionally think it's far to soon to put such things out there as solid fact, even if we can logically *assume* that it will be the case.
 
Originally posted by: Banzai042
I know that it is a logical assumption, however it still is just that, the G70 cards haven't exactly been out for an incredibly long time, and only 2 of the cards in the line have even been released so far. I persionally think it's far to soon to put such things out there as solid fact, even if we can logically *assume* that it will be the case.

Basically until someone has the card in their hand, everything will be speculation. The OP didn't state it as a fact, just that early reports indecate that the G80 will be using GDR4. You have to remember that nVidia has been working on the G80 for quite a while, just like ATI is already working on the R600. I am pretty sure that once the card is released, the expert R & D engineers begin working on the next design, while another less experienced team handles the refreshed part that follows a few months down the road. Again, this is my speculation, as it seems to make some sense. It would be a waste of a core group of engineers time to have them working on the 'refresh' cycle, when nothing too significant happens with them.

This is speculation on my part... But given that these GPU's are getting more complex every generation, they need to get a head start right away.

Take it for what you will, but these forums are full of thinkers and being that we think, we love to speculate.
 
Man, just when I was looking forward to getting a 7800GTX and it's all-ready out-dated... 😉

-spike
 
rumors ... speculation ...

Seen a few wild rumors floating that G80 will be the first gigahertz video processor... we'll see.
 
Originally posted by: jiffylube1024
That's a lot of Fours. GDDR4, SM 4... 80 is a multiple of 4 as well 😉 . The stars align.

Maybe 40 pipes? How about 40NM also? Hmm... Maybe even 4 cores on a single card running on a seudo-sli like setup?
 
Originally posted by: Hacp
Originally posted by: jiffylube1024
That's a lot of Fours. GDDR4, SM 4... 80 is a multiple of 4 as well 😉 . The stars align.

Maybe 40 pipes? How about 40NM also? Hmm... Maybe even 4 cores on a single card running on a seudo-sli like setup?

Awesome! I think 40-pipes and 4-core SLI could be possibilities. I don't hold out much hope for 40nm though 😉 .

I think they should Call it GeForce 2^3 (TM) and make people call it: "Geforce: 2 to the power of 3." THAT's marketing!

But the question is, Will ATI's XXL1800 XT Strontium Edition be faster? 😉
 
Originally posted by: jiffylube1024
Originally posted by: Hacp
Originally posted by: jiffylube1024
That's a lot of Fours. GDDR4, SM 4... 80 is a multiple of 4 as well 😉 . The stars align.

Maybe 40 pipes? How about 40NM also? Hmm... Maybe even 4 cores on a single card running on a seudo-sli like setup?

Awesome! I think 40-pipes and 4-core SLI could be possibilities. I don't hold out much hope for 40nm though 😉 .

I think they should Call it GeForce 2^3 (TM) and make people call it: "Geforce: 2 to the power of 3." THAT's marketing!

But the question is, Will ATI's XXL1800 XT Strontium Edition be faster? 😉
The only four i'd like to see [again] is $400 for the top card. 😛

"Power of Four"
:thumbsdown:
 
Do you think theyd have to stick something like a reserator or a fridge maybe ontop of that then?

If so id like it to be a beer fridge...

And id expect to get a crate of free beer of my choosing when that beast comes out with a beasty pricetag.
 
To be fair drifter106 does make a point.

The time really has come to design hardware which is more power efficient. Good to see AMD and eventually intel doing it, Nvidia has done it a bit with G70 - we have only ATI to see do this.

And also, speaking of which, it would be nice if upcoming video cards had a much more aggressive cool & quiet/speedstep feature - where, say the clockspeeds dropped by like 70 or 80% and therefore being able to also switch off the fans in idle mode saving even more power.
 
Back
Top