• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

nVidia G80 rumers

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Aflac
Originally posted by: Roguestar
Octo SLi. That's funny. What's even funnier is the thought that anyone would have the room on a motherboard to fit 8 graphics cards, even if two are 2 GPU per card, let alone anyone have the hard cash to even buy such a fantasy system.

Wasn't there an octo-SLI workstation or something recently? I believe the graphics cards were held externally or something.

Octo-SLI is completely useless if they cant even get Quad-SLI to work properly.
 
What about software voltage regulating? Has there been a rumor about being able to control the voltages via software like ATI cards can?
 
Octo-SLI could be good for high levels of AA but nVidia needs to improve their current AA engine first.

Compared to ATi their AA patterns are far too limited and it just gets worse when you throw more cards at the problem.
 
Originally posted by: Elfear
Originally posted by: Cookie Monster

-New AA algorithms, which also indicate a major makeover in the IQ department. We can expect new AF from NV as well. Not to mention HDR + AA will be there.

Have a source for this rumor? Most of the other ones I've heard about to one degree or another but new AF and HDR+AA is a new one. I know a lot of peope are HOPING that G80 will have these features but I wasn't aware of any solid rumors saying it would.

It was reported via a french site, claiming G80 will be featuring a complete new AA engine with, yes, full AA+HDR capabilities. The source is linked via the big G80 rumour thread at B3D, which I'll try to get to you when I have time.

Cookie Monster's specs seem to pretty much be on par with what's being tossed around right now (except for clock speeds, which really aren't determined until closer when it launches).

nVidia has stated they are aiming for release later this year, implying late 2006. I'd personally venture around October or November. I think G80 is ready, their just finishing milking G70.

I think what people need to realize is that this is a new architecture, and to compare the "speculated" unit counts to those of NV40 is quite an inaccurate judgement.

 
Originally posted by: GTaudiophile

Another fact: R600 will b!tch slap G80 up one side and down the other.


Kind of hard for it to be a fact when neither one of the cards have been released.
 
Originally posted by: RedStar
hey look at this:

octo crossfire is approaching..if ati ever doubles up its GPUS 🙂

P5W64 WS Pro has FOUR 16x slots 🙂

http://www.hexus.net/content/item.php?item=6635

1 hefty board to look at while we wait 🙂

The slots are all right next to each other though, so you would only be able to use two dual slot cards. It's probably meant to be used with SCSI cards and stuff like that.
 
Originally posted by: GTaudiophile
One thing is absolutely certain: G80 will not have unified shaders unless the Vista delay has given NVDA time to add them.

Another fact: R600 will b!tch slap G80 up one side and down the other.

Someone doesnt know what he is talking about..
 
Originally posted by: Cookie Monster
Originally posted by: GTaudiophile
One thing is absolutely certain: G80 will not have unified shaders unless the Vista delay has given NVDA time to add them.

Another fact: R600 will b!tch slap G80 up one side and down the other.

Someone doesnt know what he is talking about..

And those who deny it do?

Fact is, no one knows how the two will perform against eachother. No reason to jump the gun with the flames if there isn't even fire wood yet.
 
"The slots are all right next to each other though, so you would only be able to use two dual slot cards. It's probably meant to be used with SCSI cards and stuff like that.
"

My take on that is that asus is optimisitic that the cards will only need 1 slot cooling!! hehe 🙂 =)
 
Pipes and Mhzs are becoming cliche and more or less marketing terms. Because every time they come out with a new card they twist the defintion of what's considered a pipe.
 
Originally posted by: jiffylube1024
Originally posted by: Frackal
Geez, 400mhz sounds pretty weak


Originally posted by: josh6079
Not if it is powering 32 pipelines.

It's still awfully weak since we've already got a 650 MHz core 24 pipe card. But there's no way it will be 400 MHz. It will be way higher.

This is assuming of course that the G80 will employ identical pipelines to G70/71. Which is something we cannot do.

 
G80 specs.

Dual chips,
42 pipes
29 pixel shaders
3.2 GHz Core
2.4 GHz FSB
HDMI Compatible
4 DDR2 1066 slots
ICH7R southbridge
2SATA ports
Integrated 7.1 CH hi def audio
7 DVI ports.
PCI Express x64
 
Originally posted by: Rock Hydra
G80 specs.

Dual chips,
42 pipes
29 pixel shaders
3.2 GHz Core
2.4 GHz FSB
HDMI Compatible
4 DDR2 1066 slots
ICH7R southbridge
2SATA ports
Integrated 7.1 CH hi def audio
7 DVI ports.
PCI Express x64

And comes with a 42" plasma screen built in!

 
Originally posted by: Regs
Pipes and Mhzs are becoming cliche and more or less marketing terms. Because every time they come out with a new card they twist the defintion of what's considered a pipe.

Agreed. Not to mention these pipes are "DX10" pipes not DX9.

The 32 pipeline config could be 32x2 or even 32x3. (based on rumours of multi function ALUs, or NV adding another ALu to there already 2 per pipeline of the G71)

400mhz could be the right estimate because you dont need high mhz for performance because most performance rely on the architectural side of things. Look at R300 vs NV30. NV30 was clocked at 500mhz while the R300 was clocked around 300mhz~. Same goes for NV40 and R420. Ones clocked at 400~mhz and the other 520~mhz. The two cards performed similarly.

If ATi did indeed have a 64 unified shader architecture, it doesnt sound to great unless theres other stuff added to them. The R580 has 48 PS and 8VS. That is total of 56 shaders. Now take the R600, since it has 64, and in DX9 would have to be configured to run at a fixed number of PS/VS. So maybe 56/8 config or 54/10. It doesnt sound as a big of a leap because R580 tripled that of the R520 but wasnt 3 times the performance.
The performance wont increase much but im thinking the minimum fps will rise considerably due to using unified shaders.

Im guessing that theres something else to this 64 unified shaders or ATi compensated R600 to be "small" and high clocked. That could mean higher yield as well as smaller die size which leads to low cost. While NV took the low clocked and "big" approach. (Similiar to NV40 (big) /R420 (small) or G71 (small)/ R580 (big).
 
Originally posted by: Rock Hydra
G80 specs.

Dual chips,
42 pipes
29 pixel shaders
3.2 GHz Core
2.4 GHz FSB
HDMI Compatible
4 DDR2 1066 slots
ICH7R southbridge
2SATA ports
Integrated 7.1 CH hi def audio
7 DVI ports.
PCI Express x64

All cooled by a Tuniq Tower 120 😛
 
Back
Top