The New and Improved "G80 Stuff"

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

virtualrain

Member
Aug 7, 2005
158
0
0
Originally posted by: nitromullet
Actually, as indicated in the first post of this thread...
8800GTX measures to 11", or 275mm, and 8800GTS to 9", or 225mm
The new pics don't seem to deviate from the earlier spy photos (at least not for the GTX), so I'm going to assume that the GTX is actually 11" long! The bottom picture appears to be an 8800GTS, which is the 9" card.

As I said, I compared two pics in Photoshop and put them to scale (pixel per pixel) to match height and length of the PCI-E connector and after doing this the ASUS GTX card was 13.5% longer than a 7800 GTX (which is exactly 9" long) making the GTX exactly 10.2" or 260mm in length.

Even at 10.2" it will have trouble fitting into mid-tower cases with drives in bays adjacent to the motherboard. I had to leave a drive bay empty to fit my 9" 7800 GTX in my old TT Tsunami case.

It's still a monster either way and if you have 11" of room to work with, you will be fine!

 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Good thing the douche only took 1 pic of the bottom with a huge amount of glare so you cant see if its 2 dies on one package or not.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: ArchAngel777
Originally posted by: Acanthus
Originally posted by: lopri
Originally posted by: nitromullet
This is an interesting twist on the power consumption thing IMO...

http://www.pcpower.com/products/power_supplies/max-performance/

PCP&C no longer has a mutli-rail 1KW psu... They used to have three 1KW multi-rail models. They all had same rails, but different branding for Xfire and SLI and one with two 6-pin molexes instead of four. They now have a one with a single 72A 12v rail. IIRC PCP&C psu's are used by both NV and ATI at hardware conferences etc to show off the new gear, so I would assume that PCP&C has a relationship with both chipmakers. I am curious if this psu is in response to some info obtained from NV or testing done with 8800GTX SLI...
1KW with single 72A rail.. One MUST be careful if s/he happens to open that PSU, I guess.

A 250w can kill you...

So can the easter bunny.

As for the renders of the human face... I am not impressed. Sure, it is more real than anything done before, but it looks freakish and doesn't strike me as "cool". Besides, it is only a head... What about the rest of the person? And what about the environment? This release seems lacklustre...

Please touch the caps of a 250w psu with metal.

Itll not only prove my point, but i wont have to hear your retarded posts anymore.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
From looking at the pics, it makes me wonder what the point of the vents are on the heatsink shroud. It would seem like this would just dump hot air into your case instead of venting it out the back. I would assume that they know what they are doing, but it still seems odd to me.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Acanthus
Originally posted by: ArchAngel777
Originally posted by: Acanthus
Originally posted by: lopri
Originally posted by: nitromullet
This is an interesting twist on the power consumption thing IMO...

http://www.pcpower.com/products/power_supplies/max-performance/

PCP&C no longer has a mutli-rail 1KW psu... They used to have three 1KW multi-rail models. They all had same rails, but different branding for Xfire and SLI and one with two 6-pin molexes instead of four. They now have a one with a single 72A 12v rail. IIRC PCP&C psu's are used by both NV and ATI at hardware conferences etc to show off the new gear, so I would assume that PCP&C has a relationship with both chipmakers. I am curious if this psu is in response to some info obtained from NV or testing done with 8800GTX SLI...
1KW with single 72A rail.. One MUST be careful if s/he happens to open that PSU, I guess.

A 250w can kill you...

So can the easter bunny.

As for the renders of the human face... I am not impressed. Sure, it is more real than anything done before, but it looks freakish and doesn't strike me as "cool". Besides, it is only a head... What about the rest of the person? And what about the environment? This release seems lacklustre...

Please touch the caps of a 250w psu with metal.

Itll not only prove my point, but i wont have to hear your retarded posts anymore.

QFP

 

CP5670

Diamond Member
Jun 24, 2004
5,681
782
126
Why are there two SLI connectors on that thing? Maybe it's for quad SLI with four distinct cards?
 
Oct 4, 2004
10,515
6
81
8800GTX with
-FX-62: 10,000
-Core 2 X6800: 10,500
-Core 2 QX6700: 12,000

Wow, how convenient is that. Nice, rounded off numbers. The PoS thing calls itself a review???
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
I know rofl, but those numbers in the chinese site seem pretty accurate.
Also I think you need both of those SLI bridges to make the GTXs work in SLI. My opinion...
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: ArchAngel777
Originally posted by: Acanthus
Originally posted by: ArchAngel777
Originally posted by: Acanthus
Originally posted by: lopri
Originally posted by: nitromullet
This is an interesting twist on the power consumption thing IMO...

http://www.pcpower.com/products/power_supplies/max-performance/

PCP&C no longer has a mutli-rail 1KW psu... They used to have three 1KW multi-rail models. They all had same rails, but different branding for Xfire and SLI and one with two 6-pin molexes instead of four. They now have a one with a single 72A 12v rail. IIRC PCP&C psu's are used by both NV and ATI at hardware conferences etc to show off the new gear, so I would assume that PCP&C has a relationship with both chipmakers. I am curious if this psu is in response to some info obtained from NV or testing done with 8800GTX SLI...
1KW with single 72A rail.. One MUST be careful if s/he happens to open that PSU, I guess.

A 250w can kill you...

So can the easter bunny.

As for the renders of the human face... I am not impressed. Sure, it is more real than anything done before, but it looks freakish and doesn't strike me as "cool". Besides, it is only a head... What about the rest of the person? And what about the environment? This release seems lacklustre...

Please touch the caps of a 250w psu with metal.

Itll not only prove my point, but i wont have to hear your retarded posts anymore.

QFP

LOL im not going to change it.

You get on my nerves and im not afraid to let you know.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Ok I have some info that mightve been posted in this thread.. this is from a Chinese friend of mine who has been researching these cards with info from Chinamen who dont care about NDAs.

I'm not sure what has and hasnt been posted here, but i gathered up all my info to be concise here for the OP to either add or not-

G80
-GTS same length as the 7900GTX
-GTX is longer
-FEAR 1600x1200: 16xAA @34FPS, 4x/8x @80FPS, No IQ options @130FPS (for reference 16x12 no AA/AF, X1950 Xfire is 124, 7900GTX SLI gets 139FPS)
-Vastly improved HDR performance
-A64 4000+ 3dmark06 9K
-C2Quadro 3dmark06 12K
-Black PCB for retail G80s
-Have 512b MC on all G80s, but its disabled because GDDR4 is largely unavailable
-Nvidia might respin G80 on R600 launch to enable the full 512bit MC on R600 launch if needed
-The NV10 chip to the side of the main GPU is a 10bit/27mhz video DAC (possibly H.264 decoding)
-$699 possible pricetag

G81
-June '07
-65nm
-GDDR4
-512bit MC

ATI R600
-This is ATIs last attempt, so its going to be good. And thats why they dont care about missing Christmas sales.. its the last from them like this (high end). After this, everything moves to a pro-AMD stance.
-16 layer PCB
-512bit MC
-8 Memory channels
-GDDR4
-Extremely hot (but AMD is helping with that big time currently)
-80nm
-Produced by TSMC


All in all-
R600 will be a beast, but might not destroy G80 in performance regardless.. should be a very good match or beat G80, but as stated above, NV is thinking of respinning G80 on R600 launch if needed (to enable the full 512bit MC + GDDR4).. and G81 will be ready in June. Its not looking good for "ati" high end in the future, Nvidia wants to punish ATI one last time to end this charade once and for all now that AMD took over ATI.

My thoughts-
I'm extremely happy if this info is all true, and I'm inclined to believe it since its from China. People in the west fake leaks because they cant get any info out of any NDA signees.
X1950 Crossfire / 7900GTX SLI performance on a single card +DX10/SM4, faster HDR, HDR+AA on Nvidia with Nvidia driver support, unified shaders.... is all phenomenal and thats what I was hoping for and expecting.

I'm not a big fan of the "waiting game" because its a game you cant really win. I'm buying G80 and buying G81 as well in June. Will skip the respin if Nvidia does that on R600 launch. Then after that I'll be buying G90 as well.
All I've found with the waiting game is that you end up holding your current card for far to long till the value is completely gone from it. Then you have a half worthless card and dont get to play with the newest stuff. So I sold my GTX and am now eagerly awaiting with check card in hand to order the first 8800GTX I can get my hands on. :)
The value of all current X1900 and GF7s is about to plummet. I lost a total of $40 of the original purchase price of my 7900GTX, which isnt a bad loss IMO.. one of my better "rounds".

I hope for R600 to be a good chip, I really do because I want to see The Final Product be a damn good battle versus the mighty Nvidia.
 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Originally posted by: Crusader
ATI R600
-This is ATIs last attempt, so its going to be good. And thats why they dont care about missing Christmas sales.. its the last from them like this (high end). After this, everything moves to a pro-AMD stance.
Hahahahahahaha, I can't wait to see you eat those words when the refreshes are out and whatever next generation comes out after R600 and G80 are over and done with. AMD have said over and over that the ATI brand and Radeon product family is going nowhere, and is going to continue on as the flagship graphics product of AMD. :laugh:
 

virtualrain

Member
Aug 7, 2005
158
0
0
Here's some info on the HSF mounting holes for those of you who use water cooling or other aftermarket HSF products...

As illustrated in this Photoshop analysis of a G80 vs. G70 card, the mounting hole spacing is different...

Note that the images were sized identically using the PCI-e connector as reference and then a red box connecting all four holes on the G70 was overlayed on the G80 for comparison.

Ignore the "What's This?"... that's the RAMDAC.

 

WelshBloke

Lifer
Jan 12, 2005
33,320
11,472
136
R600 will be a beast, but might not destroy G80 in performance regardless.. should be a very good match or beat G80, but as stated above, NV is thinking of respinning G80 on R600 launch if needed

Wow Crusader bigging up r600, who'd have thought it. :Q


:p

 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Ulfhednar
Originally posted by: Crusader
ATI R600
-This is ATIs last attempt, so its going to be good. And thats why they dont care about missing Christmas sales.. its the last from them like this (high end). After this, everything moves to a pro-AMD stance.
Hahahahahahaha, I can't wait to see you eat those words when the refreshes are out and whatever next generation comes out after R600 and G80 are over and done with. AMD have said over and over that the ATI brand and Radeon product family is going nowhere, and is going to continue on as the flagship graphics product of AMD. :laugh:

Uhhh SURE!
I'll eat someone elses words for you, toolbag. Do you not have reading comprehension? That was pulled from chinese forums.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: WelshBloke
R600 will be a beast, but might not destroy G80 in performance regardless.. should be a very good match or beat G80, but as stated above, NV is thinking of respinning G80 on R600 launch if needed

Wow Crusader bigging up r600, who'd have thought it. :Q


:p

I exist in the Truth of matters. I dont live in some fantasy world where ATI rules all. Nor do I disrespect much of ATIs engineering.

I happen to know that software > hardware when it comes down to it, within the overall product.. which is why I tend to Lean Green. They have better driver support.

I was going to wait for G90, but since G80 has fully unified shaders.. I'm onboard and ready to shell out $$ for this.

I wanted unified shaders + NV driver support, FTW! Before the unified G80 confirmation from Kris, it was looking like the typical "ATI superior engineering" and Nvidia superior drivers.

With G80, I feel I'm getting the best of both worlds from Nvidia this time. Great chip from what it appears.. and NV still has more software engineers working for them than hardware so prob will have great driver support, as per typical Nvidia!! :thumbsup:
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
Originally posted by: Crusader
All I've found with the waiting game is that you end up holding your current card for far to long till the value is completely gone from it. Then you have a half worthless card and dont get to play with the newest stuff. So I sold my GTX and am now eagerly awaiting with check card in hand to order the first 8800GTX I can get my hands on. :)
The value of all current X1900 and GF7s is about to plummet. I lost a total of $40 of the original purchase price of my 7900GTX, which isnt a bad loss IMO.. one of my better "rounds"

Your method reduces the cost of staying current but, IMHO, the cost of staying current is grossly overpriced to begin with. Personally I see better value buying on the trailing edge of each cycle but to each their own.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
You are probably right in that its more economical that way dredd. I have debated either way. For me, the money isnt a -huge- deal, but it still bothers me to spend when I dont need to (waste).
But I figure this is one of my few hobbies, and I get a lot of enjoyment out of it, and a lot of that enjoyment comes from the latest and greatest.. for me, staying a gen behind troubles me greatly!

If I could handle it, I'd do what you do.
I've seen supply of the last gen simply dry up to the last few generations, rather than drop in price. And it seems to have gotten worse with crossfire and SLI, as everyone is looking for a cheap deal on a 2nd card when the new ones are released, raising demand and prices more.

I agree with you overall though, both are still valid methods of attempting to stay current and doing it a bit more economical!
 

Elfear

Diamond Member
May 30, 2004
7,168
826
126
Originally posted by: Crusader

I was going to wait for G90, but since G80 has fully unified shaders.. I'm onboard and ready to shell out $$ for this.


Where did you get this from? All the rumors I've seen have said that G80 will either have the standard pipes again (i.e. not unified) or the pipes will be partially unified.


G81
-June '07
-65nm
-GDDR4
-512bit MC

I highly doubt this is accurate. The leap from 90nm to 65nm would be huge and I doubt Nvidia will be able to do that in ~6-7 months. From what I know ATI is ahead in jumping to smaller architectures and I doubt they will have a high-end 65nm card out by June. Maybe someone who knows more about this stuff can chime in.


I happen to know that software > hardware when it comes down to it, within the overall product.. which is why I tend to Lean Green. They have better driver support.

I won't even get into the "Nvidia has godlike drivers" debate with you. We obviously have very different opinions on that topic.
 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Originally posted by: Crusader
Uhhh SURE!
I'll eat someone elses words for you, toolbag. Do you not have reading comprehension? That was pulled from chinese forums.
Toolbag? Way to keep up credibility there, not like you ever had any.

Hmm, your source is someone on a Chinese forum eh? I'll take the word of AMD over that any day, and AMD say that the ATI brand and Radeon product family is going nowhere.

:)