8800GTX to be 30% faster than ATI's X1950XTX. GTS to be about equal to it.

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
best of all, G80 is only a stop-gap measure until their March part arrives
No, actually, it's pure profit for nvidia until R600 arrives. Time to market.

unless it really is +30% . . .

don't forget nv30 had a lead over r300 with the "first DX9 card"

time to market . . . unless it is a dustBuster :p

then pure agony for nvidiots

:D
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: GTaudiophile
R600 is ATI's second-gen unified shader product...Microsoft chose their design over nVidia's for a reason...and it will show!

And that reason had nothing to do with technology. Microsoft got pissed at nVidia, pure and simple. ;)
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: apoppin
Originally posted by: Gstanfor
best of all, G80 is only a stop-gap measure until their March part arrives
No, actually, it's pure profit for nvidia until R600 arrives. Time to market.

unless it really is +30% . . .

don't forget nv30 had a lead over r300 with the "first DX9 card"

time to market . . . unless it is a dustBuster :p

then pure agony for nvidiots

:D

Uhh, batshit, R300 launched first, and nv3x never really caught up DX9 performance wise, so I don't know what the hell this lead you are prattling on about is.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I hope that there is some kind of Y adapter from 1 to 2 six pin power in the box
Of course - it even lists the two of them in the cable bundle specifications.

There's no way they would ship the card without the necessary cables given many PSUs don't have PCIe power connectors (mine doesn't but my 7900 GTX requires one so I use the splitter cable).
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Nightmare225
Originally posted by: GTaudiophile
R600 is ATI's second-gen unified shader product...Microsoft chose their design over nVidia's for a reason...and it will show!

And that reason had nothing to do with technology. Microsoft got pissed at nVidia, pure and simple. ;)

nvidia pisses off a lot of big corporations . . . they are extraordinarily aggressive . . . might really hurt them as they try to develop their CPU . . . wouldn't surprise me to see them get totally shut down in that regard . . .

this article talks about the End of nvidia

[or it surviving beyond all odds . . . it's a two-parter]

Nvidia Stexar move turns gun turrets on AMD, Intel

edit: uh Gstanfor, if the g80 is another nv30 then your 'time to market' is wasted.

only true nvidiots will buy overpriced hype
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
What we know at the moment is that g80 is a minimum of 30% faster than the current fastest video card. That sounds like a damn good reason to buy to me Batshit.
 

Killer4Hire

Junior Member
Oct 2, 2006
22
0
0
I was wondering if you would need a SLI power supply for the 8800GTX because of the 2 - 6pin connectors needed??
 

Killer4Hire

Junior Member
Oct 2, 2006
22
0
0
I was wondering if you would need a SLI power supply for the 8800GTX because of the 2 - 6pin connectors needed??
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Most decent PSU's capable of running G80 will already have two PCI-e connectors already. Personally, I think its about time for the external power brick already.
 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
Originally posted by: Gstanfor
What we know at the moment is that g80 is a minimum of 30% faster than the current fastest video card. That sounds like a damn good reason to buy to me Batshit.

MINIMUM??? Everything I've read says MAXIMUM!
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
What we know at the moment is that g80 is a minimum of 30% faster than the current fastest video card. That sounds like a damn good reason to buy to me Batshit.

i don't care what your batshit says to you . . . but NOwhere but in your fevered illusions is any suggestion of +30% as a "minimum"

:roll:

and i though you didn't have a good reason to buy G80 :p

your replies are a contradictory steaming pile of FUD
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: GTaudiophile
Originally posted by: Gstanfor
What we know at the moment is that g80 is a minimum of 30% faster than the current fastest video card. That sounds like a damn good reason to buy to me Batshit.

MINIMUM??? Everything I've read says MAXIMUM!

Well then, you should stop reading the same book that told batshit "nv30 had a lead over R300"...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Originally posted by: GTaudiophile
Originally posted by: Gstanfor
What we know at the moment is that g80 is a minimum of 30% faster than the current fastest video card. That sounds like a damn good reason to buy to me Batshit.

MINIMUM??? Everything I've read says MAXIMUM!

Well then, you should stop reading the same book that told batshit "nv30 had a lead over R300"...

so you are admitting you are full of crap but trying to distract others from your goof up by pointing out another's

nice try

yes 9700p was first out the gate and crushed the much hyped nv30 . . . it never recovered . . . nor evidently has your ego . . .

my condolences
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: apoppin
Originally posted by: Gstanfor
What we know at the moment is that g80 is a minimum of 30% faster than the current fastest video card. That sounds like a damn good reason to buy to me Batshit.

i don't care what your batshit says to you . . . but NOwhere but in your fevered illusions is any suggestion of +30% as a "minimum"

:roll:

and i though you didn't have a good reason to buy G80 :p

your replies are a contradictory steaming pile of FUD

Since when does having a good reason to buy imply that a person *will* buy? I had good reasons to buy 7900 GXT, or 7950 GX2 as well...

All we have heard so far are rumors. The pic I linked to above are about the only solid bit of info that exists right now. If you want tho believe the rumors then fine.

I remember the fanatics were also convinced nv40 was going to be another nvidia flop, right up until the day it released...

If at first you don't succeed . . . quit!
You should take your own advice again - maybe your other personalities haven't experienced Oblivion yet?
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Gstanfor
Most decent PSU's capable of running G80 will already have two PCI-e connectors already. Personally, I think its about time for the external power brick already.

Agreed on the power brick idea. Theses things are what like $20 at most to give a power hungry card some extra juice. I'd gladly have two power bricks behind my PC than have to fork over $550 for a 1KW PCP&C PSU, or even $350 for an Enermax Galaxy (neither of which fit in my case)...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Originally posted by: apoppin
Originally posted by: Gstanfor
What we know at the moment is that g80 is a minimum of 30% faster than the current fastest video card. That sounds like a damn good reason to buy to me Batshit.

i don't care what your batshit says to you . . . but NOwhere but in your fevered illusions is any suggestion of +30% as a "minimum"

:roll:

and i though you didn't have a good reason to buy G80 :p

your replies are a contradictory steaming pile of FUD

Since when does having a good reason to buy imply that a person *will* buy? I had good reasons to buy 7900 GXT, or 7950 GX2 as well...

All we have heard so far are rumors. The pic I linked to above are about the only solid bit of info that exists right now. If you want tho believe the rumors then fine.

I remember the fanatics were also convinced nv40 was going to be another nvidia flop, right up until the day it released...

If at first you don't succeed . . . quit!
You should take your own advice again - maybe your other personalities haven't experienced Oblivion yet?

you are dense

you stated:
That sounds like a damn good reason to buy to me

i also stated - EVERYWHERE that i think those "+30%" figures are suspiciously LOW . . . so i am not in your fanatic class . . . i was expecting g80 to be pretty sophisticated and much faster than the present gpus.

and as usual your last statement makes ZERO sense to anyone but yourself. :p

EdIT: i have also ZERO reason to "hope" g80 flops . . .

other than to see you shut the FUD up for awhile

:D
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: hardwareking
http://www.xbitlabs.com/news/video/display/20061023142704.html
Check this out.All the rumoured specs are more or less true.And it doesn't have a liquid cooler.
I wonder if a power brick will come with the bundle.(hope it does)

See, this is what happens when two goons like apoppin and Gstanfor argue back and forth... No one actually reads the thread anymore and the same pics get posted as new twice in the same thread.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: hardwareking
http://www.xbitlabs.com/news/video/display/20061023142704.html
Check this out.All the rumoured specs are more or less true.And it doesn't have a liquid cooler.
I wonder if a power brick will come with the bundle.(hope it does)

Someone post an un-shortened version of the first pic so we can see how long the card really is. Funny how they try to hide it, like no one would notice.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: munky
Originally posted by: hardwareking
http://www.xbitlabs.com/news/video/display/20061023142704.html
Check this out.All the rumoured specs are more or less true.And it doesn't have a liquid cooler.
I wonder if a power brick will come with the bundle.(hope it does)

Someone post an un-shortened version of the first pic so we can see how long the card really is. Funny how they try to hide it, like no one would notice.

I stretched it a bit (eyeball, nothing technical) with photoshop...

http://img163.imageshack.us/my.php?image=asusgf8800pic1resizebc1.jpg
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nitromullet
Originally posted by: hardwareking
http://www.xbitlabs.com/news/video/display/20061023142704.html
Check this out.All the rumoured specs are more or less true.And it doesn't have a liquid cooler.
I wonder if a power brick will come with the bundle.(hope it does)

See, this is what happens when two goons like apoppin and Gstanfor argue back and forth... No one actually reads the thread anymore and the same pics get posted as new twice in the same thread.

when you get called names you tend to reply 'in kind' . . . pompous ass

:D

thanks for calling me a goon

and i AM reading the thread . . . further speculation and nothing much further than the German site's "preview" :p
Specifications of the G80 chip are not clear. Some sources indicate that Nvidia?s first DirectX 10 chip will incorporate 48 pixel shader processors and an unknown number of vertex shader/geometry shader processors. Other, however, claims that the G80 has 32 pixel and 16 vertex and geometry shader processors. Yet another source has indicated that the G80 will have unified shader architecture and will consist of 700 million transistors.
i am in denial about This Topic:
8800GTX to be 30% faster than ATI's X1950XTX. GTS to be about equal to it.

no way . .. it has to be faster . . . who would buy a 'similar performing' GTS for much more money than the XTX?
:Q

Gstanfor

yeah . . . i know :p

:D


 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Anyone think that the additional die could be a physics processor? The g80 is supposed to have something called "quantum effects processing". Maybe the extra die is a fully capable physics processor?
If that were the case, the amount of physic processing is worthless unless the GPU can process a "sophysicated" [:p] scene. Look at what happened with Ageia. People paid to get a hit in their frames for not much else in terms of particle detail. If G80's physic capabilities are better than a dedicated physics processor or the X1900's capabilities, it better be able to power those advancements graphically and fluently if it is to be considered good.
Since when was 11-12k with BETA drivers poor?
Since they were achieved with non-existent Kentsfields.
In case you had not noticed (possibly due to you rose tinted sunnies) the post I linked to was made by Beyond3D's editor - one of the few people in the world who actually have access to G80 at the moment...
So he would be likely to be under an NDA and unable to comment on any "problems" that have been rumored, correct? I mean, if Nvidia gave him one of their tight-lipped samples, they must trust him enough not to talk about it, right?
No, actually, it's pure profit for nvidia until R600 arrives. Time to market.
In order for something to be a "Profit" they must first replenish the MILLIONS of dollars they've spent on developing G80. The short time between G80 and R600 won't be enough to accomplish that. It will however begin their climb towards profit.
And that reason had nothing to do with technology. Microsoft got pissed at nVidia, pure and simple.
PM me why they did. I'm curious but don't want to derail this thread more than others already have.
Personally, I think its about time for the external power brick already.
External? No thanks. I'd like better PSU's to emerge as long as they can be put in a case, but I'm not going to have my main computer components sprawled out across my desktop any more than a case, monitor, keyboard, mouse, and speakers. Anything else can go inside my case or no where at all.

These companies can't keep driving hardware components to become bigger and bigger and bigger forever. Eventually they're going to need to do the opposite to attract buyers.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: apoppin
when you get called names you tend to reply 'in kind' . . . pompous ass

:D

thanks for calling me a goon

I can call you batshit too if that makes you happy... :D