G80 De-Mystified; GeForce 8800GTX/GT

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Pabster

Lifer
Apr 15, 2001
16,986
1
0
Originally posted by: Capt Caveman
Expect G80 to be out somewhere in mid November along with Kentsfield.

Dear God, looks like the wallet is REALLY gonna take a hit :D
 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Originally posted by: Pabster
Originally posted by: Capt Caveman
Expect G80 to be out somewhere in mid November along with Kentsfield.
Dear God, looks like the wallet is REALLY gonna take a hit :D
Yeah, especially if we need external PSUs to run these things.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: Ulfhednar
Originally posted by: Modular
My 6600gt still plays BF2 on all Medium, 2xQ AA and 16X AF at an average of 60 FPS....it was 120.00 over a year ago when I bought it. I'm not sure that many people will agree with you at all on this one. Actually, the other way around is moreso the truth. Look at the X1600 POS.

ATi and nVidia always produce monster cards that cost way too much $$ for a year. It's the "value" price-point that really makes them the money. And nVidia trounces ATi in that price point yearly (150 - 250).

You can get a 7600gt for 114.99 right now on newegg....
Maybe I should have specified that I wasn't talking about what are now low-end budget cards. I was talking about cards like the 7900GT, X1800XT, X1900XT, 7900GTX, and 7950GX2.

I don't know about the US, but here in the UK the 7900GT costs almost as much as an X1900XT and the 7900GTX is ridiculously expensive. It's ridiculous and Nvidia cannot compete. I won't even mention the 7950GX2, it's just god awful value for money.

7600GT 256MB = £100
X1900GT 256MB = £130
X1800XT 256MB = £140
7900GT 256MB = £170
7950GT 256MB = £175
X1900XT 256MB = £175
7950GT 512MB = £205
X1900XT 512MB = £220
X1900XT-X 512MB = £250
7900GTX 512MB = £320
X1950XT-X 512MB = £320 (rip-off)

This is the sort of pricing we've seen for the last year or two, Nvidia cannot compete and from what I've heard people saying on this forum the X1900s are the best bang-for-buck going over there too beyond all reasonable doubt.

The prices remain high because people are buying the cards at that price. Thus is appears unless Nvidia has a downtick in demand they are competing just fine at those prices.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: ShadowOfMyself
Originally posted by: Wreckage
Originally posted by: Capt Caveman
I know it's the Inquirer but they posted yesterday about the R600 consuming 250w - http://theinquirer.net/default.aspx?article=34446

250W :Q

My whole HTPC with a 6600GT, P4, 1GB RAM, etc. does not take that much.

If that is true, that card will be a major failure.

So will G80


You think nobody will buy the new cards? What, you think people will continue to buy the 7900 series of cards for the rest of time? People say these types of silly things every time a new gen comes out they dont want to pay for.


 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Originally posted by: Genx87
The prices remain high because people are buying the cards at that price. Thus is appears unless Nvidia has a downtick in demand they are competing just fine at those prices.
So you're saying those prices are reasonable? You're on drugs my friend, serious drugs.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: ShadowOfMyself
Originally posted by: Wreckage
Originally posted by: Capt Caveman
I know it's the Inquirer but they posted yesterday about the R600 consuming 250w - http://theinquirer.net/default.aspx?article=34446

250W :Q

My whole HTPC with a 6600GT, P4, 1GB RAM, etc. does not take that much.

If that is true, that card will be a major failure.

So will G80

It did not say the G80 will need 250W. (It's probably all Inq BS anyway).
 

SpeedZealot369

Platinum Member
Feb 5, 2006
2,778
1
81
Plus it seems like this Dx10 launch is (correct me if I'm wrong) one of the most hyped launches in a loooong time, I doubt either side (especially G80) will have trouble selling any cards.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Ulfhednar
Originally posted by: Genx87
The prices remain high because people are buying the cards at that price. Thus is appears unless Nvidia has a downtick in demand they are competing just fine at those prices.
So you're saying those prices are reasonable? You're on drugs my friend, serious drugs.

They may not be reasonable, but there are still people who will pay that much and more.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: Ulfhednar
Originally posted by: Genx87
The prices remain high because people are buying the cards at that price. Thus is appears unless Nvidia has a downtick in demand they are competing just fine at those prices.
So you're saying those prices are reasonable? You're on drugs my friend, serious drugs.

If they werent, would people buy the cards and thus keep the price where it is?
You may not like the price but the market has determined they are fine as is.

 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Originally posted by: Creig
They may not be reasonable, but there are still people who will pay that much and more.
Yes, they're commonly described as people with more money than sense. A great many of the people buying Nvidia hardware at such shocking prices are probably also fanboys.

Originally posted by: Creig
If they werent, would people buy the cards and thus keep the price where it is?
You may not like the price but the market has determined they are fine as is.
I should start an e-tailer if people like you find it so acceptable to set rip-off prices. :) Personally, I have far more sense than that and have reflected that sense by buying mostly ATI hardware over the last year or two.

Originally posted by: inspire
:thumbsup: Wii FTW.
:thumbsup:
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
First WTF is VCAA???
Second, according to that it will have:
48 and 96 Vertex Shaders. WTF is wrong with that picture.
(did I do the math right? 24x2=48, 8x12=96)

I think you might want to have more pixel shaders then vertex shaders, and were are the geometry shaders?
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I should start an e-tailer if people like you find it so acceptable to set rip-off prices. Personally, I have far more sense than that and have reflected that sense by buying mostly ATI hardware over the last year or two.

Like me? I am not buying any of these cards at these prices so nice try. Nvidia doesnt set ripoff prices the market does, this is basic economics.

 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: Genx87
Originally posted by: ShadowOfMyself
Originally posted by: Wreckage
Originally posted by: Capt Caveman
I know it's the Inquirer but they posted yesterday about the R600 consuming 250w - http://theinquirer.net/default.aspx?article=34446

250W :Q

My whole HTPC with a 6600GT, P4, 1GB RAM, etc. does not take that much.

If that is true, that card will be a major failure.

So will G80


You think nobody will buy the new cards? What, you think people will continue to buy the 7900 series of cards for the rest of time? People say these types of silly things every time a new gen comes out they dont want to pay for.

If the card ships with an external power brick and can be cooled with non-exotic methods, I don't see what the fussing is about. The average user who would spend the money on such a card probalby doesn't care about the electricity bills.
 

lopri

Elite Member
Jul 27, 2002
13,327
708
126
250W for a single card is a suicide. It WON'T happen. That is against the PCI-E specs and Intel's PSU Guidelines. External powerbrick could be a solution, sure, but again that won't happen at this point of the game. I will donate $100 to AT if either a R600 or a G80 consumes that much power. Now, 150W is a more realistic number since we know the motherboard PCI-E slot supplies ~75W and a 6-pin power connector can supply the rest.
 

lopri

Elite Member
Jul 27, 2002
13,327
708
126
Originally posted by: Bateluer
If those prices are to believed, its way too pricey. 650 for a single card, bah, nuts to that.
It all depends, really. If the perfromance and quality can justify the price, (i.e. 2x, which is very possible) there definitely will be a market for such products. Think about it.
 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Originally posted by: Genx87
I should start an e-tailer if people like you find it so acceptable to set rip-off prices. Personally, I have far more sense than that and have reflected that sense by buying mostly ATI hardware over the last year or two.
Like me? I am not buying any of these cards at these prices so nice try. Nvidia doesnt set ripoff prices the market does, this is basic economics.
Before we talk about basic economics, can we talk about basic reading comprehension? I didn't say that you were buying them, I said that you find it acceptable that prices are so ridiculous.

My problem with this is that I like to switch brands every so often, but thanks to Nvidia prices being so utterly absurd I am on my third ATI card (which is the longest winning streak for ATI out of my wallet so far) and I would like to switch at some point.
 

Schadenfroh

Elite Member
Mar 8, 2003
38,416
4
0
Originally posted by: inspire
Originally posted by: Ulfhednar
Originally posted by: SpeedZealot369
If this is true, time for external power bricks :(
That rumour is already in full circulation and the flames are burning ever hotter due to this article from June 5th.

http://www.anandtech.com/tradeshows/showdoc.aspx?i=2770

ATI and Nvidia can bite me if this comes to pass, I will buy a Wii.

:thumbsup: Wii FTW.

That is right! Defy ATI by buying a Wii! That will show them!
 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Originally posted by: Schadenfroh
That is right! Defy ATI by buying a Wii! That will show them!
LOL, you make a very fair point but I was more emphasising defying this out of control train-ride that PC hardware might end up taking if we have to buy 700w 4x 12v rail PSUs just to run a PC with one of these graphics cards in.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: Ulfhednar
Originally posted by: Genx87
I should start an e-tailer if people like you find it so acceptable to set rip-off prices. Personally, I have far more sense than that and have reflected that sense by buying mostly ATI hardware over the last year or two.
Like me? I am not buying any of these cards at these prices so nice try. Nvidia doesnt set ripoff prices the market does, this is basic economics.
Before we talk about basic economics, can we talk about basic reading comprehension? I didn't say that you were buying them, I said that you find it acceptable that prices are so ridiculous.

My problem with this is that I like to switch brands every so often, but thanks to Nvidia prices being so utterly absurd I am on my third ATI card (which is the longest winning streak for ATI out of my wallet so far) and I would like to switch at some point.


Boo effing hoo pay up or quit complaining.


 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Originally posted by: Genx87
Boo effing hoo pay up or quit complaining.
I think we already established how I have more sense than that. :) It's no skin off my nose; Nvidia lose a customer and I save money until prices become more sensible and the features I want become available on their graphics hardware, it's that simple.
 

SonicIce

Diamond Member
Apr 12, 2004
4,771
0
76
256 + 128 bit doesn't sound like a good idea. I think it would be better if they shared one large bus. If it's split, that means only a maximum of 384 bits are used in the best scenarios. This is like why Intel moved from 2xMB segregated caches to one large shared cache. Just in case one core needs more at a given time than the other. They only take what they need.
 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Originally posted by: SonicIce
256 + 128 bit doesn't sound like a good idea. I think it would be better if they shared one large bus. If it's split, that means only a maximum of 384 bits are used in the best scenarios. This is like why Intel moved from 2xMB segregated caches to one large shared cache. Just in case one core needs more at a given time than the other. They only take what they need.
With all those transistors and this split memory and bandwidth, I had the idle thought that they could be preparing for on-chip GPU physics on a single card with 128MB of 128bit memory set aside purely for that purpose.

Of course, it was just an idle thought, and possibly has absolutely no bearing on reality.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: Ulfhednar
Originally posted by: Genx87
Boo effing hoo pay up or quit complaining.
I think we already established how I have more sense than that. :) It's no skin off my nose; Nvidia lose a customer and I save money until prices become more sensible and the features I want become available on their graphics hardware, it's that simple.

Like I said, market will dictate prices. Apparently however you are in the minority and thus Nvidia prices stay higher than you like.