GPU Pricing: wtf?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Equinox
Well, as I stated earlier, I'm personally boycotting the *next* $400 and $500 cards they put out, because after dropping $400 on my current card, I'm going to have to keep it for a while. Hopefully most of the people that paid that much will feel the same way, and they can start lowering their expectations of just what is a fair price. I certainly know that $799 for essentially an overclocked GTX wasn't a fair price. Apparently a large number of other people felt the same way because they stayed away in droves until the prices came down. GTX and GTS prices haven't come down tremendously, and the HD2900XT is still pretty much sitting above MSRP at most places.

I seriously doubt it will happen this next generation... It's been a long time since we've made a leap forward in top end performance (Nov), and you can bet that when that happens again whoever is the first to do it will sell some $500-600 cards in a heartbeat. There are a lot of us that bought 8800s very early on that haven't had a reason to even consider an upgrade or spend any money for a while now, so we'll easily have the money for a next gen card when they come out.
 

toadeater

Senior member
Jul 16, 2007
488
0
0
Originally posted by: munky
The good old days of $300 high end cards are not coming back. AMD and NV will charge as much as people would pay, and unless everyone boycotts video cards costing over $300, there's no reason for the cards to drop in price.

There's no need to boycott. Most people aren't buying them and don't have any plans to until prices drop.
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
I wouldn't argue just for the sake of argument - but just look at the CPU market today. Did anyone expect, say a year or two ago, to see a quad-core for $266? Or an X2 3800+ for $80? That's a brutal war in free market. And AMD is still cutting prices at their cost. If NV was in AMD's shoe in current CPU market (and assume 8800 GTX was a CPU), sure they will sell it for $300, $200, or even $100 at their cost. It's not a hard guess. Does anyone think 8800 Ultra's initial price ($830?) was set after precise calculation of production cost? Prices are nearly completely arbitrary (in the sense that it's more to do with supply/demand than actual cost to produce goods/services) in a free market. I do think the price fixing between ATI-NV in the past was entirely possible.

Edit: Clarification and grammar
 

sliderule

Member
May 13, 2007
75
0
0
Originally posted by: bryanW1995
Originally posted by: schneiderguy
Originally posted by: sliderule

Hmm, I don't think either company Ati or Nvidia are US companies, but it is true pretty much all of the worlds businesses are selling out to cheap Chinese labour...whether they be based in Europe, Japan, or North America. :/

nVidia is american, ATI is canadian, at least they were before they got acquired by AMD. They might be considered american now also :confused:
What he said. However, it is an easy mistake to make since california is more like a different planet...

lol, well I knew ATI was Canadian, but for some reason I always thought Nvidia was a Taiwanese Co...good to know
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: lopri
I wouldn't argue just for the sake of argument - but just look at the CPU market today. Did anyone expect, say a year or two ago, to see a quad-core for $266? Or an X2 3800+ for $80? That's a brutal war in free market. And AMD is still cutting prices at their cost. If NV was in AMD's shoe in current CPU market (and assume 8800 GTX was a CPU), sure they will sell it for $300, $200, or even $100 at their cost. It's not a hard guess. Does anyone think 8800 Ultra's initial price ($830?) was set after precise calculation of production cost? Prices are nearly completely arbitrary (in the sense that it's more to do with supply/demand than actual cost to produce goods/services) in a free market. I do think the price fixing between ATI-NV in the past was entirely possible.

Edit: Clarification and grammar

Ah, so you don't have actual proof of what you say, and look where AMD is right now, they aren't doing financially healthy right now due to this extreme price war. A corporation is suppose to price products where they make profit.

You can't assume a GPU is a CPU. You can't compare CPU's to GPU's, they are different products and hence different production costs. AMD is cutting prices to survive, and eating their own blood to do it, which isn't a good thing.

Nvidia wouldn't do that, unless it needed to survive to do it.

Prices are indeed affected by production cost, but that's obviously not the only issue at hand. Profitability is another, which drives companies to make as much as possible.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
As far as production costs go, a transistor is a transistor. Only difference would be the process it's using. GPUs have a much higher transistor count than CPUs do currently. So costs a lot more "per wafer" you yield a GPU of today, and yields may be much lower.

Initially, a 8800Ultra yield from a wafer may have been rare. Cherry picked cores with an expensive price tag to go with it. As time goes on, yields improve as the process is refined and 8800Ultra cores become more common. So the price will come closer down to earth (If there is a reason for prices to come down, like competition, close to next product cycle, etc. etc.).

AMD isn't doing well financially right now, as we all know, but there is still a chance they can turn things around. At least thats' what the stockholders are counting on. Germany just had to lend AMD 262 million for the Dresden Fab update, but that is only about 10% of what AMD needs to update that Fab. Mega dollars.
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
I think there is a misunderstanding.

Of course a company will want to make a profit so initially set the price of a price no lower than production cost. That's a given. But whether that price will be sustained or not - which is what matters for the company - is decided by supply/demand. You can make a hand-woven glove at home for total 24 hours of labor, but whether if it sells for the time you spent isn't up to you. CPU and GPU are different 'technically' but they are both 'goods' in a market. So is everything. How about RAM? RAM prices went down because the price of silicon that's used for Micron D9 went down by 4 folds in a matter of months?

And of course company who can't sustain the profit to make up the cost will go belly up. ATI did, didn't they? AMD might follow suit sooner or later. Killer-NIC or Physix are also the same.

I am not exactly sure what kind of 'proof' you're asking for. If you think I should provide NV's QuickBook sheets :D , then sorry I don't have such things. But I believe you should be able to find Mr. Jen-Seun's comments in recent (past 1~2 years) transcripts of NV's quarterly share holder meetings, regarding his pricing strategy. He's quite genius and I speak highly of him as a company's executive.
 
Jan 9, 2007
180
0
71
ATI didn't go belly up, they got bought for a very, very attractive (maybe too attractive) price. AMD isn't going to die just yet, but they are in need of a cash infusion to be sure. They are also in need of some products that are competitive. The ATI side of the house had to have missed out on some of the OEM deals just due to the lateness of their arrival - big integrators can't wait until the end of May to determine what they will be building for the back to school season. They have to test the products. The only one with plenty of new inventory at that time was NVidia, and we are now seeing laptops with NVidia 8 series chips start to trickle out. AMD is in a bind right now, because they have the pressure of a so-so line of products with the R600 family, and their CPUs aren't flying of the shelves, either (though they do get plenty of OEM business). Laptop sales is where everyone makes boat loads of money; they have good margins because they have to be fully designed around the chips. If AMD doesn't have Turions that compete, or Radeon Mobile chips to compete, they miss out on a lucrative chunk of the market.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
IMO GPU memory is highly overrated. 320mb isn't *that* far off 512mb, and IMO it's the 'sweet spot' size for a modern GPU.

I plan to buy a DDR3 rig once they drop in price, and that way my card will be able to share system memory that's pretty much as fast as the memory on the videocard. :light:

As for the prices, I agree, they are messed up.

I never spent more than $200 on a GPU until I got the 320mb 8800GTS. That included both a Radeon 9700 non-pro and an X800Pro, both within 6 months of their respective releases (the 9700 I bought used).
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I almost got sick when I spent $149AR for my x1950xt. If all the rest of you were cheap bastards like me I could have bought an 8800gtx for that price. Thanks a lot, guys!
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: SickBeast
IMO GPU memory is highly overrated. 320mb isn't *that* far off 512mb, and IMO it's the 'sweet spot' size for a modern GPU.

I plan to buy a DDR3 rig once they drop in price, and that way my card will be able to share system memory that's pretty much as fast as the memory on the videocard. :light:

As for the prices, I agree, they are messed up.

I never spent more than $200 on a GPU until I got the 320mb 8800GTS. That included both a Radeon 9700 non-pro and an X800Pro, both within 6 months of their respective releases (the 9700 I bought used).

Depends on resolution. I guarantee I can fill that frame buffer quick at 1920x1200.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: bryanW1995
I almost got sick when I spent $149AR for my x1950xt. If all the rest of you were cheap bastards like me I could have bought an 8800gtx for that price. Thanks a lot, guys!

If it wasn't for us buying the high margin parts there would have never been an X1950XT for you to buy for cheap.
 
Jan 9, 2007
180
0
71
I spent more than I wanted to on my GTS 640, but I must admit I have enjoyed having it. It will need to last for a while, though. It replaced a 6800GT (that only got replaced because it stopped displaying the blue part of the spectrum - that was a real treat!). If Creative ever gets a real 1x pcie X-FI so that I could take my X-FI out of the only PCI slot I have (which would be covered by a second GTS), and the 640 GTS is sold cheaply while everyone is feverishly buying the next gen, there is always SLI, I suppose.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Matt2
Originally posted by: SickBeast
IMO GPU memory is highly overrated. 320mb isn't *that* far off 512mb, and IMO it's the 'sweet spot' size for a modern GPU.

I plan to buy a DDR3 rig once they drop in price, and that way my card will be able to share system memory that's pretty much as fast as the memory on the videocard. :light:

As for the prices, I agree, they are messed up.

I never spent more than $200 on a GPU until I got the 320mb 8800GTS. That included both a Radeon 9700 non-pro and an X800Pro, both within 6 months of their respective releases (the 9700 I bought used).

Depends on resolution. I guarantee I can fill that frame buffer quick at 1920x1200.
Yeah but with 320mb of the framebuffer filled, even the 640mb GTS starts to slow down. I just leave AA off and don't run into any issues.
 

toadeater

Senior member
Jul 16, 2007
488
0
0
Originally posted by: Equinox
Agreed. Prices are totally out of hand, and I think they are working the market.

There aren't really that many people buying these $300+ DX10 cards (which include Vista as an added expense).

Nvidia and ATI might be hoping the hold outs will eventually succumb to their price-fixing scam, but they will back down before gamers do. In part because PC game developers want the market to evolve and don't want 2006 DX9 cards to be the average GPU requirement.

Some publishers are already starting to complain that the expense of DX10 cards and Vista is hurting the PC gaming market.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: yacoub

Last gen's high end cards had 512MB. There's no reason anything in the GTS/GTX tier of this generation should have less than that. 320MB as a number is just silly.

Of course it would have been nice to buy 8800GTS 512mb for $260, but even today it still presents good value at 1600x1200 4AA/16AF. It handily outperforms 7950GT 512mb by 80% which isn't bad Benches

If anyone wants to talk about Value, let's first point out the performance of HD 2900XT in the latest shader intensive games at 1920x1200 with 0AA/AF such as Oblivion, STALKER, Supreme Commander where it can't even outperform 8800GTS 320mb card which costs $130 less (arguying that HD 2900XT has enough memory to turn on AA is also questionable considering both cards are already slow enough at those resolutions without AA). I personally only need a card to play on my 37 inch westy at 1920x1080, in which case 8800GTS 320 presents good value.

Xbitlabs' latest video review Link mentions that with older drivers the card suffered memory allocation problems and a lot of those have been fixed as the 640mb card has hardly any advantages in their tests.

In any event, if Nvidia has a refresh in the next 3 months, EVGA customers can easily step up from their card today without having to worry about depreciation of their card. So I dont see any point in waiting for G90 which can also be delayed. Not to mention Nvidia tends to release their top end cards ($400-600) first and then months later follows them with cheaper $300 and below cards. This means the wait for next-gen is FAR FAR longer than "some months away" at the same price point.

 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: SickBeast
IMO GPU memory is highly overrated. 320mb isn't *that* far off 512mb, and IMO it's the 'sweet spot' size for a modern GPU.

I plan to buy a DDR3 rig once they drop in price, and that way my card will be able to share system memory that's pretty much as fast as the memory on the videocard. :light:

As for the prices, I agree, they are messed up.

I never spent more than $200 on a GPU until I got the 320mb 8800GTS. That included both a Radeon 9700 non-pro and an X800Pro, both within 6 months of their respective releases (the 9700 I bought used).

But you have to remember the MSRP's of the respective GPUs, Radeon 9700 NP was a $299 card, and X800 Pro a $399 card, so whatever they fell to in retail or you got in on a hot deal, is a different story.

These days Nvidia and ATI are trying a new thing where the upper range are actually better value then the lower range, which is not what were used to. As well the different between a high end SKU and low end has widen considerably each generation. Even if you maintain the same ratios each generation.

320 is effectively like 256 to me, which is a bit thrifty on the performance-mainstream, I would want at least a 512 frame buffer on a high end card today, memory prior to the G80 generation on Nvidia cards tended to not make a difference between 256-512.

But overall I would rather have a more powerful shader core then more memory, that is ot to say I wouldn't want both, but priority wise the shader core is more important as that helps everywhere as opposed to more memory which only helps in select circumstances.
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: Matt2
Originally posted by: SickBeast
IMO GPU memory is highly overrated. 320mb isn't *that* far off 512mb, and IMO it's the 'sweet spot' size for a modern GPU.

I plan to buy a DDR3 rig once they drop in price, and that way my card will be able to share system memory that's pretty much as fast as the memory on the videocard. :light:

As for the prices, I agree, they are messed up.

I never spent more than $200 on a GPU until I got the 320mb 8800GTS. That included both a Radeon 9700 non-pro and an X800Pro, both within 6 months of their respective releases (the 9700 I bought used).

Depends on resolution. I guarantee I can fill that frame buffer quick at 1920x1200.

actually you fill the FB at 12x10 easily. load up oblivion and get QTP3. infact don't get QTP3 and just up the AA to 4x and see.
 

Puffnstuff

Lifer
Mar 9, 2005
16,187
4,871
136
You know just before the 2900 was released the nvidia prices were falling pretty steadily and then when it appeared and didn't offer the performance that was promised prices shot back up immediately. I would imagine that as we near the release of the next nvidia chipset that the prices will fall as vendors attempt to clear stock for it. My 7800gt co is still working fine for me so I'm not in any kind of hurry to buy a new card. When prices moderate and circumstances warrant a new card I'll take a serious look at what's available however I'm pretty sure I'll stick with nvidia.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
The last time I paid 300.00 + for a video card was with 3dfx and the voodoo 1 cards.
At the time it was well worth it, hardware 3d and tomb raider.

Now I can't justify it. I start to buy a 8800 or something and I think, its what you already got, but just faster.

dunno.
I could get the 8800, but that might require new living arrangements and a lawyer.


 

Bradtechonline

Senior member
Jul 20, 2006
480
0
0
You can spend 150 bucks on a card if you don't need to play in high resolutions. I play all my games with eye candy maxed out on a 17" monitor. 19" highest I would go. I don't want to have to spend 300-400 bucks on new video cards every year. I'll just buy a console if it gets to that point. This X1950XT once I fixed driver issues reminds me of my old ti 4200. Great bang 4 the buck.