They won the war of perception... When did 260+$ for a minimal card became cheap?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
This perception doesn't make any sense. It doesn't matter how many cards perform better or worse than a given card. The 6600GT for 200 never ran the newest games out at high settings at 1680 or 1920 when it was released; neither did the 8600gts for 200 when it was released.

BUT the medium settings of games today on the 8600gts look much better then the medium settings of the games the 6600gt had to cope with; so you are still getting the expected increase in graphics quality year after year.. for a similar price. I don't care if there is an 800 dollar card that makes the game look 3 times better; if the game still looks much better than games of the past era on the same priced card then nothing has changed.

I don't understand why people are complaining about super expensive high end parts; so what if your 200 dollar card doesn't run max settings; it shouldn't be able to do that, as a mid range part; it should run mid range settings. If anything, super performing high end parts allow makers to create games with faster advancing graphics that you WILL be able to play at max when you get your next 200 dollar card in a year or two.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: taltamir
Originally posted by: Wreckage
Originally posted by: taltamir
then you need a DX10 card, DX10 game, and running it at very high settings... the handful of such games only run at such settings where you get DX10 effects.....

It's a stretch to call DX10 and High Settings "minimal".

Minimal is DX9 and Medium settings at best.

If you are running it in "Medium" settings then none of the DX10 features are on. they are all disabled and you are running in DX9... which you would get a better framerate on a last gen card then an 8600 or lower. running high settings isn't minimal, but if you are not on high then you are running DX9. (very high they call it on most games)...

That's not true. You are using dx10 path even on medium settings.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
you are running the DX10 EXECUTEABLE, but you are not running any DX10 FEATURES that don't exist in DX9. You are better off running DX9 and getting a better framerate since without the DX10 only features it will look the same as DX9. Only DX10 only features make it look different then DX9...

Now you could in theory use command line arguments to force enable a specific DX10 feature while on otherwise low settings, giving you actual DX10 something with playable settings... But this is all theoretical and unpractical. Essentially if you want to play a DX10 enabled game in DX10 mode with DX10 features making it look in any way different then its DX9 version of the same game then you need at least a GTS or your framerate is too low to be playable.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: taltamir
you are running the DX10 EXECUTEABLE, but you are not running any DX10 FEATURES that don't exist in DX9. You are better off running DX9 and getting a better framerate since without the DX10 only features it will look the same as DX9. Only DX10 only features make it look different then DX9...

Now you could in theory use command line arguments to force enable a specific DX10 feature while on otherwise low settings, giving you actual DX10 something with playable settings... But this is all theoretical and unpractical. Essentially if you want to play a DX10 enabled game in DX10 mode with DX10 features making it look in any way different then its DX9 version of the same game then you need at least a GTS or your framerate is too low to be playable.

You sound so sure. But dx10 path gives you better overall image than dx9. Not much better but the difference is there. Bioshock as well. Gives you better water surface effects.

As tested here. http://enthusiast.hardocp.com/...wzLCxoZW50aHVzaWFzdA==
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
That article corroborates every single one of my claims:
"Depending on which mode you are running in, DX9 or DX10, some options may or may not be available. The highest settings allowed in the game is a setting called ?Very High? for every graphics options. However, ?Very High? is only present when you are running in the DX10 code path. In DX9 the highest in-game settings available are ?High.? There is also a ?Medium? and ?Low? for both modes as well. There are no gameplay differences between DX9 and DX10 rendering modes, only visual image quality differences. In DX10 you will receive better post processing effects including motion blur and HDR lighting quality as well as crisper surface details."

- So low and medium are the same for DX9 and 10... High is when unique DX9 features come into play. very high is when unique DX10 features that replace them come into play and make the game look better then on a DX9 card...
Keep in mind DX is not a kernel, it is mainly a collection of libraries from which the game draws code. So DX10 means there are more things to keep track of (lower performance) but more commands and options the engine can use... Some of those engine commands are new to DX10, most existed for many versions. So even in DX10 mode the game is running a lot of DX9 code, and some DX8 code, and so on... it just adds more commands for it to use not replace every single command ever made with a spanking new DX10 alternative that magically looks better...
-forgive my gross oversimplification

DX10 settings for the GT to be playable (33.8fps):
Shadow, Shader, post processing at medium, everything else high

DX9 playable settings for the GT (33.3fps):
Shadow high
Shader medium
post processing high
everything else high

So if you are running a GT and you want this game playable you have to run shadows, shaders, and AA, the things that improved DX10 looks... in medium mode... While in DX9 some of them can be in high mode... and keep in mind that the new DX10 stuff only happens in VERY high mode... that means a GT is a great DX9 card as far as crysis is concerned or a DX10 for other games, but a single GT cannot run crysis in DX10 mode at playable framerates if you enable any actual DX10 features... you are better off enabling some MORE dx9 features and playing it at ALMOST max on DX9... Even the GTX was unable to reach any DX10 features while at playable FPS...

The thing is, I am not buying games for just one card! I played and enjoyed bioshock on my 7900 OC for 250$ bought a year ago... I will just skip crysis until they polish their engine to the point it actually works... The question is... what card can i replace my 7900 and actually play games like Company of Heroes in DX10 mode with actual DX10 EFFECTS... CoH can be MAXED OUT as far as graphics on my current card.. I play on 1920x1200 max everything (except the DX10 only features). But the 8600GTS performs a LOT worse then my 7900... I will not be able to get all the DX9 level effects on it, much less any DX10 effects... the CHEAPEST DX10 card I can get that will actually give me DX10 effects in CoH is the GT (the GTS will work, but it costs more then the GT...)

THIS is why I call the GT a minimal card, because it is the minimum a person can have if they want to actually have DX10 effects... if NOT then they are better off buying a 7900 REAL cheap and enjoy DX9 in all it's glory.

I am fully aware that the GT is the most bang for the buck and ALMOST the most power card on the market...
 
Oct 30, 2004
11,442
32
91
Originally posted by: JBT
I think alot of people are basing all these cards performance on Crysis. There are SOOO many more games out there than this. Sure I will get it and play the hell out of it but there are other great games out there tha I plan on playing quite a bit that are much more suited to spitting good frames out with the GT. ...

I predict that people will still be playing Unreal Tournament 3 on public servers and in pick-up games and clan matches long after most people have forgotten about Crysis.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: taltamir
That article corroborates every single one of my claims:
"Depending on which mode you are running in, DX9 or DX10, some options may or may not be available. The highest settings allowed in the game is a setting called ?Very High? for every graphics options. However, ?Very High? is only present when you are running in the DX10 code path. In DX9 the highest in-game settings available are ?High.? There is also a ?Medium? and ?Low? for both modes as well. There are no gameplay differences between DX9 and DX10 rendering modes, only visual image quality differences. In DX10 you will receive better post processing effects including motion blur and HDR lighting quality as well as crisper surface details."

- So low and medium are the same for DX9 and 10... High is when unique DX9 features come into play. very high is when unique DX10 features that replace them come into play and make the game look better then on a DX9 card...
Keep in mind DX is not a kernel, it is mainly a collection of libraries from which the game draws code. So DX10 means there are more things to keep track of (lower performance) but more commands and options the engine can use... Some of those engine commands are new to DX10, most existed for many versions. So even in DX10 mode the game is running a lot of DX9 code, and some DX8 code, and so on... it just adds more commands for it to use not replace every single command ever made with a spanking new DX10 alternative that magically looks better...
-forgive my gross oversimplification

DX10 settings for the GT to be playable (33.8fps):
Shadow, Shader, post processing at medium, everything else high

DX9 playable settings for the GT (33.3fps):
Shadow high
Shader medium
post processing high
everything else high

So if you are running a GT and you want this game playable you have to run shadows, shaders, and AA, the things that improved DX10 looks... in medium mode... While in DX9 some of them can be in high mode... and keep in mind that the new DX10 stuff only happens in VERY high mode... that means a GT is a great DX9 card as far as crysis is concerned or a DX10 for other games, but a single GT cannot run crysis in DX10 mode at playable framerates if you enable any actual DX10 features... you are better off enabling some MORE dx9 features and playing it at ALMOST max on DX9... Even the GTX was unable to reach any DX10 features while at playable FPS...

The thing is, I am not buying games for just one card! I played and enjoyed bioshock on my 7900 OC for 250$ bought a year ago... I will just skip crysis until they polish their engine to the point it actually works... The question is... what card can i replace my 7900 and actually play games like Company of Heroes in DX10 mode with actual DX10 EFFECTS... CoH can be MAXED OUT as far as graphics on my current card.. I play on 1920x1200 max everything (except the DX10 only features). But the 8600GTS performs a LOT worse then my 7900... I will not be able to get all the DX9 level effects on it, much less any DX10 effects... the CHEAPEST DX10 card I can get that will actually give me DX10 effects in CoH is the GT (the GTS will work, but it costs more then the GT...)

THIS is why I call the GT a minimal card, because it is the minimum a person can have if they want to actually have DX10 effects... if NOT then they are better off buying a 7900 REAL cheap and enjoy DX9 in all it's glory.

I am fully aware that the GT is the most bang for the buck and ALMOST the most power card on the market...

Are you not reading what I'm reading? Did you not see the high dx9 vs high dx10 image quality tests. One is working dx9 and one is working dx10.

Image Quality

DirectX 9 vs. DirectX 10

First we wanted to examine DX9 image quality versus DX10 image quality. To do so we are running the game with all options at ?High? in DX9 mode and in DX10 mode.

Article Image Article Image Article Image Article Image Article Image

In the first screenshot we see that there is a slight difference in the shadow on the edges, they are a tad smoother in DX10. The shadow also appears to be darker in DX10 due to a slight lighting difference. In the second screenshot we are comparing the depth of field effect and in the third screenshot motion blur and we see no differences between DX9 and DX10 at the ?High? settings. In the fourth and fifth screenshot we are looking at lighting quality and lighting quality with water and again see no noticeable differences between DX9 and DX10. The benefits that DX10 brings to this game visually are not realized until you enable the ?Very High? quality settings as you can see below.


The difference is there but again it's minimal.
 

Denithor

Diamond Member
Apr 11, 2004
6,300
23
81
Originally posted by: Sureshot324
I don't see why people have a problem with high end video cards being $500 plus. If you you want to spend ~$150 for a video card, then do it. It's not like $150 video cards would be any faster if $500+ cards didn't exist. Competition between AMD and Nvidia is too fierce for that. They're going to make the best card they possible can at each price range.

The real problem here is that AMD has not provided any real competition for nVidia at the top end in the last year. We are seeing the same thing that happened with the s939 X2 cpus when Intel was still stuck with Netburst processors: AMD could (and did) charge $300 for the lowest end dual core processor because they had no real competition at that level of performance.

If there had been serious competition for this mid-level performance we would have seen the GTS320 pushed down into the <$200 range by now instead of remaining in the $250-300 range where it has remained since launch.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
I think what's lost a bit in all this discussion is the change in people's expectations from a video card and the shift HD has brought upon the industry. You have to realize that cheaper, larger LCDs have brought high resolution gaming to the masses and the video card industry is playing catch-up.

Even a year ago, 1600x1200 was considered a "higher resolution". Now its mid-range. 1920 is pushing 2x the number of pixels as 1280x1024 and 2560 is pushing 4x the number of pixels of 1280x1024. And nowadays people want all that with 4xAA!

If you take a step back and look at the difference between gaming on a 17-20" monitor vs a 22"+ Monitor/HDTV I think most would say that the difference in video card price is worth it relative to the difference in gaming experience. Personally I'm still amazed that the G80 made 1920+ resolutions a viable gaming option in a single generation leap from its predecessors, and its only going to get better as video card makers continue to reach these new expectations.
 

syn0s

Member
Jul 9, 2006
179
0
76
I agree with the OP. To me "mid range" has always meant $100 - $175, maybe $200 MAX. Anything above that was high end.

The 8800GT is NOT a mid range card. The naming convention simply says that it isn't mid range. Any x800 or x900 or x950 from nVidia is going to be high end, plain and simple. That's how their naming works. It might be on the lower spectrum of the high end cards, but it is still high end. Nvidia needs to release updated 8600GTs and GTSs for the mid end, and they need to put out some decent performance. Just like previous naming conventions (7600GT,GS, 6600GT, etc)

This card is only optimal for 1280x1024 or above resolutions. Heck, even 1280x1024 is a breeze for this card, except for Crysis and WiC. The only real benefits of this card are if you have a 1600x1200 or 1680x1050 WS resolution. Maybe a 1440x900 WS resolution would also benefit from it, but once again, Crysis or other high end games. And yes, I remember when 1280x1024 was considered a "high" resolution. Now it should be considered Low-Mid range, bare minimum for a decent setup.

The future of games will become more rigorous for gfx cards, that's for sure. Either that or Crysis is just terribly unoptimized. At 1280x1024, if you can tell the difference between 120 and 140 FPS, you are superhuman.

Heck, I remember when the original Diamond Monster 3D 4MB 3D accelerator cost me $100 alone, and that was after it had come down in price quite a bit. Remember how you had to hook up the link cable to your 2D video card in the back? Pwnage.
 
D

Deleted member 4644

I think I get what the OP is trying to say...




First, if you look at the Geforce 256 days, $300 (or so) was the TOP END card.

Now, if you look at the 8800GT, you are paying $300 for the same performance as you got last year from the 8800GTX.

Compared like that, $300 seems a bit disappointing.

Then again, card makers have to spend a lot more on R&D now than they did back in 1999, so I think that is where the top end price of $600 comes from (plus some greed maybe).
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
Originally posted by: nonameo
Gee, I remember when the voodoo2 was top of the line. It was 200$ new. Yes, I did buy one.

edit: inflation, or just less importance placed on the video card?

Voodoo 2s were not $200 new.
 

Denithor

Diamond Member
Apr 11, 2004
6,300
23
81
Originally posted by: syn0s
Heck, I remember when the original Diamond Monster 3D 4MB 3D accelerator cost me $100 alone, and that was after it had come down in price quite a bit. Remember how you had to hook up the link cable to your 2D video card in the back? Pwnage.

I have one of those in the closet. ;)
 

GreggyD

Member
Aug 2, 2000
97
0
0
I paid $200 for the original Intergraph Rendition 1000 card back in Nov 1996.

Eleven years later, $239 for a Superclocked 8800GT is a hell of a deal....which is why I bought one.
 

ManWithNoName

Senior member
Oct 19, 2007
397
0
0
Originally posted by: GreggyD
I paid $200 for the original Intergraph Rendition 1000 card back in Nov 1996.

Eleven years later, $239 for a Superclocked 8800GT is a hell of a deal....which is why I bought one.

Congrats. I have been around a long time and I ususally don't get stumped, but for the life of me, I couldn't remember the Intergraph. Looked it up to jog my memory and soon as Number Nine was mentioned the gray matter kicked in. That was quite a card for it's time.....

http://en.wikipedia.org/wiki/Rendition_(company)

I'm not even going to start on what I paid for some of my cards and it was usually near the $350-$400 mark for high-end cards such as the X800/X850 PE series, Hercules Prophets, etc.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Azn
Originally posted by: taltamir
That article corroborates every single one of my claims:
"Depending on which mode you are running in, DX9 or DX10, some options may or may not be available. The highest settings allowed in the game is a setting called ?Very High? for every graphics options. However, ?Very High? is only present when you are running in the DX10 code path. In DX9 the highest in-game settings available are ?High.? There is also a ?Medium? and ?Low? for both modes as well. There are no gameplay differences between DX9 and DX10 rendering modes, only visual image quality differences. In DX10 you will receive better post processing effects including motion blur and HDR lighting quality as well as crisper surface details."

- So low and medium are the same for DX9 and 10... High is when unique DX9 features come into play. very high is when unique DX10 features that replace them come into play and make the game look better then on a DX9 card...
Keep in mind DX is not a kernel, it is mainly a collection of libraries from which the game draws code. So DX10 means there are more things to keep track of (lower performance) but more commands and options the engine can use... Some of those engine commands are new to DX10, most existed for many versions. So even in DX10 mode the game is running a lot of DX9 code, and some DX8 code, and so on... it just adds more commands for it to use not replace every single command ever made with a spanking new DX10 alternative that magically looks better...
-forgive my gross oversimplification

DX10 settings for the GT to be playable (33.8fps):
Shadow, Shader, post processing at medium, everything else high

DX9 playable settings for the GT (33.3fps):
Shadow high
Shader medium
post processing high
everything else high

So if you are running a GT and you want this game playable you have to run shadows, shaders, and AA, the things that improved DX10 looks... in medium mode... While in DX9 some of them can be in high mode... and keep in mind that the new DX10 stuff only happens in VERY high mode... that means a GT is a great DX9 card as far as crysis is concerned or a DX10 for other games, but a single GT cannot run crysis in DX10 mode at playable framerates if you enable any actual DX10 features... you are better off enabling some MORE dx9 features and playing it at ALMOST max on DX9... Even the GTX was unable to reach any DX10 features while at playable FPS...

The thing is, I am not buying games for just one card! I played and enjoyed bioshock on my 7900 OC for 250$ bought a year ago... I will just skip crysis until they polish their engine to the point it actually works... The question is... what card can i replace my 7900 and actually play games like Company of Heroes in DX10 mode with actual DX10 EFFECTS... CoH can be MAXED OUT as far as graphics on my current card.. I play on 1920x1200 max everything (except the DX10 only features). But the 8600GTS performs a LOT worse then my 7900... I will not be able to get all the DX9 level effects on it, much less any DX10 effects... the CHEAPEST DX10 card I can get that will actually give me DX10 effects in CoH is the GT (the GTS will work, but it costs more then the GT...)

THIS is why I call the GT a minimal card, because it is the minimum a person can have if they want to actually have DX10 effects... if NOT then they are better off buying a 7900 REAL cheap and enjoy DX9 in all it's glory.

I am fully aware that the GT is the most bang for the buck and ALMOST the most power card on the market...

Are you not reading what I'm reading? Did you not see the high dx9 vs high dx10 image quality tests. One is working dx9 and one is working dx10.

Image Quality

DirectX 9 vs. DirectX 10

First we wanted to examine DX9 image quality versus DX10 image quality. To do so we are running the game with all options at ?High? in DX9 mode and in DX10 mode.

Article Image Article Image Article Image Article Image Article Image

In the first screenshot we see that there is a slight difference in the shadow on the edges, they are a tad smoother in DX10. The shadow also appears to be darker in DX10 due to a slight lighting difference. In the second screenshot we are comparing the depth of field effect and in the third screenshot motion blur and we see no differences between DX9 and DX10 at the ?High? settings. In the fourth and fifth screenshot we are looking at lighting quality and lighting quality with water and again see no noticeable differences between DX9 and DX10. The benefits that DX10 brings to this game visually are not realized until you enable the ?Very High? quality settings as you can see below.


The difference is there but again it's minimal.


Yes, the quality on HIGH on DX10 vs the quality on HIGH on DX9 was better... but they specifically said they could NOT get playable framerate on the high quality on DX10... they could get playable framerate on high on DX9, and on medium in DX10... Thats my whole point!
I believe you that if you crack the quality enough DX10 looks better... but you can't reach that point!

Show me pictures comparing the High setting DX9 to a medium setting DX10... both of which average 33fps on that game... and lets see which one looks better!
 

Aiden

Member
Jan 2, 2003
88
0
0
Prices are always overinflated on new products. i expect the gt to be around 200 or so for christmas for the basic model, which would put it in the mid-range price range for a video card. Right now MSRP on a 640mb gt, is 450$ for evgas overclocked card, you can get it for 350$ after rebates.

Prices are inflated, because its a new product, and a quality new product at that. This is potentially one of those cards that comes along once every so often. it might end up being compared to the ti4200, radeon 9800, and vodoo cards, which puts it in extremely rare company.

Once prices drop, it becomes one of the mainstream video cards for people to get. upgrading from almost any 7x, or ati 2nd generation card is a huge benifit for most people.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
640MB = GTS not GT.. and the GTS costs way more then the GT, despite performing worse... it is not a new product though, it is a year old.
 

hans007

Lifer
Feb 1, 2000
20,212
17
81
the 8800gt is just slightly above the midrange price.


when the geforce 2 and voodoo 5 5500 were out they were about $300 each (that was 6-7 years ago btw).


$250 for a 8800gt is not really midrange, i'd say midrange is actually about $180 now (we have to adjust for well , inflation, the fact that the dollar is not worth as much now compared to other currencies etc) which is where the 3850 is.

i think card prices relatively in europe are actually falling. since before they would have to pay a huge premium.

also 8800gt prices are artificially high because there is no supply of them and they are new. the 3850 is relatively available and is $180 ... i suppose if you compare it to the voodoo2 though, 1 8800gt is not the fastest card around, but its close enough to an 8800gtx. so i suppose you could say that the high end card is actually just 2 8800gt just like back in the voodoo 2 12mb days (and though they were originally $299, even when they first came out if you bought one of the smaller brands you could get it for $250... , i actually got a "bestdata" one for $200 just a few monthsa fter it came out)
 

Aiden

Member
Jan 2, 2003
88
0
0
One of the main problems has been that there have not really been any good midrange cards in the entire last generation from either ATI or Nvidia. There has been no wow card from ati or nvidia for people to latch onto in the price range most people consdier midrange.

So it is perception in some respects, but a big part is the lack of quality products for many people in the mainstream market. Introduce the 8800Gt and people are really happy with the preformance reguardless of the price. Add in that the enthusiest market can overclock the card and compete at the GTX level for half the price, and you have a highly valued card siimilar to the 97/98 pro series from ati, or the 66/68 series from nvidia.

The only reason the 8800GT was even launched in november was to compete with ATI, and the 3850/3870 release. Its basically a soft launch on a product, in order to sway public opinion.

Prices will drop at some point, but probaly not till after the holidays.
 

cubeless

Diamond Member
Sep 17, 2001
4,295
1
81
the broader theme here is that what is the minimum you 'need' keeps going skyward... everyone needs a cat scan or mri for every bump, a plain old x-ray won't do; everyone needs leather and 600w sound in their cars; everyone needs a bigscreen tv and 87 hbo channels; 2500sq ft is a starter home... just keep pushing up the perceived base level...

this is the american way... the marketers push your buttons; you spend all your money on junk; poor you can't afford healthcare while you sit in your ...

maybe we can get the democrats to fix this problem, too!!! video card allowances for underpriveledged gamers who still have CRTs!!!