This is crosspost from the thread about the person who could
not see fuzzy text on his 16x12 hundred monitor. There has
been a fair amount of discussion about image quality
and frame rate as important measures in a videocard.
The best budget videocard in this thread got a rating of "8" on somebodies absolute scale of videocard perfection.
This thread brings out needed points when one is talking
about best at what price point, which part of the price scale,
which operating system, which games are run, which would
I recommend to my neighbour if I did not wish to be
his overnight best friend, would you pay a $60 dollar
premium on a CPU if it could get you 86% more performance?, etc, etc,
etc.
The cross-thread post below talks about the days right after
the next generation hardware in 1998 conquered the last generation
software and other video card metrics were bought back
into the fold.
Here is the link referring to the <a target=new href="http://www.tomshardware.com/graphic/98q2/980427/chips-12.html#conclusion_">landmark
3D quality</a> of matrox (2D as well ) back in
1998 era. The interesting thing is that both
2D and
3D round off effects such as banding
were discussed but are forgotten in
todays "perfect" high-end graphics reviews reviews which only mention more [l=framerate][/l]
as the primary/primary/primary arbiter in a area which breaks down
for me as
1. Get 2D right
2. Stable drivers to run 90% of games out of the box
3. Meet 3D performance minimums
4. Extra 3D accuracy
5. Last, Focus on extra DVD/VI/VO features
The above is what I would call a more balanced prioritization
of the rating metrics to be used to gauge
a modern videocard. Many in the industry are concurring.
The DVD/VI/VO become more important
as users get more sophisticated and master aspects of using
your computer as a toaster to create/archive VHS to DVD transfers of
treasured camcorder footage.
If the sales success of certain particular graphic chipsets
is any indication, 3D has been the easy sell loss leader:
One can get away with blurrier text than their competitor,
unstable driver releases that work in a few platforms,
current demands that exceed reasonable limits used in last year's
more popular motherboards, overheating, burning
hot heatsinks, $500 retail prices, rare memories,
seperately powered graphics cards AND
THE FPS COUNTER WILL STILL SELL PRODUCT
in spite of these system oversights. Who is to
blame? The benchmark mentality of the PC industry
and/or the consumer who believes too readily in
the myth of a high non-obsolescing fps benchmarking
number. My guess is that the original poster
of this thread, LSD, is barking/hallicinating

up the right tree and it is those
"el Cheapo" 16x12 monitors
( [l=link]http://forums.anandtech.com/messageview.cfm?start=21&catid=31&threadid=289430[/l] )
that hide the glorious inner details but sadly
still being capable of showing the higher frame rate counter number
in the top right of the corner of the monitor.
Cheers.