Most Useless video card features of our time article

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Ah, I remember the Savage 2000. Can't remember what article it was I read it in, but it went something like this. "With their ViRge chipset, S3 managed to create the world's first 3D Deccelerator. As incredible as it may sound, S3 actually manages to top that achievement with the world's first T/L deccelerator." :p
 

TStep

Platinum Member
Feb 16, 2003
2,460
10
81
subtitled: a trip down memory lane:p

Reading that reminds me of how damn easy it is anymore to just buy a video card, buy a game, pop it in the box, and play with beautiful IQ. No longer have to deal with the driver nightmares, no wrappers, no card specific game titles that worked in your buddies box but not yours, or any of that crap. Now all there is to argue about are a few fps difference and a couple of optimizations ;) Kind of miss the challange of old though.
 

Slaimus

Senior member
Sep 24, 2000
985
0
76
Some things that came to my mind:

1. Integrated TMDS transmitters - it was a feature ever since the Geforce3, yet every cardmaker still uses an external Silicon Image chip because the internal one was not powerful enough to drive high res LCDs. EDIT: Even the Voodoo3 3500 had "LCDfx"

2. "nVidia Shading Rasterizer" on the GF2 GTS - it promised limited pixel shading ability, except no one ever used it.

3. "Vertex Skinning" on the Radeon - it was supposed to make realistic joints on models possible, probably the predecessor to TRUFORM.
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
so what I don't get is whether ANY proprietary technologies like these EVER make mainstream utilization. is nvidia like the house of representatives and ati like the senate and microsoft/opengl developers like the presidency? a scenario where the house and the senate approach the president with legislation (new features) and the president has the power to veto it.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Originally posted by: Slaimus
Some things that came to my mind:

1. Integrated TMDS transmitters - it was a feature ever since the Geforce3, yet every cardmaker still uses an external Silicon Image chip because the internal one was not powerful enough to drive high res LCDs. EDIT: Even the Voodoo3 3500 had "LCDfx"

Not quite - most card designs use an external TMDS because the signal quality of NVidia's integrated one just plain sucks, making DVI frequencies above 110 MHz quite unusable. This isn't about power, it's just about signal integrity.
ATi's in turn works very well all the way up to 165 MHz, and this is why you won't see a discrete TMDS chip on an ATi card unless it's implementing dual DVI.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Missing 3D textures on that list(they were far too memory intensive to actually be useful and still are for a while yet).

2. "nVidia Shading Rasterizer" on the GF2 GTS - it promised limited pixel shading ability, except no one ever used it.

DooM3 is built around it(although NSR is PR speak for features that were already there in the original GeForce). D3 will run on NV1x hardware but won't on R1x0 because of what could be done with register combiners("NSR").

3. "Vertex Skinning" on the Radeon - it was supposed to make realistic joints on models possible, probably the predecessor to TRUFORM.

Vertex skinning is commonly in use today, all of the remotely current parts have supported it since the GeForce(minus the nigh 'featureless' V5 of course). It's much like hardware T&L, almost everything uses it now, and everyone has it.
 

OMG1Penguin

Senior member
Jul 25, 2004
659
0
0
I ran doom 3 on my geforce 256 (32mb)!!

With my a64 @ 2400~

From the responses I got when I first posted that, I am suprised the machine even post'ed.
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
3D rendered interpretation of your own head. In practice, it was never used, and the aforementioned 3D head was... well... bloody scary. The only real use I can imagine it being suitable for is terrorising small children and giving them nightmares.

LMAO
 

kmmatney

Diamond Member
Jun 19, 2000
4,363
1
81
I was one of the unfortunate soulds who briefly had a Voodoo Rush based card. What an utter disaster! It never did work right - I finally gave up and through it away.

The biggest video card debacle has to be BitBoys, however.
 

Elcs

Diamond Member
Apr 27, 2002
6,278
6
81
Originally posted by: kmmatney
I was one of the unfortunate soulds who briefly had a Voodoo Rush based card. What an utter disaster! It never did work right - I finally gave up and through it away.

The biggest video card debacle has to be BitBoys, however.

The only problem I ever had with my Voodoo Rush card was finding drivers for it. It was near flawless.... until people decided to make games look nicer. Then it kind of slowed down to uselessness.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Elcs
Originally posted by: kmmatney
I was one of the unfortunate soulds who briefly had a Voodoo Rush based card. What an utter disaster! It never did work right - I finally gave up and through it away.

The biggest video card debacle has to be BitBoys, however.

The only problem I ever had with my Voodoo Rush card was finding drivers for it. It was near flawless.... until people decided to make games look nicer. Then it kind of slowed down to uselessness.

I had one, too. I don't remember any problems with it, other than that it wasn't all that fast...
 

ScrewFace

Banned
Sep 21, 2002
3,812
0
0
The most useless feature is nVidia's support for Shader Mark 3.0. Games are just finaly using Dirext 8.1 pixel and vertex shaders. Shader Mark 3.0 won't be in full use for 2 years and by that time the GeFroce 6800 series won't be able to run the new games at playable framerates anyhow.:)
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Do you mean Shader model 3.0?

By the end of the year I believe ~12 titles will be using some form of the model.

The most noteable are

Farcry and Stalker.
 

Despy

Member
Dec 26, 2000
34
0
0
Used to have one of those 3Dfx V5 cards. It actually worked pretty well and I played all the way through Half-life with that bad boy. Thief with 2xAA back in the day was pretty durn cool. I always lamented the fact that the T-buffer was never used for anything other than AA. There was actually a demo that showed off the affects and they were pretty neat.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Disagree with the MAXX flame in there, I loved my MAXX, a very cool card to own.
I suppose to be honest I should note I got rid of it due to driver issues, like the walls flashing or just being blank in UT.

The Radeon VIVO was a great card as well. 3 tmus per pipe or no.