• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Most influential Graphic cards in history

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

biostud

Lifer
Feb 27, 2003
15,178
436
126
voodoo
GF256
9700pro
GF4 4200Ti (low cost, compared to performance)
8800GTX

Rest are just improvements of the big leaps IMHO.
 

BFG10K

Lifer
Aug 14, 2000
21,711
559
126
My list:

  • Voodoo 1 (mainstream 3D acceleration).
  • Voodoo 2 (SLI, the precursor to Crossfire/nVidia SLI).
  • GeForce DDR (T&L, huge leap in memory bandwidth).
  • Voodoo 5 (hardware integrated SLI, 4x rotated grid super-sampling).
  • GeForce 3 (programmable shaders).
  • Radeon 9700 Pro (16xAF fast enough to be standard in all games).
  • 8800 GTX (unified shaders, the performance king for about 18 months).
Originally posted by: Chizow

Great 2D desktop IQ also, even though it was only 16-bit.
Eh? VSA-100 chips fully supported 32 bit color in both 2D and 3D. Even the Voodoo 3 supported 32 bit color in 2D mode, though it was limited to 16 bits in 3D.

Originally posted by: BenSkywalker

even their version of FSAA which was easily replicated by everyone else
Easily replicated? Tell me, what other consumer video card has ever offered 4x rotated grid super-sampling? The closest is probably a pair of 7950 GX2s (Quad SLI) running 32xSLI AA, but the texture samples are so ridiculously close to each other that impact on IQ would be minimal.
 

Munky

Diamond Member
Feb 5, 2005
9,379
0
76
Originally posted by: Wreckage
Originally posted by: cmdrdredd


The 4870 was slower than?
The GTX280 which had been out for awhile. Very much so.

When the 8800 series came out it absolutely demolished any other card on the market regardless of price point.

The 4870 was between second and third place. Hardly "influential" or even mildly impressive.

Had it passed the GTX280 by 30% or more maybe, but it fell behind. In a 2 horse race 2nd place is last place.
I disagree. The 8800gt was also slower than the 8800gtx, but because it offered almost as much performance for half the price, it made an impact on far more people than the 8800gtx, which only mattered to those who shelled out $600 when it was launched. Likewise, the the 4870 was second only to Nv's much more expensive gtx280, and forced NV to drop prices across their entire lineup.
 

Modelworks

Lifer
Feb 22, 2007
16,243
5
76
Matrox Millenium should be on that list. It was one of the best selling cards of its day because it allowed people doing things like photo editing and desktop publishing to have 2d acceleration with millions of colors on the desktop.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: munky
Originally posted by: Wreckage
Originally posted by: cmdrdredd


The 4870 was slower than?
The GTX280 which had been out for awhile. Very much so.

When the 8800 series came out it absolutely demolished any other card on the market regardless of price point.

The 4870 was between second and third place. Hardly "influential" or even mildly impressive.

Had it passed the GTX280 by 30% or more maybe, but it fell behind. In a 2 horse race 2nd place is last place.
I disagree. The 8800gt was also slower than the 8800gtx, but because it offered almost as much performance for half the price, it made an impact on far more people than the 8800gtx, which only mattered to those who shelled out $600 when it was launched. Likewise, the the 4870 was second only to Nv's much more expensive gtx280, and forced NV to drop prices across their entire lineup.
That's why the whole discussion is pointless.

If you're Joe Workstation, the current FX5800 4GB might be the most influential.

If you're Joe Pinchpenny, the 9600GT might be.

If you're Joe Ihaveacareer, the 4870X2 might be.

Etc.

This discussion is about as hard to define as "Who has the best drivers".
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Originally posted by: BenSkywalker
What about setups like the 8800GT or particularly the 9600GT in SLI? The 4850 cost ~30% more then the 8800GT when it launched and more often then not failed to match that performance advantage(it did exceed it under some circumstances). The 9600GT in SLI simply smacked the 4850 around for the same price point. What did it bring to consumers exactly? It was a very competitive single GPU part at its price point, nothing at all more.
??? At launch, the 8800GT had an MSRP of $249, but you would be hard pressed to find one for less than $300. When the 4850 launched, its MSRP was $199 and could be had for $250. Yet the price on the 8800GT didn't really fall until the 4850 arrived. I purchased an 8800GT back in December and it still cost me $250. And that was the absolute cheapest I could find one for. Seven months later I picked up an even faster 4850 for $140 AR. How does that make the 8800GT cheaper?

And as far as SLI/Crossfire, you'll notice I mentioned that the 4850 belongs on that list for the gaming power it brought to the masses. Just how many people of the general public have an SLI or Crossfire capable motherboard? Or an SLI/Crossfire power supply? And how many Mom's and Pop's would be willing to buy both of those simply in order to run two video cards simultaneously? I said "power to the masses" because the 4850 is a fast, cheap card that can be put into virtually any system with a high-speed PCI-E port and a 6-pin PCI-E connector. That's worlds away from SLI/Crossfire price/complexity.

Originally posted by: BenSkywalker
Base graphics are designed around sub $100 parts, go ahead and look at your games library and see for yourself. The 4850 still hasn't gotten into that bracket yet, nor will it any time too soon(possible excetption being some BF sale type event). To say the 4850 had pretty much no long term impact on the market would range from fairly accurate to overstating what the card brought to the table.
Not long after the 4850 was released, it could be purchased for around $140 AR. Right now you can pick one up at Newegg for $130 AR. That's starting to get VERY close to the sub-$100 price you mentioned. And the processing power difference between some sub-$100 part and a $130 4850 is immense, much more so than the price difference.

Rollo is right that this is a very subjective topic. It is simply my opinion that the ease of installation, extremely low price and relatively high performance level of the 4850 (not to mention the corresponding price cutting Nvidia was forced to do) is going to help both the public and the gaming industry in not only the short-term, but the long-term as well by giving developers a more powerful base of cards to program for.
 

Compddd

Golden Member
Jul 5, 2000
1,864
0
71
Originally posted by: WT
Still have a Canopus Voodoo1 6mb card (n00bs had the 4mb one, I roll with 6mb, baby !!) and a Creative Voodoo2 12mb card. Creative had a firesale back in '98 IIRC, and was selling them for $50.

First true 3D card for me was the STB Riva 128, which didn't have good driver support compared to the 3dfx cards at the time.
My first graphics card was a Canopus Voodoo1 6MB :) I still have it, ahh the memories. Playing Myth the Fallen Lords with full Glide acceleration! I loved that game lol.
 

AmdInside

Golden Member
Jan 22, 2002
1,355
0
76
I think one card not mentioned but should be is the KyroII. It made companies think more efficiently (ie not brute force approach). Although neither NVIDIA or ATI use TBR, their drivers became much more efficient after the KyroII.
 

drum

Diamond Member
Feb 1, 2003
6,810
3
76
I upgraded from a 9500 Pro to a 8800GTX in march of 07 so i'm feelin pretty good about myself no matter what this list says :laugh: still using the gtx :)
 

rudder

Lifer
Nov 9, 2000
19,429
82
91
This thread brings back memories...

Of this turd.

I just remember it never really worked that well and I eventually got it swapped with a different card.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
998
126
Originally posted by: Wreckage
Originally posted by: cmdrdredd


The 4870 was slower than?
The GTX280 which had been out for awhile. Very much so.

When the 8800 series came out it absolutely demolished any other card on the market regardless of price point.

The 4870 was between second and third place. Hardly "influential" or even mildly impressive.

Had it passed the GTX280 by 30% or more maybe, but it fell behind. In a 2 horse race 2nd place is last place.
The 48x0 series sure was influential on Nvidia's pricing.


I don't know how anyone could say the 48x0 series cards are revolutionary (or the GeForce 6x00 really). Those cards are very evolutionary, they really didn't change the industry the way some of the other GPU's mentioned did.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Easily replicated? Tell me, what other consumer video card has ever offered 4x rotated grid super-sampling?
Any of them could, using the accumulation buffer extenstions it wasn't that hard. RG v OG each had their pros and cons, it isn't like either came remotely close to stochastic which was where the real IQ was going to come from.

GeForce 3 (programmable shaders).
The original GeFroce was also programmable, GF3 just expanded on it by quite a bit.

How does that make the 8800GT cheaper?
WTF are you talking about? Why not tell us what you paid for your first Voodoo2 and then go off about how great CCC is......

The first day the 4850 was available the 8800GT was a decent amount cheaper, and it still is today. The performance versus price ratio has been relatively close to leaning towards the 8800GT.

And as far as SLI/Crossfire, you'll notice I mentioned that the 4850 belongs on that list for the gaming power it brought to the masses.
You can say that over and over and over again if you'd like. You decided what the masses wanted to pay, what the masses saw as a good value and what you deemed them desiring. What you didn't offer was any benches to back your analysis, any sales numbers for the 4850 nor any example whatsoever of the masses wanting the 4850 on some incredible scale.

I said "power to the masses" because the 4850 is a fast, cheap card that can be put into virtually any system with a high-speed PCI-E port and a 6-pin PCI-E connector.
Name me one game the 4850 made playable that wasn't on a 9600GT for HALF the price. Not one setting, you want to talk about the masses- name one game.

It is simply my opinion that the ease of installation
If you are consuming very large quantities of narcotics, you should seek professional help- it eventually will give you serious health issues and could lead to an early demise.

not to mention the corresponding price cutting Nvidia was forced to do
9600GTs in SLI were faster and cost the same as the 4850 the day BEFORE the 4850 launched, a decent amount faster too. ATi didn't introduce anything new with the 4850 in terms of performance per dollar, new features or anything outside of a solid single GPU offering.

Although neither NVIDIA or ATI use TBR, their drivers became much more efficient after the KyroII.
No, they didn't at all. Developers started using the hardware both ATi and nVidia had had in place for a while taking advantage of ordering the rendering by z depth to take use of early reject hardware already in place from both ATi and nVidia. The KyroII didn't impact anything long term, but it certainly offered a unique perspective on alternative rendering techniques(despite the HUGE shortcomings it had using that technique).
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: SlowSpyder


I don't know how anyone could say the 48x0 series cards are revolutionary (or the GeForce 6x00 really).
The 6 series brought about SM3.0, HDR, Purevideo and modern SLI

That's huge.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Wreckage
Originally posted by: SlowSpyder


I don't know how anyone could say the 48x0 series cards are revolutionary (or the GeForce 6x00 really).
The 6 series brought about SM3.0, HDR, Purevideo and modern SLI

That's huge.
Agreed. Didn't it introduce soft shadows as well?
 

Munky

Diamond Member
Feb 5, 2005
9,379
0
76
Originally posted by: nRollo
Originally posted by: Wreckage
Originally posted by: SlowSpyder


I don't know how anyone could say the 48x0 series cards are revolutionary (or the GeForce 6x00 really).
The 6 series brought about SM3.0, HDR, Purevideo and modern SLI

That's huge.
Agreed. Didn't it introduce soft shadows as well?
It only introduced a method of rendering soft stencil shadows. Most games use shadow mapping as opposed to stencil shadows, and the devs write PS code to give them soft edges.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: munky
Originally posted by: nRollo
Originally posted by: Wreckage
Originally posted by: SlowSpyder


I don't know how anyone could say the 48x0 series cards are revolutionary (or the GeForce 6x00 really).
The 6 series brought about SM3.0, HDR, Purevideo and modern SLI

That's huge.
Agreed. Didn't it introduce soft shadows as well?
It only introduced a method of rendering soft stencil shadows. Most games use shadow mapping as opposed to stencil shadows, and the devs write PS code to give them soft edges.
Still when you factor in the stencil shadows, that's 5 check box features introduced with that line.

Definitely one of the most "influential" for gamers. (although I still lean towards the Bitchin' Fast and it's record breaking bungholio marks)
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,517
160
106
Stencil shadows have existed for years. I can recall a patch for Quake 2 that gave it stencil shadows on certain hardware (TNT, I think).
 

darXoul

Senior member
Jan 15, 2004
702
0
0
Matrox Millennium/Mystique (x)
S3 Virge
3dfx Voodoo (x)
3dfx Voodoo 2
Riva TNT2 Ultra (x)
GeForce 256 DDR
GeForce 4 Ti 4200 (x - but 4400)
Radeon 9700 Pro
GeForce 6800 GT (x)
GeForce 8800 GTX (x)

That's my subjective list - point of view of an enthusiast but price-aware gamer. I owned all the cards marked with (x).
 

BFG10K

Lifer
Aug 14, 2000
21,711
559
126
Originally posted by: BenSkywalker

Any of them could, using the accumulation buffer extenstions it wasn't that hard.
Which ones? Name them. Name any consumer single-slot part that has offered 4xRGSS to a gaming end-user.

RG v OG each had their pros and cons,
Yep. In particular OG is quite easy to bludgeon into a driver and requires minimal hardware support while RG is not quite as easy and generally requires more hardware support.

That?s the significance of 3dfx?s implementation; the whole thing was designed to operate at the hardware level, unlike the driver-level OGSS hacks the competitors used.

it isn't like either came remotely close to stochastic which was where the real IQ was going to come from.
You mean ATi?s Temporal AA? I personally think it?s a gimmick.

The original GeFroce was also programmable, GF3 just expanded on it by quite a bit.
Right, which is why I listed both cards. The GF3 was truly programmable, unlike the earlier GeForce ?DX7+? cards. That makes the GF3 a landmark card.
 

BFG10K

Lifer
Aug 14, 2000
21,711
559
126
Originally posted by: nRollo

Agreed. Didn't it introduce soft shadows as well?
Nope, cards as old as the original Radeon could do soft shadows.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Which ones? Name them. Name any consumer single-slot part that has offered 4xRGSS to a gaming end-user.
Any board that supported the AB extenstions could do it if the software developer was interested- they weren't.

Yep. In particular OG is quite easy to bludgeon into a driver and requires minimal hardware support while RG is not quite as easy and generally requires more hardware support.
RG requires no hardware support beyond the basic OpenGL extension support- any board that could render to texture could do it.

That?s the significance of 3dfx?s implementation; the whole thing was designed to operate at the hardware level, unlike the driver-level OGSS hacks the competitors used.
They had a hardware hack to force accumulation buffer techniques where devs didn't ask for them. Their technique really wasn't even very good, they didn't fix the LOD bias issues with it until huge uproar from the community(which I was very amused with, was warning them months in advance it was going to be a problem)- something a properly designed hardware implementation would have had absolutely no issue wiith(there was NO hardware compensation for number of samples taken and how that needed to factor in as the footprint for texture sampling became anisotropic but erratic- issues that a genuine hardware solution wouldn't have had). The only portion they had in hardware was the initial raw sampling which by every other element of the design seemed to be more of a byproduct of swiping SGI's idea then anything else. After using hardware that was actually built to use accumulation buffer techniques from the ground up 3dfx's looked like what it was- a very scaled back hack.

I had it out with 3dfx's crew back when the V5 was first coming around(not their fans, their employees ;) ) and pointed out numerous issues they had with their technique, including the point that they renamed it from what it had previously been(their technique was called MSAA, not FSAA prior to their version of it- they changed the name to FSAA as the new verson of MSAA- which they new was superior- was on the horizon being pushed by MS and nV). The way nVidia and ATi were doing FSAA was the way it was done on graphics cards prior to the V5 up to the $50K workstations- it wasn't a hack by any means, it is actually a far more accurate way of doing it. The AB style version of AA they used was previously better suited for a type of softening filter, not unlike a Gaussian Blur in Photoshop.

You mean ATi?s Temporal AA? I personally think it?s a gimmick.
Hehe, no. Like PRRenderman, something like how Yoda was rendered with in SW:EPIII(the movie, not any game- they use a proprietary render engine, but same idea). Stochastic is VASTLY superior as aliasing needs a pattern, the best way to eliminate the pattern is through random dispersion of samples when talking about edge aliasing(base texture filtering is running at 128x per pixel, edge can't get close to that with current bandwidth limitations).

Right, which is why I listed both cards. The GF3 was truly programmable, unlike the earlier GeForce ?DX7+? cards. That makes the GF3 a landmark card.
Depends where you draw the line in the sand. Can you encode media on the GF3? Nope. Can you fold? No again. You can easily make the argument the first truly programmable GPU was the 8800 as it is the first one that will run C based code. Where you draw the line of fully programmable can put you in many different spots(is it the GTX2x0 because of DP + C support?) however the first offering that was programmable period was the GeForce256.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: darXoul
Matrox Millennium/Mystique (x)
S3 Virge
3dfx Voodoo (x)
3dfx Voodoo 2
Riva TNT2 Ultra (x)
GeForce 256 DDR
GeForce 4 Ti 4200 (x - but 4400)
Radeon 9700 Pro
GeForce 6800 GT (x)
GeForce 8800 GTX (x)

That's my subjective list - point of view of an enthusiast but price-aware gamer. I owned all the cards marked with (x).
That's actually a pretty good list right there.
 

BFG10K

Lifer
Aug 14, 2000
21,711
559
126
Originally posted by: BenSkywalker

Any board that supported the AB extenstions could do it if the software developer was interested- they weren't.
What does developer interest have to do with it? 3dfx implemented it at the driver level and it was available in the bulk of games without developer effort or knowledge.

We have OGSS from nVidia at the driver level now. If RGSS is so easy, why doesn?t nVidia offer it instead of only offering OGSS?

We have 4xRGSS with Quad SLI; it?s not very good, but by technical definition it?s there. Are you saying it was worth it for nVidia to implement it for the 0.0000001% people using that setup (most of whom won?t use it anyway), but there?s not enough interest from the rest of their market?

Quad SLI is the only non-3dfx solution to offer 4xRGSS, and it also has four GPUs. That?s not a coincidence.

RG requires no hardware support beyond the basic OpenGL extension support- any board that could render to texture could do it.
So again I?ll ask, where are the implementations? Why not add xS modes that feature RGSS?

At identical sample sizes a good RGSS implementation should have the same workload as OGSS but offer much better image quality. So if it?s easy to implement RGSS, it should be a no-brainer to replace the OGSS modes with it, but that hasn?t happened.

Additionally, at the time the T-Buffer method had several advantages over the regular accumulation buffer approach.

The way nVidia and ATi were doing FSAA was the way it was done on graphics cards prior to the V5 up to the $50K workstations- it wasn't a hack by any means, it is actually a far more accurate way of doing it.
What?s more accurate. MSAA? MSAA doesn?t take texture or shader samples so it?ll be inferior to SSAA, assuming identical sample patterns.

You can?t possibly be referring to OGSS. Ordered grid is the worst possible sample pattern for AA; it?s wasteful, brute force, inefficient, and it?s mathematically provable a rotated grid is superior to it, especially for near horizontal and near vertical edges.

It also just happens to be extremely easy to bludgeon anywhere without hardware support, and that?s exactly what ATi and nVidia did back in the day.

I implemented a triangle rasterizer about ten years ago and I found it was trivial to add OGSS. It was so easy that I implemented it despite having a superficial understanding of anti-aliasing at the time.

Stochastic is VASTLY superior as aliasing needs a pattern, the best way to eliminate the pattern is through random dispersion of samples when talking about edge aliasing(base texture filtering is running at 128x per pixel, edge can't get close to that with current bandwidth limitations).
The main premise of stochastic is that the sample pattern varies each frame. While it?s true that a fully-fledged implementation randomly varies the pattern, the fact that ATi alternated patterns between frames means TAA was closer to stochastic than any other AA scheme ever available in consumer space.

I never thought much of it though; in practice the requirement of vsync and a high framerate to avoid flickering made it impractical for most real-time situations, and a real stochastic implementation would have the same limitation. A far better option is to implement RGSS or even better, SGSS (I believe the V5 6000 had this, but I?d need to check).

One thing is clear; you can be sure Master Yoda wasn?t rendered with OGSS. :p

Depends where you draw the line in the sand. Can you encode media on the GF3? Nope. Can you fold? No again. You can easily make the argument the first truly programmable GPU was the 8800 as it is the first one that will run C based code. Where you draw the line of fully programmable can put you in many different spots(is it the GTX2x0 because of DP + C support?) however the first offering that was programmable period was the GeForce256.
The GF3 was the first to support DX8, the first version of DirectX to support pixel and vertex shaders. I consider that influential because it was the precursor to the modern day proliferation of shader usage, unlike the orginial GF which had limited developer support given the DX7 spec didn?t expose its functionality.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
What does developer interest have to do with it?
Everything. What you are doing is akin to blaming 3Dfx because the original DooM didn't utilize PS 4.0 when played on a Voodoo1. AB effects can be handled by any developer that wants to use them- they haven't shown any interest. AB is entirely software based- 3dfx used a horrific partial brute force method to hack it on to games that weren't trying to use it.

We have OGSS from nVidia at the driver level now. If RGSS is so easy, why doesn?t nVidia offer it instead of only offering OGSS?
It is a SOFTWARE issue. Every bit of nV's and ATi's boards are capable of doing it if requested- to break developers code and do it another way you would need to design the hardware around rendering everything wrong.

At identical sample sizes a good RGSS implementation should have the same workload as OGSS but offer much better image quality.
Not approaching true, not even in the league of it. RG is very, very clearly inferior for everything except edge aliasing on near vertical or horizontal edges. In every other way, OGSS is superior.

You can?t possibly be referring to OGSS. Ordered grid is the worst possible sample pattern for AA; it?s wasteful, brute force, inefficient, and it?s mathematically provable a rotated grid is superior to it
RG is vastly superior at introducing noise. Yes, it reduces ALL patterns far more effectively then OGSS- including the ones you WANT people to see. All textures are hosed- there is nothing you can do at the 0 mip level- RG IQ is always vastly and clearly inferior to OGSS.

It also just happens to be extremely easy to bludgeon anywhere without hardware support, and that?s exactly what ATi and nVidia did back in the day.
AA is like drawing a triangle- it isn't hard and it is done through brute force. 3dfx was applying a filter that devs could ask for without the devs asking for- they were rendering the game wrong to get the desired effect.

The main premise of stochastic is that the sample pattern varies each frame.
Uhm, no. The premise of stochastic is the sample pattern varies every pixel. What ATi does is a weak attempt at simulating stochastic.

One thing is clear; you can be sure Master Yoda wasn?t rendered with OGSS.
OG would be next after stochastic- RG would never have been a remote option- it's just a blur filter. You

The GF3 was the first to support DX8, the first version of DirectX to support pixel and vertex shaders. I consider that influential because it was the precursor to the modern day proliferation of shader usage, unlike the orginial GF which had limited developer support given the DX7 spec didn?t expose its functionality.
It was exposed, just not by D3D. It wasn't until DX9 that MS started to get serious about making D3D a real overall 3D API, although obviously they still had staggering limitations that they fixed with D3D10, D3D8 was just another step. Yeah, we can get into a lengthy discussion about the merits of API structures, and I can see both ends- but taking points off of a part because it was only exposed under the far more robust API isn't quite fair(and in all honesty, we saw plenty of OGL games in that time era).
 

BFG10K

Lifer
Aug 14, 2000
21,711
559
126
Originally posted by: BenSkywalker

Everything. What you are doing is akin to blaming 3Dfx because the original DooM didn't utilize PS 4.0 when played on a Voodoo1. AB effects can be handled by any developer that wants to use them- they haven't shown any interest. AB is entirely software based- 3dfx used a horrific partial brute force method to hack it on to games that weren't trying to use it.
I?m seeing a response, but it doesn?t appear to be related what you quoted.

Again I?ll ask, what does the developer have to do with AA? If I enable 4xAA in the driver control panel and run GLQuake, what does the developer have to do with it?

It is a SOFTWARE issue. Every bit of nV's and ATi's boards are capable of doing it if requested- to break developers code and do it another way you would need to design the hardware around rendering everything wrong.
Breaking what developer code? What are you talking about? When I force 4xAA from the driver in GLQuake, what developer code am I breaking?

What developer code is Quad SLI breaking when I force 32xAA which enables 4xRGSS?

What the hell are you talking about?

You?re the one telling us how easy it is to implement, yet when questioned as to why only Quad SLI supports 4xRGSS, you start screaming about breaking developer code, or some-such nonsense.

Not approaching true, not even in the league of it. RG is very, very clearly inferior for everything except edge aliasing on near vertical or horizontal edges. In every other way, OGSS is superior.
Such edges are exactly where the eye most easily spots aliasing; that is a proven scientific fact. So what you casually brush away as a non-issue is actually the whole point of RG. That and fewer RG samples are required to attain the same IQ to OG.

As for the rest of your claim (?in every other way, OGSS is superior?), please post up evidence to a credible source that backs your claims. I can post a ton of information that disproves your claims.

Also those clouts at nVidia and ATi must be really dumb, given RGSS is the very mode they implemented on their multi-GPU setups.

RG is vastly superior at introducing noise. Yes, it reduces ALL patterns far more effectively then OGSS- including the ones you WANT people to see. All textures are hosed- there is nothing you can do at the 0 mip level- RG IQ is always vastly and clearly inferior to OGSS.
Again, you must be joking. Please post any credible information that backs your claims.

I mean wow, by that same reasoning, 16xOGSS must be worse than 4xOGSS since 16xOGSS contains a 4xRGSS pattern in it, so therefore it?s removing more noise you want to keep. :roll:

I mean seriously, what a load of utter nonsense.

AA is like drawing a triangle- it isn't hard and it is done through brute force. 3dfx was applying a filter that devs could ask for without the devs asking for- they were rendering the game wrong to get the desired effect.
Again what does the dev have to do with the user forcing AA? When the negative LOD was implemented they had the best AA in consumer space. You can clearly see the positive comments from those using it when they moved to other solutions and found what they got was inferior. Additionally, numerous articles, whitepapers and screenshots demonstrate without a doubt that 4xRGSS is superior to 4xOGSS.

Uhm, no. The premise of stochastic is the sample pattern varies every pixel. What ATi does is a weak attempt at simulating stochastic.
So you?ll admit that ATI have made a better effort toward stochastic than anyone else in consumer space?

but taking points off of a part because it was only exposed under the far more robust API isn't quite fair
What points am I taking off? The GeForce was in my list for heaven?s sake.

You?re the one taking issue with the GF3 being on my list despite the fact that it had a far bigger impact on shader proliferation than the GF ever did. The GF3 created shaders as a baseline standard and begin the death of fixed function programming.

(and in all honesty, we saw plenty of OGL games in that time era).
So list them. List all of the OpenGL games that supported the GF?s programming functions and were rendered differently to other DX7 hardware as a result.
 

ASK THE COMMUNITY

TRENDING THREADS