Eh? VSA-100 chips fully supported 32 bit color in both 2D and 3D. Even the Voodoo 3 supported 32 bit color in 2D mode, though it was limited to 16 bits in 3D.Originally posted by: Chizow
Great 2D desktop IQ also, even though it was only 16-bit.
Easily replicated? Tell me, what other consumer video card has ever offered 4x rotated grid super-sampling? The closest is probably a pair of 7950 GX2s (Quad SLI) running 32xSLI AA, but the texture samples are so ridiculously close to each other that impact on IQ would be minimal.Originally posted by: BenSkywalker
even their version of FSAA which was easily replicated by everyone else
Originally posted by: Wreckage
The GTX280 which had been out for awhile. Very much so.Originally posted by: cmdrdredd
The 4870 was slower than?
When the 8800 series came out it absolutely demolished any other card on the market regardless of price point.
The 4870 was between second and third place. Hardly "influential" or even mildly impressive.
Had it passed the GTX280 by 30% or more maybe, but it fell behind. In a 2 horse race 2nd place is last place.
Originally posted by: munky
Originally posted by: Wreckage
The GTX280 which had been out for awhile. Very much so.Originally posted by: cmdrdredd
The 4870 was slower than?
When the 8800 series came out it absolutely demolished any other card on the market regardless of price point.
The 4870 was between second and third place. Hardly "influential" or even mildly impressive.
Had it passed the GTX280 by 30% or more maybe, but it fell behind. In a 2 horse race 2nd place is last place.
I disagree. The 8800gt was also slower than the 8800gtx, but because it offered almost as much performance for half the price, it made an impact on far more people than the 8800gtx, which only mattered to those who shelled out $600 when it was launched. Likewise, the the 4870 was second only to Nv's much more expensive gtx280, and forced NV to drop prices across their entire lineup.
Originally posted by: BenSkywalker
What about setups like the 8800GT or particularly the 9600GT in SLI? The 4850 cost ~30% more then the 8800GT when it launched and more often then not failed to match that performance advantage(it did exceed it under some circumstances). The 9600GT in SLI simply smacked the 4850 around for the same price point. What did it bring to consumers exactly? It was a very competitive single GPU part at its price point, nothing at all more.
Originally posted by: BenSkywalker
Base graphics are designed around sub $100 parts, go ahead and look at your games library and see for yourself. The 4850 still hasn't gotten into that bracket yet, nor will it any time too soon(possible excetption being some BF sale type event). To say the 4850 had pretty much no long term impact on the market would range from fairly accurate to overstating what the card brought to the table.
Originally posted by: WT
Still have a Canopus Voodoo1 6mb card (n00bs had the 4mb one, I roll with 6mb, baby !!) and a Creative Voodoo2 12mb card. Creative had a firesale back in '98 IIRC, and was selling them for $50.
First true 3D card for me was the STB Riva 128, which didn't have good driver support compared to the 3dfx cards at the time.
Originally posted by: Wreckage
The GTX280 which had been out for awhile. Very much so.Originally posted by: cmdrdredd
The 4870 was slower than?
When the 8800 series came out it absolutely demolished any other card on the market regardless of price point.
The 4870 was between second and third place. Hardly "influential" or even mildly impressive.
Had it passed the GTX280 by 30% or more maybe, but it fell behind. In a 2 horse race 2nd place is last place.
Easily replicated? Tell me, what other consumer video card has ever offered 4x rotated grid super-sampling?
GeForce 3 (programmable shaders).
How does that make the 8800GT cheaper?
And as far as SLI/Crossfire, you'll notice I mentioned that the 4850 belongs on that list for the gaming power it brought to the masses.
I said "power to the masses" because the 4850 is a fast, cheap card that can be put into virtually any system with a high-speed PCI-E port and a 6-pin PCI-E connector.
It is simply my opinion that the ease of installation
not to mention the corresponding price cutting Nvidia was forced to do
Although neither NVIDIA or ATI use TBR, their drivers became much more efficient after the KyroII.
Originally posted by: SlowSpyder
I don't know how anyone could say the 48x0 series cards are revolutionary (or the GeForce 6x00 really).
Originally posted by: Wreckage
Originally posted by: SlowSpyder
I don't know how anyone could say the 48x0 series cards are revolutionary (or the GeForce 6x00 really).
The 6 series brought about SM3.0, HDR, Purevideo and modern SLI
That's huge.
Originally posted by: nRollo
Originally posted by: Wreckage
Originally posted by: SlowSpyder
I don't know how anyone could say the 48x0 series cards are revolutionary (or the GeForce 6x00 really).
The 6 series brought about SM3.0, HDR, Purevideo and modern SLI
That's huge.
Agreed. Didn't it introduce soft shadows as well?
Originally posted by: munky
Originally posted by: nRollo
Originally posted by: Wreckage
Originally posted by: SlowSpyder
I don't know how anyone could say the 48x0 series cards are revolutionary (or the GeForce 6x00 really).
The 6 series brought about SM3.0, HDR, Purevideo and modern SLI
That's huge.
Agreed. Didn't it introduce soft shadows as well?
It only introduced a method of rendering soft stencil shadows. Most games use shadow mapping as opposed to stencil shadows, and the devs write PS code to give them soft edges.
Which ones? Name them. Name any consumer single-slot part that has offered 4xRGSS to a gaming end-user.Originally posted by: BenSkywalker
Any of them could, using the accumulation buffer extenstions it wasn't that hard.
Yep. In particular OG is quite easy to bludgeon into a driver and requires minimal hardware support while RG is not quite as easy and generally requires more hardware support.RG v OG each had their pros and cons,
You mean ATi?s Temporal AA? I personally think it?s a gimmick.it isn't like either came remotely close to stochastic which was where the real IQ was going to come from.
Right, which is why I listed both cards. The GF3 was truly programmable, unlike the earlier GeForce ?DX7+? cards. That makes the GF3 a landmark card.The original GeFroce was also programmable, GF3 just expanded on it by quite a bit.
Nope, cards as old as the original Radeon could do soft shadows.Originally posted by: nRollo
Agreed. Didn't it introduce soft shadows as well?
Which ones? Name them. Name any consumer single-slot part that has offered 4xRGSS to a gaming end-user.
Yep. In particular OG is quite easy to bludgeon into a driver and requires minimal hardware support while RG is not quite as easy and generally requires more hardware support.
That?s the significance of 3dfx?s implementation; the whole thing was designed to operate at the hardware level, unlike the driver-level OGSS hacks the competitors used.
You mean ATi?s Temporal AA? I personally think it?s a gimmick.
Right, which is why I listed both cards. The GF3 was truly programmable, unlike the earlier GeForce ?DX7+? cards. That makes the GF3 a landmark card.
Originally posted by: darXoul
Matrox Millennium/Mystique (x)
S3 Virge
3dfx Voodoo (x)
3dfx Voodoo 2
Riva TNT2 Ultra (x)
GeForce 256 DDR
GeForce 4 Ti 4200 (x - but 4400)
Radeon 9700 Pro
GeForce 6800 GT (x)
GeForce 8800 GTX (x)
That's my subjective list - point of view of an enthusiast but price-aware gamer. I owned all the cards marked with (x).
What does developer interest have to do with it? 3dfx implemented it at the driver level and it was available in the bulk of games without developer effort or knowledge.Originally posted by: BenSkywalker
Any board that supported the AB extenstions could do it if the software developer was interested- they weren't.
So again I?ll ask, where are the implementations? Why not add xS modes that feature RGSS?RG requires no hardware support beyond the basic OpenGL extension support- any board that could render to texture could do it.
What?s more accurate. MSAA? MSAA doesn?t take texture or shader samples so it?ll be inferior to SSAA, assuming identical sample patterns.The way nVidia and ATi were doing FSAA was the way it was done on graphics cards prior to the V5 up to the $50K workstations- it wasn't a hack by any means, it is actually a far more accurate way of doing it.
The main premise of stochastic is that the sample pattern varies each frame. While it?s true that a fully-fledged implementation randomly varies the pattern, the fact that ATi alternated patterns between frames means TAA was closer to stochastic than any other AA scheme ever available in consumer space.Stochastic is VASTLY superior as aliasing needs a pattern, the best way to eliminate the pattern is through random dispersion of samples when talking about edge aliasing(base texture filtering is running at 128x per pixel, edge can't get close to that with current bandwidth limitations).
The GF3 was the first to support DX8, the first version of DirectX to support pixel and vertex shaders. I consider that influential because it was the precursor to the modern day proliferation of shader usage, unlike the orginial GF which had limited developer support given the DX7 spec didn?t expose its functionality.Depends where you draw the line in the sand. Can you encode media on the GF3? Nope. Can you fold? No again. You can easily make the argument the first truly programmable GPU was the 8800 as it is the first one that will run C based code. Where you draw the line of fully programmable can put you in many different spots(is it the GTX2x0 because of DP + C support?) however the first offering that was programmable period was the GeForce256.
What does developer interest have to do with it?
We have OGSS from nVidia at the driver level now. If RGSS is so easy, why doesn?t nVidia offer it instead of only offering OGSS?
At identical sample sizes a good RGSS implementation should have the same workload as OGSS but offer much better image quality.
You can?t possibly be referring to OGSS. Ordered grid is the worst possible sample pattern for AA; it?s wasteful, brute force, inefficient, and it?s mathematically provable a rotated grid is superior to it
It also just happens to be extremely easy to bludgeon anywhere without hardware support, and that?s exactly what ATi and nVidia did back in the day.
The main premise of stochastic is that the sample pattern varies each frame.
One thing is clear; you can be sure Master Yoda wasn?t rendered with OGSS.
The GF3 was the first to support DX8, the first version of DirectX to support pixel and vertex shaders. I consider that influential because it was the precursor to the modern day proliferation of shader usage, unlike the orginial GF which had limited developer support given the DX7 spec didn?t expose its functionality.
I?m seeing a response, but it doesn?t appear to be related what you quoted.Originally posted by: BenSkywalker
Everything. What you are doing is akin to blaming 3Dfx because the original DooM didn't utilize PS 4.0 when played on a Voodoo1. AB effects can be handled by any developer that wants to use them- they haven't shown any interest. AB is entirely software based- 3dfx used a horrific partial brute force method to hack it on to games that weren't trying to use it.
Breaking what developer code? What are you talking about? When I force 4xAA from the driver in GLQuake, what developer code am I breaking?It is a SOFTWARE issue. Every bit of nV's and ATi's boards are capable of doing it if requested- to break developers code and do it another way you would need to design the hardware around rendering everything wrong.
Such edges are exactly where the eye most easily spots aliasing; that is a proven scientific fact. So what you casually brush away as a non-issue is actually the whole point of RG. That and fewer RG samples are required to attain the same IQ to OG.Not approaching true, not even in the league of it. RG is very, very clearly inferior for everything except edge aliasing on near vertical or horizontal edges. In every other way, OGSS is superior.
Again, you must be joking. Please post any credible information that backs your claims.RG is vastly superior at introducing noise. Yes, it reduces ALL patterns far more effectively then OGSS- including the ones you WANT people to see. All textures are hosed- there is nothing you can do at the 0 mip level- RG IQ is always vastly and clearly inferior to OGSS.
Again what does the dev have to do with the user forcing AA? When the negative LOD was implemented they had the best AA in consumer space. You can clearly see the positive comments from those using it when they moved to other solutions and found what they got was inferior. Additionally, numerous articles, whitepapers and screenshots demonstrate without a doubt that 4xRGSS is superior to 4xOGSS.AA is like drawing a triangle- it isn't hard and it is done through brute force. 3dfx was applying a filter that devs could ask for without the devs asking for- they were rendering the game wrong to get the desired effect.
So you?ll admit that ATI have made a better effort toward stochastic than anyone else in consumer space?Uhm, no. The premise of stochastic is the sample pattern varies every pixel. What ATi does is a weak attempt at simulating stochastic.
What points am I taking off? The GeForce was in my list for heaven?s sake.but taking points off of a part because it was only exposed under the far more robust API isn't quite fair
So list them. List all of the OpenGL games that supported the GF?s programming functions and were rendered differently to other DX7 hardware as a result.(and in all honesty, we saw plenty of OGL games in that time era).