"I WANT TO BELIEVE": Nvidia's texture hardware X-file and Anantech's silence

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bluemax

Diamond Member
Apr 28, 2000
7,182
0
0
I used to stand firmly against nVidia on 2D.... then I got this ELSA Erazor X2. I had the ASUS v7100MX before it, and everything from 1024 and up had the infamous fuzzy text. Blecch.

This ELSA is running 1280 on a regular basis and is sharp as a tack! Better than the Voodoo3-3500TV I had before it!!
So if 2D is **not** an issue between Geforce and Radeon, how big is the differences in 16 bit colour? 32 bit? FSAA? Compatibility? EMBM? T&L?
I don't know whether I should keep this VERY NICE card or still try to trade up to a Radeon...
This Geforce DDR sure compliments my Cel450A nicely though... :)

I noticed a Geforce2 GTS deal floating around Hot Deals... ELSA Gladiac... wooh! I'd have gone for it if I could! (Canadians excluded as usual.)
ELSA has hot, hot, hot 2D (well, it's as good as it SHOULD be!) and anyone considering a Geforce card should not buy anything less!
 

bluemax

Diamond Member
Apr 28, 2000
7,182
0
0
Oh, and regarding the HP 610 or 612... I'm 100% certain of the cost of ink. While I was in sales at Office Depot Canada our prices were approximately:
HP 600 series standard (won't work in 610/612) $40 for 40ml ink cartridge ($1.00/ml)
HP 610 *specific* cartridge: $50 for 20ml - ($2.50/ml ... that's 2.5x more expensive, and still prone to breakdowns too.)
 

RobsTV

Platinum Member
Feb 11, 2000
2,520
0
0
WetWilly
When a company puts out 1000's of cards, there is bound to be a bad one every now and then. Even if the defect rate was 0.1%, someone has to be that 1 out of 1000. While your comment on the Elsa's 2d seemed far fetched, (but possible), once you stated the 3d also was bad, well that proves you got a lemon. Don't base your GTS results on one card that turned out to be bad. Exchange it for another. The 1000's that have an Elsa with GREAT 2d and 3d can't all be wrong.
 

audreymi

Member
Nov 5, 2000
66
0
0
Day 3: Gosh Soffer, where have you gone to when we need to talk
about image quality

Regarding 2D:
This ground has been gone over many times in this forum. Use
the search engine and do a boolean search on "Nvidia" and "2d".
Here are a couple of posts on this issue. I think RobsTV made
the grade of Senior Member on this issue alone. Rob, is it really
true that you sell TVs as well ? :)

  1. post 1
  2. post 2
  3. post 3

What is the difference between and ELSA Gladiac and and ELSA Erazor X2?
 

lsd

Golden Member
Sep 26, 2000
1,184
70
91


<< lsd, I have the TA demo, and the TA full game on the way. S3TC is disabled by default on the TA demo game, and I assume on the full version. What do you mean by &quot;say good by to 32bit color in new FPS games&quot;? I thought The demo played very smoothly. It only has that one map, so its kind of hard to say >>


Well I actually have the full game. Don't be fooled about how smoothly the demo plays. That map is tiny compared to the full version maps. The terrain maps are killers. If you want 32bit color you are going to have to drop your resolution to 800x600. I know s3tc is disabled by default, and everytime I start the game it reverts back to s3tc off. However, the sky problem is fixed. The weird thing is whenever I take a screen shot the sky looks weird.
 

RobsTV

Platinum Member
Feb 11, 2000
2,520
0
0
audreymi,
Looks like you found me out!
Yep, my 7 posts total in those 3 links you list is what pushed me to senior.
Wait a minute... You have 7 posts in this useless thread alone...Plus another 2 in those.

Actually, I have nothing to hide, as my profile is there for all to see. I visit Anandtech at least once a day, and spend most of the time in Hot Deals Forum. Still, only average less than 1 post a day. When you are here long enough, senior will take care of itself. No need to force it, as you get no prize when you get there.

If you had read my 2d rants, it was based on the simple fact that while ATI makes a card, and every card will be the same, and 3dfx made a card, and every card was the same, you can say a blanket statement about ATI and 3dfx. Same plant should yield same results. But since nVidia does NOT make cards, and each card manufacturer is left to there own discression as to quality, and each is made at a different plant, you will get different image quality results with different nVidia board manufacturers. So you can not label all nVidia cards as being equal. A more correct way to complain would be to say, &quot;my Creative, Asus, whatever card had poor 2d&quot;. That was my point that many still do not get.

This has proven to be true when it comes to 2d. Most review sites say that the 2d quality of the Visiontek/Elsa cards does NOT exhibit the blurryness or flaws at high resolutions that other nVidia cards have.

No I do not sell TV's. I am an Tech, not a salesman.
 

lsd

Golden Member
Sep 26, 2000
1,184
70
91


<< ... You have 7 posts in this useless thread alone...Plus another 2 in those. >>


LOL
 

audreymi

Member
Nov 5, 2000
66
0
0
Cheers Rob,
I think both the cards are great for the consumer wishing to
play games today. You cannot go wrong with either. Myself, I found
that the first two sites I went to for information were Tom's and
Anand's but I ended up spending another two weeks at other sites
after my salesmen showed me the difference in 2D sharpness between the two cards on a Sony 200 series monitor. I chose Radeon. To be fair, the store
I went to did not have ELSA cards and again I went to the web
for information just to see if I should go to another store.
Believe it or not, I found a review that was just released the week
I was ready to plunk down my money and it compared ELSA and Radeon
in 2D and gave the nod to the Radeon cards by a bit more than a nose.
With this discussion on 2D and this thread on textures, I can only see things improving despite the curious silence on this issue and the large amount of noise on Win2K as a gaming platform. My conclusion of reviewers of popular internet web sites are just as human, as you and I, they just have a tough time of having to save face in front of our their &quot;fans&quot;. I do hope that a &quot;gifted daughter&quot; makes it to president of the good ole U.S.A one day :)
 

RobsTV

Platinum Member
Feb 11, 2000
2,520
0
0
You have to use your own judgement at review sites, and try to combine reviews from many sites to get a more realistic picture. For example the review above that mentions 2d, is a review to find the best budget card. They included the Radeon, but NOT the GeForce2 GTS, when at the time of the review the Radeon and GTS cards were priced about the same, at less then $150. So if you used that review only as a basis for your decision, of course the ATI card would come out ahead of the GF2 MX card. Had they included the GTS card, then they too would have most likely crowned the winner the GTS, just as most of the sites have done in comparison reviews.

If you looked at Sharky's highend comparsion, you would see that even though he mistakenly prices the Elsa GTS card at $100 more than the Radeon, the GTS card still places first, followed by 2 more GTS cards, and the Radeon places 4th, while the 3dfx V5 places last. You'll also note that this compared 6 GTS cards, 1 ATI card, and 1 3dfx cards, which also adds credibility to the fact the all GTS cards are not created equal. This comparison was not just a performance test, but also compared things that the ATI card excells, at like DVD and image quality. Still, it's a 4th place finish when ALL is compared.Sharky's High End 3D Video Card Shootout
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
WetWilly:

agreed on all counts. I'm pretty impressed with my buddy's Radeon 64.

lsd:

The S3TC issue is gone with Team Arena.... I know s3tc is disabled by default, and everytime I start the game it reverts back to s3tc off. However, the sky problem is fixed

uh...no. Read what you wrote. It is &quot;fixed&quot; when it's off. Turn it on, and the GeForce's look crappy once again, tho Ben was talking about a driver registry hack to use DXT3 instead of DXT1.

bluemax:

I got this ELSA Erazor X2. I had the ASUS v7100MX before .....This ELSA is running 1280 on a regular basis and is sharp as a tack! ....better than the 3500...So if 2D is **not** an issue between Geforce and Radeon,

uh, it's still an issue. The Radeon and 5500 both have far superior 2d to the 3500. The MX's 2d is a joke, of course. You should probably hold off on those statements until you see your actual competition.

RobsTV:

You have to use your own judgement at review sites, and try to combine reviews from many sites to get a more realistic picture.

I used to agree with you, but I absolutely do not believe ANY video card reviews anymore. I don't even read 'em anymore. I test them myself. That's the only true way.

and I sure as HELL don't trust Sharky <rolls eyes>
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Robo, a question for you: I don't have a 5500, but have a lot of experience with Voodoo 3s (I have two currently). How does UT in Glide look on a V5 compared to a V3? Not speed, just visuals?
 

lsd

Golden Member
Sep 26, 2000
1,184
70
91


<< uh...no. Read what you wrote. It is &quot;fixed&quot; when it's off. Turn it on, and the GeForce's look crappy once again, tho Ben was talking about a driver registry >>



Do you own TA?
Like I said, it is FIXED, but s3tc is off by default. You must enable it.
-------
I'm talking about the full version, not the demo.
Oh yeah, I never wrote &quot;it was fixed when it was off&quot;. I wrote that it's fixed; but s3tc is off by default. :confused:
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
oldfart, the V3 has a very washed out appearance compared to the 5500 in just about everything. In glide it's not as noticeable, but it's still quite obviously there. there is a nice increase in visual quality.

lsd, I own TA.

heard tons of complaints from GTS owners that they still haven't fixed the TC issue, so I'm not sure what you mean.
 

lsd

Golden Member
Sep 26, 2000
1,184
70
91
edit--
Damn it Robo.. can I be right for once? :(
Apparently the sky isn't fixed. I checked agian... it turns out that disabling or enabling compression while on a map makes no effect. You have to get out of the map and go to the main menu and disable or enable it. Shunks.
BTW: how does 1280x1024 with 32bit color shq play on your V5?
 

2dfx

Member
Sep 3, 2000
36
0
0
WIth Q3 1.27, the lightmaps and the sky have an option not to be compressed when DXTC is on, which fixes much of the problems and is noticably slower.

I think the variable for this is:
r_ext_compress_texutres 1

For the compression of all textures, which looks shocking, i think the variable is:
r_ext_compressed_textures 1
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
I think the variable r_ext_compress_texutres is no longer used. It has been replaced by r_ext_compressed_textures . Delete or rename your q3config.cfg, start up Q3 to generate a fresh new one. You won't have r_ext_compress_texutres.
 

WetWilly

Golden Member
Oct 13, 1999
1,126
0
0
RobsTV,

While your comment on the Elsa's 2d seemed far fetched, (but possible)

Brother, where do I start on this one?

Far fetched? Let's see. Had the Elsa installed, had already optimized the text to the best I could get it. Switched to VGA driver, powered off, popped out Elsa, popped in Radeon, booted, installed drivers, rebooted. Noticed IMMEDIATELY that Radeon's 2D was definitely sharper. Heck, forget about Windows - the BIOS boot screens were noticeably sharper. What's so far fetched or impossible about that? The process took less than 10 minutes, and I'm pretty sure I could remember what the Elsa's quality looked like when the Radeon booted. Not only that, the higher the resolution and/or refresh rate, the greater the disparity in quality. Sound familiar? If it doesn't, go check that guy who posted the filter mod for the GeForce - he says the same thing. I still stand by my statement - on identical hardware with a warmed up monitor, the Radeon was sharper - and I'm far from alone on that one. I don't know how you can knock my experience when I haven't seen a Radeon on your hardware lists in your many posts. You can say nVidia has great 2D in a vacuum, but I've had the Radeon, a V5500, an Elsa, and an old Matrox in this PC. My ranking for 2D, which is pretty consistently with most people WHO HAVE ACTUALLY TRIED ALL FOUR MANUFACTURERS' CARDS IN THEIR SYSTEMS with decent monitors, is 1) Matrox, 2) Radeon, 3) V5500 (although Radeon and V5500 are probably close enough for a tie), and 4) nVidia. And let's not forget that even among nVidia enthusiasts there's a consensus that 2D quality varies greatly, and it's common knowledge that Elsa/Visiontek are better than the rest of the nVidia cards.

once you stated the 3d also was bad, well that proves you got a lemon.

I never said the 3d was bad. If you read my previous two posts, I only talked about 2D in the first, and color saturation in the second. Since you somehow think my results are a fluke, how about an nVidia fansite that came to the same conclusion. Here's a quote from Riva3D's Review of the Radeon:

Before I get into benchmarks and raw numbers I have another area that needs to be addressed: visual quality. Throughout my tests I noticed that the Radeon constantly produced superior visuals over the other cards tested, and frankly over any card I have had the pleasure to work with. This quality manifested itself in more vibrant colors, and noticeably sharper display in both Q3 and 3D Mark. I thought I might be imagining this, but once I noticed the difference on my own I looked around and found similar comments in other Radeon reviews. For example, a review here at Maximum PC mentions this same thing, and even has a few pics for good measure. This is definitely something to take into consideration when looking over my test results.

At least somebody's else who's tried something besides nVidia and 3dfx is willing to take note of the Radeon's superior visual quality.

Don't base your GTS results on one card that turned out to be bad. Exchange it for another.

I don't see how you can make the assumption that the card is bad. Because it doesn't have better 2D than ATI and Matrox? I'd expect (and many others in this forum and elsewhere would as well) that if I kept exchanging cards until I found one that's got better 2D than ATI and Matrox, I'd 1) open and exchange every Gladiac GTS Elsa made, and 2) lose exchange privileges everywhere. BTW, as I'd posted in another one of these &quot;quality&quot; (pardon the pun) discussions, I'd had a Creative Geforce2 MX board in my system as well. That one's 2D was worse than the Elsa. And guess what? That result (Creative=poor vs. Elsa=good 2D quality) matches the consensus as well. So I got a bad Creative and a bad Elsa? I don't think so.

The 1000's that have an Elsa with GREAT 2d and 3d can't all be wrong

I've already responded to your calling Elsa 2D &quot;GREAT&quot; in a vacuum. It might be &quot;great&quot; if you haven't used an ATI or Matrox card and/or you've not terribly picky about 2D. I AM very picky about 2D and had used Matrox cards a lot. But a lot of people aren't terribly critical of 2D. If the only 2D you see is the desktop before you launch Quake, then more power to you. I've lost count of how many systems I've worked on that I'd refuse to use if they were mine.

All that said, I still haven't ruled out the Geforce. Main reason is that ATI's drivers aren't quite there. My Xpert128 experience deja vu. Example? I installed the Radeon's 7041 drivers, and just cleaned them out to try the Elsa again (and yes, the Elsa's 2D is STILL not as good as the Radeon's). I was manually cleaning the ATI drivers out of the Windows directory. Know how many files were there? Something like 350 - most of them multilanguage support. Even though those drivers are unsupported, it doesn't instill confidence when the installer sucks.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Thanks Robo. I have been meaning to ask you that for awhile, because I can say owning both, the the Radeon in D3D looks MUCH better than the V3 in Glide.

lsd, I got my TA retail today. I tried a bunch of the maps. Smooth as butter 1024 x 768 x 32.

Are there any TA timedemos that anyone has seen? In case you guys missed it. I did post a timedemo for Q3 ver 127. From the console:

timedemo 1
demo demo127
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
RobsTV:

When a company puts out 1000's of cards, there is bound to be a bad one every now and then. Even if the defect rate was 0.1%, someone has to be that 1 out of 1000.

Reeeeeaaaally? Then could there be a miniscule chance that your Voodoo3 was such a card? And that your experiences with it are nothing like what anyone else has experienced with their V3's?
 

WetWilly

Golden Member
Oct 13, 1999
1,126
0
0
Robo and RobsTV,

You have to use your own judgement at review sites, and try to combine reviews from many sites to get a more realistic picture.

I used to agree with you, but I absolutely do not believe ANY video card reviews anymore. I don't even read 'em anymore. I test them myself. That's the only true way.

and I sure as HELL don't trust Sharky <rolls eyes>


Man, I can always count on Robo to get there first ;). RobsTV, did you actually read all of that review? Without getting into the whole Sharky issue, the first tipoff was the scoring under &quot;Image Quality&quot; - ATI=10, 3dfx=8, and - wait, this takes the cake - ALL of the GTS cards get the same score of 9. That should have triggered something in somebody's head to at least find out how they rated image quality in the review. Was 2D included? Different bit-depths? So how did they rate image quality? By testing the accuracy of a 3DMark2000 reference image. Although there were brief references to the ATI's color vibrancy, the whole discussion is about how accurately each card rendered the reference frame. Uhhh, excuse me, but the last time I checked, &quot;quality&quot; wasn't a synonym for &quot;accuracy.&quot; But wait, does it end there? Oh noooooooo, of course it doesn't. Go to the link and take a good look at the reference image they used. Notice how about 70% of the reference image is composed of shades of brown and green? You're going to rate image quality based on an image like that? What did they do - sit around saying &quot;look how vibrantly the ATI renders that dirt?&quot;

The basic problem is that if you're trying to get some info on visual quality issues, you can combine all the reviews you want and you'll still have pretty much nothing.
 

lsd

Golden Member
Sep 26, 2000
1,184
70
91


<< lsd, I got my TA retail today. I tried a bunch of the maps. Smooth as butter 1024 x 768 x 32. >>


Depends on what you call smooth ;)
On the terrain map, I think terra1(?), my framerate drops into the 30's and it's very hard to rail long distance. The general concensus on quake3world.com is that TA is a monster. How many fps are you getting in mpdemo1 ?
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
lsd:

edit--
Damn it Robo.. can I be right for once?
Apparently the sky isn't fixed. I checked agian...


<giggles> sorry man! :)

... it turns out that disabling or enabling compression while on a map makes no effect

that is one of the variables that requires you to restart the video. Ever notice in the graphics menu how you have to &quot;accept&quot; changes when you change texture or colordepth, then everything blinks out and whatnot? that's what is happening, the video is restarting

from the console, after you disable (or enable) TC, do this:

/vid_Restart

your screen will flash, and the change will take place

BTW: how does 1280x1024 with 32bit color shq play on your V5?

$hitty. mid-50s. :(

2dfx:

I think the variable for this is:
r_ext_compress_texutres 1

For the compression of all textures, which looks shocking, i think the variable is:
r_ext_compressed_textures 1


heh...&quot;shocking&quot;....who was it that said it's barely noticeable? <G> yeah, right

okay, here's the scoop.

Enable Texture compression (v1.17)

/r_ext_compress_textures 1
/vid_restart

Enable Texture compression (v1.27 and Team Arena)

/r_ext_compressED_textures 1
/vid_restart

note the &quot;-ed&quot; in there for 1.27. Caps aren't important, but the /vid_restart is

to turn it off, set the variable to &quot;0&quot; instead of &quot;1&quot;

oldfart:

I can say owning both, the the Radeon in D3D looks MUCH better than the V3 in Glide.

no doubt. The Radeon looks fantastic in 32-bit D3d.

BFG:

RobsTV:

When a company puts out 1000's of cards, there is bound to be a bad one every now and then. Even if the defect rate was 0.1%, someone has to be that 1 out of 1000.

Reeeeeaaaally? Then could there be a miniscule chance that your Voodoo3 was such a card? And that your experiences with it are nothing like what anyone else has experienced with their V3's?


BWAAAHAHAHAHAAAAAAAA!!!!!!!!!!

<cha-ching>

:D

WetWilly:

Man, I can always count on Robo to get there first

heh...I try man, I try. <g>

So how did they rate image quality? By testing the accuracy of a 3DMark2000 reference image

no kidding. Isn't that the most pitiful thing you've ever seen? Pathetic....

lsd (Again):

On the terrain map, I think terra1(?), my framerate drops into the 30's and it's very hard to rail long distance. The general concensus on quake3world.com is that TA is a monster.

heh....looks awesome and runs smooth as silk on the 5500 in 16-bit. There is a WHOLE lot to be said for outstanding 16-bit image quality. <G>

actually, the 5500 maintains it's framerate a bit better in TA than the GTS, er...loses a bit less, I've noticed. I've done some comparisons with a friend of mine. He has a Creative labs GTS, and with equal configs, we are about 10 fps apart in Q3, but only about 4 or 5 in TA.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
&quot;Uhhh, excuse me, but the last time I checked, &quot;quality&quot; wasn't a synonym for &quot;accuracy.&quot;

Image quality is ALL about accuracy. You may have a preference for an improperly rendered scene, but an improperly rendered scene is by definition of lower image quality(when speaking of 3D graphics, accuracy is how image quality is defined).

Radeon looks real sharp and the FSAA doesn't work for sh!t, they go together. They make a compromise to have a sharper image, increased aliasing. Take a flight sim or a racing sim and compare the aliasing to a GeForce series of boards, neither of them having FSAA enabled. The Radeon suffers quite a bit more noticeable aliasing then the GeForce boards.

The V5 gives the best compromise in this case, being able to select your own LOD settings(nVidia and ATi should take note) so you can decide.

Colors brighter, more bleeding. This is adjustable on all of the boards, use your contrast/brightness/gamma settings. It is a compromise. Again, have to give props to 3dfx here as offering different predefined setting groups for D3D, OpenGL and the desktop without you having to switch them yourself(using the various tweak panels) each time you are switching.

Myself, I drop the gamma way down(~0.65) and raise the contrast up a decent amount and the brightness slightly for GF boards(all of them I have ever used). This gives you the color saturation that you see with the Radeon out of the box, at the cost of slight bleeding which the ATi also has(take any screen shot and bump up the contrast, colors start to bleed along with getting brighter).

Image quality comes down to accuracy. Everything else can be adjusted(as long as the tools are available).

Feature support and implementation can aid significantly in improving image quality(Dot3, trilinear, anisotropic, EMBM, FSAA), but again you can very easily compare the quality of the boards by looking at a reference rasterized(software rasterizer) image(outside of FSAA which is extremely subjective).
 

audreymi

Member
Nov 5, 2000
66
0
0
Day 4: Brandon Hill, please past this thread on to your boss. In the mean time take a look at Andrew Worobyew's article on driver releases and texture compression.

Regarding Sharky's High and Budget End Review:
That was close, Rob: ELSA's 86 scrore to Radeon's 82 score

I never saw that mamoth 33 page high end roundup report until your post. The 4 point difference came down to the &quot;bundle&quot; rating of 9(ELSA) vs 4(ATi), and &quot;ease of use&quot; rating of 9(ELSA) vs 7(ATI). These are not so important to me but make up a 7 point difference between the two products.
As you see in Ben's previous post, &quot;ease of use&quot; is not too important for some users :)

The most notable attribute about a well designed report is that they
are &quot;transparent&quot; and allow you to reassess a reviewer's metrics to help you stand on the reviewer's shoulder to come up with your own conclusions and rankings. I like the idea of the summary table at the end of a report to put a reviwer's overall take on &quot;all&quot; the issues.

In the end, I found the &quot;budget review&quot; more suitable for my purposes.
The high end report was published on October 9, 2000. The budget review
was published three weeks later on October 30, 2000. If you compare the two summary tables ( high end and budget ), you will notice that a new &quot;measure&quot; of &quot;2D quality&quot; was added. The site obviously felt something was amiss in ratings system and added in this new metric. Why can't all sites be this transparent in their reviews? Why?

Regarding Team Arena:
ID software has a number of close alliances with both hardware vendors and with game developers looking to license their engine. They have always believe that choice is a good thing if you look at their initiatives in OpenGL to run one code base on SGI, Win95, Linux, and Apple operating systems. The decision to leave the default for &quot;texture compression&quot; to off is due to (I think) the significant development on Nvidia chips and their commitment to quality image rendering. A good chunk of the vocal gaming community are now
running either GeForce or GeForce2 and shipping Team Arena with
texture compression &quot;on&quot; would just tie up Id's support line over the holidays and into the new year. I think it was a good move on their part to leave it &quot;off&quot; but to have it optionally &quot;turned on&quot; for
hardware which does not have this bug.

Last, my son says that it will be very interesting to see if the next generation re-write of DOOM will ship with an additional texturing pass (3 all together) as the default. John Carmack has hinted at using it. If so, all cards will need to make an additional pass but the Radeon will be able to take advantage of its third texture unit and enjoy a 100% advantage in framerate for three pass games. Are there any games today that use three passes during texturing?