Mod, please lock, this is no longer on topic.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Better then ATi's or 3dfx's sure, but that certainly didn't take much. Matrox and nVidia both hammered them in IQ on numerous fronts.
That's quite interesting considering ATi already had 16xAF for free on their boards while the N1x line was limited to just 2x and Matrox didn't even have AF. The Radeon's IQ in that respect was in a totally different league compared to its competitors.
 

jazzboy

Senior member
May 2, 2005
232
0
0
Originally posted by: Elcs
I remember a couple of years ago, there was speculation of a Kyro 3 but it faded away.

Thats a shame. I think they could easily come back with a good card. If i remember right, don't they still make workstation video cards?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Fox5
Better then ATi's or 3dfx's sure, but that certainly didn't take much. Matrox and nVidia both hammered them in IQ on numerous fronts.

Wierd, in that time period I found nvidia to have the worst image quality, ati just ahead of them, 3dfx ahead of them, and then matrox ahead of them. Nvidia didn't have decent image quality until the FX series imo.
You're just forgetting that nVidia image quality was a lot more manufacturer dependent in those days because nVidia gave OEMs more flexibility on important components like RF filters. (e.g. Leadtek might have the best IQ on the market, PNY might be fuzzy)

PowerVRs always ran in 32 bit color, so they should have had at least good 16 bit quality compared to nvidia and ati.(matrox and 3dfx I believe also always ran in 32 bit color, just downsampled to 16 bit)
I believe "cracking" between tiles was the problem with PowerVR, not color depth. It was an interesting rendering method, would have been nice to see what they would have done next.

 

PrayForDeath

Diamond Member
Apr 12, 2004
3,478
1
76
I used to have a KyroII card; I played many games with it including Serious Sam and GTAIII; as far as I can remember, it was similar to the GF2 Pro I had. But it had some bugs with newer games like Generals. It was a good card though.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: BFG10K
Better then ATi's or 3dfx's sure, but that certainly didn't take much. Matrox and nVidia both hammered them in IQ on numerous fronts.
That's quite interesting considering ATi already had 16xAF for free on their boards while the N1x line was limited to just 2x and Matrox didn't even have AF. The Radeon's IQ in that respect was in a totally different league compared to its competitors.


*Shudders* The ATI cards were so pathetically slow back then, you have to wonder how in the hell anyone could enable *any* anisotropic filtering on their cards. That is what is funny about the post... Yeah, they had the feature, but good luck ever getting use out of it when your hardware can barely run the latest games... ATI majorly sucked back then, until the Radeon series, they were the joke of the market.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
That's quite interesting considering ATi already had 16xAF for free on their boards while the N1x line was limited to just 2x and Matrox didn't even have AF.

You could set the LOD bias on nV parts to -3 and get the same effect(actually better in some regards). Did you ever try AF out on a R100 part? Massive aliasing on a scale far worse then anything we have seen since and extremely disjointed, some directly adjacent textures would go from extremely sharp to extremely blurry.

How can it be the bare minimum if that is currently the top of the line color scheme?

INT10, FP16, FP24, FP32= 40bit, 64bit, 96bit and 128bit color.

And when most games at the time only used 8-bit and 16-bit textures....

Blending operations require considerably higher accuracy then the source art. Currently everyone is either FP24 or FP32 which is 96bit or 128bit color despite the fact that source art is still 32bit- you can tell the difference over FP16(64bit) as the lack of accuracy hurts blening operations considerably(too much data is rounded out instead of being factored in to simplify).

LCDs generally don't have color banding in games

For the legally blind perhaps not, they actually do have horrific color banding though across the color spectrum. Even if 18bit color was enough to represent every visisble color in the spectrum(which it isn't close to) you still have such low levels of contrast that banding is extremely pervasive.

Perhaps I'm deluded by the fact that I can remember when I (well, OK, my parents) upgraded to a video card with 16 colors, and everything seems good compared to ye olden days of CGA graphics.

I recall being blown away seeing the first color computer/display setup that came into town many years ago(likely prior to a good portion of our members being born) on a Apple][ prior to the launch of the Mac. Still, in the timeframe we are talking about source art was already moving to 32bit not to mention that lightmap blends over 16bit art was already going to show artifacts.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
I recall being blown away seeing the first color computer/display setup that came into town many years ago(likely prior to a good portion of our members being born) on a Apple][ prior to the launch of the Mac. Still, in the timeframe we are talking about source art was already moving to 32bit not to mention that lightmap blends over 16bit art was already going to show artifacts.

I do not recall a big difference from 16bit to 32bit in games... I do remember that smoke looked better and a few other things, but generally, I didn't notice the difference. I probably would notice the different more today that I would have 6 years ago.

I believe people are willing to put up with less beauty for performance until it becames the standard. Say for instance the standard moves to 4XAA/16AF at all times, then people would adopt it and couldn't live without it... But until you actually make it a pretty set in stone setting, people probably live without it.

I would like to see AA4/16AF in the future be a pretty much mandatory option just like 32 bit color is now over 16 bit... You know? Hardware will have to improve a bit until we get to that point, but we are close.

It is amazing to see how good something looks on my 30" 1280 X 768 LCD monitor with AA4X/16XAF. Looks charp, clear and produces a beautiful picture.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91

Zap

Elite Member
Oct 13, 1999
22,377
7
81
A buddy and I had about three Kyro II cards that we kinda traded back/forth. One was a Hercules 64MB, second was an Evil 64MB, third was some 32MB generic. They worked okay. We got them after they had come down in price so they saw use in secondary machines. Earlier this year I "rediscovered" the Evil card in a machine built for someone's daughter. The fan was stuck with clogged dirt. Even after resurrecting the fan with compressed air and some oil, the card still gave visual glitches at the Windows desktop. A preventable death always saddens me. :(
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: xtknight
I'm not sure if you guys were speaking of 16-bit color, 16-bit textures, or both? I was also unable to get Doom 3 to run in 16-bit mode. Here's pics of rocket smoke:

16-bit: http://pics.bbzzdd.com/users/xtknight/q3-r16bit.jpg
32-bit: http://pics.bbzzdd.com/users/xtknight/q3-r32bit.jpg

You're right, they do look a lot worse, especially the grayscale of the rocket smoke.

Edit: fixed links.


Cool, thanks for the added screenshots. We were talking about 16bit color versus 32bit color. The 16bit versus 32bit textures has to do with clarity generally and not color.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: xtknight
Quake 3 16-bit color/16-bit textures: http://pics.bbzzdd.com/users/xtknight/q3-16bit.jpg
Quake 3 32-bit color/32-bit textures: http://pics.bbzzdd.com/users/xtknight/q3-32bit.jpg

Look the same to me.

Anyhow, back in the day I'd take 16-bit post filtered over 32-bit anyday.
2x super sampled AA to 2x multisampled AA has a bigger image quality difference than 16-bit post filtered to 32-bit did, yet I'd still take MSAA due to the much faster performance.
Current video cards perform about the same with 16-bit or 32-bit though, so there is no longer any reason to do 16-bit.(I believe the GTA games had an option for it, and there's no image quality difference) I'd assume most games that truly benefit from 32-bit require it to always be on. In screen shot comparisions of the time, I remember there being a huge difference from ATI and nvidia 16 bit to 32, but powervr, matrox, and 3dfx 32 bit barely improved over their 16-bit.(I believe 3dfx supposendly had the worst 32bit quality though)
Oh, and most console games are 16-bit. Transparencies and post-filter effects generally suffer, but that's about all I notice.
 

SonicIce

Diamond Member
Apr 12, 2004
4,771
0
76
128-bit color? thats 3.4 * 10 ^38 colors? that must include like HDR values and stuff :Q 128-bit color is about 340,282,366,920,938,463,463,374,607,431,770,000,000 values :p nuts do we really need 3 undecillion colors. for what real world application?
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: SonicIce
128-bit color? thats 3.4 * 10 ^38 colors? that must include like HDR values and stuff :Q 128-bit color is about 340,282,366,920,938,463,463,374,607,431,770,000,000 values :p nuts do we really need 3 undecillion colors. for what real world application?

As noted, this is needed for blending operations (you can't blend 32-bit colors in 32-bit buffers and get accurate results) and high-precision shader math. But the actual output from the video card is (usually) still 24-bit color (8 bits each for red, green, and blue).

Originally posted by: BenSkywalker
Still, in the timeframe we are talking about source art was already moving to 32bit not to mention that lightmap blends over 16bit art was already going to show artifacts.

I didn't say it didn't make a difference, just that it wasn't *nearly* as bad as you implied in your earlier post. Some color banding (often barely noticable if the game's textures were done well) is a minor IQ issue; "complete ass" (to me) is something like not drawing textures, rendering in 8-bit color, that sort of thing. :p
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I didn't say it didn't make a difference, just that it wasn't *nearly* as bad as you implied in your earlier post.

I still think of it as utterly hideous- have a hard time seeing how you wouldn't.

Some color banding (often barely noticable if the game's textures were done well) is a minor IQ issue

I know you know better then to look at jpgs to judge IQ on(don't feel like explaining it out to the younger set) but even looking at those take a quick glance at the HUD- that looks p!ss poor.

"complete ass" (to me) is something like not drawing textures, rendering in 8-bit color, that sort of thing.

I consider that 'non functional'. I'm sure you have seen me go off on the Sacrifice bug more then once- rendering glitches like those you mentioned above are what is wrong with the game on R3x0+ hardware.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: BenSkywalker
I didn't say it didn't make a difference, just that it wasn't *nearly* as bad as you implied in your earlier post.

I still think of it as utterly hideous- have a hard time seeing how you wouldn't.

Some color banding (often barely noticable if the game's textures were done well) is a minor IQ issue

I know you know better then to look at jpgs to judge IQ on(don't feel like explaining it out to the younger set) but even looking at those take a quick glance at the HUD- that looks p!ss poor.

"complete ass" (to me) is something like not drawing textures, rendering in 8-bit color, that sort of thing.

I consider that 'non functional'. I'm sure you have seen me go off on the Sacrifice bug more then once- rendering glitches like those you mentioned above are what is wrong with the game on R3x0+ hardware.


Damn, you must be very difficult to live with... Seriously, are you married? Not to go off topic, but you are perhaps the most anal person I have seen post on www.anandtech.com about IQ. Nothing wrong with being picky because it really does not affect me in even the slightest way, but I can't help but think that it must be a miserable life for you when hardly anything can live up to your expectations. :-/
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: BenSkywalker
I didn't say it didn't make a difference, just that it wasn't *nearly* as bad as you implied in your earlier post.

I still think of it as utterly hideous- have a hard time seeing how you wouldn't.

Um... it doesn't look that bad? The differences between those screenshots are FAR from anything I would consider "utterly hideous".

Some color banding (often barely noticable if the game's textures were done well) is a minor IQ issue

I know you know better then to look at jpgs to judge IQ on(don't feel like explaining it out to the younger set) but even looking at those take a quick glance at the HUD- that looks p!ss poor.

Well, yes, it would be nice to have uncompressed images, but full-res JPEGs at minimum compression would have little effect on the quality of a screen capture.

Yes, there is some color banding in the HUD (and in the shots where a rocket is being fired, the smoke is clearly off). Again, I didn't say it had no effect or couldn't be noticed, just that it's pretty tame as far as this sort of thing goes.

"complete ass" (to me) is something like not drawing textures, rendering in 8-bit color, that sort of thing.

I consider that 'non functional'.

I would also consider it broken (as well as looking like crap). However, 16-bit color is neither broken, nor does it introduce horrible rendering errors.

You're certainly entitled to whatever opinion you want about IQ and how things look. But when the conversation goes (paraphrasing):

OP: "Well, I thought the Kyro2 looked better than the GeForce2".
You: "What are you, f***ing blind? It used 16-bit color!!!"

Don't be surprised if people don't take your opinions on this sort of thing too seriously. :beer:
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Damn, you must be very difficult to live with... Seriously, are you married?

Yes, extremely happily as a matter of fact. I have five children(9 1/2 years for the oldest to 3 months for my youngest). Just ran it by my wife and she says I am extremely anal about some things, but overall I'm not. One thing is I don't get stressed about much.

Now if you want to talk to people who would have a bigger issue with the way I like things done you should talk to the people that work for me :D They all know I don't tollerate mistakes, I expect them to be perfect and I make sure that they are well compensated for performing as such. A mistake for my team means lost money, sometimes a lot of lost money, so it isn't without merit. I ususally will go through a half dozen or so qualified experienced people until I find one that has potential to be good enough(then it takes years of training to get them where they need to be). Of course, that said, because of the way I do things I am extremely well compensated for my job and are given significant flexibilities not afforded to most people that work in my company. I have found that being extremely anal in my job has served me extremely well.

but I can't help but think that it must be a miserable life for you when hardly anything can live up to your expectations.

If I were to settle then perhaps that would be the case, but I don't. If I spend enough time looking for something or working with something then I pretty much always find that I will be able to be quite pleased with the end results. Sure, when my family was looking for a house all of them that were existing were poorly designed and inadequate, but that just meant that we came up with our own plans and hired someone to build it exactly as we wanted it. Can't find an autombile I like then attain the closest thing I can and modify it until it is how I like it. Can't find any decent furniture so have some custom built- and of course can't find a pre built PC worth a d@mn so I build my own. You may think of things such as this as being anal which I can understand. But in the end, who is going to be the one living a miserable life? The person who is always willing to settle for 'good enough' or the person who is willing to spend the extra time and effort to make sure they have exactly what they or their family wants? Just think about that.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Well, yes, it would be nice to have uncompressed images, but full-res JPEGs at minimum compression would have little effect on the quality of a screen capture.

There is enormous loss, jpegs trunctuate color data which is exactly what we are disussing here.

You're certainly entitled to whatever opinion you want about IQ and how things look. But when the conversation goes (paraphrasing):

OP: "Well, I thought the Kyro2 looked better than the GeForce2".
You: "What are you, f***ing blind? It used 16-bit color!!!"

The Kyro2 didn't use 16bit color, but besides that there was a lengthy list of rendering errors in terms of the Kyro2. I wrote a full review of the Kyro2 back when I was running GB and the people from PVR didn't seem to take issue with any of it(in fact, one glitch I pointed out they contacted me about and we tried a few different things to get it to tollerable levels). I will gladly articulate exactly what is wrong with any of the parts, in fact I listed off numerous factors in this thread yet the singular one people are taking large issues with is 16bit color. You may think that the posters on this forum will ignore what I say and that is fine, but when I used to author articles on IQ concerns for a site I would regularly get conversations going with IHVs and engine developers and they, apparently unlike people here, would listen.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: BenSkywalker
Well, yes, it would be nice to have uncompressed images, but full-res JPEGs at minimum compression would have little effect on the quality of a screen capture.

There is enormous loss, jpegs trunctuate color data which is exactly what we are disussing here.

You generally don't get what I would call "enormous" loss at the better quality settings, but yes, it's hardly ideal for this sort of thing. I will see if I can take some uncompressed screenshots from a game to compare.

You're certainly entitled to whatever opinion you want about IQ and how things look. But when the conversation goes (paraphrasing):

OP: "Well, I thought the Kyro2 looked better than the GeForce2".
You: "What are you, f***ing blind? It used 16-bit color!!!"

The Kyro2 didn't use 16bit color, but besides that there was a lengthy list of rendering errors in terms of the Kyro2. I wrote a full review of the Kyro2 back when I was running GB and the people from PVR didn't seem to take issue with any of it(in fact, one glitch I pointed out they contacted me about and we tried a few different things to get it to tollerable levels). I will gladly articulate exactly what is wrong with any of the parts, in fact I listed off numerous factors in this thread yet the singular one people are taking large issues with is 16bit color

Sorry, I thought you were referring to the Kyro2 in particular when you said:

16bit color was complete @ss. As I stated before, I am not legally blind so I can't comment on how things look for you but as a person with perfect vision the 'best' 16bit output looked like vomit. 32bit color is a bare minimum for anything in the time era we are discussing.

You had branched off into talking about 16-bit color in general and 3DFX cards at that point.

Look. I didn't own a Kyro2. Maybe it had zillions of problems that did, in fact, make it look like crap all the time. Maybe the OP is, in fact, legally blind. I did, however, have lots of experience running games in both 16- and 32-bit color (and 16- and 32-bit texture detail), and the differences, while noticeable, are hardly at the level you seem to be describing them at.

You may think that the posters on this forum will ignore what I say and that is fine, but when I used to author articles on IQ concerns for a site I would regularly get conversations going with IHVs and engine developers and they, apparently unlike people here, would listen.

You just seem to blow any sort of IQ issue WAY out of proportion compared with the general population. And the tone that comes across in your posts sometimes is that you are personally offended by other people not agreeing in the severity of such problems (ie, calling the OP in this thread 'legally blind'). At points it makes it difficult to take your criticisms seriously, since it seems like anything short of absolute perfection is derided as unacceptable, and anyone who disagrees with you is tossed aside as an uninformed idiot (which, of course, they may well be most of the time around here :p). Not saying this is the actual attitude you have, just that it comes across this way at times.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Matthias99
Originally posted by: BenSkywalker
Well, yes, it would be nice to have uncompressed images, but full-res JPEGs at minimum compression would have little effect on the quality of a screen capture.

There is enormous loss, jpegs trunctuate color data which is exactly what we are disussing here.

You generally don't get what I would call "enormous" loss at the better quality settings, but yes, it's hardly ideal for this sort of thing. I will see if I can take some uncompressed screenshots from a game to compare.

You're certainly entitled to whatever opinion you want about IQ and how things look. But when the conversation goes (paraphrasing):

OP: "Well, I thought the Kyro2 looked better than the GeForce2".
You: "What are you, f***ing blind? It used 16-bit color!!!"

The Kyro2 didn't use 16bit color, but besides that there was a lengthy list of rendering errors in terms of the Kyro2. I wrote a full review of the Kyro2 back when I was running GB and the people from PVR didn't seem to take issue with any of it(in fact, one glitch I pointed out they contacted me about and we tried a few different things to get it to tollerable levels). I will gladly articulate exactly what is wrong with any of the parts, in fact I listed off numerous factors in this thread yet the singular one people are taking large issues with is 16bit color

Sorry, I thought you were referring to the Kyro2 in particular when you said:

16bit color was complete @ss. As I stated before, I am not legally blind so I can't comment on how things look for you but as a person with perfect vision the 'best' 16bit output looked like vomit. 32bit color is a bare minimum for anything in the time era we are discussing.

You had branched off into talking about 16-bit color in general and 3DFX cards at that point.

Look. I didn't own a Kyro2. Maybe it had zillions of problems that did, in fact, make it look like crap all the time. Maybe the OP is, in fact, legally blind. I did, however, have lots of experience running games in both 16- and 32-bit color (and 16- and 32-bit texture detail), and the differences, while noticeable, are hardly at the level you seem to be describing them at.

You may think that the posters on this forum will ignore what I say and that is fine, but when I used to author articles on IQ concerns for a site I would regularly get conversations going with IHVs and engine developers and they, apparently unlike people here, would listen.

You just seem to blow any sort of IQ issue WAY out of proportion compared with the general population. And the tone that comes across in your posts sometimes is that you are personally offended by other people not agreeing in the severity of such problems (ie, calling the OP in this thread 'legally blind'). At points it makes it difficult to take your criticisms seriously, since it seems like anything short of absolute perfection is derided as unacceptable, and anyone who disagrees with you is tossed aside as an uninformed idiot (which, of course, they may well be most of the time around here :p). Not saying this is the actual attitude you have, just that it comes across this way at times.

When you are referring to the OP, are you referring to me or that other dude way up there? Though, to call anyone blind who posts on a forum is a bit extreme. Because the definition of "legally blind" is 20/200 in your best eye, corrected. No one could post on the forum with that :D
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: ArchAngel777
Originally posted by: BenSkywalker
I didn't say it didn't make a difference, just that it wasn't *nearly* as bad as you implied in your earlier post.

I still think of it as utterly hideous- have a hard time seeing how you wouldn't.

Some color banding (often barely noticable if the game's textures were done well) is a minor IQ issue

I know you know better then to look at jpgs to judge IQ on(don't feel like explaining it out to the younger set) but even looking at those take a quick glance at the HUD- that looks p!ss poor.

"complete ass" (to me) is something like not drawing textures, rendering in 8-bit color, that sort of thing.

I consider that 'non functional'. I'm sure you have seen me go off on the Sacrifice bug more then once- rendering glitches like those you mentioned above are what is wrong with the game on R3x0+ hardware.


Damn, you must be very difficult to live with... Seriously, are you married? Not to go off topic, but you are perhaps the most anal person I have seen post on www.anandtech.com about IQ. Nothing wrong with being picky because it really does not affect me in even the slightest way, but I can't help but think that it must be a miserable life for you when hardly anything can live up to your expectations. :-/

I bet he owns a 7800GTX sli system and a monitor capable of at least 1920x1080 just so he can have perfect IQ. Heck, if I had the money I would too.

And if I was an IQ whore back in 2000, I would have owned a Voodoo 5 5500 and used 4x AA, and if I wanted acceptable framerates with that, I would have ran in 16 bit color. And if I was just a framerate whore who wanted decent graphics, I'd still own the Voodoo 5 and run in 16 bit color, as I'd consider nvidia's 16 bit color completely unacceptable.(too bad the voodoo 5 didn't have T&L support, that made it a pretty poor buy, within 6 months there were games that wouldn't run on it)

There are many different parts of IQ though, not just color accuracy, framerate, jagginess, filtering, or polygon counts, but everything that goes into the image. The dreamcast has better filtering, sharper graphics, better color accuracy, better textures, and so forth than PS2's graphics, but I would never say the best dreamcast games can even approach the graphics of the best ps2 games, though when you're using blurry low resolution tvs to output your graphics, polygon counts and lighting are probably much more important than filtering, ps2's output level would be completely unacceptable on any computer monitor.