Mod, please lock, this is no longer on topic.

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Does anyone remember the Kyro I and Kyro II cards? They have some innovative designs, but sadly, they lacked the true polished character of an ATI or nVidia card. I owned an origional Kyro and it had some "issues" that never really cleared up. But for the most part, the card ran pretty good.

That is what has been lacking for a past few years... Nothing majorly innovative, nothing revolutionary... Just, pretty much stuck in the CPU X86 architecture type Intel Versus AMD with marginal performance gains with each product, but nothing earth shattering.

Did any of you own a Kyro II? Post opinions on it...

Also, what ever happened to the company that designed the chip?
 

Schadenfroh

Elite Member
Mar 8, 2003
38,416
4
0
/recalls the tile based rendering kyro fanboys versus geforce flamewars
/shutters at the thought
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Just sold one last week on eBay. It was a Kyro Evil. I ran it under Linux for a while, but no hard gaming (I think I ran Enemy Territory on it).

PowerVR's best work was the Dreamcast.
 

lein

Senior member
Mar 8, 2005
620
0
0
I have a 4000XT. Similar to a GeForce 2 in terms of performance and got it for about $50 I think about 4 years ago (when geforce 3 was king). Am quite happy with it, as its still in use, and I too would like to see a 3rd company that can actually compete with ATI and nvidia (other than integrated like Intel). Would be pretty cool and interesting if Intel decided to release a performance gfx card =).
 

jr9k

Member
Jun 30, 2005
53
0
66
A friend bought one and changed it for a GF2 MX400, after playing Max Payne with my (then) old Voodoo 3 2000.

Kyro didn't support T&L (IIRC) when games started to use it.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,997
126
I heard Kyro 3 offers full SM 3.0 support but more than likely the card itself is just a rumour.
 

jazzboy

Senior member
May 2, 2005
232
0
0
Ive still got Kyro 2 in one of my computers (1.3 Duron, 256 RAM) and it was reasonably competitive with my Geforce 2 GTS in older Directx titles, but opengl performance was unfortunatlely sup-par. And obviously its main problem was the lack on T&L support.

However I must say that the image quality was fantastic on it - my GF2 GTS didnt come close!
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
However I must say that the image quality was fantastic on it - my GF2 GTS didnt come close!

Your Kyro2 must have been a LOT better then mine. Mine had serious issues with basic level filtering(their box filtering introduced major artifacts I couldn't help but notice) and their AF was both inferior and significantly slower then my GF2's. Overall it wasn't a bad card, it was much cheaper then a GF2, but I never found its IQ to be great or even very good. Better then ATi's or 3dfx's sure, but that certainly didn't take much. Matrox and nVidia both hammered them in IQ on numerous fronts.

They also had numerous issues with compatability using some of the(at the time) newer games. I recall Giants couldn't be run using all of the highest quality settings unless you wanted Z artifacts(by using Z buffer) or flickering shadows everywhere(forcing W buffer). Lots of glitches and workarounds like that for an enormous number of games(go digging through your registry and you'll see). The amount of game specific tweaks PVR pull off would make anyone at ATi and nV blush for certain.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: BenSkywalker
However I must say that the image quality was fantastic on it - my GF2 GTS didnt come close!

Your Kyro2 must have been a LOT better then mine. Mine had serious issues with basic level filtering(their box filtering introduced major artifacts I couldn't help but notice) and their AF was both inferior and significantly slower then my GF2's. Overall it wasn't a bad card, it was much cheaper then a GF2, but I never found its IQ to be great or even very good. Better then ATi's or 3dfx's sure, but that certainly didn't take much. Matrox and nVidia both hammered them in IQ on numerous fronts.

They also had numerous issues with compatability using some of the(at the time) newer games. I recall Giants couldn't be run using all of the highest quality settings unless you wanted Z artifacts(by using Z buffer) or flickering shadows everywhere(forcing W buffer). Lots of glitches and workarounds like that for an enormous number of games(go digging through your registry and you'll see). The amount of game specific tweaks PVR pull off would make anyone at ATi and nV blush for certain.


Yah, same here. My Kyro has artifacts and the filtering was not too great. Much of the game was either artifacts, or blurry lines... It isn't as bad as I am making it sound, but it was obvious.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: ArchAngel777
Does anyone remember the Kyro I and Kyro II cards? They have some innovative designs, but sadly, they lacked the true polished character of an ATI or nVidia card. I owned an origional Kyro and it had some "issues" that never really cleared up. But for the most part, the card ran pretty good.

That is what has been lacking for a past few years... Nothing majorly innovative, nothing revolutionary... Just, pretty much stuck in the CPU X86 architecture type Intel Versus AMD with marginal performance gains with each product, but nothing earth shattering.

Did any of you own a Kyro II? Post opinions on it...

Also, what ever happened to the company that designed the chip?

9700 pro has pretty earth shattering performance.....

Anyhow, PowerVR has made the mobile MBX chip, which is integrated into intel's StrongARM(or whatever they're called now) cpus in PDAs. It's also rumored that Intel will use them in their future integrated graphics.
Also, they made the arcade hardware Sega is using in many of its games until Xbox360 and Ps3 arcade hardware is ready, like House of the Dead 4, which looks fairly crappy.

Better then ATi's or 3dfx's sure, but that certainly didn't take much. Matrox and nVidia both hammered them in IQ on numerous fronts.

Wierd, in that time period I found nvidia to have the worst image quality, ati just ahead of them, 3dfx ahead of them, and then matrox ahead of them. Nvidia didn't have decent image quality until the FX series imo.

PowerVRs always ran in 32 bit color, so they should have had at least good 16 bit quality compared to nvidia and ati.(matrox and 3dfx I believe also always ran in 32 bit color, just downsampled to 16 bit)
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
PowerVR's best work was the Dreamcast.
I didnt' own a PC in Kyro's time, but the Dreamcast had some really nice graphics.

3dfx I believe also always ran in 32 bit color, just downsampled to 16 bit)
22-bit color
 
Jan 31, 2002
40,819
2
0
Originally posted by: VIAN
PowerVR's best work was the Dreamcast.
I didnt' own a PC in Kyro's time, but the Dreamcast had some really nice graphics.

Find a copy of Headhunter at a pawn shop or on eBay if you want to see what the DC was capable of. You'll only have to play for about 20 minutes to hit the first boss fight, on the rooftop, in the pouring rain. :)

- M4H
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: VIAN
PowerVR's best work was the Dreamcast.
I didnt' own a PC in Kyro's time, but the Dreamcast had some really nice graphics.

3dfx I believe also always ran in 32 bit color, just downsampled to 16 bit)
22-bit color

I don't think it was actually 22-bit color though, I think the 22-bit stuff was just some random number Creative made up basically to say it was 16-bit as good as 16-bit could get, whereas most other cards were 16-bit natively and then by the time they finally output the image, quality had significantly degraded.(nvidia cards were the worst offenders, but ati was pretty bad too at the time) I guess it was like super sampling, but for color accuracy instead of resolution.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Wierd, in that time period I found nvidia to have the worst image quality, ati just ahead of them, 3dfx ahead of them, and then matrox ahead of them.

I don't know what it is like to be legally blind, so I can't comment on your perspective exactly but a few staggeringly huge issues- 3dfx was incapable of even trilinear filtering in any multitextured game- it looked disgustingly poor if we are very kind. ATi had what has to be the poorest AF quality of any part ever released on the original Radeon- their filtering implementation didn't even take Z axis into account. It had horrific aliasing and was known for massive rendering errors in a very large percentage of games.

Nvidia didn't have decent image quality until the FX series imo.

Funny as I don't think a sane person would really argue the fact that the GeForce4 was leaps and bounds beyond the FX in terms of image quality. Actually, from a correct rendering perspective the GF4 was the best consumer part ever released from any company.

PowerVRs always ran in 32 bit color, so they should have had at least good 16 bit quality compared to nvidia and ati.

16bit color was complete @ss. As I stated before, I am not legally blind so I can't comment on how things look for you but as a person with perfect vision the 'best' 16bit output looked like vomit. 32bit color is a bare minimum for anything in the time era we are discussing.

matrox and 3dfx I believe also always ran in 32 bit color, just downsampled to 16 bit)

3dfx wasn't even capable of running 32bit color until the Voodoo4/5 hit.
 

SonicIce

Diamond Member
Apr 12, 2004
4,771
0
76
i beleve there were only 65,536 possible colors even at 32bit. the reason it was called 22bit was because of the dithering?
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
16bit color was complete @ss. As I stated before, I am not legally blind so I can't comment on how things look for you but as a person with perfect vision the 'best' 16bit output looked like vomit. 32bit color is a bare minimum for anything in the time era we are discussing.

How can it be the bare minimum if that is currently the top of the line color scheme? :confused:
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: ArchAngel777
16bit color was complete @ss. As I stated before, I am not legally blind so I can't comment on how things look for you but as a person with perfect vision the 'best' 16bit output looked like vomit. 32bit color is a bare minimum for anything in the time era we are discussing.

How can it be the bare minimum if that is currently the top of the line color scheme? :confused:

And when most games at the time only used 8-bit and 16-bit textures.... and not to mention most LCD monitors are only 6-bit with dithering.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
It's 6 bit per subpixel. There are three subpixels (red, green, blue) per pixel in an LCD. 6x3=18-bit color.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: xtknight
It's 6 bit per subpixel. There are three subpixels (red, green, blue) per pixel in an LCD. 6x3=18-bit color.

That's still not near 32 bit, yet LCDs generally don't have color banding in games, I think in most cases the only reason for higher precision is for rounding errors and loss of precision, which 3dfx, matrox, and powervr cards were able to avoid very well.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: ArchAngel777
16bit color was complete @ss. As I stated before, I am not legally blind so I can't comment on how things look for you but as a person with perfect vision the 'best' 16bit output looked like vomit. 32bit color is a bare minimum for anything in the time era we are discussing.

How can it be the bare minimum if that is currently the top of the line color scheme? :confused:

IIRC, Ben also considers 1600x1200 as "low res" and the 6800GT as a "low end" card. So you might want to keep that sort of perspective in mind when reading his posts. :p

While 16-bit color definitely shows some noticeable artifacting/banding compared with 24- or 32-bit color, it is hardly "complete ass" or "vomit". Perhaps I'm deluded by the fact that I can remember when I (well, OK, my parents) upgraded to a video card with 16 colors, and everything seems good compared to ye olden days of CGA graphics.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
i beleve there were only 65,536 possible colors even at 32bit. the reason it was called 22bit was because of the dithering?
Wait a minute, you are getting confused.

The Voodoo3/4/5 accomplished 22-bit color by enabling 16-bit color and then using post filter.

32-bit color only had a real advantage over 16-bit when smoke or other translucent things were displayed. Other than that, it really didn't. Although 22-bit color did produce some artifacts, it did pretty damn good. Used it for a while when I used to play DOD.
 

Elcs

Diamond Member
Apr 27, 2002
6,278
6
81
I think that the Kyro 2 in its time proved that the graphics card market was still assailable.

I renember reading, reading and re-reading Anandtechs review on the Kyro 2 and I was so impressed I bought a Hercules 4500. I think it was 175/175 clocked. Cant remember that far back. I had a few minor issues with it but overall, I was highly impressed.

I never regretted not buying a GF2. Now I have a 64mb GF2 GTS in my possession as well as my old Kyro II. I might benchmark a few games for old times sake.

Their selective drawing method was flawed but worked well. With more industry backing, Id have put money on that company taking off. I never encountered a bug I couldnt work around or live with.

I remember a couple of years ago, there was speculation of a Kyro 3 but it faded away. Think of how much of a difference selective drawing would make when running at any resolution or quality. If implemented correctly, it could reduce load levels considerably.

In short, I remember the great Kyro II. It doesnt deserve to be forgotten.