KYRO III probably out in Q1 2002. What are its known specs?

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

OneOfTheseDays

Diamond Member
Jan 15, 2000
7,052
0
0
why are we still talking about the Kyro 2? It is old news as far as Americans are concerned. Nowadays, there are much better cards at cheaper or equal prices than the Kyro 2 sells for. Nowadays, it makes more sense to go for either Radeon or Geforce because they have excellent lineups in the budget gaming market.
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net


<< why are we still talking about the Kyro 2? It is old news as far as Americans are concerned. Nowadays, there are much better cards at cheaper or equal prices than the Kyro 2 sells for. Nowadays, it makes more sense to go for either Radeon or Geforce because they have excellent lineups in the budget gaming market. >>



Well we are still talking about the Kyro 2 on this forum because it's a global forum, yes it might be true that the Kyro 2 has more competition in the budget market in America but in other countries where Nvidia doesn't have such a strangle hold things are different.

Anyway with the release of the Kyro 3 in a few months time it's good to keep a few Kyro related topics going! :)
 

holdencommodore

Golden Member
Nov 3, 2000
1,061
0
0
For the price the Kyro 2 is great. I am very happy with it considering the options I had. In Australia the only thing in the Kyro 2 (about $145) price range is the Geforce 2 MX200 or SiS 315. If the price is right for the Kyro 3 - under $500 which most GF3 Ti 200 are, then it will be at the top of my shopping list when I build another computer.
 

oozz77

Junior Member
Jan 7, 2002
2
0
0
PowerVR
I got all my games working perfectly now and Unreal, UT, etc does work equally as well in both D3D as well as OGL. I am so impressed with this card as not only can I play my games at much higher resolutions and 32 bit but they play very fast and fluid as well. RTCW is absolutely beatiful on this card. Looks just as good as the GF3 screen shots to me.
Vedin, ofcoarse the card plays UT in all the lower resolutions as well. What PowerVR meant was that UT only gives you the choice of 1280 or 1600 in the higher resolution range. It has nothing to do with the Kyro 2.
HoldenCommodore, Hi I'm from Aus too. I just picked up my Kyro2 for $155 with TV-out and am very happy from the upgrade from my V3.
There's no way I would pay $500 for a GF3TI on principal alone as even now there are some games that enen the Kyro2 can out perform it. With Ram prices at a all time low I can't see how Nvidia can justify such prices on an incremental increase in performance and features . Its a joke. This is why its a shame that 3dfx went under; the less competion the higher the asking price! Well from the many benchmarks I saw I'm glad I didn't get a GF2TI200 at$320. I really don't care which company is ahead of the race as long as us consumers and gamers benefit in price as well as performance.

 

Powervr

Member
Jun 8, 2000
101
0
0
well said oozz77....

but I have only those two resolutions in d3d...
weird... I am the only one having this problem..
time to test the new kyro drivers
;)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I created that pole to make a point. 60FPS is not a slide show.

Your poll does not show that at all. Also the reponses in your poll are irrelevant but if they weren't, it still wouldn't confirm what you're saying.

No really. Me too.

No, because my card is running my games at least twice as yours. That means smoother gameplay, higher resolutions and more eye candy for me.
 

Powervr

Member
Jun 8, 2000
101
0
0
I saw this here www.paraknowya.com

Kyro III reaching developers


I have just been playing Empire Earth for the past few hours. While taking a break and checking my mail, I though I would take a quick look at the readme file. What?s this I see.

?Supported Video Cards*
------------------------
Empire Earth supports video cards based on the following 3D chipsets:
????????
????????
ATI Rage Fury MAXX
ATI Radeon
PowerVR Kyro II
PowerVR Kyro Series3?

Interesting, so this mean Sierra have got a Kyro III and have tested it with this game. If the Kyro III has been sent out to developers it can?t be far away. Its also good to see developers are supporting the Kyro III.


:)
I guess that kyro 3 not far away...
 

EMAN

Banned
Jan 28, 2000
1,359
0
0
Your poll does not show that at all. Also the reponses in your poll are irrelevant but if they weren't, it still wouldn't confirm what you're saying.

BFG if you really want I can create a pole for you.

How about "60FPS = slide show?"

Almost everybody in the forum will say it's not slide show. You want to bet some money on it? I'm willing to bet my whole system for it. If you don't want to bet then stop blabbering.

My pole shows that most people on this board wanted 50-70fps. That obviously shows that 60fps is not a slide show.


No, because my card is running my games at least twice as yours. That means smoother gameplay, higher resolutions and more eye candy for me.

Not all games are twice as fast as my card. Maybe some. Even than your card is just faster nothing more. No extra eye candy is involved. You play your game at 1154x864 with no Aniso. I play my games at the same resolution or higher with Aniso. I already have smooth game play. So who's got more eye candy? Definitely not you with your puny 17inch monitor.
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81


<< Interesting, so this mean Sierra have got a Kyro III and have tested it with this game. If the Kyro III has been sent out to developers it can?t be far away. Its also good to see developers are supporting the Kyro III >>



This probably means the Kyro3 is no more than a higher clocked KyroII, and has the same feature set. So no problems should be encountered with it.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Almost everybody in the forum will say it's not slide show.

Again with you and your polls. Do you think that peoples' opinions will somehow make you technically correct? Because they won't. The fact that you're so heavily into making polls speaks volumes about your complete inability to back up your claims using facts and logic. I, unlike you, do not need other people telling me that my statements are correct.

Also how about a poll that asks "is 60 FPS average a slideshow compared to an average of 120 FPS"? After all, you're deliberately twisting the issue to make yourself look more reasonable than you really are.

You want to bet some money on it?

No, why would I?

I'm willing to bet my whole system for it.

And does that make you proud?

If you don't want to bet then stop blabbering.

Last time I checked, this was a technical open forum, not a "say only what you'll gamble against" forum, so stop your childish dribble. If you can't handle logic and facts I have nothing more to say to you.

My pole shows that most people on this board wanted 50-70fps. That obviously shows that 60fps is not a slide show.

That is a completely invalid deduction to make. How does it show that those people wanted those scores?

For all you know they've never even seen higher framerates and/or they don't have systems capable of going any higher than those scores. In that case your poll doesn't mean they want those scores, only that they've never seen any better.

Not all games are twice as fast as my card. Maybe some.

Probably most.

Even than your card is just faster nothing more.

You have a Radeon, yes? In that case my 3D image quality is better than yours. In fact nVidia's 3D image quality is probably better than everyone elses in the consumer market.

You play your game at 1154x864 with no Aniso.

In some games I use anisotropic filtering.

I play my games at the same resolution or higher with Aniso.

But without trilinear (again, do you have a Radeon?), unlike myself.

I already have smooth game play.

Not to me you don't because if it was smooth I would set my settings so that my card ran as slow as yours.

So who's got more eye candy?

Me of course, since my framerates hardly ever drop below 60 FPS. If I want slow eye candy I'll take screenshots and view them in a slideshow viewer. If I play games I want consistently fast and smooth framerates.

Definitely not you with your puny 17inch monitor

What does my monitor have to do with this?
 

EMAN

Banned
Jan 28, 2000
1,359
0
0
Again with you and your polls. Do you think that peoples' opinions will somehow make you technically correct? Because they won't. The fact that you're so heavily into making polls speaks volumes about your complete inability to back up your claims using facts and logic. I, unlike you, do not need other people telling me that my statements are correct.

What is wrong with a pole and people's opinions. Are you that insecure that people will disagree with you. Fact is that 60fps is not a slide show whether you agree with me or not.

You want to use a fact. Fact is 25fps is not a slide show if you disagree with me tell that to movie makers. Just because you can't play games at 60fps you don't have to call it a slide show. You just suck at playing games.


Also how about a poll that asks "is 60 FPS average a slideshow compared to an average of 120 FPS"? After all, you're deliberately twisting the issue to make yourself look more reasonable than you really are.

But you said 60fps is a slide show. Did you not say that? Now you want to twist your story. What ever little boy.


No, why would I?

That's what I thought.


And does that make you proud?

Actually it does. I'm a man and I can back myself up.


Last time I checked, this was a technical open forum, not a "say only what you'll gamble against" forum, so stop your childish dribble. If you can't handle logic and facts I have nothing more to say to you.

I'm willing to back my claim. You in the other hand blabber garbage and can't even back yourself up. I don't think this forum needs stupid post like "60fps is a slide show". Good don't say nothing to me because I think your one of the biggest problem in the forum.


That is a completely invalid deduction to make. How does it show that those people wanted those scores?

How is it invalid. Instead of saying it's invalid why don't you show me. It shows that most people want 50-70fps.


For all you know they've never even seen higher framerates and/or they don't have systems capable of going any higher than those scores. In that case your poll doesn't mean they want those scores, only that they've never seen any better.

Actully almost everybody in the forum can get higher FPS by lowering their resolutions or playing it in 16bit color. So your statement doesn't make any sense.


Probably most.

Instead of saying that show me some related web pages showing me double fps in most games. Of course you can't cause it's only some games that take advantage of only raw fill rate and bandwidth like Quake 3. Not all games are based on Quake 3 engines even than your CPU is more important soon as you have enough bandwidth.


You have a Radeon, yes? In that case my 3D image quality is better than yours. In fact nVidia's 3D image quality is probably better than everyone elses in the consumer market.

That is your opinion and nothing more. I have a radeon and makes geforce 3 look like garbage. And last time people tried to find out which card had better IQ. Radeon won. So you were saying.


In some games I use anisotropic filtering.

Wow you do. That's amazing.


But without trilinear (again, do you have a Radeon?), unlike myself.

Does it make a difference? If you look at reviews from hardware sites they will tell you that Radeon Bi-Aniso looks better or equal to Geforce Tri-Aniso. You want proof. Goto Digit-Life.


Not to me you don't because if it was smooth I would set my settings so that my card ran as slow as yours.

Of course not you because your just a kid who can't play a game without having 150+fps.


Me of course, since my framerates hardly ever drop below 60 FPS. If I want slow eye candy I'll take screenshots and view them in a slideshow viewer. If I play games I want consistently fast and smooth framerates.

How is faster FPS=better IQ. You don't make any sense.


What does my monitor have to do with this?

Did you know that monitor makes a huge difference in Image Quality? I guess you don't know since you have a cheezy 17inch shadow mask monitor.
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0


<< You want to use a fact. Fact is 25fps is not a slide show if you disagree with me tell that to movie makers. Just because you can't play games at 60fps you don't have to call it a slide show. You just suck at playing games. >>



<groan>, surely you must have realised that movies are not displayed at 25fps by now.
 

Brodde

Junior Member
Jan 15, 2002
13
0
0
For Kyro III information www.paraknowya.com is a good start. Nothing new right now though...

There was some kind of confusion about PowerVR series 3 and Kyro III.
From what I remember:

PowerVR series 2 == Kyro (1)
PowerVR series 3 == Kyro II

If I remember wrong I will probably get corrected right away, and then flamed to death ;)

When it comes to the Kyro II, I am getting rid of mine. Its performs allright in lots of games, its actually really good in RTCW and UT (and Diablo II). =)

But its got some serious compatability issues. Lots of games require a lot of tweaking, and even after hours of work, its not sure you'll get the game to work. (Im stilled pissed for not being able to run wizardy 8 and Red faction). And some games run smooth, crashes when some things happen. In ghost recon, the game crashes whenever I use a rockelauncher (probably the smokeeffects that kills it).

Despite lots of problems, I support the Kyro cards. I like to see more competition, and I will probably buy a Kyro III card, and atleast try it out.

But right now I'm going to wait a week til the 128Mb Ti200 comes out. After that I have a Kyro II for sale :)

 

vss1980

Platinum Member
Feb 29, 2000
2,944
0
76
PowerVR Series 1: 2 chips, PCX1 and PCX2 - PCX2 fixed a few errors in the PCX1 chip like bad lighting
example: Matrox m3D, Videologic Apocalypse 3D/5D

PowerVR Series 2: example: Videologic Neon 250
Single chip 2D/3D - released same time as TNT2 but a year late and performed about the same area.

PowerVR Series 3: 2 chips (so far) - Kyro & KyroII

PowerVR Series 4: KyroIII ???
 

Brodde

Junior Member
Jan 15, 2002
13
0
0
Umm, ok, I was allmost correct then. The point was the earlier post by powerVR stating:


ATI Rage Fury MAXX
ATI Radeon
PowerVR Kyro II
PowerVR Kyro Series3?

Interesting, so this mean Sierra have got a Kyro III and have tested it with this game. If the Kyro III has been sent out to developers it can?t be far away. Its also good to see developers are supporting the Kyro III.


I guess that kyro 3 not far away...


It could be a misunderstanding.
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0
well firstly cinema runs at 25fps x 2 (each frame shown twice) and films show motion blur which tricks the brain into believing more than 1 frame is shown. Also as mentioned elsewhere even film producers avoid quick panning shots as this really shows up the low fps rate.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
What is wrong with a pole and people's opinions

Nothing except opinion is not the same thing as fact. Just because slightly more people voted on your side that doesn't mean anything. Also, why are you ignoring the significant amount of people that agreed with me? This isn't a presidential election whereby you simply win outright because you have more votes and you ignore everyone elses.

Fact is 25fps is not a slide show

It most certainly is and if you can't see that then you're probably blind or in denial. Whatever the case, it doesn't make you right.

if you disagree with me tell that to movie makers.

How many times do I have to tell you that movies are completely different to FPSs and that...

Oh, forget it it, I'm tired of repeating the same arguments only to be ignored by zealots. If you don't understand why 25 FPS in a movie is not the same thing as 25 FPS in a 3D game I suggest you do some research.

Just because you can't play games at 60fps you don't have to call it a slide show. You just suck at playing games.

When did I ever say that? I said that 60 FPS average is my minimum target for 3D games but I prefer it to be higher. I also said that in the heavier areas a 60 FPS average will often crawl (ie it'll be a slideshow).

But you said 60fps is a slide show. Did you not say that?

I did, after which I quantified the statement to compared to 120 FPS.

That's what I thought.

Then why even bother asking such an innappropriate question? This is an open forum, not a casino. I'm not sure where you get the idea otherwise. Just because you can't use facts and logic to back up your claims don't try to turn the issue into a lottery.

Actually it does. I'm a man and I can back myself up.

Pfffff.

I'm willing to back my claim. You in the other hand blabber garbage and can't even back yourself up.

You don't back yourself in any way. All you do is wave around that useless poll in my face as if it were fact or something. If that's the best ammunition you've got then I'd quit right now if I were you.

I don't think this forum needs stupid post like "60fps is a slide show".

If you even had half a clue you would understand exactly what I was saying and why it works the way it does.

Good don't say nothing to me because I think your one of the biggest problem in the forum.

At least I don't try to discredit people on the basis of trying to gamble with them.
At least I use technical facts, logic and experience to back up my claims instead of some half-assed opinion poll.

It shows that most people want 50-70fps.

It most certainly does not. It shows that most people are playing at that setting, not that they want it. Again, that means nothing because of the reasons I gave you.

Actully almost everybody in the forum can get higher FPS by lowering their resolutions or playing it in 16bit color. So your statement doesn't make any sense.

But according to you they want to play at the scores they're getting now so they have absolutely no reason to lower the resolution and bit depth, right?

You're tripping over your own arguments now.

Instead of saying that show me some related web pages showing me double fps in most games.

Sure, which card do you have? Let me know and I'll get the results.

That is your opinion and nothing more.

Much like your poll, yet you feel the poll somehow proves that you're correct.

And no, it's much more than opinion. There are technical reasons why nVidia's rendering accuracy is better than ATi's. Again, if you don't know why I suggest you do some research on the issue.

I have a radeon and makes geforce 3 look like garbage.

That's not my problem if you don't know how to set up your GF3 and/or you don't know where to look for the image quality differences.

And last time people tried to find out which card had better IQ. Radeon won. So you were saying.

And where my that be? In every single bake-off we've done nVidia has come out on top. And even then the screenshots weren't that useful for comparing image quality.

Wow you do. That's amazing.

I was just responding to your off-tangent argument which has precisely nothing to do with the issue at hand.

Does it make a difference?

ROTFL.

If you look at reviews from hardware sites they will tell you that Radeon Bi-Aniso looks better or equal to Geforce Tri-Aniso.

If they're saying that then they're talking absolute rubbish. Trilinear filtering blends the mipmaps into a smooth gradient while anisotropic filtering sharpens the textures at certain angles. Both of the filtering methods complement each other, not replace each other. If you think that ani + bi is better than ani + tri filtering you're either totally clueless or a total zealot, or both.

The Radeon has horrible mipmap banding patterns when you enable anisotropic filtering but you probably won't see them in the screenshots. You certainly will see them in the game though. Of course you'd know that, having fully tested both cards like I have. You'd also know that because you understand how the filtering methods work on a technical level, like I do.

Of course not you because your just a kid who can't play a game without having 150+fps.

For the last time, 60 FPS average is my minimum target for playability but I prefer 120 FPS for a much smoother experience.

How is faster FPS=better IQ. You don't make any sense.

Who said anything about IQ? Do you even know what IQ stands for?

I was talking about total image quality and how it relates to the gaming experience. Pick two guys who both have the same image quality settings in a game. Do you think that the guy getting 60 FPS average is going to have the same gaming experience as the 120 FPS guy? Of course not, and the only difference between the two systems is the speed in which they're rendering the game.

Did you know that monitor makes a huge difference in Image Quality?

And why do you feel that my monitor delivers bad image quality?

I guess you don't know since you have a cheezy 17inch shadow mask monitor.

At least I don't have have two wires running along the length of my screen.

Also you seem to have taken a childish interest with arguing off topic and bashing my minitor for some reason. If it makes you feel any better to bash plastic and glass, go right ahead.

Any link Mignon?

Why should he give you a link? You're the one making the false statements about framerates, movies and 3D games. It's not his job to dig up articles to disprove your false claims. Like I said before if the only backing you have is that worthless poll, you're fighting a losing battle.
 

EMAN

Banned
Jan 28, 2000
1,359
0
0
Nothing except opinion is not the same thing as fact. Just because slightly more people voted on your side that doesn't mean anything. Also, why are you ignoring the significant amount of people that agreed with me? This isn't a presidential election whereby you simply win outright because you have more votes and you ignore everyone elses.

Oh really! Than majority of people on this forum likes to play games that run like a slide show in your opinion. Do you even know what your saying? How am I ignoring everybody elses opinions? You just say crap but never ever back yourself up. Your the one with a blunt statement saying "60fps is a slide show". I in the other hand never say stupid $hit like that.


It most certainly is and if you can't see that then you're probably blind or in denial. Whatever the case, it doesn't make you right.

Oh and I'm blind. I'm in denial because BFG the Nvidia Fanboy said so. Give me some concrete evidence that I'm blind or in denial if you can't shut your Pink hole.


Oh, forget it it, I'm tired of repeating the same arguments only to be ignored by zealots. If you don't understand why 25 FPS in a movie is not the same thing as 25 FPS in a 3D game I suggest you do some research.

Oh now I'm a zealot. Oh no you figured me out! Did you know that your the biggest FANBOY here. Tell me where did you find this research?


If you even had half a clue you would understand exactly what I was saying and why it works the way it does.

And now I have no clue.:confused: What do you know BFG knows everything and everybody else in here doesn't know a thing about computers or its components. Your the man BFG. NOT:disgust:


When did I ever say that? I said that 60 FPS average is my minimum target for 3D games but I prefer it to be higher. I also said that in the heavier areas a 60 FPS average will often crawl (ie it'll be a slideshow).

Why don't you read your post again cause your memory is failing you. You never said that. You said 60fps is a slide show.


I did, after which I quantified the statement to compared to 120 FPS.

Now we're getting somewhere. I don't make $hit comments like that. I don't think 60fps is a slide show and neither does rest of this forum except you. Either way you said it. Don't try to deny something you said because I have concrete evidence now.


Then why even bother asking such an innappropriate question? This is an open forum, not a casino. I'm not sure where you get the idea otherwise. Just because you can't use facts and logic to back up your claims don't try to turn the issue into a lottery.

Why not car racers do it all the time. They bet their car to see who's the fastest and give up their car if the lose. I see the same logic here. If you don't believe yourself than obvisously you expect defeat. If you think your right put your money where pink gilly is and bet your system!


Pfffff.

Something you know nothing about. Cause your still a little boy who has save his lunch money just to get a geforce.


You don't back yourself in any way. All you do is wave around that useless poll in my face as if it were fact or something. If that's the best ammunition you've got then I'd quit right now if I were you.

Really now the poll is useless. Why don't you show me some concrete evidence than. You wave your pink gilly around making stupid false statements and your stupid fanboy attitude. Nobody in their right mind cares what BFG has to say because he's nothing more than a fanboy. I bet you haven't figured it out did you? Fanboy.


If you even had half a clue you would understand exactly what I was saying and why it works the way it does.

Why don't you show me some web pages saying that your right instead of telling everybody in the forum that they're wrong. You can't can you? Figures. Your nothing more than a troll.


At least I don't try to discredit people on the basis of trying to gamble with them.
At least I use technical facts, logic and experience to back up my claims instead of some half-assed opinion poll.


I'm just expressing how I feel about you. How did I discredit you. What technical facts did you use? You keep saying all those things but you never pointed me to a web page or where you have done your research. You don't use facts, you use BFG right, everybody wrong. I'm not the one who said "60fps is a slide show" It's statements like that you need to prove to everybody else. You can't prove it and that's that.


It most certainly does not. It shows that most people are playing at that setting, not that they want it. Again, that means nothing because of the reasons I gave you.

Oh really than why is that most people play at that setting? Obviously they want it or else they wouldn't be playing with average of 50-70fps. What reason did you give me? Nothing once again.:Q


But according to you they want to play at the scores they're getting now so they have absolutely no reason to lower the resolution and bit depth, right?

Of course not because they want better graphic details. Your an idiot BFG. This is what you said from a previous post. "For all you know they've never even seen higher framerates and/or they don't have systems capable of going any higher than those scores. In that case your poll doesn't mean they want those scores, only that they've never seen any better." I just suggest how to get higher FPS since according to you people can't get higher FPS because of system incapablities. Now they have seen higher FPS they can choose for themselves.


You're tripping over your own arguments now.

Your the one who's tripping on your own statements. Read the post right above.


Sure, which card do you have? Let me know and I'll get the results.

Radeon 64meg DDR VIVO SE 200mhz. Don't show me some games. You better show me most games.


Much like your poll, yet you feel the poll somehow proves that you're correct.

And no, it's much more than opinion. There are technical reasons why nVidia's rendering accuracy is better than ATi's. Again, if you don't know why I suggest you do some research on the issue.


Radeon's image quality is better. I have done enough research to know that radeon's graphics look better than geforce in image quality. Even Kyro 2 has better image quality than geforce. And I actually have a geforce 2 sitting next to my radeon and radeon spanks it hard. They had a image comparison between geforce and radeon few months back and they all said radeon has superior image quality. If you don't agree with me you can tell that to many review sites yourself. I think you just wanted to hear what you want to hear and blocked out everything else cause your a fanboy. It was the time when all this quack issues arose. People started to test for image quality. Radeon won and everyone knows that except you.


That's not my problem if you don't know how to set up your GF3 and/or you don't know where to look for the image quality differences.

It's not my problem either it's geforce problem. Why do I have to look for image quality differences when I know Radeon has better IQ. I have geforce 2 here right next to my radeon and the radeon looks crisper and better detailed.


And where my that be? In every single bake-off we've done nVidia has come out on top. And even then the screenshots weren't that useful for comparing image quality.

It was Radeon that came out on top. If you remember I think you need to reread once again. Like I said you just blocked off everything what people had to say about how good the radeon IQ was and just accepted what you wanted to hear about your geforce cause your a fanboy.


ROTFL.

Stop laughing at yourself. I know your funny looking and all but you really need to stop.


If they're saying that then they're talking absolute rubbish. Trilinear filtering blends the mipmaps into a smooth gradient while anisotropic filtering sharpens the textures at certain angles. Both of the filtering methods complement each other, not replace each other. If you think that ani + bi is better than ani + tri filtering you're either totally clueless or a total zealot, or both.

I don't care what it does and I don't buy into that techno garbage Nvidia has been feeding to you. Radeon looks good as geforce if not better. Every review site said this. If you don't agree with me you can tell that $hit to Anandtech, Hardocp, Digit-Life and so on. Your just a fanboy talking how awesome your card is but the fact is your geforce 3 doesn't look any better than Radeon way of Anisotropic Filtering.


The Radeon has horrible mipmap banding patterns when you enable anisotropic filtering but you probably won't see them in the screenshots. You certainly will see them in the game though. Of course you'd know that, having fully tested both cards like I have. You'd also know that because you understand how the filtering methods work on a technical level, like I do.

I don't know what you been smoking but it doesn't have banding problems at all. It has banding problems when I set my Aniso to 128 tap, only than I see some shimmering effect.


For the last time, 60 FPS average is my minimum target for playability but I prefer 120 FPS for a much smoother experience.

Oh now your changing your statement. BFG the liar who originally said 60fps is a slide show.


Who said anything about IQ? Do you even know what IQ stands for?

You did cracker jack. You said you had better image quality. IQ stands for image quality. Do you know what MHZ stands for?


I was talking about total image quality and how it relates to the gaming experience. Pick two guys who both have the same image quality settings in a game. Do you think that the guy getting 60 FPS average is going to have the same gaming experience as the 120 FPS guy? Of course not, and the only difference between the two systems is the speed in which they're rendering the game.

No you didn't say that. You said you had better image quality. You want me to point it to you.

EMAN So who's got more eye candy?
BFG, Me of course

I don't know how since you play your games 1154x864 32bit+trilinear 150+fps and I play 1154x864 or higher resolution 32bit + Aniso 70+fps.


And why do you feel that my monitor delivers bad image quality?

Because it's tiny, not flat, low resolution, not a aperture grille.


At least I don't have have two wires running along the length of my screen.

Actually those wires are way cool because I can run my monitor 1600x1200 85htz with colors you can't imagine on your puny 17inch phillips.


Also you seem to have taken a childish interest with arguing off topic and bashing my minitor for some reason. If it makes you feel any better to bash plastic and glass, go right ahead.

Why is it off topic? We were talking about image quality. I think monitor has a lot to do with what we see. If you don't think that stop looking at your monitor everyday.


Why should he give you a link? You're the one making the false statements about framerates, movies and 3D games. It's not his job to dig up articles to disprove your false claims. Like I said before if the only backing you have is that worthless poll, you're fighting a losing battle.

Why shouldn't he? If he's trying to educate me he should post a link. BFG=Mindless Fanboy. You already lost the battle with your stupid statament, so I don't know what your talking about. Your so scared of that poll maybe I should make another one
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Than majority of people on this forum likes to play games that run like a slide show in your opinion.

Correct.

Do you even know what your saying? How am I ignoring everybody elses opinions?

By casually forgetting that around ~40% of the people in your own poll disagreed with you and agreed with me. Your childishly simplistic view of the poll so so funny it's sad. Basically, you think the poll proves you're right and you also conveniently forget everyone who agrees with me at the same time.

You just say crap but never ever back yourself up.

Excuse me? What didn't I back up? I gave you specific examples where the difference between even 120 FPS and 100 FPS is apparent and all you did is made some stupid comment about Carmack. Clearly you're not even capable of comprehending when logic and facts dispute what you're saying and instead you carry on with your childish antics. Either grow up and argue like an adult or take a hike.

Your the one with a blunt statement saying "60fps is a slide show".

Because 60 FPS average quite often is a slideshow because it'll dip much lower than 60 FPS on many occasions. Of course you're to simple to understand anything remotely complicated like that. All you do is grasp onto your magic number and as soon as someone speaks out about it, you bark and keep barking, trying to drive the person away.

Oh and I'm blind. I'm in denial because BFG the Nvidia Fanboy said so. Give me some concrete evidence that I'm blind or in denial if you can't shut your Pink hole.

OK, cap all your games to 25 FPS and play them for three weeks. Then uncap them. If you can't see any difference between the "before and after" cases you're either blind, a zealot or lying. Or since you like polls so much, why don't you start one and ask if people think that 25 FPS is a slideshow?

Tell me where did you find this research?

If you don't know I assume you haven't done any. In that case you have absolutely no business spewing the garbage you've been doing and challenging people to bets because you're "manly".

Why don't you read your post again cause your memory is failing you. You never said that. You said 60fps is a slide show

I said that 60 FPS average is a slideshow and in certain cases it certainly is. But since you had trouble understanding that I proceeded to quantify the statement by comparing it to 120 FPS. And now you still seem to have problems understanding that as well. How much simpler would you like me to go?

Now we're getting somewhere.

You've suddenly gained reading and comprehension skills? That's good to hear.

Don't try to deny something you said because I have concrete evidence now.

ROTFLMAO! What "concrete evidence"? That stupid poll you created? LOL! If you were any more childish you'd be...no, I don't even think that's possible.

Why not car racers do it all the time. They bet their car to see who's the fastest and give up their car if the lose. I see the same logic here

Something you know nothing about. Cause your still a little boy who has save his lunch money just to get a geforce.

Your statements are now getting truly pathetic and the funny thing is that you have no idea how bad they are. Each time you make comments like this it only proves how wrong you really are and you have to resort to personal insults to make yourself look like you're still in the argument.

Really now the poll is useless.

What do you mean "now"? It's been useless right from the start, yet you can't seem to understand such a simple fact.

Why don't you show me some web pages saying that your right instead of telling everybody in the forum that they're wrong

Do you know what an average score is?
Do you understand the implications of playing games in a wide variety of situations?

If you're getting a 60 FPS average it will drop to unplayable levels on many more occasions than 120 FPS will.

I'm just expressing how I feel about you.

(1) Which has exactly what to do with the argument?
(2) WTF would I care what you have to say about me?
(3) Stick to facts, not "feelings".

Oh really than why is that most people play at that setting? Obviously they want it or else they wouldn't be playing with average of 50-70fp

I see there is just no hope for you at all. You just don't get it.

I just suggest how to get higher FPS

Which they won't do because according to you they only want 50 FPS to 70 FPS.

Radeon 64meg DDR VIVO SE 200mhz. Don't show me some games. You better show me most games.

No, you show me most games where the Ti500 is not twice the speed.

Here are my results. Your card is 10% faster than a Radeon 64 MB DDR. In all of the video card limited settings (1600 x 1200 x 32) the Ti500 is at least twice as fast as your card and it looks better too.

Radeon's image quality is better. I have done enough research to know that radeon's graphics look better than geforce in image quality.

Are you blind? Can you not see ATi's mipmap banding? Can you not see their box-shaped trilinear approximation?

I have geforce 2 here right next to my radeon and the radeon looks crisper and better detailed.

Crisper? What does "crisper" mean? Or is it just another unsubstantiated, non-technical term that ATi fanboys continue to use on a regular basis? I assume you tried using nVidia's digital vibrance control, right?

Better detailed? I assume you checked for excessive LOD artifacts (aliasing, texture shimmering, and sparkling, etc) on your Radeon which is caused by their LOD being set too high? It looks good in screenshots but not so good in games.

I assume you have, you being the expert and all.

It was Radeon that came out on top.

It most certainly did not. Show me the links that prove otherwise. On the otherhand I've got
this:

But I believe that the GeForce3's implementation, where the mip map boundaries form an arc at a set distance from the "camera," is a little more proper.

The Radeon 8500's mip map boundaries are a little less precise. They tend to move and jump around, too, and not in a uniform way. You can see how the boundaries in the screenshots there aren't symmetrical; they don't really meet in the center of the screen. Sometimes, they intersect way off center. It's hard to describe unless you see it, but ATI is kind of guesstimating where mip map boundaries ought to be.


And here:

The GeForce3 implements trilinear filtering properly with anisotropic filtering, and it produces beautiful, smooth gradients

With anisotropic filtering enabled on the Radeon 8500, two things happen. First, trilinear filtering goes away. Transitions between mip maps are stark. Second, see how the mip map transition is positioned oddly on the walkway in the screenshot. The lack of precision in determining mip map boundaries makes for some weird results with anisotropic filtering enabled. I believe both of these problems are carried over from the original Radeon, and it's a shame to see them in the 8500.


Also I've got links that show that nVidia's rendering accuracy in 3DMark is much higher than ATi's and I'll put them up later if you like. Now, show me your technical sites which describe what "crispness" means and explain to me how it relates to rendering on a technical level.

Radeon looks good as geforce if not better. Every review site said this.

Wonderful, then you should have no problems showing me examples of them. Show me a review site that tells me that ATi's bi + ani looks better than nVidia's tri + ani. Show me a review site that says that ATi's box approximation looks better than nVidia's per-pixel rendering.

I don't know what you been smoking but it doesn't have banding problems at all.

Do you even know what bilinear filtering is and what it does?

I don't care what it does and I don't buy into that techno garbage Nvidia has been feeding to you.

Good response. You don't have a clue as to what you're talking about so you instead try to discredit me with insults. That style of arguing seems to carry through your statements quite often.

IQ stands for image quality.

To a layman perhaps. In reality it actually stands for internal quality, as in the precision inside the rendering pipelines. Of course you doing all that "research" would know this.

I don't know how since you play your games 1154x864 32bit+trilinear 150+fps and I play 1154x864 or higher resolution 32bit + Aniso 70+fps

Well I have trilinear filtering and higher framerates but you have anisotropic filtering but horrible mipmap boundaries. I think the overwhelming majority would agree with me that I have the edge. Also I've got the benefit of nVidia's high IQ accuracy.

low resolution,

Err, it's the same resolution as yours.

not a aperture grille

You seem to have the mistaken idea that aperture grilles are better than shadowmasks for everything. That is not the case.

Actually those wires are way cool because I can run my monitor 1600x1200 85htz with colors you can't imagine on your puny 17inch phillips.

Yet I can do the same on my new monitor without the wires. Go figure.

Why is it off topic? We were talking about image quality.

And my minitor has no problem with that.

? If he's trying to educate me he should post a link

Yet because you don't know anything you're free to post whatever dribble you like without having to back yourself up? I find your logic flawed.
 
Jul 1, 2000
10,274
2
0
Jesus - it is a video card people... Get a grip on reality.

I support anyone who brings a quality product into the market place. It lowers prices for the sane gamers who actually buy whichever card offers the best bang for the buck.

You fanboys are absolutely ridiculous. How can you justify wasting all of this time yammering away about something as trivial as FPS on a card by card basis or whether 60 FPS is a slide show?

This is insane. Some of you should seek professional counseling... or better yet, step away from the computer for a little while and get actual lives.

Not only that, how do you have time to actually game, between writing a "War and Peace" length piece about your inane opinions on these cards? Writing line by line reutation of the previous loser's stupid, pointless arguments which are grounded solely in opinion.

Do you just run benchmarks, or do you actually play the games? Just curious.
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0

I am not describing my preference. I am describing a typical gamer preferences.

I don't run around message boards telling people I need 150+fps.


So you run around and tell everyone that they don't.

"Hello Pot", says the kettle... "Your black!"


You describe like everybody needs 150+fps arrogantly. Which people don't. Most people are satisfied with 60fps and that's the point I was trying to make.


Still telling everyone what they should think, eh?


Almost everybody in the forum will say it's not slide show. You want to bet some money on it?

Than majority of people on this forum likes to play games that run like a slide show in your opinion


Can you tell me what the majority of the people on the forum think about hypocrisy?


I was describing typical gamer preferences. That is why I made that poll.


And about 40% disagreed with you.


There you go again putting words in my mouth!


Kinda like describing typical gamer preferences


BFG even though I can tell the difference between avg. fps of 60 and 120fps

and

I can tell the difference from constant 30fps to 70fps but I cannot tell the difference from constant 90 fps to 120fps.


Then why wouldn't you want 90fps?


Your definition of slide show is WRONG!


Since I (and the rest of the 40%) understood BFG's definition of slide show in this context I will clarify. 60FPS can become a slideshow in games when the minimum instantious FPS drops to half that number or below. The effect on gaming is a annoying choppy look and feel, nicely characterized by the phrase slide show. Kinda like calling ATI's IQ crisp.


I'm a man and I can back myself up.


This is probably the funnies thing I've read on BBS. You should also say that you are the most amazingly humble person in the world.

Your arguing about video cards d00d, that doesn't make you a man. At best it makes you a gold-medalist in the special olympics.


I don't make $hit comments like that.


No, you say stuff like "1000 opinions make a fact".


 

EMAN

Banned
Jan 28, 2000
1,359
0
0
Correct.

I guess your always right and everyone else is wrong. Typical trollish behavior.


By casually forgetting that around ~40% of the people in your own poll disagreed with you and agreed with me. Your childishly simplistic view of the poll so so funny it's sad. Basically, you think the poll proves you're right and you also conveniently forget everyone who agrees with me at the same time.

Did you know that majority wins and minority lose. It's been like that since voting existed. Of course you live in BFG Totalitarian government so you wouldn't know anything about that. Check your numbers again because your 40% just shrunk to less than 20%. Who agrees with you? Bunch of Mindless Nvidiots who patrol these forums. I wouldn't care about them becaue they are all the same.


Excuse me? What didn't I back up? I gave you specific examples where the difference between even 120 FPS and 100 FPS is apparent and all you did is made some stupid comment about Carmack. Clearly you're not even capable of comprehending when logic and facts dispute what you're saying and instead you carry on with your childish antics. Either grow up and argue like an adult or take a hike.

I really don't care what you think. Give me some web pages saying that. If not shut your pink gilly.


Because 60 FPS average quite often is a slideshow because it'll dip much lower than 60 FPS on many occasions. Of course you're to simple to understand anything remotely complicated like that. All you do is grasp onto your magic number and as soon as someone speaks out about it, you bark and keep barking, trying to drive the person away.

Average of 60fps is not slide show even if it dips below 60fps on many occations. Actually I am simple minded. I see smooth gameplay and it says 60fps on the timedemo. What do you know 60fps is not a slide show. I guess you can't understand that because your got something caught between your a$$ chicks and can't get it out. I didn't create this magic number. People on the forum want this magic number. So you better go on tell that $hit to everybody who voted for avg. of 50-70fps.


OK, cap all your games to 25 FPS and play them for three weeks. Then uncap them. If you can't see any difference between the "before and after" cases you're either blind, a zealot or lying. Or since you like polls so much, why don't you start one and ask if people think that 25 FPS is a slideshow?

But why would I cap my games at 25fps when I get average of more than 60fps. Did I say I play games at avg of 25fps? You know that I don't so why play at those settings? Why don't you set your fps to 25fps and tell me how it goes.:p


If you don't know I assume you haven't done any. In that case you have absolutely no business spewing the garbage you've been doing and challenging people to bets because you're "manly".

I guess you haven't done any research either cause you can't even name 1 source.:eek:


I said that 60 FPS average is a slideshow and in certain cases it certainly is. But since you had trouble understanding that I proceeded to quantify the statement by comparing it to 120 FPS. And now you still seem to have problems understanding that as well. How much simpler would you like me to go?

You said, "Average of 60fps is slideshow". How stupid can you be. Your statement is right there and you keep denying you said this. You keep changing your statement little by little but you won't fool nobody with that pink gilly tactics.


You've suddenly gained reading and comprehension skills? That's good to hear.

You haven't gained anything but your trollish and Fanboy ways. What's new?


ROTFLMAO! What "concrete evidence"? That stupid poll you created? LOL! If you were any more childish you'd be...no, I don't even think that's possible.

No not the poll. You admitted saying 60fps is a slide show.


Your statements are now getting truly pathetic and the funny thing is that you have no idea how bad they are. Each time you make comments like this it only proves how wrong you really are and you have to resort to personal insults to make yourself look like you're still in the argument.

If you can't back yourself than shut your gilly instead of saying I'm stupid and your smart. Its stupid statement like "your staments are now getting truly pathetic" that don't prove a thing either. Your the biggest dumbest fanboy I have ever seen and you can't seem to shut it because you think your the greatest to hit the gaming world. You happy now pink gilly boy.:disgust:


What do you mean "now"? It's been useless right from the start, yet you can't seem to understand such a simple fact.

Since I live in Democratic goverment. Majority wins. Didn't you know this. Oh forgot we're in BFG totalitarian government.


Do you know what an average score is?
Do you understand the implications of playing games in a wide variety of situations?


Do you understand average of 60fps is not a slide show?
Do you understand having lowest fps of 25-30 is not a slide show?
Of course you don't cause your BFG (Big Fat Gayboy)


If you're getting a 60 FPS average it will drop to unplayable levels on many more occasions than 120 FPS will.

Unplayable? According to who? I play fine with average of 60fps so I don't know why you keep comparing it to 120fps.


(1) Which has exactly what to do with the argument?
(2) WTF would I care what you have to say about me?
(3) Stick to facts, not "feelings".


Of course it has everything to do with the argument. Your a fanboy. For you it's Nvidia or die. There will never be a time where you would actually agree with me because I'm not a fanboy of Nvidia nor am I a fanboy.


I see there is just no hope for you at all. You just don't get it.

There will never be hope for you because your a fanboy. People actully play with avg of 50-70fps and you can't accept that.


Which they won't do because according to you they only want 50 FPS to 70 FPS

Of course they wouldn't. They would rather have better image quality and play around 50-70fps.


No, you show me most games where the Ti500 is not twice the speed.

Your the one who wanted to show me. If you can't just say so and remove your dumb Fanboy comment.


Here are my results. Your card is 10% faster than a Radeon 64 MB DDR. In all of the video card limited settings (1600 x 1200 x 32) the Ti500 is at least twice as fast as your card and it looks better too.

You still haven't shown me most games. My card is not 10% faster it's more like 20%faster because I get more than 100fps in quake 3 and with the right tweaks I get more than 110fps 1024x768. Soon as you add aniso filtering your geforce craws while my radeon chugs along fine losing only 10%of my frames. It looks better in your opinion. Everybody knows Radeon 64 looks better than geforce 3. If only when you turn on all your features than it can exceed the orginal radeon's image quality.


Are you blind? Can you not see ATi's mipmap banding? Can you not see their box-shaped trilinear approximation?

Are you blind? Can you not see ATI creates a superior image quality hands down while your geforce looks washed out.


Crisper? What does "crisper" mean? Or is it just another unsubstantiated, non-technical term that ATi fanboys continue to use on a regular basis? I assume you tried using nVidia's digital vibrance control, right?

Crisper. Meaning sharper, cleaner, superior. Now I'm a ATI fanboy for owning Nvidia board. Yeah whateva dude. Of course I tried I tried digital vibrance. Still radeon has better image quality.


Better detailed? I assume you checked for excessive LOD artifacts (aliasing, texture shimmering, and sparkling, etc) on your Radeon which is caused by their LOD being set too high? It looks good in screenshots but not so good in games.

It looks better than dullish dark looking picture.


I assume you have, you being the expert and all.

Of course I have. And I can lower my LOD and get higher FPS. Did you know that?


It most certainly did not. Show me the links that prove otherwise. On the otherhand I've got here And here:

Straight from your link...I want to touch on the subject of image quality for a second, because there are a few things worth mentioning. Like many folks, I've estimated in the past that the Radeon's image quality is superior to the GeForce2's.

What do you know even your link says my radeon is superior than geforce 2.

Geforce 3 might render mip maps closely to correct than radeon 64 but that doesn't mean they are perfect. That is only one man's opinion while I've got this! that shows ATI has superior image quality. Notice how Nvidia's picture looks dull. Look at the radeon's image quality. Colorful, and better detailed. You look at the entire image not only 1 thing. Sure you look for mipmap closely under different levels of colors but you would never notice these mipmaps while playing a game. You would actually notice much more details on a radeon because it's simply has more details and you can't escape that. You in the the other hand have to look under a micro-scope to see Radeon mipmap levels were little off. How about your texture compression U. T. don't tell me your texture compression is not broken because your the one who ran around these forums saying that texture compression was broken. Your point is taken but still Radeon has better Image quality.


Also I've got links that show that nVidia's rendering accuracy in 3DMark is much higher than ATi's and I'll put them up later if you like. Now, show me your technical sites which describe what "crispness" means and explain to me how it relates to rendering on a technical level.

Well show me dude.


Wonderful, then you should have no problems showing me examples of them. Show me a review site that tells me that ATi's bi + ani looks better than nVidia's tri + ani. Show me a review site that says that ATi's box approximation looks better than nVidia's per-pixel rendering.

You can read it here.

"Now about anisotropic filtering. You can see that it's better on RADEON 8500 than on GeForce3. It is under condition of RADEON's highest level and GeForce3's middle, but as we found out, even Level4 causes great perfomance falloff (almost always greater than that of RADEON 8500). So there's no sense in demonstrating Level8 with, perhaps, the same quality as RADEON's, but also with losses that make such anisotropy useless."


Do you even know what bilinear filtering is and what it does?

Bilinear filtering is used to smooth out textures that are mapped. Do you know what Trilinear Filtering is and what it does?


Good response. You don't have a clue as to what you're talking about so you instead try to discredit me with insults. That style of arguing seems to carry through your statements quite often.

And your still eating their techno-garbage that they've been feeding you since they one. That is why you need to replace your card every 6months. I don't need techno garbage to know that it's all marketing gimick. All I know is that Kyro and ATI has better image quality and that's what matters because I have to look at it everytime I play a game.


To a layman perhaps. In reality it actually stands for internal quality, as in the precision inside the rendering pipelines. Of course you doing all that "research" would know this.

To me it's image quality and I don't need anybody telling me otherwise unlike brain-washed Nvidia Fanboy like yourself. To me if I can't see the superior end results than it's bull-$hit.


Well I have trilinear filtering and higher framerates but you have anisotropic filtering but horrible mipmap boundaries. I think the overwhelming majority would agree with me that I have the edge. Also I've got the benefit of nVidia's high IQ accuracy.

Dang I would have to look it under different sets of RGB to actually pin point "little less precise"(words used from tech-report) mipmap levels. So in your words little less precise is horrible? Whatever dude. You blow $hit out of proportion. Dang I still got higher details and Aniso. filtering. Oh no I have better image quality.


Err, it's the same resolution as yours.

How is it same as mine? Your max resolution is 1280x1024 while my max resolution is 1920x1440.


You seem to have the mistaken idea that aperture grilles are better than shadowmasks for everything. That is not the case.

Not for everything but for gaming. HELL YEAH.


Yet I can do the same on my new monitor without the wires. Go figure.

Fake new monitor. And I have Geforce 4 already.


And my minitor has no problem with that.

Sure it does you got a 17inch phillips.


Yet because you don't know anything you're free to post whatever dribble you like without having to back yourself up? I find your logic flawed.

There you go with I'm superior and your inferior statement. Why don't you post a link since your so superior.
 

EMAN

Banned
Jan 28, 2000
1,359
0
0
Merlocka your just another fanboy trying to aggravate me. I'm not even going to bother.