BFG10K
Lifer
- Aug 14, 2000
- 22,709
- 3,002
- 126
I guess your always right and everyone else is wrong.
Not at all. If they like playing at those settings, that's fine with me. Just don't tell me that 60 FPS average isn't a slideshow because in many situations it is.
Did you know that majority wins and minority lose.
I can't believe I'm still actually trying to have a rational conversation with you after you make statements like this.
Who agrees with you? Bunch of Mindless Nvidiots who patrol these forums. I wouldn't care about them becaue they are all the same.
<rolleyes>
I see. Those voters are "invalid" because they don't agree with you. But the rest of them are open-minded ATi users, right?
I didn't create this magic number. People on the forum want this magic number.
I'm sorry, I'm having trouble staying on my chair after rolling around on the floor having fits of laughter from reading that statement.
But why would I cap my games at 25fps when I get average of more than 60fps
Because you claimed that 25 FPS is not a slideshow.
You said, "Average of 60fps is slideshow". How stupid can you be. Your statement is right there and you keep denying you said this.
That's right, my statement is right there yet you still don't understand what it means. Yet other people who weren't even involved in this thread and don't have the benefit of my repeated explanations don't seem to have any problems grasping the simple concept behind the statement. Yet you still don't get it. What does that tell you?
Do you understand having lowest fps of 25-30 is not a slide show?
There's nothing to "understand" because that statement is false.
I play fine with average of 60fps so I don't know why you keep comparing it to 120fps.
But I play at 120 FPS.
Your the one who wanted to show me.
And I did. You do know how to click on links right?
Of course it has everything to do with the argument. Your a fanboy. For you it's Nvidia or die. There will never be a time where you would actually agree with me because I'm not a fanboy of Nvidia nor am I a fanboy.
It's nothing do do with nVidia being my preferred vendor. If I had any other card I would be making the exact same arguments. Guess what, I'll let you in on a little secret: 3dfx used to be my preferred vendor before I had nVidia cards.
You still haven't shown me most games.
Yet you haven't even shown me one game where the Ti500 isn't twice as fast than your card at 1600 x 1200 x 32.
My card is not 10% faster it's more like 20%faster
No it isn't. 183 to 200 is 10%.
and with the right tweaks I get more than 110fps 1024x768.
What tweaks? I thought you just told me that you prefer image quality and you run at full details? Backpeddling again are we?
Soon as you add aniso filtering your geforce craws while my radeon chugs along fine losing only 10%of my frames.
That's because my GF3 does true trilinear filtering and true anisotropic filtering. OTOH your Radeon drops back to bilinear and does "adaptive" anisotropic which basically means that whenever ATi feels like it they adjust the tap rating.
Everybody knows Radeon 64 looks better than geforce 3.
"Everybody" who doesn't have a clue.
Are you blind? Can you not see ATI creates a superior image quality hands down while your geforce looks washed out.
I noticed how you completely skipped over the technical points I made and instead have gone back to the standard "dull colours"argument. OK, we'll play it your way - have you ever used digital vibrance on nVidia's cards? Do you even know what it does?
Crisper. Meaning sharper, cleaner, superior.
Those are very big words, words which don't have an ounce of credibility from you because you have absolutely no ability to back them up.
Yeah whateva dude. Of course I tried I tried digital vibrance. Still radeon has better image quality.
How was it better? At the fourth setting there is no way you can claim that the Radeon had more "vibrant colours" because that setting almost burns the monitor out. So tell me, what was wrong with the nVidia card now?
It looks better than dullish dark looking picture.
Say hello to nVidia's gamma correction slider. You did try that as well, right?
And I can lower my LOD and get higher FPS. Did you know that?
And when you do that you lose the fake "sharpness" you had. In otherwords, you can't say your card is sharper.
"Like many folks, I've estimated in the past that the Radeon's image quality is superior to the GeForce2's".
What do you know even your link says my radeon is superior than geforce 2.
No it doesn't. In fact that sentence lead-in sounds like he's saying that he was wrong in the past and he's now going to prove it.
Geforce 3 might render mip maps closely to correct than radeon 64 but that doesn't mean they are perfect.
What kind of a strawman is that? Is the GF3's mipmapping superior to the Radeon's or not? Just answer the question. Don't backpeddle or skirt around the issue.
That is only one man's opinion
How is that opinion if the screenshots are taken from the actual game and show on a technical level what is happening?
Is 1 + 1 = 2 an opinion? According to you, it is.
while I've got this!
You've got what? A link to Rage3D that says that ATi is better? ROTFLMAO!
I've got links to an independant and unbiased hardware website while you have a link that practically points to ATi's headquarters.
Notice how Nvidia's picture looks dull.
Notice how an ATi fanboy has the ability to do absolutely anything he pleases to the image before he puts it up?
Sure you look for mipmap closely under different levels of colors but you would never notice these mipmaps while playing a game.
Rubbish. Absolute rubbish. If you can't even tell the difference between bilinear and trilinear in a game it speaks volumes about your inability to look at more comlicated and subtle image quality features.
You in the the other hand have to look under a micro-scope to see Radeon mipmap levels were little off.
Huh? A little off? With bilinear filtering they're absolute lines - sharp on one side and blurry on the other.
How about your texture compression U. T. don't tell me your texture compression is not broken because your the one who ran around these forums saying that texture compression was broken.
Point taken and conceded. The Radeon's DXT1 is definitely better than nVidia's. But nVidia wins in pretty much everything else and with DXT3 becoming more common the issue is largely fading away.
Well show me dude.
3DMark Link.
The Radeon 8500, in general, has a poorer image quality than the VisionTek GeForce3 in 3DMark2001. This conclusion may or may not be transferrable to other applications, especially OpenGL applications. However, we can all say for 100% certainty that the VisionTek has the better image quality in all four of 3DMark2001 image quality tests.
Moreover, I do not see any evidence to support the common myth that NVIDIA enhanced the performance scores in 3DMark2001 by degrading the image quality, by making the images "blurrier". On the contrary, not only did NVIDIA keep the level of detail the same but they also fixed graphical anomalies. In two of the four image quality tests, the deviation from the reference image actually went down. This means that the image quality actually improved in two of the four tests.
ATi, on the other hand, went on a different route. The deviation from the reference images went up with the latest drivers while at the same time improved its performance. There is a degradation in image quality in all four of the image quality tests. There maybe a correlation between the performance gains of ATi's latest drivers and the degradation in image quality. This correlation is just speculation at this point. I'll just let the alleged "experts" discuss that issue. I already know what they are going to say anyway: the image quality problems are due to bugs and it is not interpreting 3DMark2001 images correctly.
And pay extra attention to post discussion at the end - he skillfully destroys the flawed arguments that ATi fanboys constantly make about the whole issue.
You can read it here.
Uh, from your own article...
Excessive LOD:
I want to attract your attention to that LOD BIAS value is biased to higher texture quality on RADEON 8500 in Direct3D, so at disabled anisotropy the clearness seems to be better than that of GeForce3. but that's only for screenshots, as such a push of LOD BIAS results in "texture noise", showing up as flashing dots - so-called "sand".
Variable tap ratings:
By the way, it is about RADEON's anisotropy in general. The thing is that GeForce3 "overlays" filtering to all objects independently to angles of surfaces' slopes, and anisotropy seems as disabled at some surface slope angles on RADEON, i.e. you can see some ditherings among clearly reflected surfaces. It is, certainly, a rare thing, so you'll have to specially look for it.
Do you know what Trilinear Filtering is and what it does?
I've already explained exactly what it does. In terms of the argument at hand it improves image quality in places where anisotropic filtering doesn't.
So in your words little less precise is horrible? Whatever dude. You blow $hit out of proportion
No, don't try that smokescreen with me. You know very well that I'm talking about bilinear filtering.
Not at all. If they like playing at those settings, that's fine with me. Just don't tell me that 60 FPS average isn't a slideshow because in many situations it is.
Did you know that majority wins and minority lose.
I can't believe I'm still actually trying to have a rational conversation with you after you make statements like this.
Who agrees with you? Bunch of Mindless Nvidiots who patrol these forums. I wouldn't care about them becaue they are all the same.
<rolleyes>
I see. Those voters are "invalid" because they don't agree with you. But the rest of them are open-minded ATi users, right?
I didn't create this magic number. People on the forum want this magic number.
I'm sorry, I'm having trouble staying on my chair after rolling around on the floor having fits of laughter from reading that statement.
But why would I cap my games at 25fps when I get average of more than 60fps
Because you claimed that 25 FPS is not a slideshow.
You said, "Average of 60fps is slideshow". How stupid can you be. Your statement is right there and you keep denying you said this.
That's right, my statement is right there yet you still don't understand what it means. Yet other people who weren't even involved in this thread and don't have the benefit of my repeated explanations don't seem to have any problems grasping the simple concept behind the statement. Yet you still don't get it. What does that tell you?
Do you understand having lowest fps of 25-30 is not a slide show?
There's nothing to "understand" because that statement is false.
I play fine with average of 60fps so I don't know why you keep comparing it to 120fps.
But I play at 120 FPS.
Your the one who wanted to show me.
And I did. You do know how to click on links right?
Of course it has everything to do with the argument. Your a fanboy. For you it's Nvidia or die. There will never be a time where you would actually agree with me because I'm not a fanboy of Nvidia nor am I a fanboy.
It's nothing do do with nVidia being my preferred vendor. If I had any other card I would be making the exact same arguments. Guess what, I'll let you in on a little secret: 3dfx used to be my preferred vendor before I had nVidia cards.
You still haven't shown me most games.
Yet you haven't even shown me one game where the Ti500 isn't twice as fast than your card at 1600 x 1200 x 32.
My card is not 10% faster it's more like 20%faster
No it isn't. 183 to 200 is 10%.
and with the right tweaks I get more than 110fps 1024x768.
What tweaks? I thought you just told me that you prefer image quality and you run at full details? Backpeddling again are we?
Soon as you add aniso filtering your geforce craws while my radeon chugs along fine losing only 10%of my frames.
That's because my GF3 does true trilinear filtering and true anisotropic filtering. OTOH your Radeon drops back to bilinear and does "adaptive" anisotropic which basically means that whenever ATi feels like it they adjust the tap rating.
Everybody knows Radeon 64 looks better than geforce 3.
"Everybody" who doesn't have a clue.
Are you blind? Can you not see ATI creates a superior image quality hands down while your geforce looks washed out.
I noticed how you completely skipped over the technical points I made and instead have gone back to the standard "dull colours"argument. OK, we'll play it your way - have you ever used digital vibrance on nVidia's cards? Do you even know what it does?
Crisper. Meaning sharper, cleaner, superior.
Those are very big words, words which don't have an ounce of credibility from you because you have absolutely no ability to back them up.
Yeah whateva dude. Of course I tried I tried digital vibrance. Still radeon has better image quality.
How was it better? At the fourth setting there is no way you can claim that the Radeon had more "vibrant colours" because that setting almost burns the monitor out. So tell me, what was wrong with the nVidia card now?
It looks better than dullish dark looking picture.
Say hello to nVidia's gamma correction slider. You did try that as well, right?
And I can lower my LOD and get higher FPS. Did you know that?
And when you do that you lose the fake "sharpness" you had. In otherwords, you can't say your card is sharper.
"Like many folks, I've estimated in the past that the Radeon's image quality is superior to the GeForce2's".
What do you know even your link says my radeon is superior than geforce 2.
No it doesn't. In fact that sentence lead-in sounds like he's saying that he was wrong in the past and he's now going to prove it.
Geforce 3 might render mip maps closely to correct than radeon 64 but that doesn't mean they are perfect.
What kind of a strawman is that? Is the GF3's mipmapping superior to the Radeon's or not? Just answer the question. Don't backpeddle or skirt around the issue.
That is only one man's opinion
How is that opinion if the screenshots are taken from the actual game and show on a technical level what is happening?
Is 1 + 1 = 2 an opinion? According to you, it is.
while I've got this!
You've got what? A link to Rage3D that says that ATi is better? ROTFLMAO!
I've got links to an independant and unbiased hardware website while you have a link that practically points to ATi's headquarters.
Notice how Nvidia's picture looks dull.
Notice how an ATi fanboy has the ability to do absolutely anything he pleases to the image before he puts it up?
Sure you look for mipmap closely under different levels of colors but you would never notice these mipmaps while playing a game.
Rubbish. Absolute rubbish. If you can't even tell the difference between bilinear and trilinear in a game it speaks volumes about your inability to look at more comlicated and subtle image quality features.
You in the the other hand have to look under a micro-scope to see Radeon mipmap levels were little off.
Huh? A little off? With bilinear filtering they're absolute lines - sharp on one side and blurry on the other.
How about your texture compression U. T. don't tell me your texture compression is not broken because your the one who ran around these forums saying that texture compression was broken.
Point taken and conceded. The Radeon's DXT1 is definitely better than nVidia's. But nVidia wins in pretty much everything else and with DXT3 becoming more common the issue is largely fading away.
Well show me dude.
3DMark Link.
The Radeon 8500, in general, has a poorer image quality than the VisionTek GeForce3 in 3DMark2001. This conclusion may or may not be transferrable to other applications, especially OpenGL applications. However, we can all say for 100% certainty that the VisionTek has the better image quality in all four of 3DMark2001 image quality tests.
Moreover, I do not see any evidence to support the common myth that NVIDIA enhanced the performance scores in 3DMark2001 by degrading the image quality, by making the images "blurrier". On the contrary, not only did NVIDIA keep the level of detail the same but they also fixed graphical anomalies. In two of the four image quality tests, the deviation from the reference image actually went down. This means that the image quality actually improved in two of the four tests.
ATi, on the other hand, went on a different route. The deviation from the reference images went up with the latest drivers while at the same time improved its performance. There is a degradation in image quality in all four of the image quality tests. There maybe a correlation between the performance gains of ATi's latest drivers and the degradation in image quality. This correlation is just speculation at this point. I'll just let the alleged "experts" discuss that issue. I already know what they are going to say anyway: the image quality problems are due to bugs and it is not interpreting 3DMark2001 images correctly.
And pay extra attention to post discussion at the end - he skillfully destroys the flawed arguments that ATi fanboys constantly make about the whole issue.
You can read it here.
Uh, from your own article...
Excessive LOD:
I want to attract your attention to that LOD BIAS value is biased to higher texture quality on RADEON 8500 in Direct3D, so at disabled anisotropy the clearness seems to be better than that of GeForce3. but that's only for screenshots, as such a push of LOD BIAS results in "texture noise", showing up as flashing dots - so-called "sand".
Variable tap ratings:
By the way, it is about RADEON's anisotropy in general. The thing is that GeForce3 "overlays" filtering to all objects independently to angles of surfaces' slopes, and anisotropy seems as disabled at some surface slope angles on RADEON, i.e. you can see some ditherings among clearly reflected surfaces. It is, certainly, a rare thing, so you'll have to specially look for it.
Do you know what Trilinear Filtering is and what it does?
I've already explained exactly what it does. In terms of the argument at hand it improves image quality in places where anisotropic filtering doesn't.
So in your words little less precise is horrible? Whatever dude. You blow $hit out of proportion
No, don't try that smokescreen with me. You know very well that I'm talking about bilinear filtering.