• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

What's wrong with "cheating"?

clicknext

Banned
When everyone accuses ATI or NV of cheating, aren't they really just trading in image quality for speed? What's wrong with it? If it becomes known that their image quality is bad on certain things and that costs them customers, their fault. I'm getting tired of people talking about these "cheats" and talking about the companies like they commited some terrible sin.
 
well, if i buy a card thinking itll give good IQ and it doesnt....thats cheating, even though it gives better speed. false advertising.
 
IF they give you an option to turn the optimizations off, no one but the most ignorant fanboys complains.

If the don't say that they've optimized, there is a noticeable difference in IQ, and there's no way of turning it off...that's different.
 
Originally posted by: jagec
IF they give you an option to turn the optimizations off, no one but the most ignorant fanboys complains.

If the don't say that they've optimized, there is a noticeable difference in IQ, and there's no way of turning it off...that's different.

Yes i couldnt agree more. This is what i define as a cheat (shady optimizations dispite the tradeoff)
 
Cheating between ATI and Nvidia always comes up,it`s how you look at it,in the end we really only have these two companies to choose from for the best gaming cards,so in the end I just look at best value for performance with good image quality and go for it between the two,I go Nvidia they cheat,I go ATi they cheat,I can`t win 😉.
 
All gaming cards optimize not just textures, but vertices, shading, shadows, and virtually every other aspect of output. None of these are as accurate as they could be, which is why different hardware is used for CAD work.

It's just the immense ignorance of the masses who label it "cheating."
 
when fanboys cannot find anything else to slam a card for, they put the screenshots under a microscope and say, OMG, ATI/NVIDIA did not render 2 pixels correctly!!!! i can call Cheat on them! HURARY! Even tho the person probably could not tell the differance in actuall game play, they got to colourize the games to see differances
 
Making optimizations to drivers that reduce image quality and increase performance without an option to disable that feature. Also, features not operating the way they are supposed to, to increase performance. For example, someone says their card supports feature X and then in games it actually disables X and enables Y instead. It is as if you were lied to. It's similar to bait and switch tactics and false advertising. There should be two checkboxes in the driver. One for X and one for Y. The user can choose which one they want.

A store advertises that it is selling a solid oak table and when you get to the store it looks like solid oak. However, when you examine the table very closely and determine it is actually made of particle board with an oak veneer. The store's owner claims that it looks extremely close. You can't tell the difference, so it's good enough.
 
If it is decreasing IQ so much then how come every single article on cheating needs to color the mip map levels to see any difference at all? Sorry, but all this cheating crap is old. Maybe if the cheating decreased the IQ on the scale of this I would care, but that isn't the case.

Oh, and last I checked P4s didn't have the option to turn off SSE2 optimizations. Damn cheaters...
 
The biggest point to this IMHO is that both cards were benchmarked at there launch against each other.
when the cards were set at high quality settings, one was more impressive than the other. So many people are buying that card not knowing that the cards were not run with the same settings of filtering.
Giving an unfair advantage, which resulted in better scores and people deciding that card is better. 🙁
 
IMO there is nothing wrong with it as long as they allow for you to actually get what you are asking for. In other words provide a work around. Also when they do this but dont tell you it is kind of underhanded because all the reviews out there forced Nvidia to do full trilinear filtering while ATI was only doing brilinear. If the reviewers knew about this then they could have just let the Nvidia card run in its Brilinear form.

And I have seen some of the benchmarks with Nvidia running brilinear and it the R420 doesnt look as good as it does in some benchmarks.
 
i bleive people dont like cheats because there usually done in 1 or 2 popluar benchmarks. they cheat to beat there opponet to sell more video cards when in reality that card is not faster then the other one.

beleive it or not but people buy the fastest video card that performance best in 3dmark. now if you can cheat in 3dmark to beat the other guy. people are going to buy that card.
 
/me sighs.
Anyone who's actually bothered to read up on this latest round of BS knows that ATI's adaptive trilinear is not anything close to NVidia's on/off "brilinear" setup.

Cheating IMO is what NV was doing in 3DM03. Not what ATI is doing now, or what NVidia was doing in UT2K3 (although I imagine people were angry their NV cards wouldn't do trilinear).
 
Originally posted by: obsidian
If it is decreasing IQ so much then how come every single article on cheating needs to color the mip map levels to see any difference at all? Sorry, but all this cheating crap is old. Maybe if the cheating decreased the IQ on the scale of this I would care, but that isn't the case.

Oh, and last I checked P4s didn't have the option to turn off SSE2 optimizations. Damn cheaters...

I don't have an issue with optimizing... I like optimizing. I have an issue with features that don't work the way they are supposed to. If someone says they are doing trilinear, then they give you brilinear instead, that is cheating. They need to give you exactly what they say they are giving you.

What if the specifications for SSE2 are 64-bit calcuations and then company X does 48-bit and doesn't tell anyone about it and says it's close enough. Meanwhile, people are being srewed on the accuracy of their calcuations and company X looks great in all the benchmarks. That's cheating.
 
Originally posted by: Schadenfroh
when fanboys cannot find anything else to slam a card for, they put the screenshots under a microscope and say, OMG, ATI/NVIDIA did not render 2 pixels correctly!!!! i can call Cheat on them! HURARY! Even tho the person probably could not tell the differance in actuall game play, they got to colourize the games to see differances

The optimisations don't work when you use coloured mip-maps.
It's only when the difference between two layers is small, which is why you can't tell the difference in many cases, because it's not working.
 
Its all BS. You will never be able to tell the difference.

The comparisons some people have pointed out are retarded. When ATI tells you your getting a video card with "this clock rates/memory", this amount of pipes, etc, with PS2, etc, YOUR GETTING THAT CARD. Your not getting a 3DFX card that has a red PCB with a Radeon HSF and then call it "close enough". What the card puts out, we as consumers and reviewers will determine whether its worth our money, whether or not you can actually tell performance and image quality differences. Bill Gates did not create an OS monopolistc empire by allowing the user to do whatever the F*ck they wanted to get whatever the F*ck the user wanted out of it, you play within the rules of the OS itself. Same thing applies to ATI/Nvidia cards.
 
Originally posted by: chsh1ca
/me sighs.
Anyone who's actually bothered to read up on this latest round of BS knows that ATI's adaptive trilinear is not anything close to NVidia's on/off "brilinear" setup.

Cheating IMO is what NV was doing in 3DM03. Not what ATI is doing now, or what NVidia was doing in UT2K3 (although I imagine people were angry their NV cards wouldn't do trilinear).

Agreed.
 
Back
Top