Is it me or is [H]ardOCP an ATI Fanboy site !!!!

videoclone

Golden Member
Jun 5, 2003
1,465
0
0
I've read so many HardOCP videocard reviews and have come to the conclusion they are ATI lovers!

I don?t think there's 1 single person on this forum that would take an X800Pro over a 6800Ultra if both cards were handed to them for free unless your the person who writes the HardOCP Video card reviews .... its simple we all know the 6800Ultra is better but from HardOCP's Reviews they are always in favor of ATI and put the X800Pro in the same performance league and even (Quoted) Just ontop of 6800 they say that they are doing Gaming Benchmarks from now on to avoid cheating drivers in 3Dmark but Anandtech use?s this same approach and has been for quite some time now but their benchmarks are inline with all other leading hardware review sites . they all have the 6800Ultra clearly dominating over the X800Pro.. BUT NOT HARD OCP

But HardOCP contradicts all other benchmarks from every other site. The majority is right and HardOCP has been judged

Do not go here to read on Nividia reviews. They are So Wrong!

Oh and they olso use Old Nvidia drivers never touching Nividia Beta's but for some reason have no problem running ATI Beta Driver's

ALL Nividia Drivers that have been released at the moment with the 6800 Videocard are beta but HardOCP choose to use the oldest one's they can!

Heck Hard OCP even had a 9800XT beating a 6800Ultra and even stated that the 9800XT was better in some tests. LOL what idiots
 

WicKeD

Golden Member
Nov 20, 2000
1,893
0
0
Ya, I am finding HardOCP to favor ATi lately. Not that its wrong, they just are not giving Nvidia any fair play. The H never said anything about ATi "optimizations" this time around even when IQ was f00ked....

I go there to see what news they are reporting, but there reviews to me are worth a grain of salt.
 

Rob94hawk

Member
May 9, 2004
132
0
0
I was thinking that but their latest review on the visiontek x800pro won some and lost some vs. the 6800U: http://www.hardocp.com/article.html?art=NjI4

And they used nvidias non WQHL drivers.

Not to sound too much like an ATI fanboy but I've got to give kudos to ATI for actually getting their card out to the enthusiast.
 

videoclone

Golden Member
Jun 5, 2003
1,465
0
0
Originally posted by: Rob94hawk
I was thinking that but their latest review on the visiontek x800pro won some and lost some vs. the 6800U: http://www.hardocp.com/article.html?art=NjI4

And they used nvidias non WQHL drivers.

Not to sound too much like an ATI fanboy but I've got to give kudos to ATI for actually getting their card out to the enthusiast.

They used non WQHL drivers!!! umm all the drivers are NON WQHL there isnt any WQHL drivers that supports the Geforce 6800 Ultra they are all beta's ....HardOCP used the oldest Nvidia drivers they could to review the visiontek x800pro

Nvidia Recommend using the latest release !!!

They know the latest Nvidia drivers make the 6800 Ultra blow the X800Pro to hell in Every game benchmark that?s why they dont use them ... they are ATI fanboys! plan and simple

:roll:
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
The used the ones with the least amount of bugs in them, whats wrong with that? NV pushed the 61.11's out the door at the last second, and asked reviewers to use them. They have MANY problems with Farcry, and other games.

If NV wants people to use the best and current drivers, they should actually get some that are WHQL passed, and stop this beta crap.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
You guys are upset because hardcop found gaming better on ATI products. Like big deal that will change when Nvidia gets good drivers. Only one to blame is Nvidia for doing a paper launch with major bugs in the drivers. Would you rather read a review that keeps making excuses and saying things like "we are sure the drivers will be fixed when the card makes it to retail"? The worst part is Nvidia has been using the beta driver excuse for a couple of years now. I say it again, Nvidia makes great cards, maybe even the best cards, but they desperately need decent pr. (Not that ATI is in my good books, with the 9800 rip off rubes pro). :wine:
 

SgtZulu

Banned
Sep 15, 2001
818
0
0
Yes if you haven't noticed Kyle is an ATI whore

HardOCP.com has zero credibility in my book
 

Dean

Platinum Member
Oct 10, 1999
2,757
0
76
Don't whine to Hardocp because they don't shine your blessid Nvidia in the most positive light. The gameplay experience they provide in their eval's are the best out there and shows what actually using the cards is like. I happen to very much like their approach. You guys are sounding like the ATI fanboys from the last couple years when Hardocp along with Nvidia declared 3dmark2k3 a useless benchmark. The ATI fanboys called Kyle and Brent Nvidia fanboys then, and now you guys call them ATI fanboys.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
3dmark 2k3 is a useless benchmark. Its graphics are not based on real game graphics engines.

However, 3dmark 2k1 was based on real engines, it made a lot more sense. (although its scores dont matter now either because the games are outdated)
--------------------------------------------------------------------

Anyone who doesnt see the bias just by going through their latest video card articles and reading the comments is just as biased... or learning disabled.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Originally posted by: Dean
Don't whine to Hardocp because they don't shine your blessid Nvidia in the most positive light. The gameplay experience they provide in their eval's are the best out there and shows what actually using the cards is like. I happen to very much like their approach. You guys are sounding like the ATI fanboys from the last couple years when Hardocp along with Nvidia declared 3dmark2k3 a useless benchmark. The ATI fanboys called Kyle and Brent Nvidia fanboys then, and now you guys call them ATI fanboys.

how soon we forget. LOL
 

Compddd

Golden Member
Jul 5, 2000
1,864
0
71
LoL was gonna say. Everyone used to scream that Kyle and HardOCP were Nvidia hoes and now everyone is screaming they are ATI whores. Wtf?
 

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: Compddd
LoL was gonna say. Everyone used to scream that Kyle and HardOCP were Nvidia hoes and now everyone is screaming they are ATI whores. Wtf?

yes indeed, I remember when they used to run a GF3 and hell a GF2...once the 9800 came out boom they started useing that. They use the best...and right now they deem the 420 the best.
Once the 6800 matures some and drivers are out in force...you may see them switch back..who knows
 

jagec

Lifer
Apr 30, 2004
24,442
6
81
Originally posted by: videoclone
....HardOCP used the oldest Nvidia drivers they could to review the visiontek x800pro

hmm, maybe using nVidia drivers on an ATI card is part of the problem...
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Hard OCP has the most bastardized system of benchmarking I have ever seen. I don't know why anybody would actually think of those benchmarks as being credible. Look at this page. Here are the 3 cards and settings being compared:
VisionTek X800 Pro @ 1280x1024
ATI X800 Pro @ 1280x1024
GeForce 6800 Ultra @ 1024x768

The first thing you notice is that the settings are not the same. It's an apples to oranges comparisson.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Compddd
LoL was gonna say. Everyone used to scream that Kyle and HardOCP were Nvidia hoes and now everyone is screaming they are ATI whores. Wtf?

Errr, Compddd, Kyle posted a huge rant about nVidia forcing brilinear in the drivers and not telling reviewers a while back. Then he adopted his new "I'll post only the benchmarks I consider playable" format, where he subjectively decides what's playable and gives you only the benchmarks at that setting. (e.g. he might say 50fps ave is the spot, so one card might be at 10X7 4X8X, the other at 12X10 2X4X)
Some people thought that was a little LESS than useful, because like BFG says, you want to see the cards at the highest setting as well to give an indication of the cards performance in non cpu limited situations. Others (e.g. Rollo) thought it was less than useful because we wanted to see the performance at several settings to give a better overall picture of the cards performance.
Then he went about making sure all nVidia optpizations were disabled for his benchmarks and compared them to ATIs performance settings.

Of course, he never said anything about the driver cheats on the 9600XT.

He never said anything about the ATI cheats that I've seen other than this weak indictment:
We touched on this previously, but I want to revisit this topic in a bit more detail, especially since there has been a huge outcry in the past couple of weeks addressing ATI "cheating" at their texture filtering and not representing what they are doing correctly to reviewers. Admittedly, going back and seeing exactly what ATI has said about their texture filtering and what has been done by ATI when it comes to texture filtering, does not seem to fully jive. In fact you could likely pigeon hole their actions as being misleading to reviewers. Or maybe the reviewers are putting themselves in a position to be mislead? NVIDIA is not a stranger to filtering fiascos either
Yeah right, are they cheating, or is it all the reviewers fault??? The world will never know.

AFAIK Hardocp does nothing to disable ATI driver cheats.

Just some more retarded trolling to to consider. :roll:
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
here's a scenraio for ya.

I watched TechTV's "The ScreenSavers" about a year ago, They had Kyle on as a guest. They were benchmarking the Athlon 64 (when it was just about to go public) and Leo Laporte asked Kyle Bennet a question. That question was "At what speed is the Front Side bus of the Athlon 64?" Kyle's reply, "uhh, I think it's 466Mhz" He was guessing as it was the next step closer from the Barton (having 400Mhz) Leo had to correct Kyle and say "I thought the A64's FSB was the same speed as it's core". Again Kyle " Uh...oh yeah, you're right"

DUUHHH!!! If he cant figure out the FSB of a CPU, then should he even have a video card?