• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Biased TomsHardware!

VulcanX

Member
http://www.tomshardware.com/re...e-gtx-295,2107-10.html
I dont understand how they can EVEN PUBLISH summin so strictly biased its not even funny!

We?ve presented the results from six games. Five of them were mandated by Nvidia as a sample of the most-anticipated titles for the 2008 holiday season. Four of those five are part of Nvidia?s The Way It?s Meant to Be Played program. Two are already staples of our own benchmark suite. And we picked one game, Crysis, to add to the mix. This is still engineering-sample hardware and, according to Nvidia, the final fan speeds haven?t yet been set. What makes something like this okay? All of the titles chosen are, in fact, popular games and we can understand the frustration of seeing the same three-year old apps tested over and over again simply because they?re recognized performance metrics. Even still, we want to stay transparent to our readers. In fact, it was a breath of fresh air to see some new software instead of the same Supreme Commander savegame or the World in Conflict fly-through.

HONESTLY HOW IS THIS OK? They making ATI look like a pile of scrap in the meantime they using strictly Nvidia : The way its meant to be played games, sum1 plz tell me how much Nvidia paid for this to get done? And why not throw sum 3dmark etc at it, that is a true test, not putting games the GFX cars is designed on!
 
Originally posted by: kb2114
I don't really see any problem with them testing all of the most popular games on store shelves today...

Thats bcoz ur prob a Nvidia fanboy, its a preview, i understand they want it to look good etc, but i mean thats just ridiculous! Why couldnt they be a lil more fair and just give a proper review! But hey, when money is involved certain stuff gotta happen i guess. Esp seeing how Nvidia is SPLASHED all over TomsHardware (toms have to give back to their generosity somehow)



Let's lay off the jabs, mmkay? Launching personal attacks against someone for posting their opinion would be a no-no in the Video forum.

- AmberClad (Video Mod)
 
Originally posted by: VulcanX

And why not throw sum 3dmark etc at it, that is a true test, not putting games the GFX cars is designed on!
How exactly is 3DMark a "true test"? Do you play 3DMark? Because I sure don't; I play real games. 3DMark is worthless.

And it?s not really Toms? fault since the five title mandate came from nVidia, presumably as a condition for reviewers to be allowed to release the results. But yeah, it would have been nice if they added some titles to the list like ComputerBase did.
 
But 3dmark is a true test of a GFX cards colours, and breaks the card down with the multiple tests it has with DX10, Shader Processing etc etc, which i think is extremely important, bcoz if a game is designed for a certain manufacturer and is proven to run well with Nvidia lets say, then of course Nvidia will outperform the competition, after all the games are designed solely for them, 3dmark is not biased and with that kept in mind do they run games as benchmarks in most OC events? as far as i know 3dmark has become a standard, and yes some games like Crysis do get used (the benchmark tools) but the main benchmark is 3dmar in any OC event, so how accurate is it from that point of view?
 
While I agree that there should have been a wider variety of games tested for the 295, 3DMark is not really indicative of real world performance of GPU's these days.
Actually hasn't been for a long while.
 
Honestly why even bother with 3DMark. It's not at all indicative of game performance. Remeber how well the 2xxx did there? To me the only reason why anyone would use it would be when overclocking.
 
Just read the THG article, can't see why they would include NO AA / NO AF results for such powerful video cards because I doubt anyone with such video cards would use settings like that. I prefer to use settings like 8xAA and 16xAF myself! 😉
 
3DMark is a benchmarking tool, period.

The best way (IMHO) to choose the right card for you is to compare how they perform in the games you play.

If you play mostly Source engine games would you still buy an nVidia card because of better 3Dmark scores?


 
Originally posted by: Darklife
Honestly why even bother with 3DMark. It's not at all indicative of game performance. Remeber how well the 2xxx did there? To me the only reason why anyone would use it would be when overclocking.

there are 4 "mini-games" it benches
- 3DMark06/Vantage is extraordinarily useful for tracking changes in a SINGLE system

- but pretty useless for comparing different systems
rose.gif
 
3DMark is a useful tool for overclocking and measuring performance gains on a single system. It is not very accurate for comparing the relative performance of different video cards for actual gaming performance.

A good example is the HD 4670. It scores terrible in 3DMark06 compared to HD 3850/3870 and the 9600GT. But the fact is in real games the HD 4670 produces very similar fps.

Why does the HD 4670 has such a low score in 3DMark? I haven't a clue....
 
Yeah I don't care for this review at all.

1) I mean why the heck would you use a outdated 8.12 driver when ATi asked to use the 8.561.3 drivers?

2) Why forced AF in Crysis when your going to use POM? It's known that ATi doesn't work well when forcing AF in Crysis, and there is a command for it.

3) Dead Space and COD WaW don't have profiles yet. I would expect reviewers to use current titles, but this makes the review look one sided when there's not enough optimized titles for both.

4) Why is there no 4870 1GB?

5) Why didn't he use higher AA and AF?

I'm sure there's more, but once I see a review like this I just go somewhere else.
 
I haven't seen one single review I would base a purchase on. I tried to use a consensus from a number of reviews because many (by their choice of games, hardware and settings) are going to be biased towards one company or the other. Its nothing to get worked up about, just read some other reviews. At least Tom's Hardware (which is one of my less favorite sites for reviews) was forthcoming about the benchmarks chosen. They are some of the more popular titles these days and just because they have "The Way it's Meant to be Played" on them doesn't mean they are always going to perform better with Nvidia hardware (Fallout 3 for example has been doing better with ATI hardware since 8.10)
 
Originally posted by: rogue1979
3DMark is a useful tool for overclocking and measuring performance gains on a single system. It is not very accurate for comparing the relative performance of different video cards for actual gaming performance.

A good example is the HD 4670. It scores terrible in 3DMark06 compared to HD 3850/3870 and the 9600GT. But the fact is in real games the HD 4670 produces very similar fps.

Why does the HD 4670 has such a low score in 3DMark? I haven't a clue....

3dmark is usually (way) ahead of the curve. Go 2 or 3 years in the future and you'll start seeing games with similar performance loads to that 3dmark. Sometimes not even that long if we look how soon Max Payne was out after 3dmark2001.
 
The article is a preview... and they state that very clearly in the article... your giving them a bad wrap.

I look for as many reviews I can find and look for the the games I play. I would never base a purchase on 1 review... that alone a preview.
 
Originally posted by: Fox5
Originally posted by: rogue1979
3DMark is a useful tool for overclocking and measuring performance gains on a single system. It is not very accurate for comparing the relative performance of different video cards for actual gaming performance.

A good example is the HD 4670. It scores terrible in 3DMark06 compared to HD 3850/3870 and the 9600GT. But the fact is in real games the HD 4670 produces very similar fps.

Why does the HD 4670 has such a low score in 3DMark? I haven't a clue....

3dmark is usually (way) ahead of the curve. Go 2 or 3 years in the future and you'll start seeing games with similar performance loads to that 3dmark. Sometimes not even that long if we look how soon Max Payne was out after 3dmark2001.

true .. many systems cannot run it at 19x12 with settings fully maxed out
- the default test's resolution is really low

HOWEVER, it is useless for comparing new PCs against each other
-it is *awesome* for tracking changes in a single system .. a very useful tool that has become useless for "competition"
- like that stupid, "how many 3DMarks can my OC'd system get before it melts" mentality
rose.gif


it does bench four mini-games which are well optimized for in the vendor's drivers .. so it does not relate to ANY real world performance

 
Originally posted by: Ocguy31
What is all this faux-outrage about?

Pretty sure it's this, "Thats bcoz ur prob a Nvidia fanboy"

Or in plain english, this faux outrage isn't about a damn thing except adolescent foot stomping.
 
Originally posted by: kb2114
I don't really see any problem with them testing all of the most popular games on store shelves today...

Maybe not, but what about Left 4 Dead? That game runs flawlessly on my 8800GTS 320 at 1680 and 4x AA. And all the current cards run the game flawlessly at much higher settings. There is absolutely no point in testing a game where all the cards you test run the game flawlessly, but as soon as Nvidia asks review sites to test it, it becomes one of the staple benchmarks across the net.
 
While were on the subject of benchmarking........why does AT use 2560 x 1600 highest quality 4 x AA almost exclusively for testing mid to high end cards now.
Sure it should be included, but how many gamers actually play at those settings. Most I know play at either 1680 x 1050 or 1900 x 1200.
Seems like those settings should be used as well.

EDIT......NM, I'm an idiot, i just found those resolutions in the scaling graphs. 😱
 
Back
Top