It seems that no one is happy with the resolution+settings

AkumaX

Lifer
Apr 20, 2000
12,643
3
81
Real World. WTF is "Real World"

1024x768 2xAA 4xAF on a 7600GT?
1280x1024 4xAA 8xAF on a X1900XT?

WTF is wrong with you people?! WTF settings do you people want reviewers to test?!

"I think 2 x X1900XT's in XFire should alleviate the bottleneck, especially at 1900x1200 8xAA 16xAF"

WHO TF actually USES something like this!? WTF will make you guys happy?!
 

iamaelephant

Diamond Member
Jul 25, 2004
3,816
1
81
People aren't interested in real-world benchmarks it seems.... God knows why. I get frustrated when I'm looking at reviews of individual parts I want to buy and even if they are mid-range parts they are reviewed with NASA super-computers.
 

Powermoloch

Lifer
Jul 5, 2005
10,084
4
76
I'm happy with 1280x1024 with my x850xt 2AA/8AF on some games (no AA/AF on the games I regularly play...RTCW:ET)

even so with my 1600x1200 on the 7600GT with 2AA/8AF (looks pretty darn sweet on Starwars Battlefront II :D )
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
durrr...why not test CPU limted low rez AAAAAAAAAAND test high detail high res. So for an X1900xtx + E6600, you'd test at 1024x768 and a 1600x1200 AA/AF
 

akshayt

Banned
Feb 13, 2004
2,227
0
0
1900XT + Any X2 or Core 2 Duo should be tested from 12X10 no AA/AF to 1920*1440 or up with AA and AF.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: iamaelephant
People aren't interested in real-world benchmarks it seems.... God knows why. I get frustrated when I'm looking at reviews of individual parts I want to buy and even if they are mid-range parts they are reviewed with NASA super-computers.

Because maybe people want to know what they will be getting 2 years from now when they change to a faster card when using the same cpu? because not everyone changes cpu as often as his underwear :roll:

Judging by your point, you could go out and buy a P D 805 and you would be one happy gamer with a mid range card... that is, until you upgraded and realized you bought the wrong cpu since it is now obsolete... thats the point
 
Oct 4, 2004
10,515
6
81
This is how I would do it -
SINGLE High-End GPU - 1280x1024 with 0xAA 16xAF, All graphics settings maxed.
DUAL High-End GPUs - 1920x1200 with 4xAA 16xAF, All graphics settings maxed.

A lotta people seem to think 1280x1024 is the new bare minimum, so this is to keep them happy.
 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
A significant number of AT readers who add to the comments on articles seem to think AT should bench a new piece of hardware on a rig identical their own system. How many times do we see "but how will it perform with my x800XL?", or "how does it conpare to my D820?", or "can you test it on my ASUS xxx to see if it can OC?"

At least with gaming resolutions most people accept the CPU and GPU limited scenarios are worthwhile, rather than simply demanding to see the exact res that they game at a home.
 

michal1980

Diamond Member
Mar 7, 2003
8,019
43
91
The big problem now comes from the splintering outputs people have.
And the fact you have sli/x-fire.
And dual core processors

It used to be you have a crt monitor, maybe 17 or 19 in, and 2 resolutions that 90-95% of the public used. (i.e. 10x7, and 12x9 )

So you could easily run a card through a lot of tests.

Now you have people that still use crts, people that have lcds, and people that have widescreen lcds, etc. so you go from 2 resolutions to 6-8 that people want to know about
(10x7 [which is dieing], 12x10, 16x1200, 14x10[or whatever] and on and on.

Then people want to test these on amd systems, and intel systems, and single core, and dual core.

And now people want sli & xfire, and compare to old cards, and compare vendors, etc.
And different settings

I think the reviewers are just running out of juice.

Just 2 processors, and 2 video cards, with all these optitions with different games will net what? With 4 games, lets say 2 different settings a card, you get like 32 different tests. And you should re-run each test, say 3 times, so you now have 96 tests.

Talk about a lot of work, it what ussually is a short time period.

 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: michal1980
The big problem now comes from the splintering outputs people have.
And the fact you have sli/x-fire.
And dual core processors

It used to be you have a crt monitor, maybe 17 or 19 in, and 2 resolutions that 90-95% of the public used. (i.e. 10x7, and 12x9 )

So you could easily run a card through a lot of tests.

Now you have people that still use crts, people that have lcds, and people that have widescreen lcds, etc. so you go from 2 resolutions to 6-8 that people want to know about
(10x7 [which is dieing], 12x10, 16x1200, 14x10[or whatever] and on and on.

Then people want to test these on amd systems, and intel systems, and single core, and dual core.

And now people want sli & xfire, and compare to old cards, and compare vendors, etc.
And different settings

I think the reviewers are just running out of juice.

Just 2 processors, and 2 video cards, with all these optitions with different games will net what? With 4 games, lets say 2 different settings a card, you get like 32 different tests. And you should re-run each test, say 3 times, so you now have 96 tests.

Talk about a lot of work, it what ussually is a short time period.


Exactly, so why bother wasting time with "real world settings" aka gpu bottlenecked when testing cpus (I know this thread was made with Hardocp's conroe evaluation in mind) when EVERYONE already knows unless you have some high end dual gpu system you WILL be gpu limited?

Its like, useless? Thats about the same as someone walking into the rain and telling you "hey see I told you, rain gets you wet!" NOT !"#% SHERLOCK