FutureMark 3DMark06 Benchmark Overview [Now with Download Link]

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

professor1942

Senior member
Dec 22, 2005
509
0
0
Originally posted by: Dritnul
Originally posted by: professor1942
I wonder if my crappy 128 MB card is supported.

min req is a directx 9.0c(Dec)
256 MB VRAM
1GB RAM
2.4-8 GHz P4 or eqiv


Actually, it does work fine on a 128 MB card. If you can consider 6 fps 'fine'. :p
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: Ackmed

Huh? A dual core adds *about* a 20% increase in the final score. Of course that depends on the kind of CPU it is. That does not happen in real games, at 1280x1024. If it gives the same boost in score at 1600x1200, or even higher, then it is giving really false information. Because that will not correlate into higher frames in games by the same percentage. As I said earlier, 3dmark06 will give a higher score, for a [/i]slower[/i] performing PC. And that is just wrong.

There are many games and benchmarks that take advantage of dual core through optimized drivers and give from 5-15 or even 20% at cpu limited resolutions. Some for reference are Doom3,Quake4,HL2,Chronnicles of Riddick, Aquamark3 etc.. If you are talking about 1280x1024 or greater gpu limited resolutions only,then you're correct for TODAY single threaded games and this makes perfect sense. Dual core optimized drivers doesn't give noticeable performance gains at 1280x1024 and since 1280x1024 is the default bench in 3dmark06 for now it's inaccurate cpu result. (I wouldn't say that if we were talking for 1024x768 though). Anyway what I mean is that this whooping 20% gain in dual cores ,which seems and IS by far overrated now for this resolution(1280x1024) especially with the use of AA/AF, may not be the case in the near future. Dual cores WILL make a major difference in SMP games and 3dmark06 is designed for future games rating not todays. For highr gpu resolutions this isn't and will not be the case of course, and if 3dmock presnts the same gains at 1600x1200 for example then it's utterly nonsense.

You say 3dmark06 is not made for today, but for 2006? It is 2006.

Are we been sarcastic here? :p First of all 3dmark is always released with the macro prospect in mind since it tests new advanced technologies that will be extensively used in the future.Wait for the games of 06 this year just began.. Any SMP games? You'll see that these gains will eventually be justified in a great proportion even at 1280x1024 ;)

Thats just one issue with this version of 3dmark. NV cards cant do HDR+AA in the tests, so it skips all SM3 tests when AA is selected. So when people try to compare scores between a NV card and ATi card with AA selected, the ATi card is doing SM3 with HDR+AA, when the NV card is only going SM2. Just as FS did, this gives out serious misinformation.

Misinformation is for the misinformed. Tough luck. If you don't know that an Nvidia card can't do AA+float HDR and that they can't be combined in future games(well thay can through differed shaders but this is another story) then whose fault is this? How can Futuremark give you a score for something that it cannot be combined in a real game? :confused: Their tactic is correct.

The benchmark is flawed in several key areas. I dont understand how people can think its accurate, or feel like they have to upgrade because of a low score.

I told you in some points you're correct. But about cpu results it's all about what point of view you use. In a micro point of view the cpu result are overrated. In a macro point of view they are not IMHO for resolutions till 1280x1024. And if you ask me it's far more accurate than 03 or 05.
As far as concerns people use common sense m8. Those who upgrade just to get a high 3dmock score are ... :p You and I know that for todays games these things are not the case. But the benchmark has to present the new technologies, HDR + SM3.0 which will extensively be used in the future games. If you want comparable and useful results then buy the professional version and use only comparable tests, or even better do game benchmarks ;)

 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: jim1976
Originally posted by: Ackmed

Huh? A dual core adds *about* a 20% increase in the final score. Of course that depends on the kind of CPU it is. That does not happen in real games, at 1280x1024. If it gives the same boost in score at 1600x1200, or even higher, then it is giving really false information. Because that will not correlate into higher frames in games by the same percentage. As I said earlier, 3dmark06 will give a higher score, for a [/i]slower[/i] performing PC. And that is just wrong.

There are many games and benchmarks that take advantage of dual core through optimized drivers and give from 5-15 or even 20% at cpu limited resolutions. Some for reference are Doom3,Quake4,HL2,Chronnicles of Riddick, Aquamark3 etc.. If you are talking about 1280x1024 or greater gpu limited resolutions only,then you're correct for TODAY single threaded games and this makes perfect sense. Dual core optimized drivers doesn't give noticeable performance gains at 1280x1024 and since 1280x1024 is the default bench in 3dmark06 for now it's inaccurate cpu result. (I wouldn't say that if we were talking for 1024x768 though). Anyway what I mean is that this whooping 20% gain in dual cores ,which seems and IS by far overrated now for this resolution(1280x1024) especially with the use of AA/AF, may not be the case in the near future. Dual cores WILL make a major difference in SMP games and 3dmark06 is designed for future games rating not todays. For highr gpu resolutions this isn't and will not be the case of course, and if 3dmock presnts the same gains at 1600x1200 for example then it's utterly nonsense.

Yes I know about CPU limited resolutions. 1280x1024 is not one of them. The same improved score from a dual core CPU is still there, no matter what res from what I have seen. At 1600x1200, there is not a 20% boost in any game, with a dual core CPU over a single core CPU. The games that take advantage of dual core CPU's are far, far out numbered by the ones that dont. Giving a much higher score because of dual core is silly. A 3800+ X2 with a 6600GT, scores higher than a 3500+ with a 6800GS. And that is not the case in real games, the 6800GS smokes the 6600GT with those CPU's.

Originally posted by: Ackmed
You say 3dmark06 is not made for today, but for 2006? It is 2006.

Originally posted by: jim1976
Are we been sarcastic here? :p First of all 3dmark is always released with the macro prospect in mind since it tests new advanced technologies that will be extensively used in the future.Wait for the games of 06 this year just began.. Any SMP games? You'll see that these gains will eventually be justified in a great proportion even at 1280x1024 ;)

No, its 2006. Eventually? Yeah, how many years down the road will that be? Well after 06 is outdated you can bet.

Thats just one issue with this version of 3dmark. NV cards cant do HDR+AA in the tests, so it skips all SM3 tests when AA is selected. So when people try to compare scores between a NV card and ATi card with AA selected, the ATi card is doing SM3 with HDR+AA, when the NV card is only going SM2. Just as FS did, this gives out serious misinformation.

Originally posted by: jim1976
Misinformation is for the misinformed. Tough luck. If you don't know that an Nvidia card can't do AA+float HDR and that they can't be combined in future games(well thay can through differed shaders but this is another story) then whose fault is this? How can Futuremark give you a score for something that it cannot be combined in a real game? :confused: Their tactic is correct.

Misinformed people looking to see how cards are compared, use "tools" like 3dmark. Futuremark spreading misinformation to people who dont know better, is wrong. Their tactic is not correct. Not even close. Giving a score of zero is just silly.

The benchmark is flawed in several key areas. I dont understand how people can think its accurate, or feel like they have to upgrade because of a low score.


Originally posted by: jim1976
I told you in some points you're correct. But about cpu results it's all about what point of view you use. In a micro point of view the cpu result are overrated. In a macro point of view they are not IMHO for resolutions till 1280x1024. And if you ask me it's far more accurate than 03 or 05.
As far as concerns people use common sense m8. Those who upgrade just to get a high 3dmock score are ... :p You and I know that for todays games these things are not the case. But the benchmark has to present the new technologies, HDR + SM3.0 which will extensively be used in the future games. If you want comparable and useful results then buy the professional version and use only comparable tests, or even better do game benchmarks ;)

Wow, thanks for saying Im correct, because I was so worried.

06 is worse to me, and many others. The results are vastly distored, and they dont use a lot of new features. Quoted from another user who worded it well I thought, "The X1800 does not support D24X8 it has to fall back to R32F which has an impact on bandwidth. So tell me, how is 24 bit to 32 bit a fair comparison? It's apples to oranges. So while the 7800s are running well on 24 bit with PCF, the X1800 is being compared to it on 32 bit without any fetch4 or DFC support. Thus it is not a relevant test to compare the two's capabilities imo."

There are far too many "apples to oranges" comparisons to even think about it being a valid comparison for video cards, or even CPU's.

1) DST16 could've been used in the benchmark but wasn't (developers(or IHV's) preference I suppose)
2) DST24 is used, but cards that don't support the feature is forced to use F32(which might cost more but is an unknown factor atm).
3) Fetch4 used in SM2.0 which doesn't support Fetch4 but an algorithm of it is used.
4) SM3.0 supports Fetch4 yet it isn't used in the SM3.0 benchmark.
5) AA doesn't get scored with a certain IHV's card.
6) CPU's are vastly overrated

I dont feel like cleaning up the quote tree. So there it is.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: Ackmed

Yes I know about CPU limited resolutions. 1280x1024 is not one of them. The same improved score from a dual core CPU is still there, no matter what res from what I have seen. At 1600x1200, there is not a 20% boost in any game, with a dual core CPU over a single core CPU. The games that take advantage of dual core CPU's are far, far out numbered by the ones that dont. Giving a much higher score because of dual core is silly. A 3800+ X2 with a 6600GT, scores higher than a 3500+ with a 6800GS. And that is not the case in real games, the 6800GS smokes the 6600GT with those CPU's.

Well it depends on how you see it. For single threaded games yes this is utterly nonsense, but can you bet that this will not be the case with an SMP game in your example?
Anyway haven't checked the proffesional version, so I'll have to take your word as far as concerns the gpu limited resolutions. I'm not surprised either since Futuremark and vendors are not stupid. They have their way of promoting what they want ;)



No, its 2006. Eventually? Yeah, how many years down the road will that be? Well after 06 is outdated you can bet.

In one year many things happen in tech life. Quad cores are on their way in the end of this year and many games (SMP too)will make their appearance.

Misinformed people looking to see how cards are compared, use "tools" like 3dmark. Futuremark spreading misinformation to people who dont know better, is wrong. Their tactic is not correct. Not even close. Giving a score of zero is just silly.

I told you my opinion on this. You cannot present a score on something that it cannot be done reallistically. Anyways what did you expect? It still remains a synthetic crap as it was for enthousiasts nothing more or less. Those who care find meaningful benches to work with.

Wow, thanks for saying Im correct, because I was so worried.

06 is worse to me, and many others. The results are vastly distored, and they dont use a lot of new features. Quoted from another user who worded it well I thought, "The X1800 does not support D24X8 it has to fall back to R32F which has an impact on bandwidth. So tell me, how is 24 bit to 32 bit a fair comparison? It's apples to oranges. So while the 7800s are running well on 24 bit with PCF, the X1800 is being compared to it on 32 bit without any fetch4 or DFC support. Thus it is not a relevant test to compare the two's capabilities imo."

There are far too many "apples to oranges" comparisons to even think about it being a valid comparison for video cards, or even CPU's.

1) DST16 could've been used in the benchmark but wasn't (developers(or IHV's) preference I suppose)
2) DST24 is used, but cards that don't support the feature is forced to use F32(which might cost more but is an unknown factor atm).
3) Fetch4 used in SM2.0 which doesn't support Fetch4 but an algorithm of it is used.
4) SM3.0 supports Fetch4 yet it isn't used in the SM3.0 benchmark.
5) AA doesn't get scored with a certain IHV's card.
6) CPU's are vastly overrated

There's no need for the attitude.. We're having a conversation, am I skipping something here?
Anyway I suspect there will be patches released to fix issues like those you mention eventually. Give it some time it's just released and with all these things I've seen all these years I'm not surprised.
And in the worst scenario I don't find it any worse than 03/05 which were nowhere near true gaming conditions by far too.
But honestly even if this doesn't happen I wouldn't loose my sleep over this :p
 

railer

Golden Member
Apr 15, 2000
1,552
69
91
1604....yikes!

Inspiron 9300 1.83 p-m, 6800go stock clock. Lower than I expected....
 

Ftn

Junior Member
Jan 20, 2006
2
0
0
06 is worse to me, and many others. The results are vastly distored, and they dont use a lot of new features. Quoted from another user who worded it well I thought, "The X1800 does not support D24X8 it has to fall back to R32F which has an impact on bandwidth. So tell me, how is 24 bit to 32 bit a fair comparison? It's apples to oranges. So while the 7800s are running well on 24 bit with PCF, the X1800 is being compared to it on 32 bit without any fetch4 or DFC support. Thus it is not a relevant test to compare the two's capabilities imo."

There are far too many "apples to oranges" comparisons to even think about it being a valid comparison for video cards, or even CPU's.

Nvidia uses D24X8 with PCF. Ati uses DF24 with Fetch4. Neither of them has anything to do with DFC (Dynamic Flow Control).
X1800 is only card in X1x00 line that does not support Fetch4.

1) DST16 could've been used in the benchmark but wasn't (developers(or IHV's) preference I suppose)
2) DST24 is used, but cards that don't support the feature is forced to use F32(which might cost more but is an unknown factor atm).
3) Fetch4 used in SM2.0 which doesn't support Fetch4 but an algorithm of it is used.
4) SM3.0 supports Fetch4 yet it isn't used in the SM3.0 benchmark.
5) AA doesn't get scored with a certain IHV's card.
6) CPU's are vastly overrated

1) Developers choose precision they need.
2) DST24 or DF24 is used, cards that doesn't support them are forced to use R32F.
3) PCF or Fetch4 are used in SM2.0 if supported, otherwise very efficient shader fallback.
4) SM3.0 does not use PCF or Fetch4, becouse they wont work with their shadow edge smoothing.
5) AA doesn't get scored with cards which doesn't support HDR with AA.
6) If you want to compare graphics scores, compare graphics scores.
 

Savarak

Platinum Member
Oct 27, 2001
2,718
1
81
2303 with amd 64 3000+@2.30ghz, 1gig ram, 6800gt stock... that about average?
 

Doctorweir

Golden Member
Sep 20, 2000
1,689
0
0
3065...ouch... :( Damn, it's CPU-limited...totally new experience with 3DMark...
That finally brings me to hit the button under my Opteron 165-order :D
 

DidlySquat

Banned
Jun 30, 2005
903
0
0
I pwned with a total score of 4152 (SM2 = 1924, SM3 = 1878, CPU = 968)

7800 GTX OC, Vevice 3000+ @2.475, see my sig for details
 

Sable

Golden Member
Jan 7, 2006
1,130
105
106
I managed 4079, can't remember the break down though.

I'm gonna try and run it on my XP2400 and ATI9500 tonight. Then possibly my K6-2 550 and TNT M64. ;)

edit:

All run at stock speeds, I've not started overclocking this yet. I've only had it a couple of weeks.
 

cronic

Golden Member
Jan 15, 2005
1,782
0
0
Originally posted by: aka1nas
Originally posted by: professor1942
Alright, I overclocked a couple things and did much better the second time:

http://img462.imageshack.us/img462/9434/3dmark060co.jpg

Am I missing something? How are you getting over 11000 with that setup?



must be at 1024x768 or lower would be the only answer for that score. why people are posting scores at lower than the default resolution is beyond me. if you are going to post them please put the resolution on it.
 

professor1942

Senior member
Dec 22, 2005
509
0
0
Originally posted by: cronic
Originally posted by: aka1nas
Originally posted by: professor1942
Alright, I overclocked a couple things and did much better the second time:

http://img462.imageshack.us/img462/9434/3dmark060co.jpg

Am I missing something? How are you getting over 11000 with that setup?



must be at 1024x768 or lower would be the only answer for that score. why people are posting scores at lower than the default resolution is beyond me. if you are going to post them please put the resolution on it.


Sorry, I didn't realize I'd covered up the resolution - here's another run:

http://img507.imageshack.us/img507/6237/3dmark0629nx.jpg
 

cronic

Golden Member
Jan 15, 2005
1,782
0
0
Originally posted by: professor1942
Originally posted by: cronic
Originally posted by: aka1nas
Originally posted by: professor1942
Alright, I overclocked a couple things and did much better the second time:

http://img462.imageshack.us/img462/9434/3dmark060co.jpg

Am I missing something? How are you getting over 11000 with that setup?

must be at 1024x768 or lower would be the only answer for that score. why people are posting scores at lower than the default resolution is beyond me. if you are going to post them please put the resolution on it.


Sorry, I didn't realize I'd covered up the resolution - here's another run:

http://img507.imageshack.us/img507/6237/3dmark0629nx.jpg



SHINS...That is absolute Bull Sh+t! There is no way you are scoring an 11436 with 6800 series cards and a 2.6Ghz 3700 sd. No way no how. That score is whack!!!!!!
Nice photoshop job there Professor 1942
More like a 1436 and you added a 1 to the score but that score is bullsh+t!!!!