FutureMark 3DMark06 Benchmark Overview [Now with Download Link]

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: christopherzombie
Originally posted by: JustStarting
Very odd... my first pass at this was with my 146 Opty @ 2.9Ghz and my 7800Gt @ 500/1200- scored a 4121!! I was happy!!

Slap in the 170 opty running at only 2.75Ghz and back my 7800GT down to 490/1190 and I get a 4,508!! I thought I'd see the 146 with the higher clocked CPU/GPU speeds outrun the 170 at lower CPU/GPU speeds.

I didn't think the dual core would make that much differnce at a lower clock speed?

The CPU score is added into the final score. This is where dual core makes it's money. Compare the FPS in each test to see if your 146 is any slower than the 170.

You mean, thats where the dual core exaggerates, and inflates the score? :)
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Yeah, I'm going to toss in a 4800+ just to keep things competitive....stupid that single core users should take a hit. The CPU score shouldn't factor in...this is supposed to be a GPU benchmark, not a system one (that's what PCMark is for).
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
4543 X1800XT stock 3800 X2 @ 2.5GHz

4861 X1800XT @ 690/780 3800 X2@ 2.5GHz

I noticed that the projects both report 2D clockspeed (594 MHz / 693 MHz for my card) rather than the actual 3D GPU clockspeed. Seems ripe for abuse. I know I was overclocked for the second test and I know the first test ran at ~625/750 not the 594/693 reported.

I could easily promote my results as if my card was clocked at 594/693, and checking some compare links, I'm not the only one. Is it the same for all the other cards with seperate 2D/3D clocks? Hmmm
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
3036 X1800XL Stock 499/495 AMD64 3000+ @ 1.8GHz

apparently the GPU clockspeed is normal behavior for 3Dmark, it only appears to report the clockspeed prior to the benchmark when the card is still in 2D mode. I guess you're on the honor system when you have dynamic overclocking...one more reason its only good for checking your own system changes.
 

vtohthree

Senior member
Apr 18, 2005
701
0
0
3dmark score: 2726
*edit: 2762 now(after i clocked my card to 505/116 and rebooted)

I was totally dissappointed, it ran everything so slow.
My 6800GS is overclocked too 497/115.


I just built my rig a couple months ago, it feels so weak now.
Its time to buy a X1900XT w/ a dualcore cpu.
 

Zenoth

Diamond Member
Jan 29, 2005
5,202
216
106
Finally tried it with my new X2 4400+.

Total score: 3917

Link is in sig.
 

Robor

Elite Member
Oct 9, 1999
16,979
0
76
Just ran 05/06. A64 3200+ @ 2.5Ghz & ATI X1900XTX @ default (rest of rig in sig).

3Dmark05 - 10258
3DMark06 - 4897

Not too shabby I guess but considering the cost of the XTX it's probably not a huge jump over my old 7800GT. Think my CPU is holding me back at all here?
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: otispunkmeyer

yeah thats valves HDR, it looks good but its not the proper way of doing it. kudos to them for making something everyone can use though, but its just not proper.

big thumbs up to ATI for implementing the hardware in their desgin to allow AA and proper HDR.

Wrong, the ATi demo called Debevec RNL, the Valves HDR are the Microsoft implementation of HDR through shaders, the HDR used on Far Cry and other games is not a Microsoft DX Standard, is an Open EX Standard. So really there's no improper HDR implementation. Though the Open EX Standard offers better quality HDR effects (Not a huge difference), it comes with a cost, a bigger impact in performance.