MunkyMark06 Official Download Page

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Nope, it is not in WMI and not in any D3D dumps either. Somehow the GPU registers need to be dumped, and personally I'm not sure how to do that. ;) I can't believe the overclock values for Coolbits aren't stored somewhere readable though. I find that quite amazing??!
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: xtknight
Nope, it is not in WMI and not in any D3D dumps either. Somehow the GPU registers need to be dumped, and personally I'm not sure how to do that. ;) I can't believe the overclock values for Coolbits aren't stored somewhere readable though. I find that quite amazing??!

really? I havn't used it in the longest time.. perhaps coolbit actually manipulate GPU clock on the fly..in driver? umm, dnt know. did you go through rivatuner sdk? it has a gpu clock api in there.. but still no shader info..
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
I don't think autodetection is a big deal, no offence.

I don't think it's hard to pick your card from a pull-down list, and I like how you can choose other cards to look at.

A "compare" feature would be time better spent IMO.

I also think that future cards should be added in if we know preliminary specs, just for fun. :)

*Edit* Also, how hard would it be to add in the original "SickBeast Marks©"? It was (core clock) x (memory clock, single rate) x (number of pixel shaders).

I just think that my system more accurately predicted the upcoming 7900GTX's performance in comparison to the X1900XTX. :)
 

Paratus

Lifer
Jun 4, 2004
17,761
16,112
146
Hey why does Zebos 5700U get over 500. My 9600XT should at least be comparable. I want my money back !!!


;)
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: beggerking
really? I havn't used it in the longest time.. perhaps coolbit actually manipulate GPU clock on the fly..in driver? umm, dnt know. did you go through rivatuner sdk? it has a gpu clock api in there.. but still no shader info..

Yeah, been there. Not much useful (but the device manager display enumeration and temperature monitoring stuff was cool). There was a command line program in the RivaTuner dir somewhere but it doesn't support the 7800GT yet apparently! Maybe the next version of RivaTuner will support it (latest is only a beta).

Judging from looking at the dll files somewhere there is a function called QueryClockInfo() in the NVIDIA drivers but I don't know where. Looks like it's not an exported function. I'll investigate that more tomorrow I guess. Or on second thought maybe it's just a waste of time.

Originally posted by: SickBeast
A "compare" feature would be time better spent IMO.

lol, I was thinking about making an encoding algorithm to encode people's username and claimed card/clocks/etc in so they can't cheat their scores. :p
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Paratus
Hey why does Zebos 5700U get over 500. My 9600XT should at least be comparable. I want my money back !!!


;)

I agree, MunkyMarks are NV biased. :)
 

Paratus

Lifer
Jun 4, 2004
17,761
16,112
146
Originally posted by: SickBeast
Originally posted by: Paratus
Hey why does Zebos 5700U get over 500. My 9600XT should at least be comparable. I want my money back !!!


;)

I agree, MunkyMarks are NV biased. :)



What!!!

Munky works for AEG!!!!!!!



:laugh::shocked:
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Paratus
Originally posted by: SickBeast
Originally posted by: Paratus
Hey why does Zebos 5700U get over 500. My 9600XT should at least be comparable. I want my money back !!!


;)

I agree, MunkyMarks are NV biased. :)



What!!!

Munky works for AEG!!!!!!!



:laugh::shocked:

Well look at your result, the compare the X1900XTX with the proposed 7900GTX. It's NV biased! See, there are no optimizations, but I'll bet AEG paid Munky to bork his formula to give NV the edge! :beer:

P.S. Munky you know I'm kidding!
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
For the record, SickBeast Marks© are 100% non-biased and I still hold them to a higher regard than Munky Marks.

I'm just trying my best to be good to my fellow primate. :p
 

Paratus

Lifer
Jun 4, 2004
17,761
16,112
146
Originally posted by: SickBeast
Originally posted by: Paratus
Originally posted by: SickBeast
Originally posted by: Paratus
Hey why does Zebos 5700U get over 500. My 9600XT should at least be comparable. I want my money back !!!


;)

I agree, MunkyMarks are NV biased. :)



What!!!

Munky works for AEG!!!!!!!



:laugh::shocked:

Well look at your result, the compare the X1900XTX with the proposed 7900GTX. It's NV biased! See, there are no optimizations, but I'll bet AEG paid Munky to bork his formula to give NV the edge! :beer:

P.S. Munky you know I'm kidding!

I bet they gave him that X1900 he's got!


:laugh: jking!
 

Rock Hydra

Diamond Member
Dec 13, 2004
6,466
1
0
My score: 320 marks.

Intel GMA 900 on my Dell Lattitude 110L

:(

Desktop Cards:
FX 5900 Ultra didn't fare as well either with: 906
GeForce 6800 (unlocked @ stock speed): 1675


 

Paratus

Lifer
Jun 4, 2004
17,761
16,112
146
Originally posted by: Rock Hydra
My score: 320 marks.

Intel GMA 900 on my Dell Lattitude 110L

:(

Desktop Cards:
FX 5900 Ultra didn't fare as well either with: 906
GeForce 6800 (unlocked @ stock speed): 1675

Dude you almost tied my Oc'd 9600XT with an Intel GMA :eek:. I think the low end of Munky Marks needs a little tweaking.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Uhhh... I hate to break it to you guys , but upon further thinking, it seems like our formulas and real life benches get the opposite results. Ok, my original formula was close, but it just happened to work out in one scenario. For example, look at the benches of the gtx512 and the x1900xtx in FEAR. Without AA, the gpu fillrate should be the biggest factor, but the gtx scores closer to the xtx, despite the xtx having twice as many shaders. Then when you add AA the memory should play a significant role too, but the gtx takes a bigger nosedive despite faster memory clocks.
 

Elfear

Diamond Member
May 30, 2004
7,169
829
126
Originally posted by: JAG87
Originally posted by: Extelleron
For example, the X1800XT gets around 4800 while the 7800GTX (256) gets near 6500.... meanwhile the X1800XT beats the 7800GTX in performance.

thats going in my sig as biggest bs ever written on AT.

:confused:
 

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
Originally posted by: Rock Hydra
My score: 320 marks.

Intel GMA 900 on my Dell Lattitude 110L

:(

Desktop Cards:
FX 5900 Ultra didn't fare as well either with: 906
GeForce 6800 (unlocked @ stock speed): 1675

what? accordnt to MM your Integrated is 90% as fast as my ced 9800pro/xt?
 

JBT

Lifer
Nov 28, 2001
12,094
1
81
You give way to much credit to pixel shaders on this app.
At any rate my X800XT scores 3250
 

firewall

Platinum Member
Oct 11, 2001
2,099
0
0
826667 with a 6600GT!!! WTF? :shocked:

core clock: 10000
memory clock: 10000
pixel pipelines: 32
vertex pipelines: 25
pixel shaders: 32
mem bus width: 2048

You need to make it more realistic by putting restrictions on how much the values can be increased. Not a very good software since it just uses stored values and formulas. It would be better to get some real benchmarks of the cards instead of the user entering whatever he/she wants.