MunkyMark 2006 benchmark

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

themusgrat

Golden Member
Nov 2, 2005
1,408
0
0
Originally posted by: SickBeast

Oh yes I can draw an awesome monkey. I'll have it ready tomorrow night. :Q

Good work, men. Brilliant.

EDIT: Sorry, I won't crap up the thread anymore.
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
God damnit Munky! This is exactly why I wanted you to contact me. We would have it patented, VC funded and IPO'd before Friday! Now you have to watch MunkyMark get ravaged by these snakes :(
 

themusgrat

Golden Member
Nov 2, 2005
1,408
0
0
Nah Nah. Catch me if you can. Err, sorry. I didn't mean to (fingers crossed). Now I get to make the millions. Muhahaha.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: RobertR1
God damnit Munky! This is exactly why I wanted you to contact me. We would have it patented, VC funded and IPO'd before Friday! Now you have to watch MunkyMark get ravaged by these snakes :(

Dude, Munky Marks are "Originally conceived by SickBeast".

PM me if you want to make your "mark". :p
 

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
Originally posted by: munky
Originally posted by: KeithTalent
Ok, I think I messed something up. I have a mobility X300 with a core of 350 and memory of 250 (woohoo) and by my Munkycalc I get 26.4, but when I run the FEAR test at those settings I get 15 as my average. Does anyone know what I am doing wrong? Oh wise Munky, please enlighten me :D

How did you get 26.4 in my formula?
Last I checked the x300 is a 4 pipe card with a 128-bit mem bus. If that is correct, then the formula comes out to:

((350 * 4) + (250 * 128/16)) / 500 = 6.8

Which is still off, but now I'm wondering how did a x300 get a higher avg fps then mwmorph's 9800xt? Are you guys using the right settings? 1280x960, 4xAA, 8xAF, max details, no soft shadows,...?

oh no soft shadows? I thought something was wrong. all i read was max everything.


edit: odd, i still get 11 softshadows on or off.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Lets make Munky more accurate and than force anandtech to do a little review : ) . Then Munky will be rich and sickbeat will be annoyed @! patent it at all cost
 

KeithTalent

Elite Member | Administrator | No Lifer
Administrator
Nov 30, 2005
50,231
118
116
Originally posted by: munky
Originally posted by: KeithTalent
Ok, I think I messed something up. I have a mobility X300 with a core of 350 and memory of 250 (woohoo) and by my Munkycalc I get 26.4, but when I run the FEAR test at those settings I get 15 as my average. Does anyone know what I am doing wrong? Oh wise Munky, please enlighten me :D

How did you get 26.4 in my formula?
Last I checked the x300 is a 4 pipe card with a 128-bit mem bus. If that is correct, then the formula comes out to:

((350 * 4) + (250 * 128/16)) / 500 = 6.8

Which is still off, but now I'm wondering how did a x300 get a higher avg fps then mwmorph's 9800xt? Are you guys using the right settings? 1280x960, 4xAA, 8xAF, max details, no soft shadows,...?

Heh, yeah, screwed up the math somewhere obviously. I am pretty sure I got all the settings correct when I tested. I will check it again later.

Thanks for the help.

 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Ok, does anyone know how to actually code the app so it detects the hardware features of the card? Or, I suppose I could code an app to measure the performance of the card like texture fillrate, shader throughput, and mem bandwidth, but my mad OpenGL skillz are getting kinda rusty. Plus, would that automatically give Ati card a lower score? Anyway, looks like MunkyMark will need an updated algorithm to compensate for 128mb cards, so dont be pirating my idea just yet ;)
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
For ****** sakes can you just wait...

You know how long it takes to input core clocks, mem clocks, pixel pipes, vertex pipes, and pixel shaders for EVERY SINGLE VIDEO CARD MADE FROM ATI AND NVIDIA FROM 2000?

You dont need the code to detect your card's spces, because im putting presets for every single card out there. I am using rojakpot's list to fill in my program database.

and beggerking, get your app of the net asap, because when mine goes up it will put yours to shame :p


tonight I should have a good working beta of the program. I have lots of spare time because ill get our of school early.

cheers


edit
btw i did get the xp theme working, so thanks to the guy who offered code, but i got it.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
and yes, munky and sick your both in the f'n about window.

so stfu already.

any1 else want in the about, its 5 dollars per character :D
 

Powermoloch

Lifer
Jul 5, 2005
10,084
4
76
Originally posted by: JAG87
For ****** sakes can you just wait...

You know how long it takes to input core clocks, mem clocks, pixel pipes, vertex pipes, and pixel shaders for EVERY SINGLE VIDEO CARD MADE FROM ATI AND NVIDIA FROM 2000?

You dont need the code to detect your card's spces, because im putting presets for every single card out there. I am using rojakpot's list to fill in my program database.

and beggerking, get your app of the net asap, because when mine goes up it will put yours to shame :p


tonight I should have a good working beta of the program. I have lots of spare time because ill get our of school early.

cheers


edit
btw i did get the xp theme working, so thanks to the guy who offered code, but i got it.



oh yeah !!!!
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: JAG87
For ****** sakes can you just wait...

You know how long it takes to input core clocks, mem clocks, pixel pipes, vertex pipes, and pixel shaders for EVERY SINGLE VIDEO CARD MADE FROM ATI AND NVIDIA FROM 2000?

You dont need the code to detect your card's spces, because im putting presets for every single card out there. I am using rojakpot's list to fill in my program database.

and beggerking, get your app of the net asap, because when mine goes up it will put yours to shame :p


tonight I should have a good working beta of the program. I have lots of spare time because ill get our of school early.

cheers


edit
btw i did get the xp theme working, so thanks to the guy who offered code, but i got it.

lol. I don't mind. I only spend 20 min on it :)
I'd say there is a way to detect those settings rather than having to manually enter all of them... perhaps an API from NV or ATI? It'd be much better if its done that way so the software would be more future proof..

 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: mwmorph
edit: odd, i still get 11 softshadows on or off.

I think you have to restart the game for changes to take effect (or change resolutions).
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: munky
Ok, does anyone know how to actually code the app so it detects the hardware features of the card? Or, I suppose I could code an app to measure the performance of the card like texture fillrate, shader throughput, and mem bandwidth, but my mad OpenGL skillz are getting kinda rusty. Plus, would that automatically give Ati card a lower score? Anyway, looks like MunkyMark will need an updated algorithm to compensate for 128mb cards, so dont be pirating my idea just yet ;)

Above I posted code to get amount of RAM and # GPUs on NVIDIA cards. But I need the function prototype for the NVIDIA QueryClockInfo() to get clocks, can't seem to find it anywhere.

Edit: hmm, RivaTuner SDK looks interesting.
 

themusgrat

Golden Member
Nov 2, 2005
1,408
0
0
This is turning out to be good. I should be in the credits of all release candidates, because of my sig. I will vote squared for anyone who puts me in. :) And seriously, we should get Anand to fully test it. Maybe he would have ideas on adjusting for CPU, audio (I think you should do +/- 2 fps for addin audio), etc.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Alright. I think I can claim the first working (automated) MunkyMark here! :) Took a good four hours.

(52K, calculates score in less than quarter of a second.)

MunkyMark alpha 0.01 - by xtknight (VC6.0 ANSI)
Supports SLI

List of supported GPUs:
NVIDIA GeForce 6800 Ultra
NVIDIA GeForce 6800
NVIDIA GeForce 6800 LE
NVIDIA GeForce 6800 GT
NVIDIA GeForce 6600 GT
NVIDIA GeForce 6600
NVIDIA GeForce 7800 GTX 512
NVIDIA GeForce 7800 GTX
NVIDIA GeForce 7800 GT
SLI is factored in as 1.9x improvement per extra GPU.

These are supported for the most part, but the device ID for 'NV43' for example has not been added, only the one for 'GeForce 6600GT', and to be honest, I'm not sure which one the 6600GTs use. Please report your results.

It will initially start for ATI cards, display device and vendor info, but it will not calculate your score yet. Actually I thought video RAM was needed for the algorithm, but apparently it's not, so you can comment out the ATI-excluding part and start adding in ATI device IDs, and it will work (not with Crossfire though).

If you have two GPUs and didn't enable SLI, my program is aware of that, and will only use your first GPU is SLI is disabled.

If anyone has any of the above cards, PLEASE test this. This is a super beta-alpha-gamma whatever you want to call it version. But it's about 5 weeks (at least) from perfect condition probably. Limitations: Most likely bugs, limited card support, does not detect overclocks (including manufacturer overclocks). It's in SHAKY FORM, so DON'T be surprised if you run into problems, but do please post the reported error here and your video configuration. But nevertheless here it is. Just double click it, you don't have to start a command console window, it will pause for you at the end.

http://xtknight.atothosting.com/munkymark/mm06ALPHA-001.exe
Source code (Visual C 6): http://xtknight.atothosting.com/munkymark/src/mm06ALPHA-001-src.zip

(Press RETURN at end of program to quit it.)

Here's an example of it at work:

MunkyMark alpha 0.01 - by xtknight (VC6.0 ANSI)
(Currently only working on NVIDIA adapters)

Enumerating display adapters...

Device ID: PCI\VEN_10DE&DEV_0092&SUBSYS_C5183842&REV_A1
vendorID: 10de
deviceID: 0092
subsysID: c5183842
revision: 00a1

NVIDIA adapter detected.
Querying nvcpl.dll for information...

Video Mode: Single GPU
GPU Count: 1
SLI Factor: 1.0
VRAM: 256MB

Finding device name in look-up table.
Device: GeForce 7800 GT
Calculating MunkyMark!

MunkyMark: 30.4000014439

---

So if you know C please go ahead and improve upon it! The device table in the header file also needs populating as current card support is limited. You don't need to know C to update the look-up table actually, it's self explanatory. Not sure when ATI support is coming but I might need somebody as a guinea pig. I don't know ATI's API yet so it's impossible to detect Crossfire right now.

Floating point precision for the score is 10 decimal places right now, but it can easily be modified (simply change one number in printf function).

Also, please manually calculate your score and check it against what my program returned. Thanks.

(No credits yet :p)
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Hey, nice work on that app. How did you know the code to query the Nv device features? Is it something you can tell by looking at the driver dll? I'm trying to figure out how determine that info for Ati cards.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: munky
Hey, nice work on that app. How did you know the code to query the Nv device features? Is it something you can tell by looking at the driver dll? I'm trying to figure out how determine that info for Ati cards.

Thanks. The functions needed are documented in the NVIDIA SDK. ATI probably has an SDK too but I only have an NVIDIA adapter to do testing on.

Part of it was just getting the device identifier string (from Windows) and matching vendor ID with either NVIDIA or ATI, and then device ID with one of the ones listed in the look-up table for default chip clock speeds, texture/shader units, etc (it is looped through).

I'm not sure how to disassemble the DLL to gather function prototypes (like for QueryClockInfo). The guy who made RivaTuner knows how to (get clock speeds, active pipes, etc) though and I suppose we could contact him.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
sigh, it looks like the formula will have to be reformulated...

the scores of the 7800 GT, and GTX are terribly low due to the fact that they have very low core clocks.

The X1800 XT ends up beating the 7800 GTX, and X1800 XL ends on par with the 7800 GT, and the 7800 GS has a better score then the 7800 GT.

stupid nvidia had to make the core so low on the GT and the GTX.

everything else looks increadible. you guys have to see how close these numbers are to reality. But still the formula needs a touch up.

Any ideas?
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
the formula currently is:

score = core * [(pixel pipelines + vertex pipelines + pixel shaders)/3] + [memory * buswidth/16]


(forget the 500, thats just for fear-like numbers)
 

Leper Messiah

Banned
Dec 13, 2004
7,973
8
0
Originally posted by: xtknight
Alright. I think I can claim the first working (automated) MunkyMark here! :) Took a good four hours.

(52K, calculates score in less than quarter of a second.)

MunkyMark alpha 0.01 - by xtknight (VC6.0 ANSI)
Supports SLI

List of supported GPUs:
NVIDIA GeForce 6800 Ultra
NVIDIA GeForce 6800
NVIDIA GeForce 6800 LE
NVIDIA GeForce 6800 GT
NVIDIA GeForce 6600 GT
NVIDIA GeForce 6600
NVIDIA GeForce 7800 GTX 512
NVIDIA GeForce 7800 GTX
NVIDIA GeForce 7800 GT
SLI is factored in as 1.9x improvement per extra GPU.

These are supported for the most part, but the device ID for 'NV43' for example has not been added, only the one for 'GeForce 6600GT', and to be honest, I'm not sure which one the 6600GTs use. Please report your results.

It will initially start for ATI cards, display device and vendor info, but it will not calculate your score yet. Actually I thought video RAM was needed for the algorithm, but apparently it's not, so you can comment out the ATI-excluding part and start adding in ATI device IDs, and it will work (not with Crossfire though).

If you have two GPUs and didn't enable SLI, my program is aware of that, and will only use your first GPU is SLI is disabled.

If anyone has any of the above cards, PLEASE test this. This is a super beta-alpha-gamma whatever you want to call it version. But it's about 5 weeks (at least) from perfect condition probably. Limitations: Most likely bugs, limited card support, does not detect overclocks (including manufacturer overclocks). It's in SHAKY FORM, so DON'T be surprised if you run into problems, but do please post the reported error here and your video configuration. But nevertheless here it is. Just double click it, you don't have to start a command console window, it will pause for you at the end.

http://xtknight.atothosting.com/munkymark/mm06ALPHA-001.exe
Source code (Visual C 6): http://xtknight.atothosting.com/munkymark/src/mm06ALPHA-001-src.zip

(Press RETURN at end of program to quit it.)

Here's an example of it at work:

MunkyMark alpha 0.01 - by xtknight (VC6.0 ANSI)
(Currently only working on NVIDIA adapters)

Enumerating display adapters...

Device ID: PCI\VEN_10DE&DEV_0092&SUBSYS_C5183842&REV_A1
vendorID: 10de
deviceID: 0092
subsysID: c5183842
revision: 00a1

NVIDIA adapter detected.
Querying nvcpl.dll for information...

Video Mode: Single GPU
GPU Count: 1
SLI Factor: 1.0
VRAM: 256MB

Finding device name in look-up table.
Device: GeForce 7800 GT
Calculating MunkyMark!

MunkyMark: 30.4000014439

---

So if you know C please go ahead and improve upon it! The device table in the header file also needs populating as current card support is limited. You don't need to know C to update the look-up table actually, it's self explanatory. Not sure when ATI support is coming but I might need somebody as a guinea pig. I don't know ATI's API yet so it's impossible to detect Crossfire right now.

Floating point precision for the score is 10 decimal places right now, but it can easily be modified (simply change one number in printf function).

Also, please manually calculate your score and check it against what my program returned. Thanks.

(No credits yet :p)


Works fine with my 7800GT.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Not sure if I like this benchmark. Makes my card look like mid- mid range - is just as bad as 3dmark and all the other benchmarks. :D