MunkyMark06 Official Download Page

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
Originally posted by: themusgrat
That will get messy, because it uses GDDR3, so real RAM speed is actually 333mhz. I did change stuff to 521/1150. Is 2017 MM right?

no real ram speed is 500mhz.
GDDR3 means Graphics Double Data Rate Ram Generation 3. It's still DDR, now just with evolutionary cchanges over the hotter running DDR1 and 2.
 

MADMAX23

Senior member
Apr 22, 2005
527
0
0
I scored 4604 in MunkyMark06 with my 7800gt @ 475 / 1140 Mhz 100% stable in all games during lots of hours!!

Keep improving the formula because as Zebo has shown, there are some bugs in SLI modes I am afraid....but pretty good formula by the way guys, Munky, Sickbeast, beggerking, JAG87!! My congratulations!!

Keep improving it!!
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
The 6800GS is brutally butchered. It has no pipes compared to the GT. IT has 12 pixel pipes 12 pixel shaders, and only 5 vertex pipes. THe GT has 16 pixep pipes, 16 pixel shaders and 6 vertex pipes. Its only normal that it will score higher.

to reply to the futuremark things... what are they going to do? law suit for NOT making profit of their logo? LOL, i dont think it says anywhere that I cannot use their logo for personal things.

and yes, I shall optimize the formula. Also I see some people are seeing wierd things from the screenshots, such as the labels around the text... I optimized this program to work with XP themes, so if you are using old windows skin, or custom skins, you get that effect. Unfortunately I dont think thats fixable unless I use vb.NET which I dont have right now, but i might get and port the program onto there.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
I liked my old equation better. In actual games the 6800gs is almost on par with the 6800gt, and my formula showed them almost even also.
 

RichUK

Lifer
Feb 14, 2005
10,341
678
126
I got 3050 with my OCed BFG 6800Ultra @ 450/1250.

I must say as it is a parody of FutureMark, i think it best to alter the banner and put in monkey instead of Future ;) As this could infringe on copyright etc.

Still fariplay for creating the app.
 

aznrice54

Member
Oct 26, 2005
71
0
0
Very nice job!

My X800 XT at 540/590 = 3340.

Don't know how accurate it is though. It may need some tweaking considering all the minute details, some of which I'm not even aware of.

Awesome work nonetheless.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
they really cant do anything to me. Im not making profit from their logo. I dont see how its illegal to take someones logo for personal use. Im not commercializing it, nor badmouthing their good name. If anything I am giving them free publicity.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
OK, but VERY, very unaccurate in many cases. I liked the original munkymark much, much better. For example, the X1800XT gets around 4800 while the 7800GTX (256) gets near 6500.... meanwhile the X1800XT beats the 7800GTX in performance. The 6800GS is way lower than the 6800GT yet it performs around the same. It could be good, but needs improvement.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
you should use field level validation to validate inputs. If a anything other than number is entered into the field, the program crashes.
 

RichUK

Lifer
Feb 14, 2005
10,341
678
126
Originally posted by: JAG87
they really cant do anything to me. Im not making profit from their logo. I dont see how its illegal to take someones logo for personal use. Im not commercializing it, nor badmouthing their good name. If anything I am giving them free publicity.

But still you copied their logo, on a none registered FutureMark Product. Which can been seen in their eyes as bad advertising as well as against copyright laws.

Anyway for a little mess around on AT I?m sure it wont cause any harm :)
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
2,210 MM's with my X800Pro.

Looks awesome. Thanks for the note in the credits. ;)

Just a suggestion: Can I edit the "FutureMark" logo in the bottom right corner to read "BeastMark Corporation" and edit the little orange circle into the Gorilla avatar from AT?

Also, a "monkey" splash screen would be cool.

LMK. :)
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Oh, one more thing: can we get a theoretical score on the 7900GTX?

700 core
850 memory
32 piplelines
8 vertex shaders
32 pixel shaders
256 memory bus width

I think it would be very interesting. :)
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Just one more then I'm done for now.

Thank-you to everyone who contributed to this project!

That includes those that promoted it.

It's been fun. I hope it proves to be a useful app in some regard. I think it's a great indicator of performance from future unreleased cards! :beer:
 

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
Originally posted by: SickBeast
Oh, one more thing: can we get a theoretical score on the 7900GTX?

700 core
850 memory
32 piplelines
8 vertex shaders
32 pixel shaders
256 memory bus width

I think it would be very interesting. :)

justput in the #s yourself, using the 512mb gtx template with changed numbers, you get 16633 MM06TM
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: mwmorph
Originally posted by: SickBeast
Oh, one more thing: can we get a theoretical score on the 7900GTX?

700 core
850 memory
32 piplelines
8 vertex shaders
32 pixel shaders
256 memory bus width

I think it would be very interesting. :)

justput in the #s yourself, using the 512mb gtx template with changed numbers, you get 16633 MM06TM

The X1900XTX only gets 11,950MM's. Something doesn't quite jive. The Inquirer was saying that the 7900GTX would need 700mhz just to match the X1900XTX. There must be something that we don't know about. :)

Not only that, but IMO a 700mhz 7900 card is very unlikely.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Originally posted by: Extelleron
For example, the X1800XT gets around 4800 while the 7800GTX (256) gets near 6500.... meanwhile the X1800XT beats the 7800GTX in performance.

thats going in my sig as biggest bs ever written on AT.

Originally posted by: beggerking
you should use field level validation to validate inputs. If a anything other than number is entered into the field, the program crashes.

thank you so much. I never thought of that.

Originally posted by: xtknight
:Q Good work so far.

I can help you with automated detection if you want. Maybe I'll find out a way to detect overclocks, active pipes also.

I need my program to be tested in the automated sense (instructions there), so that we can combine these programs into something useful.

http://forums.anandtech.com/messageview...&STARTPAGE=6&FTVAR_FORUMVIEWTMP=Linear

if you can do that, feel bad about yourself. you made me do 3 nights of work to enter all those cards, and now you tell me this...

Originally posted by: SickBeast


The X1900XTX only gets 11,950MM's. Something doesn't quite jive. The Inquirer was saying that the 7900GTX would need 700mhz just to match the X1900XTX. There must be something that we don't know about. :)

Not only that, but IMO a 700mhz 7900 card is very unlikely.

exactly. the highest clock seen from nvidia so far is 550 on the GTX 512. now they are telling me that they can crunch in more pipes, and increase the clock to 700 mhz? naa. its either one or the other. If nvidia leaves 24 pipe architecture, they will make a really good heatsink and clock the card at 700/900 (estimates). If they crunch more pipes in, they will leave the current clocks on the GTX 512.

The second option seems to yeild far better performance. I hope nvidia doesnt just overclock their current architecture, but actually makes a new chip and smokes ATI. believe it or not 16633 is not that far fetched. If the card does actually have 32 ppipes, 32pshaders and 8vpipes, it might really beat the snot out of the X1900.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: JAG87
if you can do that, feel bad about yourself. you made me do 3 nights of work to enter all those cards, and now you tell me this...

You didn't waste any work. Right now I am talking about detecting the model of card in the PC. It still has to look through the table to find shaders, etc. It'll be a long long time until I get code to detect active pipelines, if I have the ambition to. But if there's just some boxes to type in the unlocked # then active pipeline detection is not needed anyway.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: xtknight
Originally posted by: JAG87
if you can do that, feel bad about yourself. you made me do 3 nights of work to enter all those cards, and now you tell me this...

You didn't waste any work. Right now I am talking about detecting the model of card in the PC. It still has to look through the table to find shaders, etc. It'll be a long long time until I get code to detect active pipelines, if I have the ambition to. But if there's just some boxes to type in the unlocked # then active pipeline detection is not needed anyway.

:) yes.. I read your code, you were lazy in listing out shader information for most of the cards.

I looked into WMI and its not in there. WMI only has info on ID info and amount of memory on the videocard.. so the only place to go now is probably dx sdk or ati/nv sdk...

if none of above works, then we'll have to use assembly coding..which I forgot how to do..were never good at first place.. and too lazy to do..