themusgrat
Golden Member
- Nov 2, 2005
- 1,408
- 0
- 0
Originally posted by: SickBeast
Oh yes I can draw an awesome monkey. I'll have it ready tomorrow night. :Q
Good work, men. Brilliant.
EDIT: Sorry, I won't crap up the thread anymore.
Originally posted by: SickBeast
Oh yes I can draw an awesome monkey. I'll have it ready tomorrow night. :Q
Originally posted by: RobertR1
God damnit Munky! This is exactly why I wanted you to contact me. We would have it patented, VC funded and IPO'd before Friday! Now you have to watch MunkyMark get ravaged by these snakes![]()
Originally posted by: munky
Originally posted by: KeithTalent
Ok, I think I messed something up. I have a mobility X300 with a core of 350 and memory of 250 (woohoo) and by my Munkycalc I get 26.4, but when I run the FEAR test at those settings I get 15 as my average. Does anyone know what I am doing wrong? Oh wise Munky, please enlighten me![]()
How did you get 26.4 in my formula?
Last I checked the x300 is a 4 pipe card with a 128-bit mem bus. If that is correct, then the formula comes out to:
((350 * 4) + (250 * 128/16)) / 500 = 6.8
Which is still off, but now I'm wondering how did a x300 get a higher avg fps then mwmorph's 9800xt? Are you guys using the right settings? 1280x960, 4xAA, 8xAF, max details, no soft shadows,...?
Originally posted by: munky
Originally posted by: KeithTalent
Ok, I think I messed something up. I have a mobility X300 with a core of 350 and memory of 250 (woohoo) and by my Munkycalc I get 26.4, but when I run the FEAR test at those settings I get 15 as my average. Does anyone know what I am doing wrong? Oh wise Munky, please enlighten me![]()
How did you get 26.4 in my formula?
Last I checked the x300 is a 4 pipe card with a 128-bit mem bus. If that is correct, then the formula comes out to:
((350 * 4) + (250 * 128/16)) / 500 = 6.8
Which is still off, but now I'm wondering how did a x300 get a higher avg fps then mwmorph's 9800xt? Are you guys using the right settings? 1280x960, 4xAA, 8xAF, max details, no soft shadows,...?
Originally posted by: JAG87
For ****** sakes can you just wait...
You know how long it takes to input core clocks, mem clocks, pixel pipes, vertex pipes, and pixel shaders for EVERY SINGLE VIDEO CARD MADE FROM ATI AND NVIDIA FROM 2000?
You dont need the code to detect your card's spces, because im putting presets for every single card out there. I am using rojakpot's list to fill in my program database.
and beggerking, get your app of the net asap, because when mine goes up it will put yours to shame
tonight I should have a good working beta of the program. I have lots of spare time because ill get our of school early.
cheers
edit
btw i did get the xp theme working, so thanks to the guy who offered code, but i got it.
Originally posted by: JAG87
For ****** sakes can you just wait...
You know how long it takes to input core clocks, mem clocks, pixel pipes, vertex pipes, and pixel shaders for EVERY SINGLE VIDEO CARD MADE FROM ATI AND NVIDIA FROM 2000?
You dont need the code to detect your card's spces, because im putting presets for every single card out there. I am using rojakpot's list to fill in my program database.
and beggerking, get your app of the net asap, because when mine goes up it will put yours to shame
tonight I should have a good working beta of the program. I have lots of spare time because ill get our of school early.
cheers
edit
btw i did get the xp theme working, so thanks to the guy who offered code, but i got it.
Originally posted by: mwmorph
edit: odd, i still get 11 softshadows on or off.
Originally posted by: munky
Ok, does anyone know how to actually code the app so it detects the hardware features of the card? Or, I suppose I could code an app to measure the performance of the card like texture fillrate, shader throughput, and mem bandwidth, but my mad OpenGL skillz are getting kinda rusty. Plus, would that automatically give Ati card a lower score? Anyway, looks like MunkyMark will need an updated algorithm to compensate for 128mb cards, so dont be pirating my idea just yet![]()
Originally posted by: munky
Hey, nice work on that app. How did you know the code to query the Nv device features? Is it something you can tell by looking at the driver dll? I'm trying to figure out how determine that info for Ati cards.
Originally posted by: xtknight
Alright. I think I can claim the first working (automated) MunkyMark here!Took a good four hours.
(52K, calculates score in less than quarter of a second.)
MunkyMark alpha 0.01 - by xtknight (VC6.0 ANSI)
Supports SLI
List of supported GPUs:
NVIDIA GeForce 6800 Ultra
NVIDIA GeForce 6800
NVIDIA GeForce 6800 LE
NVIDIA GeForce 6800 GT
NVIDIA GeForce 6600 GT
NVIDIA GeForce 6600
NVIDIA GeForce 7800 GTX 512
NVIDIA GeForce 7800 GTX
NVIDIA GeForce 7800 GT
SLI is factored in as 1.9x improvement per extra GPU.
These are supported for the most part, but the device ID for 'NV43' for example has not been added, only the one for 'GeForce 6600GT', and to be honest, I'm not sure which one the 6600GTs use. Please report your results.
It will initially start for ATI cards, display device and vendor info, but it will not calculate your score yet. Actually I thought video RAM was needed for the algorithm, but apparently it's not, so you can comment out the ATI-excluding part and start adding in ATI device IDs, and it will work (not with Crossfire though).
If you have two GPUs and didn't enable SLI, my program is aware of that, and will only use your first GPU is SLI is disabled.
If anyone has any of the above cards, PLEASE test this. This is a super beta-alpha-gamma whatever you want to call it version. But it's about 5 weeks (at least) from perfect condition probably. Limitations: Most likely bugs, limited card support, does not detect overclocks (including manufacturer overclocks). It's in SHAKY FORM, so DON'T be surprised if you run into problems, but do please post the reported error here and your video configuration. But nevertheless here it is. Just double click it, you don't have to start a command console window, it will pause for you at the end.
http://xtknight.atothosting.com/munkymark/mm06ALPHA-001.exe
Source code (Visual C 6): http://xtknight.atothosting.com/munkymark/src/mm06ALPHA-001-src.zip
(Press RETURN at end of program to quit it.)
Here's an example of it at work:
MunkyMark alpha 0.01 - by xtknight (VC6.0 ANSI)
(Currently only working on NVIDIA adapters)
Enumerating display adapters...
Device ID: PCI\VEN_10DE&DEV_0092&SUBSYS_C5183842&REV_A1
vendorID: 10de
deviceID: 0092
subsysID: c5183842
revision: 00a1
NVIDIA adapter detected.
Querying nvcpl.dll for information...
Video Mode: Single GPU
GPU Count: 1
SLI Factor: 1.0
VRAM: 256MB
Finding device name in look-up table.
Device: GeForce 7800 GT
Calculating MunkyMark!
MunkyMark: 30.4000014439
---
So if you know C please go ahead and improve upon it! The device table in the header file also needs populating as current card support is limited. You don't need to know C to update the look-up table actually, it's self explanatory. Not sure when ATI support is coming but I might need somebody as a guinea pig. I don't know ATI's API yet so it's impossible to detect Crossfire right now.
Floating point precision for the score is 10 decimal places right now, but it can easily be modified (simply change one number in printf function).
Also, please manually calculate your score and check it against what my program returned. Thanks.
(No credits yet)