• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

New Hercules 3D Prophet 4500 with Kyro 2 ............ !

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
no T&L is considered a step backwards to most game developers. I want to see how well it fares with 3dmark 2001, course it wont take long since 3dmarks 2001 is meant for cards that support DX8. I am more interested in the next Kyro card with T&L.
 
not really, brute force has worked so far, what I do consider 2 steps backwards is the price of the Geforce 3.
 


<< I want to see how well it fares with 3dmark 2001, course it wont take long since 3dmarks 2001 is meant for cards that support DX8. >>



lol, since when has 3Dmark been considered a viable and reliable benchmark? Don't make me laugh, hahahahahahahaha
Why does a GFmx get twice the score of a Voodoo5? I have NEVER seen a mx running ANY game at twice the speed, in fact I can't remember many games running better on an mx. Before you start thinking I'm a Voodoo zealot, I have owned both cards and a GF2pro. I'm not saying that the voodoo is necessarily better, but it is deffinately not lagging by 50%.
I think most reasonable reviewers would also agree that as a benchmark 3Dmark is BS, just eyecandy for the masses.-just out of curiosity, does Nvidia own a stake in madonion?
 

3dmark has it's uses... but one of them is NOT comparing cards from different manufacturers. Just use it for what it is best for... comparing different drivers for your Nvidia card.

Madonion? Just a bunch of talented amatures from the old demo scene days.
 
UKTaxMan-

&quot;Why does a GFmx get twice the score of a Voodoo5? I have NEVER seen a mx running ANY game at twice the speed&quot;

Now you have🙂

It is actually 3dfx's fault that it ended up that way. 3DMark2K had to run 16bit default because 3dfx didn't have any board that could do 32bit 3D when the bench came out(V3 was their latest and greatest at the time). The 32bit scores have always been reflective of the actual relative performance in recent games(you can search through this board for evidence of that). ATi looked bad because they had very poor 16bit performance. nVidia just happened to have seriously @ss kicking 16bit performance so they looked incredibly good. Check out the scores popping up for the new test that does run 32bit default, all of the sudden the scores have gotten real tight between the various boards(nVidia and ATi in particular).
 
Ben, a few things....

1) those MDK2 scores are 640x480. 3dMark2000's default is 1024x768. Kinda a weird comparison, no?

2) The MX drops from 140 fps @ 640 to just over 60 fps @ 1024. The 5500 &quot;drops&quot; from 60 to ~56 in the same resolutions. Doesn't that strike you as odd?

3) the 5500's OGL drivers SUCK SUCK SUCK in MDK2. With WickedGL, I get ~90 fps in MDK2 @ 1024x768x32 w/texture slider maxed.


3DMark2K had to run 16bit default because 3dfx didn't have any board that could do 32bit 3D when the bench came out

yeah, and the 5500 was already past due by that time, it was a known fact that the 5500 had 32-bit color, blah blah blah...keep on preaching Benji! 😉

if they were really that concerned about &quot;unsupported features&quot;, I sincerely doubt they would've included EMBM, considering how many cards supported it at the time (Matrox) and how many were on track to support it (Matrox)
 
I talked to someone with a 32 meg Radeon. He says UT will run @ 1600 x 1200 x 32. So what's the deal? Why does AT not run that test?
 
Unreal Tournament reads I believe the first 7 resolutions that a video card driver supports only. With NVIDIA's drivers, since it supports so many resolutions, this limits Unreal Tournament to 1024x768 as a maximum resolution. I guess he does this to be fair since it is a problem with Epic's Unreal Tournament and not NVIDIA's drivers.
 


<< It is actually 3dfx's fault that it ended up that way. 3DMark2K had to run 16bit default because 3dfx didn't have any board that could do 32bit 3D when the bench came out(V3 was their latest and greatest at the time >>



3dfx's fault? Look everyone with any sense knows that 3DMark has always been hopelessly biased towards Nvidia. Punishing 3dfx by halving their scores because of the lack of 1 feature shows that bias clear as day.
 
Robo-

&quot;Ben, a few things....

1) those MDK2 scores are 640x480. 3dMark2000's default is 1024x768. Kinda a weird comparison, no?&quot;


Errr, the quote I was referring to said-

&quot;I have NEVER seen a mx running ANY game at twice the speed&quot;

You also forgot to mention that that was 640x480 32bit versus 1024x768 16bit and with the GF2MX dropping to 16bit makes a BIG difference at that res(~25FPS in Quake3 for isntance).

&quot;yeah, and the 5500 was already past due by that time, it was a known fact that the 5500 had 32-bit color, blah blah blah...keep on preaching&quot;

My kids still believe in the tooth fairy too, doesn't mean the guys who work at MO do🙂

What was the number one selling single add in board for the twelve months preceeding and roughly six months post 3DMark2K launch? Why it was the V3🙂

&quot;if they were really that concerned about &quot;unsupported features&quot;, I sincerely doubt they would've included EMBM, considering how many cards supported it at the time (Matrox) and how many were on track to support it (Matrox)&quot;

Must have been there heavy nVidia bias😛 Know how much of a weighting EMBM has on the 3DMark2K score? 0%, the same that the hardware T&amp;L and AGP texturing does. Of course, more examples of heavy nVidia bias right?🙂

Orbius

&quot;3dfx's fault? Look everyone with any sense knows that 3DMark has always been hopelessly biased towards Nvidia. Punishing 3dfx by halving their scores because of the lack of 1 feature shows that bias clear as day.&quot;

Of course it does. That's why nVidia is using their patented hardware T&amp;L technology that no other board manufacturer can use, right? That's why MS couldn't add hardware T&amp;L to DX7 or why the OpenGL board couldn't add it to OpenGL because it is a nVidia exclusive feature, right? That's why MO switched over to 32bit default so ATi would continue to trail horribly, right? Since they are so nVidia biased and all. We had this discussion in another thread.

It's all the money nVidia is charging. See, they need to overcharge for their cards so they can bribe all the benchmark makers, the game developers and even Intel(can't use a V5 with a PIV) to make their hardware look and work better with their stuff. The cards are dirt cheap to make, nVidia is just paying eveyone off to make 3dfx look bad, and they are STILL DOING IT!! That's right, even after 3dfx is dead game developers and benchmark makers are STILL using hardware T&amp;L, Dot3 and are even planning on using all of those GeForce3 features all just to punish 3dfx!!! We had this all figured out some time ago, nVidia is bribing everyone they can....😕
 
🙁
dispite the greatness of this card there are many out there that don't like it....
I wonder why...
please try to use your brains for once...

3d cards are what they are 3d cards not more that that...
don't make a religion out of a 3d card !!!
 
I agree with powervr2, this is nuts -

I have been looking at all the available data on this card on the forums and the net, and I have come to 2 conclusions:

1) The Kyro2 is very aptly suited for today's games. With the lack of development of PC games going on right now, the Kyro2 will be good to go until we are actually in &quot;need&quot; (to be read loosely) of truly DX8 compliant hardware in the next 6 to 8 months.

2) Developers have not exactly been &quot;chomping at the bit&quot; to develop T&amp;L games. T&amp;L has been largely overhyped by the video card companies in an effort to differentiate their products in the marketplace (read - to kill 3dfx). It is well implemented in a couple of titles, but... not very well across the board. A faster CPU will do the work better... cheaper.

3) FSAA works with every title. I think we can all agree that jaggies suck... Any attempt to bring FSAA to the masses should be praised.

4) People are spending way too much time benchmarking their systems, and not enough time playing the games that they presumptively bought the video cards to play. Benchmarking can only ultimately lead to unhappiness with your hardware. If not now - later...

Go out - play UT or Q3A - be happy. For tomorrow, all of our hardware will be obsolete.

 
Ok Ben, if its such a ridiculous notion, how does MadOnion make its money?
If you're saying the potential doesn't exist for people to 'contribute' to MadOnion, then you're blind to reality.
 
Back
Top