• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

A Bitter Thread - Our Favourite Propietary Graphics API (Glide) Destroys Yet Another Promising Title (Deus Ex)

jpprod

Platinum Member
Yes, I'm talking about Deus Ex and Glide. Yes, this is an on-topic thread, cuz it's more about video hardware/software issues than the game itself. Now then..

Have you checked out the Deus Ex demo? If not, I whole-heartedly recommend you to do so. This title proves two things:

1. No matter what how much people has tried to put 'em down since Daikatana showed up, ION Storm is quite capable of creating a "hit" title. From what can be told from the demo, as a game Deus Ex is amazing - the whole feel of the world is extremely realistic. But since this is a technical thread, I'll say no more.

2. No developer can ever make the Unreal engine run properly on majority of graphics hardware, and UT only ran acceptably in D3D 'cos it's graphics were so simple and levels small. It's very, very sad to see the technical-WoT fiasco all over again... I can't believe developers still license Unreal engine, as far as performance goes, they might as well license the Ultima XI engine.

In D3D at default settings, Deus Ex runs like a one-legged dog. This is not only on my measly Celeron 375/128mb/GeForce system, several high-end system owners (at least one of the systems was P3/800, 256mb + GF DDR) have had same same symphtoms: unbeliveably sluggish performance in any larger areas and during combat. No matter the horsepower you throw at it, Unreal engine's D3D renderer just won't be satisfied and gobbles up the juice.

Why does the Unreal engine run so badly? It sure as hell has nothing to do with nVidia card architechture, since excluding 3dfx Voodoo5, all other D3D cards crumble under it's weight as well. Outside UT, CPU horsepower seems to do nothing to the Unreal engine as well. Also, slowdown doesn't occur when there's lots of textures on the screen, or when polygon count is high, but rather only on largest and most open of areas. I have some theories (of which most are old):

- Portal-style clipping of Unreal engine somehow fits very poorly with anything but Glide. I don't know the Unreal engine thoroughly, so I really can't elaborate. But since Unreal is the only major game to utilize portal clipping, and also the only one to have major, unsolveable D3D issues... Maybe a bit too far fetched.

- Using palletized textures. It's no secret, that Unreal engine relies heavily on palletized (8-bit) textures which some D3D hardware (namely Nvidia TNT, TNT2) or D3D drivers (GeForce, GeForce2) fail to support. I believe Epic hasn't done a proper job with the texture management to alleviate missing HW/SW support for this feature, and is relying on D3D driver to do the "conversion-new texture upload" process on-the-fly. Needless to say, this would totally kill framerate and cause excessice HD swapping as conversion copies easily eat up to megabytes of memory.


[RANT MODE]
Nevertheless, Deus Ex is such a great game that I even consider overlooking the issues and buying it. But I don't get it: why do developers choose the Unreal engine, overlooking these issues, when there are several technically superior alternatives (Quake II/III, LithTech 1.0/1.5/2.0) out there? Are they sitting ignorantly in their offices with only Voodoo cards in their computers believing, that people will buy a new video card just to run their game? Or is there huge sums of "sponsor" money involved?

Worst of all, head of Duke Nukem Forever developement team openly announced, that DN4ever will run best with Glide, and they won't even try to do anything about it.

Hell, screw the whole damn thing, I think I've had it up to here <pointing at his own neck> of PC games. My next computer will be a Sun or Alpha UNIX box + PS2.
[/RANT MODE]

[edit] Addressed typos, upgraded to a cheesier thread title [/edit]
 
Well, I agree that d3d support for the UT engine has been bad from the start. But I feel the need to point out that the development team around tim sweeney(Unreal´s main programmer) has been working their bu$$s off and bring out patch after patch. The current UT status as far as d3d goes is very playable. I have been experimenting with TNT,G400 TNT2Ultra , so I know what im talking about. The D3D is sharper than glide. Even though Glide remains the top choice for UT
engine games, consider this: The V3 cards are extremely cheap. I for one thing somehow prefer a game that rocks on a 100 bucks card.

As far as engines go, The Q3A engine is the the best in terms of compatibility. THIS IS, HOWEVER, NOT because of ID, but the guys from
opengl Setup. They build the startup program that detects the OpenGL
driver from your card.


Q2 engine is definitly worse than the unreal engine. The Lithtech
engine is great(how i adored Shogo) but needs more support...

I will get Deus EX and hope to get it running smoothly. I will upgrade to a 700E for that baby, so it better rock. Since I own a V3, I should
be fine....
 
I agree with Leo, cept maybe about the Q2 engine, its just too dated to do anything with now, Im impressed Raven did as well as they did with it.
And saying the Q3 engine is better cause of GlSetup is just bull.
Ive never used GlSetup, yet Q3 runs like 2x faster than UT most of the time, and at the same time looks better.
All GlSetup does is find drivers(as is my understanding, correct me if Im wrong).
And driver issues is hardly something that can be blamed on the guys oevr at id.
 
Perhaps my biggest concern with Deus Ex is the amount of sales Ion Storm will loose because of their choise of game engine - there were dozens of people on Ion Storm's message boards who tried out the demo and cancelled their preorders solely on sake of D3D performance. I sincirely hope that publishers don't draw hasty conclusions from this - medicore sales of Deus Ex and other cyberpunk RPGs (System Shock II comes to mind) might even eventually lead to partial destruction of the whole genre.

In my opinion Epic and Sweeney aren't the only ones to blaim about this whole farce, since Unreal Tournament runs quite well indeed in D3D (just tried the demo out again - damn, it's smoother than Quake3) with all details at max.

About Quake II engine, it might be older than Unreal and boast less spectacular special effects, but it's extremely compatible with a wide selection of hardware, expandable (as SoF demonstrates), and it can take massive amounts of geometry without severe hiccups.

Don't get me wrong, aside from technical issues Unreal engine is groundbreaking in many ways. It introduced several graphical innovations to the genre (volumetric fog effects, detail textures, halos) plus it's music/sound capablilites are still among the best on the market (even without 3D sound hardware). But the D3D issue is simply such a major one, that Epic shouldn't even be licencing the engine in the first place. Oh, and what kind of picture does this whole ordeal convey of Epic's QA process? Not very positive, I can tell you that.

<Bitter man's rant about to commence. Children, run, or you'll end up like him>

Quote from the lead designer of DNF, George Broussard:
&quot;That said, we will try to do what we can with the D3D drivers, but in reality, Epis has spent a year on them, so I'm not sure what more we can gain over that.&quot;

A year!? More like two years by now. In my honest opinion two years is enough time to re-write the whole Unreal engine codebase to run better in D3D, if Epic really wanted to get to bottom of this issue. And you know what's really ironic? The one complaining shouldn't even be me, but these licensees who paid enormous sums of money for the engine...

<Bitter man's rant ends, thanks for bearing with me>
 
Deus Ex just made me decide to go V5.

I need a new card and I knowing how much better the Unreal engine looks and runs in it's native glide environment, the Deus Ex demo just pushed me off the fence into voodoo land.

It's a bitch, because I like nvidea's technology, the way they do business, and the raw fps's they produce.

The point is that software drives hardware sales. Not vice-versa.
-Nobody bought Quake because their GeForce was optimized for it.

And that's about the ONLY thing 3Dfx knows about video cards.
 
You can turn on Palettized textures if you have a GF1/2. Go to your Direct3D key and add a dword PALETTEENABLE with a value of one.

The funny thing is, I've noticed that most GF owners need to have 60FPS as their acceptable minimum, but most Voodoo owners are happy with 30FPS. To each their own I guess.
 
jpprod - I don't see how it is &quot;destroyed&quot;. Maybe part of the rant can be aimed at NVIDIA to get the palletized textures feature working (since their new hardware seems to have it). I've also read tons of posts from NVIDIA-based cards' owners that say either 1) Glide wrappers work well, or 2) UT D3D runs very well on their computers. So what's the problem.

If you use NVIDIA cards, you get better Q3 perfromance as a bonus. If you use 3dfx cards, you get better Unreal performance as a bonus.

I disagree that the engines you listed are &quot;better&quot; solutions for developers.

Michael

ps - remember the cycle time it takes to design a game. When some were choosing Unreal, the VSA-100 delay was not known.
 
Unfortunately palletized textures aren't supported by GeForce Detonator D3D drivers, though they're there in OpenGL. Enabling palletized would undoubtably yield to a massive performance gain in all Unreal-engined games, I wonder why Nvidia doesn't give the option to us?

[edit] Michael, you were posting as I was

If you use NVIDIA cards, you get better Q3 perfromance as a bonus. If you use 3dfx cards, you get better Unreal performance as a bonus.

No, Nvidia cards have only a very marginal lead in Quake3 and 3dfx cards (Voodoo3 and up) are actually very playable at even higher detail settings. Unreal engine based games (aside from UT) run like a one-legged dog on anything besides hardware offering native Glide or MeTal support. There's a big difference between being able to declare yourself the performance champ (Nvidia vx. 3dfx in Quake3) and having a huge, orders of magnitude difference in performance (3dfx vs. all other D3D/OpenGL hardware in running Unreal engine)

[/edit]
 
So why do people shy away from writing for openGL instead of a combo of glide and D3D? UT in ogl would be nice if it did not crash so freakin often, or have problems switching from the menu to levels etc.
 
ArthurB1: Indeed, I too believe that Unreal engine could run much better in OpenGL than in D3D if similiar efforts were directed that way. As a conspiracionist I must point out the possibility of favouring D3D over OpenGL being a political decision (j/k) 😛
 
Damn, Nvidia still hasn't got their drivers fixed regarding palletized textures??

UT and Deus Ex use tons of textures. Running these types of games without compression will hinder even a GTS. I wonder if even the 64 megger is enough for totally smooth gaming without compression.

 
That was my point. That was a kickass game that will only run in glide. That is why i didn't get it.

TopQ
 
Sorry, Tribes runs fine in OpenGL mode. It's not perfect, but it's playable. And that with G400 with not-so-perfect OpenGl.
 
I tried the UT new D3D dll file, but it didn't help, just made the graphics look disgusting.
jp:
1. Yep, agreed, the game looks amazing(even if I suck)...although there are irritating things. Like: When I meet the contact, a stray NSF soldier just wanders up behind me, waits for the cutscene to end, and just opens fire. Kinda unfair.
2. On my Celeron 550, 128mb, TNT2 Ultra(1024@16bpp) it runs ok, but a little choppy. I hope on my GF2 (Friday!!), it'll improve. Sucks that you have to disable detail textures to eliminate choppiness.

Too bad the game is so damn fun I have to buy it regardless 🙂
 
jpprod - I'm still unsure of why Glide is the issue for you. Seems to me that the 3dfx cards handle the engine better because they have palletized textures. If the TNT series had them or the D3D drivers for the GF/GF2 worked for palletized textures, then the games would work very well.

Glide is an ide that's optimized for 3dfx cards. When it was created, it was much better than D3D and much easier to use than OpenGL. Unreal was originally a software only game.

I would say this is an NVIDIA problem in that they did not support a feature that was being used by many games (a little different saying that a NVIDIA card is missing a feature) and now it is an Epic problem to some extent because NVIDIA cards are very popular with OEMs and that'll hurt their marketing.

Michael
 
How you bench Dues Ex? On my V5 Glide seems a little smoother, D3D still plays fine as far I can tell. I only played for a few minutes, just to see how it ran on the V5, does it slower after the beginning?
I was playing 8x6, 2x and 4xFSAA in D3D, I had it at 1024x768, but then letters on the HUD were too small for me read easily.
 
jpprod, a highly interesting rant. I've already posted about UT engine's technical problems; its political problems are also obvious, as Tim Sweeney initially spent all his time writing proprietary API engines (particularly 3dfx and S3) in exchange for &quot;funding&quot; from those companies.

Notably, Sweeney's deal with S3 (on using MeTaL with its S3TC support) may have prevented him from using S3TC/DXTC in other API's (namely D3D). Despite Sweeney's comments, D3D has highly viable DXTC support; there is no honest excuse in using palletized instead.

Ultimately, the only good API is an open, non-proprietary API -- OpenGL. D3D, Glide, and MeTaL are all closed and proprietary. D3D is Microsoft-only; the others are OEM-only. From our perspective, D3D is more &quot;open&quot; since we mostly use Windows for gaming. However, I really hope Microsoft will support OpenGL in X-Box; they may, in hopes of enticing the much-needed developers.

Finally, you made an interesting point about ditching PC's (or Windows) altogether, and going with a console for gaming. Just recently, I thought about this myself: the console game market is *huge* compared to the PC market, and the variety of good games (and development talent) is unimaginable. The hardware is stunning. While the PC graphics companies are bickering over &quot;do you REALLY need T&amp;L?&quot;, the consoles have already implemented excellent T&amp;L functionality. The developers immediately took advantage of it, and now console games completely blow away the PC (especially on PSX2). The poly count in those console games makes me feel ashamed of PC technology.

Alas, I'm still hoping that X-box will breathe life back into the PC.
 
Unless OpenGL guys improve texture management and few other things which developers are screaming about, OpenGl (for gaming) will be dead by the end of next year. 90% of games using D3D, and that number will increase with the release of DX8.

IMO
 
Leon, are you talking about Company X's OpenGL driver guys? NVIDIA, for one, has excellent OpenGL drivers, and their texture management in 5.XX drivers (Detonator 2) is excellent. Partly because they occasionally rely on AGP texturing, and partly because they track texture usage. I would prefer if OpenGL could manually &quot;force&quot; textures into local/system memory (like D3D), but that could reduce its generality.

As to OpenGL being dead, NVIDIA has announced that all new features of their new hardware will be 100% supported under OpenGL, probably before the next version of D3D exposes them. In fact, even GeForce1 has features usable under OpenGL, but only Direct3D 8.0 will support them.

Direct3D may improve, but (unless the DOJ interjects) it will never be an open, non-proprietary, cross-compatible standard like OpenGL. Recognizing that, companies like id Software (and many companies who license their engines) will keep using OpenGL; and major players like NVIDIA will maintain excellent support for it. (Their recent fully-functional Linux OpenGL drivers further attest to that--they were alpha just recently, and now they're making it stable, etc; with optimized performance and 100% functionality).
 
Leo V - You want an open standard? Use Glide - 3dfx open sourced it. That's not true for OpenGL which SGI still has a great deal of claim over.

If texture management is not part of the OpenGL standard, then NVIDIA's work arounds or &quot;extensions&quot; to provide the texture strength that modern games demand will not be enough.

Palletized texures are an acceptable solution (I'm not claiming they're the best in every situation) and Unreal was started way before S3TC ever say the light of day. Unreal is just a prominent example of that form of texture management. There are other methods, but NVIDIA's lack of support of a much used feature is a main reason why they are needed as much.

Michael
 
Damn U Leo V. You just convinced me to buy a PS2. Screw waiting for T&amp;L support, or faster 4X FSAA!!! I'll just buy a PS2 or a X-box, which is going to have millions of game support. Now that Sony made PSX open hardware... I'm sure I can find a cheap PS2 knock out (if that will ever happen).
 
Ugg, don't know about you guys, but there are LOT of really bad games for the PS, just as there will be for the PS2. I'd even go as far to say that the ratio between good/bad games is higher for the PC vs. any console system. Although, true PnP sure is a nice selling point..
 
Back
Top