Doom3+OpenGL=bad news for ATI?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: GeneralGrievous
Just maybe ATI could do for Doom3 what Nvidia did for Far Cry and increase the performance with their drivers....oh dreams .....
You mean increase performance by lowering IQ? No thnx, at least for the x800 cards.

Actually with Patch 1.2 the image quality issues have been completely fixed on all Nvidia videocards, while still retaining the performance gained from the previous set of drivers.

Link - "And for now we can mention a comforting fact that the problem of low NVIDIA quality in FarCry (in several scenes you could see pixelization at changeable lighting) is resolved in v1.2 even without using Shaders 3.0." - digit-life
 

Socio

Golden Member
May 19, 2002
1,732
2
81
Originally posted by: MrPabulum
Is not ATI completely rewriting their OpenGL drivers? Maybe the Cat. 4.8s will coincide with a Doom 3 release? ;)

Yes they are!

With NVIDIA always getting the upper hand as far as OpenGL goes believe me ATI techs are going to make theirs just as good if not better than NVIDIA they pretty much don't have a choice seeing that DOOM III will be here in a few weeks and some of the biggest games now in production now like Quake4 and RTCW2 are being based on the Doom III engine which of course is OpenGL. Not to mention the rumor that COD2 is in the works as well and will be using the DoomIII engine also.


As for the 60FPS cap I heard that was for playing online only which similar to what UT2004 does already.
 

stnicralisk

Golden Member
Jan 18, 2004
1,705
1
0
Originally posted by: Extrarius
Originally posted by: Alkaline5
[...]So if things in D3 are only moving 60 times a second, drawing more frames than that would be pointless.[...]
Not quite true, because the positions of objects could be interpolated for drawing to make them look more smooth. In other words, if the object is at X=1 physics frame 1, and X=5 in physics frame 2, the display (if it were running at say 120 FPS) could show the object at X=1, X=3, then X=5

Also, what is up with the whole "ATI sucks at opengl" thing? I've never noticed any such thing, nor have I noticed nvidia doing worse at directX. In fact, IME ATI does better at opengl than d3d (or all games that I own that support both don't use directx right or something), so I'd say its not a definite rule at all.


Look at some benchmarks
RTCW, COD, NWN are all games that run a lot more FPS on Nvidia hardware (even on the FX series man!) and anything with 2.0 pixel shaders and the FX series did horrible and it ran much faster on ATi. This isnt speculation this is FACT.
 
Apr 14, 2004
1,599
0
0
Actually with Patch 1.2 the image quality issues have been completely fixed on all Nvidia videocards, while still retaining the performance gained from the previous set of drivers.
Ah, my mistake. I thought you were talking about the 5xxx cards.

RTCW, COD, NWN are all games that run a lot more FPS on Nvidia hardware (even on the FX series man!) and anything with 2.0 pixel shaders and the FX series did horrible and it ran much faster on ATi. This isnt speculation this is FACT.
Take a look at this. Not every opengl game does better on the Nvidia ardware. 9800 Pro beats the overclocked 5950
 

Alkaline5

Senior member
Jun 21, 2001
801
0
0
Originally posted by: Cerb
Why not 100 ticks/s? That would even satisfy member of AT :)
After thinking about the incredible amount of control the player had over the Q3 engine, I'd be really surprised if there wasn't a server-side setting that allowed some leeway in manipulation of the frame-lock in singleplayer.
Originally posted by: Extrarius
Not quite true, because the positions of objects could be interpolated for drawing to make them look more smooth. In other words, if the object is at X=1 physics frame 1, and X=5 in physics frame 2, the display (if it were running at say 120 FPS) could show the object at X=1, X=3, then X=5
I'm no graphics programmer, so this an honest question: In-game characters and objects could have some degree of prediction applied, but player characters could be too erratic to predict successfully and would require waiting for finalized locations, right? I assume the engine would have to calculate the locations of all on-screen objects twice (once for the intermediate, and once for the "final" frame) before being able to produce the intermediate frame. Since the intermediate frame couldn't be drawn until the final frame was nearly ready for rendering calculations would that produce a bottleneck?
 

badnewcastle

Golden Member
Jun 30, 2004
1,016
0
0
I'm sure you can find more information on this by doing a google search, but from my understanding Humans can only see between 50 and 60 fps on a computer screen... So what is the big deal about them capping the fps???
 

Pulsar

Diamond Member
Mar 3, 2003
5,224
306
126
First, badnewcastle, there are some instances where you can see the difference. I generally try to run at 85fps / 85hz for completely smoooth gameplay.

You will see the difference if you set two computers, one at 60 and one at 85, side by side, at least in my experience.

As for the control you had in Q2 and Q3, that is exactly what they don't want. There were a TON of exploits centered around changing your framerate - i.e. - you could change your framerate to 1 and fit through spaces you weren't meant to fit through - then change it back to 85 (where you'd get max jump distance). Because of jump-strafing in all quake games, jump speed is VERY important, as is height.

Because physics are directly tied in with q2/3 framerate, they had to get rid of it to change how it acted.

You could purposely lag yourself then lag-jump by changing your frames correctly as well. It was a large problem in tournament place, especially in games like capture-the-flag.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I think what we failed to take into consideration is the fact that when Nvidia dominates at Open GL, ATI's X800 series still manage to get about 100+fps at max detail settings (1600x1200 4AA/8AF). So yes, Nvidia cards might get 150-200 frames at COD, RTCW, etc. (NWN the difference is more like 5 frames tops...) So yes, Nvidia does hold a considerable advantage, but so far the most stressful games have been based on Direct 3D (UT2k4, Far Cry, etc., T:AOD)

Doom 3 seems to be the first OpenGL game recently to put a significant stress on the cards. Yes it might run slightly faster on Nvidia's hardware, but when neither top end card will play it at max detail, what difference does it make if one plays at 55 and the other at 50? Of course I don't know the exact #'s. We'll just have to see upon its release how much difference there is between the top cards. What might give Nvidia an additional advantage is their upper-hand performance in general under heavy use of stencil shadows in any particular game (ie. doom3), where it dominates ATI's hardware.
 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
I bet you guys would be running this at 30fps with the details cranked up higher.
DooM3 is a relatively slow game, and plays beautifully at 30fps.
Not worth the IQ sacrafice to run it at 60+ fps anyways.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: clicknext
Originally posted by: thegimp03
"Only 60 fps...." This is pre-supposing that your computer will even be able to reach that level? Hehehe. :)

lol, yeah, I bet the high end systems will only be around that level, and everything else lower.

Exactly