Architectural differences in oblivion benchmarks

orangat

Golden Member
Jun 7, 2004
1,579
0
0
Is Oblivion faster with ATI cards because of architectural advantages or is it better optimized in software?
 

A5

Diamond Member
Jun 9, 2000
4,902
5
81
Considering it's a TWIMTBP title (meaning most of the software optimizations are nvidia), I'm guessing it's the extra shaders on the 1900 series that make the most difference. It also helps explain why most of the 1800 series gets beaten.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
I would not account the Ati performance lead to just the x1900 shaders. Seeing how even other cards such as the x850xt have a big lead over their NV competition, it may be due to architectural differences, such as Z-culling, compression algorhithms, or even the way the cards handle certain shader code. Or it could be that Nv once again needs to release an "optimized" driver, and then they will catch up to Ati.
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
I tend to agree with munky. It seems ATI has been on the 'more and more optimized' shaders bandwagon for a while. In games like Oblivion, that approach really has paid off.

Given that the x850xt, for example, with 16 pipes, gives slightly better (non-HDR) performance than a 20-pipe 7800gt and competes well with the 24-pipe 7800gtx (providing higher minimum but lower average fps), in the outdoor settings, I'd say that ATI's shader routines are more efficient for Oblivion--at this stage...
 

orangat

Golden Member
Jun 7, 2004
1,579
0
0
From the AT benches, Nvidia cards are significantly slower for every generation 68xx-79xx which seems to belie the TWIMTBP claim.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
That goes for many TWIMTBP games: FEAR, TR:Legend, Farcry, BF2, ... did I forget any?
 

orangat

Golden Member
Jun 7, 2004
1,579
0
0
Originally posted by: munky
That goes for many TWIMTBP games: FEAR, TR:Legend, Farcry, BF2, ... did I forget any?

The differences are slight and not to the extent of Oblivion.
For example, in Oblivion the x850xt is faster than the 7800gt in the outdoor-bloom benchmark which is remarkable since the x850 is one generation behind less 4 pipes.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: orangat
Originally posted by: munky
That goes for many TWIMTBP games: FEAR, TR:Legend, Farcry, BF2, ... did I forget any?

The differences are slight and not to the extent of Oblivion.
For example, in Oblivion the x850xt is faster than the 7800gt in the outdoor-bloom benchmark which is remarkable since the x850 is one generation behind less 4 pipes.

Yeah, I know the differences are usually not so big, but it still makes me what exactly does the TWIMTBP money do?
 

Golgatha

Lifer
Jul 18, 2003
12,386
1,032
126
Originally posted by: munky
Originally posted by: orangat
Originally posted by: munky
That goes for many TWIMTBP games: FEAR, TR:Legend, Farcry, BF2, ... did I forget any?

The differences are slight and not to the extent of Oblivion.
For example, in Oblivion the x850xt is faster than the 7800gt in the outdoor-bloom benchmark which is remarkable since the x850 is one generation behind less 4 pipes.

Yeah, I know the differences are usually not so big, but it still makes me what exactly does the TWIMTBP money do?


It gets nVidia advertising in your face every time you load up a game?
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: munky
That goes for many TWIMTBP games: FEAR, TR:Legend, Farcry, BF2, ... did I forget any?

Um, ATi/NV performed pretty bad in F.E.A.R overall until drivers releases were to fix this (such as the F.E.A.R exe performance boost). But now, they perform silimarly with NV having higher mininum FPs while ATI having higher avg FPS (2-3fps faster).

Same goes for TR:legend. NV can play that game 19x12 next generation content with AA after the driver release that fixed a bug in the game.

FC is old. Both cards perform over 60 fps constant! NV has the lead with HDR benchs, while ATi could do HDR plus AA.

BF2 is also same as FC. Both cards perform over 60 fps constant.

Early benchmarks showed you the X850XT PE sometimes beating the 7800GT. But i dont think it can do it as of now. These cards are generally faster overall due to maturity in drivers.

Oblivion favours ATi cards outdoor, but favors NV cards indoor. BUt overall ATi cards perform well in oblivion thanks to its 48 pixel shaders.

Which architecture has its strengths and weaknesses, for such that when prey is released on July 10th NV will perform better due to it being OpenGL/based on the doom3 engine although heavily modified and use of stencil shadows.

note - NV doesnt use shader replacements anymore. Do ATi cards still do? (cant remember)
 

Nextman916

Golden Member
Aug 2, 2005
1,428
0
0
cookie monster you might want to edit your post because you are referring to ATI cards in general when your post reflect the capablities in X1xxx series cards.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
I think Oblivion has been a game like F.E.A.R which is simply programed predominantly on ATI hardware from the start. X850 XT shouldn't be faster then the 7800 GT as NV4x Shader tehcnology is better then the R4xx based stuff.

No matter, expect Nvidia to make sure some driver optimization to get Oblivion running better as time goes on same thing as they did with F.E.A.R., but the message is quite clear, if you want the best experience in Oblivion go for X1K cards of the X1800 or X1900 line.

ATI had better performance in F.E.A.R until pretty much 84.17 and further on? Though I think ATI still has better performance with the soft shadows in that particular game.


 

Bull Dog

Golden Member
Aug 29, 2005
1,985
1
81
Originally posted by: Cookie MonsterOblivion favours ATi cards outdoor, but favors NV cards indoor. BUt overall ATi cards perform well in oblivion thanks to its 48 pixel shaders.

Indoor benchmarks are pointless. It can hardly be called a win to nVidia or ATI if the FPS are all above 30-40 and there is 1-2 variation of FPS between the Ultra high-end (X1900XTX and 7900GTX and the higher-end middle range card (7900GT and X1800XT).
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
According to the Oblivion benchmarks here at Anandtech, the 7800 GT is putting up even numbers to the X800 XL... which is pretty absurd IMO. To me it's very obvious which company this game favors.
 

F1shF4t

Golden Member
Oct 18, 2005
1,583
1
71
Originally posted by: coldpower27
According to the Oblivion benchmarks here at Anandtech, the 7800 GT is putting up even numbers to the X800 XL... which is pretty absurd IMO. To me it's very obvious which company this game favors.

With the claims about the performance of 7800gt and x800, is there any effect of different shader versions being used? as if the 7800gt is forced to do more effects, this could explain the performance of the cards. I'm primarily tralking of sm3 and stuff.

I dont own the game, but would be interesting to see the effect of forcing different shader versions.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
By default SM3 shaders are disabled in the game, although you can supposedly enable them in the ini file. I enabled it on mine, but I didnt notice any changes in IQ or performance.
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: munky
By default SM3 shaders are disabled in the game, although you can supposedly enable them in the ini file. I enabled it on mine, but I didnt notice any changes in IQ or performance.

Yeah same here some users have actually done it and had graphical problems.

But other than that except for being written in the config files there are no SM3 shader paths. Supposedly something to do with they could never get it working...

Which is pretty wierd as other less known companies have been able to get SM3 working on much older engines *cough cough* Far Cry, Splinter Cell (Unreal 2 Enigne!)
 

Barkotron

Member
Mar 30, 2006
66
0
0
Originally posted by: Drayvn
Originally posted by: munky
By default SM3 shaders are disabled in the game, although you can supposedly enable them in the ini file. I enabled it on mine, but I didnt notice any changes in IQ or performance.

Yeah same here some users have actually done it and had graphical problems.

But other than that except for being written in the config files there are no SM3 shader paths. Supposedly something to do with they could never get it working...

Which is pretty wierd as other less known companies have been able to get SM3 working on much older engines *cough cough* Far Cry, Splinter Cell (Unreal 2 Enigne!)

The SM3 thing is weird. What are they using for HDR? If it's not SM3, why don't they let X800-series ATI cards do HDR? Does anyone know?
 

PingSpike

Lifer
Feb 25, 2004
21,756
600
126
Yeah, this particular title does seem to favor ATI parts...and not just current parts either. It does well will the x8xx generation as well.

There's a couple factors that could be at play though. From everything I've read, the developers used ATI hardware on their development machines from the start. Which isn't surprised, but they started during the 9xxx series and the FX series era. And the FX series was abysmal then and is kind of irrelevant now. Also, the xbox360 uses ATI hardware...making it more worthwhile for them to focus on the ATI side of things. And of course, ATI's hardware may just be better suited to this title.

One thing is for sure, TWIMTBP money doesn't mean jack sh|t and can safely be ignored. If anything, all it did was buy off the developers to keep them from enabling HDR+AA for ATI parts.

A good question was raised, if it does't use SM3 paths then why doesn't HDR work on x800 series parts? The only thing I can think of there is TWIMTBP money at work again. But I believe you have to roll your own setup to do HDR on SM2 cards...and it maybe it was an afterthought that makes partial use of the SM3 features on nvidia's parts. I'd be less inclined to believe this conspiracy theory, except that the HDR+AA debacle can pretty much ONLY be explained by a conspiracy. (Bethseda's developers said it would have delayed the game to add HDR+AA to ATI parts...Chuck said it took him 12 hours total including research to get the job done, through a driver without intimate knowledge of the game.) So that certainly doesn't leave this one out of the realm of possibility.
 
Jun 14, 2003
10,442
0
0
Originally posted by: Barkotron
Originally posted by: Drayvn
Originally posted by: munky
By default SM3 shaders are disabled in the game, although you can supposedly enable them in the ini file. I enabled it on mine, but I didnt notice any changes in IQ or performance.

Yeah same here some users have actually done it and had graphical problems.

But other than that except for being written in the config files there are no SM3 shader paths. Supposedly something to do with they could never get it working...

Which is pretty wierd as other less known companies have been able to get SM3 working on much older engines *cough cough* Far Cry, Splinter Cell (Unreal 2 Enigne!)

The SM3 thing is weird. What are they using for HDR? If it's not SM3, why don't they let X800-series ATI cards do HDR? Does anyone know?


because the method of HDR used doesnt rely on the pixel shaders.... its that openEXR bollocks that needs FP blending or something
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
HDR is a separate feature from SM3. It just so happens that only SM3 cards support FP blending, but if the x800 cards also supported it then HDR would work on them with SM2 also.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
XB360 native game, it should be faster on the PC with ATi parts(although it is a bit surprising to see how much slower it is on the PC then its console counterpart even without AA).

The reverse will be true with PS3 ports.
 

gobucks

Golden Member
Oct 22, 2004
1,166
0
0
as far as oblivion goes, it seems to me that it's ATi's memory architecture that is really helping it out. That 512-bit ring bus thingy is apparently helping to maximize bandwidth and that in turn helps minimize the performance impact of AA/AF.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: gobucks
as far as oblivion goes, it seems to me that it's ATi's memory architecture that is really helping it out. That 512-bit ring bus thingy is apparently helping to maximize bandwidth and that in turn helps minimize the performance impact of AA/AF.

As what says it's not only the X1k Series that has great performance in this game but the X800 Series as well.

To the poster above regarding HDR + AA, X800's cannot OpenEXR FP16 HDR, they can do another form of HDR that pretty much no developer implemented.