Leo DirectX forward plus rendering lighting

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
"Fair" in this case would mean not doing obscene work loads for no visual gains. Something im sure users would select if there were an option. Luckily, AMD has this option in their drivers, discard completely rubbish tessellation factors.

So, like the advanced lighting technique in Dirt:Showdown? Or the "Compute" in Sniper? The visual gain in Showdown is nothing compared with the 60% performance hit on nVidia cards.

Fair means same workload and not settings you or I like.

I like having this option, and thats my point above, if NV push tessellation like crazy, its irrelevant as an AMD user you can ignore it. If AMD push Forward + in future games... its pretty damn relevant when kepler sucks at it.

Lowering settings to getting more frames while losing IQ is not a choice. Example: Nobody will see the impact of forward+ in Dirt:Showdown. Even MSAA is working without it. So yeah, i have the choice to disable a feature, getting 120% more frames and playing with smooth 60 FPS and let my GT670 run with half the clock rate.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Why kind of BS is that? Throwing a game out because nvidia cards run it so poorly. BF3 NEVER ran that badly on AMD cards and now runs as fast. Maybe nvidia should get off their asses and optimize their drivers.

Same workload my ass. You pay for the best card to give you the best experience. If I bought an nvidia card for something other than performance like physx or 3D or whatever. We all know AMD sucks at 3D and physx is locked to 15fps on non nvidia systems. But if there are visual features and texture packs that make a game look better even a little better and don't run like crap on AMD cards then keep it in reviews so nvidia can fix their performance.

You also have no right to complain about Dirt:Showdown performing poorly on your 670. You made you choice. Live with it.

Just like I made mine, I don't get physx, but I don't tell people it doesn't matter or tell them not to use it. That is just being sour.

/rant
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
This is pretty bullshit when games like Hawx, Hawx2, LP and LP2 along with a bunch of other TWIMTBP games that totally skewed the results towards NV. What then? They still used it, double standard losers, i won't visit them in the future seeing that statement.

Actually HawX was an AMD evolved title that did bring DirectX 10.1 aspects to the gamer -- very welcomed as well.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Imho,

I don't agree! -- a title may have a fidelity setting that may shine on a particular architecture but doesn't really effect game-play. No one is harmed really and yet a gamer can be rewarded for the strength of an architecture.

To wait 'till both architectures are even -- ideal playing field -- if one can't handle the fidelity setting - then it's not worth doing -- how the hell does one move forward with that idealistic view?

All gamers have to enjoy it and if the other architecture or software isn't up to speed -- none at all. Can't push forward and innovate with that backwards thinking to me. It's noble to have a desire for everyone to have equal gaming experiences but an idealistic view and idealism becomes the enemy of good.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Imho,

I don't agree! -- a title may have a fidelity setting that may shine on a particular architecture but doesn't really effect game-play. No one is harmed really and yet a gamer can be rewarded for the strength of an architecture.

To wait 'till both architectures are even -- ideal playing field -- if one can't handle the fidelity setting - then it's not worth doing -- how the hell does one move forward with that idealistic view?

All gamers have to enjoy it and if the other architecture or software isn't up to speed -- none at all. Can't push forward and innovate with that backwards thinking to me. It's noble to have a desire for everyone to have equal gaming experiences but an idealistic view and idealism becomes the enemy of good.

I agree.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
If I'm not wrong, then all that full HW scheduling does is make it so that there needs to be no per game driver optimization. I'm guessing that's why nvidia didnt pick up more performance in crystal warhead and in AVP.

However, nvidia is at a disadvantage with their low DP performance, because DP will be important in the future. 32 bit data precision won't be enough.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Not according to Anandtech's own benchmarks. The 7970 and especially AMD's 7950 score far worse than NV's counterparts.

That's because a stock 7950/7970 aren't that fast in BF3 since AMD cards suffer in deferred MSAA Forstbite 2.0 engine. BF3 is also one of the best games for nV for that reason. Doesn't matter though since 7950 and 7970 overclock 25-50%. So they easily catch up to NV's cards while beating them almost everywhere else.

You can overclock 7900 series to still get close to 670/680 in the games where 670/680 are faster, but you cannot really overclock 670/680 to catch up to 7900 series in games where AMD is leading because that lead is usually massive (Crysis 1/Warhead, Metro 2033, Anno 2070, Alan Wake, Sniper Elite V2, Sleeping Dogs, Dirt Showdown, Serious Sam 3, Bulletstorm, SKYRIM with mods / 8AA, Batman AC 8AA, etc.). Basically while NV cards are good at the popular games such as WOW, Guild Wars 2 and BF3/Crysis 2, but on the whole they have some serious gaps in many other less mainstream games.

Unless you play BF3/GW2/WOW/BatmanAC/Lost Planet 2/Medal of Honor Warfighter and Crysis 2/3 only, then those gaps will show up eventually.

13_bat3.png

13_bat3.png


If all you play is BF3, then sure $300 GTX660Ti is good.

Imho,

I think it is wonderful!

You keep saying this over and over. So the fact that your card performs like a dog in Dirt Showdown, Sniper Elite V2 and Sleeping Dogs is great? Jeez.

Maybe you can send me a GTX680 when Medal of Honor Warfighter launches so I can actually play the game....

Either way this game of throwing $ at developers by one upping each other is just hurting us. I don't feel like having an AMD and an NV card in my rig to play certain games. There is no reason that someone like Grooveriding with $1k of GPUs should be running a game at 40-45 fps. Stuff like this hurts PC gaming. At the end a person may buy a $400 NV GPU, fire up one of these AMD sponsored titles and then realize it runs poorly and abandon PC gaming for PS4. That's what will happen if $400 GPUs can't even run a basic game at 60 fps. This is no different than someone who buys a 7970 and fires up The Secret World and it has the most artificial looking red bricks via Tessellation seen in the last 2 years and it runs like a dog.

91 fps in DX9 Ultra on 7970 and 96 fps on a 680
sw%201920%209.png


vs.

24 fps in DX11 Ultra on 7970 and 53 fps on a 680
sw%201920%2011.png


Almost a 4x reduction in performance on AMD cards and the game doesn't look any better.....that's great for PC gaming? Ya, I guess if you bought a 680.

Not many PC gamers have $1k to buy a $500 AMD card and a $500 NV card to get the best of all world.
 
Last edited:

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Some people on our forum have argued that it's great when AMD and NV work closely with developers and I said if that continues, I fear we'll need both brands of videocards to play games. That's only getting worse now that AMD is throwing $ at Gaming Evolved. First Dirt Showdown, then Sniper Elite V2, now this.

Been saying the same thing for years, that we will need both GPU brands in the PC if things like this go to the extreme.
The only people who win is the GPU makers.
 
Last edited:

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
@RussianSensation: Would you rather AMD/Nvidia stop sponsoring game developers altogether, and just get a load of weak looking console ports?
 

ashenburger

Junior Member
Aug 13, 2012
16
0
0
Final8ty: Only if the market can sustain it, I couldn't imagine spending that amount of money on 2 cards I can't CFX/SLI. And if the prices go down, this really doesn't win anything for the GPU makers.

RussianSensation: Do you think that The Secret World stuff cannot be mitigated with drivers? And if not, is it only the amount of tesselation that is too much for the card?
 
Last edited:

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Final8ty: Only if the market can sustain it, I couldn't imagine spending that amount of money on 2 cards I can't CFX/SLI. And if the prices go down, this really doesn't win anything for the GPU makers.

Some people would just drop the PC for any serious gaming.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
If the 7970 is doing worse that the 560 Ti I would bet that more is going on than simply overloading the AMD cards' tessellators. The 7000 series is competitive with AMD in games like Crysis 2 and Batman: Arkham City, both TWIMTBP games with what some might call excessive amounts of tessellation. No, it probably just comes down to poor optimizations in the game, Nvidia being able to test drivers on the game early, and little consideration for AMD in general. Give the AMD driver team a good look at it and I'm sure AMD will make its cards competitive.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
That's because a stock 7950/7970 aren't that fast in BF3 since AMD cards suffer in deferred MSAA Forstbite 2.0 engine. BF3 is also one of the best games for nV for that reason. Doesn't matter though since 7950 and 7970 overclock 25-50%. So they easily catch up to NV's cards while beating them almost everywhere else.

You can overclock 7900 series to still get close to 670/680 in the games where 670/680 are faster, but you cannot really overclock 670/680 to catch up to 7900 series in games where AMD is leading because that lead is usually massive (Crysis 1/Warhead, Metro 2033, Anno 2070, Alan Wake, Sniper Elite V2, Sleeping Dogs, Dirt Showdown, Serious Sam 3, Bulletstorm, SKYRIM with mods / 8AA, Batman AC 8AA, etc.). Basically while NV cards are good at the popular games such as WOW, Guild Wars 2 and BF3/Crysis 2, but on the whole they have some serious gaps in many other less mainstream games.

Unless you play BF3/GW2/WOW/BatmanAC/Lost Planet 2/Medal of Honor Warfighter and Crysis 2/3 only, then those gaps will show up eventually.

13_bat3.png

13_bat3.png


If all you play is BF3, then sure $300 GTX660Ti is good.



You keep saying this over and over. So the fact that your card performs like a dog in Dirt Showdown, Sniper Elite V2 and Sleeping Dogs is great? Jeez.

Maybe you can send me a GTX680 when Medal of Honor Warfighter launches so I can actually play the game....

Either way this game of throwing $ at developers by one upping each other is just hurting us. I don't feel like having an AMD and an NV card in my rig to play certain games. There is no reason that someone like Grooveriding with $1k of GPUs should be running a game at 40-45 fps. Stuff like this hurts PC gaming. At the end a person may buy a $400 NV GPU, fire up one of these AMD sponsored titles and then realize it runs poorly and abandon PC gaming for PS4. That's what will happen if $400 GPUs can't even run a basic game at 60 fps. This is no different than someone who buys a 7970 and fires up The Secret World and it has the most artificial looking red bricks via Tessellation seen in the last 2 years and it runs like a dog.

91 fps in DX9 Ultra on 7970 and 96 fps on a 680
sw%201920%209.png


vs.

24 fps in DX11 Ultra on 7970 and 53 fps on a 680
sw%201920%2011.png


Almost a 4x reduction in performance on AMD cards and the game doesn't look any better.....that's great for PC gaming? Ya, I guess if you bought a 680.

Not many PC gamers have $1k to buy a $500 AMD card and a $500 NV card to get the best of all world.

Imho,

Fear mongering and idealism.

Your thinking is backwards thinking -- Gee, even though we have these neat features and abilties for our GCN architecture -- let's not include them because GeForce Owners may suffer a performance hit.

"Hey guys, we can really enhance Borderlands 2 physics for our customers?" " Oh no, we can't do that -- AMD doesn't offer physX!"

How the hell does one innovate and showcase their architectural strengths? Tech demos or actual content in games?

Let's all hold hands together, dance around the camp fire, singing show tunes and get all excited that all gamers have the same gaming experiences with similar performance! Hey, that sounds like a frigg'n console! It's the chaotic competition that moves and drives innovation forward -- -- no let's wait for everyone to be on the same page first before we move forward.
 
Last edited:

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
Imho,

Fear mongering and idealism.
Interesting comment, especially when you go on to say...
"Hey guys, we can really enhance Borderlands 2 physics for our customers?" " Oh no, we can't do that -- AMD doesn't offer physX!"

How the hell does one innovate and showcase their architectural strengths? Tech demos or actual content in games?
Nvidia removes effects and features, then puts them back in using PhysX, with questionable results. Sure, the particles and such are interactive and "real time" but in the end it doesn't make any difference, and in many cases the effects are overblown in an attempt to showcase PhysX, and they end up looking stupid and out of place. It's one thing to put in some eye candy using PhysX, it's an entirely different kettle of fish when you REMOVE easily achievable effects done on the CPU just to prop up PhysX.
Let's all hold hands together, dance around the camp fire, singing show tunes and get all excited that all gamers have the same gaming experiences with similar performance! Hey, that sounds like a frigg'n console! It's the chaotic competition that moves and drives innovation forward -- -- no let's wait for everyone to be on the same page first before we move forward.
Standardization is what has driven this industry forward from the beginning. Every attempt to do otherwise has failed. It is not through chaos that brings about innovation, it's through cooperation and industry consensus, and most importantly standardization. The web did not become the sensation it is by limiting itself to certain hardware. If Nvidia first came up with a web mark up language, they would make sure it would only run on their computers, and it would fail.

A simple example if Macromedia Flash. It gained widespread acceptance though innovation. And you might say, AH HA, that was a company innovating, not waiting for a standard to become popular. This is true, but Macromedia did NOT limit what Flash would run on, did not cripple features depending on the hardware. They made sure it was 100% free to end users and ran on everything. This is the polar opposite to PhysX and why PhysX will ultimately fail. If Nvidia really wanted PhysX to succeed, they would make it run on anything, and make money on the licensing and dev tools. But they are too myopic.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
It is not through chaos that brings about innovation, it's through cooperation and industry consensus, and most importantly standardization.

Just like the industry consensus and most importantly standardization on x86.
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
Imho,

Fear mongering and idealism.

Your thinking is backwards thinking -- Gee, even though we have these neat features and abilties for our GCN architecture -- let's not include them because GeForce Owners may suffer a performance hit.

"Hey guys, we can really enhance Borderlands 2 physics for our customers?" " Oh no, we can't do that -- AMD doesn't offer physX!"

How the hell does one innovate and showcase their architectural strengths? Tech demos or actual content in games?

Let's all hold hands together, dance around the camp fire, singing show tunes and get all excited that all gamers have the same gaming experiences with similar performance! Hey, that sounds like a frigg'n console! It's the chaotic competition that moves and drives innovation forward -- -- no let's wait for everyone to be on the same page first before we move forward.

The difference between the two is one is based on open standards. The industry should not be so eager to create another monopoly. What's wrong with having many companies competing and driving the market forward instead of none. Opencl is open to anyone, CUDA and Physx not so much, they have full control of the spec. AMD is trying to help promote an industry wiith open standards which benefits everyone but the monopolies. I'm not sure why anyone would object to that.

As for the direction of the industry, for sure its heading in the same direction that AMD is heading. There are a lot of big name titles out with more to come that are not only optimized for the GCN architecture, but are also adding extra features to games that utilize that compute power. If you have an NV card you just don't get the extra benefits of having an AMD card but since its an open standard its noones fault but there own. I often wonder why people cheer ARM, Google and Linux for being open, but forgive nv's proprietary scheme.
 
Last edited:

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
Just like the industry consensus and most importantly standardization on x86.
I am not sure you thought this out very carefully. Do you want a virtual monopoly in graphics for 40+ years? The lack of innovation in x86 through its history is downright sad. We got slow, incremental updates to the ISA, although thankfully Moore's law pushed computing power up exponentially. What's worse is Intel wanted to dump x86 and leave an entire industry stranded, it is almost dumb luck that years earlier AMD was put into the x86 game because of IBM, and we eventually ended up with x64. BTW, it is my understanding that x86 is not the fastest growing ISA, and I believe is no longer the most used ISA, no doubt because x86 is not available for anyone to license if they want to.
Great job of expanding!
There is no need to rehash discussions over and over. I'm pretty sure you know exactly what I am talking about, and if you don't c'est la vie don't want to go down that road.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Imho,

Fear mongering and idealism.

Your thinking is backwards thinking -- Gee, even though we have these neat features and abilties for our GCN architecture -- let's not include them because GeForce Owners may suffer a performance hit.

Remember back in the days when you could select OpenGL or DX to play a game? Some games ran much faster on NV cards in OpenGL. Well that option was there. This way, you could still play the game with the ATI card in DX and have faster performance with the NV card in OpenGL. Both parties are happy. If Borderlands developer put in CPU physics and PhysX for nV cards and PhysX was superior, I have no problem with that. Instead, they spent time putting in physics effects via PhysX and allocated no effort at all towards CPU physics model. So it doesn't give a gamer any option at all. You either get physics with PhysX or nothing.

Dirt Showdown uses DirectCompute for the global lighting model. If you want global lighting, it's either that or nothing at all in that game. The difference is the industry is moving towards DirectCompute while PhysX is a closed standard.

Further, the excessive Tessellation in The Secret World is the only major difference between running the game in DX9 and DX11 as investigated by GameGPU.ru. The game looks so ugly, the developer would have been way better off working on a high-resolution texture pack. The rail-road tracks on cobble streets look all crooked as if the game was made in 2003. Unfortunately they were too busy serving NV's needs to put in a gigantic tessellated wall of red bricks that looks all smudged with a 4x performance hit. That's the future of PC gaming? I am all for Tessellation when it makes the game look a lot better. The Secret World is not one of those games. How did the developer go through beta testing of that game and conclude that a 4x performance hit is worth it to keep the Tessellation option while making the game look worse?!

This is no different than concrete barrier and invisible ocean in Crysis 2. Both of those things literally came out of nowhere, mysteriously after NV got involved.....Considering how great Crytek's physics, lighting and destruction model was in Crysis 1 4 years ago, I am not gullible enough to believe that Crytek's programmers would be so incompetent to do this:

true-water-full-620.jpg


true-water-mesh-620.jpg


What kind of a programmer would miss a mistake like this by "accident" on a multi-million dollar blockbuster videogame????

city-trees-full-620.jpg


city-trees-water-mesh-620.jpg


A tessellated ocean in the middle of NYC?

window-full-620.jpg


window-water-mesh-620.jpg


debris-water-mesh-620.jpg
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
And then after dumbing down the graphics of Crysis 2 for consoles, removing complex lighting and lens flares of Crysis 1, removing complex CryEngine 2 physics effects, I am expected to believe that out of the blue Crytek instead decided to waste coding resources by applying the most excessive geometry tessellation to parts of the game no one can even notice are being tessellated?

debris-full-620.jpg


debris-floor-mesh-620.jpg


wood-full-620.jpg


wood-mesh-620.jpg


The world's most tessellated slab of flat concrete ever seen in a game?

barrier-side-full-620.jpg


barrier-side-mesh-620.jpg


My head is spinning at how bad this is....

barrier-dx11-full.jpg


barrier-dx11-mesh.jpg
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
If I am not mistaken this was the case in the original crysis as well(they probably copied the old codebase for crysis 2). Also without any input from the developers its silly to speculate.