Leo DirectX forward plus rendering lighting

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I just picked up Sleeping Dogs and it is kicking my setup's ass. Trying to max it out gives me 20FPS...

LOL, this is like Witcher 2 on steroids. From the blog post:

"Gamers like you know that a game run without anti-aliasing introduces aliasing, or “jaggies,” onto the edges of objects in the game world. These jaggies are ugly and we all know it, so we went ballistic on them with an advanced form of anti-aliasing that combines supersampling and compute-accelerated post-processed AA. You, as the user, have configured your game to run at 1920×1080, and you’ve selected 4xSSAA as your anti-aliasing method. These settings tell the graphics card to render the game’s content at a 4x larger resolution of 3840×2160 (ultra-high definition), then resize that frame back down to 1920×1080 before display on the monitor. The “Extreme” anti-aliasing setting uses the compute horsepower of Graphics Core Next to do another anti-aliasing pass on the final frame, which will smooth out those last four pixels of aliasing we described in the example above." Ok now take that statement and apply it to 2560x1600 resolution. Your videocard is doing 4-5x the workload of a single 2560x1600 frame. :eek:

Some people on our forum have argued that it's great when AMD and NV work closely with developers and I said if that continues, I fear we'll need both brands of videocards to play games. That's only getting worse now that AMD is throwing $ at Gaming Evolved. First Dirt Showdown, then Sniper Elite V2, now this.

performance.png


It's going to be a game of who throws more $ at developers, NV going ballistic on PhysX + tessellation and AMD exposing DirectCompute, contact hardening shadows and HDAO advantage of its architecture. I am expecting the wildest use of tessellation in Crysis 3. :)
 
Last edited:
Feb 19, 2009
10,457
10
76
It's okay because on AMD you can just set tessellation override to not be stupid.

Everything else AMD is doing is DX11 standard. Physx is obviously not open standard. When new consoles come, all on AMD hardware and DX11.. guess what??

Have to add, none of this matters when big K arrives with full compute and gaming power. For the enthusiasts, power use is not relevant, never has been.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Big K might be pretty expensive. Can't see such a beast under $600. I might skip 2013 entirely and wait until 20nm and huge architectural changes with Maxwell in 2014. I just wish there was even a rumour regarding when HD8000 series launches. I've read Q1 or Q2 2013. That's a big difference.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
LOL, this is like Witcher 2 on steroids. From the blog post:

"Gamers like you know that a game run without anti-aliasing introduces aliasing, or “jaggies,” onto the edges of objects in the game world. These jaggies are ugly and we all know it, so we went ballistic on them with an advanced form of anti-aliasing that combines supersampling and compute-accelerated post-processed AA. You, as the user, have configured your game to run at 1920×1080, and you’ve selected 4xSSAA as your anti-aliasing method. These settings tell the graphics card to render the game’s content at a 4x larger resolution of 3840×2160 (ultra-high definition), then resize that frame back down to 1920×1080 before display on the monitor. The “Extreme” anti-aliasing setting uses the compute horsepower of Graphics Core Next to do another anti-aliasing pass on the final frame, which will smooth out those last four pixels of aliasing we described in the example above." Ok now take that statement and apply it to 2560x1600 resolution. Your videocard is doing 4-5x the workload of a single 2560x1600 frame. :eek:

Some people on our forum have argued that it's great when AMD and NV work closely with developers and I said if that continues, I fear we'll need both brands of videocards to play games. That's only getting worse now that AMD is throwing $ at Gaming Evolved. First Dirt Showdown, then Sniper Elite V2, now this.

performance.png


It's going to be a game of who throws more $ at developers, NV going ballistic on PhysX + tessellation and AMD exposing DirectCompute, contact hardening shadows and HDAO advantage of its architecture. I am expecting the wildest use of tessellation in Crysis 3. :)

Yeah I went back and actually paid more attention to the settings instead of shoving them over... :rolleyes: Noticed it is super-sampling and I think AMD also put the fix in on the shadowing method used. Either way it is crippling my setup.

Part of it is lack of a working SLI profile, but I found this post from boxleitnerb in PC gaming with an SLI flag that works. This doubled my framerate, so now I'm getting about 40-45FPS. Game looks decentish... but not 40-45FPS on 680SLI decentish, that is less than I get in BF3 multiplayer by a sizeable margin. Game is a lot of fun, it's like Batman AA/AC combat with a better storyline set in the crime world of Hong Kong. Going to wait for nvidia's next driver update and see if they do something for this game.

You are right about AMD/nvidia getting too close to game devs. It's not a conspiracy when we keep seeing this happen time and again with heavily GPU vendor branded titles that play like crap on the competition's cards.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
LOL, this is like Witcher 2 on steroids. From the blog post:

"Gamers like you know that a game run without anti-aliasing introduces aliasing, or “jaggies,” onto the edges of objects in the game world. These jaggies are ugly and we all know it, so we went ballistic on them with an advanced form of anti-aliasing that combines supersampling and compute-accelerated post-processed AA. You, as the user, have configured your game to run at 1920×1080, and you’ve selected 4xSSAA as your anti-aliasing method. These settings tell the graphics card to render the game’s content at a 4x larger resolution of 3840×2160 (ultra-high definition), then resize that frame back down to 1920×1080 before display on the monitor. The “Extreme” anti-aliasing setting uses the compute horsepower of Graphics Core Next to do another anti-aliasing pass on the final frame, which will smooth out those last four pixels of aliasing we described in the example above." Ok now take that statement and apply it to 2560x1600 resolution. Your videocard is doing 4-5x the workload of a single 2560x1600 frame. :eek:

Some people on our forum have argued that it's great when AMD and NV work closely with developers and I said if that continues, I fear we'll need both brands of videocards to play games. That's only getting worse now that AMD is throwing $ at Gaming Evolved. First Dirt Showdown, then Sniper Elite V2, now this.

performance.png


It's going to be a game of who throws more $ at developers, NV going ballistic on PhysX + tessellation and AMD exposing DirectCompute, contact hardening shadows and HDAO advantage of its architecture. I am expecting the wildest use of tessellation in Crysis 3. :)

Imho,

I think it is wonderful! It's not throwing dollars and bribes at developers; it is working with developers to try to improve the experiences for their customers. It creates PC awareness and improves the experiences as a whole for the PC platform compared to the console. Looking for some idealism where AMD and nVidia are holding hands, dancing around the camp fire, singing show tunes, when this is a competitive sector, when there is a lot of innovative good, and improved gaming experience potential offered as a whole.
 

SomeoneSimple

Member
Aug 15, 2012
63
0
0
There's more examples in Sleeping Dogs of how Kepler just doesn't have the grunt to produce these leading edge effects. Now that developers are starting to exploit all the amazing cababilities inside GCN, Kepler will fall further and further behind.

Come on, this is nonsense. A few AMD sponsored games are hardly evidence that Keplers architecture will outdate any quicker than GCN. The same could be said if you'd look at Nvidia favored games, like Max Payne 3 or Battlefield 3, where AMD's lineup gets similarly crushed.

Sleeping Dog's performance on Extreme looks gimmicky though; a 7970Ghz nearly 1.7 times as fast as a GTX670? It wouldn't surprise me if they'd use computations with FP64 precision, just to throw Kepler off its feet. o_O
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Here is a point:

It seems that AMD is compute heavy and has additional default ram; why not work with developers to try to improve the gaming experience for their customers? Why should AMD owners have to wait for nVidia to offer more default ram or improve their compute?

Innovation doesn't wait for anyone.

I made the same exact point when nVidia had a hardware tessellation advantage.
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
Come on, this is nonsense. A few AMD sponsored games are hardly evidence that Keplers architecture will outdate any quicker than GCN. The same could be said if you'd look at Nvidia favored games, like Max Payne 3 or Battlefield 3, where AMD's lineup gets similarly crushed.

Sleeping Dog's performance on Extreme looks gimmicky though; a 7970Ghz nearly 1.7 times as fast as a GTX670? It wouldn't surprise me if they'd use computations with FP64 precision, just to throw Kepler off its feet. o_O

Nice try but no. Future effects are going to be further compute driven and kepler is extremely weak there.
 

-Slacker-

Golden Member
Feb 24, 2010
1,563
0
76
It r not working.

I'm getting vague silhouettes on a mainly black screen. Sound works fine though.
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
Here is a point:

It seems that AMD is compute heavy and has additional default ram; why not work with developers to try to improve the gaming experience for their customers? Why should AMD owners have to wait for nVidia to offer more default ram or improve their compute?

Innovation doesn't wait for anyone.

I made the same exact point when nVidia had a hardware tessellation advantage.

Here's a quote from Dave Bauman:
Recent / future titles already partnering with AMD Gaming Evolved include: Sleeping Dogs, Hitman Absolution, Medal of Honor Warfighter, Tomb Raider, Bioshock Infinite, with more to come...

To add to that there's Showdown, Sniper, and Nexius and more already released
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
Here is a point:

It seems that AMD is compute heavy and has additional default ram; why not work with developers to try to improve the gaming experience for their customers? Why should AMD owners have to wait for nVidia to offer more default ram or improve their compute?

Innovation doesn't wait for anyone.

I made the same exact point when nVidia had a hardware tessellation advantage.

Here's a quote from Dave Bauman:
Recent / future titles already partnering with AMD Gaming Evolved include: Sleeping Dogs, Hitman Absolution, Medal of Honor Warfighter, Tomb Raider, Bioshock Infinite, with more to come...

To add to that there's Showdown, Sniper, and Nexius and more already released

http://forum.beyond3d.com/showthread.php?t=59176&page=157
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Nice try but no. Future effects are going to be further compute driven and kepler is extremely weak there.

BF3 and Dirt 3 both use DirectCompute :\

Techreport is throwing Dirt Showdown out of their suit:

I omitted the Showdown results from our overall index for several related reasons. First, because AMD told us themselves that they worked directly with CodeMasters to implement a new lighting path in this game engine. That lighting path happens to work very poorly on GPUs produced by AMD's competitor--so poorly, in fact, that the GeForce results for that game are *half* the speed of the Radeons in the 99th percentile frame times. That's true despite the fact that these Radeons and GeForces perform comparably in every other scenario tested. Also, the size of the performance gap in Showdown skews the overall results sufficiently that it offers a very different picture than we see in the other five games.
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
The thing is people have double standarts...

how often do you hear:
"I choose nvidia because they work closer with developers."

And AMD are alot more "fair" with their implimentations than nvidia are lol.
 

SomeoneSimple

Member
Aug 15, 2012
63
0
0
Nice try but no. Future effects are going to be further compute driven and kepler is extremely weak there.

By just saying no without raising any further points, I'll just presume you don't have any idea what the future will bring and simply want to disagree in a rude way.

Yes, compute shaders will be extensively used in the future, but does that automatically mean that Kepler will be at a disadvantage?
No, it does not.

The most prime example of a game which uses a ton of compute shaders, is Battlefield 3, where Kepler still manages to crush their GCN counterparts.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
LOL, this is like Witcher 2 on steroids. From the blog post:

"Gamers like you know that a game run without anti-aliasing introduces aliasing, or “jaggies,” onto the edges of objects in the game world. These jaggies are ugly and we all know it, so we went ballistic on them with an advanced form of anti-aliasing that combines supersampling and compute-accelerated post-processed AA. You, as the user, have configured your game to run at 1920×1080, and you’ve selected 4xSSAA as your anti-aliasing method. These settings tell the graphics card to render the game’s content at a 4x larger resolution of 3840×2160 (ultra-high definition), then resize that frame back down to 1920×1080 before display on the monitor. The “Extreme” anti-aliasing setting uses the compute horsepower of Graphics Core Next to do another anti-aliasing pass on the final frame, which will smooth out those last four pixels of aliasing we described in the example above." Ok now take that statement and apply it to 2560x1600 resolution. Your videocard is doing 4-5x the workload of a single 2560x1600 frame. :eek:

Some people on our forum have argued that it's great when AMD and NV work closely with developers and I said if that continues, I fear we'll need both brands of videocards to play games. That's only getting worse now that AMD is throwing $ at Gaming Evolved. First Dirt Showdown, then Sniper Elite V2, now this.

performance.png


It's going to be a game of who throws more $ at developers, NV going ballistic on PhysX + tessellation and AMD exposing DirectCompute, contact hardening shadows and HDAO advantage of its architecture. I am expecting the wildest use of tessellation in Crysis 3. :)

Yeah, it was fine when only one side was doing it. Just buy that company's cards and you were golden.

Best case scenario both sides start releasing better all around performing products with no glaring weaknesses to exploit. Worse case scenario we end up with a whole bunch of games that only perform well on one brand or the other. Hopefully, option one plays out. If so, it will improve cards for everyone.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
It's going to be a game of who throws more $ at developers, NV going ballistic on PhysX + tessellation and AMD exposing DirectCompute, contact hardening shadows and HDAO advantage of its architecture. I am expecting the wildest use of tessellation in Crysis 3. :)

And the fun part about all this is:
Kepler has no problem with "Contact hardening shadows and HDAO". It's AMD forward+ renderer which limited the performance of the kepler cards. And it's has nothing to do with compute. Even GF110 is losing more than 50% of the performance which makes this card slower than a 6970: http://www.computerbase.de/artikel/grafikkarten/2012/test-nvidia-geforce-gtx-660-ti/31/

GTX680 is 60% faster than a GTX580....

BF3 and Dirt 3 both use DirectCompute :\

Techreport is throwing Dirt Showdown out of their suit:

I omitted the Showdown results from our overall index for several related reasons. First, because AMD told us themselves that they worked directly with CodeMasters to implement a new lighting path in this game engine. That lighting path happens to work very poorly on GPUs produced by AMD's competitor--so poorly, in fact, that the GeForce results for that game are *half* the speed of the Radeons in the 99th percentile frame times. That's true despite the fact that these Radeons and GeForces perform comparably in every other scenario tested. Also, the size of the performance gap in Showdown skews the overall results sufficiently that it offers a very different picture than we see in the other five games.

If they would not do it they would look like fools. They did the same with Crysis 2 and Tessellation.
 
Last edited:

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Some people on our forum have argued that it's great when AMD and NV work closely with developers and I said if that continues, I fear we'll need both brands of videocards to play games. That's only getting worse now that AMD is throwing $ at Gaming Evolved. First Dirt Showdown, then Sniper Elite V2, now this.

It's going to be a game of who throws more $ at developers, NV going ballistic on PhysX + tessellation and AMD exposing DirectCompute, contact hardening shadows and HDAO advantage of its architecture. I am expecting the wildest use of tessellation in Crysis 3. :)

I don't necessarily think that either of the graphics chip designers sponsoring a game locks the other out of it. Take, for example, the venerable Crysis Warhead. It was a TWIMTBP game, but nowadays AMD holds a steady lead in it, with its 7970 outdoing Nvidia's 680:

48450.png


(On another note: omg! The 7970 GE is flirting with the 1080p, 4x MSAA, 60 fps single GPU Holy Grail in Crysis! It's only been what, 5 years? :awe:)

And Dragon Age II, a Gaming Evolved title which was firmly biased in AMD's favor when it was released, now is heavily tilted in Nvidia's favor.

DA2_02.png


Any huge disadvantages can be overcome through driver optimizations, IMO. Yes, there are hardware limitations like compute ability or tessellation, but after driver optimizations (and AMD ditching its lackluster first gen DX11 tessellator...) things even out. I expect this to happen in Sleeping Dogs' case. Just give the Nvidia driver team a couple months to work at it.

What can't be overcome through driver optimizations are features that are exclusive at the hardware level. AMD can never properly use PhysX because Nvidia has not made the source code available for AMD to make drivers for. I'm sure that AMD's GPGPU adept GCN architecture could excel at PhysX if it was simply allowed to, but it isn't. Nvidia probably thinks it will sell more of its GPUs if PhysX remains exclusive, but that very attitude is hurting PhysX as few developers want to implement technology that only half of the PC gaming market can even use. Exclusive technology like PhysX harms its designer, harms games that use it, and harms the PC game market in general. It's no good and needs to die, or open up so that AMD can use it as well.

What are the numbers for Nvidia and AMD for market share in the gaming graphics market, anyways?

Here's a quote from Dave Bauman:


To add to that there's Showdown, Sniper, and Nexius and more already released

AMD has been putting more money into Gaming Evolved for a couple years or so now. Dragon Age II, Deus Ex: Human Revolution, and Total War: Shogun 2 were all AAA Gaming Evolved titles.

PhysX aside, overall I think TWIMTBP and Gaming Evolved are good for the PC gaming industry. It encourages developers to use more advanced and complicated effects in their games, which gives us gamers better looking, more demanding games, which encourages us to go buy more powerful GPUs, which gives the GPU makers more money to invest into helping game developers. It's a cycle that everyone wins; if you don't have enough money to buy top-end GPUs, just turn the settings down.
 
Last edited:
Feb 19, 2009
10,457
10
76
BF3 and Dirt 3 both use DirectCompute :\

Techreport is throwing Dirt Showdown out of their suit:

I omitted the Showdown results from our overall index for several related reasons. First, because AMD told us themselves that they worked directly with CodeMasters to implement a new lighting path in this game engine. That lighting path happens to work very poorly on GPUs produced by AMD's competitor--so poorly, in fact, that the GeForce results for that game are *half* the speed of the Radeons in the 99th percentile frame times. That's true despite the fact that these Radeons and GeForces perform comparably in every other scenario tested. Also, the size of the performance gap in Showdown skews the overall results sufficiently that it offers a very different picture than we see in the other five games.

This is pretty bullshit when games like Hawx, Hawx2, LP and LP2 along with a bunch of other TWIMTBP games that totally skewed the results towards NV. What then? They still used it, double standard losers, i won't visit them in the future seeing that statement.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
This is pretty bullshit when games like Hawx, Hawx2, LP and LP2 along with a bunch of other TWIMTBP games that totally skewed the results towards NV. What then? They still used it, double standard losers, i won't visit them in the future seeing that statement.

They disable Tessellation in Batman:AC and Crysis 2.

So i guess you had no problem which these decisions. :rolleyes:
 
Feb 19, 2009
10,457
10
76
Didnt read their reviews of late so im not aware they disable tess. Thats pretty bullshit too, its a dx11 standard, can't disable it for fairness.

IF they wanted to be "fair" they could simply set tessellation factor in CCC to not tessellate to the extreme while achieving the same visual IQ. But even this is iffy, it should be let to the end user to set these.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
IF they wanted to be "fair" they could simply set tessellation factor in CCC to not tessellate to the extreme while achieving the same visual IQ. But even this is iffy, it should be let to the end user to set these.

Makes no sense. Fair means doing the same workload.
 
Feb 19, 2009
10,457
10
76
"Fair" in this case would mean not doing obscene work loads for no visual gains. Something im sure users would select if there were an option. Luckily, AMD has this option in their drivers, discard completely rubbish tessellation factors.

I like having this option, and thats my point above, if NV push tessellation like crazy, its irrelevant as an AMD user you can ignore it. If AMD push Forward + in future games... its pretty damn relevant when kepler sucks at it.