• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

gamegpuMirror's Edge Catalyst Beta GPU Benchmarks

csbin

Senior member
http://gamegpu.com/action-/-fps-/-tps/mirror-s-edge-catalyst-beta-test-gpu


ysyJi.jpg



jVjCn.jpg
 
hmm 380X with 4GB is slower than 280X with 3Gb.7970 with 3GB is faster than GTX 960 4GB version then why is 7970 below 960?

MirrorsEdge_1920.jpg


silly GameGPU.
 
So far so good, nearly 60 fps maxed out 1080p for most mid-range GPUs.

But NV is sponsoring this game (on their channel & they are giving away beta keys), it could get a lot worse perf by the time it ships with GameWorks tacked on... hopefully not, I'm looking forward to this title. 🙂
 
That much touted "GCN effect" never quite seems to come to fruition...again. Maybe the next game....

Not sure if serious. Let me help you.

$399 290 > $699 780Ti. $549 290X easily beats 780Ti.
$299 280X > $500-650 780.

That's a terrible showing for all Kepler cards.

We shouldn't even have a situation where 290/290X would be easily beating 970/980 in games since GameGPU uses reference versions which are automatically 7-8% slower than an after-market 390/390X would be. Besides, Hawaii was never designed to compete with a next gen NV architecture designed from the ground up. The fact that 290/290X/390/390X regularly smack 780/780Ti around is already impressive given the huge price disparity of those cards during that generation.

Let's not forget 980Ti cost $620-650 for most of its life when R9 290X was $280-300. 290X scores on those charts are easily beaten by an after-market $330 390 too. What do you expect exactly a $330 card regularly beating a $600-650 one?

Next time try to put some objective thought into your post.

You know it's changed days when the worst result AMD sees is a 970 tying with a 290X. 😉 And the 280X beats the 780 again.

Ya, apparently 290 should have been regularly beating 780Ti and 280X was always a 780/OF Titan competitor. Funny enough in 2012 the same people kept claiming 680 is faster than 7970Ghz (7970Ghz beats a 280X).

What's next, a $299 Polaris is a fail since it loses to a $549 1080?

If anything, GameGPU paints Hawaii cards in the worst light possible. A stock 390 beats a 290X and 390X is 8% faster.
http://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_XtremeGaming/23.html

Something else that green camp always ignores: crypto-currency more or less made AMD cards to date free from 4870 days. That means even if any direct NV card ties or slightly beats an AMD card, it's an automatic fail for NV since the AMD card cost $0.

That means in the real world for anyone who did some research it goes like this:

$0 280X ~ $1000 OG Titan
$0 290 ~ $700 780Ti

Yaaaa the NV card is doing alight, almost....

Crypto-currency actually proves and highlights just how biased many NV owners are that even when your videocard makes $/pays for itself, they still don't want it.

Anyway, 2GB cards are completely dead which is what I have been saying on this forum since 285 came out, then reinforced my view when 380-960 2GB came out. Of course once again, the butthurt of 2GB owners and green camps defense of the 670-680/950-960 2GB turds got in the way of facts such as professional reviews also proving that 2GB cards are crap.

Welcome to 2016.
 
Last edited:
A 270W Hawaii is tied to a 150W GTX 970, that is indeed some poor showing from AMD.

Ya those cards are from different generations. It's like sliding in a 780Ti in place of a 290X and claiming how it's inefficient junk against the 970/980.

Also, after-market 970 used 170-185W, not 150W. Not all Hawaii chips use 270W either.

HardOCP has XFX 390 system drawing just 43W more than MSI Gaming 970 system:
http://m.hardocp.com/article/2015/09/21/xfx_r9_390_double_dissipation_8gb_video_card_review/10#.Vxugmsj3bCQ

AnandTech shows an 80W difference between a 970 and a 390 rig.
http://www.anandtech.com/show/9784/the-amd-radeon-r9-380x-review/13

November 2014, Sapphire Tri-X 290 was on sale for $200 US with 3 free games:
http://slickdeals.net/e/7412008-sapphire-radeon-r9-290-tri-x-video-card-4gb-gddr5-3-games-199-99-ar-vco-td?src=SiteSearchV2Algo1

That means someone who jumped on such deals enjoyed a ~ 970 performance and has $130 extra to spend on a 650-660W Platinum or a 750-850W Gold Power Supply. All this time, outside of gaming the Hawaii card also made $.

It's even worse for the 980. PowerColor PCS+ 290X went as low as $254 November 2014, or a $300 savings over the 980'a $550 price at that time:

http://slickdeals.net/e/7448366-r9-290x-powercolor-pcs-axr9-290x-4gbd5-ppdhe-4gb-254-99-ar-visa-checkout-fs?src=SiteSearchV2_SearchBar

That means the 290X user has $300 extra set aside for Vega/Big Pascal or even the 1080.

So now you completely evaded how the entire Kepler line is getting annihilated by AMD, and also ignored the huge price disparity between after-market 290/290X cards when 970/980 came out to try to show that NV is doing well?

Btw, average US prices for electricity are $0.12 / kWh.

100W power difference X 4 hours a day X 365 days a year X $0.12 kWh = $17.52.

In Canada, in Ontario, the average rate is $0.085 / 1kWh. This would mean an annual cost difference of $12.41. Big deal. Modern AAA games have DLC that alone costs $30-60.

How much is the full price of Fallout 4 and the Witcher 3? Those 2 games alone = $240-250 US!

If I wanted to care about $12-18 in annual electricity costs, all I need to do is not buy just 1 AAA game at launch and wait until GreenManGaming or Steam has it on sale for 50% off.
 
Last edited:
The 2 GB 960, as expected tanks pretty heavily, however, the 4 GB variant is actually doing better than I expected. I figured the 7970 would edge it out. While the 280X is undoubtedly the GHz edition, anyone know the clockspeed the 960 was ran at (probably the SSC)?
 
Wow Kepler just sucks hard today. A 680 barely beating a 7870? Glad I bought a 290. That card has really stretched it's legs.
 
Ultra is usually the only setting tested across the mid-high eng lineup.

Which is a shame actually. Ultra settings are basically a fraud, and the industry knows this very well. Triple-A PC games are coded primarily for High settings (@60fps), while it's console equivalent uses ~Medium settings (@30fps, or in this case also 60fps)). Now the delta between "High" and "Ultra" is in most cases visually negligible, while the performance hit it takes on your hardware is massive.

Ticking everything to Ultra on newest titles for benchmarking is a very unprecise metric to measure performance. Wish they'd also take Medium/High settings into account.
 
Mirror's Edge is not a Gameworks title. You can stop lying now.

Isn't Horizon based AO an Nvidia HBAO? It's in video settings.
I'm not sure if it's HBAO but it makes zero image quality difference and eats like 10-20% fps so that sounds like Nvidia tech.
 
Which is a shame actually. Ultra settings are basically a fraud, and the industry knows this very well. Triple-A PC games are coded primarily for High settings (@60fps), while it's console equivalent uses ~Medium settings (@30fps, or in this case also 60fps)). Now the delta between "High" and "Ultra" is in most cases visually negligible, while the performance hit it takes on your hardware is massive.

Ticking everything to Ultra on newest titles for benchmarking is a very imprecise metric to measure performance. Wish they'd also take Medium/High settings into account.

It is what it is. While visuals are one factor of what makes PC gaming great, I believe flexibility is another factor.

However, many on these forums are both hardware enthusiasts and gaming enthusiasts. Undoubedly, having a card that runs everything on Ultra is something of a symbol of pride in that the builder made the "correct" choice of components. While something like a 7870 can run this game acceptably well at reduced settings, the hardware enthusiast side wants something that utterly stomps the game like one swats a fly.

In my opinion, nothing at all wrong with wanting to run everything on Ultra. Just be prepared to pay and research for that privilege.
 
Which is a shame actually. Ultra settings are basically a fraud, and the industry knows this very well. Triple-A PC games are coded primarily for High settings (@60fps), while it's console equivalent uses ~Medium settings (@30fps, or in this case also 60fps)). Now the delta between "High" and "Ultra" is in most cases visually negligible, while the performance hit it takes on your hardware is massive.

Ticking everything to Ultra on newest titles for benchmarking is a very imprecise metric to measure performance. Wish they'd also take Medium/High settings into account.

It is what it is. While visuals are one factor of what makes PC gaming great, I believe flexibility is another factor.

However, many on these forums are both hardware enthusiasts and gaming enthusiasts. Undoubedly, having a card that runs everything on Ultra is something of a symbol of pride in that the builder made the "correct" choice of components. While something like a 7870 can run this game acceptably well at reduced settings, the hardware enthusiast side wants something that utterly stomps the game like one swats a fly. Anything that shows signs of hitting it's limits is like that nagging feeling in the back of your mind.

In my opinion, nothing at all wrong with wanting to run everything on Ultra, nice to have that semi-security of being able to run anything in the future for years (albeit, at steadily lower settings). Just be prepared to pay and research for that privilege.
 
Last edited:
Mirror's Edge is NVIDIA sponsored.

yUubceQ.png


https://www.youtube.com/watch?v=lEQEsTTmZbc

^ Official NV advertisement. They have a few more, with dev commentary too.

Might be a case of PhysX usage.

This is what NV needs to do to cancel the console effect. Remember Rise of the Tomb Raider? Async Compute on the Xbone for PureHair & Global Illumination... DX11 port on PC, with a later DX12 patch that is broken. haha
 
Do you retract your statement now that Nixxes didn't hold back DX12 for Rise of the Tomb Raider for PC since it takes some real time to port to DX12 like I said ?

No, because they were using DX12 for the Xbone. If they actually tried or were told to do it DX12 for the PC, it would have been fine, Crystal Dynamics & Nixxes aren't amateurs, they have had Mantle experience before, able to get Mantle into the PC port alongside DX11 and Mantle runs faster.
 
Back
Top