AMD's Gaming Evolved snags FarCry3

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Why are people comparing a 7970GHz Edition to the GTX680? Why we are not using the GTX690 and looking which card is faster. I mean the GTX690 is only using 40 Watt more than the 7970GHz Edition and it's much faster...

LOL. Cheapest 7970GHz edition on Newegg, $420. Cheapest GTX690, $1000. I don't know why they're not being compared. :whiste:
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Hopefully, there may be some PC centric goodies in FarCry 3, improved fidelity or gaming experience potential for PC gamers over the console version.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
They actually could license Cuda and Physx. And actually had quiet talks with nVidia about PhysX.
Did you read what you quoted from Huddy? :confused:

[Nvidia] put PhsyX in there, and that's the one I've got a reasonable amount of respect for. Even though I don't think PhysX - a proprietary standard - is the right way to go, despite Nvidia touting it as an "open standard" and how it would be "more than happy to license it to AMD", but [Nvidia] won't. It's just not true! You know the way it is, it's simply something [Nvidia] would not do and they can publically say that as often as it likes and know that it won't, because we've actually had quiet conversations with them and they've made it abundantly clear that we can go whistle.
I am a realist -- not an idealist.
Then why do you keep insisting that proprietary tech is the way forward? Just go back and look at history, most often this strategy ultimately fails. That is reality, which is the cornerstone of realist beliefs. And proprietary hurts consumers, which is certainly not letting "the market decide" as you so vehemently tout as critically important to you.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
BUT, this "Game GPU" website, I'm not sure, but from what I see their testing methodology is far from ideal, they have different people with different combination of CPUs and VGAs making the tests? (anyone who understands Russian can clarify that?)

Used to be that way. Their logic behind it was that someone who is buying a $500-1000 GPU setup is not going to pair it with a Phenom II CPU. So they would pair low-end CPUs with low-end GPUs, mid-range CPUs with mid-range GPUs, high-end CPUs with high-end GPUs to simulate what they felt was more of a real world gamer system.

This was a while ago and after receiving feedback from their community, they went back to what almost all websites use -- high-end overclocked test platform to test all the GPUs. The 10 separate test beds listed in there chart are there for CPU limited 1024x768 testing only. All of the Half-Life 1 Black Mesa GPU tests are run on:

Core i7 3930K @ 4.8ghz

Thus, all the GPU testing involves 1 high-end overclocked 6-core i7 CPU for every GPU now to remove CPU bottlenecking.
 
Feb 19, 2009
10,457
10
76
Then why do you keep insisting that proprietary tech is the way forward? Just go back and look at history, most often this strategy ultimately fails. That is reality, which is the cornerstone of realist beliefs. And proprietary hurts consumers, which is certainly not letting "the market decide" as you so vehemently tout as critically important to you.

I have to agree that moving forward needs to happen with DX11 and on so, as its an open standard AGREED TO BY ALL involved in setting the paramaters for each iteration.

But i disagree, the market does decide, NVs advantage with Physx is real and should not be underestimated by some of you on this forum. A unique feature that adds extra effects that are noticeable? Many NV users see a few good titles with Physx and its worth it to them to pay a premium etc. The market does therefore, decide.

This is why AMD is pushing GE to really include a lot of dx11 compute features, so they can hammer away at the mass mindset: want extra effects? Make sure you have an AMD card else your NV gpu will run like crap. I suspect after a few great games with this, those who dont normally care to try the other side will think twice.

Its not going to work long term because bigK is certainly going to be a full compute beast so it will handle any dx11 load just as well. But its a good dev relation approach regardless, and it gets the brand recognition out there. Seeing AMDs logo prior to playing your greatest and latest games repeatedly is worth something at least.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Why are people comparing a 7970GHz Edition to the GTX680?

Because HD7970 GE is a direct competitor to the GTX680, and actually costs less:

Gigabyte Windforce 3x 1100mhz 7970 GE = $450 (Newegg)
Sapphire Vapor-X 1050mhz 7970 GE = $450 (Newegg)

You are the guy repeating over and over again that AMD has an advantage with DirectCompute. Even after i corrected you several times you have no problem with this statement. Brink- a OpenGL games based on an outdated engine - as an example for the DirectCompute performance of GCN is hilarious and ridiculous at the same time.

Do you just make stuff up out of the blue? When did I say "Here is a list of 30 games, including Brink, that shows that Tahiti XT is superior for DirectCompute". I just expanded the list of fairly modern PC games to paint a more accurate picture of where GPUs stand because your list included outdated games like Dirt 2, Lost Planet 2 and HAWX 2 that hardly anyone plays anymore. But you have a problem with me expanding the list to include Brink even though it's newer than some of the games you added? Hilarious.

You want to talk about DX11? Maybe you should start to use DX11 games. Until then i don't care about your marketing crusade anymore.

I did, but you ignored them. You also clumped in DX11 games where 680 is slower: Dirt 3, Batman AC, or not any faster at all: Crysis 2.

You may want to check out modern reviews that tested 15+ games in the last 1-2 months. Start with TPU, Computebase, Xbitlabs, HT4U.net and HD7970 GE > GTX680 in most DX11 games. But of course you'll just come up with some excuses why these tests are biased or AMD is cheating by "downsampling" or using DirectCompute to accelerate lighting calculations in Dirt Showdown and Sniper Elite V2.

What's next continuing to make up data that HD7970 GE uses 275W of power at load like the GTX480 does? :whiste:

Ignoring that in GPU demanding games like Skyrim + ENB Mods, Crysis 1 / Warhead, Metro 2033, Witcher 2 Enhanced Edition, Trine 2 + SSAA, Arma II DayZ mod, Alan Wake, Anno 2070, Sleeping Dogs that HD7970 GE beats the 680 handily? You can keep your 10% faster in BF3, I'll take 20-30% faster in the other GPU demanding titles. 10% slower in a couple games I can live with, 20-30% in other games starts to matter more, especially when the competitor charges $480+ for a slower GPU and then another $40-50 for 4GB of VRAM on top. Pure brand name money grab.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I am a realist -- not an idealist.

What's the point of PhysX? To create more realistic looking games. The idealist viewpoint is if PhysX is working as its intended, it should be making the games look much more realistic since it's simulating real world physics.

You are going to love PhysX in BL2 then since you seem to not care at all if PhysX actually abides to real world physics as long as "some PhysX feature" is added. Hundreds of redundant particles?! Yes please, PhysX!!! BL2 has the world's most "realistic" physics particle snow simulation. Did you know when disturb snow, i.e., wind disperses snow in the real world, snow blower, etc. that the snow ends up looking like pebbled rocks? I didn't know that the snow flakes turn into ice when you shoot them unless the entire planet is covered in a shield of ice? When you shoot the snow in BL2, that's exactly what happens though.

This is a repeat of Mafia II where PhysX creates hundreds of exaggerated particles that look fake. Thus far, in BL2, PhysX has done a very good job with actual cloth wind simulation. The rest, looking like a typical PhysX implementation let-down. Even if we assume that the planet is covered in ice, if you shoot ice with a gun, do you get 300 pebbled-sized ALMOST EXACTLY the same looking ice rocks scattered on the ground like this? There is no chance at all that at least 2-3 of them will be much larger in size? If this is really ice particles and the entire planet is covered in ice, where are the cracks in the ice and the crater from the impact of the shot? I am sorry I actually want games to have real world realistic looking physics effects. :rolleyes:

Borderlands2_2012_09_18_15_51_15_099.jpg
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Umm...the whole point of PhysX in games is to make them more realistic. Borderlands can be a fully imaginary world but PhysX is meant to simulate real world physics on planet earth. Last time I checked, that's the only kind of physics model that we actually use and understand. I guess you don't care at all that PhysX actually looks realistic as long as there are 1000 random particles on screen?

Interestingly enough, if the game senses any trace of AMD .dll file or driver on your system, even if you have an NV card, it blocks out PhysX. The game also blocks out PhysX on CPUs. NV also blocks out PhysX when you pair an AMD GPU + a low-end NV GPU despite this game needing nothing more than an 8800GT to accelerate PhysX in BL2. Want to go out and buy a $40 GT430 to use as a slave card for PhysX with an AMD flagship GPU, no sir, cannot do it without a hack. That's proprietary. AMD using DirectCompute in AMD Gaming Evolved to take advantage of its architecture is a much fairer game - that's actually game engine optimization for their architecture. No one stops NV from optimizing drivers for Dirt Showdown, Sniper Elite V2 or Sleeping Dogs.

The amount of PhysX workload on the GPU is very little in this game as can be seen by adding a 2nd GTX690 for PhysX alone in the red. That means a Core i7 could have easily handled the PhysX effects, or a slave GT430 card would have done it without a problem with AMD GPU, if NV didn't block this for marketing reasons:

690.png


Pretty amazing how you guys think how 1 company purposely blocking technology from 40-50% of gamers even if they have a fast enough CPU or a slave old NV card around is awesome.
 
Last edited:

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
But i disagree, the market does decide, NVs advantage with Physx is real and should not be underestimated by some of you on this forum. A unique feature that adds extra effects that are noticeable? Many NV users see a few good titles with Physx and its worth it to them to pay a premium etc. The market does therefore, decide.
The market is not free to naturally decide, that is the point. The market for a long time "decided" that Internet Explorer was the best browser. Websites coded for it, because it was artificially forced into peoples homes.

Just because a feature may be desirable to some, and that feature causes people to a buy a product, does not mean the market is truly deciding. The market at one time "decided" that they could only have one local phone carrier.
 

NIGELG

Senior member
Nov 4, 2009
852
31
91
Borderlands is the epitome of realism.
PhysX is supposed to add realism....no matter the title.Mafia 2 ....shoot pillars and see hundreds of weird,unrealistic,identical particles come out of said pillar.....lol.
 
Feb 19, 2009
10,457
10
76
Physx isnt about realism, its about cool effects that is absent when you dont use NV gpus. Thats it. Think of the masses, the average joe and their reaction. It works, because so many ppl asking for GPU purchase advice disregard AMD because they lack Physx.

The market decides, with good marketting to influence buyers. Is it wrong to lock out AMD users regardless if they own a NV gpu for physx? Nope, just business.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
PhysX is supposed to add realism....no matter the title.Mafia 2 ....shoot pillars and see hundreds of weird,unrealistic,identical particles come out of said pillar.....lol.

Ya, that's the point. Since NV acquired Ageia, the actual physics implementation via PhysX in games has not really improved. It's been a long time now, so what are NV's engineers doing all this time? Where are the generational PhysX improvements? Every new PhysX game is just hundreds/thousands of random particles and some wind cloth simulation.

BL2 runs at a blistering level of performance on modern GPUs. This is not one of those games you go out and spend $400+ on to upgrade your GTX560Ti. If you are playing BL2, you do not even need a modern NV GPU even if you want to run PhysX. It's still based on UE3 engine with DX9 effects - so no DX11 effects of any kind. A GTX470 overclocked from 2010 will crush this game without much effort, or a $229 GTX660.

b2%201920.png
 
Last edited:

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
Interestingly enough, if the game senses any trace of AMD .dll file or driver on your system, even if you have an NV card, it blocks out PhysX. The game also blocks out PhysX on CPUs. NV also blocks out PhysX when you pair an AMD GPU + a low-end NV GPU despite this game needing nothing more than an 8800GT to accelerate PhysX in BL2. Want to go out and buy a $40 GT430 to use as a slave card for PhysX with an AMD flagship GPU, no sir, cannot do it without a hack. That's proprietary.

It seems you know a lot about how to make the two work together seamlessly, perhaps you should work as a driver writer! I'm sure its real easy to have AMD and Nvidia drivers installed at the same time without issues.

The amount of PhysX workload on the GPU is very little in this game as can be seen by adding a 2nd GTX690 for PhysX alone in the red. That means a Core i7 could have easily handled the PhysX effects, or a slave GT430 card would have done it without a problem with AMD GPU, if NV didn't block this for marketing reasons:

The only shocking part of that chart is that Borderlands 2 somehow maxes out a GTX 690 @ 1080p, which is very odd to me.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
Is it wrong to lock out AMD users regardless if they own a NV gpu for physx? Nope, just business.
This is a dangerous way of thinking, and ultimately leads to harming consumers and taking money out of their pockets. It's basically anti competitive behavior, similar to Intel using non optimized code paths when they detect an AMD processor. If you endorse these types of practices, then you do not want the market to naturally decide.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It seems you know a lot about how to make the two work together seamlessly, perhaps you should work as a driver writer! I'm sure its real easy to have AMD and Nvidia drivers installed at the same time without issues.

"PhysX" as you now know it worked perfectly before in 2 situations:

1) When you bought a slave Ageia card and paired it with a system with an AMD GPU;
2) Even after NV renamed Ageia physics technology into PhysX, it still worked when you used an AMD+NV card before Forceware 186 drivers.

Thoughts?

You are saying NV didn't purposely lock out PhysX when it senses an AMD videocard/drivers? That's pretty well known actually, which is why it works with a hack.

Also, why can't you run PhysX on a Core i7 6-core CPU again?

It's pretty obvious NV doesn't want people to go out and buy a $40 GT430 that will run all PhysX effects for the next 5 years. There goes their entire marketing plan for selling PhysX as a key feature on their future GPUs.

The only shocking part of that chart is that Borderlands 2 somehow maxes out a GTX 690 @ 1080p, which is very odd to me.

Do you need a GTX690 to have playable framerates in BL2 @ 1080P? The game continues to scale if you add more GPU power, but doesn't mean you need 100 fps minimums. What's so odd about that? CF and SLI scaling works in the game as well. If you want 150 fps average, get a GTX690. The rest of the world can play this game smoothly on an HD6950/GTX560Ti at 1080P. You are not buying modern $400-500 GPUs to play BL2 at 1080P. Just just a total waste of $. This is not a Far Cry, Doom 3, Crysis, Metro, Witcher 2 style game that actually requires upgrading unless you are still gaming on a GTS250.
 
Last edited:

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
"PhysX" as you now know it worked perfectly before in 2 situations:

1) When you bought a slave Ageia card and paired it with a system with an AMD GPU;
2) Even after NV renamed Ageia physics technology into PhysX, it still worked when you used an AMD+NV card before Forceware 186 drivers.

Thoughts?

My thoughts are that the two companies (or at the very least Nvidia) don't want to work on getting stable drivers for something that they cannot fully control. So what happens if AMD wants to change something in their drivers, and it messes up the Physx operations of the secondary cards? Who is to blame, AMD, Nvidia? Who has to then put in the work to correct it?

You are saying NV didn't purposely lock out PhysX when it senses an AMD videocard/drivers? That's pretty well known actually, which is why it works with a hack.

That is not what I am saying at all. Not sure where I said that... Of course they did this on purpose, for the reason I gave above.

Also, why can't you run PhysX on a Core i7 6-core CPU again?

Who's to say an i7 6-core can run the Physx in BL2? Have you tested it? No. Maybe Gearbox did test it, and they felt it stressed the CPU too much so they stuck to GPU only. Who knows? You, for sure, don't.

It's pretty obvious NV doesn't want people to go out and buy a $40 GT430 that will run all PhysX effects for the next 5 years. There goes their entire marketing plan for selling PhysX as a key feature on their future GPUs.

I'm not even sure what you are trying to say here... If Nvidia sold a GT430 for Physx use, I'd say they are following their marketing plan to a T...

I'd like to see more Physx usage data from other sources, as I took a look at the page (translated) that you got that chart from and it doesn't say how they tested the Physx. Unless I missed it somewhere, they just said they ran a test to test Physx usage. Was this just shooting the ground and watching snow fly, or was there more to it?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
What's the point of PhysX?

Imho,

To try to add more immersion or improve the gaming experience. This may encompass three aspects to me -- fidelity, realism and change game-play. So far, it has been more-so subjective fidelity and more realistic behavior but the key for the future of Physics is to dramatically redefine how games are played and may take the entire industry to bring this to the gamer.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
And proprietary hurts consumers, which is certainly not letting "the market decide" as you so vehemently tout as critically important to you.

I never said such a thing I have said:

Proprietary may offer division, fragmentation and chaos but it also may bring innovation, choice and improved gaming experiences, which hopefully creates awareness so the important players may forge standards to mature so many more gamers may enjoy.

The key is the view is actually realistic and balanced.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
I never said such a thing...
Yes in effect that is exactly what you are advocating.
Proprietary may offer division, fragmentation and chaos but it also may bring innovation, choice and improved gaming experiences, which hopefully creates awareness so the important players may forge standards to mature so many more gamers may enjoy.
This is just a load nonsense, with no committal to anything. May offer, may bring, may enjoy, may forge, hopefully.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
I'm not even sure what you are trying to say here... If Nvidia sold a GT430 for Physx use, I'd say they are following their marketing plan to a T...

I'd like to see more Physx usage data from other sources, as I took a look at the page (translated) that you got that chart from and it doesn't say how they tested the Physx. Unless I missed it somewhere, they just said they ran a test to test Physx usage. Was this just shooting the ground and watching snow fly, or was there more to it?

Nvidia doesn't just want people buying a cheap Nvidia card and using that for PhysX for a few years they want people buying cards every year.

An AMD user can buy a cheap Nvidia card and use it for PhysX and continue to buy new AMD cards as their main.

If Nvidia didn't play these stupid games I'd have a 460 in my system for the Batman games.