SomeoneSimple
Member
- Aug 15, 2012
- 63
- 0
- 0
Dunno if you noticed, with recent drivers, AMD is as fast or faster even in BF3.
Not according to Anandtech's own benchmarks. The 7970 and especially AMD's 7950 score far worse than NV's counterparts.
Dunno if you noticed, with recent drivers, AMD is as fast or faster even in BF3.
"Fair" in this case would mean not doing obscene work loads for no visual gains. Something im sure users would select if there were an option. Luckily, AMD has this option in their drivers, discard completely rubbish tessellation factors.
I like having this option, and thats my point above, if NV push tessellation like crazy, its irrelevant as an AMD user you can ignore it. If AMD push Forward + in future games... its pretty damn relevant when kepler sucks at it.
Not according to Anandtech's own benchmarks. The 7970 and especially AMD's 7950 score far worse than NV's counterparts.
This is pretty bullshit when games like Hawx, Hawx2, LP and LP2 along with a bunch of other TWIMTBP games that totally skewed the results towards NV. What then? They still used it, double standard losers, i won't visit them in the future seeing that statement.
Imho,
I don't agree! -- a title may have a fidelity setting that may shine on a particular architecture but doesn't really effect game-play. No one is harmed really and yet a gamer can be rewarded for the strength of an architecture.
To wait 'till both architectures are even -- ideal playing field -- if one can't handle the fidelity setting - then it's not worth doing -- how the hell does one move forward with that idealistic view?
All gamers have to enjoy it and if the other architecture or software isn't up to speed -- none at all. Can't push forward and innovate with that backwards thinking to me. It's noble to have a desire for everyone to have equal gaming experiences but an idealistic view and idealism becomes the enemy of good.
Not according to Anandtech's own benchmarks. The 7970 and especially AMD's 7950 score far worse than NV's counterparts.
Imho,
I think it is wonderful!
Some people on our forum have argued that it's great when AMD and NV work closely with developers and I said if that continues, I fear we'll need both brands of videocards to play games. That's only getting worse now that AMD is throwing $ at Gaming Evolved. First Dirt Showdown, then Sniper Elite V2, now this.
Final8ty: Only if the market can sustain it, I couldn't imagine spending that amount of money on 2 cards I can't CFX/SLI. And if the prices go down, this really doesn't win anything for the GPU makers.
That's because a stock 7950/7970 aren't that fast in BF3 since AMD cards suffer in deferred MSAA Forstbite 2.0 engine. BF3 is also one of the best games for nV for that reason. Doesn't matter though since 7950 and 7970 overclock 25-50%. So they easily catch up to NV's cards while beating them almost everywhere else.
You can overclock 7900 series to still get close to 670/680 in the games where 670/680 are faster, but you cannot really overclock 670/680 to catch up to 7900 series in games where AMD is leading because that lead is usually massive (Crysis 1/Warhead, Metro 2033, Anno 2070, Alan Wake, Sniper Elite V2, Sleeping Dogs, Dirt Showdown, Serious Sam 3, Bulletstorm, SKYRIM with mods / 8AA, Batman AC 8AA, etc.). Basically while NV cards are good at the popular games such as WOW, Guild Wars 2 and BF3/Crysis 2, but on the whole they have some serious gaps in many other less mainstream games.
Unless you play BF3/GW2/WOW/BatmanAC/Lost Planet 2/Medal of Honor Warfighter and Crysis 2/3 only, then those gaps will show up eventually.
![]()
![]()
If all you play is BF3, then sure $300 GTX660Ti is good.
You keep saying this over and over. So the fact that your card performs like a dog in Dirt Showdown, Sniper Elite V2 and Sleeping Dogs is great? Jeez.
Maybe you can send me a GTX680 when Medal of Honor Warfighter launches so I can actually play the game....
Either way this game of throwing $ at developers by one upping each other is just hurting us. I don't feel like having an AMD and an NV card in my rig to play certain games. There is no reason that someone like Grooveriding with $1k of GPUs should be running a game at 40-45 fps. Stuff like this hurts PC gaming. At the end a person may buy a $400 NV GPU, fire up one of these AMD sponsored titles and then realize it runs poorly and abandon PC gaming for PS4. That's what will happen if $400 GPUs can't even run a basic game at 60 fps. This is no different than someone who buys a 7970 and fires up The Secret World and it has the most artificial looking red bricks via Tessellation seen in the last 2 years and it runs like a dog.
91 fps in DX9 Ultra on 7970 and 96 fps on a 680
![]()
vs.
24 fps in DX11 Ultra on 7970 and 53 fps on a 680
![]()
Almost a 4x reduction in performance on AMD cards and the game doesn't look any better.....that's great for PC gaming? Ya, I guess if you bought a 680.
Not many PC gamers have $1k to buy a $500 AMD card and a $500 NV card to get the best of all world.
Interesting comment, especially when you go on to say...Imho,
Fear mongering and idealism.
Nvidia removes effects and features, then puts them back in using PhysX, with questionable results. Sure, the particles and such are interactive and "real time" but in the end it doesn't make any difference, and in many cases the effects are overblown in an attempt to showcase PhysX, and they end up looking stupid and out of place. It's one thing to put in some eye candy using PhysX, it's an entirely different kettle of fish when you REMOVE easily achievable effects done on the CPU just to prop up PhysX."Hey guys, we can really enhance Borderlands 2 physics for our customers?" " Oh no, we can't do that -- AMD doesn't offer physX!"
How the hell does one innovate and showcase their architectural strengths? Tech demos or actual content in games?
Standardization is what has driven this industry forward from the beginning. Every attempt to do otherwise has failed. It is not through chaos that brings about innovation, it's through cooperation and industry consensus, and most importantly standardization. The web did not become the sensation it is by limiting itself to certain hardware. If Nvidia first came up with a web mark up language, they would make sure it would only run on their computers, and it would fail.Let's all hold hands together, dance around the camp fire, singing show tunes and get all excited that all gamers have the same gaming experiences with similar performance! Hey, that sounds like a frigg'n console! It's the chaotic competition that moves and drives innovation forward -- -- no let's wait for everyone to be on the same page first before we move forward.
Nvidia removes effects and features, then puts them back
It is not through chaos that brings about innovation, it's through cooperation and industry consensus, and most importantly standardization.
If you have not observed this by now, well....Please, expand on this!![]()
Imho,
Fear mongering and idealism.
Your thinking is backwards thinking -- Gee, even though we have these neat features and abilties for our GCN architecture -- let's not include them because GeForce Owners may suffer a performance hit.
"Hey guys, we can really enhance Borderlands 2 physics for our customers?" " Oh no, we can't do that -- AMD doesn't offer physX!"
How the hell does one innovate and showcase their architectural strengths? Tech demos or actual content in games?
Let's all hold hands together, dance around the camp fire, singing show tunes and get all excited that all gamers have the same gaming experiences with similar performance! Hey, that sounds like a frigg'n console! It's the chaotic competition that moves and drives innovation forward -- -- no let's wait for everyone to be on the same page first before we move forward.
I am not sure you thought this out very carefully. Do you want a virtual monopoly in graphics for 40+ years? The lack of innovation in x86 through its history is downright sad. We got slow, incremental updates to the ISA, although thankfully Moore's law pushed computing power up exponentially. What's worse is Intel wanted to dump x86 and leave an entire industry stranded, it is almost dumb luck that years earlier AMD was put into the x86 game because of IBM, and we eventually ended up with x64. BTW, it is my understanding that x86 is not the fastest growing ISA, and I believe is no longer the most used ISA, no doubt because x86 is not available for anyone to license if they want to.Just like the industry consensus and most importantly standardization on x86.
There is no need to rehash discussions over and over. I'm pretty sure you know exactly what I am talking about, and if you don't c'est la vie don't want to go down that road.Great job of expanding!
Imho,
Fear mongering and idealism.
Your thinking is backwards thinking -- Gee, even though we have these neat features and abilties for our GCN architecture -- let's not include them because GeForce Owners may suffer a performance hit.
