More ATI/NVidia PR wars

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
In a perfect forum Wreckage would be perma-banned. How can one person have so many issues?

I find the PR wars amusing hope they keep it up for a spell.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
So nV owns AA?
Wreckage, I'd like to see you answer that question please.

Also from the article:

AMD is already working with games developers on over 20 forthcoming games which feature DX11 tech. NVIDIA has been nowhere to be seen!
I guess you'd argue that those 20+ DX11 games should be locked to ATi so that nVidia doesn't benefit for free from them?
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Why are people arguing new ATI cards suck, because there are very little DX11 games? 8800GTX didn't suck when it came out, even though no DX10 games appeared for like a year? It completely mowed down it's competition in dx9 games. ATI's new cards do the same right now. Also, it can be argued that adoption of DX11 is moving at a much faster rate then DX10 adoption. M$ learned from it's mistake with Vista :p

And the AA/DX11 discussion is funny. It basicaly comes down to this: AMD could pay codemasters to lock out DX11 when a Nvidia card is present. Meaning, only ATI cards can make use of the DX11 renderpath, and Nvidia has to use DX9/DX10. Yet he argues, they don't.
 
Last edited:

Dkcode

Senior member
May 1, 2005
995
0
0
Just release some new damn cards so i can buy :mad:
 
Last edited:

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
And the AA/DX11 discussion is funny. It basicaly comes down to this: AMD could pay codemasters to lock out DX11 when a Nvidia card is present. Meaning, only ATI cards can make use of the DX11 renderpath, and Nvidia has to use DX9/DX10. Yet he argues, they don't.

DX11 basically = tessellation. AMD didn't need to wait till DX11, they could have paid developers to add tessellation to any game out since the 2900 came out, in fact they really should have because at that point it was a feature unique to Ati. However they sat there with a cool feature and did nothing.

They waited until it became part of the microsoft DX standard, which means they can't lock out nvidia because its part of the standard - i.e. any card from any vendor with DX11 certified drivers can use it. Microsoft will have that down in legal writing somewhere I expect.

Actions speak louder then words. I don't want to hear about why the competition is so nasty, or excuses for why they can't get games to run with feature X or Y. I want to see them show me why they are so good by making all my games work perfectly, and giving me stuff the competition can't match.

Its seems to me Ati spends far too much time talking about nvidia, and far too little time making the most of their hardware (which is the best out there at the moment).
 
Last edited:

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
How dare you reply with a valid argument to Wreckage? Now you won't get an answer! :p

As for the PR slaps - I find them funny. nVidia should've ignored the blatant trap set by that AMD guy. Over exaggeration from both camps as to be expected from the PR departments.

Are you trying to say Wreckage will dodge a question? Everyone knows that it's only you AMD/ATI/Intel fanboys who dodge questions when Wreckage asks the hard hitting questions on behalf of nVidia. Wreckage never dodges questions!
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
So nV owns AA?

a.) AA works on ATI cards in Batman
b.) Who locked it out? The game developer?
c.) If NVIDIA wrote the code, then yes they probably do own it.
d.) Red glasses make certain facts hard to see.

Wreckage, I'd like to see you answer that question please.

Why so much concern over my ontopic post and none whatsoever over the many offtopic posts?
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
a.) AA works on ATI cards in Batman

How about the AA modes that work perfectly fine on ATI cards that were locked out on purpose when Batman sees that the user is using an ATI card? You're not stupid, you knew what he meant when he asked you that question about whether nVidia owns AA.

Way to do something (and not for the first time) you routinely accuse others of, which is dodging the question. In fact, there are multiple threads where I've personally asked you questions and you ignored my post. And that's not counting the questions others have asked you over the years that you can't answer because it would shed a bad light on nVidia so you dodged the question.

b.) Who locked it out? The game developer?

There is no question in my mind and to anyone who is unbiased and can read between the lines that the developer did it at the behest of nVidia. Especially judging by how nVidia has locked out ATI on PhysX as an add-on companion to a main GPU. The PhysX thing was likely as retaliation for ATI not licensing PhysX but that's fine IMHO because it's part of business.

The Batman AA mode lockout is not fine because it hurts consumers. Hell, you and I both know perfectly well if this was ATI pulling these stunts you'd have made 300 posts about it screaming about how evil ATI is for working to lock out a somewhat common and standard AA mode that works perfectly fine on all modern hardware (performance hits aside).

This AA issue not working on ATI cards, while not illegal, is almost like Intel and their hush-hush deals with OEM's not to use AMD CPU's. Anyone who is not an idiot knows it happened even if there is very little solid evidence to prove this. There is a ton of circumstantial evidence.

c.) If NVIDIA wrote the code, then yes they probably do own it.

Not necessarily, when you do contract work, even if you're not getting paid directly by the company you are writing the code for, it usually has work agreements that the contracted work belongs to the company.

d.) Red glasses make certain facts hard to see.

Funny enough, green ones tint the world just as much if not more than red ones.

Why so much concern over my ontopic post and none whatsoever over the many offtopic posts?

Yeah, you never ever post off topic posts such as talking about corporate finances in a thread that was talking about the pros and cons of new GPU's or something like that right? I could pull other examples of you posting off topic stuff but I am too lazy.
 

PingviN

Golden Member
Nov 3, 2009
1,848
13
81
Nvidia are moving focus away from gaming, I don't see why they even try to deny that. Every time Nvidia makes an official statement it's allways about CUDA here and CUDA there. If they're not talking about PhysX, but PhysX is Nvidias own standard and as such will never reach a wide enough audience.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Nvidia are moving focus away from gaming, I don't see why they even try to deny that. Every time Nvidia makes an official statement it's allways about CUDA here and CUDA there. If they're not talking about PhysX, but PhysX is Nvidias own standard and as such will never reach a wide enough audience.

As long as you ignore the fact that they have the fastest gaming video card available. They are the only card the supports GPU physics in games. Their developer relations with game creators allows for a better gaming experience on their cards. Also most people game on their cards over any other.

But to ignore all that would require a total disconnect from reality.
 

sciwizam

Golden Member
Oct 22, 2004
1,953
0
0
As long as you ignore the fact that they have the fastest gaming video card available. They are the only card the supports GPU physics in games. Their developer relations with game creators allows for a better gaming experience on their cards. Also most people game on their cards over any other.

But to ignore all that would require a total disconnect from reality.

Indeed, it would.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
The 295 does much better at lower resolutions with eye candy turned down a bit. Exactly situations you don't buy a high end video card for. 5870 comes out on top by a hair in a few titles at the extreme high end, 295 does better in few more. We're talking less than a 5% difference for either winner though.

But the price for longer bars in benchmarks is microstutter, high performance only in titles with a correct SLI profile, mouse lag, multi-monitor support, windowed gaming issues, power usage & heat and so on.

It's fair to say the 295 is the fastest multi-GPU benchmarking card for the freshest game titles without DX11 support on average.
 
Last edited:

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Only in your alternate reality, I suppose.

http://www.techreport.com/articles.x/17618/9

Farcry 2 2560x1600 4xAA/16xAF
GTX295 - 53
5870 - 47

Wolfenstein 2560x1600 4xAA/8xAF
GTX295 - 93
5870 - 69

Left4Dead 2560x1600 4xAA/16xAF
GTX295 - 141
5870 - 122

HAWX 2560x1600 4xAA
GTX295 - 57
5870 - 56

Sacred 2 2560x1600 4xAA
GTX295 - 38
5870 - 43

Crysis Warhead 1920x1200 4xAA
GTX295 - 30
5870 - 27


FACT, not only was it faster in every game but 1, but in some cases it was a lot faster.
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Yeah guys, trying to deny GTX295 being a faster card is just stupid
 

PingviN

Golden Member
Nov 3, 2009
1,848
13
81
As long as you ignore the fact that they have the fastest gaming video card available.

Yes? Get 2 HD5870 and you have the fastest setup. Get 2 HD5850 and you have the second fastest setup. Sure, GTX295 i fast, but it's in no way competetive to the HD5870 (price/performance, performance/watt, DirectX (let's not forget DirectX 11 is a rather important sales argument)).

They are the only card the supports GPU physics in games.

Which is used by 2 (?) big titles: Mirror's Edge and Batman: AA. I'm not that impressed. With OpenCL (Bullet amongst other) in the making, PhysX days are accounted for unless ported to OpenCL.

Their developer relations with game creators allows for a better gaming experience on their cards. Also most people game on their cards over any other.

G80 kicked butt. Nvidia has a reputation of quality and performance going way back, of course they have a devoted userbase. Not to mention the fact that they had a great many OEMs on their side. However: with AMD aquiring ATi, OEMs now have an even more attractive platform option. And given that AMD was first out with Dx11, you can be sure Dell, HP, Acer and Compaq will be looking at these chipsets first, and maybe (at best) give the good ol' G80/G92 a glance.

But to ignore all that would require a total disconnect from reality.

Old merits count for nothing. G92 was, at best, a decent bunch of GPUs, GT200 was a fiasco, PhysX is going nowhere and CUDA... well, CUDA will have to compete with Microsofts DirectCompute.

Add to this the fact that Nvidia has problem getting their Fermi into production, AMD currently owning the market in all segments and the (for AMD) convinient situation where they can offer a competetive plattform for both Laptops and Desktops. Nvidias lack of an x86 licens forces them focus more on GPGPUs.