4850 VS 4870 VS 9800GTX+

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Wreckage
Originally posted by: apoppin


And intel and AMD *both* feel the CPU is the place to implement Physics
- for NOW
Good for them as all of us either have an Intel or AMD CPU. However NOTHING has shown up for it yet since Intel bought Havok.
PhysX is no "done deal" with gaming devs
With many of them it is as many games support it and their are something like 60 or more on the way. As pointed out AMD only has 18% of the graphics market so this is yet another reason to support the dominant player in gaming.
it is not a very impressive list - except perhaps to you
Please post a list of physics supported by your card....

:laugh:

the same list as yours :p





. . . my g80 8800GTX, right?
:confused:

you can repeat yourself over-and-over and we know your pro-Nvidia position no matter if hell freezes over. Unless you have something *new* as an argument, i am dismissing yours .. certainly for now until we actually see a compelling new title.

EDIT: To the OP

to answer your question

on your list, 4870

as an additional alternative you may not have considered, GT260
- oc'd preferably
rose.gif
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: thilan29
Originally posted by: nRollo
Getting the free PhysyX tips the choice the other way for me though, because it adds more realism to the games than anything else I've seen since the switch to "3D".

HL2 did physics fairly well and didn't need a PPU/GPU to do it. I'm not sure how much it's changed but I remember when GRAW(2) came out and they were touting the physx capabilities, all I saw was some stuff being blown to bits and then disappearing like 2 seconds later. Where's the realism in that? The blown up bits should stay there and hinder/help the player in the level.

I'm not sure how much it's changed from what I just described (Is it a lot different in regular games now not techdemos?) but if it's still the same I'd rather not have it as it just degrades performance anyway (except for dedicated techdemos) due to all the extra bits needing to be rendered.

physx supposedly does some auto scaling, and many effects can be manually disabled. based on how much processing power you have.
Check this link for a difference between GRAW physics with and without physX processing:
http://www.driverheaven.net/ar...articleid=122&pageid=5

the flag oneis downright pathetic, so it moves in the wind an gets holes torn in it, it also clips through itself, bah, unimpressive.
But the destruction capabilities, and the fact that the AI would also use it to blow a tiny hole in a fence or something and then peek and shoot THROUGH the hole while being hidden. That is very impressive.

Originally posted by: ajaidevsingh
ATi will also offer Havoc sooner or later its not logical to get Physx on the impact of game play FPS!!!

BTW Most of the games that support Phys do so in a as lite a quantity as possible so that the GPU can handle it..

How many Physx PPu games were released ??

Havok does not compete with physX. Havok is a lightweight CPU based physics engine for games to PURCHASE and integrate.
PhysX is a FREE CPU + GPU auto scaling physics engine that allows gameplay capabilities years ahead of havok. Check the link above.
It is like saying the wii is a direct competitor to the PS3/Xbox360. Those are different market segments.

Also nVidia opened physX and it IS being ported to AMD cards... NGO is working on the port with help from both nVidia AND AMD engineers!

Originally posted by: apoppin
"Free" PhysX is not a deal tipper for me. Hell, we don't even know how "free" it is or if it kills FPS. We also still don't see much support for it among the devs and we know intel and AMD will work together against Nvidia. i guess i could toss in an old Aegia PCI card if i want to check it out that badly.

PhysX could run 100% on the CPU. Depending on how MUCH physX effects were used, that could have no impact at all, or a massive impact.
In one example, UE3 physX maps. It is used so intensively that the FPS drops to 1/6ths its amount with physX enabled.
When you enable GPU physX capability in increase by 4x.
So physX the API is NOT free, it is EXTREMELY intensive. But physX on the GPU can give you a "free" boost of 4x the FPS just by enabling it.
It is like saying that a new driver version gives you a free FPS boost. It doesn't mean that it doesn't take calculation power to do things. it just means your calculation power goes further then it did before.

Just like how AMDs new AA is not free to perform (IE, disabling AA still increases your FPS)... but there is a free INCREASE in performance compared to before. making AA a valid option for many people. (and one of the key benefits for AMD)
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Originally posted by: keysplayr2003
Mem, please don't get yourself sucked into this. Creig stated something he shouldn't have as it has no bearing on anything we are discussing. It's something Creig has done time and time again. When a conversation goes south, attack the poster. We'll, his choice.

First of all, don't try to turn this around and claim that I started this. YOU are the one who broke your own "attack the data, not the poster" rule.

Originally posted by: keysplayr2003
Two FUD statements in a single post. Why?

If you had wanted to refute my statement, you should have put up some numbers to prove it. Instead, you thought it would be amusing to insult me and intimate that I was deliberately presenting patently false information. You offered NO proof of your own, yet proclaimed me to be spreading nothing but FUD. THAT'S ATTACKING THE POSTER!

Secondly, what I posted was NOT FUD. MOST websites have shown that the 4870 is faster than the GTX260 at 1920x1200. As I showed in my proof, sometimes the 4870 and GTX280 are virtually neck and neck in benchmarks at 1920x1200 which is the resolution the OP stated he was gaming at. Heck, even MarcVenice had to jump in and tell you that YOU were spreading FUD when you claimed:

Originally posted by: keysplayr2003
The GTX260 and HD4870 are overall dead even with each other.

If you feel that something I post is inaccurate, please refrain from making insulting remarks and post some numbers/facts to back yourself up instead of simply saying that I'm making "FUD" statements.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
keys went and put up a list of exact data and showed which card wins in each test. Considering i also went and put up such a chart, I can tell you that it takes a LOT OF TIME to do.

So he probably spent an HOUR crafting and collecting data that directly refutes your claims. How is he attacking the poster instead of the post?

The first link you posted only tests 4 games, and ARS is not even a testing site, they write articles. Your second link is to xbit labs, they tested tons of different games. In various settings, that one is a very VERY good review since it tests so many different games and different settings.
You claim that the GTX280 and the 4870 are neck in neck is refuted by your own link, the one to xbitlabs. I went through them and compared the GTX260 and the 4870 and the 4870 barely wins. (because it wins by a larger margin in more games, the 4870 fastest win is it being 2.75x times faster then the GTX260, and the GTX260 fastest win is it being 2.2x times faster then the 4870) That is the 260, not 280.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: taltamir
keys went and put up a list of exact data and showed which card wins in each test. Considering i also went and put up such a chart, I can tell you that it takes a LOT OF TIME to do.

So he probably spent an HOUR crafting and collecting data that directly refutes your claims. How is he attacking the poster instead of the post?

The first link you posted only tests 4 games, and ARS is not even a testing site, they write articles. Your second link is to xbit labs, they tested tons of different games. In various settings, that one is a very VERY good review since it tests so many different games and different settings.
You claim that the GTX280 and the 4870 are neck in neck is refuted by your own link, the one to xbitlabs. I went through them and compared the GTX260 and the 4870 and the 4870 barely wins. (because it wins by a larger margin in more games, the 4870 fastest win is it being 2.75x times faster then the GTX260, and the GTX260 fastest win is it being 2.2x times faster then the 4870) That is the 260, not 280.
Barely wins?

:confused:

did you not say you REFUTED his claim?
- he just said "wins" ... if what you say is true, you are agreeing with him


(1) Havok does not compete with physX. Havok is a (2) lightweight (3) CPU based physics engine for games
(2)Havok does not consider themselves "lightweight"; (3) they WILL be ported to AMD's GPUs - AND, most importantly, (1) Havok DOES compete with PhysX as proven in the 5 links i previously posted

:roll:
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Originally posted by: taltamir
keys went and put up a list of exact data and showed which card wins in each test. Considering i also went and put up such a chart, I can tell you that it takes a LOT OF TIME to do.

Oh, I know it takes time. I've done it myself on more than one occasion. If you noticed, I put up numbers of my own that showed the 4870 and GTX280 trading victories at 1920x1200, which is the resolution the OP said he was gaming at.

Originally posted by: taltamir
So he probably spent an HOUR crafting and collecting data that directly refutes your claims. How is he attacking the poster instead of the post?

It was his opening statement of "Two FUD statements in a single post. Why?" He didn't present numbers to refute my claim. Instead, he simply called me out as posting nothing but false information. ie - lying. The numbers I posted at 1920x1200 backed up what I said. How is saying "The 4870's overall performance is much closer to a GTX280 than a GTX260" complete FUD when the numbers I posted showed the majority of benchmarks between at 4870 and GTX280 at 1920x1200 to be within a few frames of each other?

Originally posted by: taltamir
The first link you posted only tests 4 games, and ARS is not even a testing site, they write articles. Your second link is to xbit labs, they tested tons of different games. In various settings, that one is a very VERY good review since it tests so many different games and different settings.
You claim that the GTX280 and the 4870 are neck in neck is refuted by your own link, the one to xbitlabs. I went through them and compared the GTX260 and the 4870 and the 4870 barely wins. (because it wins by a larger margin in more games, the 4870 fastest win is it being 2.75x times faster then the GTX260, and the GTX260 fastest win is it being 2.2x times faster then the 4870) That is the 260, not 280.

? I didn't link to ARS or Xbit. My link was to Overclockers Club. I think you're referring to thilan29's post.