GTX 260 vs HD 4870

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
I'm looking to upgrade the card in my PC so that I can pass it on to my Mac Pro, however I'm torn between the GTX 260 and the HD 4870, on one hand the 4870 seems to have EXCELLENT AA performance and overall better performance across the board however the GTX 260 seems dominant in many UE3 games and UE3 is a VERY popular engine.

However performance is not all that there's to it, price-wise (ignoring rebates) the GTX 260 is like $20 less, it's an insignificant difference but whatever there's also the point of power consumption, the GTX 260 consumes less power than the HD 4870 and as I understand it's also cooler.

Another important point for me is silence and cooling efficiency, there seems to be no cards for either camp with something other than reference so which of the two is the most silent?

The final point of contention is, physics, NV has PhysX ALREADY working on their cards and ATI has a partnership with Intel for Havok but nothing has materialized yet, I'm not sure how the whole thing is going to pan out but PhysX is growing rapidly with engines like UE3 supporting it vividly, not to mention PhysX can already be hardware accelerated, havok is still running on CPUs.

So with that in mind, can anyone recommend a card for me? I could wait a few weeks for GT200b if it would be worth it (a die-shrink would definitely help with thermals) but still I'd like to make a purchase soon.

I'm going for a future-proof card, ATI would have already sold me if it wasn't for the slightly higher power consumption, the dubious UE3 performance and the whole physics deal.
 

unr3al

Senior member
Jun 10, 2008
214
1
81
www.link-up.co.za
You name one or two advantages for the HD4870 (the most important advantages imo) and then you list every single advantage the GTX260 has (which aren't many) but you want to know which one to get? Sounds like you've already decided you really want a GTX260 but you just want to make sure you won't be making a mistake.

In short, the GTX260 is a very overclockable card and could easily match a GTX280. So if you can get it for $20 less than an HD4870, get it. Here where I live there is a 20% price difference between the two though; in AMD's favor.
 

idiotekniQues

Platinum Member
Jan 4, 2007
2,572
0
76
whats up with physx, ive even heard some nvidia guys say its basically vaporware for now
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: idiotekniQues
whats up with physx, ive even heard some nvidia guys say its basically vaporware for now

Plus AMD has DX10.1, Nvidia doesnt

Basically comes down to: Chose the vaporware feature you like the most
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
The price drop of the GTX260 has made it very attractive. AT and TH's review of the HD4870 shows it using about 40 (avg) more watts at idle and only 15W more at load than the GTX260. I don't know how you are going to use your machine, so I'm going to just make a few assumptions: The computer is on half the day, it sits idle or is only used for desktop use for about 8 hours and you game for 4 hours. So, (40W)*8 hrs + 15W*4hrs = 380W-hrs or 0.38 kWh per day. Over the course of a year, using the 2006 national average of approximately $.10/kWh (it's probably currently higher in your area), you would see a savings of $13.87 per year [.38 kWh/day * 365 days/year * $.10/kWh] from using the GTX260 over the HD4870. Obviously, if you leave your PC idling for a whole day or electricity costs $.20/kWh in your area you can pretty much just double the savings. Or if you don't even use the your PC as often as I assumed, the extra cost of the HD4870's power usage is mitigated.

But don't forget, there are many instances and games where the HD4870 is faster than the GTX260 (I think it's pretty much faster than the 260 across most games), not to mention it can outpace the GTX280 in some games (like Bioshock, which even uses the Unreal 3.0 engine).

What are your current computer habbits? If you let me know you currently use a gaming computer (which will let me know how you will use this new rig) and where you live (to get a more accurate impression of the cost of electricity), then I could give you a more exact estimate of the additional power savings of the GTX260.
 

ajaidevsingh

Senior member
Mar 7, 2008
563
0
0
Its not tough at all get the 4870 i had a deli ma as to keep a GTX 280 or a 4850 CF and i opted for 4850 much better in several games!!!

Also i cant tell you how but somehow i can swear that the gaming graphics on the AMD seem better than on Nvidia!!! "Must be my partiality or something tough""
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
I just tried a 4870 and it was amazing. I didn't thought it was possible to play current games with 8x AA and no performance hit what so ever ( at least for the eye). This card just loves AA.

Power consumption is not that higher for ATI card, but the heat levels are pretty different, since there are quite a bunch of people that are hitting 100 C on stock cooler and that doesn't seem healthy at all.
In my country, GTX 260 is much more expensive then 4870, so I would take the ATI card , install an S1 on it and it would still be cheaper then the GTX. This is what I like mostly on it, the possibility of adding an aftermarket cooler . The GTX is pretty much stuck with its stock cooler, for the moment.

Like others have said, both PhisX and DirectX 10.1 support are only on paper for the time being. Who knows which will turn out in the future to be more important, but this can't be a criteria for choosing a card now.
I don't know where are you seeing poor performance in U3, I only see just a bit of an advantage for the GTX 260, but nothing extraordinary. In Mass Effect for example, both cards are getting the same fps.
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
Why do people compare price of electricity with video cards. Its just stupid. Oh wow you save a couple bucks a month on a video card. Color me not impressed when you drop $200-300 on a card. I run folding at home on mine 24/7, my highest electric bill this month was $90. Oh terrible im going to lose my house! /sarcasm
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: imaheadcase
Why do people compare price of electricity with video cards. Its just stupid. Oh wow you save a couple bucks a month on a video card. Color me not impressed when you drop $200-300 on a card. I run folding at home on mine 24/7, my highest electric bill this month was $90. Oh terrible im going to lose my house! /sarcasm

Uh, a couple bucks per month over the course of several months adds up to money saved. Some people don't care about money, and that's fine, but I don't see why you are GETTING SO DAMNED UPSET when real, factual evidence and numbers are used to show that in the long run, something is more expensive.

Here is a fact: The HD4870 uses, on average, 40W more power than the GTX 260 at idle.
Here is a situation: For me, I spend at least 4 hours a day just using my computer - browsing the internet (instead of watching TV). That means the HD4870 is using 0.160 kWh more energy. Living in a high-cost energy area, electricity (as of 2006 data) was $0.18/kWh. That's only an extra $0.86 on my electric bill per month. Over the course of a year, I would be paying an extra $10 by using an HD4870. Over two years time - the average time I keep a video card - that's an extra $20 I am spending (assuming my usage and electric costs per kWh are constant) by using the HD4870 over the GTX260.

You may think this is minor, but every little bit adds (or subtracts). It is not a bad thing, or a thing to get upset over, to lower your energy footprint if the option is economical. In this case, when comparing the GTX260 vs the HD4870, the GTX260 has a minor win. It's not earth shattering, but it is a win none-the-less and it is a viable argument because there is actually a tangible benefit under completely normal operating conditions. edit: It is certainly a viable enough argument as DX10.1, Cuda support, and etc. Of course, a user's wants and needs vary so not only one aspect should be compared. These video cards are similar in price, and thus comparisons will be made. If the GTX 260 still cost over $400, then it wouldn't even be part of the equation.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Though call tbh, if the GTX260 was more epensive, it be easy, if it was equal in price it would probably still be in the HD4870's favor, but if the GTX260 is cheaper, before rebates, damn hard call.

Right now I see a MSI gtx260 for 245$ at newegg, and a HD4870 from MSI as well, for $260, and it's overclocked a whole 30mhz. So 15$ less for the gtx260. Now benchmarks

Crysis, 4870 > gtx260

CoD4, 4870 > gtx260 equal at 2560*1600 though !!!

Et: QW, 4870 > gtx260 equal at 2560*1600 though !!!

AC, 4870 > gtx260

Witcher 4870 = gtx260

BS 4870 >> gtx260 LOTS faster

Oblivion 4870 =< GTX260 Only at 2560*1600 does the gtx260 become noticably faster

AoC 4870 > GTX260

GRID 4870 > GTX260

So the HD4870 is faster in games benched by AT. If I were you I wouldn't care about all the extra stuff, for now, videocards are made to play games, not do physics or other stuff. By the time gpu's really run physics you could get nvidia card instead, but right now it ain't worth it. Up to you to dig deeper, and check out games you play that I did not mention.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: idiotekniQues
whats up with physx, ive even heard some nvidia guys say its basically vaporware for now

PhysX is not vaporware.

I've been playing UT3 and GRAW2 with PhysX, and Warmonger has PhysX.

There are several other games coming this year, and I believe an estimated 50 titles by the end of next year.

In the GTX260 vs 4870, I'd say this is the deciding factor at this point as the performance of the cards is so similar.* The difference in PhysX immersion and non PhysX immersion is so great no one would ever choose to play non PhysX.

There will be reviews for this showing up today, so people can see what the press's opinion is as well.


*While the 4870 can do 8X AA in some games that the GTX260 can't, I consider the difference in 4X vs 8X AA and Physx vs Non PhysX on totally different magnitude of desirability. 8XAA is nice- but PhysX changes the whole game.
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
Like i said, if you are worried about that small amount of electricity saving when putting $200-300 on a video card you can't afford the card to begin with.

You might as well compare how much more you lose every week price drops on that same card you just bought. Which is, well, pointless.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: error8
I just tried a 4870 and it was amazing. I didn't thought it was possible to play current games with 8x AA and no performance hit what so ever ( at least for the eye). This card just loves AA.

Power consumption is not that higher for ATI card, but the heat levels are pretty different, since there are quite a bunch of people that are hitting 100 C on stock cooler and that doesn't seem healthy at all.
In my country, GTX 260 is much more expensive then 4870, so I would take the ATI card , install an S1 on it and it would still be cheaper then the GTX. This is what I like mostly on it, the possibility of adding an aftermarket cooler . The GTX is pretty much stuck with its stock cooler, for the moment.

Like others have said, both PhisX and DirectX 10.1 support are only on paper for the time being. Who knows which will turn out in the future to be more important, but this can't be a criteria for choosing a card now.
I don't know where are you seeing poor performance in U3, I only see just a bit of an advantage for the GTX 260, but nothing extraordinary. In Mass Effect for example, both cards are getting the same fps.

The drivers, game demos, and patches for PhysX will be on NVIDIA's site August 12th.

People will be able play PhysX games in six days, I have been for weeks.

 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
I have 4 Physx titles here I have been playing with for about 2 weeks now. As Rollo has stated above, August 12th is the date for Physx Pack and driver availability.
There will be a new Physx pack available with a concurrent driver launch that supports Physx across the entire 8, 9 and GT200 series.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: keysplayr2003
I have 4 Physx titles here I have been playing with for about 2 weeks now. As Rollo has stated above, August 12th is the date.
There will be a new Physx pack available with a concurrent driver launch that supports Physx across the entire 8, 9 and GT200 series.

Pack contents:

Full version of Warmonger

Full version of Unreal Tournament 3 PhysX Mod Pack (you will need Unreal Tournament 3 to play)

PhysX patch for Ghost Recon Advanced Fighter

Sneak peek at Nurien upcoming social networking service, based on the Unreal Engine 3 (with built-in benchmark)

Sneak peek at the upcoming game Metal Knight Zero (with built-in benchmark)

All new NVIDIA ?The Great Kulu? tech demo

All new NVIDIA ?Fluid? tech demo


 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: imaheadcase
Like i said, if you are worried about that small amount of electricity saving when putting $200-300 on a video card you can't afford the card to begin with.

You might as well compare how much more you lose every week price drops on that same card you just bought. Which is, well, pointless.

Money isn't always the issue, especially when the cards are already similar in price. Currently, the GTX260 is not only cheaper (by about $20), but also costs less over the next 1-2 years (another $10-$20 for my scenario; his will be different).

Did you miss my other points? Lowering your, the average American's, energy footprint, CO2 output, and etc can be especially important to people. But hey, maybe some people love other features. CUDA looks great. Maybe it's all about performance? In that case the 4870 is a winner.

I don't see why we should compare depreciation costs either. Both of the cards are going to depreciate roughly the same. Not to mention, if you're the average gamer and are dropping $250 on a video card then you are more likely to keep the card a bit longer than hardcore enthusiasts, and maybe keep it indefinitely. I try to make the most out of my old parts. I don't sell them. I keep them around and make spare machines. If someone I knew ever needed something basic, I would just give it away.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
I hate to have to say this, but it's necessary.But before anyone shouts "marketing", I'd just like to emphasize that this is just sharing of information that we as focus group members have access to. Somebody in this thread said Physx is vaporware, but this information clearly shows that it is not.
So please, make an effort to keep it on topic and keep it civil. It's fairly easy.

Thanks.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: keysplayr2003
I hate to have to say this, but it's necessary.But before anyone shouts "marketing", I'd just like to emphasize that this is just sharing of information that we as focus group members have access to. Somebody in this thread said Physx is vaporware, but this information clearly shows that it is not.
So please, make an effort to keep it on topic and keep it civil. It's fairly easy.

Thanks.

The entire point of focus groups is marketing ;)
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
A couple of benchmarks for you guys.

Using a mainstream system (by no means the top of the line)
AMD AthlonX2 4600 stock at 2.4 GHz
2GB DDR2
MSI K9N4 Ultra mobo
GeForce 9800GTX+ (single) at stock clocks.
Forceware 179.79 beta
Windows XP32 SP2
Physx driver Pack (available Aug 12th)

Warmonger at 1280x1024 (higher resolutions will follow shortly)
Using Fraps.

Using software Physx calculations (on CPU)
Min: 4 Avg: 8.13 Max: 21

Using hardware Physx calculations (on GPU)
Min: 28 Avg: 39.70 Max: 63

More to follow. I'll start a new thread on this as it deserves one of it's own.

Keys
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: dug777
Originally posted by: keysplayr2003
I hate to have to say this, but it's necessary.But before anyone shouts "marketing", I'd just like to emphasize that this is just sharing of information that we as focus group members have access to. Somebody in this thread said Physx is vaporware, but this information clearly shows that it is not.
So please, make an effort to keep it on topic and keep it civil. It's fairly easy.

Thanks.

The entire point of focus groups is marketing ;)

Dug, you're a great guy, but don't even joke about this.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: keysplayr2003
Originally posted by: dug777
Originally posted by: keysplayr2003
I hate to have to say this, but it's necessary.But before anyone shouts "marketing", I'd just like to emphasize that this is just sharing of information that we as focus group members have access to. Somebody in this thread said Physx is vaporware, but this information clearly shows that it is not.
So please, make an effort to keep it on topic and keep it civil. It's fairly easy.

Thanks.

The entire point of focus groups is marketing ;)

Dug, you're a great guy, but don't even joke about this.

I joke not.

I'm not trying to irritate/aggravate/insult you, but the point of focus groups is, and always has been, marketing, whether it be consumer products, government policies, or politicians and political parties.

Companies do it to more effectively sell products, nothing more, nothing less.

They don't do it out of the goodness of their hearts, and they never will ;)

In this case I don't see what you or nRollo are doing as in anyway detrimental to the forum, but nvidia has signed you up to indirectly assist their sales by testing new gear, reporting on customer feedback over products, reporting driver problems, and perhaps correcting FUD about their products (and the useful advice you and nRollo have provided in this thread, for example, will assist in doing just that).

Google focus groups, focus groups and marketing, or just marketing tools if you don't trust/believe me :(

I'll note again that I'm not trying to cause trouble, I honestly don't think anything I've said is in any way insulting or incorrect, and I for one am grateful for more correct info :)
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Nvidia focus group members sure seem to ride high horses, very high horses. If you ask me, it's the way you guys post, it's full of arrogance, stating things as fact. Might be partially due to nrollo, like him saying "In the GTX260 vs 4870, I'd say this is the deciding factor at this point as the performance of the cards is so similar.* The difference in PhysX immersion and non PhysX immersion is so great no one would ever choose to play non PhysX." The post above his shows the 4870 clearly being the winner in many benchmarks, performance isn't similar, the HD4870 is faster then the GTX260 in most cases.

And yes, you are in a focus group purely out of marketing reasons. Average joe gets acces to some information and equipment only inhouse testers used to get, so people can see not everything is being made up. People trust average joe over PR statements. It has marketing written al over it. Don't blame some of us for being sceptical? The benchmarks you posted show how the 9800gtx+ gets completely destroyed at 1280*1024 with edge smoothing and 8xAF, on the UT3 engine, which can produce some nice images but is allready getting old. I'm talking minimum FPS of course, which I bet happens when you blow something up and the GPU can no longer handle both the 3d acceleration AND needs all of its shaderpower. This is at 1280*1024, I'm curious if the GPU will still DO any physics calculations when you turn up the resolution to 1680*1050. let alone 1920*1200. And when people have to choose between using Physx and no AA, things aren't as CLEAR or defining as nrollo makes it out to be ...

And since keys only posted half the benchmark ( which makes physx on the gpu look good, I'll post the other half )

Rig1: AthlonX2 4600+ 2GB DDR2 9800GTX+

GRAW2
1280x1024 (higher res will follow shortly) edge smoothing AA and 8xAF

Using software Physx calculations (on CPU)
Min: 2 Avg: 8.37 Max: 25

Using hardware Physx calculations (on GPU)
Min: 6 Avg: 30.05 Max: 81
-------------------------------------------------------
 

WT

Diamond Member
Sep 21, 2000
4,816
60
91
Bah, I don't mind the comments from Keys or nRollo, as both have enough technical knowledge to get some points across and engage in a good discussion, but some guys really hate the fact that they even post after they were accepted into the focus group. I'm stuck choosing between the two cards as well, and I fully expected to be buying the 4870 up until the price drop and the news on Physx, but I am now reading each and every one of these threads to further edumacate (sic) myself on which is the better choice for me.
PS - I just started playing UT3, and after having been thoroughly disgusted with the demo, the game is very good. It reminds me of the QW:ET demo fiasco, and in the end the full game is much better than the demo. The UT3 numbers are interesting to say the least. Thanks for any worthwhile input anyone can add to this discussion.
 

cm123

Senior member
Jul 3, 2003
489
2
76
Originally posted by: nRollo
Originally posted by: idiotekniQues
whats up with physx, ive even heard some nvidia guys say its basically vaporware for now

PhysX is not vaporware.

I've been playing UT3 and GRAW2 with PhysX, and Warmonger has PhysX.

There are several other games coming this year, and I believe an estimated 50 titles by the end of next year.

In the GTX260 vs 4870, I'd say this is the deciding factor at this point as the performance of the cards is so similar.* The difference in PhysX immersion and non PhysX immersion is so great no one would ever choose to play non PhysX.

There will be reviews for this showing up today, so people can see what the press's opinion is as well.


*While the 4870 can do 8X AA in some games that the GTX260 can't, I consider the difference in 4X vs 8X AA and Physx vs Non PhysX on totally different magnitude of desirability. 8XAA is nice- but PhysX changes the whole game.


wow, normally being a AMD fan as I am, normally do not agree w/you rollo - however you have some points here, moreso the entire GTX series VS 4800 series - personally, I find slighty better pic from 4870 than GTX 280 (might be more driver related really in the end as time goes on) - however got more features can use today like PhysX I guess - one thing that still hits me hard is drivers and vista for nvidia, way too many crashes...

I too acutally have been able to with my 280 play physX games already (1680x1050 with no problems) - so would I take PhysX over dx 10.1 or 8xaa - sure of course, after seeing it, who wouldn't...

personally I would like to see physx added to cod4 - would get really interesting... also improve RMA (failure rate) GTX series - dx 10.1, shame even though vapro ware for now not supported, I know much of it already is covered in GTX hardware, I would like to hear more about that, what is and such - as far as the grahpics quality said above, be interesting to talk away from public eye here as to that - few things here to take back to nvidia (sure you have already)

rollo, be crazy for nvdia to leave the chipset market too -


 

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
Maybe using one card for both graphics and physics could overload it (though I don't think it'll do much considering the AGEIA PPU was very weak) but as I understand it NV will let you use your main card for graphics and let you use a weaker card for physics (something like a 9500GT) of course I think that would only work if both cards are nvidia.

Now, about power consumption and thermals, I'm not worried about bills as I'm worried about stability, overclocking performance and silence/temperatures.

I know both cards are in the same performance ballpark (with the HD 4870 being better and in some cases rivals the GTX 280) but I'm worried that might change in the future, the GTX 260 has a lot more memory and many games use MORE than 512MB, it's just a minor difference so it's not noticeable in the framerates but look what happened to the 8800GT 256MB.

I still think physics is the main contention point, I love physics, I think they are the next revolution in gaming and NVIDIA already has something running, even though AMD + Intel is a powerhouse, I just don't see developers adopting Havok when PhysX gives you hardware acceleration TODAY on hardware most of the gaming market already has (then again they wouldn't alienate ATI users either) and no matter how powerful or how many cores Intel x86 CPUs get, a GPU is much better at doing physics.

Then again the HD 4870 has much better performance now but will 512MB still cut if for the next few years?

Just that it's a fairly difficult choice shows that NV is not doing as bad some people think, the GTX 260 at $400 (or whatever it was) was a horrible card but at this price, it's very attractive.