GTX480 arrived [evga forums]

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
And you're just going to blindly believe what NVIDIA says without question, even though its a game with their name on it, and they helped develop, which has an IQ bug on the highest IQ settings, which undoubtedly will be used to test new cards. Yeah, fanboy indeed :rolleyes:.

The current situation is NVIDIA's IQ in DX11 in the game is crap. When it's fixed and reevaluated, conclusions can then be drawn.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
And you're just going to blindly believe what NVIDIA says without question, even though its a game with their name on it, and they helped develop, which has an IQ bug on the highest IQ settings, which undoubtedly will be used to test new cards. Yeah, fanboy indeed :rolleyes:.

The current situation is NVIDIA's IQ in DX11 in the game is crap. When it's fixed and reevaluated, conclusions can then be drawn.

Let me quote from the link:
Xbit’s piece included a response from developer 4A Games’ Oles Shishkovtsov regarding the issue:
No, the observed difference in quality is not due to the performance of the graphics cards. Indeed, graphics cards from Nvidia and ATI render the scene differently in some antialiasing modes. We are trying to find a way to correct this issue. Again, this has nothing to do with speed.
&

What started out looking like it could have been a tasty issue turns out to be a non-starter. No Quake / Quack issue here. It is not a hardware problem, it is not a driver problem, and it is definitely not NVIDIA trying to cheat and boost framerates by diminishing image quality. It is an application bug and the developer is aware of it and working on it. A resolution should land whenever the game’s next patch lands, which NVIDIA estimates at "about 2 weeks or so."


Now go ahead and FUD along all you want too :)
I have owned an equal number of NV videocards as ATI. You are too funny to call me a fanboy. In fact my last 2 videocards have been from NV.

Dosn't matter, your stance is fail...only ATI fannies seems to ignore the developers word, so I figured if it looks like a duck and quack (pun intended) like a duck...it's a duck.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Dont feed the Ljonberg... i ment troll.

Thing is, Lonjberg has good info about this, just presenting with the "fanboy fail" tag redirects all the good info into a flame fest.

Request to Lonjberg: If you have good info to contradict FUD, that's great, just don't fling the fanboy (or the like) tags around. Everyone will focus on that single word, instead of the info. So, I'll post it here again for everyone to absorb.

From the H article:
"The Bottom Line

What started out looking like it could have been a tasty issue turns out to be a non-starter. No Quake / Quack issue here. It is not a hardware problem, it is not a driver problem, and it is definitely not NVIDIA trying to cheat and boost framerates by diminishing image quality. It is an application bug and the developer is aware of it and working on it. A resolution should land whenever the game’s next patch lands, which NVIDIA estimates at "about 2 weeks or so."

Russian sensation, does this satisfy your earlier claim?

Thanks.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
You have to show both sides of the coin if you want to make a good point. What about the times when the GTX 480 is slower than the 5870? That's why the oft quoted 10-15% faster is more accurate than saying sometimes it's 30% faster.

I *DID* show both sides of the coin. Notice how is post said "is only 8-10% faster". Notice how I edited the post to and said "8 to 10% faster or up to 30% faster." I did not take away what he said, I added to it to make the statement more truthful.

How is that not showing both sides of the coin? In fact, HE did not show both sides of the coin because he stated the low end of typical performance improvements.

this is the perfect example of an nvidiot. one who chooses what he wants to see..

Reeeallly. That isn't what you did at all. Okie dokie.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Dosn't matter, your stance is fail...only ATI fannies seems to ignore the developers word

My stance is fail? That NV has inferior image quality on Very High in Metro 2033? We have also seen claims from developers that SM3.0 is going to revolutionize gaming on 6800 series....and X800XT series was subpar since it was limited to SM2.0...or that their games are multi-core optimized, when they are not. Just because a developer said something, doesn't mean it is 100% true. I don't see a correlation between "ATI fannies" and "ignoring the developers' words".
 
Last edited:

BababooeyHTJ

Senior member
Nov 25, 2009
283
0
0
He has an itch for new hardware as an early adopter. Of course in September 2009, a $379.99 5870 was a horrible deal. But 7 months later, a $499.99 15% faster NV card is a great deal. Also, 5970 was available for $590 at NCIXUS.com for 3 weeks at least, and we know how badly GTX480 loses to 5970. There is absolutely nothing wrong with being an NV fanboy as long as you are upfront about it. It is also justifiable for an American to support an American company. However, for some reason, a lot of people just don't want to admit their brand loyalty. I don't know why, because there is nothing wrong with being a "Ford" or a "Starbucks" or an "Amazon.com" fan, etc. For example, we know aigo is an Intel fan, but he is not a d*ck about it like Wreckage is about defending NV in every post.

The guy in that thread with the GTX 480 EVGA card stated (while running 100% fan speed): "Noise level isn't too bad... no worse than one of My GTX 280's at 100% fan speed..." <--- I guess he lost his hearing going to Iron Maiden concerts in his early 20s. lol

Yeah, 5870 was really easy to find at that price. A ton of sock at the time. 5970 is even easier to find especially at below "msrp". :rolleyes:

I wouldn't touch crossfire with a ten foot pole. It doesn't always scale based on my experience with a 4870x2 and sees negative scaling often enough while seeing larger framerate swings than a single card. I'm not a fan of multi-gpu setups and can't stand that argument.

I don't see GTX470 or 480 at a bad price point at this point in time. Well see how these cards perform in the long run once the guinea pigs get their hands on them.

I normally stay out of these fermi threads and I'm not sure why I took the bait.

I am interested to see what the end users think of these cards without people thread crapping.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Wow, did you just fail bigtime in getting Elfears point?
Try reading it again :D

No I did not. Elfears failed in getting the point I was making. The original person was low balling the performance of the gtx480, saying it only gets 8-10% in games. Sure there are some games where it only gets an 8-10% improvement in frame rate, but the fact is that is still low balling. It gets 30% as often as it gets 8%. So, in my post to the original person, I augmented his comment to say it gets anywhere from 8-30%.

Elfears then jumped and said I need to show both sides of the coin. But I DID show both sides of the coin. The original poster DID NOT show both sides of the coin by only stating the low ball gtx480 performance references.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
My stance is fail? That NV has inferior image quality on Very High in Metro 2033? We have also seen claims from developers that SM3.0 is going to revolutionize gaming on 6800 series....and X800XT series was subpar since it was limited to SM2.0...or that their games are multi-core optimized, when they are not. Just because a developer said something, doesn't mean it is 100&#37; true. I don't see a correlation between "ATI fannies" and "ignoring the developers' words".
Yep, evidently some people lack simple logic. Like I said, when a "fix" is released and the gameplay and performance is reevaluated, we can then draw conclusions. Until then they're just blowing smoke up your ass, although some people are liking it :rolleyes:.
No I did not. Elfears failed in getting the point I was making. The original person was low balling the performance of the gtx480, saying it only gets 8-10% in games. Sure there are some games where it only gets an 8-10% improvement in frame rate, but the fact is that is still low balling. It gets 30% as often as it gets 8%. So, in my post to the original person, I augmented his comment to say it gets anywhere from 8-30%.

Elfears then jumped and said I need to show both sides of the coin. But I DID show both sides of the coin. The original poster DID NOT show both sides of the coin by only stating the low ball gtx480 performance references.
But you're wrong, and that's what you're missing. It does not get "30% as often as it gets 8%." Countless reviews have shown it to be 10-15% faster; sometimes it can be 30% faster, sometimes it can be 10% slower, in the end, it averages to 10-15% faster across many games, resolutions, and IQ settings. You evidently have a fundamental misunderstanding of how averages work.
 
Last edited:

302efi

Golden Member
Jul 16, 2004
1,539
1
81
That guy has alot in internet coolness for taking pics of opening a box with a video card in it.

Theres so much internet coolness factor in that, it makes me wanna go spend $500+ to make a thread so I get alot of thumbs up too :D
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Yep, evidently some people lack simple logic. Like I said, when a "fix" is released and the gameplay and performance is reevaluated, we can then draw conclusions. Until then they're just blowing smoke up your ass, although some people are liking it :rolleyes:.
But you're wrong, and that's what you're missing. It does not get "30&#37; as often as it gets 8%." Countless reviews have shown it to be 10-15% faster; sometimes it can be 30% faster, sometimes it can be 10% slower, in the end, it averages to 10-15% faster across many games, resolutions, and IQ settings. You evidently have a fundamental misunderstanding of how averages work.


Please link three reviews comparing the same game where the gtx480 is 10% slower at playable frame rates than 5870. I ask for three because one random benchmark of a specific game can be easily refuted, but three that show the same/similar results means it's being reproduced by several different reviews. I know playable frame rates are subjective, but my opinion is if benchmark fps is in the 20's and/or teens average that isn't playable.
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I'd ask about your intended GTX470 purchase (all the performance of the 5850 for more money and the heat/power use of a 5870) but I don't want to get your panties in a bunch.

It's definitely faster than a 5850. Especially in Minimums which are very important imo. Whether thats worth the price is a different story though...
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Please link three reviews comparing the same game where the gtx480 is 10% slower at playable frame rates than 5870. I ask for three because one random benchmark of a specific game can be easily refuted, but three that show the same/similar results means it's being reproduced by several different reviews. I know playable frame rates are subjective, but my opinion is if benchmark fps is in the 20's and/or teens average that isn't playable.
From mAJORD on XS:
480vs5870final.png

Like I said, 10-15% across most games. The results change a little bit at other resolutions, but this is a good middle ground.
 

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
To the folks who keep saying the 480 is only $5-$10/mo more than the 5870 on electric bill:

5870 = ~$400
480 = ~$500

$5/mo = $60/year. Many folks buy a video card for 2 years of use, so the 480 will be $120 more in electric than the 5870 in its useful lifetime.

THEREFORE, the actual cost comparison should be:

5870 = ~$400
480 = ~$620

No, $5/mo is not gonna break someone's bank. But are you folks seriously trying to justify that 55&#37; price premium is worth the 10-15% (on average) performance boost along with the extra heat and noise? While I understand that the top dog always carries a premium for that, I would expect at least 30%+ improvement across the board for that kind of markup.

Even if the cost seems insignificant to you, you still need to factor it in for a cost analysis. Just like factoring in the iPhone's monthly bill for its 2-year contract into the cost of owning the phone.
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
To the folks who keep saying the 480 is only $5-$10/mo more than the 5870 on electric bill:

5870 = ~$400
480 = ~$500

$5/mo = $60/year. Many folks buy a video card for 2 years of use, so the 480 will be $120 more in electric than the 5870 in its useful lifetime.

THEREFORE, the actual cost comparison should be:

5870 = ~$400
480 = ~$620

No, $5/mo is not gonna break someone's bank. But are you folks seriously trying to justify that 55&#37; price premium is worth the 10-15% (on average) performance boost along with the extra heat and noise? While I understand that the top dog always carries a premium for that, I would expect at least 30%+ improvement across the board for that kind of markup.

Even if the cost seems insignificant to you, you still need to factor it in for a cost analysis. Just like factoring in the iPhone's monthly bill for its 2-year contract into the cost of owning the phone.

So you are saying if your run a 100 watt lightbulb in your house for a year it cost you 60$? Thats about how much more electricity a gtx 480 draws then the 5870.
Where do you buy your electric?
If you game 3 hours a day at 100 watts more. Thats 1000 hours a year for a 100 watt light bulb. No way is that 60$.

A 100 watt lightbulb at 9 cents kwh for 90 hours a month cost about 80 cents a month, NOT 5$.

9 cents per kwh at 100 watts at 1000 hours a year = about 10$.
So a gtx 480 playing Crysis 90 hours a month would cost you 10$ more a year then a 5870 with much better minimum framerates.:)

In Philadelphia it only 8 cents kwh.

http://michaelbluejay.com/electricity/howmuch.html
 
Last edited:

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
To the folks who keep saying the 480 is only $5-$10/mo more than the 5870 on electric bill:

5870 = ~$400
480 = ~$500

$5/mo = $60/year. Many folks buy a video card for 2 years of use, so the 480 will be $120 more in electric than the 5870 in its useful lifetime.

THEREFORE, the actual cost comparison should be:

5870 = ~$400
480 = ~$620

No, $5/mo is not gonna break someone's bank. But are you folks seriously trying to justify that 55% price premium is worth the 10-15% (on average) performance boost along with the extra heat and noise? While I understand that the top dog always carries a premium for that, I would expect at least 30%+ improvement across the board for that kind of markup.

Even if the cost seems insignificant to you, you still need to factor it in for a cost analysis. Just like factoring in the iPhone's monthly bill for its 2-year contract into the cost of owning the phone.


I HIGLY doubt anyone who buys a gpu for their gaming rig, thinks about the overall costs/future costs of their product choice, UNLESS its a geforce8000 series where in the end people could pretty much count on the card tanking/breaking down around the 2 year age.

I dont mean to insult you TemjinGold, but unless one buys one of these cards to have in their htpc under their livingroom TV alongside their other HI-FI equipment, noone will think about the power draw or heat/noise issues.
Your classical gamer is interested in performance for an affordable price. It speaks volumes about the average consumers buying power that we keep these things for a maximum of 3 years. Most people update within 2 years (i have no sources for this, so call it "made-up" and youll be in the right).

In other words, because of our buying power, we will buy one step above what we need or should be getting. Just think about an average joe going into a hardware store to buy his first laptop/pc. You can bet your ass that the seller will manage to score a few dollars more in sales by suggesting things the user doesnt need/want or hasnt asked for.




I would think about the overall costs if i bought a car, i would then think about km/ per liter and ofcourse the resale and valueloss over time as well
as repair/upkeep costs.



Here are a few things im curious about:

With the current heat/noise issues of the Fermi cards, what will their standing be in the OEM lairs? AMD/ATI currently lead the laptop gpu market, will nVidia have to wait for the respinn to offer their solutions?
 

thilanliyan

Lifer
Jun 21, 2005
12,062
2,275
126
A 100 watt lightbulb at 9 cents kwh for 90 hours a month cost about 80 cents a month, NOT 5$.

9 cents per kwh at 100 watts at 1000 hours a year = about 10$.
So a gtx 480 playing Crysis 90 hours a month would cost you 10$ more a year then a 5870 with much better minimum framerates.:)

In Philadelphia it only 8 cents kwh.

http://michaelbluejay.com/electricity/howmuch.html

I'm not saying you're utterly wrong but I seriously doubt the actual cost is 8c/kwh. There are other costs in the electricity bill that are added on top of the basic rate if you take a look at the actual break down of your bill. Also, 8c/kwh seems to be on the cheap end of the scale...I doubt most people would pay that little. I took a look at it in Toronto a while back and although the basic rate was around 9-10c/kwh, what we were actually paying was around 15-16c/kwh.
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I'm not saying you're utterly wrong but I seriously doubt the actual cost is 8c/kwh. There are other costs in the electricity bill that are added on top of the basic rate if you take a look at the actual break down of your bill. Also, 8c/kwh seems to be on the cheap end of the scale...I doubt most people would pay that little. I took a look at it in Toronto a while back and although the basic rate was around 9-10c/kwh, what we were actually paying was around 15-16c/kwh.

Your right the U.S. average kwh is 10.93 cents per kwh. So it cost 12 dollars a year more. Thats if you game 1000 hours a year.:D Even I don't do that.:eek:

Scroll down to bottom of linked page.

http://www.eia.doe.gov/cneaf/electricity/epm/table5_6_a.html