• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[Ars] Nvidia or AMD: Who makes the best budget graphics card?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
AMD drivers have been amazing for the past 6 months, while nVidia has had more blunders than normal. It honestly feels like the ill-informed meme people parrot has been reversed, outside of Linux.

However, if we're bringing in secondary factors beyond the video card into this... The option for a FreeSync Monitor should push the win for any "Budget" argument. None of these cards would be able to push a consistent 60fps at decent settings on modern games. FreeSync would allow people to push higher fidelity, remove tearing and enjoy the smooth variable frame rates that these cards would inevitably offer between 30 and 60fps. Disguising a card to be punching a class above.
Well AMD has been improving its Linux drivers a large amount since they released Polaris, and making them open source. So Nvidia may no longer be the go to GPU for Linux in the future.
 
Here we go again, pushing the 470 as a low power card. It would for sure run on a good quality 350 or 400 watt PSU with a six pin connector. I also most certainly *would not* try to run it on a low quality OEM 300 watt PSU with some jerry-rigged Molex adapter.

Jerry rigged molex adapter aside. HP sells systems with 300 watt power supplies containing RX480's in them, so I don't see why a 470 is a problem.
 
Here we go again, pushing the 470 as a low power card. It would for sure run on a good quality 350 or 400 watt PSU with a six pin connector. I also most certainly *would not* try to run it on a low quality OEM 300 watt PSU with some jerry-rigged Molex adapter.
happy medium's indefensibly bone-headed comment which itself was a tangent from my reference of a silverstone 300 watt is the jumping off point for this, not some sort of concerted effort to get the 470 into boxes with low quality OEM units.


I really don't think the 480 is that special. It's the 470 that's the best value. As always cut down chips from amd provide way more value for their cost compared to the full chop. It's a massive pricing discrepancy that you should take advantage of. Im interested in seeing how it plays out with Vega and whether the cut down chip unlocks or not.
currently that "massive" price discrepancy is about $25. better deal? probably. but it's not like the difference between, say, 290 and 290x
 
Last edited:
AMD drivers have been amazing for the past 6 months, while nVidia has had more blunders than normal. It honestly feels like the ill-informed meme people parrot has been reversed, outside of Linux.

However, if we're bringing in secondary factors beyond the video card into this... The option for a FreeSync Monitor should push the win for any "Budget" argument. None of these cards would be able to push a consistent 60fps at decent settings on modern games. FreeSync would allow people to push higher fidelity, remove tearing and enjoy the smooth variable frame rates that these cards would inevitably offer between 30 and 60fps. to Disguising a card to be punching a class above.
Yea, I am sure that people looking for a budget gpu are just going buy a new monitor
Jerry rigged molex adapter aside. HP sells systems with 300 watt power supplies containing RX480's in them, so I don't see why a 470 is a problem.

Dont think so. Yes, the standard PSU is 300 watts. And an RX 480 is an option. *BUT* if you go to the configuration page and select an RX480 it forces you to select an upgraded psu as well, a 500 watt unit.
 
Jerry rigged molex adapter aside. HP sells systems with 300 watt power supplies containing RX480's in them, so I don't see why a 470 is a problem.
Some aftermarket 470s do consume more than a 480.

ALAEY8H.png


That Nitro hits 270W in Battlefront, probably without a full CPU load. With a quality 300W unit, there probably wouldn't be a problem, but I'd be a bit unsure if that would hold up with a full CPU and GPU load. It is notable that Computerbase runs a 4.5GHz 6700K, which obviously wouldn't be the case for a prebuilt system.

Still, I'd guess that that same 300W PSU could handle an OC'ed 1050 Ti and an OC'ed 6700K at full GPU and CPU load. That system only consumes half the power.

NL4Ftjw.png


Notably, the RX 480 Nitro actually hits 307W.
 
This makes very little sense. You're saying they're over-volting their GPU's so they can meet a certain spec, thus hurting good samples from being spec'd slightly higher at lower power consumption (aka more competitive in every metric) while simultaneously making their entire product stack look worse. Create a third sku if necessary and raise the price of the top tier, higher performing sku. If you're right on this, AMD's cutbacks in engineering are reflecting their incompetent marketing and Vega stands no chance at all at being anything close to competitive.

Maybe it has something to do with voltage droop that happens with sudden load changes when the gpu jumps from one calculation task to the next. Maybe it is not the gpu that is the reason for higher stock voltage.
Maybe the higher voltage allows for more freedom in vrm component choice for third party suppliers to build cheaper cards. I think this is a BOM(Bill Of Material) reason.
Cheaper components with more tolerance could be a reason for more voltage droop.
After all, the RX480 and RX470 are so called "budget" cards that allow max settings at 1080p gaming.
The chosen vrm controller IR3567B itself is highly likely not the reason, since everybody who owns a rx480 or RX470(and has a IR3567B controller) can downvolt the gpu core voltage without an issue.

https://en.wikipedia.org/wiki/Voltage_droop
 
are all of american oem build pcs equipted with 300psus? beaucse all of the oem builds in europe in the last 15 years ive seen had at least 400watt psus.
 
currently that "massive" price discrepancy is about $25. better deal? probably. but it's not like the difference between, say, 290 and 290x

By which deals now?
And it's not like you can compare the 290/290x directly you need to factor in the price bracket difference.
 
By which deals now?
And it's not like you can compare the 290/290x directly you need to factor in the price bracket difference.
the two cheapest ones on newegg as of the post last night. both msi armor brand.

when hawaii was relaunched as the 390, 290 was $250 and 290x was $329 per anandtech's 390 launch article. which is a 31.6% price difference, much larger than the % price difference between 470 4Gb and 480 4GB.
 
I know the RX470 is a bit better in the value department, but just picked up an MSI RX480 Armor 8GB OC for $200AR. Wanted that little bit of extra oomph over the RX470 since I game at 1920x1200 as opposed to 1080p, which requires the pushing of a tad more pixels. Lot of good deals going on right now.

This is my first GPU update in a little over 3 years. Whew, finally! 🙂
 
Last edited:
0 for 2.

Are there OEM versions of the 470/480 or are those the exact same versions they sell off the shelf?
 
I've flashed over 35 480's (both aftermarket but mostly reference) with the same mining BIOS. This BIOS sets the voltages to the same value on all cards. In addition to these optimized lowered voltages from the modified BIOS I use a tool call Wattman 0.92 to undervolt to 880mv GPU core and 900mv memory. I've set all but one of the 35 cards to the exact same voltage settings and have no stability issues mining for over 6 months. The only card that needed a slight bump in voltage (900mv GPU core) has an ASIC rating of 90%. All cards run at 72C or lower on the core and don't throttle. The reference cards under mining load (depends on algorithm and ASIC rating) consume between 58 - 75W each. The the rest of the card consumes an additional 35 - 45W. All reference cards run at 1095 Mhz core and 2180 Mhz memory with modified (lower latency) straps. All cards are 8GB models with Samsung memory.

From my tests I conclude that AMD unnecessarily overvolts their 480 reference series cards quite a bit more than they have to. I suspect they do this as they lack the resources, expertise to better fine tune the cards but in skipping this process the reference cards throttle unnecessarily causing performance degradation and increased fan noise. Perhaps they know about the issue but don't think it's worth addressing due to time/cost factors. I don't really know.

This is a shame as these cards at stock speeds can be undervolted quite a bit where they get much closer to Geforce 1060 power consumption. Everyone likes to blame Globalfoundries for their crappy 14nm process but I think a large part of the power consumption issue is directly caused by AMD's lack of fine tuning.

This overvoltage observation goes all the way back to Radeon 58xx series cards.

Back on topic. The best value card depends on geography but in Canada and the US market the 4GB 480 is the best followed closely by the 470. The 1050Ti is terrible value in comparison *unless you have a special need for size/power* (and I bought the EVGA 1050Ti). The 1060 3GB card is rendered pointless at the current pricing. It should slot in price wise between the 1050 Ti and 470.
 
Last edited:
From my tests I conclude that AMD unnecessarily overvolts their 480 reference series cards quite a bit more than they have to. I suspect they do this as they lack the resources, expertise to better fine tune the cards but in skipping this process the reference cards throttle unnecessarily causing performance degradation and increased fan noise. Perhaps they know about the issue but don't think it's worth addressing due to time/cost factors. I don't really know.

Perhaps their initial batch of chips needed the higher voltage, but more recent batches can handle lower voltages? Maybe we'll see a refreshed RX 475/485 which runs at a lower voltage?
 
Perhaps their initial batch of chips needed the higher voltage, but more recent batches can handle lower voltages? Maybe we'll see a refreshed RX 475/485 which runs at a lower voltage?

Maybe. I should have mentioned all of my RX480 reference cards are from launch batches or purchased very shortly afterwards. The reference cards use less energy than the aftermarket Sapphire and MSI model cards which were purchased a little later. I understand the higher priced XFX aftermarket cards are supposed to be more energy efficient but they could be binning and or have better VRM design. Perhaps XFX simply recognized these chips were being overvolted and did something about it. Who knows.
 
History repeats itself, AMD rules budget cards and Nvidia top end ones. Lets see if Vega can challenge
I honestly couldn't care less if AMD ever puts out a $700+ video card because it's unlikely that I'd ever spend that much. As long as they keep putting out performance/midrange cards at a good price, they'll keep getting my money.
 
I honestly couldn't care less if AMD ever puts out a $700+ video card because it's unlikely that I'd ever spend that much. As long as they keep putting out performance/midrange cards at a good price, they'll keep getting my money.

Dude, i never implied that the card must be $700. My point was that since i like AMD and only buy high end, it would be great to have an option at that performance class.
I bought HD7970 and 980TI because they were the best back then. Would love to get me a fast Vega card that's all... but if 1080TI comes out and its faster, count me in 🙂
 
I honestly couldn't care less if AMD ever puts out a $700+ video card because it's unlikely that I'd ever spend that much. As long as they keep putting out performance/midrange cards at a good price, they'll keep getting my money.
I couldn't agree more, I've never spent over $300 on a video card, usually $250 is where I max out, last card Saphire 290 oc for $250 when they went on sale almost 2yrs ago. I would have bought a 480 but my 290 is so good I don't need a 480 at this point. If they come out with a 480 refresh then I might consider the purchase but will never spend $500 to $1000 on a card.
 
AMD has some really nice cheap graphics cards out right now. Especially on newegg. I can't comprehend why anyone would buy a 1060 on newegg when there are 480s that are so much cheaper there.
 
Back
Top