[Rumor (Various)] AMD R7/9 3xx / Fiji / Fury

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
The 260/X (Bonaire) has FreeSync and TrueAudio already.

360 and 370 could be Bonaire?
 

DiogoDX

Senior member
Oct 11, 2012
757
336
136
AMD-Virtual-Super-Resolution-May-2015-Radeon-3001.jpg


That rules out a re-badge for the low-end. So we just lack info on 390/X.

Edit: Only Fiji interests me personally, but AMD definitely needs their entire stack to be competitive. The 750ti, 960 and 970 is doing the damage.
Don't rules out because VSR already works on 7000 cards with modded win10 driver.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
Interesting, I didn't know that, still waiting for VSR on my old 7950. Some old games would benefit from super sampling.
Used the wagnards tools method just a couple of days ago for my HD7950.
Definitely worth it. This is a great freaking feature for old games where you have extra horsepower and no setting to enable to utilize it.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
guys here is an indication that the rebrand theory is going awfully wrong. :cool:

http://videocardz.com/56182/amd-off...t-for-radeon-r9-300-and-r7-300-graphics-cards

"What is also not surprising, but definitely interesting, is that AMD managed to include Bonaire and Pitcairn-based Radeon R7 360 and R7 370 series to this list. VSR is without a doubt much easier to implement than all in-house features like FreeSync or TrueAudio, so I wouldn’t see this as an indicator for other technology to be ported into new series, but hopefully I’m wrong."

Funnily wccftech now tries to defend its rebrand theory. But the truth is this rules out Pitcairn and Bonaire as they do not have the necessary hardware to support VSR.

If AMD is taking the effort to include VSR then Freesync and TruAudio + Tonga improvements (and even more further architectural improvements as I expect) will make it to the R9/R7 series. Anyway its just a week away before these rumours are settled once and for all. :whiste:

The problem with this theory is that AMD already has a partially working software implementation of VSR in a beta driver that works with all GCN cards, even GCN 1.0. This post contains details of how one person got VSR working on their 7970 (a Tahiti-based GCN 1.0 card).

Another problem is that the leaked R7 370 images include CrossFire fingers on the card. Below is a zoomed-in portion of the image, where these are clearly visible:

El73jEz.png


Only GCN 1.0 needs a physical CrossFire bridge. GCN 1.1 (Hawaii) and GCN 1.2 (Tonga) use XDMA for this. Therefore, if AMD had updated the silicon, we'd expect XDMA to have been added, along with features like FreeSync and TrueAudio. But XDMA is clearly not present, which means that this is an unmodified Pitcairn chip. While it's theoretically possible that AMD updated the hardware scaler block, UVD block, and added TrueAudio and FreeSync while still leaving the architecture at GCN 1.0 and omitting XDMA, it seems very unlikely. It's 95%+ certain that this is Pitcairn, with perhaps a metal-layer respin (new stepping) at most. Even that is questionable, since it doesn't appear power efficiency will be any better than on existing 7850/7870/R9 270 cards. (R9 270 got 1280 shaders on a single 6-pin connector, while this one only does 1024.) The leaked XFX R9 370 data sheet from a few months back was wrong about the release date, but that may well have been changed by AMD, and the other stuff all looks correct, such as the short PCB (167mm) - you can see in the slides leaked today that the R7 370's board is shorter than its cooler. The data sheet indicates that the "Ref Board Power" is 110W-130W. In comparison, the original HD 7850 from way back in 2012 only used a maximum of 96W in gaming loads and 101W in FurMark. The increased power usage is presumably due to higher core clocks (the original 7850 was only 860 MHz) and even more to the use of faster GDDR5, which makes the 200 series less efficient than the 7000 series on average. So it doesn't look like we're getting any efficiency gains with R7 370, either.

I don't want to believe it, either - but the preponderance of the evidence indicates that everything except Fiji is just a straight rebrand.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
The 290X does not use 300W under normal usage (ie: Games where is averages ~250W). "FurMark" is not a game, it is not used to tell people how many watts a GPU uses while gaming (ie: Regular usage). To use it is such is disingenuous.

If you want to go that route, then the GTX980 uses 342W according to TPU, as opposed to 182W while gaming.

Plus, stating that the 300 series are direct renames of the 200 series as fact is WRONG as we DO NOT KNOW. You can talk about how rumors state one thing or another, but do not state them as fact.

The reference GTX 980 only goes up to 190W even in FurMark (in gaming, it peaks at 184W, so the FurMark figure isn't even that far out of normal usage). This indicates that it's running up against the power limit - which is how cards are supposed to be designed.

You're probably thinking of this screwed-up Gigabyte GTX 980 which I linked to in the past. That card does indeed go up to an insane 342W on FurMark, which is enough to violate the PCIe specification (the card has one 8-pin and one 6-pin connector, which is a total of 300W allowed power usage). But none of the other 980 cards do this, which means that Gigabyte screwed up by disabling the power limiter entirely on this card.

Again, I reject the argument that an actual program you can run on your system somehow doesn't count just because the vendors don't like the results. If AMD doesn't like it, then they should be using the built-in power limit function to cap power usage, just as Nvidia does with their reference designs.

I wish that someone would release an actual game based on the FurMark engine just so that this argument ("it doesn't count") goes away.
 
Feb 19, 2009
10,457
10
76
I wish that someone would release an actual game based on the FurMark engine just so that this argument ("it doesn't count") goes away.

Until they do, stop using FurMark as a representation of a GPU's power use. Gamers certainly don't buy GPUs to play FurMark, at least not the sane ones. ;)

You could argue Bit Mining power load applies for Radeons and that was true once when it was very popular.

But these days, gaming power load across various popular titles is an accurate representation.

On this, AMD's TDP ratings tend to match max load, such as Bit Mining while NV's TDP ratings reflect average gaming load (it's been shown to use way more in Compute tasks). I think it's time AMD went with NV's approach so it compares the same measurement.

Either way, nobody causes a stir because the Titan X can pull more than 250W in compute or furmark.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The problem with this theory is that AMD already has a partially working software implementation of VSR in a beta driver that works with all GCN cards, even GCN 1.0. This post contains details of how one person got VSR working on their 7970 (a Tahiti-based GCN 1.0 card).

Another problem is that the leaked R7 370 images include CrossFire fingers on the card. Below is a zoomed-in portion of the image, where these are clearly visible:

El73jEz.png


Only GCN 1.0 needs a physical CrossFire bridge. GCN 1.1 (Hawaii) and GCN 1.2 (Tonga) use XDMA for this. Therefore, if AMD had updated the silicon, we'd expect XDMA to have been added, along with features like FreeSync and TrueAudio. But XDMA is clearly not present, which means that this is an unmodified Pitcairn chip. While it's theoretically possible that AMD updated the hardware scaler block, UVD block, and added TrueAudio and FreeSync while still leaving the architecture at GCN 1.0 and omitting XDMA, it seems very unlikely. It's 95%+ certain that this is Pitcairn, with perhaps a metal-layer respin (new stepping) at most. Even that is questionable, since it doesn't appear power efficiency will be any better than on existing 7850/7870/R9 270 cards. (R9 270 got 1280 shaders on a single 6-pin connector, while this one only does 1024.) The leaked XFX R9 370 data sheet from a few months back was wrong about the release date, but that may well have been changed by AMD, and the other stuff all looks correct, such as the short PCB (167mm) - you can see in the slides leaked today that the R7 370's board is shorter than its cooler. The data sheet indicates that the "Ref Board Power" is 110W-130W. In comparison, the original HD 7850 from way back in 2012 only used a maximum of 96W in gaming loads and 101W in FurMark. The increased power usage is presumably due to higher core clocks (the original 7850 was only 860 MHz) and even more to the use of faster GDDR5, which makes the 200 series less efficient than the 7000 series on average. So it doesn't look like we're getting any efficiency gains with R7 370, either.

I don't want to believe it, either - but the preponderance of the evidence indicates that everything except Fiji is just a straight rebrand.

possible. Or, they could simply use the same PCB.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
I don't trust that pic. Wccf posted another article with the oem r9 370 and it looks just like this one. Just a much sharper image.

Like the look of the cooler
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The reference GTX 980 only goes up to 190W even in FurMark (in gaming, it peaks at 184W, so the FurMark figure isn't even that far out of normal usage). This indicates that it's running up against the power limit - which is how cards are supposed to be designed.

You're probably thinking of this screwed-up Gigabyte GTX 980 which I linked to in the past. That card does indeed go up to an insane 342W on FurMark, which is enough to violate the PCIe specification (the card has one 8-pin and one 6-pin connector, which is a total of 300W allowed power usage). But none of the other 980 cards do this, which means that Gigabyte screwed up by disabling the power limiter entirely on this card.

Again, I reject the argument that an actual program you can run on your system somehow doesn't count just because the vendors don't like the results. If AMD doesn't like it, then they should be using the built-in power limit function to cap power usage, just as Nvidia does with their reference designs.

I wish that someone would release an actual game based on the FurMark engine just so that this argument ("it doesn't count") goes away.

One problem with the FurMark usage as a "benchmark" is that at different times AMD and Nvidia have dialed back performance in their drivers to prevent over use. It is known as a heat virus to both companies, and their drivers have been treating them as such, so it's not a very accurate measurement.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
One problem with the FurMark usage as a "benchmark" is that at different times AMD and Nvidia have dialed back performance in their drivers to prevent over use. It is known as a heat virus to both companies, and their drivers have been treating them as such, so it's not a very accurate measurement.

That was true at one time, but not any more. From Kepler and GCN 1.0 onward, both Nvidia and AMD simply set a power limit on their cards and throttle back clocks if the limit is exceeded. There's no need to single out FurMark in the drivers; any application could theoretically be throttled if it uses too much juice. Games generally don't hit the power limit, but GPGPU applications sometimes do. If the drivers were throttling FurMark specifically, you'd expect to see other applications using more power than FurMark, but I've never heard of that ever happening on a Kepler, Maxwell, or GCN card.

The fact that throttling happens at the card level and not the driver level can be seen with this Gigabyte GTX 980. The reference GTX 980 only uses 190W; the Gigabyte card jumps up to an absurd 342W. It looks like what happened here is that Gigabyte didn't implement the power limit at all on this card. If FurMark was being throttled at the driver level, then we would expect to see that no matter which AIB manufactured the card, but that's clearly not the case.

The advantage of FurMark is that it exposes the card's power limit (whether or not you choose to call that "TDP" is not really important). You need this to determine whether your PSU and case cooling are adequate for the electrical and thermal load.
 
Feb 19, 2009
10,457
10
76
The advantage of FurMark is that it exposes the card's power limit (whether or not you choose to call that "TDP" is not really important). You need this to determine whether your PSU and case cooling are adequate for the electrical and thermal load.

Which is important if you buy GPUs to play FurMark.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
That was true at one time, but not any more. From Kepler and GCN 1.0 onward, both Nvidia and AMD simply set a power limit on their cards and throttle back clocks if the limit is exceeded. There's no need to single out FurMark in the drivers; any application could theoretically be throttled if it uses too much juice. Games generally don't hit the power limit, but GPGPU applications sometimes do. If the drivers were throttling FurMark specifically, you'd expect to see other applications using more power than FurMark, but I've never heard of that ever happening on a Kepler, Maxwell, or GCN card.

The fact that throttling happens at the card level and not the driver level can be seen with this Gigabyte GTX 980. The reference GTX 980 only uses 190W; the Gigabyte card jumps up to an absurd 342W. It looks like what happened here is that Gigabyte didn't implement the power limit at all on this card. If FurMark was being throttled at the driver level, then we would expect to see that no matter which AIB manufactured the card, but that's clearly not the case.

The advantage of FurMark is that it exposes the card's power limit (whether or not you choose to call that "TDP" is not really important). You need this to determine whether your PSU and case cooling are adequate for the electrical and thermal load.

Your example just shows why Furmark isn't a great choice of benchmark to see the TDP. In one case, the card is left to go hog wild, while another is held back.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It's interesting you would link up some obscure site with overclocked 970s

1. Computerbase.de is not an obscure site. It's been widely regarded as one of the most professional GPU hardware review sites. Just because they are not based out of U.S., doesn't mean they are obsure. You sound like a typical American who hasn't left his country, or maybe his continent and lives in his own little world.

2. They aren't using an overclocked 970. All of the cards are running at reference clocks in their 5 NV generations review.

while back-handedly questioning the integrity of virtually all western review sites.

Wrong. I am not questioning the integrity of all western review sites. It should be a journalist's job to investigate things and if sites believe marketing TDP numbers at face value, they aren't doing a good enough job reviewing GPU hardware. TPU is one site that does investigate power usage of many cards for instance and they do a great job!

You seem to have used one of the weakest arguments I've ever seen by disregarding Computerbase.de testing instead of focusing on the main issue at hand - TDP does NOT mean power usage. There are plenty of other sites I can link and you'll diregard all of them too? :rolleyes:


The constant conspiracy theory thing gets old, and frankly it makes you and your references much more questionable than anything else.

It's not a conspiracy theory. Most PC gamers in the world do not have BestBuy, so it's worthless to use a "reference 970" TDP to mean the power usage of a typical 970. Again, instead of focusing on the main issue at hand, you start insinuating how my reference and motives are questionable - essentially you presented a weak rebuttal.

AT has a long reputation of exposing poor business practices.

Marketing is not a poor business practice in this instance. It's about looking deeper beyond marketing.

The most recent includes determining how phone makers were borking benchmarks by having the OS detect a bench is being run and pushing the phone beyond normal thermal limits. They did a lot of research to determine that. What did your site do? Nothing. That's right. Not ... one ... thing.

What? It's not my site. Why are you bringing smartphones into this when we are discussing GPU power usage. Another weak argument to attack the site because it doesn't meet your "requirements", instead of focusing on the content. I barely remember you posting in this sub-forum in the last 9 years and suddenly Computerbase and other non-US sites are not worth reading anymore? Ya right.


This is what AnandTech and Tom's sites say about R9 290 vs GTX 970 power, noise, and heat. They are both within a few watts of each other.

You are using reference 290 data against an after-market 970. Makes no sense. Again, I never even mentioned 290 and focused on NV's marketing TDPs but you have to bring 290 into the discussion for no reason other than to not focus on the real power usage of 99% of 970s - which all happen to be after-market versions. That's the point here - the types of GPUs bought by PC gamers worldwide are after-market 970s, not BestBuy 970s.

Way to miss all other points too about how a gaming rig with an i5/i7, monitor, speakers, etc. will already use well over 300W of power. Since we were discussing how 50-60W of power might heat up a room, the whole point was that a 970 system and a 290 system will both heat up a room. Unless that person is running a Wii U, that 50-60W of extra power usage won't matter in a small room when the entire system / rig is using well over 300W of power with the monitor, etc.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Which is important if you buy GPUs to play FurMark.

He doesn't get it. It's like if you put dual 8-pin connectors on a 980 and let it run FurMark, it could easily draw 350W of power but in games it'll probably max out at 200W (*EDIT* Actually that's basically what the Gigabyte G1 980 Gaming card is -- I actually found the results and added them to the bottom of this post to prove exactly this scenario!!!) But since he doesn't seem to understand how a power virus actually works on the entire graphics card (not just the ASIC), this is pointless to explain. Let him use FurMark and the entire forum will just ignore those posts.

This reminds me of some people in the CPU forum that unless you ran some synthetic benchmark for XX hours your CPU overclock isn't stable. What matters are real world programs we use in our rig. I can be stable in Prime95 but still crash in a game. Who plays Furmark? Let's just imagine if I a gamer got a flagship card that draws 350W of power in Furmark but only 200W of power in games. Now for 2 cards, since SLI/CF doesn't scale at 100%, let's say with 2 cards the gaming power usage would be 350-380W but Furmark would load both cards at 350W x 2 = 700W. All of a sudden the PSU requirements for that rig are pushing 1000W with the overclocked CPU and all. Makes no sense!

Taking it 1 step further, if I run FurMark on both of my overclocked GPUs and LinX on my CPU, my 1000W Platinum PSU can handle it but power usage will be insane, essentially theoretical simulation of 99.99% CPU load + 99.99% x 2 GPU loads. You cannot have a situation like that in gaming.

Real world CPU platform power usage:

power-3.png


Synthetic CPU platform power usage:

power-2.png


Real world GPU gaming power usage:

power_peak.gif


vs. Synthetic/power virus/irrelevant GPU power usage (this could go up every 150W if you we keep adding 8-pin connectors and 24 power phases and put the GPU under LN2, heck it could use 700W of power and still operate at stock speeds).

power_maximum.gif


^ This is a PERFECT example why FurMark is a worthless test. Here we have a reference 980 and an after-market 980 that use a very similar amount of power at Peak in gaming. However, once we switch to Furmark, since Furmark is designed to power load every aspect of the Graphics Card (including the PCB, MosFETs, VRMs, VRAM), it'll essentially max out every component on the graphics card, not just the ASIC itself. If you have a 6-pin + 8-pin connectors, it'll max those out. If you have dual 8-pins and enough power circuitry, it'll max those out. The actual power usage in games or the performance of the graphics card doesn't change. That means a card with dual 8-pin connectors will have a higher FurMark power usage than the exact same GPU with a 6-pin and an 8-pin. If we add 3x 8-pin, over 500W power usage under FurMark is possible with the exact same card that normally would peak 230W in games.

Like this card:

galax_980ti_3.jpg


It's hard to have an informed discussion on power usage of R9 300 series of cards (for starters none of them have been tested in the real world by a 3rd party for us to see the results) when people use FurMark and rely on NV's TDP numbers as the basis of comparisons. Of course at that point a card that has a 6-pin connector will draw a ton more wattage in FurMark than a 750Ti that doesn't!

Next thing you know, a 750Ti with a 6-pin power connector draws 150W of power. Just because a card has a 75W PCIe slot and a 75W 6-pin connector, it doesn't at all mean it draws 150W of power in games. In fact, it might never even use the 6-pin connector at all, until the gamer starts overclocking. Facepalm -- some of the stuff that posters try to pass off as facts/good info on this sub-forum is cringe worthy.

Asus 750Ti OC with a 6-pin connector still uses < 75W of power in games:

power-consumption1.png
 
Last edited:

beginner99

Diamond Member
Jun 2, 2009
5,320
1,768
136
Is there a reason people aren't comparing to the current 290x's with 8gb?

Because current 8 GB 290x are niche product and because it's niche the price is inflated and doesn't actually reflect the true price for additional 4 GB. It's also very likely 390x will use the new 8 Gb 20 nm GDDR5 from Samsung which also might be cheaper than current GDDR5.
 

zagitta

Member
Sep 11, 2012
27
0
0
While it's theoretically possible that AMD updated the hardware scaler block, UVD block, and added TrueAudio and FreeSync while still leaving the architecture at GCN 1.0 and omitting XDMA, it seems very unlikely.

How is that unlikely at all? The UVD block is mostly decoupled from the main GPU logic since it's just a bunch of fixed function hardware that does it's own thing while XDMA is a lot more complex thing to integrate because you're unifying the synchronous chip to chip interconnect (crossfire bridge) with the asynchronous PCI-E bus.

My point is that XDMA is so vastly different from the old crossfire bridge design that it wouldn't surprise me the least if it requires the chip to be designed for it from the start.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
this is definitely legit at least

It's hard to tell from that small picture but HD6970/7970/R9 290X reference fan blades are curved.

290X_Three_Amigos_Side.jpg


This fan looks more similar to a reference GTX680

CHBQaGiU8AA6XFv.jpg:large


Comparison.jpg


Maybe AMD decided to switch the fan supplier.

index.php
 

stuff_me_good

Senior member
Nov 2, 2013
206
35
91
I totally understand your point, especially if we were discussing a gaming rig with a 750Ti and a 35-65W Intel CPU against a high-end gaming rig. But how many gamers buying $300+ GPUs will be in that position?

If you are going to argue that a high-end gaming rig will heat up a small room, that applies to most high-end gaming systems, including a GTX970/980 rig.

Power_01.png

Power_02.png

Power_03.png


If you are going to use that argument, then you need to start adding up speaker and monitor power usage too in relative terms. My Westinghouse 37" LVM-37W3 monitor alone peaks at 210W. Once all of that is added up, do you think a difference of 50-60W matters a great deal in heating up a room when the entire gaming rig with speakers and the monitor uses 500W+ of power already?

That's why I can't understand how all the perf/watt proponents ignore the overall power usage of a gaming system, including the monitor + speakers, etc. Why don't they want to measure perf/watt in the context of the overall system power usage? I can't game without my monitor or an i5/i7 rig, mobo, etc. If I am legitimately going to compare perf/watt used in generating IQ/FPS, I need to look at the overall power usage since that's how we use our computers. That's why I feel sites only using perf/watt on a videocard basis alone are being disingenuous to the readers.



Doesn't solve 3.5GB memory issue. Also, AMD's AIBs tend to be more aggressive with rebates. Even if R9 390 is $329, it shouldn't be too long before $20 rebates are offered.



That's another key weakness of a 970, the after-market versions are often not that much faster than a reference version, and can be slower than a stock reference 290X.

8449


Also, if we are going to bring up after-market 970s, we can't use the marketing 145W power usage TDP for NV. Look at the power usage of after-market 970 cards, it's 180-190W, easily.

power-load.gif


NV marketing => Use after-market 970 level of performance and quote 145W TDP in reviews. Average Joe buys into that. AMD's marketing team needs to discredit this marketing tactic.

RS, you have some good valid points, but I can also see the truth behind what silverforce is trying to imply.

Like you have said so many times, people shouldn't stuck on only focusing on one metric which in this case is the GPU power usage when in fact the only thing that matter is the whole system power usage. It's the same thing like having all those marketing graphs where the graphs are zoomed so that the 2% difference looks huge when in fact it's not. In gaming 40-50w difference is not a big deal when comparing system power usage.


But I thing silveforce is trying to say that when you have hot room if you are living in hot climate or no proper airflow, the OC ability goes out of the windows from the get go. And like you RS have so many times pointed those intel graphs which prove that when silicon is running hot it is also using more power. Also affecting even more to the OC ability and making the whole system use more and the room hotter.

Second point, I have my computer under the office table on my left side where I sit and the table is placed against the wall from right side and from front. So there is only one way the air comes out first and its mostly just where I sit between me and the table. All the hot air comes straight to my face and the difference is pretty big when gaming and when not. I have 7970 OC to 1.1GHz and sometimes while gaming the hot air is really annoying. Any more than that in in my face would be really annoying all the time. While 40-50w is not much in total system power and not really much warming the room, but straight in your face it is starting to be a big deal. So I think that is also one of the things silverforce was trying to imply. The new PSU for every new GPU is starting to become really old excuse and it's hard to believe that all those 550$ 980 owners doesn't invest in other components of their systems like PSU. Besides mostly everybody any time someone is asking what PSU to buy is suggesting something good and efficient which doesn't take the whole system with it when it goes poof.
 

Goatsecks

Senior member
May 7, 2012
210
7
76
Like you have said so many times, people shouldn't stuck on only focusing on one metric which in this case is the GPU power usage when in fact the only thing that matter is the whole system power usage. It's the same thing like having all those marketing graphs where the graphs are zoomed so that the 2% difference looks huge when in fact it's not. In gaming 40-50w difference is not a big deal when comparing system power usage.

It about keeping things cool and quiet, not being eco friendly: 40-50w for a single component is quite a big deal.