Don't rules out because VSR already works on 7000 cards with modded win10 driver.![]()
That rules out a re-badge for the low-end. So we just lack info on 390/X.
Edit: Only Fiji interests me personally, but AMD definitely needs their entire stack to be competitive. The 750ti, 960 and 970 is doing the damage.
Don't rules aout because VSR already works on 7000 cards with modded win10 driver.
Used the wagnards tools method just a couple of days ago for my HD7950.Interesting, I didn't know that, still waiting for VSR on my old 7950. Some old games would benefit from super sampling.
guys here is an indication that the rebrand theory is going awfully wrong.
http://videocardz.com/56182/amd-off...t-for-radeon-r9-300-and-r7-300-graphics-cards
"What is also not surprising, but definitely interesting, is that AMD managed to include Bonaire and Pitcairn-based Radeon R7 360 and R7 370 series to this list. VSR is without a doubt much easier to implement than all in-house features like FreeSync or TrueAudio, so I wouldnt see this as an indicator for other technology to be ported into new series, but hopefully Im wrong."
Funnily wccftech now tries to defend its rebrand theory. But the truth is this rules out Pitcairn and Bonaire as they do not have the necessary hardware to support VSR.
If AMD is taking the effort to include VSR then Freesync and TruAudio + Tonga improvements (and even more further architectural improvements as I expect) will make it to the R9/R7 series. Anyway its just a week away before these rumours are settled once and for all. :whiste:
The 290X does not use 300W under normal usage (ie: Games where is averages ~250W). "FurMark" is not a game, it is not used to tell people how many watts a GPU uses while gaming (ie: Regular usage). To use it is such is disingenuous.
If you want to go that route, then the GTX980 uses 342W according to TPU, as opposed to 182W while gaming.
Plus, stating that the 300 series are direct renames of the 200 series as fact is WRONG as we DO NOT KNOW. You can talk about how rumors state one thing or another, but do not state them as fact.
I don't want to believe it, either - but the preponderance of the evidence indicates that everything except Fiji is just a straight rebrand.
I wish that someone would release an actual game based on the FurMark engine just so that this argument ("it doesn't count") goes away.
The problem with this theory is that AMD already has a partially working software implementation of VSR in a beta driver that works with all GCN cards, even GCN 1.0. This post contains details of how one person got VSR working on their 7970 (a Tahiti-based GCN 1.0 card).
Another problem is that the leaked R7 370 images include CrossFire fingers on the card. Below is a zoomed-in portion of the image, where these are clearly visible:
![]()
Only GCN 1.0 needs a physical CrossFire bridge. GCN 1.1 (Hawaii) and GCN 1.2 (Tonga) use XDMA for this. Therefore, if AMD had updated the silicon, we'd expect XDMA to have been added, along with features like FreeSync and TrueAudio. But XDMA is clearly not present, which means that this is an unmodified Pitcairn chip. While it's theoretically possible that AMD updated the hardware scaler block, UVD block, and added TrueAudio and FreeSync while still leaving the architecture at GCN 1.0 and omitting XDMA, it seems very unlikely. It's 95%+ certain that this is Pitcairn, with perhaps a metal-layer respin (new stepping) at most. Even that is questionable, since it doesn't appear power efficiency will be any better than on existing 7850/7870/R9 270 cards. (R9 270 got 1280 shaders on a single 6-pin connector, while this one only does 1024.) The leaked XFX R9 370 data sheet from a few months back was wrong about the release date, but that may well have been changed by AMD, and the other stuff all looks correct, such as the short PCB (167mm) - you can see in the slides leaked today that the R7 370's board is shorter than its cooler. The data sheet indicates that the "Ref Board Power" is 110W-130W. In comparison, the original HD 7850 from way back in 2012 only used a maximum of 96W in gaming loads and 101W in FurMark. The increased power usage is presumably due to higher core clocks (the original 7850 was only 860 MHz) and even more to the use of faster GDDR5, which makes the 200 series less efficient than the 7000 series on average. So it doesn't look like we're getting any efficiency gains with R7 370, either.
I don't want to believe it, either - but the preponderance of the evidence indicates that everything except Fiji is just a straight rebrand.
The reference GTX 980 only goes up to 190W even in FurMark (in gaming, it peaks at 184W, so the FurMark figure isn't even that far out of normal usage). This indicates that it's running up against the power limit - which is how cards are supposed to be designed.
You're probably thinking of this screwed-up Gigabyte GTX 980 which I linked to in the past. That card does indeed go up to an insane 342W on FurMark, which is enough to violate the PCIe specification (the card has one 8-pin and one 6-pin connector, which is a total of 300W allowed power usage). But none of the other 980 cards do this, which means that Gigabyte screwed up by disabling the power limiter entirely on this card.
Again, I reject the argument that an actual program you can run on your system somehow doesn't count just because the vendors don't like the results. If AMD doesn't like it, then they should be using the built-in power limit function to cap power usage, just as Nvidia does with their reference designs.
I wish that someone would release an actual game based on the FurMark engine just so that this argument ("it doesn't count") goes away.
One problem with the FurMark usage as a "benchmark" is that at different times AMD and Nvidia have dialed back performance in their drivers to prevent over use. It is known as a heat virus to both companies, and their drivers have been treating them as such, so it's not a very accurate measurement.
The advantage of FurMark is that it exposes the card's power limit (whether or not you choose to call that "TDP" is not really important). You need this to determine whether your PSU and case cooling are adequate for the electrical and thermal load.
That was true at one time, but not any more. From Kepler and GCN 1.0 onward, both Nvidia and AMD simply set a power limit on their cards and throttle back clocks if the limit is exceeded. There's no need to single out FurMark in the drivers; any application could theoretically be throttled if it uses too much juice. Games generally don't hit the power limit, but GPGPU applications sometimes do. If the drivers were throttling FurMark specifically, you'd expect to see other applications using more power than FurMark, but I've never heard of that ever happening on a Kepler, Maxwell, or GCN card.
The fact that throttling happens at the card level and not the driver level can be seen with this Gigabyte GTX 980. The reference GTX 980 only uses 190W; the Gigabyte card jumps up to an absurd 342W. It looks like what happened here is that Gigabyte didn't implement the power limit at all on this card. If FurMark was being throttled at the driver level, then we would expect to see that no matter which AIB manufactured the card, but that's clearly not the case.
The advantage of FurMark is that it exposes the card's power limit (whether or not you choose to call that "TDP" is not really important). You need this to determine whether your PSU and case cooling are adequate for the electrical and thermal load.
No crossfire fingers visible in that shot, not that they necessarily would be from that perspective.
It's interesting you would link up some obscure site with overclocked 970s
while back-handedly questioning the integrity of virtually all western review sites.
The constant conspiracy theory thing gets old, and frankly it makes you and your references much more questionable than anything else.
AT has a long reputation of exposing poor business practices.
The most recent includes determining how phone makers were borking benchmarks by having the OS detect a bench is being run and pushing the phone beyond normal thermal limits. They did a lot of research to determine that. What did your site do? Nothing. That's right. Not ... one ... thing.
This is what AnandTech and Tom's sites say about R9 290 vs GTX 970 power, noise, and heat. They are both within a few watts of each other.
Which is important if you buy GPUs to play FurMark.
Is there a reason people aren't comparing to the current 290x's with 8gb?
While it's theoretically possible that AMD updated the hardware scaler block, UVD block, and added TrueAudio and FreeSync while still leaving the architecture at GCN 1.0 and omitting XDMA, it seems very unlikely.
this is definitely legit at least
I totally understand your point, especially if we were discussing a gaming rig with a 750Ti and a 35-65W Intel CPU against a high-end gaming rig. But how many gamers buying $300+ GPUs will be in that position?
If you are going to argue that a high-end gaming rig will heat up a small room, that applies to most high-end gaming systems, including a GTX970/980 rig.
![]()
![]()
![]()
If you are going to use that argument, then you need to start adding up speaker and monitor power usage too in relative terms. My Westinghouse 37" LVM-37W3 monitor alone peaks at 210W. Once all of that is added up, do you think a difference of 50-60W matters a great deal in heating up a room when the entire gaming rig with speakers and the monitor uses 500W+ of power already?
That's why I can't understand how all the perf/watt proponents ignore the overall power usage of a gaming system, including the monitor + speakers, etc. Why don't they want to measure perf/watt in the context of the overall system power usage? I can't game without my monitor or an i5/i7 rig, mobo, etc. If I am legitimately going to compare perf/watt used in generating IQ/FPS, I need to look at the overall power usage since that's how we use our computers. That's why I feel sites only using perf/watt on a videocard basis alone are being disingenuous to the readers.
Doesn't solve 3.5GB memory issue. Also, AMD's AIBs tend to be more aggressive with rebates. Even if R9 390 is $329, it shouldn't be too long before $20 rebates are offered.
That's another key weakness of a 970, the after-market versions are often not that much faster than a reference version, and can be slower than a stock reference 290X.
![]()
Also, if we are going to bring up after-market 970s, we can't use the marketing 145W power usage TDP for NV. Look at the power usage of after-market 970 cards, it's 180-190W, easily.
![]()
NV marketing => Use after-market 970 level of performance and quote 145W TDP in reviews. Average Joe buys into that. AMD's marketing team needs to discredit this marketing tactic.
Like you have said so many times, people shouldn't stuck on only focusing on one metric which in this case is the GPU power usage when in fact the only thing that matter is the whole system power usage. It's the same thing like having all those marketing graphs where the graphs are zoomed so that the 2% difference looks huge when in fact it's not. In gaming 40-50w difference is not a big deal when comparing system power usage.
