• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[Rumor (Various)] AMD R7/9 3xx / Fiji / Fury

Page 92 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
3.5gb at LOWER levels

3.5gb at 198gb/s
.5 gb at 26gb/s

That is not 224gb/s you don't get to claim 4gb at 224gb/s.

Cod Advanced Warfare had to gimp all GPU to work around by limiting gpu vram to 75% (u can fix in config) because they didn't know why the 970 was messsed up when it used full vram (it has options to use full vram to increase framerates and storing more files)

I see, I didn't think about it also impacting the overall performance of the 3.5GB portion, though it makes perfect given the crossbar configuration.

No wonder so many people were upset, I'd have returned it as well.
 
9Klurzl.png
 
I kept hearing 3 dates: 16th reveal, 18th review, 24 purchase

I guess it might be wrong.

No, it's accurate, the dates just have different meanings:

16th: 300 sires and Fury reveal
18th: 300 series release and reviews
24th: Fury X release and (presumably) reviews
 
Why is it so hard for some forum posters to believe that the Fury XT actually beats a GTX 980TI? though it's close?

Look at the Fury gpu, its HBM memory etc. Josh at PC Perspective has a great article released today about the architecture.
 
Why is it so hard for some forum posters to believe that the Fury XT actually beats a GTX 980TI? though it's close?

Look at the Fury gpu, its HBM memory etc. Josh at PC Perspective has a great article released today about the architecture.
Some people don't want to be beat, whatever. Who cares, buy what you can afford and move on.
 
If both are from AMD (We know the E3 one is, but we only have PCW's word for the slide), then why would they show the card as performing worse at a lower resolution?

Maybe these benches were taken with a FX CPU? (I Kid, I kid).

Sounds odd and contradictory from the slide performance at E3. I guess wait for more official, and detailed benchmarks? I wish we had reviews this week!!

🙂
 
Maybe these benches were taken with a FX CPU? (I Kid, I kid).

Sounds odd and contradictory from the slide performance at E3. I guess wait for more official, and detailed benchmarks? I wish we had reviews this week!!

🙂

Maybe the E3 slide was with an overclocked Fury X?

That would be a ~20% overclock 😱:biggrin::hmm:😱:thumbsup::'🙂wub:

And 35% over the 980Ti...
 
Maybe the E3 slide was with an overclocked Fury X?

That would be a ~20% overclock 😱:biggrin::hmm:😱:thumbsup::'🙂wub:

And 35% over the 980Ti...

I do not think AMD would use an overclocked Fury X for the first-ever benchmarks. Not without saying so, anyway. That amounts to fraud. It seems much more likely that the slide is fake.
 
Is the Fury and Fury X the same chip, just cooled differently? I note that Nano is not named Fury, it's R9 Nano despite being a Fiji chip. Makes me hope that Fury & Fury X are in fact the same chip (that X is not cut down for the non-X). I haven't found anything on this yet - anyone know?
 
Why is it so hard for some forum posters to believe that the Fury XT actually beats a GTX 980TI? though it's close?

Look at the Fury gpu, its HBM memory etc. Josh at PC Perspective has a great article released today about the architecture.
The funny thing is that who cares? The cards are neck and neck it seems and the 980ti came out earlier anyway. If you got one who cares what the fury is? You couldn't have known and you either a) wanted your gpu to play new games or b) wanted it because you like nvidia or whatever. Either way not a big deal.

Fury x has simply upped the game and forced nvidia to improve their cooling solutions for their high end cards and offer more performance.

Even if you're an nvidia fanboy this launch means good things for you, not sure why they're so upset.
 
I guess it all comes down to that asterisk.

obH6Y6g.jpg


Whatever that asterisk is, it is apparently good for a 20% increase in FPS. I bet they disabled all AA, which is the right thing to do anyway.
 
AA doesn't come in Ultra, so that's very likely, and not bad at all for being an actual framerate people would want to play at.
 
So the price is the same as the 980 Ti and the performance is basically the same? What does it come down to after that, which is more overclockable, runs at a lower temp, and better driver uspport?
 
So the price is the same as the 980 Ti and the performance is basically the same? What does it come down to after that, which is more overclockable, runs at a lower temp, and better driver uspport?
I don't think they are exactly the same. The fury seems to do better. But to confirm I'm waiting for reports from reviewers next week.
 
So the price is the same as the 980 Ti and the performance is basically the same? What does it come down to after that, which is more overclockable, runs at a lower temp, and better driver uspport?

Well, you know temp and noise will be won by Fury at the same price point... if those things matter to you.
 
So the price is the same as the 980 Ti and the performance is basically the same? What does it come down to after that, which is more overclockable, runs at a lower temp, and better driver uspport?
Hdmi 2.0

Of I had known this feature was missing I would have never followed this launch.
 
Where are those Samsung Freesync monitors? Now might be a good time to get those to market... to remind people there's a benefit to DP you can't get on HDMI and DVI...
 
Hdmi 2.0

Of I had known this feature was missing I would have never followed this launch.

Oh cry me a river.
1) TVs are not for gaming. They are terrible for gaming.
2) So since you're not gonna be using the TV for gaming, especially not 4K gaming, you're not gonna need the 60Hz at 4K, which is practically the only difference between HDMI 2.0 and HDMI
1.4.

Bonus: HDMI 2.0 does not allow for Freesync.

HDMI 2.0 is outdated and obsolete.
 
Last edited:
Back
Top