• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

When will we see Fury reviews?

2is

Diamond Member
So after waiting a couple weeks to see what AMD has to offer, all we really have is confirmation that a card called Fury with HBM will exist which was pretty much already known. Any ideas when the actual reviews will start going up?

EDIT: If a mod can correct the typo in the title, it would be appreciated. 🙂
 
Last edited by a moderator:
As soon as 100 separate Fury related threads are created, the NDA will expire. It's a clause in the contract. I've seen it myself.
 
24th for Fury X launch, presumably with launch day reviews, and I think July 16 (not sure on that one's exact date) for Fury non-X.
 
third link for the source don't know if they accurate

4GB HBM seems to rock up to 4k if true

Can firestrike use more than 4GB though? Will games use higher quality textures for 4k? It wasn't THAT long ago that my 670s with 2GB of VRAM were enough for 1440p. No longer true...

Given the difference between 6GB and 12GB in Firestrike, I'm lead to believe it doesn't really use that much VRAM. What's a "next gen" game designed for 4k going to use?
 

If those are accurate, is the R9 Fury X matching the 295X2 and other 290X Crossfire setups?

I have 2x 290X Lightnings, and if I can get the same performance without having to pay much more after selling off the current cards, with none of the multi-GPU scaling issues and waiting for game/driver support, then I'll be making every effort to make that switch.
 
If those are accurate, is the R9 Fury X matching the 295X2 and other 290X Crossfire setups?

I have 2x 290X Lightnings, and if I can get the same performance without having to pay much more after selling off the current cards, with none of the multi-GPU scaling issues and waiting for game/driver support, then I'll be making every effort to make that switch.
That's two of us. I think I'm done with multi-GPU, unless DX12 changes something.
 
That's two of us. I think I'm done with multi-GPU, unless DX12 changes something.

From what we have heard DX12 will let each GPU use its memory as its own. So if each GPU has 4GB, and you have two GPUs, you actually have 8GB of usable RAM. Where as currently you still only have 4GB as everything is duplicated across both GPUs.
 
I always get confused by benchmarks since boost was created.

Are titan x and 980ti boosting? Or are they capped at a certain frequency?

If they are boosting to 1150mhz+ and the fury x is stuck at 1050 that doesn't look so good for nvidia.
 
From what we have heard DX12 will let each GPU use its memory as its own. So if each GPU has 4GB, and you have two GPUs, you actually have 8GB of usable RAM. Where as currently you still only have 4GB as everything is duplicated across both GPUs.

I've read that while technically possible, it comes with it's own can of worms and isn't likely to get implemented at the driver level, which would be required for that to become reality. Awesome if it happens though.
 
To give you an idea of what next gen games are going to use Vram wise, I recorded a 6.2 GB usage in GTAV today maxed out @ 4K.
 
To give you an idea of what next gen games are going to use Vram wise, I recorded a 6.2 GB usage in GTAV today maxed out @ 4K.

Vram usage != vram required.

1430748911U8nIsW8LSm_5_1.gif
 
This, why is it so hard to understand for some that game engines scale to the memory available doesn't mean they NEED said amount on the card.

Do you think it's a coincidence several sites are coming out with articles about VRAM usage in 4K just before Fury launches?
 
This, why is it so hard to understand for some that game engines scale to the memory available doesn't mean they NEED said amount on the card.

Some understand it perfectly because it wasn't an issue for the last 8 months with 970 SLI or 980 SLI but now a new metric needs to be created where AMD can't compete to justify sticking to the preferred brand. Also, the game might use > 4GB of VRAM but at that point the performance hammers any single GPU out today (Example #1). Alternatively, a game can use > 6GB of VRAM but it doesn't mean it runs poorly on a 4GB card (Example #2). You can also have a situation where VRAM being used is very high in MSI AB making you believe a 4GB card would choke but in reality it wipes the floor with a 6-12GB card (Example #3).

memory.gif


Dead Rising 3 - Example #1
deadrising3_3840_2160.gif


COD AW - Example #2
cod_aw_3840_2160.gif


SoM - Example #3 (R9 295X2 is 63% faster than TX)
som_3840_2160.gif


Do you think it's a coincidence several sites are coming out with articles about VRAM usage in 4K just before Fury launches?

It would matter if their own analysis supported this hypothesis but it seems they just make stuff up.

"On the GeForce GTX 980 Ti we clearly exceed 4GB of VRAM in Dying Light at 1440p and 4K. We also exceed 4GB of VRAM at 4K in GTA V and Far Cry 4. What this shows is that these games, the GTX 980 cards "want to" and can, exceed the 4GB framebuffer if more VRAM is exposed. This means the 4GB of VRAM on the GTX 980 is limiting these cards, but it is not on the 6GB GeForce GTX 980 Ti."

980 SLI > Titan X in GTA V @ 4K. If 4GB of VRAM was limiting the 980, TX would be faster.
1430748911U8nIsW8LSm_5_3_l.gif

1430748911U8nIsW8LSm_5_1_l.gif


980 SLI > Titan X in Dying Light @ 4K

http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-GEFORCE_GTX_TITAN_X-test-dl.jpg


I couldn't find 980 SLI in Far Cry 4 @ 4K but R9 295X2 crushes a Titan X in that game at 4K.

FC4.png


When a professional review site doesn't understand the difference between required VRAM usage vs. dynamic VRAM usage in a modern PC game, that is an eye-opener.
 
Last edited:
This, why is it so hard to understand for some that game engines scale to the memory available doesn't mean they NEED said amount on the card.

How are you going to determine how much a game actually "needs"? Serious question. Short of swapping cards with similar GPU but with less ram until you find the point performance drops, how would you differentiate what a game requires vs what it's using? Doesn't sound like an easy thing to do without intimate knowledge of the game engine and even then you're probably only looking at a rough estimate. Unless you have a practical way of making such a determination, you're not really in a good position to berate reviews documenting VRAM usage.
 
Do you think it's a coincidence several sites are coming out with articles about VRAM usage in 4K just before Fury launches?

Who cares?

It will either matter when people play the games on their PC, or it won't.

Reviews won't change what happens with the end user.
 
Back
Top