When will we see Fury reviews?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

happy medium

Lifer
Jun 8, 2003
14,387
480
126
4GB does not appear to be a limiting factor in today's games.

Unless I'm loosing my mind, I could swear there was times when a gtx970's 3.5gb of memory was tanking games. People were saying don't buy it , it wont hold up to the moderns games of tomorrow. That was months ago.....suddenly 4gb is enough for a expensive high end 4k rig months later? Am I missing something here? Are we going backwards? Is everyone hoping for a direct x 12 miracle? Is the 6gb of memory on the gtx980ti even usefull or is that a gimmick. A month ago I heard alot of people on the forums with high end rigs saying they like the idea of the gtx980ti having 6gb of memory. Why would they care if 4gb was enough?

I think if I had a high end rig with over a thousand dollars in gpu's to run @4k ,I would want over 4gb of memory in my rig. Am I alone in this? Is it just me?
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Unless I'm loosing my mind, I could swear there was times when a gtx970's 3.5gb of memory was tanking games. People were saying don't buy it , it wont hold up to the moderns games of tomorrow. That was months ago.....suddenly 4gb is enough for a expensive high end 4k rig months later? Am I missing something here? Are we going backwards? Is everyone hoping for a direct x 12 miracle? Is the 6gb of memory on the gtx980ti even usefull or is that a gimmick. A month ago I heard alot of people on the forums with high end rigs saying they like the idea of the gtx980ti having 6gb of memory. Why would they care if 4gb was enough?

Problem with 3.5GB on the 970 is if you needed to use the extra half gig it tanked the card. The same settings that tanked the 970 played well on a 980 with full speed 4GB. But you bring up a good point, as this illustrates just how close we are to saturating 4GB.

You're certainly not alone on this. It has been my point of contention all along. The AMD guys want to hear none of it though because Fury is a 4GB part.
 

sze5003

Lifer
Aug 18, 2012
14,304
675
126
I feel like the fury should have been a 6gb card seeing how my current one is 3 now and it kind of peaks on witcher 3. But also seeing how the fury x is watercooled and could be better than the 980ti, I'd rather go that route. I'm still wanting to see reviews tomorrow.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
That isn't the case at all. I'm not talking about 5 years from now. With DX12 in everyone's hands in July, who knows how much more complex games are going to get. VRAM requirements could easily rise in a very short period of time. We already have >2GB NEEDED even at 1080p. If you feel comfortable that 4GB is going to be enough before you decide to buy your next GPU then you have nothing to concern yourself with. I'm not that confidant.

I'm sure you've heard all these arguments before, so not sure why you're asking the same questions again. Nobodys mind is going to change here.

I understand, but with DX12 in everyone's hands in July, who knows what will happen.... Period. Where was all this VRAM talk with 290X, 980, 780? The forum is having a VRAM meltdown before anybody actually knows anything. Leave "the AMD guys" crap out of it.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I understand, but with DX12 in everyone's hands in July, who knows what will happen.... Period. Where was all this VRAM talk with 290X, 980, 780? The forum is having a VRAM meltdown before anybody actually knows anything. Leave "the AMD guys" crap out of it.

Plenty of VRAM talk when the 980 launched without a 8GB option. This isn't a new argument, and we are now 9 months into the future from the release of the 980. So if it was talked about then, it's going to be a bigger deal now. Requirements go up, not down with time.
 

jackstar7

Lifer
Jun 26, 2009
11,679
1,944
126
Unless I'm loosing my mind, I could swear there was times when a gtx970's 3.5gb of memory was tanking games. People were saying don't buy it , it wont hold up to the moderns games of tomorrow. That was months ago.....suddenly 4gb is enough for a expensive high end 4k rig months later? Am I missing something here? Are we going backwards? Is everyone hoping for a direct x 12 miracle? Is the 6gb of memory on the gtx980ti even usefull or is that a gimmick. A month ago I heard alot of people on the forums with high end rigs saying they like the idea of the gtx980ti having 6gb of memory. Why would they care if 4gb was enough?

I think if I had a high end rig with over a thousand dollars in gpu's to run @4k ,I would want over 4gb of memory in my rig. Am I alone in this? Is it just me?

This is a misunderstanding of the issue. With the 970s, the system/engine was operating under the belief that the card had a full 4gb to access. When it accessed that last half-gig was when you'd hit the wall. If the card were recognized as a 3.5gb card, then it wouldn't hit that wall as hard because the engine would handle things differently with their compression algorithms from the start. Same reason someone can play on 780ti's with the same settings as someone on 970s and not run into problems.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Still doesn't answer the question. If I have a 12GB and a game is using 5GB of VRAM and my performance is perfectly fine, how do you know a 4GB card will be enough? That's the question. THe only way you would know is to have a card with 4GB of vram that has a GPU that's similar in performance. Like a 980Ti and Titan X or even more similar than that, an 8GB 290x and a 4GB 290x. But that isn't practical.

So again, unless the people complaining about VRAM reporting in reviews have a way around this, they really shouldn't be complaining.

You can't prove it with a 12 GB card, you prove it by pairing 4 GB cards and seeing whether they show the symptoms of lacking VRAM.

It's not hard to show. CF 290X 4 GB vs. CF 290X 8 GB. Where does the 4 GB fall off a cliff compared to the 8? There ya go.

Unless I'm loosing my mind, I could swear there was times when a gtx970's 3.5gb of memory was tanking games. People were saying don't buy it , it wont hold up to the moderns games of tomorrow. That was months ago.....suddenly 4gb is enough for a expensive high end 4k rig months later? Am I missing something here? Are we going backwards? Is everyone hoping for a direct x 12 miracle? Is the 6gb of memory on the gtx980ti even usefull or is that a gimmick. A month ago I heard alot of people on the forums with high end rigs saying they like the idea of the gtx980ti having 6gb of memory. Why would they care if 4gb was enough?

With the 970 the worry was as much that the game would see 4 GB without driver tweaks and merrily fill up the last .5 GB, when using that .5 slows down the other 3.5 GB, which is a lower bar than a game outright needing 4 GB.

Honestly for me though it was the dishonesty of the whole thing.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
It's not hard to show. CF 290X 4 GB vs. CF 290X 8 GB. Where does the 4 GB fall off a cliff compared to the 8? There ya go.

It's also not practical when you're reviewing Fury, Titan X or 980Ti. That's my point. Unless you can come up with a practical way to test, then you're barking up the wrong tree when complaining about VRAM reporting.

What you're suggesting is for every 980Ti test that shows >4GB of vram usage, then everyone also review two more cards, a 290x 4GB and a 290x 8GB, to see if indeed the game NEEDS that much, or if it's simply utilizing that much. Tripling the amount of work. That isn't practical.

With the 970 the worry was as much that the game would see 4 GB without driver tweaks and merrily fill up the last .5 GB, when using that .5 slows down the other 3.5 GB, which is a lower bar than a game outright needing 4 GB.

The other 3.5GB still operates at full speed. Only the information stored in the .5 has the performance penalty and since a performance penalty has been documented, it's not unreasonable to assume it was "needed" and not merely utilized. Add to that the drivers actually won't allocate that half gig unless necessary and you have even further reason to assume it wasn't merely utilized, but needed as well.
 
Last edited:

jackstar7

Lifer
Jun 26, 2009
11,679
1,944
126
It's also not practical when you're reviewing Fury, Titan X or 980Ti. That's my point. Unless you can come up with a practical way to test, then you're barking up the wrong tree when complaining about VRAM reporting.

What you're suggesting is for every 980Ti test that shows >4GB of vram usage, then everyone also review two more cards, a 290x 4GB and a 290x 8GB, to see if indeed the game NEEDS that much, or if it's simply utilizing that much. Tripling the amount of work. That isn't practical.

Then I guess if someone's not willing to be thorough, then they should also restrain themselves from making conclusions that are not founded in a thorough collection of relevant data...
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Then I guess if someone's not willing to be thorough, then they should also restrain themselves from making conclusions that are not founded in a thorough collection of relevant data...

They are reporting VRAM usage as illustrated by software that is available to them that we all use. Why does it become the reviewers responsibility to review cards that aren't even part of the review just to appease AMD fans who maintain 4GB is plenty?

Do you know anyone who was that thorough? I don't. So by your logic, there should be no reporting of VRAM utilization at all. Not going to happen. As I said, you're barking up the wrong tree.

AMD decided to equip a card that will be used well into 2016 with only equal the amount of VRAM we saw in cards that came out in late 2014.
 
Last edited:

at80eighty

Senior member
Jun 28, 2004
458
5
81
That isn't the case at all. I'm not talking about 5 years from now. With DX12 in everyone's hands in July, who knows how much more complex games are going to get.

and what is your realistic adoption rate for devs to take on dx12?
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
It's also not practical when you're reviewing Fury, Titan X or 980Ti. That's my point. Unless you can come up with a practical way to test, then you're barking up the wrong tree when complaining about VRAM reporting.

What you're suggesting is for every 980Ti test that shows >4GB of vram usage, then everyone also review two more cards, a 290x 4GB and a 290x 8GB, to see if indeed the game NEEDS that much, or if it's simply utilizing that much. Tripling the amount of work. That isn't practical.

Yo, that's what reviewers do. They investigate and do the grinding for us. Any review sites that go the extra mile will get my attention. They will be reward for their efforts. Not only they get more clicks on the article, they will be known as the reviewers that will investigate what the readers want. We want reviewers to do a thorough investigation. Considering that a brand new GPU arch rarely gets released, doing a bit of extra work to help their readers make the right purchasing decision isn't too much to ask.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
from the hardocp review. yea the reviewers do the testing for us.
He said next up is 4k in sli.
http://www.hardocp.com/article/2015..._980_ti_video_card_gpu_review/12#.VYIVTOlRFjo



At 4K neither the 980 Ti or TITAN X is actually fast enough to truly enjoy the latest graphically demanding games at high IQ settings, so the VRAM point becomes moot. The main problem at 4K is just the sheer amount of GPU performance needed push all those pixels. You will need two GTX 980 Ti cards or two TITAN X cards in SLI to genuinely enjoy newer shooter games at high IQ settings at 4K. This is where the benefits of the 6GB of VRAM over 4GB on the GTX 980 will come in handy. It may though, actually not be enough at 4K, but we will test that when we test SLI.





VRAM




As you can see from our gaming lineup (which is using many new games released this year) we aren't seeing much demand over 4GB just yet. There are some hints that some games might need more; Dying Light for example, and possibly Far Cry 4 and GTA V. We can at least say this, 4GB of VRAM should be the MINIMUM for running games at 1440p today. If you were able to have 6GB of VRAM or more, you will be ensured that games coming this year and next should run fine, as far as VRAM goes at 1440p.



At 4K though 4GB of VRAM is clearly not enough. At 4K you want at a MINIMUM 6GB. It is possible though that more may actually help as you start increasing the number of video cards in SLI. 6GB might actually not be enough for some games in 4K when SLI is involved, we will see.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
It's also not practical when you're reviewing Fury, Titan X or 980Ti. That's my point. Unless you can come up with a practical way to test, then you're barking up the wrong tree when complaining about VRAM reporting.

What you're suggesting is for every 980Ti test that shows >4GB of vram usage, then everyone also review two more cards, a 290x 4GB and a 290x 8GB, to see if indeed the game NEEDS that much, or if it's simply utilizing that much. Tripling the amount of work. That isn't practical.

Do that once. There ya go, done. Do it as a special investigative piece, like people are already doing on the subject but don't willfully misrepresent things. It's two benchmark runs vs. one.

That is literally the job description of reviewers to look into important parts of performance like that.

The other 3.5GB still operates at full speed. Only the information stored in the .5 has the performance penalty and since a performance penalty has been documented, it's not unreasonable to assume it was "needed" and not merely utilized. Add to that the drivers actually won't allocate that half gig unless necessary and you have even further reason to assume it wasn't merely utilized, but needed as well.

Every tick is either the 3.5 or .5. Accesses to the .5 cost a cycle of the 3.5.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Yo, that's what reviewers do. They investigate and do the grinding for us. Any review sites that go the extra mile will get my attention. They will be reward for their efforts. Not only they get more clicks on the article, they will be known as the reviewers that will investigate what the readers want. We want reviewers to do a thorough investigation. Considering that a brand new GPU arch rarely gets released, doing a bit of extra work to help their readers make the right purchasing decision isn't too much to ask.

I'm sure they'll get a lot of peoples attention. I'm just saying it's not practical and I wouldn't expect reviewers to do it. Early bird gets the worm right? The sooner they get that review up, the more hits they'll get the more money they make. It's not uncommon these days to see a quick "preview" to harvest some hits while they get a more in depth review up.

It's fun to talk about what is ideal, but reality is far more likely. That's a LOT of extra work taking a LOT of extra time and AMD is on the short end of the stick here with the decision to top out at 4GB.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Do that once. There ya go, done. Do it as a special investigative piece, like people are already doing on the subject but don't willfully misrepresent things. It's two benchmark runs vs. one.

That is literally the job description of reviewers to look into important parts of performance like that.

Once? You'd have to do that with literally every game that uses more than 4GB to determine if it actually NEEDS it.
 

at80eighty

Senior member
Jun 28, 2004
458
5
81
At 4K though 4GB of VRAM is clearly not enough. At 4K you want at a MINIMUM 6GB. It is possible though that more may actually help as you start increasing the number of video cards in SLI. 6GB might actually not be enough for some games in 4K when SLI is involved, we will see.

not to question Kyle (?) , but isnt volume tiled textures supposed to improve VRAM requirements quite a bit over traditional texture loads? if i understand this right - with the right optimizations, we dont need more, just for it's own sake.
 

gamervivek

Senior member
Jan 17, 2011
490
53
91
Tiled resources is supposed to decrease vram requirements, the devs would just fill it up again though.
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
I'm sure they'll get a lot of peoples attention. I'm just saying it's not practical and I wouldn't expect reviewers to do it. Early bird gets the worm right? The sooner they get that review up, the more hits they'll get the more money they make. It's not uncommon these days to see a quick "preview" to harvest some hits while they get a more in depth review up.

It's fun to talk about what is ideal, but reality is far more likely. That's a LOT of extra work taking a LOT of extra time and AMD is on the short end of the stick here with the decision to top out at 4GB.

Agreed. From a potential GPU buyer, don't you want that information? Quit talking from the reviewers point of view. Speak from YOUR own point of view. Do you want it done or not?

Besides, make is a special investigation. "Far from likely". Yeah, no. It will happen because it sounds like the people right now want it done. That's their job. Neglecting such an investigation is a disservice to their readers. More work for the reviewers? yes. Too much work for the reviewers? Get the hell out of here.

If we were to based 4GB on current games, chances are, besides the outliers, it would be enough for 4K gaming. That's where a special report comes into play. Exploring deeper into those outliers.

With the VRAM pushed, is there enough GPU grunt to push those pixels? Nobody cares if the game runs at sub 30FPS with dual gpu setups when VRAM gets pushed above.

How many games are actually pushing the VRAM limit?

If some games are hitting those limits, what can be done to mediate best those problems? Lower texutre? Lower anti-aliasing?
 

at80eighty

Senior member
Jun 28, 2004
458
5
81
tiled & volume tiled are different iirc

but i do see the issue with lazy development though. it's not an indictment on the HW though
 
Feb 19, 2009
10,457
10
76
from the hardocp review. yea the reviewers do the testing for us.
He said next up is 4k in sli.
http://www.hardocp.com/article/2015..._980_ti_video_card_gpu_review/12#.VYIVTOlRFjo

The main problem at 4K is just the sheer amount of GPU performance needed push all those pixels. You will need two GTX 980 Ti cards or two TITAN X cards in SLI to genuinely enjoy newer shooter games at high IQ settings at 4K. This is where the benefits of the 6GB of VRAM over 4GB on the GTX 980 will come in handy. It may though, actually not be enough at 4K, but we will test that when we test SLI.

So let's see the test results.

Because that review clearly shows that [H] is shilling because 6GB isn't a minimum as they stated, when in fact the R295x2 and 980 SLI is faster than 980Ti at 4K.

They should have reserved their judgement until their DATA shows otherwise. What they did, was make a statement with zero evidence, and in fact, with their own DATA as evidence against them.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Once? You'd have to do that with literally every game that uses more than 4GB to determine if it actually NEEDS it.

How many AAA games get tested like that? Two or three a year?

And honestly, the main time it matters is around launch time. Get that done and it's an article that should get a lot of hits.