3DVagabond
Lifer
I wonder where's that guy who said 4gb will be enough for this console generation.
I don't know if you're talking about me, but I'll admit to saying it. Doesn't mean they can't pull some marketing settings to sabotage it though.
I wonder where's that guy who said 4gb will be enough for this console generation.
Refusing to run outright or refusing to run with any stability is basically the same in this case. The point is that the game should run perfectly stable, but as a slideshow.
Nope, instead all we got was "Battlefied 1 is going to run crap on AMD." Okay.
Go ahead, run a game where you run out of VRAM and let me know if you can get any meaningful data from it.
I'm using a non-reference 970 (Asus Strix) and I'm getting 70+ on average on Hyper settings.
Hyperbole Mode.
Not sure why everyone is worried about this. Several games released in recent memory "require" certain settings until someone figures out a way to run it with a .ini edit or startup command. Doom's Nightmare Graphics Mode "requires" 5GB, but runs just fine on cards with 4GB (or 3.5GB). GTAV wouldn't run on dual core machines, until someone made a fix. I give it a week.
I missed this part, can you please highlight for us?
Of course the real loser here is Fury/Fury X, other 4GB VRAM VGAs probably don't have the horsepower to push Hyper settings anyway.
+1 :thumbsup:
Agreed, though many of us have been getting it since the almost punch-drunk obsession with "can it run Crysis 3" thing. And the Crysis 2 thing before that. (Takes a bunch of pretty charts brand fanboys just love to argue over, screws them up and throws the whole lot in the bin). What are the actual visual differences. "Show me". Everything else is just repeating someone else's "rat race" sales pitch...
All I've seen in the links so far is this:-
http://media.gamersnexus.net/images/media/2016/game-bench/me-catalyst/me-catalyst-bench-course.jpg
http://www.gamegpu.com/images/stori...rrorsEdgeCatalyst_2016_04_22_21_05_07_098.jpg
http://www.gamegpu.com/images/stori...rrorsEdgeCatalyst_2016_04_22_21_09_11_179.jpg
http://www.gamegpu.com/images/stories/Test_GPU/Slider/MEC/1/3.jpg
http://www.gamegpu.com/images/stories/Test_GPU/Slider/MEC/1/6.jpg
http://www.gamegpu.com/images/stories/Test_GPU/Slider/MEC/2/6_1.jpg
http://www.gamegpu.com/images/stories/Test_GPU/Slider/MEC/3/7_1.jpg
http://www.gamegpu.com/images/stories/Test_GPU/Slider/MEC/4/8_3.jpg
http://www.gamegpu.com/images/stories/Test_GPU/Slider/MEC/4/8_4.jpg
Unless I've got sucked into a parallel universe, or am looking at completely the wrong game, what in God's name could justify straining even a 2GB VRAM card at 1080p with those visuals? 😵 If you 'need' 8GB VRAM and up to 10GB RAM for a "blur-saturated outdoor equivalent of Portal 1", then your game is catastrophically unoptimised... :thumbsdown:
Edit 2: Seriously, I can't believe people are cheering on the "need" for + $600 GFX cards for this (min vs med vs max / min vs med vs max). What am I supposed to be looking at? Am I really seeing Skyrim + 2K textures get higher fps on an i3-4170 + 750Ti looking like this, than an i7 + GTX 970 does in Mirrors Edge Catalyst looking like this on "Максимальное качество" ("Maximum Quality")? 😕
Well, let's see.I missed this part, can you please highlight for us?
Of course the real loser here is Fury/Fury X, other 4GB VRAM VGAs probably don't have the horsepower to push Hyper settings anyway.
This is what we call a "leading question", where the answer is built into the question, the subtext is obvious. The subtext of course being that the Fury X and RX 480 are going to choke in Battlefield 1. It's textbook FUD.Will future DICE games (Battlefield 1) also benefit/need 8GB+ VRAM VGAs to run at max settings?
Well, let's see.This is what we call a "leading question", where the answer is built into the question, the subtext is obvious. The subtext of course being that the Fury X and RX 480 are going to choke in Battlefield 1. It's textbook FUD.
This is what we call a "leading question", where the answer is built into the question, the subtext is obvious. The subtext of course being that the Fury X and RX 480 are going to choke in Battlefield 1. It's textbook FUD.
No i don't think it was you. In any case its kinda silly to assume one could coast by on a 4gb card throughout this generation because Watch Dogs was just the beginning. More unoptimized games are coming our way lol.I don't know if you're talking about me, but I'll admit to saying it. Doesn't mean they can't pull some marketing settings to sabotage it though.
Hyperbole Mode.
Not sure why everyone is worried about this. Several games released in recent memory "require" certain settings until someone figures out a way to run it with a .ini edit or startup command. Doom's Nightmare Graphics Mode "requires" 5GB, but runs just fine on cards with 4GB (or 3.5GB). GTAV wouldn't run on dual core machines, until someone made a fix. I give it a week.
I don't recall GTA V requiring fix to work on dual core. I didn't do anything and it ran fine on my Pentium. Far Cry 4 required fix after which it ran great. Also Rise of Tomb Raider technically shouldn't have worked on my 1gb card as medium settings required 2gb and low settings 1.5gb yet i ran it on medium with just 1gb of vram that too on a Pentium. I hope my 1gb card and Pentium last me a few more years...
If it's truly a legit requirement, then it runs the risk of impacting sales due to dumb consumers.
No i don't think it was you. In any case its kinda silly to assume one could coast by on a 4gb card throughout this generation because Watch Dogs was just the beginning. More unoptimized games are coming our way lol.
You guys think 4gb is enough on the 480 for 1080p gaming?
You guys think 4gb is enough on the 480 for 1080p gaming?
As long as every checkbox doesn't need to be checked, yes.
You guys think 4gb is enough on the 480 for 1080p gaming?