• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[PCGH.de] Fallout 4 Benchmark

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
What's a ugrid?
It's an .ini setting in Bethesda games. Basically determines how much of the world around you is loaded in full detail. It used to be a memory issue, but now that FO4 is 64bit it is stable. If you have the spare performance, it's the single greatest leap in image quality (for outdoor environments). And it's buried in an .ini file.

On my setup, I've experimented a bit and found a setting of 9 to be the best trade off. Very rarely drops below 60fps at 1440p (I have God rays at low since it has zero visual impact compare to ultra). Setting to 11 or 13 is more problematic in big vista scenes. The default of 5 barely has my GPU fan spinning up.
 
This thread has nothing to do with underpowered consoles. Did you watch the link below?



Clearly it's not AMD's fault, right? Right? Clearly they are always on top of new game releases, especially ones that aren't in their GE program. Cry babies will be cry babies and play the victim no matter what.

/extreme sarcasm

What?

You can't put together that if the same game engine runs on 3 different platforms, with 3 different application environments, blaming 100% of the performance issues on the application environment of ONE of the three makes no sense when ALL of the 3 exhibit performance issues.

It's called data and evidence. Try using it to make conclusions some time.
 
http://www.pcgameshardware.de/Fallout-4-Spiel-18293/Specials/Test-Benchmark-vor-Release-1177284/

1080p:

K740I1b.jpg

Looks very much like this:
Benchmarks_Far_Cry_4_-_1080p-pcgh.png


And nothing like this:
http--www.gamegpu.ru-images-stories-Test_GPU-Action-Far_Cry_4-nv-test-FarCry4_1920.jpg


Lets wait for more test and optimized drivers from the other side.
 
Same old "blame gameworks" talk when you know perfectly well it's just because AMD hasn't released an optimised driver for it yet which is AMD's fault. In 6 months time it'll almost certainly work great.


You do realise that Bethesda are on record as stating they "worked" with Nvidia? AMD were likely locked out. I'm sure AMD will release a driver but the reason for the delay is more likely to be due to lack of access to the game.

So yes, I do blame gameworks. Any gamer that thinks gameworks is good for PC gaming is deluded.
 
Catalyst 15.11 actually already has profile for fallout4, despite it hasn't been mentioned in notes. How optimised that is, who knows?
 
Did any of the reviews see where the big performance hits come from, and what settings give great frame rates on each with the least quality drop?

I usually end up doing that myself as I like high frame rates vs slight quality IQ improvements, and many of the settings that kill performance you can't even tell unless they on top of each other.

Usually you see min and max settings but not a whole lot else, I did see them compare god rays
 
I was thinking about god rays and why consoles have it. I had assumed bethesda never bothered making their own implementation, but maybe they did and simply took it out of the PC version.

Or consoles are running the nvidia version on a sensible setting/with modification.
 
Is it me or do god rays seem completely pointless to have at anything above low?

All the screenshots I'm looking at have almost no difference in image quality yet there is a massive performance decrease.
 
Is it me or do god rays seem completely pointless to have at anything above low?

All the screenshots I'm looking at have almost no difference in image quality yet there is a massive performance decrease.

Same as NV GameWorks HBAO in Dying Light. Tanks AMD performance by 50%, almost no visual difference.

Even some folks here are blaming AMD when clearly GimpRays tank performance on a 980Ti from 100 fps to 60 fps for negligible visual improvement.

Demand better optimizations from GimpWorks features!

Just like when gamers cried foul over Witcher 3 HairWorks, the developers went back and optimized it, releasing a patch that improved performance without reducing visual quality.

Seriously, who is defending this rubbish? It's acceptable to lose massive FPS for this?

http://images.nvidia.com/geforce-co...-interactive-comparison-003-ultra-vs-low.html

Guess which one just lowered your performance massively even on Maxwell?

KaENMzy.jpg


OR is it this?

ARLgf1x.jpg


This was a parody before the game's launch, I thought it was humorous but at least I was expecting there to be some major visual improvement for the performance loss (HairWorks on creatures in Witcher 3 for example).

406372baf8.jpg
 
Last edited:
You heard it here first, no difference between "GimpRays" and no "GimpRays":

Decide for yourself if you see a difference:
fallout-4-god-rays-quality-002-off.png


fallout-4-god-rays-quality-002-low.png
 
Hilarious when all their prior games ran excellent on every vendor.

I'm gonna stick to my boycott of unethical devs and skip buying this game. They won't miss my purchase at all, but I would personally feel dirty throwing money to reward their behavior.

You'd be "throwing money" to get a game that's fun to play.
 
He said there was not discernible difference between Godrays on Ultra vs Low, not On vs Off.

http://images.nvidia.com/geforce-co...-interactive-comparison-003-ultra-vs-low.html

Go level up your reading comprehension before you spout nonsense.

He said there wouldnt be difference between "no GimpRays" and "GimpRays":
Even some folks here are blaming AMD when clearly GimpRays tank performance on a 980Ti from 100 fps to 60 fps for negligible visual improvement.
Fun fact, even "low" is using "GimpRays". And i posted a picture with "GimpRays" and without it.

BTW: There are differences between Ultra and Low:
Ultra: http://images.nvidia.com/geforce-co...allout-4/fallout-4-god-rays-quality-ultra.mp4
Low: http://images.nvidia.com/geforce-co.../fallout-4/fallout-4-god-rays-quality-low.mp4
 
He said there wouldnt be difference between "no GimpRays" and "GimpRays":
Fun fact, even "low" is using "GimpRays". And i posted a picture with "GimpRays" and without it.

BTW: There are differences between Ultra and Low:
Ultra: http://images.nvidia.com/geforce-co...allout-4/fallout-4-god-rays-quality-ultra.mp4
Low: http://images.nvidia.com/geforce-co.../fallout-4/fallout-4-god-rays-quality-low.mp4

No, he didn't say that. You had best quote him directly to prove otherwise.

And the difference is...? All I can see is that the rays and tree branches are positioned a little differently.
 
Fallout 4 doesnt even look like a "real world simulator"...

BTW: Dont use Ultra, if you dont like it... na, that would be just to easy. Why dont you blame "GimpRays" for an additional ingame setting which you could easily turn down...
 
No, he didn't say that. You had best quote him directly to prove otherwise.

And the difference is...? All I can see is that the rays and tree branches are positioned a little differently.

On low the refractions due to branches are scattered and diffuse more, slightly blurry. Staring up at the sun does that.

On ultra, the rays are sharper and do not diffuse like light does.
 
Back
Top