• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Why are game developers adding noise in games??

yusux

Banned
I just saw some FEAR2 screens on gamespot.com, I'm very dissapointed at the lack of soft shadows and shadows are still way too dark and everything still looks plasticky

It started with Crysis, now FEAR2, we have HDTV that has build in noise reduction so we don't have to see it

Instead of improving the FEAR engine they actually fvcked it up even more.
 
Maybe because F.E.A.R. 2 is a console port and that added noise helps on a television screen? I don't know, I'm just thinking maybe it's because of that. If not then yes maybe there's an option to reduce or remove it, either in-game or by manually editing it in some configuration file. If it's hard-coded and is obliged to be there all the time then it sucks indeed, I don't like film-grain effects either, I completely removed it for Left 4 Dead.
 
I assume the OP is referring to film grain when he speaks of 'noise'?

I don't own L4D or nor do I have any intention of purchasing it or FEAR 2. However, film gain was in Mass Effect. And it was the first feature I turned off. It added nothing to the game, decreased graphical quality, obscured the world, gave me eyestrain, and followed it up with a head ache.
 
Originally posted by: Bateluer
I assume the OP is referring to film grain when he speaks of 'noise'?

I don't own L4D or nor do I have any intention of purchasing it or FEAR 2. However, film gain was in Mass Effect. And it was the first feature I turned off. It added nothing to the game, decreased graphical quality, obscured the world, gave me eyestrain, and followed it up with a head ache.

I remember wondering wtf was wrong with my 360/monitor until I figured it out.
 
WTF, film grain? I just started playing Mass Effect a week ago and didn't know it was on. I'll have to try turning it off next time I play

Seriously, wtf? Is this why I have a nice graphics card? Film grain?
 
Ya its an annoying trend of devs adding film grain/noise to give games a more cinematic/organic look. I think part of it is an attempt to combat larger monitors/tvs and higher resolutions, which lead to very clean, synthetic looking graphics. Blame Quentin Tarantino imo, all his movies lately have been using film grain liberally.
 
I actually like what I've seen so far in a handful of games, although I imagine I'll get sick if it once it becomes standard for each new FPS.
 
Originally posted by: Eeezee
WTF, film grain? I just started playing Mass Effect a week ago and didn't know it was on. I'll have to try turning it off next time I play

Seriously, wtf? Is this why I have a nice graphics card? Film grain?

It helps with the POS engine that is Unreal 3. The Film grain hides the jaggies very well when AA cannot be turned on for a very, very minimal performance hit. For most games it's a bad idea. For Mass Effect and any other games using the UE3 engine where you cant actually enable AA, it's nice to keep on.
 
Originally posted by: mwmorph
Originally posted by: Eeezee
WTF, film grain? I just started playing Mass Effect a week ago and didn't know it was on. I'll have to try turning it off next time I play

Seriously, wtf? Is this why I have a nice graphics card? Film grain?

It helps with the POS engine that is Unreal 3. The Film grain hides the jaggies very well when AA cannot be turned on for a very, very minimal performance hit. For most games it's a bad idea. For Mass Effect and any other games using the UE3 engine where you cant actually enable AA, it's nice to keep on.

that's not true, I can enable AA just fine in UT3
 
Originally posted by: yusux
Originally posted by: mwmorph
Originally posted by: Eeezee
WTF, film grain? I just started playing Mass Effect a week ago and didn't know it was on. I'll have to try turning it off next time I play

Seriously, wtf? Is this why I have a nice graphics card? Film grain?

It helps with the POS engine that is Unreal 3. The Film grain hides the jaggies very well when AA cannot be turned on for a very, very minimal performance hit. For most games it's a bad idea. For Mass Effect and any other games using the UE3 engine where you cant actually enable AA, it's nice to keep on.

that's not true, I can enable AA just fine in UT3
Try to do it on an Xbox 360.
 
Originally posted by: s44
Originally posted by: yusux
Originally posted by: mwmorph
Originally posted by: Eeezee
WTF, film grain? I just started playing Mass Effect a week ago and didn't know it was on. I'll have to try turning it off next time I play

Seriously, wtf? Is this why I have a nice graphics card? Film grain?

It helps with the POS engine that is Unreal 3. The Film grain hides the jaggies very well when AA cannot be turned on for a very, very minimal performance hit. For most games it's a bad idea. For Mass Effect and any other games using the UE3 engine where you cant actually enable AA, it's nice to keep on.

that's not true, I can enable AA just fine in UT3
Try to do it on an Xbox 360.

I thought this was about PC
 
Originally posted by: yusux
Originally posted by: s44
Originally posted by: yusux
that's not true, I can enable AA just fine in UT3
Try to do it on an Xbox 360.

I thought this was about PC
Cross-platform games (which, these days, is pretty much all of them) will go to the lowest common denominator. If the game needs features not possible on an Xbox to look good, it won't ship.

Consoles can't handle real AA -> games will implement film grain instead
 
Originally posted by: chizow
Ya its an annoying trend of devs adding film grain/noise to give games a more cinematic/organic look. I think part of it is an attempt to combat larger monitors/tvs and higher resolutions, which lead to very clean, synthetic looking graphics. Blame Quentin Tarantino imo, all his movies lately have been using film grain liberally.

I was cheesed off when they started adding anti aliasing to games.. I want raw unshopped vector lines, not some fuzzy blurred bull crap line. Developers are idiots.
 
Yes, the film grain effect looks nasty. I don't know how it's actually supposed to be a good thing. It certainly doesn't make anything cinematic, as no modern movie looks like this. I turned it off immediately when I saw it in Mass Effect.

For Mass Effect and any other games using the UE3 engine where you cant actually enable AA, it's nice to keep on.

AA works perfectly fine in UE3 games, at least on Nvidia cards.
 
Originally posted by: s44
Consoles can't handle real AA -> games will implement film grain instead

<-- This.

Mostly what we get anymore are console ports, and it's cheaper to just leave the film grain and blur there rather than reprogram the game to look better on a PC. Some of it is art direction, yes, but I think it's mostly due to the hardware limitations of consoles. Plus, there is a certain graphical quality that gamers expect now. So make it look good close up, then blur the stuff in the distance seems to be the trend for doing that nowadays.
 
Film grain makes low-res textures and jaggies less visible, it helps a lot on big screen TVs where the DPI is pathetically low and the hardware (consoles) is not as powerful.
 
Originally posted by: yusux
Originally posted by: mwmorph
Originally posted by: Eeezee
WTF, film grain? I just started playing Mass Effect a week ago and didn't know it was on. I'll have to try turning it off next time I play

Seriously, wtf? Is this why I have a nice graphics card? Film grain?

It helps with the POS engine that is Unreal 3. The Film grain hides the jaggies very well when AA cannot be turned on for a very, very minimal performance hit. For most games it's a bad idea. For Mass Effect and any other games using the UE3 engine where you cant actually enable AA, it's nice to keep on.

that's not true, I can enable AA just fine in UT3

Originally posted by: CP5670
Yes, the film grain effect looks nasty. I don't know how it's actually supposed to be a good thing. It certainly doesn't make anything cinematic, as no modern movie looks like this. I turned it off immediately when I saw it in Mass Effect.

For Mass Effect and any other games using the UE3 engine where you cant actually enable AA, it's nice to keep on.

AA works perfectly fine in UE3 games, at least on Nvidia cards.

Try to do it on XP with any video card. UE3 is a DX9 engine and yet it requires Vista and a DX10 card to enable AA.

The Unreal Engine 3 is easy to program for and fairly decent looking but it's also one of the worst Engines for PCs I've ever seen. Its absolutely dedicated to console users with its pop in streaming textures, lack of AA support in DX 9 when it's a pretty much pure DX9 engine(at lease UE3, not sure about the newer UE3.25/UE3.5).

Have you tried going into engine.ini ad disabling streaming texures so the game doesn't have to annoying pop in textures after loading? The engine crashes without loading, thats what happens. Streaming textures is great in console with something like 256mb of Vram and 256mb of processing ram(PS3) or 512mb of unified ram(X360) but on a pc, there is no reason to implement in outside of lazy programmers not wanting to put extra work in to make a halfway decent engine.

Add that into the fact that I havent sen a UE3 game yet that allows AA without forcing it in Ati Tray Tools/Rivatuner or otherwise at the driver level...
 
Originally posted by: 43st
Originally posted by: chizow
Ya its an annoying trend of devs adding film grain/noise to give games a more cinematic/organic look. I think part of it is an attempt to combat larger monitors/tvs and higher resolutions, which lead to very clean, synthetic looking graphics. Blame Quentin Tarantino imo, all his movies lately have been using film grain liberally.

I was cheesed off when they started adding anti aliasing to games.. I want raw unshopped vector lines, not some fuzzy blurred bull crap line. Developers are idiots.
Uh, anti-aliasing and film grain have nothing in common. I'm not sure why people are making this comparison. One technique increases sampling and accuracy, the other adds arbitrary noise. I found film grain didn't do anything to hide jaggies and only served to disfigure otherwise very clean and detailed textures.

Also, as has been beaten to death already, AA works just fine in UE3. You can force it via driver for Nvidia cards for sure, other games support it natively in DX10-only on the PC (GoW, UT3, Bioshock). As for claims about AA not being supported on consoles, that's also clearly inaccurate, as one of XBox360's hyped-up features was "free 4x AA". Only problem is its not quite free and oftentimes the difference in performance is enough so that AA can't be used on the console.

Finally, what's up with all the UE3 hate? LOL. Its probably the best combination of visuals and performance on the PC. I just loaded up GoW the other day and its still one of the best looking games on the PC, which is even more impressive given how well it runs.
 
Originally posted by: mwmorph
Add that into the fact that I havent sen a UE3 game yet that allows AA without forcing it in Ati Tray Tools/Rivatuner or otherwise at the driver level...
Sounds like a vendor specific issue limited to ATI parts, because again, forcing AA in the driver works just fine on Nvidia parts, even in non-DX10 UE3 titles like Mass Effect.
 
Back
Top