Why are game developers adding noise in games??

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

CP5670

Diamond Member
Jun 24, 2004
5,687
790
126
Try to do it on XP with any video card. UE3 is a DX9 engine and yet it requires Vista and a DX10 card to enable AA.

The Unreal Engine 3 is easy to program for and fairly decent looking but it's also one of the worst Engines for PCs I've ever seen. Its absolutely dedicated to console users with its pop in streaming textures, lack of AA support in DX 9 when it's a pretty much pure DX9 engine(at lease UE3, not sure about the newer UE3.25/UE3.5).

I use AA in several of those games on XP with DX9. The Nvidia drivers let you force it on and it works fine.

As for the engine itself, the games that run on it vary a lot in graphical quality and performance. Epic seems to be able to get the best out of it and UT3 both looks excellent and runs very well, but other games like Bioshock and Mass Effect are not quite at the same level. Mass Effect looks at least a couple of years out of date and still runs worse than UT3.
 

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
Originally posted by: chizow
Originally posted by: mwmorph
Add that into the fact that I havent sen a UE3 game yet that allows AA without forcing it in Ati Tray Tools/Rivatuner or otherwise at the driver level...
Sounds like a vendor specific issue limited to ATI parts, because again, forcing AA in the driver works just fine on Nvidia parts, even in non-DX10 UE3 titles like Mass Effect.

Sigh, that's what I'm railing against. I should not HAVE to go down into a control panel to set AA for a single game and disabling it when trying to run another game.

I believe my info may have been out of date then (apperantly NV and ATI does work if you force it in at driver level), but it's still inexcusable when Epic's crap devs couldn't add a basic graphical feature from 1997 into their new engine and the video card manufacturers had to implement their own work arounds so that something as simple as AA works. It's like buying a car only to find out a side view mirror is unpainted, the airbag compartment is filled with jello, it only comes with 3 wheels and instead of AC you get a block of ice in the back on hot summer days.

UE3 is a shitty engine with crap programming. That's it and that's what I'm trying to get at and that is why Film grain is existant in Mass Effect. Because when it came out, AA was impossible to do on UE3(on both consoles and PC) and film grain was the only way to make the jaggies tolerable.

I have to admit, I have no idea why Crysis has film grain, but it does make sense for L4D since it's supposed to be a group of players playing through a zombie movie shoot. As for FEAR II, eh, film grain + horror sort of makes sense I guess. If it really bothers you, FEAR II already has a noise removal mod, http://www.sendspace.com/file/1csmli
 

911paramedic

Diamond Member
Jan 7, 2002
9,448
1
76
I just played the FEAR 2 demo and it was a blast. The noise is because you are looking through a shield, but the textures and graphics are great.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: mwmorph
Sigh, that's what I'm railing against. I should not HAVE to go down into a control panel to set AA for a single game and disabling it when trying to run another game.

I believe my info may have been out of date then (apperantly NV and ATI does work if you force it in at driver level), but it's still inexcusable when Epic's crap devs couldn't add a basic graphical feature from 1997 into their new engine and the video card manufacturers had to implement their own work arounds so that something as simple as AA works. It's like buying a car only to find out a side view mirror is unpainted, the airbag compartment is filled with jello, it only comes with 3 wheels and instead of AC you get a block of ice in the back on hot summer days.
Well, I think its a bit more complicated as to why many devs do not include AA in their option settings by default even if hardware supports it. It probably comes down to support and understanding human nature. If you have a slider that goes to say, 11, people are always going to try and set that slider to 11, even if that goal is unrealistic given their hardware. Devs probably want to avoid having to support people who don't have the hardware capable of running such settings but still want to set everything to 11, or worst, bitching on forums about why their dated video card and PC runs their game like crap.

As for setting AA manually in the driveres, thats something that's been necessary for years ever since AA has been supported on modern graphics cards. Having to enable/disable it for each game is certainly a drag, but again, that's a vendor specific issue limited to ATI parts, as Nvidia parts allow for driver profiles that work quite well in saving and enabling game-specific settings for everything from AA/AF to SLI.

UE3 is a shitty engine with crap programming. That's it and that's what I'm trying to get at and that is why Film grain is existant in Mass Effect. Because when it came out, AA was impossible to do on UE3(on both consoles and PC) and film grain was the only way to make the jaggies tolerable.

I have to admit, I have no idea why Crysis has film grain, but it does make sense for L4D since it's supposed to be a group of players playing through a zombie movie shoot. As for FEAR II, eh, film grain + horror sort of makes sense I guess. If it really bothers you, FEAR II already has a noise removal mod, http://www.sendspace.com/file/1csmli
LOL, I guess everyone is entitled to their opinion but again, I don't think you'll find a better looking balance of performance and visuals on the PC than UE3. Crysis looks better but runs much worst, COD4/5 runs slightly better but looks slightly worst. Maybe some of Ubisoft's proprietary engines like AC or PoP come close, but I'd still give the visual nod to UE3.
 

CP5670

Diamond Member
Jun 24, 2004
5,687
790
126
Sigh, that's what I'm railing against. I should not HAVE to go down into a control panel to set AA for a single game and disabling it when trying to run another game.

Just make a profile for the game. Both nHancer and ATI Tray Tools let you do this easily. I don't see why this is a problem.
 

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
Originally posted by: chizow
UE3 is a shitty engine with crap programming. That's it and that's what I'm trying to get at and that is why Film grain is existant in Mass Effect. Because when it came out, AA was impossible to do on UE3(on both consoles and PC) and film grain was the only way to make the jaggies tolerable.

I have to admit, I have no idea why Crysis has film grain, but it does make sense for L4D since it's supposed to be a group of players playing through a zombie movie shoot. As for FEAR II, eh, film grain + horror sort of makes sense I guess. If it really bothers you, FEAR II already has a noise removal mod, http://www.sendspace.com/file/1csmli
LOL, I guess everyone is entitled to their opinion but again, I don't think you'll find a better looking balance of performance and visuals on the PC than UE3. Crysis looks better but runs much worst, COD4/5 runs slightly better but looks slightly worst. Maybe some of Ubisoft's proprietary engines like AC or PoP come close, but I'd still give the visual nod to UE3.

I have to disagree, the texture popping takes all immersiveness out of the game.

The only job of a game engine is to provide a sense of immersion. Bioware even tries to do this by taking out loading screens, replaced with elevator rides filled with dialogue. That is why I think UE3 is particularly egregious. When the whole game is designed to be as immersive as possible, the texture popping when stepping out of an elevator is incredibly annoying.

I also think COD4 looks great and runs well. I always found the UE3 engine to be a resource hog. It takes forever to start up and initialize and doesn't handle alt-tabbing well.

I personally believe that depending on what you're doing there are tons of better engines out there.
Gamebryo engine(Oblivion, Fallout 3)
Source(the new Protocol 37 source build, not the original HL2 source, but the one that added minor rendering improvements and post processing in Dec 2008)
Ego/Neon (GRID, DIRT, Operation Flashpoint 2)
Crystal Tools (FFXIII, not a big fan of FF, but the engine rendering capabilities seem very impressive so far)
idTech4(megatexturing is in a word, revolutionary and amazing)
Dunia (Far Cry 2)
Lithtech Jupiter Extended (FEAR 2)
Relent Engine(It allows you to destroy every part of the environment realistically)
Eswsence (Company of Heroes, Dawn of War II)
Scimitar (Assasin's Creed)
Iron (Sins of a Solar Empire, no not a great looking game but amazing how it can seamlessly zoom from galaxy wide 2D to single ship in combat 3D rendering without any loading or hiccups at all)

So anyways, what I'm trying to say is, UE3 is a like the lost, middle child. It seems rushed to market and concentrates on how devs can easiest do thing, not how well they can do things. The UE3 engine is as easy to build and program for as anything out there, but beyond that, it's really a case of jack of all trades, master of none.

If devs put the time and effort into it, theres so many different engines that actually suit a type of game better that there is no real reason to use UE3 beyond speed up development and not wanting to learn anything new.

Originally posted by: CP5670
Sigh, that's what I'm railing against. I should not HAVE to go down into a control panel to set AA for a single game and disabling it when trying to run another game.

Just make a profile for the game. Both nHancer and ATI Tray Tools let you do this easily. I don't see why this is a problem.

I would if I could but Rivatuner, Ati Tools, anything else does not work with Windows 7. I like gaming but I also like how Windows 7 does everything better than Vista during regular usage, aka actually working instead of crash/booting windows every time I say opened up Word or Firefox.

I had a hell of a time getting Ati drivers to even work. 8.12 Vista x64 + CCC would not run UE2 games and wouldn't uninstall(I had to manually edit my registry and delete files), Beta 8.12 Win 7 wouldn't work with CCC, Finally 9.1(Vista x64) works well so far.
 

CP5670

Diamond Member
Jun 24, 2004
5,687
790
126
The only job of a game engine is to provide a sense of immersion. Bioware even tries to do this by taking out loading screens, replaced with elevator rides filled with dialogue. That is why I think UE3 is particularly egregious. When the whole game is designed to be as immersive as possible, the texture popping when stepping out of an elevator is incredibly annoying.

It sounds like your beef is with Mass Effect, not UE3. Like I said, the UE3 games all vary a lot in graphical quality and stability. I have never seen this texture popping in UT3 except during the first few seconds, while the map assets are still loading. Alt-tab works fine in UT3, although Bioshock and ME were both giving me some trouble with that.

UE3 has gotten a mixed reputation because there are a number of games using it that don't look good and/or run badly, but there are also other games that look fantastic and run very well. That suggests that some developers who license it aren't using it properly, not that the engine itself has problems.

I would if I could but Rivatuner, Ati Tools, anything else does not work with Windows 7. I like gaming but I also like how Windows 7 does everything better than Vista during regular usage, aka actually working instead of crash/booting windows every time I say opened up Word or Firefox.

First time I've heard of that. It's probably due to the silly signed driver limitation though, and those programs should have gotten signed by the time Windows 7 actually ships.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: mwmorph
I have to disagree, the texture popping takes all immersiveness out of the game.

The only job of a game engine is to provide a sense of immersion. Bioware even tries to do this by taking out loading screens, replaced with elevator rides filled with dialogue. That is why I think UE3 is particularly egregious. When the whole game is designed to be as immersive as possible, the texture popping when stepping out of an elevator is incredibly annoying.

I also think COD4 looks great and runs well. I always found the UE3 engine to be a resource hog. It takes forever to start up and initialize and doesn't handle alt-tabbing well.
There isn't any noticeable texture popping in GoW or UT3, and I don't recall any significant issues in Mass Effect either. If you're referring to Mip Map detail, you can always try changing the Mip Map settings so that textures retain detail at further distances. In any case, texture popping and streaming could be a lot worst, as seen in Crysis.

COD4 certainly runs well on weaker hardware, but it also doesn't look as good. I've found UE3 scales well according to hardware and will take advantage of faster CPUs with more cores along with faster GPUs. Alt-Tabbing again sounds like a platform/vendor/driver specific issue because I haven't had any problems Alt-Tabbing in any of those titles. I do remember Mass Effect took longer, sometimes as long as 30s but it became near instant after I upgraded to my GTX 280 and switched to Raid 0 back in July (not sure which change made the difference).

I personally believe that depending on what you're doing there are tons of better engines out there.
Gamebryo engine(Oblivion, Fallout 3)
Source(the new Protocol 37 source build, not the original HL2 source, but the one that added minor rendering improvements and post processing in Dec 2008)
Ego/Neon (GRID, DIRT, Operation Flashpoint 2)
Crystal Tools (FFXIII, not a big fan of FF, but the engine rendering capabilities seem very impressive so far)
idTech4(megatexturing is in a word, revolutionary and amazing)
Dunia (Far Cry 2)
Lithtech Jupiter Extended (FEAR 2)
Relent Engine(It allows you to destroy every part of the environment realistically)
Eswsence (Company of Heroes, Dawn of War II)
Scimitar (Assasin's Creed)
Iron (Sins of a Solar Empire, no not a great looking game but amazing how it can seamlessly zoom from galaxy wide 2D to single ship in combat 3D rendering without any loading or hiccups at all)

So anyways, what I'm trying to say is, UE3 is a like the lost, middle child. It seems rushed to market and concentrates on how devs can easiest do thing, not how well they can do things. The UE3 engine is as easy to build and program for as anything out there, but beyond that, it's really a case of jack of all trades, master of none.
LOL, again, I don't think you'd be able to produce a better looking game image in any of those games than anything I could produce in GoW or UT3, and certainly not while maintaining as high frame rates. I mean honestly, Gamebryo/Fallout 3? I guess if overly round polys and textures with very little contrast are what you consider good graphics. FO3 isn't even in the discussion without 3rd-party texture mods. I have the 4-5 top contenders in that bunch (AC, FC2, FEAR2 demo, FO3) and I can say for certain none come close to UE3 in balancing both graphics and performance.

If devs put the time and effort into it, theres so many different engines that actually suit a type of game better that there is no real reason to use UE3 beyond speed up development and not wanting to learn anything new.
Seems to work great for 1st and 3rd person shooters/action games. If every 1st person shooter used UE3, it'd be an improvement for every title other than Crysis.
 

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
Originally posted by: Raduque
Originally posted by: mwmorph

Gamebryo engine(Oblivion, Fallout 3)

Isn't this the same engine behind the abortion that is GTA4 PC?

No, GTA 4 is Rockstar's RAGE (Rockstar Advanced Game Engine).

GTA 4 was just a piss poor port, but that's par for the course from Rockstar. Nice game design, but a room of retarded chimps with ebola would have been able to do better windows programming.

GTA San Andreas ran like crap for the hardware required. GTA SA had problems running on Nvidia hardware IIRC and when it did run, on both ATI and Nvidia hardware, ran like hell considering how bad looking of a game it was.

GTA III performance was abysmal too with absolutely no object culling, forcing the PC to render every object, even if say the tree was obstructed by a building, a gas station, 35 cars, 2 high rises, and 15 houses. As long as the objects were in the rendering range, no matter how occluded it was, it was rendered.
 

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
Originally posted by: chizow
Seems to work great for 1st and 3rd person shooters/action games. If every 1st person shooter used UE3, it'd be an improvement for every title other than Crysis.
I snipped a lot of it because it was getting long, but after loading a level in either Bioshock or Mass Effect, there was tons of textures popping in most if not all the time for me. This also isnt a PC only issue, many Xbox360 owners complain about it too.

Might be because I'm running on a single drive setup while you have Raid0. I have noticed it happens a lot less in BIA: Hell's Highway, so it might be a game by game issue depending on the texture load of the game.

I think that's where you and I disagree is how good UE3 looks. I think GOW is pretty bad looking. There's nothing special about GOW, it's entirely monochromatic and the character models follow the "super masculine boxy head/body" look that tends to be lower poly anyways.

I think as far as games go, Far Cry looks vastly better than any UE3 game while maintaining performance. One beef I have with UE3 graphically is faces, there is just something inherently wrong and fake looking about UE3 faces. It's almost like the uncanny valley effect, except that I'm disgusted by something I cant put my finger on. I just get the feeling every time I see a face in a UE3 game that the face looks completely off and ridiculously ugly. Case in point is Joe Red Hartsock in BIA:HH who just looks completely wrong and ugly that it actually disturbed me a little.

I noticed this first in Bioshock. I gave it he benefit of the doubt because of the setting of the game, people were supposed to look like freaks, but even Andrew Ryan looked wrong, as if his body was made from plastic or clay or something. Nevertheless I chalked it upt ot Bioshocks art direction.

Then I Played BIA and noticed Red was one ugly mofo. Then I played Mass Effect and noticed people just didnt look right. There was something small off about joker, ashley, captain anderson, ambassador whats his face...

I dont have these instinctive reactions to other games. Crysis looks perfectly natural. Far Cry 2 does too.

Oh and before you bas some of the engines, Neon/Ego is an incredible engine. It power Dirt, Grid, F1 and in modified form will power Operation Flashpoint 2. It's a very efficient engine that looks actually quite good. Grid looks magnificent and Operation Flashpoint 2 is shaping up to be a very, very good looking game as well.
http://www.eurogamer.net/galle...hp?game_id=7549#anchor
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: mwmorph

I think that's where you and I disagree is how good UE3 looks. I think GOW is pretty bad looking. There's nothing special about GOW, it's entirely monochromatic and the character models follow the "super masculine boxy head/body" look that tends to be lower poly anyways.

I think as far as games go, Far Cry looks vastly better than any UE3 game while maintaining performance. One beef I have with UE3 graphically is faces, there is just something inherently wrong and fake looking about UE3 faces. It's almost like the uncanny valley effect, except that I'm disgusted by something I cant put my finger on. I just get the feeling every time I see a face in a UE3 game that the face looks completely off and ridiculously ugly. Case in point is Joe Red Hartsock in BIA:HH who just looks completely wrong and ugly that it actually disturbed me a little.


The thing you have to realize is nothing of what you listed is the engines fault. The UE3 engine is one of the most capable engines that exist. Currently you cannot do realtime tessellation in directx but when directx 11 is released that will be supported. That would mean that all the places where stuff is boxy on the pc can be made smooth as silk on the pc if the video card can support it. They can't currently provide one polygon count for pc and then another for console without doing a good bit of work. Faces in UE3 don't have any default look, it's up to the artist to decide what that will be .

I think the reason UE3 games tend to look the way they do isn't because the engine isn't capable. It is because it is a very complex engine. One of the most complex there is. It is capable of a lot. But developers are mostly doing console + pc and that limits what you can do .
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: mwmorph
I think that's where you and I disagree is how good UE3 looks. I think GOW is pretty bad looking. There's nothing special about GOW, it's entirely monochromatic and the character models follow the "super masculine boxy head/body" look that tends to be lower poly anyways.

I think as far as games go, Far Cry looks vastly better than any UE3 game while maintaining performance. One beef I have with UE3 graphically is faces, there is just something inherently wrong and fake looking about UE3 faces. It's almost like the uncanny valley effect, except that I'm disgusted by something I cant put my finger on. I just get the feeling every time I see a face in a UE3 game that the face looks completely off and ridiculously ugly. Case in point is Joe Red Hartsock in BIA:HH who just looks completely wrong and ugly that it actually disturbed me a little.

I dont have these instinctive reactions to other games. Crysis looks perfectly natural. Far Cry 2 does too.
I'd say many of the graphical issues you have with UE3 are due to art direction/design and cinematography and nothing to do with the capabilities of the engine itself. Even if you don't like the squatty roid-raged character models in GoW or UE3, its still obvious the level of detail, post-processing, lighting, shading, particle effects and texturing exceed the quality of everything else short of Crysis. There's no doubt Crysis looks better, but it also runs much, much worst. I can get 60+ FPS at all times in any UE3 game I own (GoW, UT3, ME, MoH:A, Bioshock, etc), and that's with 4xTrMSAA enabled.

Again, if the character modeling is what you're basing your opinion on....that really has nothing to do with the capabilities and strengths of the engine itself. If you prefer realistic models, check out Medal of Honor Airborne, America's Army using UE 3.0 etc.

Also, not sure if you're comparing FC2 or the original Far Cry, in either case I think both look vastly inferior. FC2 has some of the worst texture popping I've seen in any game actually, relies heavily on transparent textures, making everything look like thin stage mock-ups, and actually runs worst than UE3 at similar settings.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Raduque
Originally posted by: mwmorph

Gamebryo engine(Oblivion, Fallout 3)

Isn't this the same engine behind the abortion that is GTA4 PC?
Definitely not, again, Gamebryo could never produce anything close to as good looking as GTA4. It relies heavily on textures, as seen in the major improvements from texture modsd.

Also, contrary to some of the feedback you've received, GTA4 isn't poorly optimized, it just exposes a weakness on PCs relative to consoles. Consoles have specialized RISC CPUs designed for parallel computing. PCs have long instruction general purpose CPUs that have only recently shifted focused towards parallel computing and multi-threading.

End result is that you really need a Quad core CPU to run GTA4 well, as that's how the game was designed for best results on the console. It also doesn't help that the PC allows for much greater detail, vehicle/pedestrian/object density, and viewing distance, which will all demand more CPU time.

If this game supported AA, I'd actually move it up behind Crysis as the most visually impressive game on the PC. :)



 

videogames101

Diamond Member
Aug 24, 2005
6,783
27
91
Originally posted by: BladeVenom
I also hate when they add motion blur into games.

!!!


(I love motion blur, not for multiplayer modes, but because it adds so much realism for sp)
 

erwos

Diamond Member
Apr 7, 2005
4,778
0
76
I liked the film grain effect. I can see why some people don't, but I felt that it gave the game an epic, movie-like effect. I very much doubt it was to cover up jaggies, given that you can turn it off. I think that consoles get something of a bad name from the texture pop-in problems that UE3 has - I'm not sure why it's like that, except that there must be some insane speed-up it gives.
 

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
Originally posted by: erwos
I liked the film grain effect. I can see why some people don't, but I felt that it gave the game an epic, movie-like effect. I very much doubt it was to cover up jaggies, given that you can turn it off. I think that consoles get something of a bad name from the texture pop-in problems that UE3 has - I'm not sure why it's like that, except that there must be some insane speed-up it gives.

Consoles only have 512mb ram (The Xbox 360 has 512mb dynamically adjustable ram + 10mb special extreme speed VRAM, The PSD3 has 256mb of fixed processing ram and 256mb of Vram.)

The texture streaming is required to have high res textures and all the maps fit in memory on consoles.