Why don't PS4/XB1 games use anisotropic filtering?

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
Is there a reason why they leave it off? It's nearly a free effect since mid 2000's if I remember correctly.

Part of me thinks they do it on purpose so that up-close details look sharp compared to distant details. To some casual viewers, this might make the image appear more "HD" as you get the clear contrast of sharp up-close details, and blurred out graphics in the distance, whereas if everything was sharp like you get with 16AF, the overall HD effect might be diminished.

Anyways, that's just one theory I have for why so many PS4 games appear to use no AF.
 

Drako

Lifer
Jun 9, 2007
10,697
161
106
They do. There are some games on the PS4 that don't, but the vast majority do.
 

Anteaus

Platinum Member
Oct 28, 2010
2,448
4
81
The same reason they tend to avoid AA. Neither system has the GPU capabilities to do proper full time AA/AF in all but very controlled graphical situations. We are talking about machines that still treat 60 FPS at 1080P to be a luxury. Throw in AA/AF and performance drops dramatically.

In spite of this fact, we do tend to see AA/AF more often in games with small closed level design since there is less for the GPU to do in general. For open world games, you can forget it since it's gonna take everything a developer can do just to hit 1080P with decent performance.

This is a general observation and there are always exceptions.
 

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
Odd, even with my laptop's Radeon 5470 (roughly 1/20th the gflops of PS4), Anisotropic Filtering costs next to nothing. However, compared to most effects, AF is relatively subtle.

Now, AA on the other hand kills performance for rather obvious reasons.
 

local

Golden Member
Jun 28, 2011
1,851
515
136
Want AA/AF? Buy a PC or convince Sony/MS that you will bay $600 for a decent console.
 

alcoholbob

Diamond Member
May 24, 2005
6,368
435
126
Want AA/AF? Buy a PC or convince Sony/MS that you will bay $600 for a decent console.

Impossible. Consoles are TDP, form/factor and temperature limited (due to size). They were able to release flagship PC GPUs back in the PS3/Xbox 360 generation because the TDP of gaming cards back then were in the 50-80W range.

The fastest card back in the PS4 drawing stage was the 7970. How much faster do you suppose a 80W 7970 would do in a PS4 then it is currently? Do you think that tradeoff would be worth a $600 PS4?
 

local

Golden Member
Jun 28, 2011
1,851
515
136
Impossible. Consoles are TDP, form/factor and temperature limited (due to size). They were able to release flagship PC GPUs back in the PS3/Xbox 360 generation because the TDP of gaming cards back then were in the 50-80W range.

The fastest card back in the PS4 drawing stage was the 7970. How much faster do you suppose a 80W 7970 would do in a PS4 then it is currently? Do you think that tradeoff would be worth a $600 PS4?

Maybe dont limit it so much? Design a better box if you plan on it lasting more than a few years. I have had almost everything since the Atari 2600 but I will probably skip this gen. It is by far the least exciting partly because it was almost obsolete the day it came out.
 
Mar 11, 2004
23,444
5,847
146
I have no idea. The rare times they do I think they only go with like 2x or 4x which really isn't very good. I'm especially baffled considering all the work they've put into different AA implementations. Is there a buffer problem or something? Or is it because they have a more aggressive tapering of texture quality and AF would actually highlight it?

Also, uh, WTF are people lumping AF in with AA, they're entirely different and have very different performance impact. AF has a very minor performance hit on the PC and has for like a decade or more (think it was around the time of the 9800Pro?)
 

Anteaus

Platinum Member
Oct 28, 2010
2,448
4
81
I have no idea. The rare times they do I think they only go with like 2x or 4x which really isn't very good. I'm especially baffled considering all the work they've put into different AA implementations. Is there a buffer problem or something? Or is it because they have a more aggressive tapering of texture quality and AF would actually highlight it?

Also, uh, WTF are people lumping AF in with AA, they're entirely different and have very different performance impact. AF has a very minor performance hit on the PC and has for like a decade or more (think it was around the time of the 9800Pro?)

I think it's less about AF and AA being lumped together and more about the whole of post processing effects being lumped together. You are correct that they are completely different yet both take a toll on the GPU. Personally I hate HDR and bloom effects and would gladly give it up for more aggressive AF, but developers seem addicted to it.

Only a PC gamer can truly understand how underpowered these console GPUs are. Don't get me wrong, I love the types of gaming that the PS4/XB1 provide but this gen of consoles is crap from a raw performance standpoint. I've been gaming at 1920X1200 since circa 2005-2006 and it is laughable that 1920X1080 is still viewed as some sort of luxury. Now that we are in the 4K world, it is even more ludicrous.
 

Fulle

Senior member
Aug 18, 2008
550
1
71
AF (anisotropic filtering) is a texture sharpening technique that mostly improves textures viewed at oblique angles. It is generally cheap on GPU/CPU resources, but uses a bit of memory, so stronger AF wasn't found in some PS3/Xbox360 games due to memory bandwidth limitations. The PS4 and Xbox One have enough resources that AF isn't a problem, and I can't think of a game that doesn't use it.... Although there's a possibility that some particular game somewhere might not be using AF, the PS4 in particular, has so much memory bandwidth, that that would be completely on the developer since AF would be practically free.

AA (Anti-aliasing), is a completely different thing to reduce the aliasing created by polygons, which mostly appear on the edges of 3D objects made from polygons. I've seen people describe aliasing as "shimmering" or "crawlies". Most of the more effective AA techniques are pretty heavy on system resources, both GPU and memory, so it's something more likely to reduce frame rate. A lot of PS4 and Xbox One games use post-process Anti Aliasing, which is less effective, but uses less resources. FXAA (a post process AA effect) works by blurring edges where aliasing is most noticeable. Since aliasing is less noticeable at higher resolutions, this is an area where the resolutiongate discussion actually matters a little. 1080p with post process AA might have less noticeable aliasing than 900p with post-process AA.

On PC games, I prefer to run at as high a resolution as possible, and run light MSAA (Multi-Sample AA), since it will smooth out the edges a bit without causing bluring. There are actually a few PS4 games that use MSAA, but it's less common than post process AA. The Order (while a poor game, is a good tech demo) apparently used 4xMSAA.
 

mmntech

Lifer
Sep 20, 2007
17,501
12
0
Only a PC gamer can truly understand how underpowered these console GPUs are. Don't get me wrong, I love the types of gaming that the PS4/XB1 provide but this gen of consoles is crap from a raw performance standpoint. I've been gaming at 1920X1200 since circa 2005-2006 and it is laughable that 1920X1080 is still viewed as some sort of luxury. Now that we are in the 4K world, it is even more ludicrous.

The Xbox One's GPU is certainly underpowered, on account of them spending all the budget on the Kinect 2. IIRC, Microsoft requires developers to sign a "platform parity" clause as part of their contract. Which would explain why first party titles on the PS4 can look better than third party ones.

You have to look at last generation to see why current consoles are designed the way they are. Taking deep losses on the hardware was a risky move for both companies. Microsoft got lucky with the 360. Sony, not so much. It took them a long time to recover from the PS3's weak sales. There were also the RRoD and YLOD scandals that plagued earlier models. So rather than using top end hardware, they want for cheaper off-the-shelf components that could run at a lot TDP. Thus console sell at a profit from the get go, and don't suffer from overheating.

I agree though that 1080p should be the de facto standard now for all video games. 4K is still out of reach for most people because of the horsepower required to do those displays justice. Not everybody has $1000 to drop on a GTX Titan.
 

Fulle

Senior member
Aug 18, 2008
550
1
71
What bothers me about this gen, is they're already pushing the consoles past what they can do, and releasing games that perform terribly.

For some specific examples:

Bloodborne has too much physics (breakable bottles and coffins and such) and tessellation (fuzzy werewolf fur, and swaying fabric on armor), to the point that they couldn't even use a decent AA method, and multiplayer FPS can't hold at 30 FPS. LOD in the background can sometimes be pretty poor too.
- Still a great game, but there were bad performance decisions, IMHO.

Assassin's Creed Unity nerfed the PS4 resolution to have parity with the Xbox One, taking ZERO advantage of the PS4's 50% larger GPU. So that's annoying. More annoying though, is that they pushed the amount of people in crowds to the point where they hit a severe CPU bottleneck, where FPS drops into the TEENS in a lot of situations.

Several driving games have released at a 30 fps target. Driveclub, Forza Horizon 2, Need for Speed Rivals, and The Crew. What's next? 30 FPS fighting games? Screw you devs, this is BS!

This isn't the consoles' fault though. It's just devs pushing visuals to the point that it negatively effects performance.
 
Last edited:

Bman123

Diamond Member
Nov 16, 2008
3,221
1
81
The problem is 4k tv sets aren't close to affordable for the majority of the market. You can get a decent 1080p 60 inch tv for under $800 everywhere.

Most of the people that game on console don't have the cash to drop $1000+ on a 50 inch 4k tv. Not to mention we were lucky to get 720p games last gen, now most of the games are 1080p and to console players it's a big difference.

I gamed on pc for a while and got fed up with not being able to play with my friends because they all bought consoles. So I jumped on the console bandwagon and yeah the graphics were shit compared to pc but I had a way better time playing the games with friends that I know in person.

I'm not upset at all with the games being 1080p on console, if I was playing on pc still I'd have a 1080p monitor not a 4k one but I was never hardcore on my pc specs because I couldn't justify the price.
 
Dec 30, 2004
12,553
2
76
The problem is 4k tv sets aren't close to affordable for the majority of the market. You can get a decent 1080p 60 inch tv for under $800 everywhere.

Most of the people that game on console don't have the cash to drop $1000+ on a 50 inch 4k tv. Not to mention we were lucky to get 720p games last gen, now most of the games are 1080p and to console players it's a big difference.

I gamed on pc for a while and got fed up with not being able to play with my friends because they all bought consoles. So I jumped on the console bandwagon and yeah the graphics were shit compared to pc but I had a way better time playing the games with friends that I know in person.

I'm not upset at all with the games being 1080p on console, if I was playing on pc still I'd have a 1080p monitor not a 4k one but I was never hardcore on my pc specs because I couldn't justify the price.

we're not really talking about any of that
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
Only a PC gamer can truly understand how underpowered these console GPUs are.

I'm a PC gamer and I know that the PS4 has a near 7870 level GPU and that should be able to do AF with almost no performance hit.

Even XB1 with it's lowly 7770 DDR3 class should be doing a lot more AF than it is.
 

Anteaus

Platinum Member
Oct 28, 2010
2,448
4
81
I'm a PC gamer and I know that the PS4 has a near 7870 level GPU and that should be able to do AF with almost no performance hit.

Even XB1 with it's lowly 7770 DDR3 class should be doing a lot more AF than it is.

Yet they don't do it. Based on the chipset used, they should also both hit 1920x1080 with no difficulty. It makes no sense for either company to voluntarily not use a feature that is mostly beneficial unless there are bottlenecks elsewhere in the system.

My Radeon 9800 PRO was able to do 1920x1200 10 years ago although not easily. Its replacement was a circa 2009 Geforce 260 which easily handled 1920x1200 with minimal post processing and the ATI 7770 is a more recent chipset.

I agree that they should be doing a lot more AF, but that is only part of the issue. I'm starting to get the feeling that the GPU that AMD sold them is but a shadow of the version that sat in PCs. From my point of view, Sony and Microsoft both went the cheap route and AMD offered the lowest bid. I know that the consoles are profitable, but seriously....$50 more dollars per system could have meant a huge performance difference across the board.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I know that the consoles are profitable, but seriously....$50 more dollars per system could have meant a huge performance difference across the board.

On the GPU side, absolutely not. HD7870 or HD7970M level of performance was flagship mobile spec at the time in a 100W TDP. NV released a faster 780M on May 30, 2013, that was 15% faster than HD8970M (basically a 5% faster 7970M). So no, throwing $50 at the GPU would have hardly made an impact. To fit a much more powerful GPU, they needed to go desktop SKU, which means larger case, larger PSU, more expensive cooling, etc. They could have unlocked the full 20 CUs in the PS4 but that's only an 11% increase in GPU power.

In the grand scheme of things, PS4 is a well designed console. It's a perfect balance of cost vs. power without incurring major losses for the firm.

"Sony’s PlayStation 4 Costs $381 to Build — Only $18 Under Retail Price — In Teardown"

Hopefully that means PS5/XB2 are out by Fall 2019 and this generation won't drag on for 7-8 years like the Xbox 360/PS3 did.
 
Last edited:

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
Yet they don't do it. Based on the chipset used, they should also both hit 1920x1080 with no difficulty.

No the XB1 is bottlenecked by the fact they went with DDR3 as graphics memory and the low amount of ESRAM means doing 1080p will always be a challenge for it. Doesn't change the fact they should be able to enable AF for 2-3% performance cost.

I honestly feel like they think people don't notice it or something, I don't know. I feel like game developers are somehow out of touch with the end user experience. Like they literally just don't sit down and play games, or nitpick the image quality the same way the consumer does. We are expecting perfection, they are trying to ship a product out the door. It's the only excuse I can see for how games are being released nowadays.
 

Anteaus

Platinum Member
Oct 28, 2010
2,448
4
81
No the XB1 is bottlenecked by the fact they went with DDR3 as graphics memory and the low amount of ESRAM means doing 1080p will always be a challenge for it. Doesn't change the fact they should be able to enable AF for 2-3% performance cost.

I honestly feel like they think people don't notice it or something, I don't know. I feel like game developers are somehow out of touch with the end user experience. Like they literally just don't sit down and play games, or nitpick the image quality the same way the consumer does. We are expecting perfection, they are trying to ship a product out the door. It's the only excuse I can see for how games are being released nowadays.

I accept what your saying but I don't think it's about gamers expecting perfection. The PS4/XB1 were specifically designed for 1080P televisions and they failed to meet that specification. The relative comparison of the PS4 and XB1 is silly because the PS4 with its more powerful GPU still fails to provide 1080P for 100% of its games and as many are saying here fails to take advantage of the more basic post processing techniques such as AF.

Regardless of how people talk up the GPU tech that was used, real world performance is underwhelming. Reminds me about when Bulldozer was released. On paper it was genius but in practice it failed to the chagrin of many people. It appears AMD is still in the chagrin business. :)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Regardless of how people talk up the GPU tech that was used, real world performance is underwhelming. Reminds me about when Bulldozer was released. On paper it was genius but in practice it failed to the chagrin of many people. It appears AMD is still in the chagrin business. :)

Think about it, in some of today's PC games, you cannot max them out at 60 fps at 1080P on even a 980. So how in the world could a $400 do that when it released in November of 2013? 980 is >2X faster than HD7870. The problem with PS4 and XB1 isn't that gamers always had unrealistic expectations about what those consoles would pack under the hood.

It's surprising that even knowledgeable PC gamers can't put the numbers together. Intel/AMD/NV have margins too. When an i3 is $130, i5 is $180, and GTX970 is $300, how did PC gamers expect PS4 to have hardware powerful enough to play games at 1080P at 60 fps? Even 980 will become outdated in 2.5-3 years and won't play any single AAA game at 60 fps by 2018 maxed out.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I'm happy with how most of the games look on PS4 and XB1. There's only a few that seem underwhelming in any way. There is really only so much you can do with the hardware on hand in the consoles to begin with but as mentioned, some developers try to do effects that are possible but not playable. There needs to be a balance. I think they are enough of a step up from the 360 and PS3 to be impressive on their own. The problem on the consumer side is that many people try to compare it to a PC which isn't fair to begin with. My PC is many times more capable than the consoles and I'll upgrade from my 970s to something else in a couple years or so and have a faster CPU by then probably too. The console will always have the same hardware.

My biggest problem with the XB1 and PS4 is the lack of compelling exclusives and they are already going on 2 years old. Either system only had a few games between them that are system sellers. On the console I don't care much about AA and AF etc. I care about the game I'm playing that was not available on PC, or is a better console experience due to the number of players online and the like.
 

Anteaus

Platinum Member
Oct 28, 2010
2,448
4
81
Think about it, in some of today's PC games, you cannot max them out at 60 fps at 1080P on even a 980. So how in the world could a $400 do that when it released in November of 2013? 980 is >2X faster than HD7870. The problem with PS4 and XB1 isn't that gamers always had unrealistic expectations about what those consoles would pack under the hood.

It's surprising that even knowledgeable PC gamers can't put the numbers together. Intel/AMD/NV have margins too. When an i3 is $130, i5 is $180, and GTX970 is $300, how did PC gamers expect PS4 to have hardware powerful enough to play games at 1080P at 60 fps? Even 980 will become outdated in 2.5-3 years and won't play any single AAA game at 60 fps by 2018 maxed out.

That's crap because with the exception of certain cases, I max everything out with my i7-2600K and 770 GTX with 8GB ram. It might not always hold 60 FPS consistently but that's why I have the option to scale things back a tad. You are also talking about end user prices....when you buy in large lots the price per unit is much lower.

Console developers painted themselves in a corner by inadvertently competing with PCs. They could have just taken last gen graphics and poured on a liberal amount of AA/AF, V-sync, and some post processing with a rock solid 1080P/60 and console gamers would have been fine. Instead, they are forcing current PC level post processing into GPUs that lack the raw shader power that is necessary. As a result, the games look good but the experience isn't optimal.

The 360/PS3 were originally designed for 1280x720 in spite of supporting 1080P after release so people were a bit more forgiving. For current gen, they could scale back the graphics quality a bit and hit 1080P easily, but then they would have to admit that the hardware wasn't up to it in the first place.

The bottom line is that the graphics performance issues that console gamers are dealing with right now is completely self inflicted out of a need to look as good as what PC gamers have. They were hoping that this generation would be the one to close the gap between PC and console and admittedly they got close for a minute, but once again the next generation of PC GPU's are coming out and consoles are going to get left behind.

That isn't to say that all games suffer this fate...many games run just fine at 1080P on both the XB1 and PS4. For those that do struggle, they should just dial things back a bit instead of trying to hold up the ocean.
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,145
502
126
The same reason they tend to avoid AA. Neither system has the GPU capabilities to do proper full time AA/AF in all but very controlled graphical situations. We are talking about machines that still treat 60 FPS at 1080P to be a luxury. Throw in AA/AF and performance drops dramatically.

^^^^ This. PC's have AA and AF as options for use in games. They are optional because hardware can range from the on-CPU video to multi-thousand dollar 2x/3x/4x graphic card setups. You need to remember that the GPU in the PS4 and XBox 1 are essentially modified AMD 7870, which came out almost 5 years ago in PC land, and is approx the speed of last gen's low/mid tier cards (like the Nvidia 760), or AMD's low end 270.

With the poly count they are doing in current games, I am surprised they are able to even hit 30 frames per second at 1080p without AA and AF, let alone with it (but lets face it, most games are still not 1080p, and that is because the graphics are too demanding).
 
Last edited: