• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Help me resolve this internet argument : does texture size affect framerates ?

Habeed

Member
In a case where there is surplus VRAM, and all other factors held equal, do bigger textures lower framerate?

My friend insists loudly, over and over again, that they do not. Since the final rendering resolution is the same, and the video card has enough memory, he doesn't think it matters at all.

I keep telling him that it is simple mathematics : the frame rate difference might be small, but if your graphics processor has to increment through a larger array it's going to take more steps and more time.

The specific case was Minecraft : admittedly, not a game that does much to a GPU. I think that the larger texture packs for the game that hack the game client to use larger texture sizes (thus affecting the dimensions of the array used to store the textures in RAM) are going to reduce framerates because now the algorithm (I don't know how much of Minecraft's rendering is done by GPU vs CPU but that doesn't matter) has to do several times as many steps to sample the textures.

Whether this matters in the real world (there's tons of other places that could be bottlenecking actual performance) is irrelevant : it does take more gpu or cpu time if you boost texture sizes in the game.
 
I dunno, but I guess that depends on how large the textures are. Even if you double the size of the textures, it's not necessarily a bottleneck in the system. If the GPU is "waiting" for other calculations to be finished, then maybe it can handle larger textures with no penalty. But logically I would think that larger textures would affect performance a bit.
 
just anecdotal, but the texture mods for fallout and oblivion certainly made a very noticeable impact on my fps where i had to go from a 256mb vram card to 512mb and even then it was crap. i haven't tried on these new >1GB vram cards since i've lost interest. as to whether the lower fps was SOLELY the effect of the larger textures, it's hard to say because so many other things come into play. i don't know that you can make a blanket statement like "all other factors held equal" because in the real world, implementation of larger textures is bound to require other changes.
 
Of course it makes a difference. You have to read in more data with higher res textures even if screen resolution stays the same. GPU's have texture caches and the lower res the textures are, the greater the % of reads will be filled by the caches rather than by having to hit the much slower main memory.

This is easy to test. There are many games that let you choose texture detail. Just measure the difference using one of these games.
 
1. it takes up more of your limited bandwidth (ram to vram, via the PCIe 16x slot)
2. the GPU needs to process those textures to create the final image so it takes more there as well.

even if the final image has the exact same amount of pixels, the process of creating those pixels can be more or less intensive. it is not a matter of vram amount.

when you actually run out of vram you need to start caching from the regular ram and performance drops to single digits or even less than 1 FPS.

All that being said, under certain conditions of excess resources of specific kinds, you could increase textures up to a certain amount with a very small performance impact.
 
Last edited:
Of course it affects framerate. Those textures have to be morphed to fit the polygons. What your friend is arguing is basically the same as "The size of the image you're editing in Photoshop doesn't matter because your monitor stays the same resolution". No, processing a bigger image takes longer.
 
All that being said, under certain conditions of excess resources of specific kinds, you could increase textures up to a certain amount with a very small performance impact.

If you have the right number of mips, and your camera is just right so everything in the scene is picking the smallest mip.
 
Of course it affects framerate. Those textures have to be morphed to fit the polygons. What your friend is arguing is basically the same as "The size of the image you're editing in Photoshop doesn't matter because your monitor stays the same resolution". No, processing a bigger image takes longer.

best analogy all around.
 
I think your freind is a idiot. Tell him to install oblivion then download the textures packs and run it and see what happens to the framerate.
 
Texture size usually has one of the lowest performance impacts. Other settings like resolution, AA, and the various in-game shader options usually have vastly more affect on performance.

I co-wrote an article investigating Crysis, and the performance gain from going from the lowest textures to the highest was only 1.15% on an 8800 Ultra.
 
Last edited:
Texture size usually has one of the lowest performance impacts. Other settings like resolution, AA, and the various in-game shader options usually have vastly more affect on performance.

I co-wrote an article investigating Crysis, and the performance gain from going from the lowest textures to the highest was only 1.15% on an 8800 Ultra.

That's good to here. I've had this discussion with other modelers who are convinced that the size of the texture files make the bigger impact and that polygon count isn't as critical. This argument usually occurs when setting a poly cap for models that they feel needs to be increased to "properly express themselves". 😉
 
more data+ processing=bandwith+ processing burden.
whats so hard to understand.

How much of an actual performance hit occurs though? In my experience, until you run out of VRAM or you use a stupid amount of separate smaller textures, as opposed to fewer larger textures to accomplish the same pixel/poly, the drop in performance is minimal.
 
In my experience, until you run out of VRAM or you use a stupid amount of separate smaller textures, as opposed to fewer larger textures to accomplish the same pixel/poly, the drop in performance is minimal.
Those are my findings as well.

Most mid and high cards have enough bandwidth for texturing because the primary bottleneck usually sits on their shader units and ROPs, not on their TMUs or memory bandwidth.

If a performance wall is hit, it’s far more likely because you’ve run out of VRAM (especially with third party texture packs) rather than not having enough bandwidth to deal with the bigger sizes. In that case you’ll see a sudden performance drop rather than a consistent one.

Stock Crysis comes with 1 GB worth of textures in total, and both a 4870 (512 MB) and a 8800 Ultra (768 MB) showed less than a 1.5% performance gain by going from the lowest texture value to the highest.

Unless you have a lop-sided card like a 8800 GTS 320 MB or a GF2 MX, the texture slider should be one of the last things you change to get more performance.
 
Back
Top