are compression transistors worthit...

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
DISCLAIMER: I have little doubt that I suffer from super low IQ, so I won't be surprised if none of what I said makes sense to anyone and if I misused jargon. I also don't really understand how the DX11 rendering pipeline is setup although I think I understand DX9.0c and certainly DX6.x. /DISCLAIMER

...when the bus bandwidth could be just doubled and without multi sampling since SGSSAA looks the best and doesn't give the extra compression that multisampling does?

I know there are many factors involved but I was thinking it would definitely be worthwhile in compute/shader intensive apps. But if high resolutions, MSAA, and high blending/depth precision were desired then I'm pretty sure lossless compression transistors are worth more.

I think transistors for MSAA are worthless since SGSSAA looks better and since SGSSAA should be the future IMO. I know some here disagree with me, but I also think DP shaders will be better for future apps and if I'm not mistaken that requires more bandwidth.

I'd say it would be better to increase the bit width of the IMC.

What do you think is more worthwhile--lossless compression transistors or IMCs with significantly more bandwidth?
 
Last edited:

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Doubling bandwidth would cost a lot more. The amount of transistors needed for texture compression is actually quite low. I have no doubt that texture compression is much cheaper than doubling bandwidth. Most computer task do not benefit from increased bandwidth at all since they tend to be compute heavy compared to rendering. Rendering is all about doing a ton of approximations rather than doing something precisely so rendering has high throughput. Scientific calcuations on the other hand tend to be more complex thus reducing the relative amount of bandwidth needed. Look at bitcoin mining. All those guys downclock their video memory to save power usage since more memory bandwidth benefits them not at all.

SGSSAA is better than MSAA but it doesn't matter. SGSSAA takes longer so you lose FPS. Monitor resolutions are going up so the quality of AA needed going forward will only decline, not increase. The quality of AA needed will never be as high as it was on a 640x480 screen because a sucky screen needs the most AA to compensate for its suckiness.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
SGSSAA vs MSAA is a subjective choice. SGSSAA has weaknesses that MSAA doesn't and visa versa. SGSSAA causes blurring. It is good, as it fixes jagged edges everywhere, but it does cause blurring everywhere, as AA blurs jagged edges to fix it. MSAA only fixes outer edges of objects, and leaves textures alone. It also has a much lower hit on performance as a result.

I had been using SGSSAA or normal SSAA (very similar) in Dragon Age Origins. It fixes jagged edges on some armor types that have awful aliasing. SGSSAA fixed this issue, however, when I started recently playing 3D Vision games closer to the monitor, I found the blurring that SGSSAA causes was really noticeable and bad when viewed from closer up to the monitor. With MSAA, this blurring does not occur.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
Doubling bandwidth would cost a lot more. The amount of transistors needed for texture compression is actually quite low. I have no doubt that texture compression is much cheaper than doubling bandwidth. Most computer task do not benefit from increased bandwidth at all since they tend to be compute heavy compared to rendering. Rendering is all about doing a ton of approximations rather than doing something precisely so rendering has high throughput. Scientific calcuations on the other hand tend to be more complex thus reducing the relative amount of bandwidth needed. Look at bitcoin mining. All those guys downclock their video memory to save power usage since more memory bandwidth benefits them not at all.
Thanks:)

SGSSAA is better than MSAA but it doesn't matter. SGSSAA takes longer so you lose FPS. Monitor resolutions are going up so the quality of AA needed going forward will only decline, not increase. The quality of AA needed will never be as high as it was on a 640x480 screen because a sucky screen needs the most AA to compensate for its suckiness.

I thought increased PPI didn't decrease aliasing at all angles like a properly rotated grid AA algorithm does. I don't notice that much less aliasing when increasing the res of monitor... not like I do with 4x SGSSAA anyway.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
are compression transistors worthit...
What transistors? Texture compression is emulated by pixel shader operations.

...when the bus bandwidth could be just doubled and without multi sampling since SGSSAA looks the best and doesn't give the extra compression that multisampling does?
That makes no sense whatsoever.

I think transistors for MSAA are worthless since SGSSAA looks better and since SGSSAA should be the future IMO.
Uh, RGSS/SGSS isn't possible without MSAA. It would also run a lot slower without the MSAA optimizations that are leveraged while running it.

So what you're basically asking for is to go back to pure OGSS (which has extremely poor performance and IQ in comparison) to save some mythical transistors?
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
Uh, RGSS/SGSS isn't possible without MSAA. It would also run a lot slower without the MSAA optimizations that are leveraged while running it. So what you're basically asking for is to go back to pure OGSS (which has extremely poor performance and IQ in comparison) to save some mythical transistors?
OGSS sucks. I didn't know that MSAA was required for RGSS.

What transistors? Texture compression is emulated by pixel shader operations.
Didn't know that. I thought the texture units did it but I guess I thought wrong.

I also thought the ROPs did the color and depth buffer compression but I guess I thought wrong about that also.