Originally posted by: BFG10K
Some card's don't perform 2x faster in 16bit anymore, only 30% faster etc, however people playing in 16bit normally have slightly older cards.
I don't care about older cards, nor am I interested in the above comment which seems largely irrelevant to the statement you quoted.
Seems completely relvant to me, this entire thread has been discussing the viable use of 16bit by users since my first comment in it.
What possible benefit can a "postfilter" do to INCREASE the image quality of an already heavily dithered image.
Do you know how the postfilter works? Because if you did you wouldn't be asking such a question. Like I said before, it's very similar to FSAA except it doesn't change the size of the image and it uses a slightly different algorithm to achieve its colour sampling.
You haven't answered the question - how exactly is a filter going to get something from nothing? - sounds like a waste of gpu time to me.
See my original analogy.
You can re-encode a 128kbit mp3 into 384 all you like, but your source was still a 128kbit one, there's little that can be done to increase the quality.
Rendering in 32 bit colour and then sampling it down at the end is utter lunacy and no-one in their right mind would ever do this except maybe in a compatibilty situation where the target display can't handle that much colour precision. If you've already used the resources to render in 32 bit colour you haven't really achieved any savings at all and you may as well just output the unaltered image.
Hence rendering at 16bit internally for speed purposes.
Plus as someone mentioned only posts ago, there can be speed benefits to doing this.
3dfx's scheme was so popular because it provided 22 bit colour from 16 bit colour for free. There was no performance hit from using the postfilter.
No 3D card has EVER offered 8bit only colour, it's ALWAYS been 16bit, the Voodoo 1 was 16bit, the Verite 1 was 16 bit.
Can you read? Where did I say 8-bit only? I said 8-bit as
an option.
Can you read?
Name ANY
3d card that offered 8 bit output.....
I can assure you for the next 3+ generations of cards 16bit WILL be an option,
It might be an option for purely compatiblity reasons but most of the internal rendering will probably be done at a higher precision anyway.
optionally not 100% 128bit only.... so as I originally said, the next 3+ gens 16bit will be an option.
Perpahs by the 3'rd gen in 18->24 months games will have THAT many passes that 16bit really is truely horrible, but right now it's NOT "truely horrible" to the point of EVERYONE having to switch to 32bit..
I think that 32bit and 16bit will still be an option, be it feed it 16 / 32bit textures and process in 128bit internally or feed it 16bit / 32bit textures and process in 32bit / 16 bit fully.
It's much better and cheaper to optimise the hardware to do higher precision and then just scale it down rather than develop full pathways for all colour depths and try to optimise them all.
sounds logical to me.
Totally true, but it will still be an option - and if it gives even 15% more of an increase in speed - some people will use it (like me) if they can't afford a top end card.
At some stage there won't be any performance difference at all and like I said before, it'll just be there for compatibility reasons. Even now some cards are faster in 32 bit mode than they are in 16 bit mode because their HSR schemes stop working in 16 bit mode, plus the crossbar controllers aren't being utilised to their full capability.
I've never seen a card which is faster in 16bit colour than 32bit colour.
I'd much rather have full detail 16bit than most detail 32bit - ie: some users may well turn all the stuff "off" and stay in 32bit, whereas I would prefer to keep it on and drop to 16.
*I* don't notice it frequently, and this particular topic is OPINION so you can't really argue it - there's no fact to be found here.
Carmack and Sweeney are demanding 128 bit for good reasons, reasons which I agree with 100%. Whether you can see a difference or not is largely irrelevant.
It's
completely bloody relevant, if I get more frames from 16bit, I'll switch to it- whether or not they petition manu's to include it or not.
Re-read what you just quoted... note the word in capitols..... it starts with
o and ends in
pinion
Therefore 16bit is still a viable rendering method - if you or carmack like it or not, some people will use it.
(example) I purchase a R9700 now.
12months time Sof 3 comes out using the D3 engine and 10x more detail than ID put in to Doom 3, it runs ass slow in 128bit on the card, ass slow in 32bit on the card and just runs ok in 16bit.
I'll damn well run it in 16bit until I can afford a new card.
Sure don't, ain't gonna stop people using it though.
It is when they see it in 128 bit colour.
No it's not - as I said before this is
O.P.I.N.I.O.N hence if *I* feel I'm getting a smoother experience in 16bit I'll damn well do it.
P.S I'm not the only one that doesn't care about 32 / 16 bit differences, half the people at the lans I go to with MX cards whine about low frames - I drop them to 16 bit's and they are like "wow smooth" and don't notice any image quality differences
Sure future games might make more and more differences with the dithering in smoke and fog to a point where these people do notice it - then they might start using 32bit instead of 16.!!!
Like I said it's opinion, you can't dictate what the users can / can not notice - there's always going to be someone out there who doesn't notice or care.
(a good analogy of this would be users who don't notice 60hz flicker on monitors)
The Kyro I/II cards render at 32-bit internal regardless of the external framebuffer precision.
True, though there still is a small performance difference between 16 bit and 32 bit mode because it will load 16 bit textures when 16 bit colour is requested.[/quote]