Hey what the hell is up with this. Not one freaking article on what nvidia is upto within the next month or two. I wonder why. Nvidia HAS to release something with more power than the x-box during it's next product release cycle. Either Jan. or Feb. So what the heck are they going to release? There are rumours about the geforce 4.
Of course, some of the suggestions are outta wack.
Geforce 4, 48-bit color/textures, 128mb memory, 315mhz gpu clock, 644mhz 3.1ns memory (322mhz DDR), 6 rendering pipelines.
At least I think part of this is true. First the true stuff. The 128mb of memory seems about right, although I think the 64mb versions would come out first. I mean 128mb is certainly nice for a card working with 32-bit textures, but it's not really necessary (but I will explain why I don't think 48-bit color depth is happening soon). The GPU clock of 315mhz would seem very inline with how fast nvidia's next card should be. Does that mean we should expect geforce 4's in jan/feb? I think it does. Nvidia traditionally releases a major product update once a year and a minor product update. We got the minor update in september with two new geforce 3 cards and a new geforce 2 card. Certainly we will be seeing the geforce 4's soon.
I personally would have put in more rendering pipelines. I would have stuck in 8 but the geforce 4 only comes with 6. enables faster performance when displaying more effects per scene. If there's need for 6, undoubtedly there could be room for 8. In fact, I wouldn't be surprised if more complicated special effects started emerging as a result of this. Programmers would limit their effects to a certain amount of a passes and each pipeline can be used (essentially in paralell) to perform a single pass. So more complicated effects will be allowed but will we bump into the same problems when an effect becomes too complicated and drastically hinders the performance of a card?
The memory is a little faster than I would expect it to be. 600mhz (300mhz ddr) would be what I would expect, OR rather for initial release I would expect the same as the GF3 ti500.
What I do have a serious problem with is the mentioned 48-bit color depth:
1)Not very monitors that support any higher than 32-bit, let alone 40-bit or 48-bit (40-bit would be a little more logical to use than 48-bit if greater mathematical color precision is required).
2)Can you put out 40-bit or 48-bit wide color over a standard connector used by most monitors? (That might need checking, I think it is possible but what's the sense).
ok so monitors won't be using it so it must be an internal thing...
3)The textures will occupy a greater portion of memory, making 64mb cards essentially like a 32mb geforce 2 card running 32-bit textures (very performance hindering).
4)The textures will require greater memory bandwidth. Current 32-bit textures and transfers within a card's own internal memory are already too slow for the gf3, what good is this going to do for the gf4 using 150% the texture size? (this of course brings me to the next point).
5)New texture compression formats will be required for greater than 32-bits.
6)It would be an actual performance hit of 50% as each calculation will require almost certainly 50% more depth.
Unless if the geforce 4 is 70% faster than a geforce 3's peak performance, I highly doubt anything higher than 32-bit will be used.
We are currently in the FSAA development stage of things, and FSAA still has a little further to go. With the geforce 4 I have no doubt that higher than 4x FSAA modes will be available. I think that 6x FSAA (to start) and maybe 8x FSAA will be available. 10x FSAA would require monsterous power, I highly doubt nvidia has done anything other than double or quadruple their ability to render a FSAA scene. The 10x would mean 6 times power would be required (or a 6 times less power when rendering a scene).
Certainly the max resolution is good and solid. Not too many people will bother using a resolution higher than 1800x1600, in fact what would be the sense when 1024x768 looks so good with 8x FSAA?
So why is nvidia so quiet about their apparant backstab to microsoft (nevermind the fact that for their business to be doing well it would be wise of them not to miss this product cycle).
I think this whole x-box thing was bad timing. Good for 1 christmas until nvidia's geforce 4, 2 months later comes out and makes xbox eat real pc dust.
Of course, some of the suggestions are outta wack.
Geforce 4, 48-bit color/textures, 128mb memory, 315mhz gpu clock, 644mhz 3.1ns memory (322mhz DDR), 6 rendering pipelines.
At least I think part of this is true. First the true stuff. The 128mb of memory seems about right, although I think the 64mb versions would come out first. I mean 128mb is certainly nice for a card working with 32-bit textures, but it's not really necessary (but I will explain why I don't think 48-bit color depth is happening soon). The GPU clock of 315mhz would seem very inline with how fast nvidia's next card should be. Does that mean we should expect geforce 4's in jan/feb? I think it does. Nvidia traditionally releases a major product update once a year and a minor product update. We got the minor update in september with two new geforce 3 cards and a new geforce 2 card. Certainly we will be seeing the geforce 4's soon.
I personally would have put in more rendering pipelines. I would have stuck in 8 but the geforce 4 only comes with 6. enables faster performance when displaying more effects per scene. If there's need for 6, undoubtedly there could be room for 8. In fact, I wouldn't be surprised if more complicated special effects started emerging as a result of this. Programmers would limit their effects to a certain amount of a passes and each pipeline can be used (essentially in paralell) to perform a single pass. So more complicated effects will be allowed but will we bump into the same problems when an effect becomes too complicated and drastically hinders the performance of a card?
The memory is a little faster than I would expect it to be. 600mhz (300mhz ddr) would be what I would expect, OR rather for initial release I would expect the same as the GF3 ti500.
What I do have a serious problem with is the mentioned 48-bit color depth:
1)Not very monitors that support any higher than 32-bit, let alone 40-bit or 48-bit (40-bit would be a little more logical to use than 48-bit if greater mathematical color precision is required).
2)Can you put out 40-bit or 48-bit wide color over a standard connector used by most monitors? (That might need checking, I think it is possible but what's the sense).
ok so monitors won't be using it so it must be an internal thing...
3)The textures will occupy a greater portion of memory, making 64mb cards essentially like a 32mb geforce 2 card running 32-bit textures (very performance hindering).
4)The textures will require greater memory bandwidth. Current 32-bit textures and transfers within a card's own internal memory are already too slow for the gf3, what good is this going to do for the gf4 using 150% the texture size? (this of course brings me to the next point).
5)New texture compression formats will be required for greater than 32-bits.
6)It would be an actual performance hit of 50% as each calculation will require almost certainly 50% more depth.
Unless if the geforce 4 is 70% faster than a geforce 3's peak performance, I highly doubt anything higher than 32-bit will be used.
We are currently in the FSAA development stage of things, and FSAA still has a little further to go. With the geforce 4 I have no doubt that higher than 4x FSAA modes will be available. I think that 6x FSAA (to start) and maybe 8x FSAA will be available. 10x FSAA would require monsterous power, I highly doubt nvidia has done anything other than double or quadruple their ability to render a FSAA scene. The 10x would mean 6 times power would be required (or a 6 times less power when rendering a scene).
Certainly the max resolution is good and solid. Not too many people will bother using a resolution higher than 1800x1600, in fact what would be the sense when 1024x768 looks so good with 8x FSAA?
So why is nvidia so quiet about their apparant backstab to microsoft (nevermind the fact that for their business to be doing well it would be wise of them not to miss this product cycle).
I think this whole x-box thing was bad timing. Good for 1 christmas until nvidia's geforce 4, 2 months later comes out and makes xbox eat real pc dust.