Originally posted by: Rollo
Originally posted by: gururu
blaming other companies for this apparent nVidia deficiency is silly. If their cards weren't fast enough for 32-bit DX 9, why the heck did they implement it?
Perhaps to give developers the tools they needed to code for what everyone knew the spec would end up with? (32bit)
Unlike ATI who don't really seem to care about vendor relations, nVidia has a huge partnership with the software community. They were the first to bring hardware T/L to market, the first to bring FP32, and the first to bring SM3. (among others)
Games are in development for years. You can't just expect developers to work in 24 bit SM2 for all of 2003 and 2004 because ATI can't get the 32 bit SM3 together can you, Gururu? It would stagnate the industry to the level the tech has been at since fall 2002?
Even tho ATi worked towards DX9 standards, Microsofts standards. Which also if u think about it, software developers also work towards...
Why implement something so that ur the first to have it, wouldnt it be best to bring it out when it worked well instead of not very.
All software developers are working with 24 bit SM2 because they HAVE to because of the DX9 standard that is out there. 24bit and SM2 are the minimum requirements, meaning that there could be better improvements but still they are limited to the confines of the API like DX9 have put on it, they cannot expand past that, so they can only go up to 32bit and SM3. 32bit for the FX range was disastrous and gave it very low performance, while SM3 is still only emerging and is mostly unused in games, tho they are being implemented in partial effect. They cant bring out a game which has a higher standard like 64 bit SM4 can they if there are no APIs out to support it at the time.
Games are in development for years. You can't just expect developers to work in 24 bit SM2 for all of 2003 and 2004 because ATI can't get the 32 bit SM3 together can you, Gururu? It would stagnate the industry to the level the tech has been at since fall 2002?
You said games are in development for years, then what about UE3 that is being developed right now, and right now they are developing to the standard of an API, another piece of software, it isnt the hardware that gives developers what they need, as they build their games and develop them towards APIs like Microsofts. And so do hardware companies, otherwise it would be a mix bag of goodies that no developer could ever hope to build their engines towards. So to answer that quote above, its not up to the hardware developers to decide what to build there cards towards, its up to the company that makes the programming tools. Of course after that, its up to the hardware developer to add in extra bonuses if they want to, and then its up to the software developer to take it. But again as uve said games are in development for years, so going back to my example, UE3 is working towards DX10 standards, and thats what hardware will support when it is released, and when the hardware is released and has these extra bonuses the developers of UE3 wont be able to take advantage of these things as its close to deadline to publish the game. Only games coming out well after the standard API is released will be able to add these extras in.
Obviously ATi was slightly off footed when the 6800 and SM3 and that increasingly more and more games are having it in PARTIAL use at the moment. And again from many reviewers and techies the differences between 24bit and 32bit is negligable. Compare those 2 to 16 bit and there are is a significant amount of difference.
And then with the advent of SLI for both companies where we could possibly see no more refresh cards and no more 6 to 9 month refresh cycles, there is no way that we will be seeing different standards and new innovations in cards for at least 18 to 24 months.