Dave
I think you'll find the implementation is different.
I'm just going on what the dev team stated. They have a list of the differences between the different shader versions on Bungie's site(IIRC, may be GB's)- AC isn't listed there either.
So, you don't think R400 was completely unviable then?
Obviously ATi did for whatever reason
The main benefit for high precision is with longer shaders.
Of course, but we have a long way to go before we see parts that can run shaders of that length at reasonable speeds for real time. With the Quadro parts one frame every ten seconds is a lot better then one frame every ten minutes. As it stands now with the speed of the parts that are available FP16 is plenty the vast majority of the time.
I don't remember the exact timings, but it was released in 9700's lifetime, before 9800 IIRC. There were other considerable advantages to R300 over ATI's previous offerings that made it compelling to release at that time.
However, DX9 was late. There were numerous and considerable changes to DX9 prior to its release which pushed back its estimated release - PS/VS2.0 Extended being one and PS/VS3.0 being another.
And due to them releasing very early they gained a significant advantage in terms of all the early DX9 dev being done on their hardware. If they ship close to nVidia, nVidia is still the dominant player in terms of marketshare, they will be at a decided disadvantage to get even treatment let alone preferential in terms of PC native development(outside of say Futuremark and Valve).
I'll wager that won't happen. Roadmaps are fluid and they usually change dependant on numerous market conditions. For instance - R360 and NV38 are very late additions to the roadmaps, neither had estimated on third gen PS/VS2.0 parts when they started on the DX9 path, but as thing have evolved that?s what became the sensible option. NV's roadmap a while back also states that NV40 would be there instead of NV38, but that?s not possible, or sensible.
I was always under the impression that NV38 was a given. They did it with the NV1x core, and then they did it with the NV2x core. For NV55 by Longhorn- NV40 Q1 '04, NV45 Q4 '04, NV48 Q2 '05, NV50 Q4 '05, NV55 Q2-Q3 '06. I figure they will slip at least one quarter with one of the parts, possibly two by the time NV55 comes around. Figuring that in, and figuring that they will have another x8 part that would still have them right around NV55 by the time Longhorn hits.
And if they don't have any other significant console contracts, annoying MS is the last thing that NV will be doing from here on in.
The XBox1 went real well for them on a financial basis, as long as we ignore profit completely. It's quite obvious that nVidia is very willing to annoy Microsoft(not that I think that's the brightest move) with how this generation went.
Regardless of what you may think, having the XBox at the moment is giving them considerable leverage as to what they can do - without the XBox that leverage goes, and to miss DX specifications somehow will be the last thing they can afford as this is still by far their biggest revenue stream.
MS's already hurt them with their DX specs for this generation, and nVidia's marketshare surpassed Intel's based on the last numbers I saw(Mercury). nVidia has been doing a lot of things to irritate MS for some time, they offer proper support for Linux which ATi has refused to do, they started a legal battle over the NV2A's pricing structure, and they have very actively supported and promoted OpenGL native developers. I'm not saying any of this is wise long term, but their sole projects that relied entirely on MS has not given them the sort of financial return that was hoped for, not even close.
R400 was initially scheduled for release in July '03 - the development of it had probably been occurring for 12-14 months prior to that, and it had initially been sent to the fab probably at the end of '02. There was considerable uncertainty in exactly what DX9 would end up to be and how long it would last. It wasn't known until very late that it would include PS/VS3.0 specifications and that was the first real indication that DX9 would be lasting longer than usual.
So now you are saying that the R400 was no good, and that ATi decide to throw it away? By the sounds of it, things are very confused at ATi.
There are considerable changes over PS/VS2.0 required to hit PV/VS3.0, and the instruction lengths are the least of the issues.
Of course, if instruction length were simply the issue and ATi managed to get the FBuffer to go from the PR department in to their parts then the R350 would be PS/VS3.0 compliant.
These will not be simple refresh products if they hit PS/VS3.0.
I'm sure you see it that way, you have been rather obsessed with shaders for a long time now. It's been around a year now since DX9 hit and we have two shader heavy games, one of them worth owning. It is appearing that DX9 is the slowest adopted DirectX revision in a long time, since the Glide days anyway. Clearly if what you were talking about before with IHV's spending time on what matters is real, then VS/PS3.0 is going to be a checkbox feature and that's about it. If all they do is add VS/PS3.0 it will be the biggest let down from a hardware 'generation' in a long time.
ATI may only be responsible for the RTL source, which means that whoever MS asks to fab may end up actually doing the layout etc. With the model MS have chosen its not actually a given that ATI will necessarily be tied to any fabbing plans that MS are rooting for.
But it would be utterly foolish for them not to get a part in devs hands ASAP on the PC side if they want to exploit the advantage that the XBox2 contract can give them.
BFG-
That isn't true at all and you know it. I've said many times that I consider their shader reorderer to be a valid optimization. What I do consider cheating is application detection and Carmack/Futuremark do as well.
Think about it- Futuremark releases a patch and approves them for use in all drivers saying that they have eliminated all invalid optimizations. Then, some time later, they realise that they don't approve of the PS optimizations that were present the entire time in the drivers. They worked around all the other optimizations but not the one for pixel shader test, and they didn't notice anything was going on till some time after they said they were OK? nVidia's driver team is far beyond genius, they must be god like to use an application detection that relies on differing techniques for each sub segment of a test that not even the developers can catch after they claimed they have. Also pretty interesting that noone can spot the IQ differences.
Which have been disproven?
Every time someone buys an ATi card you get so excited you go out and kick a puppy. Noone can disprove this to me, therefore it obviously contains some truth, right?
Show me where nVidia, Microsoft or otherwise have stepped forward and publicly denounced his findings.
There were no 'findings', he flat out lied. Gabe Newell stated that HL2 would ship September 30th 2003, that was a lie(and he knew that was a lie when he said it). Gabe Newell then stated that they would release a public version of the benchmark used for HL2 on September 30th 2003, that was also a lie(another lie he knew was such when he stated it). Some other lies-
Working closely with NVIDIA (according to Gabe), Valve ended up developing a special codepath for NVIDIA's NV3x architecture that made some tradeoffs in order to improve performance on NVIDIA's FX cards. The tradeoffs, as explained by Gabe, were mainly in using 16-bit precision instead of 32-bit precision for certain floats and defaulting to Pixel Shader 1.4 (DX8.1) shaders instead of newer Pixel Shader 2.0 (DX9) shaders in certain cases.
Cross reference that with the more up to date comments from Valve that Dave posted on B3D a while ago(pay close attention to the details). Then there was the whole driver debacle that Newell insisted upon(not to mention going on about spending 5x as long with nVidia hardware while not even using MS's compilers for the DX9 codepath). What would you have said if Carmack demanded that only the latest official drivers be used to run the DooM3 benches when they hit? Not a lie, but without a doubt dishonest. Newell is a PR boy for ATi, just as BB is for nVidia. Take anything they say and ignore it, it's that simple. The reason why I've stated we should avoid this discussion is that it is best to wait until the game ships. I'll gladly agree to the fact that you can harp on me until you get sick of it in any way you want if the results are comparable to Newell's PR event. I'm not in the least worried about it either
