Originally posted by: BFG10K
Halo? Given it runs on the X-Box's Direct 8.x hardware I can't imagine there'd be a staggering amount of PS 2.0 code in there if any at all.Look at Halo performance with PS 2.0, real staggering difference there isn't it?
I have not seen any other source that has suggested this area of complicity and conspiracy regarding development of Direct 3d standards,
there has been no general outcry of foul! or unfair!
Originally posted by: Rogodin2
Genx
Pulling numbers out your arse again eh?
Post your source (and it better be a valid one) for the percentages.
rogo
ATi doesn't have a value DX9 card. ATi only sells DX9 cards that appear to perform at least acceptably in DX9 benchmarks/games. The 5200 ("value DX9 GPU") doesn't perform acceptably with DX9 shaders, thus, as I said, I'm not including it in the "DX9 market" just to prop up some marketer's numbers. And even the 5600 performs piss-poorly in current DX9 tests, so including it as a DX9 target developers should aim for is unrealistic.NVIDIA?s share among all DirectX 9.0-supporting graphics processors shipped skyrocketed to 60%, obviously, thanks to 70% share in the Value DirectX 9.0-compliant GPUs market.
thanks to 70% share in the Value DirectX 9.0-compliant GPUs market.
PP for the NV3X under HL2 also showed the R9800 with a substantial lead while Halo has the 5900 ahead of the 9800.
And the one actual game that is out that people want to own at that list is as fast or faster on the 5900. That is what I can see with crystal clarity.
Refrast doesn't handle some of those properly. If refrast can't do it, then no way should any board be expected to.
PS2.0A is part of the standard.
Not according to this: http://www.extremetech.com/article2/0,3973,1354541,00.asp Sorry but I no long trust any of anadtechs numbers anymore.
No see PS2.0A results here:
http://www.tech-report.com/reviews/2003q4/geforcefx-5950ultra/index.x?pg=11
Notice no real difference. Also the issues with no some scores of the FX being 0 are labeled as not exposed in drivers yet. How many more months do they need?
With an obvious devotion to creating such an impartial benchmark it is a wonder that sites don't simply run ShaderMark and call it a day
As for the comments on an impartial benchmark...
I did. In fact I used the R100's AF as well. Sure there were some things I didn't like about it but in 2000 when it made its debut the cards around it couldn't even run AF at playable levels while ATi was offering 64 tap anistropic for essentially free. Again you can disable it and gain performance, unlike nVidia's DXT1 which loses performance when enabled.Hehe, try actually using it. It has hideous level of aliasing.
So in otherwords there's no registry switch and you have no evidence at all to support your claim that nVidia's driver-hiding is not there to stop cheat detection from working.So far you have managed to say that 3DMark2K3 has cheats and that is the only thing.
No they aren't. If the application isn't requestion AF then it's up to the driver to determine how it it does it. ATi's drivers aren't overriding an application's request, they aren't using application detection and they're exhibiting clear and consistent behaviour across the board. nVidia's drivers are not and simply change things whenever it suits them.So they are by your definition cheating, or nVidia is not. Either ATi and nV are both cheating or both of them aren't, take your pick.
No, I'm talking about nVidia's further attempts to mask their inconsistenties by changing the screen shots so they no longer look like what is being rendered, as Gabe reported they were doing. ATi's quasi-trilinear certainly isn't doing that and you can see its effects as plain as day in a screenshot.What are you talking about? Are you talking about ATi not rendering certain effects properly as was recently discussed at the counterpart to ShaderDay, or something else?
And the FX can't run certain code either. Honestly what is the point of this useless tangent?Your board wouldn't be able to run it.
Did you even read what I said? I said that ATi had both the IPC and the brute force advantage meaning that regardless of the clock speeds you can expect R3xx to come on top.IPC when comparing things that don't have comparable clock speed is pretty much the defining characteristic of a Mac user.
Huh? Valve used both nVidia's CG and Microsoft's compiler.You compile the code with ATi's targetted compiler and it runs on a FX, do the same with the FX compiler and see what you get.
It is true. Multiple benchmarks confirm it, multiple developers like Carmack and Gabe confirm it and Valve's presentation had a Microsoft-backing as evidence that the numbers weren't being cooked.If that was true then it would be a good point.
How does that prove that the FX isn't inferior to the 9xxx line?The reality is that smaller developers can't afford to support the utterly miniscule marketshare of high end DX9 parts as a focal point.
Answer the question.How is it nonsenical to take issue with your repeated false statements that nV is destroyed in DX9 performance.
Yes it's also been posted that Halo is one of the worst console ports in the history of PC gaming and many people are returning the game because it runs so abysmally on even top-end PC hardware. That's hardly a title that reinforces your position.This information has been posted hundreds of times around the web.
GD: John, we've found that NVIDIA hardware seems to come to a crawl whenever Pixel Shader's are involved, namely PS 2.0..
Have you witnessed any of this while testing under the Doom3 environment?
"Yes. NV30 class hardware can run the ARB2 path that uses ARB_fragment_program, but it is very slow, which is why I have a separate NV30 back end that uses NV_fragment_program to specify most of the operations as 12 or 16 bit instead of 32 bit."
John Carmack
?Precision has a pretty big impact on the current nVidia chips, so it?s important to use half precision data types and the nVidia specific 2.0a HLSL compiler when running on NV3X hardware. Keeping those limitations in mind, it hasn?t been too difficult for us to maintain an acceptable level of performance.?
James Loe
So in otherwords there's no registry switch and you have no evidence at all to support your claim that nVidia's driver-hiding is not there to stop cheat detection from working.
No they aren't. If the application isn't requestion AF then it's up to the driver to determine how it it does it. ATi's drivers aren't overriding an application's request, they aren't using application detection and they're exhibiting clear and consistent behaviour across the board.
No, I'm talking about nVidia's further attempts to mask their inconsistenties by changing the screen shots so they no longer look like what is being rendered, as Gabe reported they were doing.
Huh? Valve used both nVidia's CG and Microsoft's compiler.
It is true. Multiple benchmarks confirm it, multiple developers like Carmack and Gabe confirm it and Valve's presentation had a Microsoft-backing as evidence that the numbers weren't being cooked.
How does that prove that the FX isn't inferior to the 9xxx line?
If a Microsoft compiled, hand-optimised, mixed mode rendering path can't beat ATi's generic full precision path then why do you expect the full precision path to magically soar above the mixed mode path soley on the basis that Microsoft's compiler is used?
Answer the question and stop this charade.
Yes it's also been posted that Halo is one of the worst console ports in the history of PC gaming and many people are returning the game because it runs so abysmally on even top-end PC hardware. That's hardly a title that reinforces your position.
Regardless, I seem to remember reading that Halo is using a similar, mixed mode, lowered precision path that Half Life 2 is using so that's hardly the basis for a sound comparison.
And yet again you're ignoring the testimony from two key developers (Carmack and Gabe) and the half a dozen or so other DirectX 9 benchmarks/games that show the opposite of Halo.
GD: John, we've found that NVIDIA hardware seems to come to a crawl whenever Pixel Shader's are involved, namely PS 2.0..
Have you witnessed any of this while testing under the Doom3 environment?
"Yes. NV30 class hardware can run the ARB2 path that uses ARB_fragment_program, but it is very slow, which is why I have a separate NV30 back end that uses NV_fragment_program to specify most of the operations as 12 or 16 bit instead of 32 bit."
John Carmack
So in otherwords in order for the NV3x line to have any hope to compete with the R3xx it has to run at a dumbed-down precision, highlighting the fundamental flaws of the architecture that everyone except Ben seems to understand.
ust ignore that the application is requesting trilinear and you theory works fine. Unfortunately for your theory, ATi is in fact ignoring an application request explicitly.
There is one game on the market that has heavy PS useage that includes DX9 shaders that I have any interest in, and it runs faster on nV hardware.
ATI has said on numerous occasions that it will do its filtering method if you select it through the control panel.
What game is that? Gun Metal? Aquamark?
As soon as you start showing SplinterCell benches of 0 for the R3X0 core boards at the highest quality settings I'll consider your point valid.
There is one game on the market that has heavy PS useage that includes DX9 shaders that I have any interest in, and it runs faster on nV hardware.
Halo
Originally posted by: jiffylube1024
Well, I just finished reading Anand's article (all 60 pages of it) and I'm extremely impressed with Anand (yet again). What a solid review.
I'm also impressed with Nvidia. It seems the 52.xx series is finally panning out as the touted "holy grail" it was supposed to be. Aside from a few image quality problems (on both cards), they run neck and neck in the majority of benchmarks.
Yet again Nvidia has stepped to the plate and delivered a huge % increse in most games it was struggling to keep up in. ATI's card must be commended, as it is either in first place, or right behind Nvidia in all the benchmarks. And they still have their trump card, Catalyst 3.8 to play.
Well, Ben, for the reviews I don't do any comparisons - but If you really wanted then we we could try the full settings in TR:AoD, which include PS2.0 shadaws requireing float buffer support....![]()
However, my comments was rather a wry one since, reading this thread, you appear to have this knee jerk reaction to any shader app that doesn't show what you like - its either biased flawed, poorly coded, irrelevent or motivated by money. I just found it a little amusing is all.
As for the use of Float support in Shadermark, that doesn't represent bias - I know of numerous developers that are quite vexed by this difference in support, the other elements hardly represent bias either.
And as for the comment about "leaving at that", well, thats just a bit off really.
But hey, I know your response, it runs better on YOUR computer
Got some links? Looking at the links rever posted, the nV and Ati parts were pretty much equal in Halo without AA/AF with nV taking a bit of a backseat when AA and AF were used. I hope you dont mean the 1 FPS advantage that nV has over ATi in the unplayable 16 x 12 resolution. That would be really reaching.No, how about their's. HardOCP didn't run a bench(why I'm not sure), the other sites show the FX parts beating out ATi's parts at the higher resolutions.
The benches in these reviews do not support that statement. The 2 are pretty much neck and neck.There is one game on the market that has heavy PS useage that includes DX9 shaders that I have any interest in, and it runs faster on nV hardware.
You don't have a shred of evidence to support your theory that nVidia isn't cheating and all you've done is pulled theories out of thin air. I have evidence in the form of Unwinder's publically available comments and application that knows nVidia's cheating drivers inside out.You make the paranoid accusations, you provide the proof. That is how it works.
Yes the application is requesting trilinear, not trilinear anisotropic. If you enable trilinear in an application do you expect anisotropic filtering to be enabled along with it? I sure as hell don't.Just ignore that the application is requesting trilinear and you theory works fine.
The application isn't requesting AF and therefore they aren't ignoring anything.Unfortunately for your theory, ATi is in fact ignoring an application request explicitly.
Right, so you're back to your comment about Gabe being a liar. Yet again you make outrageous claims without any proof and you expect to use them to form the basis of a solid argument for your case?You want me to start quoting the counterpart to ATi's PR monkey?
Fire away. In fact I encourage you to use the other thread where they're being brought up and most of them have already been dismissed.Now that nV has had their PR day I can find quotes from developers saying the same type of things that ATi is cheating on.
On one the websites that had the intially leaked HL2 benchmarks.WTF? Where did you read anything like that?
But they used it for the mixed mode one.The problem is that they did not use the same compiler for the different paths.
What tangent might that be? Making comments that are provable by widespread evidence? Of course I do.Are you sure you want to go on this tangent?
Of course, shader subsitution and other such cheats will do that. Both Anand and 3DCenter have verified such shader subsitution.Taking a look at the games I play the FX comes out on top in damn near everyone of them.
But how can you claim that if they did everything in their power to make the fastest path on the nVidia cards on the basis that they didn't do the same for the slower and irrelevant path?The point of it is that they were trying to show a larger performance rift then what there would be if it was done by any reasonably intelligent impartial developer.
I really have no issues with that Ben. If Halo runs better on FX cards then that's more power to them.It is a top tier game that uses shaders heavily, the only problem is that it doesn't favor your vid card.
Do you deny that Halo is not running a mixed-mode, reduced precision path on FX cards, similar to HL2's path?It is a title that uses DX9 shaders, the game has some levels that are covered in shaders.
Great. Does that mean Carmack, Gabe, [insert all other developers] along with 3DCenter, [insert other reviewers here] are all wrong?There is one game on the market that has heavy PS useage that includes DX9 shaders that I have any interest in, and it runs faster on nV hardware.
Except precision isn't the only issue here. The other issue is the architecture and how it relies on a ridiculously unrealistic methods of instruction scheduling in order to have any reasonable chance of competing.If there is no benefit to running FP24 then there isn't a big issue.