Originally posted by: FishTankX
Also, remember. Cg has nothing to do with Nvidia's properitary instructions. Thus things written in CG could theoretically be made compatible on the ATi platform.
Originally posted by: FishTankX
Also, remember. Cg has nothing to do with Nvidia's properitary instructions. Thus things written in CG could theoretically be made compatible on the ATi platform.
... When GPU hardware grows to allow programs of hundreds, thousands, or even
more instructions, assembly coding will no longer be practical. Rather than
programming each rendering state, each bit, byte, and word of data and control
through a low-level assembly language, we want to express our ideas in a more
straightforward form, using a high-level language.
Thus Cg, ?C for Graphics,? becomes necessary and inevitable. Just as C was
derived to expose the specific capabilities of processors while allowing higherlevel
abstraction, Cg allows the same abstraction for GPUs. Cg changes the way
programmers can program: focusing on the ideas, the concepts, and the effects
they wish to create-not on the details of the hardware implementation. Cg also
decouples programs from specific hardware because the language is functional,
not hardware implementation-specific. Also, since Cg can be compiled at run
time on any platform, operating system, and for any graphics hardware, Cg
programs are truly portable. Finally, and perhaps best of all, Cg programs are
future-proof and can adapt to run well on future products. The compiler can
optimize directly for a new target GPU that perhaps did not even exist when the
original Cg program was written. ...
Originally posted by: NFS4
Originally posted by: FishTankX
Also, remember. Cg has nothing to do with Nvidia's properitary instructions. Thus things written in CG could theoretically be made compatible on the ATi platform.
But wouldn't Cg work BEST on NVIDIA hardware?
Why? Any normal person gets upset when a company tries to deceive them through cheating.can't believe people are still attacking nVidia over the extra clipping planes.
Except the "extra performance" is a cheat since it can't possibly apply when playing games normally.The extra clipping planes if included in the actual program would otpmize performance for all 3d card makers.
That's not the issue here at all, the issue here is that static clip planes can only exist in pre-rendered sequences. They cannot exist in realtime rendering.Nothing is being clipped that can actually be seen.
Which is exactly why it's a cheat. When you go off the rails and try to play it like a real game what happens? Whoops, it's kaleidoscope time.The 3DMark2003 demo is a fixed viewpoint demo with no opportunity for the enduser to vary his point of view into the scene.
Only if it's dynamic clipping, which it isn't.Thus clipping what is outside of that viewpoint is a legitimate optimization.
you have nothing to complain about.
Originally posted by: Gstanfor
On the subject of Cg not supporting Pixel Shader V1.4 or other competitors innovations - that is certainly true when running nVidia's backend compiler on nVidia hardware.
It does not however hold true for others supporting Cg. it is the responsibility of ATi and anyone else using Cg to build their own backend compiler for the language into their drivers and that is where support for things like PS1.4 can be added.
Cg is not a closed standard and not everybody who adopts the standard is forced to do things nVidia's way - they are free to extend Cg.
Edit: this is why I have said ATi is obstructionist in regard to Cg for no good reason. Lack of ATi feature-specific support does not have to be a Cg problem, it's just that ATi wants it to be percieved as a problem.
Is Cg proprietary?
The Cg Language Specification is published and open in the sense that other vendors may implement products based on it. To encourage this, NVIDIA open sourced the Cg Compiler technology under a nonrestrictive, free license.
Vendor implementations of Cg Compilers are typically proprietary and owned by their creators. NVIDIA has developed and owns the NVIDIA Cg Compiler, and other vendors are expected and encouraged to develop their own Cg Compiler products.
Originally posted by: SilentRunning
Is Cg proprietary?
The Cg Language Specification is published and open in the sense that other vendors may implement products based on it. To encourage this, NVIDIA open sourced the Cg Compiler technology under a nonrestrictive, free license.
Vendor implementations of Cg Compilers are typically proprietary and owned by their creators. NVIDIA has developed and owns the NVIDIA Cg Compiler, and other vendors are expected and encouraged to develop their own Cg Compiler products.
link
Nividia only publishes the Cg Language specification for others to use yet they still own and maintain it. So if Nvidia chooses to make changes to the underlying Cg Language they will be able to create a new compiler concurrent to the changes. This would mean that competing vendors would have to play catch up everytime Nvidia would choose to make changes. Microsoft successfully practiced a tactic similar to this with their OS's and their applications.
Originally posted by: Gstanfor
Originally posted by: SilentRunning
Is Cg proprietary?
The Cg Language Specification is published and open in the sense that other vendors may implement products based on it. To encourage this, NVIDIA open sourced the Cg Compiler technology under a nonrestrictive, free license.
Vendor implementations of Cg Compilers are typically proprietary and owned by their creators. NVIDIA has developed and owns the NVIDIA Cg Compiler, and other vendors are expected and encouraged to develop their own Cg Compiler products.
link
Nividia only publishes the Cg Language specification for others to use yet they still own and maintain it. So if Nvidia chooses to make changes to the underlying Cg Language they will be able to create a new compiler concurrent to the changes. This would mean that competing vendors would have to play catch up everytime Nvidia would choose to make changes. Microsoft successfully practiced a tactic similar to this with their OS's and their applications.
I'm failing to see your point here. There are two parts to Cg: the language itself and the backend compilers.
The only time nVidia could disadvantage others is through a significant change in the actual language, not its backend.
The backends are vendor specific and totally independent from each other.
It is no different from nVidia releasing a new chip with new capabilities tommorow. It will not affect Cg the language at all , it will affect nVidia's backend Cg compiler, and ATi will have answer nVidia's challenge Cg or no Cg.
ATi may or may not try to modify their own backend compiler in response - it depends on whether they think their chips can handle the modifications or not.
Originally posted by: Gstanfor
Well, don't believe nVidia either then if you don't want to. It doesn't take anything away from the fact that Cg is real and in use in the industry already, and that usage will only increase over time, not decrease.
link page
Originally posted by: Gstanfor
And just what precisely, do you think nVidia can really change about the core language?
Cg is C for graphics, and C is as flexible as anyone could ever need already. There is hardly a piece of serious software out there not written in C, or had the program that wrote it written in C. Heck, even C compilers are written in C.
EDIT: All ATi or any one else has to do is revise their back end compiler to support the new revision of the language. Just like they revise drivers for new versions of DirectX or OpenGL. No difference whatsoever.
Originally posted by: rachaelsdad
Originally posted by: Gstanfor
Well, don't believe nVidia either then if you don't want to. It doesn't take anything away from the fact that Cg is real and in use in the industry already, and that usage will only increase over time, not decrease.
link page
Considereing the Actions of NVidia to do anything to try and get ahead by cheating; sending out memos to reviews sites on a cards cabability(not their own;think Kyro) I would not believe any altruisitic motive could be applied to CG. Perhaps CG is out in the industry but do you believe that developers will choose to use CG when they realize what the aim of CG is; to do nothing more that allow NVidias cards to run better than there competitors,
Rendermonkey is a tool for higher shaders and it is being used also. If NVidia wants CG to be used by everyone; why not just open source it and help them get some good will back that they have destroyed.
Originally posted by: Gstanfor
Originally posted by: SilentRunning
Is Cg proprietary?
The Cg Language Specification is published and open in the sense that other vendors may implement products based on it. To encourage this, NVIDIA open sourced the Cg Compiler technology under a nonrestrictive, free license.
Vendor implementations of Cg Compilers are typically proprietary and owned by their creators. NVIDIA has developed and owns the NVIDIA Cg Compiler, and other vendors are expected and encouraged to develop their own Cg Compiler products.
link
Nividia only publishes the Cg Language specification for others to use yet they still own and maintain it. So if Nvidia chooses to make changes to the underlying Cg Language they will be able to create a new compiler concurrent to the changes. This would mean that competing vendors would have to play catch up everytime Nvidia would choose to make changes. Microsoft successfully practiced a tactic similar to this with their OS's and their applications.
I'm failing to see your point here. There are two parts to Cg: the language itself and the backend compilers.
The only time nVidia could disadvantage others is through a significant change in the actual language, not its backend.
The backends are vendor specific and totally independent from each other.
It is no different from nVidia releasing a new chip with new capabilities tommorow. It will not affect Cg the language at all , it will affect nVidia's backend Cg compiler, and ATi will have answer nVidia's challenge Cg or no Cg.
ATi may or may not try to modify their own backend compiler in response - it depends on whether they think their chips can handle the modifications or not.
Originally posted by: Gstanfor
I make no apology whatsoever for the stance I took against the intel employees on this board at the time, and I see the fanATIc's no differently. They are the single most obnoxious thing about ATi today and will end up damaging the company they think their actions are helping.
This has NOTHING to do with simply saying something posotive about Nvidia. As any serious minded person at this forum should be trying to explain to you. You are simply making one Ridiculous outlandish statement after the next. Trying to Justify inserting clip planes, Making outlandish claims about M$ chaning DX9 spec at the last Min which is NOT TRUE, Justifying wholesale replacement of Shader code forcing not just below Dx9 spec, But DX7 T&L!!!. Using custom compiled shaders with Nvidias PROPRIETARY Back end Compiler for Cg. Application detection which artificially inflates the Frame rate. This is all just the stuff we know about. All done to an independant Benchmark program with world wide influence. In whch NO i repeat NO custom code is allowed. Because it is designed to test Pure DX code ehich puts all IHV's on a level playing field.Isn't it pathetic to see all the Rage3D fanboys come over to AT and bash anyone who is positive about NVIDIA or negative about ATI? When in doubt about an ATI fanboy search their username at Rage3D, for example here where Compddd reveals himself in all his unbiased glory.
FALSE. Nvidia Withdrew in December, 13 months after development was started and AFTER everything but Bug hunting was complete. a mere 3 months before 3dmark03 was released. They were Fully 100% behind 3dmark03 until it became clear that the Nv3x was going to have troubble with it. Which is Due almost entirely to design flaws, or poor DEsign choices. Like limited Memory bandwidth, poor single texture performance, poor DX9 shader support among other things.nVidia originally withdrew their membership and support for BRIBEmark, whoops, thats not the name..., err, QUAKmark, no thats not it either - close though..., 3DMARK 2003, when futuremark refused to consider benchmark optimizations nVidia put forward in the development stages.
ANY benchmark program or game can be easily optomized for in ways that are not acceptable. It has nothing to do with flaws and eveything to do with INTEGRITY. Further none of the other Beta partners including DELL think it is flawed or poorly coded. The only IHV who has a problem with it is Nvidia. Whos Nv3x core has several known problems.nVidia then publically stated that 3DMARK 2003 was a flawed benchmark that could easily be optimized and they proved it.
That has absolutly nothing to do with it. Nor does the statment they release support what you just said here.It would seem Futuremark finally agrees with them.
It is only 5,000$ a year for membership. Neither is Futuremark the ones doing the damage here. It is also very irritating that some people would go to such lengths to justify, spin, and defend what One IHV has been pulling.I wonder if ATi's "membership subscriptions" have made up in any way for the damage Futuremark have inflicted upon themselves?
