• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

FutureMark & Nvidia joint statement on 3DMark03; FutureMark tucks its tail between its legs.

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

FishTankX

Platinum Member
Oct 6, 2001
2,738
0
0
Also, remember. Cg has nothing to do with Nvidia's properitary instructions. Thus things written in CG could theoretically be made compatible on the ATi platform.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
Originally posted by: FishTankX
Also, remember. Cg has nothing to do with Nvidia's properitary instructions. Thus things written in CG could theoretically be made compatible on the ATi platform.

But wouldn't Cg work BEST on NVIDIA hardware?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Cg is the mulit platform language, not rendermonkey.

ATi (and and other company out there) can leverage Cg every bit as much as nVidia can.

Cg is supported by 20 US Universities and is an official part of the curriculum at several. can rendermonkey say the same?

Anyhow, nevermind my posts. Apparently I know nothing and am only good for a laugh...
 

mikable

Senior member
Sep 23, 2000
303
0
0
jeeze, ANOTHER over hyped thread about some silly non-issue about stupid benchmark that has no real effect on anything that matters.


should start another forum just for this topic, put it below the intel/amd/mac forum.....
 

rachaelsdad

Member
Aug 26, 2001
130
0
0
Originally posted by: FishTankX
Also, remember. Cg has nothing to do with Nvidia's properitary instructions. Thus things written in CG could theoretically be made compatible on the ATi platform.

If this were the case then NV30 and 35 would not run so much slower on Doom 3 if they could do so in the normal ARB_extnsions. They need CG for hardware specific optimisations. And just so you do not think I am just all pro ATI. Rendermonkey does not seem to support 1.1 and 1.3 nearly as much as 1.4 shaders. And CG does not support 1.4

J Carmack stated that (in response to CG and Rendermonkey)in this stage of the game the last thing we need is a bunch of different toos to support the same thing. CG is an HLSL: Rendermonkey is at this stage just a tool for Higher level shaders.

A Meta tool that would make the use of shading easy and workable on all hardware using Open GL and DX9 is a great idea. Hardware specific HLSL's are not the best path for all.

I agree that CG is not an API but it does have some very specific paths that cannot be coded in all hardware via GL or DX9. Just feels like an API in embryo form. :)
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Please, don't take my word alone for what Cg can and cannot do.

Take a few minutes to find out for yourself

From the foreword of the Cg Users Manual:
... When GPU hardware grows to allow programs of hundreds, thousands, or even
more instructions, assembly coding will no longer be practical. Rather than
programming each rendering state, each bit, byte, and word of data and control
through a low-level assembly language, we want to express our ideas in a more
straightforward form, using a high-level language.

Thus Cg, ?C for Graphics,? becomes necessary and inevitable. Just as C was
derived to expose the specific capabilities of processors while allowing higherlevel
abstraction, Cg allows the same abstraction for GPUs. Cg changes the way
programmers can program: focusing on the ideas, the concepts, and the effects
they wish to create-not on the details of the hardware implementation. Cg also
decouples programs from specific hardware
because the language is functional,
not hardware implementation-specific. Also, since Cg can be compiled at run
time
on any platform, operating system, and for any graphics hardware, Cg
programs are truly portable. Finally, and perhaps best of all, Cg programs are
future-proof and can adapt to run well on future products. The compiler can
optimize directly for a new target GPU that perhaps did not even exist when the
original Cg program was written
. ...
 

rachaelsdad

Member
Aug 26, 2001
130
0
0
I am not saying that NVidias own paper might be self serving but...THG on CHG and Rendermonkey
In the conclusion:
Cg will probably go down as "the first" high level language, and it's definitely the first one to have gained widespread publicity, but I rather doubt that we'll be using it a year from now. Instead, I think it's likely that DirectX programmers are going to be using Microsoft's solution, and OpenGL programmers will be using whichever solution they need for the graphics card they're currently programming.

With regards to RenderMonkey and the D3DXEffect structures, I think these are going to be increasingly popular models for the development of materials in real time graphics software. I hope we will see some more integration with modelling packages, and I'd be surprised if this doesn't come along sooner or later.

To sum up, we've seen a big gamble from nVidia, and a good piece of tools development from ATI. I think they'll both be useful in the long run; certainly Cg has been quite a buzzword for a few months (although its longevity is questionable); and RenderMonkey may be the first iteration of a tool suite that everyone will end up getting to know. I guess we'll just have to wait and see.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Well, don't believe nVidia either then if you don't want to. It doesn't take anything away from the fact that Cg is real and in use in the industry already, and that usage will only increase over time, not decrease.

Here is a good starting point for seeing what people think of Cg and are doing with it. Lots of other good info there too.
link page
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
On the subject of Cg not supporting Pixel Shader V1.4 or other competitors innovations - that is certainly true when running nVidia's backend compiler on nVidia hardware.

It does not however hold true for others supporting Cg. it is the responsibility of ATi and anyone else using Cg to build their own backend compiler for the language into their drivers and that is where support for things like PS1.4 can be added.

Cg is not a closed standard and not everybody who adopts the standard is forced to do things nVidia's way - they are free to extend Cg.

Edit: this is why I have said ATi is obstructionist in regard to Cg for no good reason. Lack of ATi feature-specific support does not have to be a Cg problem, it's just that ATi wants it to be percieved as a problem.
 

FishTankX

Platinum Member
Oct 6, 2001
2,738
0
0
Originally posted by: NFS4
Originally posted by: FishTankX
Also, remember. Cg has nothing to do with Nvidia's properitary instructions. Thus things written in CG could theoretically be made compatible on the ATi platform.

But wouldn't Cg work BEST on NVIDIA hardware?

Wouldn't an Intel C++ compiler work on a P4? Yes, but it'll do damn fine on an Athlon, too.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
can't believe people are still attacking nVidia over the extra clipping planes.
Why? Any normal person gets upset when a company tries to deceive them through cheating.

The extra clipping planes if included in the actual program would otpmize performance for all 3d card makers.
Except the "extra performance" is a cheat since it can't possibly apply when playing games normally.

Nothing is being clipped that can actually be seen.
That's not the issue here at all, the issue here is that static clip planes can only exist in pre-rendered sequences. They cannot exist in realtime rendering.

The 3DMark2003 demo is a fixed viewpoint demo with no opportunity for the enduser to vary his point of view into the scene.
Which is exactly why it's a cheat. When you go off the rails and try to play it like a real game what happens? Whoops, it's kaleidoscope time.

Thus clipping what is outside of that viewpoint is a legitimate optimization.
Only if it's dynamic clipping, which it isn't.

you have nothing to complain about.
rolleye.gif

You just don't get it do you? Static clips planes cannot exist in a real gaming environment so it's a cheat.

Also the same cheats done in 3Dmark can be done in real games as nVidia knows exactly what benchmarks are popular. Indeed, real world performance has shown the 5900 Ultra to be suspiciously slower than the 9800 Pro when in the same kinds of benchmarks it's winning. The presence of static clip planes would certainly go a long way to explain this.


Static clip planes are just one of the cheats. The reduced precision, shader susbsitution and lowered anisotropy are blatant cheats because they negatively impact image quality to gain extra performance. Nobody in their right mind can call them optimisations.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
First of some games do use clipping planes. Ultima 9: Ascension is a particular case in point. It actually uses 3 clipping planes, statically set in the options.ini file.

Second, I re-iterate, the user cannot control the viewpoint in 3dmark03. Like it or not, until futuremark publiclly release a version that does allow the viewpoint to be interactively changeable, there is only one possible viewpoint for every frame in the sequence. That means the view can be optimised. Pre-rendered or rendering on the fly is irrelevant here - the viewpoint will remain the same in both cases.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: Gstanfor
On the subject of Cg not supporting Pixel Shader V1.4 or other competitors innovations - that is certainly true when running nVidia's backend compiler on nVidia hardware.

It does not however hold true for others supporting Cg. it is the responsibility of ATi and anyone else using Cg to build their own backend compiler for the language into their drivers and that is where support for things like PS1.4 can be added.

Cg is not a closed standard and not everybody who adopts the standard is forced to do things nVidia's way - they are free to extend Cg.

Edit: this is why I have said ATi is obstructionist in regard to Cg for no good reason. Lack of ATi feature-specific support does not have to be a Cg problem, it's just that ATi wants it to be percieved as a problem.

While I won't nessisarily agree with all of that, the sentiment is correct. Any other company can use Cg; they just have to build a compiler for it.
 

SilentRunning

Golden Member
Aug 8, 2001
1,493
0
76
Is Cg proprietary?
The Cg Language Specification is published and open in the sense that other vendors may implement products based on it. To encourage this, NVIDIA open sourced the Cg Compiler technology under a nonrestrictive, free license.

Vendor implementations of Cg Compilers are typically proprietary and owned by their creators. NVIDIA has developed and owns the NVIDIA Cg Compiler, and other vendors are expected and encouraged to develop their own Cg Compiler products.


link

Nividia only publishes the Cg Language specification for others to use yet they still own and maintain it. So if Nvidia chooses to make changes to the underlying Cg Language they will be able to create a new compiler concurrent to the changes. This would mean that competing vendors would have to play catch up everytime Nvidia would choose to make changes. Microsoft successfully practiced a tactic similar to this with their OS's and their applications.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: SilentRunning
Is Cg proprietary?
The Cg Language Specification is published and open in the sense that other vendors may implement products based on it. To encourage this, NVIDIA open sourced the Cg Compiler technology under a nonrestrictive, free license.

Vendor implementations of Cg Compilers are typically proprietary and owned by their creators. NVIDIA has developed and owns the NVIDIA Cg Compiler, and other vendors are expected and encouraged to develop their own Cg Compiler products.


link

Nividia only publishes the Cg Language specification for others to use yet they still own and maintain it. So if Nvidia chooses to make changes to the underlying Cg Language they will be able to create a new compiler concurrent to the changes. This would mean that competing vendors would have to play catch up everytime Nvidia would choose to make changes. Microsoft successfully practiced a tactic similar to this with their OS's and their applications.

I'm failing to see your point here. There are two parts to Cg: the language itself and the backend compilers.

The only time nVidia could disadvantage others is through a significant change in the actual language, not its backend.

The backends are vendor specific and totally independent from each other.

It is no different from nVidia releasing a new chip with new capabilities tommorow. It will not affect Cg the language at all , it will affect nVidia's backend Cg compiler, and ATi will have answer nVidia's challenge Cg or no Cg.

ATi may or may not try to modify their own backend compiler in response - it depends on whether they think their chips can handle the modifications or not.
 

SilentRunning

Golden Member
Aug 8, 2001
1,493
0
76
Originally posted by: Gstanfor
Originally posted by: SilentRunning
Is Cg proprietary?
The Cg Language Specification is published and open in the sense that other vendors may implement products based on it. To encourage this, NVIDIA open sourced the Cg Compiler technology under a nonrestrictive, free license.

Vendor implementations of Cg Compilers are typically proprietary and owned by their creators. NVIDIA has developed and owns the NVIDIA Cg Compiler, and other vendors are expected and encouraged to develop their own Cg Compiler products.


link

Nividia only publishes the Cg Language specification for others to use yet they still own and maintain it. So if Nvidia chooses to make changes to the underlying Cg Language they will be able to create a new compiler concurrent to the changes. This would mean that competing vendors would have to play catch up everytime Nvidia would choose to make changes. Microsoft successfully practiced a tactic similar to this with their OS's and their applications.

I'm failing to see your point here. There are two parts to Cg: the language itself and the backend compilers.

The only time nVidia could disadvantage others is through a significant change in the actual language, not its backend.

The backends are vendor specific and totally independent from each other.

It is no different from nVidia releasing a new chip with new capabilities tommorow. It will not affect Cg the language at all , it will affect nVidia's backend Cg compiler, and ATi will have answer nVidia's challenge Cg or no Cg.

ATi may or may not try to modify their own backend compiler in response - it depends on whether they think their chips can handle the modifications or not.

Of course you don't understand.
rolleye.gif
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
And just what precisely, do you think nVidia can really change about the core language?

Cg is C for graphics, and C is as flexible as anyone could ever need already. There is hardly a piece of serious software out there not written in C, or had the program that wrote it written in C. Heck, even C compilers are written in C.

EDIT: All ATi or any one else has to do is revise their back end compiler to support the new revision of the language. Just like they revise drivers for new versions of DirectX or OpenGL. No difference whatsoever.
 

rachaelsdad

Member
Aug 26, 2001
130
0
0
Originally posted by: Gstanfor
Well, don't believe nVidia either then if you don't want to. It doesn't take anything away from the fact that Cg is real and in use in the industry already, and that usage will only increase over time, not decrease.

link page


Considereing the Actions of NVidia to do anything to try and get ahead by cheating; sending out memos to reviews sites on a cards cabability(not their own;think Kyro) I would not believe any altruisitic motive could be applied to CG. Perhaps CG is out in the industry but do you believe that developers will choose to use CG when they realize what the aim of CG is; to do nothing more that allow NVidias cards to run better than there competitors,

Rendermonkey is a tool for higher shaders and it is being used also. If NVidia wants CG to be used by everyone; why not just open source it and help them get some good will back that they have destroyed.
 

FishTankX

Platinum Member
Oct 6, 2001
2,738
0
0
Originally posted by: Gstanfor
And just what precisely, do you think nVidia can really change about the core language?

Cg is C for graphics, and C is as flexible as anyone could ever need already. There is hardly a piece of serious software out there not written in C, or had the program that wrote it written in C. Heck, even C compilers are written in C.

EDIT: All ATi or any one else has to do is revise their back end compiler to support the new revision of the language. Just like they revise drivers for new versions of DirectX or OpenGL. No difference whatsoever.

The way Cg is currently looking, it's fair to all and looks to be like a great addition to the graphics society.

As long as Nvidia doesn't screw it up by trying to muscle out competitors or modifying the language in a way undesirable to other graphics company, we should welcome Cg. It will drastically reduce game development cycles (Exponentially less time to adopt new technology. Just like program advanced several lightyears when C/C++ was introduced) and it's not like it's Nvidia favoristic right now.

NFS4, if you think Cg is small fry, it's definatley not! You're looking at the future of graphics. This will allow even the smallest of game companies to create marvelous graphical effects with exponentially less effort than before, and bring development cycles down and introduce new graphics technologies faster than ever before. I hate to sound like a GStanfor (Sort of the oposite of what I am. He's an Intel basher and very strongly Pro Nvidia). But Cg is really the future of graphics.

On the 3dmark issue, my stance is this.

Nvidia showed that 3dmark 2003 could be optimized (To an unfair degree), to discount FM because they diidn't cooperate with Nvidia during the early stages of 3dmark 2003. It's sort of like getting mad that your friend went on a date with your crush, without you. So you try and 'Spoil the party'.

I wouldn't have had a great beef with this if Nvidia would have stated which 'optimizations', they put into 3dmark 2003. Since FM was the one who sniffed it out, and not Nvidia coming out into the open, I see it as shady.

If they could have been more like Intel and come public with such stuff then I would have viewed it as just revenge on FM. But doing it behind everyone's back is just shady.
 

FishTankX

Platinum Member
Oct 6, 2001
2,738
0
0
Originally posted by: rachaelsdad
Originally posted by: Gstanfor
Well, don't believe nVidia either then if you don't want to. It doesn't take anything away from the fact that Cg is real and in use in the industry already, and that usage will only increase over time, not decrease.

link page


Considereing the Actions of NVidia to do anything to try and get ahead by cheating; sending out memos to reviews sites on a cards cabability(not their own;think Kyro) I would not believe any altruisitic motive could be applied to CG. Perhaps CG is out in the industry but do you believe that developers will choose to use CG when they realize what the aim of CG is; to do nothing more that allow NVidias cards to run better than there competitors,

Rendermonkey is a tool for higher shaders and it is being used also. If NVidia wants CG to be used by everyone; why not just open source it and help them get some good will back that they have destroyed.


Fragmentation.

Right now as long as Cg remains even handed, what Cg needs is unity.

Look at the linux situation!!
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I make no apology whatsoever for the stance I took against the intel employees on this board at the time, and I see the fanATIc's no differently. They are the single most obnoxious thing about ATi today and will end up damaging the company they think their actions are helping.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: Gstanfor
Originally posted by: SilentRunning
Is Cg proprietary?
The Cg Language Specification is published and open in the sense that other vendors may implement products based on it. To encourage this, NVIDIA open sourced the Cg Compiler technology under a nonrestrictive, free license.

Vendor implementations of Cg Compilers are typically proprietary and owned by their creators. NVIDIA has developed and owns the NVIDIA Cg Compiler, and other vendors are expected and encouraged to develop their own Cg Compiler products.


link

Nividia only publishes the Cg Language specification for others to use yet they still own and maintain it. So if Nvidia chooses to make changes to the underlying Cg Language they will be able to create a new compiler concurrent to the changes. This would mean that competing vendors would have to play catch up everytime Nvidia would choose to make changes. Microsoft successfully practiced a tactic similar to this with their OS's and their applications.

I'm failing to see your point here. There are two parts to Cg: the language itself and the backend compilers.

The only time nVidia could disadvantage others is through a significant change in the actual language, not its backend.

The backends are vendor specific and totally independent from each other.

It is no different from nVidia releasing a new chip with new capabilities tommorow. It will not affect Cg the language at all , it will affect nVidia's backend Cg compiler, and ATi will have answer nVidia's challenge Cg or no Cg.

ATi may or may not try to modify their own backend compiler in response - it depends on whether they think their chips can handle the modifications or not.

You're still a bit out in right field, but you have the general idea. Nvidia does own the language, and is allowed to make changes, but there is nothing in the rule book saying that you have to use anything other than what you want to. If ATI writes a compiler for Cg 1.0, and Nvidia bumps it up to 1.1, programmers do not have to program in 1.1; they can still write for 1.0 if it pleases them to do so, although I have a feeling that programmers will want to move to 1.1, since it would unlock more features.

The best comparison I can think of would be pixel shaders and the different versions, as developers can use higher versions if they want, but they can also use base versions(2.0 and 1.3 should be supported by everyone) if they want to just write it once, all the while MS controlling the specs. Equally, I don't believe there's any legal reason that ATI can't write additions to Cg(much like OpenGL extensions), and add that to their compiler. Nvidia would need to catch up to ATIs extensions if they were really interested in keeping par, just like the situation earlier with Nvidia making the change.

ATI can compete with Cg just as well as Nvidia, they just need to make their compiler as good as(or better than) Nvidia's. There will be a slight delay in ATI getting its compiler up to speed if Nvidia doesn't annouce changes beforehand, but like I've said before, ATI can force the same situation on Nvidia if they wanted to. The point is probably moot anyhow though, as someone else has stated before, we will probably move on to another HLSL that's a bit more open in the OpenGL realm, and MS will force a HLSL for DirectX. First gen products like this tend not to be ideal anyhow.
 

FishTankX

Platinum Member
Oct 6, 2001
2,738
0
0
Originally posted by: Gstanfor
I make no apology whatsoever for the stance I took against the intel employees on this board at the time, and I see the fanATIc's no differently. They are the single most obnoxious thing about ATi today and will end up damaging the company they think their actions are helping.

You have your stances, I have mine. No apology requested or needed. You still have to understand, though. That while I may like ATi, I like to see things in a level headed manner. I like ATi because of their 2d quality and 3d brightness/color. You may like Nvidia for different reasons.

And while you may not like Intel employees, personal attacks are never cool, neither is 'Calling them out'. So if you wanna take that stance it'd go over a whole lot better if you were a bit less specific in your targets.
 

Hellbinder

Junior Member
Jul 30, 2001
9
0
0
Gstanford. You are simply completey 100% Technically incorrect.

The Audacity you show talking about Fanatics or whatever, when you are sitting here TRying to say that MANUALLY inserting a clip-plane in a set benchmark is LEGIT. thats just for starters. It is just simply absurd and insulting.
Isn't it pathetic to see all the Rage3D fanboys come over to AT and bash anyone who is positive about NVIDIA or negative about ATI? When in doubt about an ATI fanboy search their username at Rage3D, for example here where Compddd reveals himself in all his unbiased glory.
This has NOTHING to do with simply saying something posotive about Nvidia. As any serious minded person at this forum should be trying to explain to you. You are simply making one Ridiculous outlandish statement after the next. Trying to Justify inserting clip planes, Making outlandish claims about M$ chaning DX9 spec at the last Min which is NOT TRUE, Justifying wholesale replacement of Shader code forcing not just below Dx9 spec, But DX7 T&L!!!. Using custom compiled shaders with Nvidias PROPRIETARY Back end Compiler for Cg. Application detection which artificially inflates the Frame rate. This is all just the stuff we know about. All done to an independant Benchmark program with world wide influence. In whch NO i repeat NO custom code is allowed. Because it is designed to test Pure DX code ehich puts all IHV's on a level playing field.

And you, and a few others here have the AUDACITY to accuse others of being fanboys???

FP16 is fine for Doom becuase SPECIFICALLY Carmack drops everything to 8bits percision at one point which evens everything out. He did this becuase he is working on getting the best performance and Quality for everyone. Not only ATI and Nvidia.

At any rate it simply shows you will grap any straw you can to keep shifing the flow of the discussion or to try to Justify even ONE of your ideas.

Again, it is totally absurd that a handfull of people are having to defend themselves from being called or accused of fanatacism when they are doing nothing but defendign REASON, COMMON SENSE, and TRUTH.



 

Hellbinder

Junior Member
Jul 30, 2001
9
0
0
More examples..
nVidia originally withdrew their membership and support for BRIBEmark, whoops, thats not the name..., err, QUAKmark, no thats not it either - close though..., 3DMARK 2003, when futuremark refused to consider benchmark optimizations nVidia put forward in the development stages.
FALSE. Nvidia Withdrew in December, 13 months after development was started and AFTER everything but Bug hunting was complete. a mere 3 months before 3dmark03 was released. They were Fully 100% behind 3dmark03 until it became clear that the Nv3x was going to have troubble with it. Which is Due almost entirely to design flaws, or poor DEsign choices. Like limited Memory bandwidth, poor single texture performance, poor DX9 shader support among other things.

Further, this has nothing to do with bribes. Nvidia has 3-4 Times the Cash and resources ATi has. This has to do with Standing for what is Right and not caving to manipulation and Favrotism.
nVidia then publically stated that 3DMARK 2003 was a flawed benchmark that could easily be optimized and they proved it.
ANY benchmark program or game can be easily optomized for in ways that are not acceptable. It has nothing to do with flaws and eveything to do with INTEGRITY. Further none of the other Beta partners including DELL think it is flawed or poorly coded. The only IHV who has a problem with it is Nvidia. Whos Nv3x core has several known problems.
It would seem Futuremark finally agrees with them.
That has absolutly nothing to do with it. Nor does the statment they release support what you just said here.
I wonder if ATi's "membership subscriptions" have made up in any way for the damage Futuremark have inflicted upon themselves?
It is only 5,000$ a year for membership. Neither is Futuremark the ones doing the damage here. It is also very irritating that some people would go to such lengths to justify, spin, and defend what One IHV has been pulling.

It is blatantly obvious who is the Blind, uninformed Fanboy here... no actually worse than a Fanboy. You my friend are bordering on religious zealot.