NOOB OPENGL QUESTION

QuestionsandAnsweres

Golden Member
Jun 27, 2001
1,628
0
0
I just noticed something. I had a radeon 9800PRO but in some games it showed some annoying graphical glitches. Well i purchased a Geforce FX 5900 and the problems are gone. But i noticed that the specs on the Radeon 9800 say it supports OPENGL 2.0 while the 5900 only supports 1.4.. Is this a hardware limitation of the Geforce 5900 or just a driver thing atm????

Or does this not matter??? It seems weird that ATI supports a higher version of OPENGL yet it doesnt run OPENGL Games as well as Geforce cards =\
 

JonnyBlaze

Diamond Member
May 24, 2001
3,114
1
0
opengl is used more in engineering or 3d programs like pro/e or 3dsmax than it is in games.

the version numbers affect those apps not really games

JBlaze
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
"3Dlabs initiated the development of the OpenGL Shading Language over two years ago with a sweeping vision for the first fundamental upgrade to this widely available 3D graphics API in its ten year history.

Working within the open standards process at the OpenGL Architecture Review Board (ARB), 3Dlabs has spearheaded the definition of a powerful high-level, hardware independent shading language - the OpenGL Shading Language. This language enables direct compilation of C-like programs to graphics hardware machine code, creating enormous opportunities for compiler and graphics architectural innovation and bringing real-time cinematic-quality rendering a step closer to reality.

The OpenGL Shading Language specification was approved on June 11th 2003 as official ARB extensions - clearing the way for graphics vendors to ship the industry's first open standard, multiple vendor high-level shading language. The ARB intends to incorporate the OpenGL Shading Language in the Core OpenGL specification as soon as possible - creating OpenGL version 2.0. 3Dlabs are the first to ship OpenGL Shading Language drivers and will continue to aggressively push to ensure OpenGL 2.0 is ratified as quickly as possible for the good of the graphics industry.

Continuing its tradition of backwards compatibility, OpenGL 2.0 will be a superset of OpenGL 1.4 - so older applications will run on graphics accelerators with OpenGL 2.0 drivers without modification."

Nvidia supports OpenGL 1.5 in the FX series.

"The OpenGL® 1.5 specification includes the revolutionary OpenGL® Shading Language, official ARB extensions that are expected to form the foundation of the upcoming OpenGL® 2.0 version of this cross-platform, open-standard API for advanced 3D graphics."

ATI is tailored to DX9 minimum specs and not OpenGL. Nvidia is tailored around OpenGL1.5 minimum specs and not DX9.

No current hardware can actually do OpenGL2.0. The new official revision was OpenGL1.5 made in 2003. There will be more revisions and we will eventually get to 2.0. When ATI released their architecture though, it was called OpenGL2.0, but it wasn't approved by ARB.
 
Nov 22, 2003
36
0
0
Originally posted by: QuestionsandAnsweres
I just noticed something. I had a radeon 9800PRO but in some games it showed some annoying graphical glitches. Well i purchased a Geforce FX 5900 and the problems are gone. But i noticed that the specs on the Radeon 9800 say it supports OPENGL 2.0 while the 5900 only supports 1.4.. Is this a hardware limitation of the Geforce 5900 or just a driver thing atm????

Or does this not matter??? It seems weird that ATI supports a higher version of OPENGL yet it doesnt run OPENGL Games as well as Geforce cards =\

OpenGL versions really don't matter. Just OpenGL driver quality. ATI is still weak when it comes to OpenGL drivers, but they're getting better all the time. Both companies only have OpenGL 1.4 drivers at this point in time. OpenGL 2.0 does not exist yet. OpenGL 1.5 is the newest version and I don't know any companies which have drivers that report OpenGL 1.5 yet. ATI was at OpenGL 1.3 until just a few months ago, but this didn'ty really matter.

Jonny:I think games are far more cutting edge than applications. Why does pro/e or max need the higher OpenGL versions for which define more advanced blending modes, cube maps, etc? The last two games to use OpenGL I can think of are Call of Duty and Homeworld2. Homeworld2 requires OpenGL 1.3 and uses advanaced features like fragment programs (Pixel Shader 2.0 in D3D terms) which are so new they are still extensions that haven't made it into the core of OpenGL. What advanced features of OpenGL does 3dsmax use? What changes happened when ATI moved to OpenGL 1.4 (3.8 or 3.9 Catalyst drivers)? Those apps certainly require fast hardware, but I don't think they require high OpenGL versions.

vian: Saying nvidia designed the FX around the minimum requirements for OpenGL 1.5, not dx9, is completely nonsense. Why do you say that?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
What advanced features of OpenGL does 3dsmax use?

Shaders, shaders, shaders. The preview rendering and such that are currently done on processors can be done an order of magnitude or so faster on a graphics card. It is certainly not unthinkable that ToyStory4 was actually finally rendered on a cluster of graphics chips instead of processors, but we need better shader support in hardware for that as of now(athough right now the hardware is perfectly capable of handling preview renders and save users lots of time).
 

QuestionsandAnsweres

Golden Member
Jun 27, 2001
1,628
0
0
5900 is a better value and i dont have any graphic glitches in my games. My 5900 came with call of duty as well so its all good :)

Originally posted by: MercenaryForHire
Originally posted by: titananandtech
OpenGL versions really don't matter. Just OpenGL driver quality.

Quoted for truth.

By the way, return that 5900 and put your 9800P back in. :p

- M4H

 
Jan 31, 2002
40,819
2
0
Originally posted by: QuestionsandAnsweres
5900 is a better value and i dont have any graphic glitches in my games. My 5900 came with call of duty as well so its all good :)

I won't even bother to rehash all the reasoning behind it, but it stands to note that for "future gaming" the 5900 will be spanked quite badly by the 9800P.

But mind you, if you've got enough money to throw around to buy/sell/buy cards like that, I doubt it matters. :p

- M4H
 

QuestionsandAnsweres

Golden Member
Jun 27, 2001
1,628
0
0
depends what u mean by future gaming. We all know when Quake 3 was released alot of companies started to use the Quake 3 engine for there games. Well I have a feeling the Geforce FX will do just fine in Doom 3 :) (compared to other cards currently out)

But ya for Direct X 9 games the radeon definatly would be faster.

But i paid $300 for my Radeon 9800PRO. took it back and got a Geforce FX 5900 for $200 and i got call of duty.

Ill probably upgrade in a year anyways. :)
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
vian: Saying nvidia designed the FX around the minimum requirements for OpenGL 1.5, not dx9, is completely nonsense. Why do you say that?

well, cause I remember reading somewhere that FX minimum specs are up to OpenGL requirements, but over DX9. Since the Nvidia website says OpenGL1.5, I thought that's what it was released with.
 
Nov 22, 2003
36
0
0
Originally posted by: BenSkywalker
What advanced features of OpenGL does 3dsmax use?
Shaders, shaders, shaders. The preview rendering and such that are currently done on processors can be done an order of magnitude or so faster on a graphics card. It is certainly not unthinkable that ToyStory4 was actually finally rendered on a cluster of graphics chips instead of processors, but we need better shader support in hardware for that as of now(athough right now the hardware is perfectly capable of handling preview renders and save users lots of time).

ATI already has their Ashli compiler which can let their Radeon 9500+ cards run Renderman shaders and render Toystory graphics. Not quite real time, but certainly better than hours per frame like before. Isn't there a maya renderer for the Radeons that does all the rendering on the VPU instead of CPU too?

But does 3dsmax actually currently use any advanced OpenGL shader features?

BTW opengl shaders are an extension, defined with the release of OpenGL 1.5, not part of the core that you'd want a higher version number for.
 
Nov 22, 2003
36
0
0
Originally posted by: VIAN
vian: Saying nvidia designed the FX around the minimum requirements for OpenGL 1.5, not dx9, is completely nonsense. Why do you say that?
well, cause I remember reading somewhere that FX minimum specs are up to OpenGL requirements, but over DX9. Since the Nvidia website says OpenGL1.5, I thought that's what it was released with.

You read wrong before. Check the OpenGL 1.5 spec. The only new things in it are VBO support (already supported throughout the entire GeForce line as an extension), Occlusion Queries (been supported for years as an nvidia extension, then standard extension), and more generalized shadow buffer support. Also there is an option HLSL shader language now. Other than the option shaders I bet the GF3 can do the minimum for OpenGL 1.5 already. And I bet it can do the vertex shaders.

The FX is well beyond the min requirements for OpenGL 1.5 or DirectX 9.0.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Good, at least something positive about Nvidia everyone should be able to agree on.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
ATI already has their Ashli compiler which can let their Radeon 9500+ cards run Renderman shaders and render Toystory graphics.

ATi cards can't handle proper RenderMan shaders, not even ToyStory PRRenderMan class. First off, they lack the precission required by Renderman shaders(IEEE FP32) they also lack the ability to handle instruction length close to what would be used in an offline setting(in theory their 9800XT hardware can handle it, but no FBuffer support in drivers yet). They can emulate certain shaders with reduced quality(how far reduced depends on the complexity of the shader). Throw a 1K instruction shader at the ATi parts and they will look considerably inferior to the same thing rendered on a nV chip. What you are talking about is preview rendering which is the same thing I was talking about. Right now nV's instruction limit is too small to allow full shader emulation/hardware support and they also lack support for branching(which is required for proper final rendering). They are still coming up well short on the hardware end despite, in the pro market, having clearly superior shader hardware v ATi.