OpenGL performance on these new boards

jema

Senior member
Oct 14, 1999
296
0
0
Hi a.

I currently have a TNT2 but I'm strongly considering a upgrade. The thing is that I don't have much cash and that I need good OpenGL performance (for 3D modelling programs) and good gaming performance, in both OpenGL and D3D).

Does anyone know how the Geforce MX, Radeon 32MB SDR and the new Voodoos perform under professional OpenGL? I'm pretty sure they all can handle D3D just fine.

I'm leaning towards the MX since my experiense with the TNT2 are good, but it seems like the Radeons perform better in 32bit colour under higher resolutions. I'll be running the board at 1024x768 or 1280x1024 most of the time.

Cheers.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
well, for professional OGL, you should definitely avoid the voodoo boards. They don't have a "real" OGL ICD yet, it's still tailored for games.

If you're after professional OGL performance, you should try to find the cheapest GTS you can find. They pretty much OWN @ OGL.

Sell your cat, your dog, maybe pawn off the watch grandma gave you for Christmas......
 

(Chanse)

Senior member
Oct 11, 1999
421
0
0
If cash is an issue, than an MX would be your best bet. 3dfx and ATI are no where near as good in ogl apps as Nvidia.
 

jpprod

Platinum Member
Nov 18, 1999
2,373
0
0
You don't need fillrate for professional OpenGL, GF2MX and GF2GTS, as SDR and DDR GeForces, should be fairly close on this field. Speed of the onboard T&L unit, quality of drivers, amount of onboard memory (for textures) and clarity of 2D-output on high resolutions should be the deciding factors.

A brand-name GeForce2 GTS (Asus and ELSA come to my mind) with 64MB RAM would be the ideal choise; you should avoid cheapest brands because they'll offer blurrier high-resolution 2D image quality than more expensive boards "thanks" to lower quality components used on the PCB.

In theory, ATi Radeon should be the fastest of current cards in professional OpenGL because according to specs, it has the most powerful T&L unit. But I guess they have a lot of work in the driver department if they desire to capture budget professional OpenGL market segment from Nvidia.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
jpprod, I would've agreed with you, but the GTS cards seem TOTALLY hit or miss with 2d image quality. I've heard from several peeps wiht 64MB Asus cards and Elsa cards that the 2d still sucks.

In fact, out of the people I "trust", I've only heard one say his Asus looked "pretty good, but not as good as the Radeon" (he replaced it with a radeon)
 

slacker2

Member
May 8, 2000
93
0
0
I am looking for the Aopen GeForce2 (building a system for a relative of mine). What attracts me to that card is the configurable video card bios (voltage adjustments and other stuff). In my experience they make by far the most stable motherboards, hopefully the same trend applies to their GeForce cards. If I ever manage to get one, I'll post my impressions.
 

jpprod

Platinum Member
Nov 18, 1999
2,373
0
0
RoboTECH; From what I've heard, 2D picture quality problems of GeForce2 GTS's are mostly occuring on users with Sony brand monitor with a trinitron tube. Several Radeon owners have also posted strange shimmering artifacts on Sony Trinitron monitors, perhaps it could be monitor/video card combination dependant problem? Reviews of brand GeForce2 boards haven't said anything negative about the 2D quality.

But alas, I can only speculate. We really need a big 2D quality comparison rounddown of all current video cards combined with different CRTs, done by a respected hardware site (hint, Anand :) )
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
jpprod, that is a good point

I've heard it's the Trinitron tube that might do it.

I have a Sony Trinitron 17", and yeah, both my GTS cards looked like hell on them.

I also have a KDS 19" VS-190 (Trinitron? Dunno), and both GTS's looked like $hite with it. The 5500 looks fantastic on both.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
jema-

Which 3D modelling applications are you using?

Your choice is limited IMHO to the GeForce family of products. The Radeon's pro OpenGL support still blows and its' performance is rather sad(the 64MB Radeon DDR gets easily bested by a GeForce1 SDR) in pro OpenGL.

3DLabs/Intergraph and the rest stink for gaming, leaving nVidia as the only currently viable option(I run nVidia because I need the same things you do).

The particular applications you use are important, the types of models you build are also. Higher levels of RAM on board is good, but most of the time you will be geometry limited unless you deal with very specific types of models.

If you look at most of the "high end" boards such as the Wildcat series, their on board texture RAM tends to be slower then AGP texturing, particularly 4X.

If you do work with very texture heavy situations then a GF1 64MB board would make sense giving that you are trying to save some cash, with a GF2 64MB beign a better overall option. If you don't work with very large texture sets then a GF2 MX will almost certainly be better then the GF1(though not the GF2).

Clockrate and T&L on the GF2MX is a decent amount higher then the GF1(~30%), and that is the most important thing most of the time. That is the reason why I asked what apps you run and also the type of models/scenes that you build.

Oh yeah, definately stick with one of the name brand parts, and the 2D issues are particularly problematic with Trinitron tubed monitors.

Robo-

"KDS 19" VS-190 (Trinitron?"

Yep.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
Ben, so what's the deal?

Trinitron monitors are great!?!?!

why does the nvidia boards have such an issue with Trinitron monitors?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
"why does the nvidia boards have such an issue with Trinitron monitors?"

I've been looking for an answer to that for months, timing issue? I honestly don't know, I just know that if you hook up two identical rigs, both with GF based boards one with a FD Sony and the other with a budget line Philips and run them at 1600x1200, the Philips will make the Sony look like sh!t.

I have done side by side comparisons and it is an issue with the particular combo of nV(and to a lesser extent Radeons) and Sony tubes. I noticed this(before any testing) when everyone that complained about the GF's image quality had Trini tubes, though that took a while to stand out to me(people would complain in one thread, mention that they had a Trini in another).

Hook the nV boards up to a Mitsubishi DiamondTron monitor and a G400 side by side and you will only see a very slight difference in image quality. Again, why this is happening is beyond me, but now that the Radeon is also having problems of its' own I would say that Sony is doing something a bit differently then everyone else(well, besides making across the board sweet monitors:)).
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
yeah dude, that's my point

I love Sony monitors and Trinitrons in general

I'm not about to give them up.

hope NV20 doesn't have this problem, or that is one less graphics card choice I"ll have

:(