Open GL t-junctions

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
OpenGL is dead.... Linux users? lulwut? who games on linux? :p

also troll post?

I thought Nvidia where the ones with bad OpenGL performance... but like I said... hardly any games use it nowadays.
 

gorobei

Diamond Member
Jan 7, 2007
3,957
1,443
136
known troll. this is his 2nd attempt in vcg, normally he plays in ot.

most likely reposting other troll attempts from other forums.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
He’s probably talking about the original GLQuake renderer, LOL.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
Source? or is this just FUD?
I heard from someone at Beyond 3D (in 2008) that ATi had trouble with Tjunctions, whatever they are,

OpenGL is dead.... Linux users? lulwut? who games on linux? :p

also troll post?

I thought Nvidia where the ones with bad OpenGL performance... but like I said... hardly any games use it nowadays.
That doesn't mean that countless classics don't use it. That doesn't mean that Rage won't be using it.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
(Factual statement, search P&N for proof.)

A20 likes to post obscure trivia threads that seem to have no real point to them, like the time I caught the ferry to Shelbyville. I needed a new heel for m'shoe. So I decided to go to Morganville, which is what they called Shelbyville in those days. So I tied an onion to my belt. Which was the style at the time. Now, to take the ferry cost a nickel, and in those days, nickels had pictures of bumblebees on 'em. Gimme five bees for a quarter, you'd say. Now where was I... oh yeah. The important thing was that I had an onion tied to my belt, which was the style at the time. You couldn't get white onions, because of the war. The only thing you could get was those big yellow ones . . .
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
I heard from someone at Beyond 3D (in 2008) that ATi had trouble with Tjunctions, whatever they are,
ATi’s had superior OpenGL performance to nVidia for years in gaming, especially in Doom 3 engine games. I'm talking about a 6850 outrunning my GTX580 in Doom 3 at 2560x1600 with 8xMSAA (128.8 vs 114 FPS).

Fermi parts are particularly slow in OpenGL with the GTX470 being a substantial performance downgrade to a GTX285.

That doesn't mean that countless classics don't use it. That doesn't mean that Rage won't be using it.
The classic source-ports often have performance problems on nVidia's parts, especially on Fermi derivatives. Even elementary rendering such as smoke or spark effects can cause performance to plummet.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
That's a fairly recent turn of events. NV has historically had much better OpenGL performance relative to ATI hardware of the day on both Windows and Linux, Doom 3 notwithstanding. During the 7800 / X1800 generation the performance gap using OpenGL rendered titles (e.g., City of Heroes with OpenGL renderer) was huge in favor of NV.

There are several reasons NV has a massive market share lead over ATI when it comes to workstation OpenGL, and previously performance was one of them.

Today firegl has more than caught up in both raw performance and price/performance. It's certainly going to get interesting in the workstation segment in the next few years -- the workstation segment is rapidly becoming NV's bread and butter, and ATI is looking to eat as much of that as they can.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
ATi’s had superior OpenGL performance to nVidia for years in gaming, especially in Doom 3 engine games. I'm talking about a 6850 outrunning my GTX580 in Doom 3 at 2560x1600 with 8xMSAA (128.8 vs 114 FPS).

Fermi parts are particularly slow in OpenGL with the GTX470 being a substantial performance downgrade to a GTX285.


The classic source-ports often have performance problems on nVidia's parts, especially on Fermi derivatives. Even elementary rendering such as smoke or spark effects can cause performance to plummet.
Well, as long as IQ and compatibility are still perfect, I don't really care if I'm getting 40 fps when I could be getting 60, unless the hardware is slowing down because of a bad psu or a bad GPU fan or something like that. nvidia has more GL extensions than ATi does anyway, don't they?

This isn't meant to be a smart ass question, but why does it matter if one runs at 129fps while another runs at 114 fps? Unless it were an Fzero port with adjusted physics, I couldn't care less if I was getting 60 or 120 fps. Of course, that's just me and a few other people.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
OpenGL is dead.... Linux users? lulwut? who games on linux? :p

also troll post?

I thought Nvidia where the ones with bad OpenGL performance... but like I said... hardly any games use it nowadays.



What single test ever gave you that idea?

Some of your arguments seems to have no basis in reailty but founeded in fuzzy-warm-feelings and hope?.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
8,124
3,061
146
Doesn't quake wars use open gl? also, I know back in the day when Doom 3 first came out, nvidia cards ran it MUCH better. Of course these were the 6800 and 7800 series.

Anyways, QW does run fine on my xfire system, at 2560x1600 with 8x or 16x AA I believe. Still get hundreds of FPS in most cases.
 
Dec 30, 2004
12,553
2
76
Doesn't quake wars use open gl? also, I know back in the day when Doom 3 first came out, nvidia cards ran it MUCH better. Of course these were the 6800 and 7800 series.

Anyways, QW does run fine on my xfire system, at 2560x1600 with 8x or 16x AA I believe. Still get hundreds of FPS in most cases.

yeah its a heavily modified doom 3 engine I believe
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
Doesn't quake wars use open gl? also, I know back in the day when Doom 3 first came out, nvidia cards ran it MUCH better. Of course these were the 6800 and 7800 series.

Anyways, QW does run fine on my xfire system, at 2560x1600 with 8x or 16x AA I believe. Still get hundreds of FPS in most cases.
IIRC, ATi got faster frame rates if some command was entered into the console. Even ATi was stumped by it and spent forever trying to reduce IQ in Doom 3 in exchange for performance, but to no avail. Their IQ in Doom 3 was subpar partially because of the app specific optimizations and partially due to ATi's higher use of general optimizations (z-buffer and texture optimizations that I've noticed in the past) that signifigantly reduce IQ.
 

zebrax2

Senior member
Nov 18, 2007
974
66
91
What single test ever gave you that idea?

Some of your arguments seems to have no basis in reailty but founeded in fuzzy-warm-feelings and hope?.

as BDG10K says

ATi’s had superior OpenGL performance to nVidia for years in gaming, especially in Doom 3 engine games. I'm talking about a 6850 outrunning my GTX580 in Doom 3 at 2560x1600 with 8xMSAA (128.8 vs 114 FPS).

Fermi parts are particularly slow in OpenGL with the GTX470 being a substantial performance downgrade to a GTX285.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
This isn't meant to be a smart ass question, but why does it matter if one runs at 129fps while another runs at 114 fps? Unless it were an Fzero port with adjusted physics, I couldn't care less if I was getting 60 or 120 fps. Of course, that's just me and a few other people.
You take no issue with nVidia’s $500 MSRP flagship GTX580 being slower than ATi’s $179 MSRP mid-range, but you feel concerned enough about T junctions to start a thread about it? LOL.

Why don’t you tell us where T junctions have affected you? I can list several examples where nVidia’s slow OpenGL performance affects me (e.g. much slower than ATi in Prey, Quake 4 and Doom 3). Also poor performance in modern source ports (e.g. Descent, Quake and Quake 2) where elementary rendering such as smoke, fog and other transparent effects can cause performance to absolutely plummet.

Not to mention the GTX470 being a performance downgrade over a GTX285 in such situations while sounding like a hairdryer in comparison.

nvidia has more GL extensions than ATi does anyway, don't they?
Let’s say this is true. Why don’t you explain to us what it means, citing specific examples in games where it has affected you?

IIRC, ATi got faster frame rates if some command was entered into the console. Even ATi was stumped by it and spent forever trying to reduce IQ in Doom 3 in exchange for performance, but to no avail.
Uh, no. You’re confusing two different issues, so let me explain them to you.

The console command (e.g. bilinear) caused both vendors to lose performance because the game went from per-texture AF to global AF. ATi had an IQ driver bug with the per-texture AF which was later fixed with a 1 FPS performance hit.

The other optimization was ATi replacing texture lookups with shader ALU calculations because it was faster on their hardware. For all intents and purposes the IQ was practically identical. Yeah, it’s an application specific optimization but both vendors do things like that these days.

You going to start a thread complaining about nVidia’s 3DMark shader substitution (which actually caused IQ to change), and the insertion of static clip planes?

Their IQ in Doom 3 was subpar partially because of the app specific optimizations and partially due to ATi's higher use of general optimizations (z-buffer and texture optimizations that I've noticed in the past) that signifigantly reduce IQ.
You haven’t seen sub-par quality in Doom 3 until you’ve seen it in on an nVidia 7000 card with default driver texture quality settings. Wall surfaces would wave like flags during movement, while floors shimmered like a swarm of ants.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
Let’s say this is true. Why don’t you explain to us what it means, citing specific examples in games where it has affected you?
less extensions can mean reduced compatibility. I haven't tried many GL games on ATi hardware so I can't say it has affected me.
Uh, no. You’re confusing two different issues, so let me explain them to you. The console command (e.g. bilinear) caused both vendors to lose performance because the game went from per-texture AF to global AF. ATi had an IQ driver bug with the per-texture AF which was later fixed with a 1 FPS performance hit. The other optimization was ATi replacing texture lookups with shader ALU calculations because it was faster on their hardware. For all intents and purposes the IQ was practically identical. Yeah, it’s an application specific optimization but both vendors do things like that these days. You going to start a thread complaining about nVidia’s 3DMark shader substitution (which actually caused IQ to change), and the insertion of static clip planes?
nvidia's not perfect and yes, the 3dmark app specific optimizations nvidia has used pissed me off, but I've always thought nvidia's IQ has been better, at least to me. i also haven't played Doom 3 on ATi HW, so I can't comment on that. Maybe the 6850 has as good of filtering quality, but I haven't seen a 6850 in action, so I'm commenting on the 5770's awful filtering quality, which I believe was hardware-based, not due to driver optimizations.
You'll have to elaborate on nvidia's use of static clip planes, that would help me out.
I had thought that id enabled something that reduced performance on ATi hardware at the time because nvidia paid them to. When that was disabled, ATi's performance was better at no IQ cost. That was what I was talking about. I remember that when I had a 5770, linear_mipmap_nearest or something like that was the best it could do, while with the nvidia hardware, it could do linear_mipmap_linear, and when I had a 5770 it shimmered like crazy in MDK2, a GL game (the only one I tried), while it doesn't shimmer on nvidia hardware.
You haven’t seen sub-par quality in Doom 3 until you’ve seen it in on an nVidia 7000 card with default driver texture quality settings. Wall surfaces would wave like flags during movement, while floors shimmered like a swarm of ants.
I put in on HQ and only got shimmering with AF enabled when I had a 6800GT. I never used default quality settings.

As for the t-junctions, I don't know because nvidia was reported not to have them. That was why I was asking about them.

Don't get me wrong, nvidia isn't all that great either. We need a 3rd (other than intel) or even 4th player in the GPU industry. There are many things about nvidia's drivers that piss me off (no choice for forcing all existing RGBA and D formats, for example). I don't play games as much as I used to, because the technology used could be a lot better than what could be used. I'm sick of all the lossy compression for example. I'd rather just have smaller textures and use up 2x as much HDD space for lossless audio in games. Another example is that there aren't any good monitors and all the HDTVs suck because they always have high input lag, are limited to 60Hz and few have RGB LED backlights. My Apple LED cinema display is decent, but it still kind of sucks because it has at least ~12ms input lag, pretty poor contrast ratio, it's limited to 60 Hz, and it doesn't have a very high color gamut, but at least it's an H-IPS (no banding, excellent viewing angle), glossy, and doesn't have more input lag than it does, like an HDTV would even on game mode.