Virtually Jenna: ATI VS NV

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
cool. atleast ati has the long bar where it matters (in the fps chart) (pun intended)

take that nvidia fanboys.

Since the NVIDIA slogan is "the way it's meant to be played" we were let down as Jenna wasn't meant to be played like this

LOLllllllllllllllllll

this is even better
It should also be mentioned that when we ran Virtually Jenna in Direct3d mode with AA and AF enabled in the ATI drivers that no differences were noted in the game. AA and AF only worked when OpenGL was selected in the options menu of Virtually Jenna before the game was launched. Since NVIDIA was only getting 5FPS in OpenGL mode at 1024x768 enabling eye candy is reserved to those with ATI graphics cards only when it comes to the ThriXXX rendering engine.

When it comes to Direct3D image quality the differences were very obvious. Both the ATI and NVIDIA graphic cards had the highest AA setting enabled in the driver and while it helped the NVIDIA image quality a difference was not see with the ATI graphics card. The image with the NVIDIA card was near perfect if one can over look the jaggies on the background images. The right side of the door frame wasn't smoothed out like it should be, but it's Direct3D is hands down better on NVIDIA.

AHH NVIDIA does have something going for it.

When it came to OpenGL testing it was hard to notice any difference between ATI and NVIDIA graphics cards other than the fact that the NVIDIA card ran at 5 frames per second while the ATI card ran the same scene at 85 frames per second with the eye candy enabled.

to bad nvidiots


 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
What is up with this "NVidiot" comments? I didn't mean to contribute to the never-ending flame war. Let's behave like adults. :lips:
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: lopri
What is up with this "NVidiot" comments? I didn't mean to contribute to the never-ending flame war. Let's behave like adults. :lips:

sorry i jusy felt like it. i've been seeing too many of nvidia biased posts these days. i apologize.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
So baby, how do you like it? SLI or Crossfire? Or perhaps a little of both :p
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: tanishalfelven
Originally posted by: lopri
What is up with this "NVidiot" comments? I didn't mean to contribute to the never-ending flame war. Let's behave like adults. :lips:

sorry i jusy felt like it. i've been seeing too many of nvidia biased posts these days. i apologize.

We're the ones that are going to have a field day dealing with the kind of stuff you're talking about when R600 approaches... :Q
 

Dainas

Senior member
Aug 5, 2005
299
0
0
Ever think of the possibility there might be a line of code missing for the opengl nvidia cards? Hypocrite is a strong word, but it seems so weak when its applied to the ATI fans here. The same people who would undoubtedly stomp their feet and scream "LAZY DEVS!!!@" if it were the other way around and it was ATI that got the short end of the optimizing stick by the ThriXXX developers.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Dainas
Ever think of the possibility there might be a line of code missing for the opengl nvidia cards? Hypocrite is a strong word, but it seems so weak when its applied to the ATI fans here. The same people who would undoubtedly stomp their feet and scream "LAZY DEVS!!!@" if it were the other way around and it was ATI that got the short end of the optimizing stick by the ThriXXX developers.

Maybe they used dynamic branching? Nv cards aren't necesarily guaranteed to run shaders as well as Ati cards, even when it's the exact same code. Usually, mistakes in the shader would result in a failure to compile, or incorrect rendering output. Unless they used different rendering code for Nv and overlooked a gross mistake, I wouldn't be so quick to balme the devs.
 

Cooler

Diamond Member
Mar 31, 2005
3,835
0
0
I think this is first time where ATI clearly murdered NVIDIA in opengl. I reallt think it must be driver problem or they are just using features that can only be implement in hardware by ATI. Also a free demo of the game would most likly suck as they want you to buy the real thing.
 

Dainas

Senior member
Aug 5, 2005
299
0
0
Yeah, That would hold water if there was a 10%, 15% or even 30% difference in performance, but not for what the benchmarks show.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Cooler
I think this is first time where ATI clearly murdered NVIDIA in opengl. I reallt think it must be driver problem or they are just using features that can only be implement in hardware by ATI. Also a free demo of the game would most likly suck as they want you to buy the real thing.

No, this is the second time. The first time was in MunkyMark SM3 test.
 

Kromis

Diamond Member
Mar 2, 2006
5,214
1
81
Originally posted by: munky
Originally posted by: Cooler
I think this is first time where ATI clearly murdered NVIDIA in opengl. I reallt think it must be driver problem or they are just using features that can only be implement in hardware by ATI. Also a free demo of the game would most likly suck as they want you to buy the real thing.

No, this is the second time. The first time was in MunkyMark SM3 test.

Very nice...very very nice! :p