[HardOCP] Nvidia cards much faster than AMD cards in "Rage" game.

happy medium

Lifer
Jun 8, 2003
14,387
480
126
"The GeForce card had an edge thanks to CUDA-based GPU texture transcoding, but it wasn't that big of a deal. After a few minutes in any given environment, the texture pop-in issue fades to being a minor annoyance with the AMD video cards installed."

What is Cuda GPU texture transcoding?
The gtx560ti was beating a 6970 because of this?

http://www.hardocp.com/article/2011/10/19/rage_gameplay_performance_image_quality/7
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
nvidia payed for feature? that makes them come out on top?
(I mean why couldnt it have been OpenCL instead? and both cards could have it?)

1318825286d0oYQIAgSR_4_3.gif



Normally the 570 and the 6950 are about equal in strength.
WITH transcodeing the 570 is a tiny bit faster (enough to run 8xAA instead of 4xAA)


Anyways game codeing in Rage is a joke.
Obviously the game is a piece of cr*p, with all the issues it has, and the loading of textures.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
CUDA GPU transcoding is GPU assisted transcoding of the heavily compressed megatextures used in Rage basically.

Anandtech had a bit of a write up about it:

Something else worth discussing while we’re on the subject is Rage’s texture compression format. S3TC (also called DXTC) is the standard compressed texture format, first introduced in the late 90s. S3TC/DXTC achieves a constant 4:1 or 6:1 compression ratio of textures. John Carmack has stated that all of the uncompressed textures in Rage occupy around 1TB of space, so obviously that’s not something they could ship/stream to customers, as even with a 6:1 compression ratio they’d still be looking at 170GB of textures. In order to get the final texture content down to a manageable 16GB or so, Rage uses the HD Photo/JPEG XR format to store their textures. The JPEG XR content then gets transcoded on-the-fly into DXTC, which is used for texturing the game world.

Source..

ID also tried to use OpenCL to do the same thing on the Radeons, but the performance apparently wasn't good enough to justify including it in the final release.
 

brybir

Senior member
Jun 18, 2009
241
0
0
Anyways game codeing in Rage is a joke.
Obviously the game is a piece of cr*p, with all the issues it has, and the loading of textures.

I agree with the sentiment. There has been a lot made of the texture pop-in but the Elephant in the room is that the game itself was just not all that exciting. I dealt with the pop-up on the PS3 version, but that did not detract from the fact that the game was just not that enjoyable for a lot of other reasons (length, story, etc). Of course, only my opinion.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Anyways game codeing in Rage is a joke.
Obviously the game is a piece of cr*p, with all the issues it has, and the loading of textures.

I agree with the sentiment. There has been a lot made of the texture pop-in but the Elephant in the room is that the game itself was just not all that exciting. I dealt with the pop-up on the PS3 version, but that did not detract from the fact that the game was just not that enjoyable for a lot of other reasons (length, story, etc). Of course, only my opinion.

Game runs great on my system. No texture pop in at all. Distance views are amazing, but up close the textures often look worse than Doom 3.
 

brybir

Senior member
Jun 18, 2009
241
0
0
Why don't you guys look at the apple to apple comparison?

Normally you follow up that sort of statement with a sentance on why the above referenced test was not an Apples to Apples comparison.

In most of the tech world, the question is "what tool can get the job done in the way that we want the job done." A vendor that provides the tool that provides the desired outcome is the one that wins the business of that client. Why is this different with graphics cards?

The "job done" is playing the game. The tools that allow it to be played are video cards. Sometime one tool is better than another tool at producing that outcome (playing the game), irrespective of how it achieves the goal. So, we do have an apples to apples comparison here because we are comparing the same exact job (playing rage) and changing up the tools used, and then testing which one does better at completing the job.
 

waffleironhead

Diamond Member
Aug 10, 2005
7,046
549
136
Those 16xAA screencaps in the review are just plain awful IMO. Might as well call it blurovision.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
nvidia payed for feature? that makes them come out on top?
(I mean why couldnt it have been OpenCL instead? and both cards could have it?)

1318825286d0oYQIAgSR_4_3.gif



Normally the 570 and the 6950 are about equal in strength.
WITH transcodeing the 570 is a tiny bit faster (enough to run 8xAA instead of 4xAA)


Anyways game codeing in Rage is a joke.
Obviously the game is a piece of cr*p, with all the issues it has, and the loading of textures.

Poor AMD, always the victim of a conspiracy.

ID also tried to use OpenCL to do the same thing on the Radeons, but the performance apparently wasn't good enough to justify including it in the final release

Unpossible! An AMD backed standard not seeing great performance on AMD parts?

Personally I would have liked to see how Nvidia cards did running id's OpenCL vs AMD cards. I suspect Nvidia cards would had done better.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Nah. It is just [H]. they don't know how to review stuff. They always get different results from everyone else- Who cares about 2560x1600 with 8xAA. [H] use sucky CPUs.

Rage&
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Wasn't it happy who said [H] was a joke and that they are biased and nobody should bother reading their reviews?
 

SHAQ

Senior member
Aug 5, 2002
738
0
76
GPU transcoding is unnecessary if you have 1GB+ of vram. Id made the texture page size too small and then added transcoding to make it stream faster. It was found that increasing the texture page size eliminated much of the streaming so transcoding could be turned off. Would you all rather have more efficient streaming or none at all and hold it in vram? The console-sized maps are easily held in PC's greater RAM amounts.
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Wasn't it happy who said [H] was a joke and that they are biased and nobody should bother reading their reviews?
Yes and they are only a joke when their results are not favorable to your views.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Nah. It is just [H]. they don't know how to review stuff. They always get different results from everyone else- Who cares about 2560x1600 with 8xAA. [H] use sucky CPUs.
If you read how the test was done by [H], you will then understand why there is a difference in results. [H] tests are done at the Dead City Center area, the most depending area of the game.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
And who praised Kyle for his benchmarking ways?

Haven't seen anyone in this thread yet, but I do not think [H] do bad reviews, I actually like them. I wish they tested more games and I don't always agree with their conclusions, I never read them anyway. I look at the data and make my own.

AT is still number 1 for me when it comes to reviews.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
"The GeForce card had an edge thanks to CUDA-based GPU texture transcoding, but it wasn't that big of a deal. After a few minutes in any given environment, the texture pop-in issue fades to being a minor annoyance with the AMD video cards installed."

What is Cuda GPU texture transcoding?
The gtx560ti was beating a 6970 because of this?

http://www.hardocp.com/article/2011/10/19/rage_gameplay_performance_image_quality/7

So you came here and posted that just to taunt AMD users? What a waste of time. Its well known that a GTX 580 performs better than a 6970 at 2560x1440, and it had better because it costs 235-250$ more on average. Oddly enough, crossfire 6970's rather easily outperform 580 sli. Whatever though.

Yeah you love nvidia. We get it. We get threads like this from you every other week, do you have nvidia logos on your car? Congrats on being a [wonderful person].

Play nice. That's not a request.
-ViRGE
 
Last edited by a moderator:

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
If you read how the test was done by [H], you will then understand why there is a difference in results. [H] tests are done at the Dead City Center area, the most depending area of the game.

I was being sarcastic.

People (some) need to start to realize that a benchmark isn't a definite proof of absolute performance for the entire game and that is why it is possible to have different results, depending on how the reviewer choose to test.

And who praised Kyle for his benchmarking ways?

I think this result is valid as were previous.
 
Last edited:

thilanliyan

Lifer
Jun 21, 2005
12,040
2,255
126
"The GeForce card had an edge thanks to CUDA-based GPU texture transcoding, but it wasn't that big of a deal."

Looks at thread title --> "nVidia cards much faster"

Looks at article text --> "The GeForce card had an edge thanks to CUDA-based GPU texture transcoding, but it wasn't that big of a deal"

Hmmm...if I didn't know any better I'd think you were an nV fanboy and/or an anti-AMD fanboy, but we all know that's not the case riiight?? :D

Yes the nV cards are faster, they should be especially since they cost more. Really though, the game itself has been slammed so much already...it's hard to care which card is faster.
 
Feb 19, 2009
10,457
10
76
H is not biased, their reviews are top notch. It is true that NV cards are much faster in Rage than AMDs. You won't see it if you test at low res or low AA, because Rage has a 60 fps max limit.

But so what, NV cards are a lot faster than AMD in LP2, Hawx 2, Crysis 2 etc..

There's a theme here. You may know what it is.
 

ultimatebob

Lifer
Jul 1, 2001
25,134
2,450
126
Geez... I think that we need a third major video card manufacturer, just so the ATI fanboys and Nvidia fanboys both have something to rally against. :)

Oh, and I mean a Competitive video card manufacturer, not Intel or Matrox!