no graphic options in RAGE beside resolution, brightness, aa and transcoding

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

address_unknown

Junior Member
Oct 5, 2011
23
0
0
Yeah i didn't understand this either... Now i'm stuck with RAGE and my system cant handle the textures, i get a 3-4 second lag when just looking around the world. And Steam doesnt give refunds. :(
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
The game is suppose to dynamically adjust settings on the fly based upon the resources on the system. Since there is a problem with the installation where folders can't be created upon installation, some files that should be there are missing. So far, people found that the folders that are suppose to be in %appdata% is missing, and there are no ways for users to figure out what else are missing. In conjunction of drivers with missing OGL patches, a disaster is formed.

The missing customization is because the game is suppose to figure out the max settings to acquire 60FPS, but due to the problem with installation, I am not surprised that the auto-configuration is not working at all, forcing the game to run at minimium spec. The result is improper use of VRAM and CPU bottleneck.

Just wait a bit for a patch to fix all that. If you don't like to wait, you can manually change the graphic settings via cfg files as a work around. Based upon what they say about transcoding, the game should be able to max out GTX460, so graphics should far exceed consoles. If you have a video card stronger than GTX460 with a weak CPU, transcoding will offload CPU onto GPU, avoiding CPU bottleneck on high-end video card setup.

All this should be amazing, except that it isn't working, yet. Nothing is working yet...:'(

Here are a few official remedies on tearing and blurry textures.
http://forums.bethsoft.com/index.php?/topic/1236250-rage-support/page__p__18743528#entry18743528
 
Last edited:

sticks435

Senior member
Jun 30, 2008
757
0
0
Hmmm, found this tidbit in an article on Tom's about the issues.

Wednesday Bethesda said that the RAGE team is currently working on an update that will allow players to more easily make configuration changes (rather than let the game do it for you). More details regarding the update will be released shortly.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
makes you wonder just what in the hell game goes on when they make a pc version of a game.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I'm a long time fan, having been blown away by the original Quake etc. I remember reading articles over the years. Some of them, (Carmack)expressing bitterness possibly towards PC piracy. IMO, it's why the game engine was evolved the way it was for PC, with no regard for the amount of disk space it would take. The more the better :)
Here is a interview going back to 2008.
Carmack: PCs Not Important As Consoles

While digital distribution is an attractive option for publishers, developers and some gamers, Rage isn’t going to be an ideal candidate for that model. “It will be harder, because this is going to be a larger distribution; we’re at least at two DVDs and on the PC we might choose to be three DVDs to match what the game will look like for the PS3,” said Carmack. “So that makes for a pretty damn big download. I wouldn’t say it’s an optimal game for digital distribution, and I don’t think it’s a high-level strategic question.”
As primarily a developer on the PC, few know the shifts in the industry as well as id Software. Constantly are doom-and-gloom reports about how the PC is falling to the wayside of consoles, some even blaming piracy. Carmack isn’t so convinced: “Well, it’s hard to second guess exactly what the reasons are. You can say piracy. You can say user migration. But the ground truth is just that the sales numbers on the PC are not what they used to be and are not what they are on the consoles.”
 

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,206
126
Here's a quick faq:

* Why don't they just de-transcode once and keep a cache around?
The format stored on disk is about 1/10th the size of the GPU-required formats. Keeping all of the detranscoded data in memory (or in a cache) would basically overwhelm available system memory. Keep in mind that it's not just how much RAM you have in your system. Rage is only a 32-bit executable, so it has a maximum addressable space of ~3G. Add to that the fact that Windows gets a little bitchy when memory usage for a single process gets too high, and you have a recipe for poor performance.

Well what the [heck]? This is stupid. 32-bit should die off. We have 64-bit OSes, and RAM has never been cheaper. No reason for a gaming rig not to have 16GB of RAM. This is a PC we're talking about, not a damn console. After all, Internet Explorer and Firefox both have 64-bit versions (Ok, there isn't a "blessed" official Mozilla public release version of Firefox in 64-bit, but Waterfox is effectively the same thing.)


Let's keep it clean
-ViRGE
 
Last edited by a moderator:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I'm a long time fan, having been blown away by the original Quake etc. I remember reading articles over the years. Some of them, (Carmack)expressing bitterness possibly towards PC piracy. IMO, it's why the game engine was evolved the way it was for PC, with no regard for the amount of disk space it would take. The more the better :)
Here is a interview going back to 2008.
Carmack: PCs Not Important As Consoles

He has changed his stance now:

John Carmack: Developing RAGE for consoles was a big mistake, future titles will put priority on PC hardware
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I think that's exactly what it is. It's probably sticking all of the textures into the system memory, then constantly streaming them across PCIe to the GPU.

I also notice it's detecting the CPU as 3 MHz. That probably won't be an issue but it's still incorrect detection nevertheless.

LOL - watch the game details crank way-down to match your CPU @ 3mhz!

:awe:
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Well after I configured my rageconfig.cfg to force high graphical settings, the game's visuals are a mixed bag.

Animations are spectacular, and level of distance views are absolutely draw dropping to see in real time, but there are various small things in the game that have horrible textures - rocks, railings, some walls.... overall, when it looks it's best it is quite amazing to see it, and when you center on crappy textures, it looks like they left ultra low resolution assets and forgot to put higher resolution assets in place.

For what it's worth, I am getting perfect game performance (locked at 60fps at all times on my gtx560ti 1 gig vram) and the only time I can see texture pop-in is when I spin 180 degrees and specifically look at the edges of the screen. It's there for only a split second (I'd say less than a quarter second) and it's only on the edge. And for anyone interested, here is my rageConfig.cfg file

seta com_videoRam 1024
seta r_useRenderThread 1
seta r_renderer best //highest rendering options
seta r_useHBAO 1 //Ambient occlusion
seta r_visDistMult 1
seta r_useMotionBlur 1 //use blur
seta r_skipBump 0
seta r_skipSpecular 0
seta r_skipNewAmbient 0
seta r_shadows 1
seta r_cgFragmentProfile best
seta r_cgVertexProfile best
vt_pageimagesizeuniquediffuseonly2 8192
vt_pageimagesizeuniquediffuseonly 8192
vt_pageimagesizeunique 8192
vt_pageimagesizevmtr 8192
//vt_restart
vt_maxaniso 16
image_anisotropy 16
vt_useCudaTranscode 2
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
After playing the game for an hour or so, I have come to a very solid conclusion that Rage will win awards for both the best looking game and worst looking game when various websites and magazines give out game awards for 2011. The level of detail for distance viewing is amazing - a new standard other games should strive for. Up close though, it's often WORSE than Doom 3 - no exaggeration.
 

dualsmp

Golden Member
Aug 16, 2003
1,627
45
91
How much difference is there when you turn the cuda transcode on and off when looking at the CPU resources within task manager? Is there a big difference? Can someone take a screenshot of task manager using transcode on and then off?
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
The transcoding being Nvidia only is bull. Couldn't OpenCL or DirectCompute be used to do the same thing, and be hardware agnostic? The latter is DirectX 11 only, so I understand why id wouldn't use it, but OpenCL should play very nicely with OpenGL. I bet Nvidia paid id a fair bit of money to make transcoding CUDA only, but that's just a guess.

As for the bugs and driver issues specifically...le sigh. Yet another game PC gamers were looking forward to is a dud, and not only that, it's in the realm of cluster****. Hopefully id, AMD, and Nvidia can pull their act together and give PC gamers the experience they were promised.
 

(sic)Klown12

Senior member
Nov 27, 2010
572
0
76
The transcoding being Nvidia only is bull. Couldn't OpenCL or DirectCompute be used to do the same thing, and be hardware agnostic? The latter is DirectX 11 only, so I understand why id wouldn't use it, but OpenCL should play very nicely with OpenGL. I bet Nvidia paid id a fair bit of money to make transcoding CUDA only, but that's just a guess.

A

I read a link(will try to find it) that id tried an OpenCL implementation, but at the time of testing(don't know exactly when this was), performance wasn't up to the task. This also ties into some various comments from Carmack that CUDA was easier for him to work with. Hopefully this is something that can be rectified in the near future, along with the possible high-resolution texture packs.

Edit: Information came from another Anandtech post, but no direct link.
http://forums.anandtech.com/showpost.php?p=32365118&postcount=1
 
Last edited:

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
8,129
3,067
146
are there any guides on config tweaking? How different is it from doom3? I will com_showfps 1 and com_allowconsole 1 work?

anyone try r_mode -1? and then set from there? does that still work?
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
The transcoding being Nvidia only is bull. Couldn't OpenCL or DirectCompute be used to do the same thing, and be hardware agnostic?
Maybe, but it might not be faster than the CPU doing it. There are several highly useful features that, thanks to design by committee, didn't make it into early specs, and don't have sufficiently wide support yet, most notably accumulators.

To have done it well, they'd have to have made it basically AMD or nVidia-specific (they likely could do a Stream/APP version, but mature support for CUDA was there when they started work on the engine). For more general-purpose uses, this often isn't true, and it's often worth the performance hit (Adobe is now going to OpenCL, FI), but for a game, you're dealing with coarse soft-real-time constraints, especially one that tries to keep constant FPS. OpenCL could be good enough now, maybe, but it also seems that CPUs are good enough the job, as well, and that Id, AMD, and nVidia have some work to do, given that no prior engine has taxed their drivers to such a degree as Id tech 5. For the latter case, the GPU option can be seen as an extra for nVidia GPU owners, since Id had the CUDA implementation done ages ago, not knowing whether or not it would be necessary to do it on the GPU. As better drives come out in the coming weeks, we'll see how it all turns out.

DirectCompute...that's a good question. It may be impossible or plain difficult (DC more or less requires you to be dealing with texture map and frame buffer type data structures, unless you like pulling your hair out), or there could have been DX<->OGL performance issues that could get in the way.

While Id is getting a lot of flack, and I have no interest in this particular game, they are technically pushing the envelope with Id Tech 5, and the varied Windows PC environment is being a royal pain because of it.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Maybe, but it might not be faster than the CPU doing it. There are several highly useful features that, thanks to design by committee, didn't make it into early specs, and don't have sufficiently wide support yet, most notably accumulators.

To have done it well, they'd have to have made it basically AMD or nVidia-specific (they likely could do a Stream/APP version, but mature support for CUDA was there when they started work on the engine). For more general-purpose uses, this often isn't true, and it's often worth the performance hit (Adobe is now going to OpenCL, FI), but for a game, you're dealing with coarse soft-real-time constraints, especially one that tries to keep constant FPS. OpenCL could be good enough now, maybe, but it also seems that CPUs are good enough the job, as well, and that Id, AMD, and nVidia have some work to do, given that no prior engine has taxed their drivers to such a degree as Id tech 5. For the latter case, the GPU option can be seen as an extra for nVidia GPU owners, since Id had the CUDA implementation done ages ago, not knowing whether or not it would be necessary to do it on the GPU. As better drives come out in the coming weeks, we'll see how it all turns out.

DirectCompute...that's a good question. It may be impossible or plain difficult (DC more or less requires you to be dealing with texture map and frame buffer type data structures, unless you like pulling your hair out), or there could have been DX<->OGL performance issues that could get in the way.

While Id is getting a lot of flack, and I have no interest in this particular game, they are technically pushing the envelope with Id Tech 5, and the varied Windows PC environment is being a royal pain because of it.

Ok, that makes sense. I understand that Rage was in development for 6 years, back when GPGPU was just getting off the ground. If CUDA was the only thing really viable for game development when they were looking that's understandable. As an AMD user I still feel like I got the short end of the stick though. :(