• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

unreal 3 performance review

JSt0rm

Lifer
first off there is a typo in the review showing the wrong resolution in the graphs for the clock for clock it should read 1920x1200 but it shows 1024x768

secondly i had one of those eye brow raising moments when the 2900xt did so well on the demo. I know its early but something nags me about how I always feel like nvidia come out of the gate with a bang and the ati part is always the long distant runner.

I own the 8800gts 640mb and it was hands down the best video card purchase I made considering I bought it the day it was released. I am pleased with its performance and think it looks nice. I came from a 1800xt that I thought had great iq for its day and held up well in many games as it aged. I do like to see that the ati part could possibly be competitive and I really hope that helps out on the cpu side. No time to feed flamers but I just thought it was cool to see the under dog take even a small little bit.

it seems like the unreal 3 engine is going to play very nicely on our multicore cpus as well. good times for the computer ahead I think.
 
Too bad that UT3 in it's current form looks like utter crap. Seriously, it's not special anymore. Maybe 2 years ago when announced and first shown off, but now...it's blah.

Unless the full resolution textures are so eye poppingly amazing or something... The jagged edges too makes me want to puke.
 
The 2900 is doing so well because there is no AA in Dx9 mode in UT3. AA is what usually tanks the R600s performance
 
Originally posted by: cmdrdredd
Too bad that UT3 in it's current form looks like utter crap. Seriously, it's not special anymore. Maybe 2 years ago when announced and first shown off, but now...it's blah.

Unless the full resolution textures are so eye poppingly amazing or something... The jagged edges too makes me want to puke.

If you're having jagged edges you're either at too low a res or you need to increase the top slider under the advanced video preferences to 100%. It's the one above the 'texture detail' and 'world detail' sliders, and it basically controls the ACTUAL rendered frame, which can then be scaled up to fill the active resolution, assuming you set it to less than 100%. 100% will make the game render to your actual specified resolution.

The game uses lookup tables when it detects your graphics card and general system speed to determine what settings to use. Many nvidia and lower end ati systems are not being set well in the beta, and require some hand tweaking. Some of this is documented in the Readme file, the rest can be gleaned from a few threads on the Epic forums. I would also gather that the beta and demo releases will give them opportunity to fine tune the defaults so that not as many people wind up with 'blocky' looking gaming experiences...

 
i havent had the time to read all the text in the article, but i noticed no 2900pro? dont they have one?

anyway im glad my 1950pro will still be good in UT3 engine games in the future.
 
Originally posted by: cmdrdredd
Too bad that UT3 in it's current form looks like utter crap. Seriously, it's not special anymore. Maybe 2 years ago when announced and first shown off, but now...it's blah.

Unless the full resolution textures are so eye poppingly amazing or something... The jagged edges too makes me want to puke.
Too bad that AA does work for GF8-cards only at the moment(?). It would sure be nice to see what kind of performance difference there are when AA is enabled for both HD2900 and GF8000 cards
 
Am I wrong in assuming that AA works with UT3 if you turn it on with a 3rd party app. like Rivatuner? That`s what I had to do with Bioshock, so give that a try if you guys really want your AA in UT3.
 
Just change UT3demo.exe -> Bioshock.exe and go to Nvidia control panel and change bioshock settings. Problem is that now it changes both games..
 
Originally posted by: cmdrdredd
Too bad that UT3 in it's current form looks like utter crap. Seriously, it's not special anymore. Maybe 2 years ago when announced and first shown off, but now...it's blah.

Unless the full resolution textures are so eye poppingly amazing or something... The jagged edges too makes me want to puke.

Supposedly, all of the eye candy wasn't included with and turned on in the Demo in order to keep the file size low. People have asked the same thing in the Epic forums.

I thought it looked and felt great, but then again, I'm playing for the feel of the game and the competition and not the eye candy.
 
Originally posted by: Scalarscience
Originally posted by: cmdrdredd
Too bad that UT3 in it's current form looks like utter crap. Seriously, it's not special anymore. Maybe 2 years ago when announced and first shown off, but now...it's blah.

Unless the full resolution textures are so eye poppingly amazing or something... The jagged edges too makes me want to puke.

If you're having jagged edges you're either at too low a res or you need to increase the top slider under the advanced video preferences to 100%. It's the one above the 'texture detail' and 'world detail' sliders, and it basically controls the ACTUAL rendered frame, which can then be scaled up to fill the active resolution, assuming you set it to less than 100%. 100% will make the game render to your actual specified resolution.

The game uses lookup tables when it detects your graphics card and general system speed to determine what settings to use. Many nvidia and lower end ati systems are not being set well in the beta, and require some hand tweaking. Some of this is documented in the Readme file, the rest can be gleaned from a few threads on the Epic forums. I would also gather that the beta and demo releases will give them opportunity to fine tune the defaults so that not as many people wind up with 'blocky' looking gaming experiences...

NO, the game has jaggies OMG! NO AA!!! Seriously :roll:

 
Originally posted by: WhipperSnapper
Originally posted by: cmdrdredd
Too bad that UT3 in it's current form looks like utter crap. Seriously, it's not special anymore. Maybe 2 years ago when announced and first shown off, but now...it's blah.

Unless the full resolution textures are so eye poppingly amazing or something... The jagged edges too makes me want to puke.

Supposedly, all of the eye candy wasn't included with and turned on in the Demo in order to keep the file size low. People have asked the same thing in the Epic forums.

I thought it looked and felt great, but then again, I'm playing for the feel of the game and the competition and not the eye candy.

Well, personally the game is unplayable to me because the screen flashes to black and back to normal every so often. During gameplay it's so distracting I die (obviously). I notified epic and we will see what happens.

I know the texture resolution is low...that's my point. Does nobody realize I was being sarcastic!?!!? I KNOW the game looks worse than the release will because they did it purposely. I also made mention of no AA which absolutely makes the game look terrible. I notice so many jagged edges everywhere.

The source engine still looks strong after all this time and so far from what I've seen/played of both UT3 and Crysis I do not see the hype. I am waiting to see what DX10 does for these titles though.
 
Just change UT3demo.exe -> Bioshock.exe and go to Nvidia control panel and change bioshock settings. Problem is that now it changes both games..
A far better way is to use nHancer to make your own profile for each game.

NO, the game has jaggies OMG! NO AA!!! Seriously
Yup, games without AA generally look like total ass. We don't buy $300 video cards to run games using settings we used in 1999.
 
Back
Top