Unreal creator believes G70 beats R520

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
There are more Unreal 2 games than I thought. A lot are just sequals to other games though.

My point still stands, there isnt one Doom3 game out yet. By the time there are lots of Unreal 3 engines out, these newer cards will not be the fastest out.
 

Emultra

Golden Member
Jul 6, 2002
1,166
0
0
Not everyone will get R520, or get it right away. Same for G70.

So I think how Unreal3 will perform on current generation cards will make quite a difference.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Ackmed
There are more Unreal 2 games than I thought. A lot are just sequals to other games though.

My point still stands, there isnt one Doom3 game out yet. By the time there are lots of Unreal 3 engines out, these newer cards will not be the fastest out.

sure there is . . . Doom 3 and it's expansion pak. :p

:D

AND although the 6800 series may not be the fastest to run Unreal3's SM 3.0 path and 1000+ shader instructions - the current ati offerings won't be able to run them at all. ;)
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
It's a dubious distinction. All I'm saying is this:

John Carmack and Tim Sweeney are lead programmers on arguably the two biggest, most licensed engine technologies in the gaming industry, and they work very closely with both graphic card companies. Their software will be used to power dozens of AAA titles over the coming years, and they DEFINITELY know what the hell they're talking about.

If these two gentlemen suggest that a certain IHV's hardware works best with their particular software, as Carmack did constantly even with the crippled NV3x series, it's probably a good idea to listen to what they have to say.

This does not mean these chips will be the best for the licensed offspring of these engines, nor that they are the best all around. All they are saying is that if you want the most out of their engines, when choosing between G70 and R520, G70 is at the moment better for them. I'm sure if you were talking to Valve's programmers about source, they'd be recommending ATi technology. While I think Source is an outstanding engine, it hasn't been licensed hardly at all, and Doom 3 is decidedly short on licenses as well. Of the next-gen engines that are around the corner, the Unreal Engine 3 has been licensed by MANY MANY more developers than the other big two combined.

My conclusion: Give some fair consideration to what Sweeney says if you're interested in playing Unreal 3 based games on next-gen hardware. I don't know about you folks, but I'm not missing UT2007, Gears of War, or BioWare's next game.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
I think a distinction must be made between a developer's preference and a gamer's preference. Carmack and Sweeney are both developers, and I take their preferences as such (both would prefer the higher limits and supposedly more stable OGL drivers). I'm not sure that translates directly to a gamer's preference, especially since ATI cards have tended to run Unreal Engine games slightly better (exclusive effects are a separate issue). Obviously, Doom 3 runs better on nV cards. I'm curious to see if D3-engine games will, though obviously a GF6 is a safer bet ATM.

I'm still not convinced UE3 will run better on nV's SM3 card over ATI's, though. I guess a GF6 will have an advantage over an X800, but that may be due to unique effects, not necessarily due to shader limits (per the above quote). And remember that Gears of War is being developed for ATI's R500, which, while it seems to be unique, should still show some similarity in terms of shader hardware to ATI's current and future, so you'd think GoW won't exactly be at a disadvantage on future (SM3) ATI h/w. But, we'll see, I guess.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Pete
I didn't realize UE3 was SM3 from the ground up, but I haven't been following it that closely outside of the sweet screenshots and videos. AFAIK, it started as SM2 and moved to SM3 later. At least, that's what I gleaned from this interview:

my take was that in early development, sm2 HAD to be used in the interim, but the plan all along was sm3, so the concept/designs followed that. i can't recall where i read that, but that was my interpretation.

another interesting thing is that, even with dx9b, the full feature-set is simply not yet available, even tho nv40 is capable of it, but will be by the time u3 hits the shelves, which contrary to some statements on these forums recently, is going to be some time next year:

NVIDIA's latest GeForce 6 series of video cards has support for Shader Model 3.0 which is exposed, but not useable, in the publicly available version of DirectX9.0b. With an anticipated release of a revision of DX9 that allows the use of SM 3.0, could you tell us what are the more interesting and useable advantages SM3.0 offers over the current SM 2.0 model? Referencing the two public demos of UE3, what kind of pixel and vertex shader 3.0 were used?

PS 3.0 utilizes a wide range of optimizations, from 64-bit frame-buffer blending to looping and dynamic conditionals for rendering multiple light interactions in a single pass without requiring a combinatorical explosion of precompiled shaders.

Our pixel shaders in the Unreal Engine 3 demos are typically 50-200 instructions in length, and are composed from a wide range of artist-controlled components and procedural algorithms.

Our vertex shaders are quite simple nowadays, and just perform skeletal blending and linear interpolant setup on behalf of the pixel shaders. All of the heavy lifting is now on the pixel shader side -- all lighting is per-pixel, all shadowing is per-pixel, and all material effects are per-pixel.

Once you have the hardware power to do everything per-pixel, it becomes undesirable to implement rendering or lighting effects at the vertex level; such effects are tessellation-dependent and difficult to integrate seamlessly with pixel effects.Full Article


at any rate, while it's certainly fun to speculate, we really won't know for sure until games based on these actually ship.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: ddogg

well show me one damm link that proves that statement u made...i dont care whether u suggest nvidia to friends or crap like that....fact is u make stupid comments here showing ur absolute ignorance....not one person in this thread so far has agreed with ur statement, so next time u make statements like that have links to back it up...just goes to show what an ignorant fvcuk u are!! plain and simple

Guys cheer up. Nvidia shipped 1 6800Ultra Extreme watercooled to Ronin. ATI shipped 1.5 X850xt pe cards (50% more than Nvidia). That's because the 2nd x850xt pe that's "available" on the market is still pre-ordered by ATI's CEO and hasnt seen availability since December. Of course the 1st one was sold to the Prime Minister of Canada.

With respect to this whole Unreal topic....looking at Chronicles of Riddick 2.0++ (not even SM3.0) and Splinter cell SM3.0 settings, Nvidia cards actually suffer extensively due to the increased image quality they cant muscle just yet (until G70). Nvidia cards will most likely look better (ie. play better -- play on words) but performance wise they might be repeating history with 5900 series at DX9 in HL2. I mean i''d rather play x800xt as opposed to 6800 series if the performance hit is large (given that ATI cards are almost always faster in all latest shader intensive games) even with reduced image quality. But if 6800 series of cards are actually faster, then this is good news for 6800 owners. Then again we might see a situation where the overall gpu speed is so severely lacking that both cards will only be able to play at 1024x768 at max settings; just like 5900 and 9800 series are in doom 3. In that game, all achitectural benefits dissappeared due to overall gpu inadequacies.
 

bersl2

Golden Member
Aug 2, 2004
1,617
0
0
Originally posted by: Pete
Carmack and Sweeney are both developers, and I take their preferences as such (both would prefer the higher limits and supposedly more stable OGL drivers).

I remember reading this somewhere.

I'm not sure that translates directly to a gamer's preference

What gamers' preference? What technical aspects would gamers know that developers wouldn't? The gamers' preference is the developer's preference, and it makes sense that both would recommend nVidia if both want to use OpenGL, because nVidia's implementation doesn't suck as much as ATI's.
 

imported_humey

Senior member
Nov 9, 2004
863
0
0
What does N30 and N35 have to do with a 6800U, exactly what issue are peeps getting in Riddick cause i have none on my Ultra bar i dont really like the game ?.
Yes the above chips were a joke and were 1st run of the new manu process instead of hanging old method which ATI are still on, and may have same bad start as Nvidia unless they can learn by Nvidias ballsup, the 6800 range is perfect but for the PvP on agp cards, and mines is working for mpeg2 atleast.

None them games you mention rape my pc, FarCry is a pc raper with all maxed and HDR and open GL Doom 3 Ressurection Of Evil is also with all maxed and if you use flashlight and run, it halfed my fps to 31, but still playable, but turn of AA in doom its far faster all maxed bar AA and it looks no different and not jaggy anyhow.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: humey
What does N30 and N35 have to do with a 6800U

I was just saying the performance drop for 6800 cards using higher visuals that SM3.0 brings is horrendous, just like 5900 series in DX9. So even Ti4200 plays faster having DX8.1 features. I am just saying there might be a possibility that 6800 series will be very slow if they are forced to run SM3.0 even if they can handle it, based on riddick's performance.

exactly what issue are peeps getting in Riddick cause i have none on my Ultra bar i dont really like the game ?

the game?s ?2.0++? mode enables soft stencil shadows for GeForce 6 users (although this comes at a remarkable performance hit, which you?ll see in our benchmarks). Basically this renders the game Unplayable

FarCry is a pc raper with all maxed and HDR

I doubt it ... Research, Regulator, Training, Volcano - all only playable at 1024x768 noAA/AF

Finally, SplinterCell: Chaos Theory
Only playble at 1024x768 no AA/AF
Performance benefit of 6800 cards with 3.0 vs. 1.1 enabled = Some
But ATi cards are still Faster

So to conclude, I am not saying that SM3.0 will necessarily run slower on Nvidia's hardware but its strongest (and demanding) features certainely suggest that that is a more likely possibility. Furthermore, even if SM3.0 brings in a boost of performance, ATi cards are faster in shader intensive games (ie. TAOD, Far Cry, HL2, Perimeter, Thief: deadly shadows, Hitman:contracts, Deus ex: invisible war, 4th test of 3dmark03) and you can bet that future games will be more shader intensive. The reason why Nvidia is faster in games like Doom 3 and riddick is because of stencil shadows and the benefits its cards have in intensive "shadow" environments (besides being OpenGL games).



 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Emultra
Originally posted by: RussianSensation
Basically this renders the game Unplayable

That would be, "unplayably". ;)

I am sorry I dont think there is such a word in the english language and I am russian :) But yeah my english isn't the best hehe so maybe renders was a poor word to use here.

Edit: if there is my bad but the online dictionary doesnt have it. Maybe something like this makes sense: Doom 3 is unplayably dark on my PC Monitor, or a phrase such as unplayably poor 3D rendering?
 

imported_humey

Senior member
Nov 9, 2004
863
0
0
And who told you 2 this misinfo, i have a 6800U and i dont have any games unplayable with all eyecandy and p.s 3 and HDR, you both need go look at that peeps Splinter Cell benches with P.S 1 versus P.S 3 then P.S 3 and HDR, there was not much a drop in any the tests.

Unless you own one these cards and try it you wont know and dont believe all you read here on on websites.

Still has ZERO to do with your comparision to the fooked up 5800-5950 series cards on DX9

Splintercell 3 CTis a doddle to play maxed its nowwhere near the games i mentioned in other thread for rapping my pc.

Sure use a ATI and get few fps more in some games but see the details in 2.5year old views.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: apoppin
Originally posted by: Ackmed
There are more Unreal 2 games than I thought. A lot are just sequals to other games though.

My point still stands, there isnt one Doom3 game out yet. By the time there are lots of Unreal 3 engines out, these newer cards will not be the fastest out.

sure there is . . . Doom 3 and it's expansion pak. :p

:D

AND although the 6800 series may not be the fastest to run Unreal3's SM 3.0 path and 1000+ shader instructions - the current ati offerings won't be able to run them at all. ;)

Thats still one game.

Good job going off topic trying to troll. The fact is, you dont know what the ATi cards will be running.
 

imported_humey

Senior member
Nov 9, 2004
863
0
0
BTW Doom3 is easy to max but the add on Resurrection Of Evil is more strainfull to pc, its not same, it must have newer stuff in it, i know baddies esp bosses were better detail.

I enjoyed both but the add on was shorter,nicer and harder esp end boss unlike in Doom3 he wasnt the hardest part of game.
 

ddogg

Golden Member
May 4, 2005
1,864
361
136
Originally posted by: humey
BTW Doom3 is easy to max but the add on Resurrection Of Evil is more strainfull to pc, its not same, it must have newer stuff in it, i know baddies esp bosses were better detail.

I enjoyed both but the add on was shorter,nicer and harder esp end boss unlike in Doom3 he wasnt the hardest part of game.

lol the last boss was nearly impossible to kill in Doom3...used cheats to finish him off!! ;)
 

imported_humey

Senior member
Nov 9, 2004
863
0
0
I was lucky i had lot heavy weopons left and spulcube added to bfg killed him, soulcube i only used on big baddy ideally unless i had low health i killed the others and i was back to 100%.

In add on its nightmare to do believe me,i wont say what incase no of you have seen it.
 

imported_humey

Senior member
Nov 9, 2004
863
0
0
To save u having to use google as it would have bene my 1st stop, here is 3 of top my head as im tot busy.

FarCry if patched to 1.3 or higher
Splinter Cell 3 CT
Riddick
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Emultra
Unplayable: adjective; unplayably; adverb. Simple. :D

Oh well, OT...
Man, it's been a while, but I don't think "unplayable" is modifying "renders." I'd think unplayable is the correct term, in the same vein as "renders his point moot" and "rendered useless." "Unplayable" is modifying "game," not "renders," AFAIK.

Anyway, new GoW pic to drool over. I'd be surprised if a 6800U could give 30+ fps average with that kind of quality--ignoring the AA, as it's a PR shot. (Then again, I don't see anything that screams SM3 or FP blending/HDR, either, so I'd think an X800 would be able to keep up with a 6800.) Alls I sees is wicked graphics. The next gen can't get here soon enough. :)
 

Emultra

Golden Member
Jul 6, 2002
1,166
0
0
Yes, I know, but I was playing on the term "rendering" as in rendering graphics. Thereby, "rendering [the game] unplayably". ;) :)
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: Pete
Originally posted by: Emultra
Unplayable: adjective; unplayably; adverb. Simple. :D

Oh well, OT...
Man, it's been a while, but I don't think "unplayable" is modifying "renders." I'd think unplayable is the correct term, in the same vein as "renders his point moot" and "rendered useless." "Unplayable" is modifying "game," not "renders," AFAIK.

Anyway, new GoW pic to drool over. I'd be surprised if a 6800U could give 30+ fps average with that kind of quality--ignoring the AA, as it's a PR shot. (Then again, I don't see anything that screams SM3 or FP blending/HDR, either, so I'd think an X800 would be able to keep up with a 6800.) Alls I sees is wicked graphics. The next gen can't get here soon enough. :)



Link's broken.