Call Of Juarez DX10 benchmark/demo

doggyfromplanetwoof

Senior member
Feb 7, 2005
532
0
0
I told my friend he wasted money on a 8800GTX.

This demo here, brings it to the KNEES!!!! Does not get above 20fps AVG. Lol


I feel sorry for the poor bastards who got a 8xxx video card for DX10.
 

Zenoth

Diamond Member
Jan 29, 2005
5,202
216
106
That demo uses a lot of DX10 features, I doubt that the first "DX10 games" will feature that many effects exclusive to it. It should start just like DX9 did, with a majority of DX8 effects, along with the slow integration of DX9. I remember Star Wars Galaxies being like that when it was first released. The first DX10 generation of GPU's (G80/R600) will be fine playing early DX10 games, but will probably get beaten by the later DX10 games.

And buying a GTX is certainly not a waste as far as DX9 games goes. Not to mention that we don't know if any of the announced early DX10 games will actually be *good* games worthy of going SLi or next gen for them. Eye candy is one thing, good game-play is another story, sometimes I have more fun on my SNES then I can have from any of my recent 3D games on my PC ...
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: doggyfromplanetwoof
I told my friend he wasted money on a 8800GTX.

This demo here, brings it to the KNEES!!!! Does not get above 20fps AVG. Lol


I feel sorry for the poor bastards who got a 8xxx video card for DX10.

You mean the poor bastards who will enjoy Crysis at smooth frame rates while you lucky "late-adopters" need to stick with substandard DirectX 9 graphics and worse performance?

I can deal with that. :)
 

solofly

Banned
May 25, 2003
1,421
0
0
Originally posted by: Nightmare225
You mean the poor bastards who will enjoy Crysis at smooth frame rates while you lucky "late-adopters" need to stick with substandard DirectX 9 graphics and worse performance?

I can deal with that. :)


:)
 

swtethan

Diamond Member
Aug 5, 2005
9,071
0
0
Originally posted by: solofly
Originally posted by: Nightmare225
You mean the poor bastards who will enjoy Crysis at smooth frame rates while you lucky "late-adopters" need to stick with substandard DirectX 9 graphics and worse performance?

I can deal with that. :)


:)

:)
 

Skiutah

Member
Jan 30, 2007
188
0
0
Originally posted by: Nightmare225
Originally posted by: doggyfromplanetwoof
I told my friend he wasted money on a 8800GTX.

This demo here, brings it to the KNEES!!!! Does not get above 20fps AVG. Lol


I feel sorry for the poor bastards who got a 8xxx video card for DX10.

You mean the poor bastards who will enjoy Crysis at smooth frame rates while you lucky "late-adopters" need to stick with substandard DirectX 9 graphics and worse performance?

I can deal with that. :)
Hold the phone, Crysis is out?
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: Skiutah
Originally posted by: Nightmare225
Originally posted by: doggyfromplanetwoof
I told my friend he wasted money on a 8800GTX.

This demo here, brings it to the KNEES!!!! Does not get above 20fps AVG. Lol


I feel sorry for the poor bastards who got a 8xxx video card for DX10.

You mean the poor bastards who will enjoy Crysis at smooth frame rates while you lucky "late-adopters" need to stick with substandard DirectX 9 graphics and worse performance?

I can deal with that. :)
Hold the phone, Crysis is out?

Key word, will.

Is Call of Juarez out? At least the DX10 version
 

Woofmeister

Golden Member
Jul 18, 2004
1,385
1
76
LOL, it's an overlay of DX10 over an old DX9 game that ran like crap when it first came out--let's see how a native DX10 game runs. Particularly after NVIDIA has some more time to work on their drivers.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
The thing that sucks is that performance doesn't really improve when you lower the settings, except for AA. The difference between 2048x2048/High shadows and 1024x1024/Normal Shadows was 1.8 FPS for me.

With maximum settings + 4xMSAA/16xAF, I get 10.3 FPS min / 18.0 FPS avg (8800GTS 640MB @ 612/2000)

Maximum settings, 0xAA/16xAF, I get 12.6 FPS min / 22.4 FPS avg.

@ 1920x1200 w/ Max Settings, 0xAA/0xAF = 10.6 FPS / 20.2 FPS (GPU @ 612/2000)
1920x1200 w/ Max Settings, 4xMSAA/16xAF = 8.9 FPS / 15.6 FPS (GPU @ 612/2000)

While obviously we don't have a TRUE "DX10 title" yet, I believe that a significant problem is still drivers. Performance increases from recent drivers (v158.45, for example) have been significant and only now is nVidia starting to focus on DX10. I think that when the real DX10 titles hit, we'll have much better performance than we see now.

P.S. : What I'm also wondering is how the heck I'm running 1920x1200 on a 1680x1050, 20.1" display.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Extelleron
The thing that sucks is that performance doesn't really improve when you lower the settings, except for AA. The difference between 2048x2048/High shadows and 1024x1024/Normal Shadows was 1.8 FPS for me.

With maximum settings + 4xMSAA/16xAF, I get 10.3 FPS min / 18.0 FPS avg (8800GTS 640MB @ 612/2000)

Maximum settings, 0xAA/16xAF, I get 12.6 FPS min / 22.4 FPS avg.

@ 1920x1200 w/ Max Settings, 0xAA/0xAF = 10.6 FPS / 20.2 FPS (GPU @ 612/2000)
1920x1200 w/ Max Settings, 4xMSAA/16xAF = 8.9 FPS / 15.6 FPS (GPU @ 612/2000)

While obviously we don't have a TRUE "DX10 title" yet, I believe that a significant problem is still drivers. Performance increases from recent drivers (v158.45, for example) have been significant and only now is nVidia starting to focus on DX10. I think that when the real DX10 titles hit, we'll have much better performance than we see now.

P.S. : What I'm also wondering is how the heck I'm running 1920x1200 on a 1680x1050, 20.1" display.

Ha!

Sounds like yet another poorly executed DX9 game with a DX10 patch slapped over it.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Matt2
Originally posted by: Extelleron
The thing that sucks is that performance doesn't really improve when you lower the settings, except for AA. The difference between 2048x2048/High shadows and 1024x1024/Normal Shadows was 1.8 FPS for me.

With maximum settings + 4xMSAA/16xAF, I get 10.3 FPS min / 18.0 FPS avg (8800GTS 640MB @ 612/2000)

Maximum settings, 0xAA/16xAF, I get 12.6 FPS min / 22.4 FPS avg.

@ 1920x1200 w/ Max Settings, 0xAA/0xAF = 10.6 FPS / 20.2 FPS (GPU @ 612/2000)
1920x1200 w/ Max Settings, 4xMSAA/16xAF = 8.9 FPS / 15.6 FPS (GPU @ 612/2000)

While obviously we don't have a TRUE "DX10 title" yet, I believe that a significant problem is still drivers. Performance increases from recent drivers (v158.45, for example) have been significant and only now is nVidia starting to focus on DX10. I think that when the real DX10 titles hit, we'll have much better performance than we see now.

P.S. : What I'm also wondering is how the heck I'm running 1920x1200 on a 1680x1050, 20.1" display.

Ha!

Sounds like yet another poorly executed DX9 game with a DX10 patch slapped over it.

The only way to get acceptable performance at 1920x1200 is to turn off the shadows... avg framerate goes up to 33.6 FPS, but that completely ruins the beauty of the game - it looks amazing maxed out, if only it weren't a slide-show I'd love it.

As I said, I'm not sure we can completely blame our problems on the game developers. This game needs more options to disable other things... the only options are resolution, shadow res/shadow quality, and AA. Apparently changing shadow quality doesn't make much of a difference until you turn them off completely, and the difference made by shadow res is negligible. Still, I believe drivers need to be improved. I don't know how good/bad nVidia's DX10 drivers really are, but I'm pretty sure they are not where they can be, and certainly AMD's drivers are not either.

 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: Matt2
Originally posted by: Extelleron
The thing that sucks is that performance doesn't really improve when you lower the settings, except for AA. The difference between 2048x2048/High shadows and 1024x1024/Normal Shadows was 1.8 FPS for me.

With maximum settings + 4xMSAA/16xAF, I get 10.3 FPS min / 18.0 FPS avg (8800GTS 640MB @ 612/2000)

Maximum settings, 0xAA/16xAF, I get 12.6 FPS min / 22.4 FPS avg.

@ 1920x1200 w/ Max Settings, 0xAA/0xAF = 10.6 FPS / 20.2 FPS (GPU @ 612/2000)
1920x1200 w/ Max Settings, 4xMSAA/16xAF = 8.9 FPS / 15.6 FPS (GPU @ 612/2000)

While obviously we don't have a TRUE "DX10 title" yet, I believe that a significant problem is still drivers. Performance increases from recent drivers (v158.45, for example) have been significant and only now is nVidia starting to focus on DX10. I think that when the real DX10 titles hit, we'll have much better performance than we see now.

P.S. : What I'm also wondering is how the heck I'm running 1920x1200 on a 1680x1050, 20.1" display.

Ha!

Sounds like yet another poorly executed DX9 game with a DX10 patch slapped over it.

Yup, even the 2900s perform poorly with it...
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Nightmare225
Originally posted by: Matt2
Originally posted by: Extelleron
The thing that sucks is that performance doesn't really improve when you lower the settings, except for AA. The difference between 2048x2048/High shadows and 1024x1024/Normal Shadows was 1.8 FPS for me.

With maximum settings + 4xMSAA/16xAF, I get 10.3 FPS min / 18.0 FPS avg (8800GTS 640MB @ 612/2000)

Maximum settings, 0xAA/16xAF, I get 12.6 FPS min / 22.4 FPS avg.

@ 1920x1200 w/ Max Settings, 0xAA/0xAF = 10.6 FPS / 20.2 FPS (GPU @ 612/2000)
1920x1200 w/ Max Settings, 4xMSAA/16xAF = 8.9 FPS / 15.6 FPS (GPU @ 612/2000)

While obviously we don't have a TRUE "DX10 title" yet, I believe that a significant problem is still drivers. Performance increases from recent drivers (v158.45, for example) have been significant and only now is nVidia starting to focus on DX10. I think that when the real DX10 titles hit, we'll have much better performance than we see now.

P.S. : What I'm also wondering is how the heck I'm running 1920x1200 on a 1680x1050, 20.1" display.

Ha!

Sounds like yet another poorly executed DX9 game with a DX10 patch slapped over it.

Yup, even the 2900s perform poorly with it...

I'd like to see some numbers with the release CoJ DX10 plus the newer HD 2900XT drivers... the numbers we see from the 2900 are from an older build of the benchmark / release drivers. If someone on here w/ a 2900XT can run some tests that would be great.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
I told my friend he wasted money on a 8800GTX.
Your friend's card will run the game well under the DX8 path while most other cards still crawl.
 

Skiutah

Member
Jan 30, 2007
188
0
0
Originally posted by: Nightmare225
Originally posted by: Skiutah
Originally posted by: Nightmare225
Originally posted by: doggyfromplanetwoof
I told my friend he wasted money on a 8800GTX.

This demo here, brings it to the KNEES!!!! Does not get above 20fps AVG. Lol


I feel sorry for the poor bastards who got a 8xxx video card for DX10.

You mean the poor bastards who will enjoy Crysis at smooth frame rates while you lucky "late-adopters" need to stick with substandard DirectX 9 graphics and worse performance?

I can deal with that. :)
Hold the phone, Crysis is out?

Key word, will.
Then how do you know the 8800GTX will have smooth frame rates with that game if it's not out?

 

Eomer of Aldburg

Senior member
Jan 15, 2006
352
0
0
Crysis had a demo playable at like a computer show and apparently they were running a dell 30inch and running it off a 8800GTX
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Too bad I bought my card to Run CURRENT games @ 1920x1200 with maximum eye candy.... I guess the op doesn't know what kind of strain a 24 incher puts on a video card
 

Raider1284

Senior member
Aug 17, 2006
809
0
0
Originally posted by: Eomer of Aldburg
Crysis had a demo playable at like a computer show and apparently they were running a dell 30inch and running it off a 8800GTX

I played this demo at GDC this year. It was def on a 30 inch screen, and on a 8800gtx, dont remember if it was SLIed or not. The graphics/physics were turned down as well but it had playable framerates. Dropped to semi-choppy/slow levels when the helicopter came in but still playable.
 

Pugnate

Senior member
Jun 25, 2006
690
0
0
Originally posted by: doggyfromplanetwoof
I told my friend he wasted money on a 8800GTX.

This demo here, brings it to the KNEES!!!! Does not get above 20fps AVG. Lol


I feel sorry for the poor bastards who got a 8xxx video card for DX10.

I bought my 8800GTX in Nov. when I was considering waiting for the R600. That was an excellent decision as we know from how the story unfolded.

And who buys the 8800 for DX10 performance? I bought mine because it was the best at DX9.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Erm, guys its a frigging benchmark tool ? 3dmark06 brings a lot of midend videocards to their knees, but the real world performance of those cards are just fine. I don't really see the problem. And as mentioned before, most likely the drivers aren't completely optimized for it, and neither is the game itself.

 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: MarcVenice
Erm, guys its a frigging benchmark tool ? 3dmark06 brings a lot of midend videocards to their knees, but the real world performance of those cards are just fine. I don't really see the problem. And as mentioned before, most likely the drivers aren't completely optimized for it, and neither is the game itself.

This is different. This is an in-game benchmark that should reflect performance in the actual game. 3D Mark is a test made solely for benchmarking and does not use any actual game engine.

I'm still wondering this.... how am I able to run 1920x1200 on my 1680x1050 monitor? It's definately 1920x1200 because of the performance decrease, and it's full screen just like any other resolution. This doesn't make sense to me.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Meh wait for the real deal to come out, and then start crying if your expensive dx10 card can't hack it :p

And as for the res, I dunno, I think 3dmark06 used to do the same to my monitor, which has a native res of 1024*768, but the official benchmark runs in 1280*1024 so it forces the resolution on your monitor, I bet it didn't look pretty ?
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: MarcVenice
Meh wait for the real deal to come out, and then start crying if your expensive dx10 card can't hack it :p

And as for the res, I dunno, I think 3dmark06 used to do the same to my monitor, which has a native res of 1024*768, but the official benchmark runs in 1280*1024 so it forces the resolution on your monitor, I bet it didn't look pretty ?

It looked fine to me... it's hard to tell when it's chugging along at less than 20 FPS, but it didn't look weird at all.
 

Skiutah

Member
Jan 30, 2007
188
0
0
Originally posted by: Raider1284
Originally posted by: Eomer of Aldburg
Crysis had a demo playable at like a computer show and apparently they were running a dell 30inch and running it off a 8800GTX

I played this demo at GDC this year. It was def on a 30 inch screen, and on a 8800gtx, dont remember if it was SLIed or not. The graphics/physics were turned down as well but it had playable framerates. Dropped to semi-choppy/slow levels when the helicopter came in but still playable.
Oh, okay. I thought for a second that Nightmare was saying he had Crysis already, then when it turned out he didn't have it I was like, "WTF? How do you know how it will handle that game."

How long ago was that demo? If Crysis was finished enough to run the game with a 30" Dell and an 8800 GTX, why the wait? How do we know the game won't require more GPU power than it has thus far? Are they adding optimizations or more DX10 features or both? (Sorry for thread crapping, I just don't see why Crysis isn't out if it's finished enough for people to play demos of it.)