Anyone with UT3, Vista, 88/9800GT, and 1920x1200?

Dec 30, 2004
12,553
2
76
UT3 under Vista32 or 64 is _significantly_ slower than XP 32-bit. I'm talking regular dips to 30fps in certain areas, whereas XP will keep it pegged at 60 most of the time, occasionally drops to 45-50.
Cod4 is the only other game I play regularly, but it's performance on Vista is fine.

I'm thinking it's CPU related. My guess is the new driver model; with Vista having to switch between kernel mode and user mode, combined with me only having 1MB of cache, is the main problem. Can anyone with the same resolution and graphics card, but with a higher powered processor, post how your Vista performance is?

My friend with his 2.4Ghz (3MB cache) Core 2 mobile processor with Nvidia 8700M graphics (he games UT3 at 14x9 IIRC) says his FPS is pegged at 60 all the time, if that helps. That could be because of the cache, or perhaps because he's gaming at a lower resolution (if it's the lower resolution, it could still be a CPU bound situation because of the way WDDM works).

Now that I've gotten used to the PreCaching in Vista I don't like having to go back to XP for gaming...but it's just too slow in Vista.
I wanna go fast. Do I need an e8400 or a would a q6600 suffice to get past this?
 
Nov 26, 2005
15,188
401
126
Yeah, I noticed it when I made the switch to Vista 64. Lately its been fine meaning it runs super fast as I have finally got my Quad up and running for it :D - its been a long time. And not to mention, but sometimes its server side. I was playing on Unreal Grrrls CTF server yesterday and it was horridly slow; players were also complaining about the server lagging.

try 1024x768
 
Dec 30, 2004
12,553
2
76
Originally posted by: BTRY B 529th FA BN
Yeah, I noticed it when I made the switch to Vista 64. Lately its been fine meaning it runs super fast as I have finally got my Quad up and running for it :D - its been a long time. And not to mention, but sometimes its server side. I was playing on Unreal Grrrls CTF server yesterday and it was horridly slow; players were also complaining about the server lagging.

try 1024x768

Do you think it's the quad or the higher Ghz+cache that made the difference?
 
Nov 26, 2005
15,188
401
126
Its a combination of both. I read an extra 3mb cache from a 7200 to an 8400 made about 30 peak more fps. Also, the game utilizes all 4 cores.

On the other hand, I went from 3.6Ghz to 4Ghz on my 8400 and there was a noticeable difference.
 
Dec 30, 2004
12,553
2
76
Originally posted by: BTRY B 529th FA BN
Its a combination of both. I read an extra 3mb cache from a 7200 to an 8400 made about 30 peak more fps. Also, the game utilizes all 4 cores.

On the other hand, I went from 3.6Ghz to 4Ghz on my 8400 and there was a noticeable difference.

Interesting.

If I disable d3d10 it turns the World Details back to 1. If I turn them back to 5 it re-enables d3d10. Hurray for fake DirectX 10 "enhancements"; there's nothing different between the 5 setting in Vista and the 5 setting on XP. Heh.
Lowering the resolution doesn't help any, it hits 25-30fps on parts in XP where it's at 45-55.
 
Oct 16, 1999
10,490
4
0
You're getting a false reading from the detail slider, it controls several variables in certain combinations. There is a difference in the 5 setting between Vista and XP. Vista is using DX10, XP isn't. That's where your speed difference is coming from.
 
Dec 30, 2004
12,553
2
76
Originally posted by: Gonad the Barbarian
You're getting a false reading from the detail slider, it controls several variables in certain combinations. There is a difference in the 5 setting between Vista and XP. Vista is using DX10, XP isn't. That's where your speed difference is coming from.

Haha, good one, no. There is no visual difference. Countless sites on the internet have confirmed this; even on Crysis there is nothing special that DX10 does that DX9 won't do. Well, except for "slow your game down".
 
Oct 16, 1999
10,490
4
0
Originally posted by: soccerballtux
Originally posted by: Gonad the Barbarian
You're getting a false reading from the detail slider, it controls several variables in certain combinations. There is a difference in the 5 setting between Vista and XP. Vista is using DX10, XP isn't. That's where your speed difference is coming from.

Haha, good one, no. There is no visual difference. Countless sites on the internet have confirmed this; even on Crysis there is nothing special that DX10 does that DX9 won't do. Well, except for "slow your game down".

So turn off DX10 and get the speed you'd get under XP.
 
Dec 30, 2004
12,553
2
76
Originally posted by: Gonad the Barbarian
Originally posted by: soccerballtux
Originally posted by: Gonad the Barbarian
You're getting a false reading from the detail slider, it controls several variables in certain combinations. There is a difference in the 5 setting between Vista and XP. Vista is using DX10, XP isn't. That's where your speed difference is coming from.

Haha, good one, no. There is no visual difference. Countless sites on the internet have confirmed this; even on Crysis there is nothing special that DX10 does that DX9 won't do. Well, except for "slow your game down".

So turn off DX10 and get the speed you'd get under XP.

I have.

And don't get that speed. With WDDM Vista switches between kernel mode and driver-mode. Takes the commands from the game, moves into kernel mode, checks to make sure no VRAM addresses are being written to that shouldn't be, switches to to kernel mode, lets the graphics driver implement those commands; and then repeats the process.

Stability at the cost of performance. I just wish it were more efficient. Maybe it would be, had I more cache? So I made this thread.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Vista is garbage. I think it's dx10 also. Maybe if you edited that d3d command, and tried running in dx9 mode, and did a fps comparison - you might see a difference, or no??