• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

How bad would an Opteron 180 @ 2.4ghz bottleneck a HD4890?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Comparing your 260 at 35% lower clocks to a 9800/8800GT with our underclocked duos still left the 260 with almost a 50% advantage in average fps. (closer to 40% minimum fps)
 
Originally posted by: yh125d
Comparing your 260 at 35% lower clocks to a 9800/8800GT with our underclocked duos still left the 260 with almost a 50% advantage in average fps. (closer to 40% minimum fps)

your cpu is slower clock for clock than mine which could easily make the framerate lower. that certainly would be responsible for a 1-3 fps right there.
 
The point is that even with his slow CPU, upgrading from a 8800GT to a GTX260 should yield about a 40-50% increase in far cry at 16x10. Most other games should see similar enough results. The OP might even see a bit more because the card he's upgrading to is faster than a 260
 
Originally posted by: yh125d
The point is that even with his slow CPU, upgrading from a 8800GT to a GTX260 should yield about a 40-50% increase in far cry at 16x10. Most other games should see similar enough results. The OP might even see a bit more because the card he's upgrading to is faster than a 260

it certainly wont be that great. minimums and overall playability wont change much if at all with that cpu. I ran my gtx260 with a 5000 X2 for a couple of days and it was a joke compared to running it with an E8500. the max framerate was noticeably higher than the 4670 I was using in there but the minimums were basically identical and it dipped well below 20fps many times no matter what settings I used.

you can clearly see from my Far Cry 2 numbers that his max framerate wont even be as good as the average framerate would be if he had a decent cpu.
 
Originally posted by: toyota
Originally posted by: yh125d
Comparing your 260 at 35% lower clocks to a 9800/8800GT with our underclocked duos still left the 260 with almost a 50% advantage in average fps. (closer to 40% minimum fps)

your cpu is slower clock for clock than mine which could easily make the framerate lower. that certainly would be responsible for a 1-3 fps right there.

According to this thread the difference between e7200 and e8500 really isn't that big (both at 3.17gHz)

% gain from 7200 to 8500

Aquamark - 12%

3Dm06 marks - 1%

3Dm06 cpu 1/2 fps - ~1%

Cinebench 1 CPU - -3%

Cinebench X CPU - 1%

Cinebench Multiprocessor speedup - 4.5%

CPUMark - no change

Pifast - 3%


So yes yours will be very marginally faster than mine at 1.6, but nowhere near enouogh of a difference to explain the large FPS differences (25-50%)

 
Originally posted by: toyota
Originally posted by: yh125d
The point is that even with his slow CPU, upgrading from a 8800GT to a GTX260 should yield about a 40-50% increase in far cry at 16x10. Most other games should see similar enough results. The OP might even see a bit more because the card he's upgrading to is faster than a 260

it certainly wont be that great. minimums and overall playability wont change much if at all with that cpu. I ran my gtx260 with a 5000 X2 for a couple of days and it was a joke compared to running it with an E8500. the max framerate was noticeably higher than the 4670 I was using in there but the minimums were basically identical and it dipped well below 20fps many times no matter what settings I used.

you can clearly see from my Far Cry 2 numbers that his max framerate wont even be as good as the average framerate would be if he had a decent cpu.

Are you even capable of interpreting your own test results? Your slow cpu/slow 260 test still had MINIMUMS that were higher than the average on my slow cpu/9800/8800GT test
 
Originally posted by: yh125d
Originally posted by: toyota
Originally posted by: yh125d
The point is that even with his slow CPU, upgrading from a 8800GT to a GTX260 should yield about a 40-50% increase in far cry at 16x10. Most other games should see similar enough results. The OP might even see a bit more because the card he's upgrading to is faster than a 260

it certainly wont be that great. minimums and overall playability wont change much if at all with that cpu. I ran my gtx260 with a 5000 X2 for a couple of days and it was a joke compared to running it with an E8500. the max framerate was noticeably higher than the 4670 I was using in there but the minimums were basically identical and it dipped well below 20fps many times no matter what settings I used.

you can clearly see from my Far Cry 2 numbers that his max framerate wont even be as good as the average framerate would be if he had a decent cpu.

Are you even capable of interpreting your own test results? Your slow cpu/slow 260 test still had MINIMUMS that were higher than the average on my slow cpu/9800/8800GT test

yes I see that. I know his max and therefore his average framerate will go up. my minimums were 20 and I am saying that his will probably be around that or even a tad lower which is quite bad and will still hinder playability.
 
Originally posted by: toyota
Originally posted by: yh125d
Originally posted by: toyota
Originally posted by: yh125d
The point is that even with his slow CPU, upgrading from a 8800GT to a GTX260 should yield about a 40-50% increase in far cry at 16x10. Most other games should see similar enough results. The OP might even see a bit more because the card he's upgrading to is faster than a 260

it certainly wont be that great. minimums and overall playability wont change much if at all with that cpu. I ran my gtx260 with a 5000 X2 for a couple of days and it was a joke compared to running it with an E8500. the max framerate was noticeably higher than the 4670 I was using in there but the minimums were basically identical and it dipped well below 20fps many times no matter what settings I used.

you can clearly see from my Far Cry 2 numbers that his max framerate wont even be as good as the average framerate would be if he had a decent cpu.

Are you even capable of interpreting your own test results? Your slow cpu/slow 260 test still had MINIMUMS that were higher than the average on my slow cpu/9800/8800GT test

yes I see that. I know his max and there fore his average framerate will go up. my minimums were 20 and I am saying that his will probably be a tad lower which is quite bad and will still hinder playability..

Your mins were 20. His (theoretical) mins were 15. 30% increase. How it that not significant?
 
Originally posted by: yh125d
Originally posted by: toyota
Originally posted by: yh125d
Originally posted by: toyota
Originally posted by: yh125d
The point is that even with his slow CPU, upgrading from a 8800GT to a GTX260 should yield about a 40-50% increase in far cry at 16x10. Most other games should see similar enough results. The OP might even see a bit more because the card he's upgrading to is faster than a 260

it certainly wont be that great. minimums and overall playability wont change much if at all with that cpu. I ran my gtx260 with a 5000 X2 for a couple of days and it was a joke compared to running it with an E8500. the max framerate was noticeably higher than the 4670 I was using in there but the minimums were basically identical and it dipped well below 20fps many times no matter what settings I used.

you can clearly see from my Far Cry 2 numbers that his max framerate wont even be as good as the average framerate would be if he had a decent cpu.

Are you even capable of interpreting your own test results? Your slow cpu/slow 260 test still had MINIMUMS that were higher than the average on my slow cpu/9800/8800GT test

yes I see that. I know his max and there fore his average framerate will go up. my minimums were 20 and I am saying that his will probably be a tad lower which is quite bad and will still hinder playability..

Your mins were 20. His (theoretical) mins were 15. 30% increase. How it that not significant?

sorry for the confusion but you must not have made the connection a couple posts back about what I was trying to say. I am saying based on REAL experience with a 2.6 5000 X2 that is faster than the ops 2.4 that his minimums could actually be a tad lower than the runs I just did. in other words I think he would only get around 15-16 for a minimum with an 8800gt and maybe 17-18 with the 4890. I am trying to find my benchmarks from when I ran the gtx260 on the 5000 X2. I know they are on a thumbdrive somewhere. lol
 
I think you guys should test different games. Or use a custom timedemo if Far Cry 2 is what matters. The built-in CPU benchmark in Far Cry 2 is almost like 3DMarks..
 
Originally posted by: lopri
I think you guys should test different games. Or use a custom timedemo if Far Cry 2 is what matters. The built-in CPU benchmark in Far Cry 2 is almost like 3DMarks..

cpu benchmark? I didnt know there was a specific cpu benchmark because the one I am running is that standard ranch one. the op mentioned Far Cry 2 at 1680 so thats what I looked at. I have some Warhead numbers too but does it really matter? all it does is start more arguments and I am tired for the night. most everybody has made good points but the fact remains the ops cpu at 2.4 is a huge bottleneck that keeps him getting more than 60-70%(even lower for minimums) of what a 4890 is capable of in demanding games at 1680. his max framerates and therefore averages will certainly be faster in almost all cases over the 8800gt but again its the minimums that will barely budge that is the main issue.
 
well just for the heck of it I will post the Warhead runs. this is the train benchmark which is not even a cpu intensive run. all run at the ops res of 1680 and DX9 high settings which would be realistic.

Benchmark information
Demo: Train
Quality: High
Renderer mode: DirectX 9
Antialising mode: Off
Filtering mode: None
MultiGPU support: Disabled
Boost renderer: Disabled
Use Custom Config File: No
Random .exe: Disabled


E8500 @3.16 GTX260 @460/1600
Resolution: 1680 × 1050
Result(1): Minimum= 21 FPS Average= 30 FPS Max= 38 FPS

E8500 @3.16 GTX260 @700/2250
Resolution: 1680 × 1050
Result(1): Minimum= 28 FPS Average= 39 FPS Max= 52 FPS



E8500 @1.60 GTX260 @460/1600
Resolution: 1680 × 1050
Result(1): Minimum= 15 FPS Average= 24 FPS Max= 31 FPS

E8500 @1.60 GTX260 @700/2250
Resolution: 1680 × 1050
Result(1): Minimum= 14 FPS Average= 25 FPS Max= 35 FPS


no big surprise here as even in Warhead a cpu of his level is a HUGE bottleneck at 1680. changing the core and memory speed on the gtx260 by 35% only makes the average go up by 1fps and I guess it was just margin of error that the minimum was lower by 1fps. here is proof where a faster cpu and slower gpu can be faster as the E8500 at 3.16 and gtx260 even downclocked by 35% easily beats a cpu like his and a gtx260 at any speed. an even better card would make zero difference as the cpu limitation has already kept an underclocked 192sp gtx260 from even doing its job.
 
Originally posted by: BrokenVisage
Originally posted by: SickBeast
You should be able to hit 2.6ghz or more with that CPU. I'm sure it will hold you back a bit, but you will see a nice boost from the 4890 anyway.

It doesn't OC well, much to my dismay. I'm not very smart with OC'ing but have tried everything to get it stable and nothing seems to work. 🙁

drop your HT to 4x
drop your memory from 400 --> 333
up the vcore to 1.4 (could probably go 1.45v if your temps are good)

Slowly start raising the clock while checking your temps

a 10% OC should be a breeze
- 15-20% should not be out of the question but you may want to bump the mem volts, drop timings a bit and bump the HT voltage

- with the clock at 220MHz - 2640MHz overall (RAMs @ 367MHz)
- with the clock at 240MHz - 2880MHz overall (RAMs @ 400MHz)


fyi - a new cooler for s939 should move forward to AM2+ / AM3


If you list your motherboard I'm sure the folks here will help you with specific settings and tweaks.
 
Originally posted by: toyota
Originally posted by: apoppin
Originally posted by: toyota
Originally posted by: apoppin
Exactly!

it will be worth it for him and his situation

not ideal .. but a hellaofa improvement

worth 180 bucks for only a couple frames per second? his minimum framerate in Far Cry 2 and many other games will be the same as it was with the 8800gt. if you dont believe me I can drop my gpu speeds down to prove it to you.

did you miss the increased details and the fact he can run with filtering now

Let's see what the OP says about his $170
- if he will return to comment

i don't think you are qualified to speak for him

here you go with my gpu core and shaders dropped 35% and gpu memory dropped nearly 30%. just like I said NO difference. that means that a gpu about 30% weaker than my 192sp gtx260 would give him the same performance. in other words he was already bottlenecked in Far Cry 2 with his 2.4 opty at 1680 even with the 8800gt. also thats with Ultra settings and 4x AA so there is no excuse about turning up the settings.


Settings: Demo(Ranch Small), 1680x1050 (60Hz), D3D10, Fixed Time Step(No), Disable Artificial Intelligence(No), Full Screen, Anti-Aliasing(4x), VSync(No), Overall Quality(Ultra High), Vegetation(Very High), Shading(Ultra High), Terrain(Ultra High), Geometry(Ultra High), Post FX(High), Texture(Ultra High), Shadow(Ultra High), Ambient(High), Hdr(Yes), Bloom(Yes), Fire(Very High), Physics(Very High), RealTrees(Very High)


E8500 @ 1.60 GTX260 @ 700/2250
Total Frames: 1612, Total Time: 51.02s
Average Framerate: 31.60
Max. Framerate: 50.15 (Frame:257, 6.63s)
Min. Framerate: 20.69 (Frame:1084, 33.76s)

E8500 @ 1.60 GTX260 @ 460/1600
Total Frames: 1616, Total Time: 51.00s
Average Framerate: 31.69
Max. Framerate: 49.77 (Frame:239, 6.25s)
Min. Framerate: 21.50 (Frame:1089, 34.21s)

who gives a CRAP what YOUR GTX does?
you go on and on about *theory* as though you really know 😛

he as a different system and a different GPU - you do not have a 8800GT
:roll:

he ALREADY bought it -- let's SEE what HE thinks

it is not "all about you" and your PC and what you *think* will happen


 
Wow guys I didn't expect this to turn into a whole big thing but thanks for the all the info.

Originally posted by: heyheybooboo
Originally posted by: BrokenVisage
Originally posted by: SickBeast
You should be able to hit 2.6ghz or more with that CPU. I'm sure it will hold you back a bit, but you will see a nice boost from the 4890 anyway.

It doesn't OC well, much to my dismay. I'm not very smart with OC'ing but have tried everything to get it stable and nothing seems to work. 🙁

drop your HT to 4x
drop your memory from 400 --> 333
up the vcore to 1.4 (could probably go 1.45v if your temps are good)

Slowly start raising the clock while checking your temps

a 10% OC should be a breeze
- 15-20% should not be out of the question but you may want to bump the mem volts, drop timings a bit and bump the HT voltage

- with the clock at 220MHz - 2640MHz overall (RAMs @ 367MHz)
- with the clock at 240MHz - 2880MHz overall (RAMs @ 400MHz)

fyi - a new cooler for s939 should move forward to AM2+ / AM3

If you list your motherboard I'm sure the folks here will help you with specific settings and tweaks.

Thanks, I'll try that once I get home. The system I'm putting the 4890 into is in my sig, the mobo is a MSI K8N Neo4 which is pretty ancient but hasn't failed me yet. What prompted my desire for a new vid card is my HTPC card just fried and I figured to just migrate my 8800GT into it instead of getting a lower grade card to replace the 2600XT that was in there. So I didn't have a "burning need" to upgrade but saw this as an opportunity to make the move, hopefully it pays off. I will post 3dMark and bench scores once it's in there, but if I don't see a huge performance increase over the 8800GT I won't be heartbroken. I am glad to see prices on these newer high end cards not being astronomical like in the past. I spent at least $50 more on each of my last 2 vid cards then I did on this one.

While I'm on the subject of upgrading.. is AMD still the way to go for gaming? I see more and more gaming setups with Intel Inside(R) that get some impressive numbers which leads me to believe that's the direction people are going now.
 
Originally posted by: BrokenVisage
Wow guys I didn't expect this to turn into a whole big thing but thanks for the all the info.

Originally posted by: heyheybooboo
Originally posted by: BrokenVisage
Originally posted by: SickBeast
You should be able to hit 2.6ghz or more with that CPU. I'm sure it will hold you back a bit, but you will see a nice boost from the 4890 anyway.

It doesn't OC well, much to my dismay. I'm not very smart with OC'ing but have tried everything to get it stable and nothing seems to work. 🙁

drop your HT to 4x
drop your memory from 400 --> 333
up the vcore to 1.4 (could probably go 1.45v if your temps are good)

Slowly start raising the clock while checking your temps

a 10% OC should be a breeze
- 15-20% should not be out of the question but you may want to bump the mem volts, drop timings a bit and bump the HT voltage

- with the clock at 220MHz - 2640MHz overall (RAMs @ 367MHz)
- with the clock at 240MHz - 2880MHz overall (RAMs @ 400MHz)

fyi - a new cooler for s939 should move forward to AM2+ / AM3

If you list your motherboard I'm sure the folks here will help you with specific settings and tweaks.

Thanks, I'll try that once I get home. The system I'm putting the 4890 into is in my sig, the mobo is a MSI K8N Neo4 which is pretty ancient but hasn't failed me yet. What prompted my desire for a new vid card is my HTPC card just fried and I figured to just migrate my 8800GT into it instead of getting a lower grade card to replace the 2600XT that was in there. So I didn't have a "burning need" to upgrade but saw this as an opportunity to make the move, hopefully it pays off. I will post 3dMark and bench scores once it's in there, but if I don't see a huge performance increase over the 8800GT I won't be heartbroken. I am glad to see prices on these newer high end cards not being astronomical like in the past. I spent at least $50 more on each of my last 2 vid cards then I did on this one.

While I'm on the subject of upgrading.. is AMD still the way to go for gaming? I see more and more gaming setups with Intel Inside(R) that get some impressive numbers which leads me to believe that's the direction people are going now.

"Intel inside"?
- that is the Warning Label on a PC for a gamer!
:Q

😀
just kidding 😛

Nvidia OR AMD is the way to go for gaming. Right now the AMD parts are less expensive from what i can see
rose.gif


Please post your results. Let us know what YOU think of the upgrade

 
I knew the bottlenecking factor of a slow CPU wasn't near as drastic as toyota thinks, but I thought it was more than that
 
Originally posted by: toyota
your cpu is slower clock for clock than mine which could easily make the framerate lower. that certainly would be responsible for a 1-3 fps right there.

The difference between the Conroe architecture and Penryn architecture in a per clock basis is between 2% ad up to 10% in the best case scenario which is not in games, but media encoding. In games the performance difference is an average of 4% - 6% which is not enough to hinder the performance, which is unplayable with a Conroe, will be unplayable with a Penryn. I will be moving to a Penryn quad soon because I will be selling this CPU, i just hate its horrible heat dissipation and power consumption, is unbearable, specially in a country were the typical temperature is around the mid 80"s F during daytime.

http://www.anandtech.com/cpuch...howdoc.aspx?i=3069&p=6


While is true that his CPU will hamper the HD 4890 performance considerably, he will still be able to enjoy his games for a while until he do an upgrade. Far Cry 2 needs a Dual Core to be playable no matter what GPU do you have. I had a HD 3850 512MB oced to 800MHz GPU and 2GHz RAM, (Faster speeds than a stock HD 3870) coupled with a Pentium M 760 at 2.70GHz which has a comparable gaming performance in single threaded games of a Core 2 Duo at 2.33Ghz, and yet, Far Cry 2 was unplayable no matter which image quality settings I chose. I was able to play most games with ultra high quality like Gears of War, Prey, Lost Planet, or a good combination of med/high in Crysis and was playable. But if you want to maximize your gaming performance, I recommend you to move to a Dual/Quad core processor as fast as possible, the difference is noticeable.
 
Originally posted by: toyota
Originally posted by: MagickMan
I think it'll be a decent upgrade*, and will provide a good reason to upgrade everything else in the near future. 😀




*unless he's running WoW, which can bring even an i7/GTX 285 combo to it's knees.

it wont be much of an upgrade at all in many cases because he was already cpu limited in Far Cry 2 at 1680 with an 8800gt.

Sure it will, he'll be able to turn up all the eye candy without seeing any penalty at all.
 
Far Cry 2 is unique in that it responds well to extra CPU power AND extra GPU power AND extra memory/speed. For this game I do recomend a faster Proc, normally I wouldn't.
Just my 2 cents _Nate
 
Originally posted by: evolucion8
But if you want to maximize your gaming performance, I recommend you to move to a Dual/Quad core processor as fast as possible, the difference is noticeable.

Not sure why you suggest moving to dual core, because AFAIK OP's Opteron 180 has two cores (each 1MB L2, 2.4GHz).
 
@OP. It all depends on your gaming resolution. The higher the resolution, the smaller the bottleneck. What resolution you gaming at?
 
Back
Top