• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

How bad would an Opteron 180 @ 2.4ghz bottleneck a HD4890?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Candymancan21
I suppose is you get the AA/AF it would be faster.

How can you tell me i dont have a clue? An Opter 165 is exactly the same thing as a 180... So is a 8800GTS and 8800GT. The GT only slightly faster, i know what im talking about i saw no diff on my 4850 and was quite disapointed.

I guess if you say yea i cant use 2xAA, the 4890 would be able to do 8xAA, but i really doubt his FPS will increase in general it just wont go down. That could be a "benefit"

just give up on them. these guys have the mentality that a faster card will always improve things no matter what. all the links to cpu benchmarks or even our own will do nothing to sway their theories. at least we know better and that really all that matters in the end.
 
Originally posted by: jaredpace
Originally posted by: toyota


Settings: Demo(Ranch Small), 1680x1050 (60Hz), D3D10, Fixed Time Step(No), Disable Artificial Intelligence(No), Full Screen, Anti-Aliasing(4x), VSync(No), Overall Quality(Ultra High), Vegetation(Very High), Shading(Ultra High), Terrain(Ultra High), Geometry(Ultra High), Post FX(High), Texture(Ultra High), Shadow(Ultra High), Ambient(High), Hdr(Yes), Bloom(Yes), Fire(Very High), Physics(Very High), RealTrees(Very High)



E8500 @ 1.60 GTX260 @ 460/1600

Total Frames: 1616, Total Time: 51.00s
Average Framerate: 31.69
Max. Framerate: 49.77 (Frame:239, 6.25s)
Min. Framerate: 21.50 (Frame:1089, 34.21s)

E8500 @ 1.60 GTX260 @ 700/2250
Total Frames: 1612, Total Time: 51.02s
Average Framerate: 31.60
Max. Framerate: 50.15 (Frame:257, 6.63s)
Min. Framerate: 20.69 (Frame:1084, 33.76s)

E8500 @ 3.16 GTX260 @ 460/1600
Total Frames: 2049, Total Time: 51.01s
Average Framerate: 40.17
Max. Framerate: 57.62 (Frame:298, 5.95s)
Min. Framerate: 31.45 (Frame:1536, 38.29s)

E8500 @ 3.16 GTX260 @ 700/2250
Total Frames: 2736, Total Time: 51.01s
Average Framerate: 53.64
Max. Framerate: 86.20 (Frame:414, 6.34s)
Min. Framerate: 37.74 (Frame:1037, 17.55s)


would be nice to see two more runs of this test with the cpu in the 4.0-4.4ghz range. would it make a difference?

not at 1680 and with a gtx260. with a faster card like a gtx285 then probably about 3.4-3.6 would be tops to get to 100% out it at 1680. 4.0 certainly wouldnt do anything useful unless you were at 1280 or you were running something much faster than gtx285 at 1680.
 
Originally posted by: toyota
Originally posted by: apoppin
Originally posted by: toyota
Originally posted by: Candymancan21
I have a Opteron 165 @ 2.9ghz, i bought a 4850 thinking the improvement was going to be big considering the 4850 was a faster card based off reviews. I litterally saw no improvement. I mean nothing, COD4 ran the same and so did other games. Once i upgraded the cpu tho to my E8400 i saw a 200% fps increase. Now im on a 4890 and i see a 70% increase on this over my 4850.

TBH i really dont think you will anything going to the 4890. I didnt on the 4850.

THANK YOU VERY MUCH. you know the same thing that I know because we both made the move to faster Core 2 cpus because or X2(or opty) was killing the potential of a decent card. all these other people dont know what the heck they are talking about. its night and day difference and these other people arguing that fact just dont get it. also real numbers like I am showing dont lie. the X2 and opty cpus especially lower clocked ones are a massive bottleneck for a fast video card. again thank you.

CoD4 is pretty CPU limited - Duh =P

NONE of you guys have a CLUE how the OP's system will run with a 4890
- none of you are comparing anything remotely to HIS system

i have tested this also. Not everyone gives a CRAP about frame rates at the minimum going up if they are OK now

FACT: the OP will have MUCH more AA/AF available and will be able to max details - at about the same frame rates WHEN his CPU bottlenecks

perhaps it is worth it to HIM

who CARES what *you* think how HE spends HIS money ?
- he will see improvement = period!!

rose.gif

how about you pull your head out of your butt? the OP asked a freaking question and I clearly gave him an example showing him that yes indeed his cpu will bottleneck a 4890. I used several cards in my old 5000 X2 system so yes I do have a clue how his even slower cpu will perform badly even with the slower cards I used. not to mention the runs I did just now will give him a clear idea. he can also look at the many cpu benchmarks out there. all you have done is act like a jerk and not added one shred of proof to the contrary.

Why don't you yank yours out? You are insisting you are right and ignoring ever thing that anyone else is saying :|

The REST of us think there will be an improvement - The PROOF will come when the OP reports back

you can blabber on and on all about YOU and YOUR system; and we don't really know what the OP will REALLY experience

all you have is your opinion and benches run on your system

i know all about bottlenecking; and you are extreme
 
Originally posted by: apoppin
Originally posted by: toyota
Originally posted by: apoppin
Originally posted by: toyota
Originally posted by: Candymancan21
I have a Opteron 165 @ 2.9ghz, i bought a 4850 thinking the improvement was going to be big considering the 4850 was a faster card based off reviews. I litterally saw no improvement. I mean nothing, COD4 ran the same and so did other games. Once i upgraded the cpu tho to my E8400 i saw a 200% fps increase. Now im on a 4890 and i see a 70% increase on this over my 4850.

TBH i really dont think you will anything going to the 4890. I didnt on the 4850.

THANK YOU VERY MUCH. you know the same thing that I know because we both made the move to faster Core 2 cpus because or X2(or opty) was killing the potential of a decent card. all these other people dont know what the heck they are talking about. its night and day difference and these other people arguing that fact just dont get it. also real numbers like I am showing dont lie. the X2 and opty cpus especially lower clocked ones are a massive bottleneck for a fast video card. again thank you.

CoD4 is pretty CPU limited - Duh =P

NONE of you guys have a CLUE how the OP's system will run with a 4890
- none of you are comparing anything remotely to HIS system

i have tested this also. Not everyone gives a CRAP about frame rates at the minimum going up if they are OK now

FACT: the OP will have MUCH more AA/AF available and will be able to max details - at about the same frame rates WHEN his CPU bottlenecks

perhaps it is worth it to HIM

who CARES what *you* think how HE spends HIS money ?
- he will see improvement = period!!

rose.gif

how about you pull your head out of your butt? the OP asked a freaking question and I clearly gave him an example showing him that yes indeed his cpu will bottleneck a 4890. I used several cards in my old 5000 X2 system so yes I do have a clue how his even slower cpu will perform badly even with the slower cards I used. not to mention the runs I did just now will give him a clear idea. he can also look at the many cpu benchmarks out there. all you have done is act like a jerk and not added one shred of proof to the contrary.

Why don't you yank yours out? You are insisting you are right and ignoring ever thing that anyone else is saying :|

The REST of us think there will be an improvement - The PROOF will come when the OP reports back

you can blabber on and on all about YOU and YOUR system; and we don't really know what the OP will REALLY experience

all you have is your opinion and benches run on your system

i know all about bottlenecking; and you are extreme

how is that extreme? those are REAL numbers from REAL benchmarks. if you dont like my proof then look at other cpu benchmarks.


9800gtx, 4850 and 8800ultra with E8500 get better fps in FEAR 2 then a gtx280 and 5000 X2. thats at 1680 and they even have AA and AF on. it would be stupid to get a card faster than a 4850 with that cpu because you area already hitting the wall. the ops cpu is slightly slower too.
http://www.pcgameshardware.com...mpared/Reviews/?page=3


okay lets look at L4D his cpu would limit him to 75fps with a gtx280 at 1680 max settings while an e8500 could push it to 120fps. gee you think there is a little cpu bottleneck there? plus they dont even show how bad the minimums would be. http://www.anandtech.com/bench/default.aspx?b=48


those games arent even that demanding and are still examples that would be perfectly playable if he stayed with a slower gpu than the 4890 because he is not even close to getting the most out of that card. Far Cry 2 and Warhead will have borderline playable minimum framerates at times because of his cpu and wont even get but 60-70% of what the card can do. in other words he would be getting the same performance with a 30-40% slower card because his cpu is just crapping itself with a high card.

what do you think the OP is really going to say in a forum after buying a new card? of course he is going to say "wow all my games run better" whether its true or not. I doubt he will ever come back and post real benchmarks because he likely doenst care and certainly doesnt want to feel like he wasted any money.

I do agree with you that there will be SOME improvement over the 8800gt in most games but his minimums will still be about the same as they were so many games will feel the same in many parts. using a 4890 at 1680 its just so much performance down the drain with that cpu so hopefully he will upgrade soon to really enjoy what its fully capable of.
 
You still don't get it toyota

you are being rude

the OP asked for an opinion and then made his decision

now you are saying basically he - and us are - stupid for not agreeing with you 😛

i am saying to WAIT till the OP reports back before posting a bunch of unrelated stuff we already know

i KNOW the OP - he is really cool and he will likely be back - as he said - unless you keep flaming him for making a bad decision - PART of which is based on his future planned upgrades
rose.gif
 
Ok, I had the opportunity to check out the entire thread completely. There are a few concerns I have now after reading through the thread thoroughly.

1) Your numbers do not match what this web site indicates.

Your numbers have drastic differences - The web review does not show even close to what your numbers do, percentage wise, of course. Now, I am aware that they are using a Quad Core, so I decided to look into the impacts of a Quad Core, versus a Dual Core for 'Far Cry 2' specifically. So here is what I came up with as a defense for using Quad Core Numbers.

A) The Review Used an Core I7. So some might say that can do 8 threads. True, but with Far Cry 2, when HT is enabled, it takes a 5% performance hit found here.

In Far Cry, at 640x480 it was demonstrated here that moving from one to two cores resulted in an 80% performance increase. Adding two more cores only gave it another 22% performance increase. So a Quad Core at 1392Mhz would be roughly equivilent to 1700Mhz Core 2, which is only 100Mhz more than your test setup. So why do your tests show such drastic differences and theirs do not? Food for thought.

2) There were many people who posted information contrary your results. Now that alone does not make you wrong, but the fact that you did not really address their posts makes me wonder...

So I thought about it a while and pondered whether it was worth my time to do any of this testing. I decided, you know, perhaps I am wrong. I have been wrong before in the past, and have no problem coming to terms with it. So, I figured I would run my own tests.

The first game I wanted to fire up was F.E.A.R. This is in my HTPC Box, and unfortunately my Plasma is only 1360x768. But, if anything, that favors your position, not mine. I was basically to lazy to move my higher resolution display over to test with what I did last time 1.5 years ago. So, again, 1360x768 is the resolution that I used, which is less stressful on the Video Card than even 1280x1024. Keep that in mind.

F.E.A.R.

All Settings set to Maximum Values in-game.
Drivers 182.50
Resolution 1360x768
nVidia CP is forcing 4X AA, 16X AF, TSAA
GTS250 Clocked at 750/1811/2200

E5200 @ 1.8Ghz

59 Min
113 Avg
248 Max

E5200 @ 3.16Ghz

59 Min
129 Avg
337 Max

Nothing gained on the minimum frames per second, despite increasing clock speed 75% on the CPU. Now, we did gain slightly on the average, but not that much and as far as maximum, well, yes the faster CPU clearly dominates. Another thing to understand is that this is an E5200 that I tested with. That means the performance has been heavily butchered and a X2 of around 2.0 Ghz would perform about what this does at 1.8Ghz.

So, I decided to try another game. Well, I happened to have my Far Cry 2 CD waited for me to install it. So I installed it, and decided to run my tests. I was expected the results you posted, but came up with something quite different.

Far Cry 2
DX10 - Ultra Settings
Drivers 182.50
Resolution 1360x768
nVidia CP is forcing 4X AA, 16X AF, TSAA
GTS250 Clocked at 750/1811/2200

E5200 @ 1.8Ghz

16 Min
24 Avg
38 Max

E5200 @ 3.16Ghz

14 Min
27.5 Avg
46 Max

These results had me spoofed. I had to rerun the test again for both clock speeds, but I came up with the same average. This is rather strange, not sure entirely why this was. But after running it several times back and forth, I have to conclude that a crippled Pentium E5200 at 1.8Ghz is 'ok' for the most part.

A few more comments I want to make.

1) There are in fact games out there where the CPU speed matters. I won't deny this. Crysis, FSX and Lost Planet being three of them at the top of my head.

2) I do not want to invalidate the results that toyota has posted, because I don't know him. It would be pretty crazy for me to call him a liar, or assassinate his character. The only thing I can say is that his results did not line up with mine, or VR-Zone's. Why is that? No clue.

3) This is with a GTS250. I will also say that if I put a beefier card in here (GTX 280) I would likely see a difference between these processors. So I will also concede to that.

4) If I take off AA, AF, and all the other eye candy, I will also agree that the CPU will hinder the graphics card. But, why do we buy these cards if were willing to put up with jagged edges? foliage that shimmers? I buy these cards so I can run extreme AA, AF, TSAA and anything else I can throw at these things.

But after all of this, I feel that Spike said it best in post #2 (Go back and read it if you want to know), because even if the CPU is not allowing the GPU to fully stretch it's legs, that is not a bad thing if overall performance increases with the insertion of a new Video Card.

I don't really argue on internet forums much anymore, if at all, because I found it to be fruitless. So if you are interested in exchanging blows, I am going to leave the ring right now.

Edit ** For some reason the third link does not work through. I may decide to TinyURL it.

Edit 2 ** Ok, I fixed the link using TinyURL
 
Originally posted by: apoppin
You still don't get it toyota

you are being rude

the OP asked for an opinion and then made his decision

now you are saying basically he - and us are - stupid for not agreeing with you 😛

i am saying to WAIT till the OP reports back before posting a bunch of unrelated stuff we already know

i KNOW the OP - he is really cool and he will likely be back - as he said - unless you keep flaming him for making a bad decision - PART of which is based on his future planned upgrades
rose.gif

I havent flamed the op one bit and him being cool has nothing to do with anything. he asked about the bottleneck issue and thats that. even if it was my best friend I would show him why its a huge bottleneck and move on. its nothing personal and you are actually being the rudest person in this thread. your attitude towards me has been poor so thats why you are getting right back. sure I am defensive hearing things that I know are not true when I have went through all this on my own last year. you act nice and I will act nice because its not like we get a prize for being correct. I simply think more people should realize how much of an effect their cpu can have so any time someone asks I will show them.
 
Originally posted by: toyota
Originally posted by: apoppin
You still don't get it toyota

you are being rude

the OP asked for an opinion and then made his decision

now you are saying basically he - and us are - stupid for not agreeing with you 😛

i am saying to WAIT till the OP reports back before posting a bunch of unrelated stuff we already know

i KNOW the OP - he is really cool and he will likely be back - as he said - unless you keep flaming him for making a bad decision - PART of which is based on his future planned upgrades
rose.gif

I havent flamed the op one bit and him being cool has nothing to do with anything. he asked about the bottleneck issue and thats that. even if it was my best friend I would show him why its a huge bottleneck and move on. its nothing personal and you are actually being the rudest person in this thread. your attitude towards me has been poor so thats why you are getting right back. sure I am defensive hearing things that I know are not true when I have went through all this on my own last year. you act nice and I will act nice because its not like we get a prize for being correct. I simply think more people should realize how much of an effect their cpu can have so any time someone asks I will show them.

There is a *current* thread in PFI that mentions you by name
:Q

My attitude is never good toward someone who tells me my head is in my butt because i disagree with them :|

i would not want to be your friend and i would tell you what to do with your opinion

i want to hear from the OP and i think your benchs are pretty unrelated to what he has
- my opinion
rose.gif
 
Originally posted by: apoppin
Originally posted by: toyota
Originally posted by: apoppin
You still don't get it toyota

you are being rude

the OP asked for an opinion and then made his decision

now you are saying basically he - and us are - stupid for not agreeing with you 😛

i am saying to WAIT till the OP reports back before posting a bunch of unrelated stuff we already know

i KNOW the OP - he is really cool and he will likely be back - as he said - unless you keep flaming him for making a bad decision - PART of which is based on his future planned upgrades
rose.gif

I havent flamed the op one bit and him being cool has nothing to do with anything. he asked about the bottleneck issue and thats that. even if it was my best friend I would show him why its a huge bottleneck and move on. its nothing personal and you are actually being the rudest person in this thread. your attitude towards me has been poor so thats why you are getting right back. sure I am defensive hearing things that I know are not true when I have went through all this on my own last year. you act nice and I will act nice because its not like we get a prize for being correct. I simply think more people should realize how much of an effect their cpu can have so any time someone asks I will show them.

There is a *current* thread in PFI that mentions you by name
:Q

My attitude is never good toward someone who tells me my head is in my butt because i disagree with them :|

i would not want to be your friend and i would tell you what to do with your opinion

i want to hear from the OP and i think your benchs are pretty unrelated to what he has
- my opinion
rose.gif

and when did I tell you that? hmmm after your crappy and somewhat ridiculous comments towards me. I see you dont want to be nice so the heck with you.
 
toyota, insulting and pointing people with a rude attitude will not help. If you don't cool down, just don't post, you are not helping. CPU bottlenecking is overrated, with my single core Pentium M, besides Far Cry 2, most of my games were playable with high settings at 1024x768, a very CPU limited resolution.
 
Originally posted by: evolucion8
toyota, insulting and pointing people with a rude attitude will not help. If you don't cool down, just don't post, you are not helping. CPU bottlenecking is overrated, with my single core Pentium M, besides Far Cry 2, most of my games were playable with high settings at 1024x768, a very CPU limited resolution.

why dont you tell apoppin not to be rude then. his attitude is what provoked most of my behavior in the first place. and no bottlenecking is not really overrated. it just depends on the particular circumstances as to whether it is an issue or not. it doesnt take a genius to read numbers from a benchmark. if adding more gpu power isnt changing the minimum framerates or barely moving the average then its pretty clear. and plenty of people would not consider the kind of performance you got with a single core acceptable. adding more gpu power in your case would have been stupid so yes cpu bottlenecking can be a real issue that can greatly affect performance. .
 
Another thing I wanted to mention on this topic is that it used to be expensive (or more) to do a system overhaul. However, that is not really the case anymore. $250 bucks can you get a mighty fast platform. However, there are exceptions to this. Maybe you only have an OEM license of Windows? If that is the case, you really can't legitmately switch out the platform, so consider shelling out another $100 for the OS (another OEM copy) or a retail copy for $200. So it all depends.

I think what I find more amazing is just how cheap it is to game these days. You can put a system together for $500 - $600 bucks that can play any game out there today with some great eye candy and frame rates. Computers have really never been so cheap and who knows what the future holds!
 
Originally posted by: ArchAngel777
Another thing I wanted to mention on this topic is that it used to be expensive (or more) to do a system overhaul. However, that is not really the case anymore. $250 bucks can you get a mighty fast platform. However, there are exceptions to this. Maybe you only have an OEM license of Windows? If that is the case, you really can't legitmately switch out the platform, so consider shelling out another $100 for the OS (another OEM copy) or a retail copy for $200. So it all depends.

I think what I find more amazing is just how cheap it is to game these days. You can put a system together for $500 - $600 bucks that can play any game out there today with some great eye candy and frame rates. Computers have really never been so cheap and who knows what the future holds!

good points. some people just try and squeeze as much out of an old system and they dont realize that they are really past its upgrade point. I have had people argue with me that they want to run an 8800gtx or so with a 2.8 P4 when they already have a card thats beyond what the P4 can handle now. sometimes its best just to sit down and evaluate the best upgrade path. the op in this thread was already holding back his 8800gt from performing at 100% so the majority of his gpu upgrade really went to waste whether people want to accept that our not. he now would get a much much bigger boost from a decent dual or triple or quad core and new mb.

I had always bought oem comps until I built this one. I would just throw a lower midrange card in there and be happy for a year or two. the problem was I was already behind the curve when I bought the oem comp plus I couldnt improve the cpu frequency for future endeavors. I now see that it makes sense just to build it myself so I can keep it longer. my cpu at stock speed is still very fast and plenty for any single gpu at 1920. if I upgrade the gpu to something twice as fast as my gtx260 then I will need at least a mild cpu oc to fully enjoy it which will not be a problem. a video card upgrade at the end of next year though will surely make my cpu start too luck inferior but Ill deal with that when the times comes.
 
Originally posted by: toyota
Originally posted by: ArchAngel777
Another thing I wanted to mention on this topic is that it used to be expensive (or more) to do a system overhaul. However, that is not really the case anymore. $250 bucks can you get a mighty fast platform. However, there are exceptions to this. Maybe you only have an OEM license of Windows? If that is the case, you really can't legitmately switch out the platform, so consider shelling out another $100 for the OS (another OEM copy) or a retail copy for $200. So it all depends.

I think what I find more amazing is just how cheap it is to game these days. You can put a system together for $500 - $600 bucks that can play any game out there today with some great eye candy and frame rates. Computers have really never been so cheap and who knows what the future holds!

good points. some people just try and squeeze as much out of an old system and they dont realize that they are really past its upgrade point. I have had people argue with me that they want to run an 8800gtx or so with a 2.8 P4 when they already have a card thats beyond what the P4 can handle now. sometimes its best just to sit down and evaluate the best upgrade path. the op in this thread was already holding back his 8800gt from performing at 100% so the majority of his gpu upgrade really went to waste whether people want to accept that our not. he now would get a much much bigger boost from a decent dual or triple or quad core and new mb.

I had always bought oem comps until I built this one. I would just throw a lower midrange card in there and be happy for a year or two. the problem was I was already behind the curve when I bought the oem comp plus I couldnt improve the cpu frequency for future endeavors. I now see that it makes sense just to build it myself so I can keep it longer. my cpu at stock speed is still very fast and plenty for any single gpu at 1920. if I upgrade the gpu to something twice as fast as my gtx260 then I will need at least a mild cpu oc to fully enjoy it which will not be a problem. a video card upgrade at the end of next year though will surely make my cpu start too luck inferior but Ill deal with that when the times comes.


Yep, I agree. Just have to weigh the pros and cons on whether a graphics cards makes more sense, a system overhaul, or both. Each situation is different and we are probably all guilty of trying to give a simplistic response to a variable equation.
 
The short answer is - pretty badly. I recently upgraded from an Opteron 165 @ 2.54 + 8800gt to a Phenom2 X3 720 @3.5 + hd4890, and ran some benches to test things out. I haven't tallied up the scores for all the games I tested, but I do have the results for Bioshock and COD4 below. And these were run at 1920x1200 rez, where you'd expect to be gpu-limited, but the Opty 165 was holding me back even then. Take a look:

Bioshock
Call of Duty 4
 
Originally posted by: yh125d
Originally posted by: Candymancan21
I suppose is you get the AA/AF it would be faster.

How can you tell me i dont have a clue? An Opter 165 is exactly the same thing as a 180... So is a 8800GTS and 8800GT. The GT only slightly faster, i know what im talking about i saw no diff on my 4850 and was quite disapointed.

I guess if you say yea i cant use 2xAA, the 4890 would be able to do 8xAA, but i really doubt his FPS will increase in general it just wont go down. That could be a "benefit"

When you upgraded to the 4850, what were you upgrading from?

A 8800gts 640.


Would people please stop basing your results and conclusions on this website http://vr-zone.com/articles/effects-of-cpu-frequency-on-fps-single-gtx-280-/7160-1.html?doc=7160 That website is using a quad core cpu on a quad core optimised game.... I dont care if they downclock to 1500mhz, its still faster then his Opteron at 2.4ghz for crying outloud. Cut those scores in half... Then you will see

You want to the know benefit he will get ? Instead of getting 20fps at 2AA, hell get 20fps at 8AA. Thats the BIG jump he'll see end of story... Man this thread is getting outa hand, im just going to wait and see when his card gets here.
 
Originally posted by: wrangler
yo Toyota dude. You ARE acting like a little know-it-all and your demeanor is absolutely repulsive.

Just sayin'.

thanks and I am sure all the rude remarks from mr apoppin must have went under your radar.

just sayin. lol
 
Originally posted by: Candymancan21
Originally posted by: yh125d
Originally posted by: Candymancan21
I suppose is you get the AA/AF it would be faster.

How can you tell me i dont have a clue? An Opter 165 is exactly the same thing as a 180... So is a 8800GTS and 8800GT. The GT only slightly faster, i know what im talking about i saw no diff on my 4850 and was quite disapointed.

I guess if you say yea i cant use 2xAA, the 4890 would be able to do 8xAA, but i really doubt his FPS will increase in general it just wont go down. That could be a "benefit"

When you upgraded to the 4850, what were you upgrading from?

A 8800gts 640.


Would people please stop basing your results and conclusions on this website http://vr-zone.com/articles/effects-of-cpu-frequency-on-fps-single-gtx-280-/7160-1.html?doc=7160 That website is using a quad core cpu on a quad core optimised game.... I dont care if they downclock to 1500mhz, its still faster then his Opteron at 2.4ghz for crying outloud. Cut those scores in half... Then you will see

You want to the know benefit he will get ? Instead of getting 20fps at 2AA, hell get 20fps at 8AA. Thats the BIG jump he'll see end of story... Man this thread is getting outa hand, im just going to wait and see when get his card what he is going to say.
they cant come up with valid arguments for some reason. just nonsensical replies and some even with insults to me yet they complain about my tone.

I just wish the op would run the benchmarks I ran before and after the 4890. I am almost to the point of going to borrow my 5000 X2 pc from the person I sold it to and buying a 4890 and 8800gt just to see what the non believers say then.

 
Well i respect everyone opinion. Apoppin is pretty smart and knows what he is talking about, but i cant help that fact that i experianced going from a 8800gts 640 an even slower card then the 8800GT to a 4850 and saw no real improvement. That forced me to upgrade to the E8400. I didnt relise the Opteron was that slow.

I think the only thing he will see the improvements on is AA/AF. He might be able to go from say 2xAA to 8xAA and not see a FPS drop, where the 8800GT would crawl. However i am almost certain that his FPS in general will not go up. Altho when i went to the 4850 crysis never existed i dont think, so i could be wrong but based off what i experianced is all im saying.
 
Originally posted by: toyota
they cant come up with valid arguments for some reason. just nonsensical replies and some even with insults to me yet they complain about my tone.

I just wish the op would run the benchmarks I ran before and after the 4890. I am almost to the point of going to borrow my 5000 X2 pc from the person I sold it to and buying a 4890 and 8800gt just to see what the non believers say then.

No you just discount the arguments and return to the same crysis benchmarks to make your point. You should never have to post those more than once in a thread.

Yes crysis, and far cry take a lot of CPU, but other games can really shine with a ok CPU and a Great GPU. (COD4, L4d, TF2) and btw those are the game being played by everyone.

You're argument, while valid, is not ubiquitous so be a little kind to those who dissent.
 
Originally posted by: Candymancan21
Well i respect everyone opinion. Apoppin is pretty smart and knows what he is talking about, but i cant help that fact that i experianced going from a 8800gts 640 an even slower card then the 8800GT to a 4850 and saw no real improvement. That forced me to upgrade to the E8400. I didnt relise the Opteron was that slow.

I think the only thing he will see the improvements on is AA/AF. He might be able to go from say 2xAA to 8xAA and not see a FPS drop, where the 8800GT would crawl. However i am almost certain that his FPS in general will not go up. Altho when i went to the 4850 crysis never existed i dont think so i could be wrong but based off what i experianced is all im saying.

well it was easy to demonstrate how a cpu like is could hold back even the 192sp gtx260 while severely underclocked. anyone that has spent anytime doing benchmarks knows a Core 2 E8xxx at 1.6 would just as fast as a 2.4 X2(or opty). the results are right in front of them so theres nothing to really argue there.

now their other point is whether it will be a noticeable change over the 8800gt. I know from experience with the 2.6 X2 that the ops 2.4 is already about at its limit with the 8800gt when it comes to minimums so those will not change by more than 1 to 2. averages will go up because max frames will still go up a bit over the 8800gt. a 4850 would have likely given him the same exact boost at 1680 though since my 192sp gtx260 severely underclocked was already flat lining with a cpu of his level.
 
Originally posted by: Schmide
Originally posted by: toyota
they cant come up with valid arguments for some reason. just nonsensical replies and some even with insults to me yet they complain about my tone.

I just wish the op would run the benchmarks I ran before and after the 4890. I am almost to the point of going to borrow my 5000 X2 pc from the person I sold it to and buying a 4890 and 8800gt just to see what the non believers say then.

No you just discount the arguments and return to the same crysis benchmarks to make your point. You should never have to post those more than once in a thread.

Yes crysis, and far cry take a lot of CPU, but other games can really shine with a ok CPU and a Great GPU. (COD4, L4d, TF2) and btw those are the game being played by everyone.

You're argument, while valid, is not ubiquitous so be a little kind to those who dissent.

the op said Far Cry 2 and that has an excellent benchmark. also again the 4890 is NOT going to improve on anything above what a 4850 could do with his cpu. and also again his minimums will still be almost exactly the same as with the 8800gt. I ran enough benchmarks on my 2.6 X2 pc to know this. using the gtx260 in the 2.6 X2 pc delivered no better minimums than with the 4670 or 9600gt I used. I didnt build the pc for the heck of it I did so because I wanted to fully utilize a higher end video card and that was not an option with that 2.6 X2. the ops 2.4 is even slower so everything I said about my experience on my old 2.6 X2 pc could be applicable to his performance too.
 
Originally posted by: toyota
the op said Far Cry 2 and that has an excellent benchmark. also again the 4890 is NOT going to improve on anything above what a 4850 could do with his cpu. and also again his minimums will still be almost exactly the same as with the 8800gt. I ran enough benchmarks on my 2.6 X2 pc to know this. using the gtx260 in the 2.6 X2 pc delivered no better minimums than with the 4670 or 9600gt I used.

Again you ignore anything others say and return to the same argument.
 
Back
Top