Upgrade now or wait?

yusux

Banned
Aug 17, 2008
331
0
0
With my current setup should I do a complete upgrade to 790i/psu/SLi or wait for newer video cards? I don't know if the live.com cashback thing would end or not, such turmoil..
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,268
11
81
If anything, upgrade the processor. Grab an E8400, 8500, or 8600 and overclock it, or go quad core if you really want. I don't think you need a complete upgrade right now, but I do think your processor is holding you back a bit. You should also consider adding some RAM. But honestly, upgrade only if you aren't satisfied with your performance; otherwise just wait.
 

aclim

Senior member
Oct 6, 2006
475
0
0
heh, honestly how much more of performance are you really going to see with lets say 3.6ghz compared to your 2.8ghz? I dunno, but with that 26" monitor I dunno if it will be that much of a difference. Your more GPU limited at those res. If you want higher clocks just go with a different CPU. I wouldn't spend anything more than that. You should be fine for a while until i7 comes down in price or anything that comes after that.

One thing I would do is maybe get 4gb of RAM. That will be the most beneficial as long as you got Vista 64bit.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,268
11
81
He has an E6300, based off the old 65nm Core 2 architecture and it only has 2mb of L2 cache while only being clocked at 2.8 GHz. I am afraid that is providing a bottleneck for his GTX 280 even at 1080p+ resolutions. Here's the proof:
http://www.pcgameshardware.com...ticle_id=663794&page=4

Compare the E4400 to the E8600, and there's a huge difference (20% in COD4) in just cache size alone at 19x12. Now just think of the added performance of the higher clock speed of an overclocked E8x00 on top of that, and he'll have all the CPU power needed for his GTX 280.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: cusideabelincoln
He has an E6300, based off the old 65nm Core 2 architecture and it only has 2mb of L2 cache while only being clocked at 2.8 GHz. I am afraid that is providing a bottleneck for his GTX 280 even at 1080p+ resolutions. Here's the proof:
http://www.pcgameshardware.com...ticle_id=663794&page=4

Compare the E4400 to the E8600, and there's a huge difference (20% in COD4) in just cache size alone at 19x12. Now just think of the added performance of the higher clock speed of an overclocked E8x00 on top of that, and he'll have all the CPU power needed for his GTX 280.

I understand they are testing cache differences but that doesn't tell a whole lot except that more cache = more performance.

Doesn't necessarily mean his CPU is bottlenecked at that resolution. CPU Bottleneck is when you get equal performance across all resolution no matter how much you upgrade your GPU. What you are talking about is CPU limiting. Yet all CPU's have them.

You aren't going to see big gains in those resolutions when you use that high of a resolution. Probably 10fps or so. What's the difference between 90fps and 100fps? Not much. When you lower resolution the gains become bigger but than again it's beyond 100fps already.

Upgrade to quad if you are going to upgrade your processor. Better yet I7. Don't waste more money on the same old core 2 duo technology.
 

aclim

Senior member
Oct 6, 2006
475
0
0
yea lol, trust me that CPU is not a big bottleneck. And that cache in my opinion is not noticeable in games. At least I dont think so. As I went from a e6300 to my current CPU, no difference at all. His money is much much better served getting more RAM and Vista 64bit to support it and play DX10.

LOL and after looking at that article, its a whole 6 fps max difference between a 2mb and a 6mb cache at the same clock speed. That is a joke. I dunno where you get that is a bottleneck. LOL.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,268
11
81
....

Firstly in CoD4, the difference is 10 frames per second. You can't use your experience because you are using an 8800GTS, and he's using a much more powerful GTX 280. And for statisticians, in CoD4 the performance increase is about 20% and in GRID it is about 15%, which is definitely noticeable in game. That is like the difference between using an HD4850 and an HD4870.

Secondly, the much higher clock speed, through overclocking, of an E8x00 processor will definitely increase his performance over a measely clocked 2.8 GHz E6300.

And I hate the "stop buying the same old architecture" argument that is so rampant around here. Why is it used? i7 is going to be much more expensive than even a complete Core 2 upgrade, for him, and through overclocking he could easily and more cheapily build a Core 2 rig that will be just as fast as a Core i7 rig, if he really wanted to do an overhaul.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: cusideabelincoln
....

Firstly in CoD4, the difference is 10 frames per second. You can't use your experience because you are using an 8800GTS, and he's using a much more powerful GTX 280. And for statisticians, in CoD4 the performance increase is about 20% and in GRID it is about 15%, which is definitely noticeable in game. That is like the difference between using an HD4850 and an HD4870.

Secondly, the much higher clock speed, through overclocking, of an E8x00 processor will definitely increase his performance over a measely clocked 2.8 GHz E6300.

And I hate the "stop buying the same old architecture" argument that is so rampant around here. Why is it used? i7 is going to be much more expensive than even a complete Core 2 upgrade, for him, and through overclocking he could easily and more cheapily build a Core 2 rig that will be just as fast as a Core i7 rig, if he really wanted to do an overhaul.

Ummmm no... This is done all on 1680x1050 4xAA. At 1920x1200 the gains will even shrink further. Games for the most part are GPU limited. Not CPU. If you are going to waste money on a processor and motherboard might as well pick up better graphic cards if you are trying to get better FPS. Returns are much greater.


http://www.pcgameshardware.com...647744&image_id=839047

http://www.pcgameshardware.com...ticle_id=647744&page=1

http://www.pcgameshardware.com...ticle_id=647744&page=1

http://www.pcgameshardware.com...ticle_id=647744&page=1

http://www.pcgameshardware.com...ticle_id=647744&page=1





 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,268
11
81
Dude, just no. The benchmarks I showed and used for my reference were done at 19x12 using a GTX 280. Your benchmarks are pretty irrelevant since they use the same cpu just at different clock speeds. So you're pretty much basing your entire argument by ignoring that the jump from an E6300 to an E8x00 also introduces changes in L2 cache size, power consumption, heat, overclocking, FSB speed, some minor architectural changes, and finally clock speed. These will all add up to a big difference, although I will agree and I should have mentioned earlier than an upgrade to a quad core Core 2 processor would provide even better performance per clock and would be the most comparable to Nehalem.

Here are some more benchmarks:
http://www.legionhardware.com/document.php?id=770&p=11

Summary:
I am using the E6700, which should be a reference to his 2.8 GHz E6300, although I believe the E6700 would still be faster. I am also comparing this to the E8400 @ 3.3 GHz (a mild overclock) and a Q9650 @ 3.3 GHZ (also a mild overclock).
Company of Heroes: The E8400 is 5% faster and the Q9650 is 16% faster
Enemy Territory: The E8400 is 18% faster and the Q9650 is 37% faster
Supreme Commander: The E8400 is 8% faster and the Q9650 is 13% faster
Unreal Tournament: The E8400 is 13% faster and the Q9650 is 21% faster

Those would be noticeable improvements, especially if he made the move to a 45nm quad core and overclocked it.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
How is it irrelevant because it tests clock speeds? :disgust:

the guy has 2.85ghz and is about on par with E6700.

You are getting anywhere from 5%~20% gains but was it even unplayable in the first place? From 111fps to 125fps doesn't mean much nor does 77fps to 81fps.

So how much does Q9650 cost? How about E8400? Wouldn't that money better served to get another GPU and get 60-80% gains? Why would you waste $200 or even $500 on something so insignificant if all your trying to do is improve game fps?

BTW those games are old or just not GPU intensive enough for the 4870x2. Majority of them getting over 100fps. Try some GPU intensive games like Crysis @ very high and the gains would be so insignificant. Why did you link to a 4870x2 cpu scaling article anyway? Because 280gtx would have less of that effect. :p

http://www.legionhardware.com/document.php?id=775&p=13

Your 13% and 21% gains in UT3 just dropped to 1.2% and 1.2%. :brokenheart:

Now if OP upgraded to 4870x2 and sold his 280gtx and broke even vs a $200, $500 CPU upgrade on the 280gtx. Guess which has higher gains? :light:
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
azn the op already has the 2nd fastest graphics card: gtx280. why would a cpu upgrade not benefit him? LOL

if there is a 15-20% increase just going from 65nm 4mb --> 45nm 6mb then there will be about another %20 going from 2.8ghz to 4.2ghz. Also those tests are done on 22" and 26" monitors with 4XAA and maximum settings.

It's either $400 to SLI, or $150 for an E8400. or $550 for both.

edit: or $300 for a nice 45nm quad or $1000 for an i7

 

krnmastersgt

Platinum Member
Jan 10, 2008
2,873
0
0
Originally posted by: aclim
yea lol, trust me that CPU is not a big bottleneck. And that cache in my opinion is not noticeable in games. At least I dont think so. As I went from a e6300 to my current CPU, no difference at all. His money is much much better served getting more RAM and Vista 64bit to support it and play DX10.

LOL and after looking at that article, its a whole 6 fps max difference between a 2mb and a 6mb cache at the same clock speed. That is a joke. I dunno where you get that is a bottleneck. LOL.

Err you upgraded in the same family of processors, which would mean your upgrade was purely clock speeds which differs from a cache increase with much high clocks as well. At the same clocks an E8400 would pull away from your E6600 because of the cache difference, then you factor in the overclock you can get with an E8400 as well and it's pretty much no contest.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: jaredpace
azn the op already has the 2nd fastest graphics card: gtx280. why would a cpu upgrade not benefit him? LOL

if there is a 15-20% increase just going from 65nm 4mb --> 45nm 6mb then there will be about another %20 going from 2.8ghz to 4.2ghz. Also those tests are done on 22" and 26" monitors with 4XAA and maximum settings.

It's either $400 to SLI, or $150 for an E8400. or $550 for both.

edit: or $300 for a nice 45nm quad or $1000 for an i7

Jared did you even read what I said? The upgrade would be insignificant. I never said it wouldn't help.

Going from 2.85ghz to 3.6ghz on the same architecture isn't much. It's incremental upgrades if you ask me. In lower resolution yes it will make a big difference but not @ 1920x1200 even with GTX 280. The money would be served better else where. Not to mention I7 is just around the corner.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: jaredpace
yah, i'm not saying to keep the same architecture...
going from 65nm 4mb --> 45nm 6mb

going from 2.8ghz to 4.2ghz.

Those benches ARE 1920 x 1200 WITH GTX 280 and 4XAA?

So did the architecture change? Do you even know what that is?

4.2ghz is a typical overclock of of those E8400? Show me a CPU scaling article with 4xAA.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: jaredpace
Well I guess it goes from Conroe to Penryn. Do those terms sound familiar? Please explain architecture to me

4xAA:
http://www.pcgameshardware.com...ticle_id=663794&page=4

So giving it another name for the processor is architectural change? Namely FSB and cache difference. I guess Intel did a great job marketing their processor to guys like you.

That link was already mentioned. look above. That's measuring cache difference.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
It's not just "another name" Azn.

Read these articles to find out more:
http://www.anandtech.com/cpuch...el/showdoc.aspx?i=2972
http://www.anandtech.com/cpuch...el/showdoc.aspx?i=3069
http://www.anandtech.com/cpuch...el/showdoc.aspx?i=3137

Here's a graph measuring cpu speed in Mhz w/ 4xAA with the OP's graphics card and not taking into account all the improvements covered in the three previous links:
http://www.pcgameshardware.com...f_the_Geforce_GTX_280/

You linked to graphs out of this article earlier, but I'm not sure you are seeing the differences.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: jaredpace
It's not just "another name" Azn.

Read these articles to find out more:
http://www.anandtech.com/cpuch...el/showdoc.aspx?i=2972
http://www.anandtech.com/cpuch...el/showdoc.aspx?i=3069
http://www.anandtech.com/cpuch...el/showdoc.aspx?i=3137

Here's a graph measuring cpu speed in Mhz w/ 4xAA with the OP's graphics card and not taking into account all the improvements covered in the three previous links:
http://www.pcgameshardware.com...f_the_Geforce_GTX_280/

You linked to graphs out of this article earlier, but I'm not sure you are seeing the differences.

Marketing boys must love you. Penryn or Conroe belong to the same Core 2 duo lineup. That's not architectural change it's called new process. It's just souping up the same architecture. Architectural change would be going to a core 2 duo from Pentium 4. My 2meg cache e6300 is called conroe and so is 4meg cache E6850. Does this mean it's architectural change to you?

That article you linked which I linked BTW if you look above is @ 1680x1050 4xAA and the only thing that got a huge gain was 3dmark. At 1920x1200 that gap would even shrink further. Do you play 3dmark?

So you want to spend $165 for 5-10% change? Why not just spend that money on a GPU if you are trying to gain FPS in games?

OP could totally sell his GTX 280 and get 4870x2 and break even if went on the ebay deal. The gains would be much bigger than spending $165 on a processor to get 5-10% gains. Why not another GTX 280 and nearly double his performance?
 

MyLeftNut

Senior member
Jul 22, 2007
393
0
0
It seems that in the review these is a significant difference in performance, but the cpu clock is so low to begin with. I wonder what would happen if they were at least at 3.0ghz to test with. I would think the performance due to cache difference would be much smaller then.

Btw OP, is your cpu capable of overclocking to around 3.4-3.5ghz? If you can, you can stick with that cpu for a while longer until there are options for a real worthwhile upgrade and at more affordable prices.
 

Kraeoss

Senior member
Jul 31, 2008
450
0
76
it seems that azn is the most knowleged person @ this forum.... seems he knows more than the actual reviews lol...