CPU Shootout: i5-760 vs i5-4670K - (updated w/760@3.5GHz) Where's the progress?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sm625

Diamond Member
May 6, 2011
8,172
137
106
I like the last chart of that article. And I like the idea of comparing an i5-760 with the latest chips. But seriously at this point no one should be running an i5-760 at stock speeds. It is just leaving too much performance on the table. At the very least take it to 3.2 GHz, which can be done on the stock cooler.
 
Feb 19, 2009
10,457
10
76
I don't get the point of this test. Isn't simple logic respected here? GPU limited games will fare fine with an older CPU, whereas CPU limited games will not. :confused:

I believe the logic here is the vast majority of games are GPU bound and NOT CPU bound.. especially at 1080p and maxed settings.
 

Makaveli

Diamond Member
Feb 8, 2002
5,025
1,624
136
I went from a i7 860 to a 3570. And it was night and day difference in games.

760 is simply too slow unless you play games with very light CPU load. And using scripted runtime benchmarks is not exactly the best to showoff the CPU.

What games were these and what clock speed was the 860 at and what speed is the 3570 running at?
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
It is a bit small (set of data) to form sweeping conclusions, I agree, but I found it extremely interesting that at least a subset of modern games are not in fact CPU dependent for the average user. The whole idea is to determine when you should indeed update and this is one source of information well worth the read. You should read multiple resources for anything but I found the information relevant and somewhat different than the cut and dry articles so prevalent across the web.

The setups used are very relevant, yeah they are not the extreme 4x titan sli with the best i7 that "everyone" on ATF uses, but they are certainly common setups. If you are not the extremely picky user it might not be as beneficial to upgrade as other sites make it appear. Who games at 640x480 where the newer processor would surely show the most gains? Sure there are a handful of games where the 760 will start to be overwhelmed and BF3 multiplayer etc. is well known. This was essentially real world testing imo.

Sure those of us with money to burn will continue to upgrade, but if you don't have the cash, or have different priorities or potentially hand me down / second hand systems, it is very interesting that you may not be missing out all that much. Have some of you missed this point?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Even Sims 3 is heavily CPU limited by a 760. The article seems to want to make some conclusion that only exist in prescripted benchmarks.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Even Sims 3 is heavily CPU limited by a 760. The article seems to want to make some conclusion that only exist in prescripted benchmarks.

Indeed, they by the looks of it might as well have run Heaven and declared Pent4 adequate for modern gaming.
 

yottabit

Golden Member
Jun 5, 2008
1,672
874
146
I've been playing games lately on a spare rig with a Pentium G630 and 7770. I thought for sure gaming would be terrible, but most FPS games seem to run fine (Bioshock Infinite @ High settings is my current playthrough)

Although I'm consciously avoiding playing games like BF:BC2, BF3, Metro 2033 because I know they'll make me sad lol

I was originally going to upgrade it to a i5-3570k like my other rig, but I may only upgrade it to a i3-3220 now. I can't believe on Anandtech Bench an i3-3220 is roughly equivalent to or better than an i5-750.
 
Last edited:

crashtech

Lifer
Jan 4, 2013
10,695
2,294
146
Conventional wisdom still prevails, yet some slight modifications to the collective mindset may be in order. When budgets are limited, evidence shows that it makes sense for a gamer to spend more on a GPU than anything else, perhaps up to 2/3 the budget on low cost projects.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Conventional wisdom still prevails, yet some slight modifications to the collective mindset may be in order. When budgets are limited, evidence shows that it makes sense for a gamer to spend more on a GPU than anything else, perhaps up to 2/3 the budget on low cost projects.

Exactly! Considering the stagnation of the games and perhaps somewhat of the CPUs this is the conclusion to be made. Where budgets are of concern it may be necessary to consider the purchase thoroughly.

The little warning bell is that games might be about to change dramatically. This may happen with the console launch, or possibly throughout the next year after as they hone in the performance.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Exactly! Considering the stagnation of the games and perhaps somewhat of the CPUs this is the conclusion to be made. Where budgets are of concern it may be necessary to consider the purchase thoroughly.

The little warning bell is that games might be about to change dramatically. This may happen with the console launch, or possibly throughout the next year after as they hone in the performance.

Another aspect of the consoles is not the fact they have 8 threaded CPUs,but the fact it looks like GPU compute might also be used more now. You could see more offloading of effects to the GPU over the next few years. Remember the last generation of consoles had GPUs which were not optimised for compute.
 

B-Riz

Golden Member
Feb 15, 2011
1,595
765
136
I've been playing games lately on a spare rig with a Pentium G630 and 7770. I thought for sure gaming would be terrible, but most FPS games seem to run fine (Bioshock Infinite @ High settings is my current playthrough)

Although I'm consciously avoiding playing games like BF:BC2, BF3, Metro 2033 because I know they'll make me sad lol

I was originally going to upgrade it to a i5-3570k like my other rig, but I may only upgrade it to a i5-3220 now. I can't believe on Anandtech Bench an i5-3220 is roughly equivalent to or better than an i5-750.

I have played BF3 MP 32 count on a G620 and 4770 @ medium detail, 1440x900, all other graphics options low / off and it was fine...

Still like the 64 player madness on the main though. :biggrin:
 

Makaveli

Diamond Member
Feb 8, 2002
5,025
1,624
136
Stock for both. All kind of games. GW2, Simcity, Skyrim, Fallout 3, Civ V, X3, Xcom, Anno, Witcher series etc.

ahh I think that is one of the main reasons both were at stock.

lynnfield to ivy/haswell is quite the huge jump but if you had that gen 1 chip overclocked abit more the gap would have been smaller the Ipc gains dont make as much as a difference in games but difference in clock speed will.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
And yet people still complain about the consoles using Jaguar...

Honestly most posts on here complaining about lack of CPU progress just irritate me.

Multiple reviews have shown that you don't need a top of the line CPU to play games. In fact, multiple ones show that the majority of games are GPU bound and not CPU bound.

In the end, when asked with the question "What would a massive increase in CPU power do for you?" I'm met with the answers of "increased benchmarks", and a few people showing niche uses where they would see benefits.

The end result would be a massive loss in sales for intel (people would just buy the cheaper processors that do the exact thing they need it to do), and a handful of happy enthusiasts.
 

cytg111

Lifer
Mar 17, 2008
26,834
16,105
136
Yea, innovation and progress irritates me too, i mean, why travel to mars, even the moon again - lets just stay here and enjoy death by entropy. Oh Jolly.
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
This just goes to show that there is increasingly little reason to upgrade a CPU for the average consumer.

The data is never all in one place, but we can see :

1 - in 4th quarter 2012, 90 million PCs shipped
2 - in 4th quarter 2012, 28 million discrete GPUs shipped

Ref:
http://www.gartner.com/newsroom/id/2301715
http://jonpeddie.com/press-releases/details/amd-intel-nvidia-q4-graphics-gpu-shipments/

This means a bit less than 1 in 3 desktop PCs are getting a discrete GPU in the first place

We also know the overall dGPU market is shrinking faster than the desktop CPU market :

AIB1(1).JPG





What I'm really getting to here is that the industry has matured and is re-focusing on mobile where advancements can be made. The lifecycle of a PC, even for an enthusiast, is going well over 3 years now.

But I see a lot of indication that mobile is maturing too, meaning lifecycle of ARM parts may not be all that short anymore. Witness the "new" Galaxy Tab 3 7" and 8", whose processors are not so much different from the Galaxy Tab 2 7" and Galaxy Tab 7.7" (there are some other things that make them desirable vs their earlier counterparts, but not the processors).
 

crashtech

Lifer
Jan 4, 2013
10,695
2,294
146
This thread goes against everything the CPU forum is all about. Let's allow this thread to die so we can continue arguing about single digit percentage gains in performance.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
This thread goes against everything the CPU forum is all about. Let's allow this thread to die so we can continue arguing about single digit percentage gains in performance.

Heh, I have a different perspective... (edit: In BF3)

I go from 63 min 91 avg @ 4GHz 0.985v to 80 min 119 avg @ 4.8GHz 1.255v

Different strokes for different folks, even at 4.8GHz my Haswell is still the main factor in my performance. Personally when I go 20nm and double my gpu power I'll be waiting in line once again for whatever Intel/AMD has for me.
 
Last edited:

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
Heh, I have a different perspective... (edit: In BF3)

I go from 63 min 91 avg @ 4GHz 0.985v to 80 min 119 avg @ 4.8GHz 1.255v

Different strokes for different folks, even at 4.8GHz my Haswell is still the main factor in my performance. Personally when I go 20nm and double my gpu power I'll be waiting in line once again for whatever Intel/AMD has for me.

Those are some nice gains. I wonder if a single GPU would see similar gains? We all know Xfire loves some cpu Powa!

UnlimitedPOWA_05-255x143.jpg
 
Last edited:

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
"or heck, how about an i7-3960K and a Titan as long as we're dreaming!)"

Which I went for, and I'm rather non-plussed. I could have built an AMD gaming box for the price of that Titan, and only reduce one or two settings from super maximum high to high. Sure its great to have that much hardware grunt, but I've now realized its rather irrevelant. How many games can really push that a year? One? Two? Still, now I have a taste of the high end, I'd love to build a basic mid-range gaming box. Mmmmm . . . . .
 

crashtech

Lifer
Jan 4, 2013
10,695
2,294
146
"or heck, how about an i7-3960K and a Titan as long as we're dreaming!)"

Which I went for, and I'm rather non-plussed. I could have built an AMD gaming box for the price of that Titan, and only reduce one or two settings from super maximum high to high. Sure its great to have that much hardware grunt, but I've now realized its rather irrevelant. How many games can really push that a year? One? Two? Still, now I have a taste of the high end, I'd love to build a basic mid-range gaming box. Mmmmm . . . . .
Are you gaming at 2560x1440?
 

ruhtraeel

Senior member
Jul 16, 2013
228
1
0
IMO Nehalem to Sandy Bridge was a good jump. Improved on pretty much everything.

All the ones after that have seemed to be very little improvements, if at all. Ivy Bridge runs hotter because Intel cheaped out on using thermal solder, and used paste instead. = Less overclock room

Haswell keeps the paste AS WELL as having their voltage regulators on the chip. Even higher temperatures, even less overclock.

Within every generation after Sandy Bridge, there has been an improvement of 5 to 15 percent. But since SB is so much more OC'able, I'd say Intel really isn't bringing anything extraordinary out right now, and the performance difference isn't noticeable from Sandy Bridge to Haswell if you OC mildly.

This is sad because every generation after is supposed to be a total improvement from the last, but if I still have more incentive to buy a $160 i5 2500k versus a $230 i5 4670k, you know the development has stagnated.





Although, if the i7 4960X really does use solder again and can OC as well as SB could, it might be the first processor to be worth upgrading to in a long time...
 
Last edited:

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
These are on ultra settings with a 7870, of course there's not a big difference in max framerate, there is a decent difference in the minimums though.
 
Feb 19, 2009
10,457
10
76
Heh, I have a different perspective... (edit: In BF3)

I go from 63 min 91 avg @ 4GHz 0.985v to 80 min 119 avg @ 4.8GHz 1.255v

Different strokes for different folks, even at 4.8GHz my Haswell is still the main factor in my performance. Personally when I go 20nm and double my gpu power I'll be waiting in line once again for whatever Intel/AMD has for me.

1. You have a CF OC setup. You alleviate GPU bottlenecking in a lot of games, thus the bottleneck is squarely on the CPU.

2. BF3 is highly threaded scaling up to 6 cores and clock speeds.

Combine both of the above, for sure, for someone like you, CPU gains are necessary to keep up. For people on single GPU, especially mid-range or below, any pos CPU is fine as long as its a quad core.

Phenom II or Q9000s? Fine. I mean lets say a user has a Q9400 rig, and a 8800GT, he has $200 budget to improve his gaming performance. Should he get a cheap MB + CPU combo or should he get a new GPU. No contest when the budget is tight, GPU power >>>> CPU power for gaming. Always has been and by the looks of things (lack of progress from Intel/AMD on the CPU side), gonna be this way for awhile.