Expectation gaming life of a 6600/6700k and a 5820k

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
One can always argue in terms of lifespan. The stock clock change of the 4790K and 6700K compared to previous makes them much more compelling. Assuming you dont overclock.

However in raw terms, a person buying a 2600K vs a 2500K pretty much wasted his money. And its unlikely to change for the foreseeable future. DX12 may if anything simply reduce CPU load.
 

Spungo

Diamond Member
Jul 22, 2012
3,217
2
81
And mind you, these tests were at 1080p.... if you step to 1440p or god forbid 4K, the differences become decimals of a percentage. Gaming is 99% GPU bound, when will people learn this?

The difference between an ok CPU and a great CPU is measured in years instead of frames. You can use a 2 core Pentium g3258 for games, and it will work great today, but it might need to be upgraded in 2 years. An i5 or i7 will last a lot longer, and it ends up saving money in the long run.

Cost-wise, the same is true for video cards. The last expensive card I bought lasted a very long time. I probably got about 4-5 years out of my 8800GTX. If I was buying crap cards that can barely run modern games, I would need to buy several cards, and it would end up costing more than just buying a good card in the first place.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Yup, paying an exorbitant $100 for a long lived part with higher stock clocks and HT is such a complete waste of money, especially with a GPU with such a godawful price/performance ratio like a GTX 980. :rolleyes:

BTW average FPS proves nothing without knowing minimums and frame time distributions.

There is nothing you can do about that. If you want the best gaming performance, you have to shell out for the best GPU available. And while there is a law of diminishing returns there, when you put more money into a CPU for more cores/threads for gaming, there isn't any diminishing return, there simply is no return, just like buying 16 or 32 GB of ram instead of 8. So yes, the 980 is a bad perf/$ purchase (like any other top tier part), but it's the only way to have top shelf graphics performance, today.


The difference between an ok CPU and a great CPU is measured in years instead of frames. You can use a 2 core Pentium g3258 for games, and it will work great today, but it might need to be upgraded in 2 years. An i5 or i7 will last a lot longer, and it ends up saving money in the long run.

Cost-wise, the same is true for video cards. The last expensive card I bought lasted a very long time. I probably got about 4-5 years out of my 8800GTX. If I was buying crap cards that can barely run modern games, I would need to buy several cards, and it would end up costing more than just buying a good card in the first place.

No it's not measured in years, because like I said before you will end up moving to a newer platform for other reasons.

The 8800 GTX is a poor example (and almost a one of a kind example), it was a huge leap in performance from what we had, and it brought DX10 support while Windows Vista adoption was abysmal and DX10 games could be counted on one hand. Those two things combined helped the 8800 have a ridiculous long service life.

Anyone with common sense recognizes that top of the line components offer the worst perf/$ and have huge early depreciation. Not only that, but you will end up being stuck with old feature sets while trying to amortize your cost for as long as possible. So the whole point of recommending top tier parts and riding them out is completely wrong.

Common sense dictates that you should only splurge money on the component that helps the main objective of your system. If it's a gaming or parallel computing system, there is only one component to splurge on, the GPU. If you're building system for software encoding/rendering, splurge mostly on the CPU. If you are building a system to run applications that require huge IO throughput, splurge on SSDs. And so on.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
This is a load of crap, pure FUD that only people who build for e-peen need for re-assurance. Example:

http://www.techspot.com/review/972-intel-core-i3-vs-i5-vs-i7/page5.html

Core i3 with 2 SMT cores is plenty for almost any modern game. Is it ideal? No there are few exceptions, but getting anything more than an i5 for gaming is just FUD. i7, and most certainly 6 core SMT i7s only have value for CPU intensive workloads like rendering or encoding.

And mind you, these tests were at 1080p.... if you step to 1440p or god forbid 4K, the differences become decimals of a percentage. Gaming is 99% GPU bound, when will people learn this?




If you're building a system to game on, other features will push you to another platform before your CPU even becomes the bottleneck. It's been like this for the past 10 years, and it will keep being like this for the foreseeable future, at least until we keep rendering through DirectX. People with gaming systems with i5s and a 980 are smart, they put the money where it's needed for gaming instead of wasting it.
yeah I expected an ignorant reply from at least one person. as I said, if you want to get the full use out of a high end card and never have to worry about hitching from pegging the cpu in any game then an i7 is the better choice now and going forward. that is a FACT whether you like it or not. I have at least 5 or 6 games that go over 50% usage on my 4770k at times and with HT off in those spots it can usually even be felt. heck Crysis 3 will eat 80% of my cpu at times and was noticeably stuttery experience with my oced 2500k in those spots. but hey you know better than me as I clearly just upgraded to an i7 to impress all my friends. :rolleyes:


EDIT: here is just a simple example in Watch Dogs.

I tested the game at 2560x1440 on max settings and temporal SMAA.

HT ON
Frames, Time (ms), Min, Max, Avg
5809, 70938, 74, 91, 81.888

HT OFF
Frames, Time (ms), Min, Max, Avg
5116, 67937, 55, 89, 75.305

not only is there an actual FPS difference in this particular case here but the difference was night and day in how smooth the game felt. with HT off it felt choppy several times during this short run and you could clearly see the character animation hitch several times while running. the framerate had huge fluctuations with HT off and went below 60 fps several times. and this simple test involved no real heavy action and was just me running down the street and then up some stairs with not very many people around.

and most people probably have stuff running in the background as I doubt everyone shuts down their browsers and stops every service. HT gives some real world breathing room for other things that may need to use a few % of the cpu while you are gaming. that eliminates any potential issues you could experience at those times when all 4 cores are already being fully utilized from a game.

and as I have already mentioned, I am talking about high end systems here. I am not talking about someone on a tight budget that would need to make decision between an i7 and getting something better than a mid range gpu. someone that goes out and gets a new cpu now that will be using a 980 Ti, 16GB ram, and SSD would be silly to not spend the extra money on the i7 over the i5 though for long term use as it already matters now it some cases.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
22,886
12,943
136
Personally I only got the i7 due to stock clocks and cache.

If you're going to be stuck with a CPU for years and years, may as well go for the larger cache. It does make a difference in games!

HT is just icing on the cake, for those circumstances where it's useful on a 4c CPU.
 

Spungo

Diamond Member
Jul 22, 2012
3,217
2
81
No it's not measured in years, because like I said before you will end up moving to a newer platform for other reasons.
Such as?


The 8800 GTX is a poor example (and almost a one of a kind example), it was a huge leap in performance from what we had, and it brought DX10 support while Windows Vista adoption was abysmal and DX10 games could be counted on one hand. Those two things combined helped the 8800 have a ridiculous long service life.
The same was also true of the GeForce4 4400. My friend got one around 2002-2003, and it lasted until 2007 when he bought an 8800 Ultra. During that time, I had 3 different video cards. I had a GeForce2 Ti, A GeForce 5200, and a Radeon 9600. Getting second rate products didn't save money in the long run.