Please help my friend decide if to ditch his AMD 760K quad core for a Celeron

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

john5220

Senior member
Mar 27, 2014
551
0
0
Windows 10 is the last and final OS Microsoft will release, so Sandy Bridge is good for 20 years.

And Sandy Bridge is approaching 5 years. And nothing has been able to dent the 2500K as we speak

Not even close there are no signs in nearly 5 years that games have reached anywhere near remotely close to tapping out the chip.

When I say 20 years I meant 20 years from manufacture of the chip so right now we are looking at about 15 years to go.

Even then we could add 20 and make it 25 years in total before the i7 sandy bridge bottoms out at a properly designed game that actually properly uses more than 8 threads.

We are not even sure that in 15 to 20 years games will truly use the full potential of all 8 threads considering even to this day games are still largely single threaded
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
And Sandy Bridge is approaching 5 years. And nothing has been able to dent the 2500K as we speak

Not even close there are no signs in nearly 5 years that games have reached anywhere near remotely close to tapping out the chip.

When I say 20 years I meant 20 years from manufacture of the chip so right now we are looking at about 15 years to go.

Even then we could add 20 and make it 25 years in total before the i7 sandy bridge bottoms out at a properly designed game that actually properly uses more than 8 threads.

We are not even sure that in 15 to 20 years games will truly use the full potential of all 8 threads considering even to this day games are still largely single threaded

Must have missed GTA V:

gtav_vhigh_cpu.png


Stock to stock there is a difference. Same (lesser degree) with Witcher III. And no, not everyone overclocks the stuffing out of their CPUs.
 

john5220

Senior member
Mar 27, 2014
551
0
0
ROFL ow wow 5 FPS min difference. LOL

Like I said there has been no dent in 5 years. What 1 FPS a year? gimme a break. And the reason when OC there is practically no difference is cause these games are largely single threaded and works better on a higher clock.

With DX 12 reducing the load on the CPU an even less need to upgrade, we are now looking at an even longer lifespan, face it the 2500K and 2600K especially OC is going to last 20 years from date of manufacture
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
ROFL ow wow 5 FPS min difference. LOL

Like I said there has been no dent in 5 years. What 1 FPS a year? gimme a break. And the reason when OC there is practically no difference is cause these games are largely single threaded and works better on a higher clock.

With DX 12 reducing the load on the CPU an even less need to upgrade, we are now looking at an even longer lifespan, face it the 2500K and 2600K especially OC is going to last 20 years from date of manufacture

Yeah, for 2/3 of that list, no one would notice any difference in game play between the various CPUs.
 

john5220

Senior member
Mar 27, 2014
551
0
0
^ yeah I am still amazed at how people look at 3 FPS and say LOOOOOK BAZIGAAA!!!

In 5 years and how many generations later we have achieved 3 FPS!!!!

Yeah I really should sell a perfectly good sandy bridge 2500K and get the latest haswell i5 and lose money for no good reason. When that 2500K can last 20 years.

Face it guys this isn't the 90's anymore and we have reached such a mature state there really is only so much the human mind can achieve. Show me the day a game cannot run on a 2600K i7 Sandy bridge 8 threaded chip.

Show me that day when the minimum requirements of a game is 10 CPU Cores. If we live to see that day ok... DX 12 at most still does not even utilize 6 cores. The tests show significant gains in dual and quad cores.

Almost nothing for 6 cores and mind u this test was done on star swarm which really does not even represent real world gaming. For all we know we are years away maybe even a decade before games can even properly use 6 cores
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
Uh, you missed the biggest point on that chart - the completely craptastic performance of the Pentium dual-core chips. Even when massively overclocked, the G3258 still has truly horrible minimum frame rates - which basically means it's going to be a stuttery mess when you get a lot of stuff going on in game. Your friend should avoid a Celeron chip like the plague!
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
Uh, you missed the biggest point on that chart - the completely craptastic performance of the Pentium dual-core chips. Even when massively overclocked, the G3258 still has truly horrible minimum frame rates - which basically means it's going to be a stuttery mess when you get a lot of stuff going on in game. Your friend should avoid a Celeron chip like the plague!

On that chart, at the level of the DC Pentiums, you are pretty much looking at unplayable frame rates anyway, even from the quad core chips. At that point, you are at 40 or less with all of the chips, a rate which you probably won't like overall.

In other words, while the FX-4300 performs better at minimum frame rates, it still falls into the same overall category as the DC pentiums. Not fast enough.
 
Last edited:

sm625

Diamond Member
May 6, 2011
8,172
137
106
Yeah but that's just games,not that that's something bad,but all that hype about gpu computing has no relation to the common user,you have to be a professional of some kind to ever run such a program.
But I guess playing VR in 4k(prob. higher) on the go would be pretty nice.

It's not just games, its graphics compute. I think its going to be a really big deal in the future. Imagine you have 2-3 people walking around an area capturing video streams on their smartphones. In the coming years, any person with a PC is going to be able to take those multiple video streams and feed them into a power hungry software that constructs a complete 3D scene from those video streams, to be used in VR. This will require enormous GPU compute power. The entire world is going to be captured and rendered over the next decade, similar to how those google vans drive around the country mapping the streets. Every real estate agent is going to be running this software. Every porn star (the early adopters no doubt), travel agents, etc. All sorts of careers we dont even know about will spring up.
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
still has truly horrible minimum frame rates - which basically means it's going to be a stuttery mess when you get a lot of stuff going on in game. Your friend should avoid a Celeron chip like the plague!
GtaV, slowest celeron around the g1820, actual gameplay, 25fps while recording and no drops at all
https://www.youtube.com/watch?v=UK0_wOKv-rU

Not that this means that someone should sell their working 760k system to get another with similar/lower performance,but ... like the plague? common, the 760k hardly does better and that's without recording.
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
It's not just games, its graphics compute. I think its going to be a really big deal in the future. Imagine you have 2-3 people walking around an area capturing video streams on their smartphones. In the coming years, any person with a PC is going to be able to take those multiple video streams and feed them into a power hungry software that constructs a complete 3D scene from those video streams, to be used in VR. This will require enormous GPU compute power. The entire world is going to be captured and rendered over the next decade, similar to how those google vans drive around the country mapping the streets. Every real estate agent is going to be running this software. Every porn star (the early adopters no doubt), travel agents, etc. All sorts of careers we dont even know about will spring up.

Sure,no way that some centralized "agency" ,like google that you brought up, will do that for you once instead for everybody having to do this over and over again.