• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Thoughts on the Intel Celeron G1610?

desura

Diamond Member
A friend wants to build a gaming computer and I'm helping him put together the parts list. Basically...he's into MOBA's, which means DOTA2.

I'm going to give him my old video card, a GTX 560 ti448, and I'm thinking that the Celeron G1610 should be good enough. A cursory glance at this site I found on google

http://cpuboss.com/cpu/Intel-Celeron-G1610

indicates that it is within 10% of the i3? It's an ivy part, too, so should be good...

Anyways, it's part of a build that I managed to piece together which, excluding OS, costs $225. Thoughts? Yes, I know the i3 exists, but it is $70 more, and I figure he can always upgrade later if it really bothers him.
 
Surprised to see it performing so poorly relative to the older AMD quads. I guess we really are at the stage now where dual cores just don't cut it for gaming.
 
or Athlon X4 750K



PS. G1610 is great chip, runs cool - fast and nice for many simple tasks - I have it on non-gaming rig. If you want to game, then better imo to spent this 20-30$ more on something bit better.
 
Damn tragedy that such a video card is gonna be paired with such a cpu,but should be the best cpu in its price range,when people are suggesting the x4 750k they are assuming you could up your budget and if you can certainly go with that or the g2020.
 
I'm not sure where you live, but don't be afraid to search Craigslist, ebay, and for sale forums. Intel cpus rarely go bad or die and don't have pins to get bent so the odds of getting something bad used are very low. You might be able to find an i3 cheaper than you think.
 
On that note, we need to get to the stage where hexa core CPUs (eg 4930k) has a clear benefit for gaming. Right now those chips (eg SB-E and IVB-E) are amazing for video editing, encoding, and real work - but since about 10-15% of my time is mucking around with games, I would love for that to happen. Anyway with regard to the topic, here's another vote for the G2020 instead of the 1610.
 
http://www.tomshardware.com/reviews/piledriver-k10-cpu-overclocking,3584-19.html

Here's a handy chart. Hint: G1610 is at the bottom, pretty-much.

I wonder how much the i3 will lose by disabling HyperThreading?

I guess we really are at the stage now where dual cores just don't cut it for gaming.

Well, if the OP's friend is only going to be playing DOTA2...

If you can get a LGA 1150 Mobo for the same price, get this at least:
http://www.newegg.com/Product/Produc...82E16819116950

Oh wow! Didn't know dual core desktop Haswell was out. I wonder how many EUs the IGP has?
 
Dual core cpu's are getting outdated but they can still more than hold their own.Grab a cheap Pentium rather than Celeron for a higher clock speed and more L3 cache out of the box.
 
On that note, we need to get to the stage where hexa core CPUs (eg 4930k) has a clear benefit for gaming. Right now those chips (eg SB-E and IVB-E) are amazing for video editing, encoding, and real work - but since about 10-15% of my time is mucking around with games, I would love for that to happen. Anyway with regard to the topic, here's another vote for the G2020 instead of the 1610.

Willing to bet games won't seriously take advantage of 6 core chips till a 6 core from intel is mainstream priced.

Would suck about now if a game for example like BF4 would pull 40fps with a core i5 3570k and a 7970Ghz at 1080p maxed and having a 4930k gives you 60+ fps as the majority of gamers won't have a purpose for a 4930k outside of BF4.
 
Willing to bet games won't seriously take advantage of 6 core chips till a 6 core from intel is mainstream priced.

Would suck about now if a game for example like BF4 would pull 40fps with a core i5 3570k and a 7970Ghz at 1080p maxed and having a 4930k gives you 60+ fps as the majority of gamers won't have a purpose for a 4930k outside of BF4.

The next generation consoles will be guaranteed to be heavily MT/MC optimized, specific to those chips being used. Regardless of what any anti AMD troll might say (really, I don't like their CPUs either) it will happen - the level of optimization for even the xbox 360 and PS3 is unbelievable. Some of the stuff happening in those games, which have a GPU on the level of a 2004 ATI X1950XT, should not be possible. Games like Last of Us, or even something simple like Black Ops 2 are so incredibly optimized for those machines that it really is amazing. If you tried to run any modern game at 1024x768 or 720p on an ATI X1950 level GPU? Yeah good luck with that and the resulting 2 frames per second.

Anyway, Anti AMD trolls would state it won't happen. I consider myself someone to really dislike their desktop CPUs myself but being realistic in this situation - MT/MC optimization will happen because it has to - console games are leaps and bounds the biggest revenue in the video game industry, period. They are still finding incredible tricks to optimize for the 360 and PS3, even after all of these years. Let's make no mistake, some of these games should not even be running at 640 or 720p on the prior generation systems.

Anyway, here's my point : Now the real question is this. Will PC ports of those same games also be heavily multi core and multi thread optimized? That isn't a certain outcome. Console games DEFINITELY WILL BE. But the thing to keep in mind, is that those systems are using their own tools and SDKs for programming and will not be using Direct3D - they're using customized SDKS and APIs for direct hardware access via lib GCN. So the million dollar question is, will PC ports also be heavily MT/MC optimized? Who knows. I certainly don't know, but I hope so. If so, intel's hexa core CPUs will really shine in gaming for the first time ever (in terms of higher performance over mainstream CPUs), which would be pretty awesome.

Maybe then, we can see a future where the Intel quad core is the baseline. Starting with Haswell-E, 8 cores will be the baseline with the enthusiast CPUs. Maybe they can do the same for the rest of their product lines perhaps? I doubt it, but it would be pretty awesome if Celerons/Pentiums were quad as a baseline. Maybe some future G4020 chip will be a quad?! Who knows what will happen though.
 
Last edited:
The next generation consoles will be guaranteed to be heavily MT/MC optimized, specific to those chips being used. Regardless of what any anti AMD troll might say (really, I don't like their CPUs either) it will happen - the level of optimization for even the xbox 360 and PS3 is unbelievable. Some of the stuff happening in those games, which have a GPU on the level of a 2004 ATI X1950XT, should not be possible. Games like Last of Us, or even something simple like Black Ops 2 are so incredibly optimized for those machines that it really is amazing. If you tried to run any modern game at 1024x768 or 720p on an ATI X1950 level GPU? Yeah good luck with that and the resulting 2 frames per second.

Anyway, Anti AMD trolls would state it won't happen. I consider myself someone to really dislike their desktop CPUs myself but being realistic in this situation - MT/MC optimization will happen because it has to - console games are leaps and bounds the biggest revenue in the video game industry, period. They are still finding incredible tricks to optimize for the 360 and PS3, even after all of these years. Let's make no mistake, some of these games should not even be running at 640 or 720p on the prior generation systems.

Anyway, here's my point : Now the real question is this. Will PC ports of those same games also be heavily multi core and multi thread optimized? That isn't a certain outcome. Console games DEFINITELY WILL BE. But the thing to keep in mind, is that those systems are using their own tools and SDKs for programming and will not be using Direct3D - they're using customized SDKS and APIs for direct hardware access via lib GCN. So the million dollar question is, will PC ports also be heavily MT/MC optimized? Who knows. I certainly don't know, but I hope so. If so, intel's hexa core CPUs will really shine in gaming for the first time ever (in terms of higher performance over mainstream CPUs), which would be pretty awesome.

Maybe then, we can see a future where the Intel quad core is the baseline. Starting with Haswell-E, 8 cores will be the baseline with the enthusiast CPUs. Maybe they can do the same for the rest of their product lines perhaps? I doubt it, but it would be pretty awesome if Celerons/Pentiums were quad as a baseline. Maybe some future G4020 chip will be a quad?! Who knows what will happen though.

This kind of played into my decision when I built my pc.

GTA iV is optimized for the 360 CPU. The triple core one.

And actually, it performs quite badly on dual core.

Kind of what made me get an i5. Of course, I didn't really get into the game that much and don't find that fun. That's another matter.
 
Get at least a G2010, it's the same price or so.

I paired a HD 7790 with a G2010 and it was perfect at 1080p. Of course, at 720p, the cpu can become a bottleneck.

The power consumption of the G2010 is ridiculously low. 16W idling and 36W at full load.
 
Last edited:
as far as I know no Haswell have 6EUs, it's 10, 20 or 40...

G3220 should have 10, i3 and higher 20...

10EUs should be clearly faster than the HD3000 and HD2500 at least.
Intel practice to not clearly show specifications of Intel HD in Celerons and Pentiums is annoying.

Even i 3/5/7 series ARK pages could use EU number.

Additionally an "Intel HD" page with clear tables of exact specifications including things like Pixel or Texel rates, EU numbers, techs available or not available of each model since Sandy Bridge of Intel HDs would be nice.
 
Last edited:
Back
Top