Is the Pentium G4620 The First Decent Stopgap in Years?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VirtualLarry

No Lifer
Aug 25, 2001
56,230
9,991
126
How does the HD610 and HD630 stack up to an NV GT610/620 / GT630/730 (DDR3) card? For desktop usage and 1080P / HDMI/DVI video watching?

Was wondering, if I upgrade my friend's rig, if I should re-use his GT610 video card, and just get the G4560 / HD610, or get the G4600 / HD630, and leave off the GPU?
 
  • Like
Reactions: Drazick

Bouowmx

Golden Member
Nov 13, 2016
1,138
550
146
For 2D desktop and videos, any recent GPU can do. No point is filling an expansion slot for that when Kaby Lake GPU is available.

For 3D, Intel HD Graphics 630 (192 shaders) is better than Nvidia Fermi 48 and 96 shaders (not specifying model because of clutter of rebrands) and Kepler 192, but a toss-up against Kepler 384.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
Thats been my rational for several years. A video card is a lot easier to replace than a chip and is the one major part you will want to replace these days of 5 year cpus. Its not that hard to swap drives around (or just buy an ssd first and add on HDD later) either.

Remember those arguing for saving $100 for a 955BE instead of a 2500K because that $100 can be spent on a better GPU back in 2011?

Guess who look like fools now?
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
How does the HD610 and HD630 stack up to an NV GT610/620 / GT630/730 (DDR3) card? For desktop usage and 1080P / HDMI/DVI video watching?

Was wondering, if I upgrade my friend's rig, if I should re-use his GT610 video card, and just get the G4560 / HD610, or get the G4600 / HD630, and leave off the GPU?

Those GPUs suck as they cannot hardware decode HEVC, never mind 10-bit. Or VP9. Seeing as everything is rich media now and not DivX from 2001 that is a big omission. Even the rubbish Snapdragon 210 in my cheapo Android can at least hardware decode (8 bit) HEVC at the very least. And that is a $100 phone. Kaby Lake can finally do all that natively. No hacks, no partial decoding. Although nothing can decode AV1 (yet). Still those $50 add-in boards are rubbish now and I doubt Nvidia will dump out a modern cheap $50 card anymore.
 
  • Like
Reactions: VirtualLarry

VirtualLarry

No Lifer
Aug 25, 2001
56,230
9,991
126
Those GPUs suck as they cannot hardware decode HEVC, never mind 10-bit

That's a very good point. I wasn't really considering that my friend would be watching HEVC, but if that or VP9 takes off, then that's a real possibility 2-5 years down the line.

Likewise, would it be better to spring for the HD630, just in case there's some newer codec that gets implemented in the drivers as a "hybrid" (GPU) solution for decode?
 
  • Like
Reactions: Drazick

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
That's a very good point. I wasn't really considering that my friend would be watching HEVC, but if that or VP9 takes off, then that's a real possibility 2-5 years down the line.

Likewise, would it be better to spring for the HD630, just in case there's some newer codec that gets implemented in the drivers as a "hybrid" (GPU) solution for decode?

2-5 years down the track there will be CPUs with better iGPUs. That said:

- DivX/Xvid - 2001 (early use) - now - can be decoded with a potato, quality is meh at best

- H.264/x264 - 2008 (early use) - now - can be decoded by most anything, CrApple will only decode it natively in an .mp4 container

- HEVC/x265 - 2014 - now - decoding is progressing, CrApple is lagging behind

Hybrid solutions aside creaky old MPEG 4 is still in use so buying now for the future doesn't really matter, it makes more sense to buy the best all rounder decoder now which is Kaby Lake. Interestingly there is still around at least a 5 year gap before the successor starts to make in-roads. I wonder if that will start to shrink with AV1 and 4K and beyond . . . .
 

VirtualLarry

No Lifer
Aug 25, 2001
56,230
9,991
126
My friend in question JUST got a 1080P monitor. So he's not likely moving to 4K anytime soon. (Unless I get tired of my 4K UHD displays, or decide to upgrade... but
I'm really happy with them so far.)
 
  • Like
Reactions: Drazick

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
Remember those arguing for saving $100 for a 955BE instead of a 2500K because that $100 can be spent on a better GPU back in 2011?

Guess who look like fools now?
Yeah, but who in 2011 planned on keeping the same CPU for five to seven years?
 
  • Like
Reactions: Drazick

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Okay, fine, but really what this digression into HT is about in regards to this thread is the fact that it works MUCH better on duals than quads and hexacores, something that many, including you, seem to have only become aware of recently. Perhaps your HT tutorial could come to some conclusions of why this is so in order to remain topical.
I suppose that is because modern games are shooting for quad-core CPUs, performance-wise. I went to check out the system requirement of Star Citizen, and it is as follows:
  • Windows 7 (64 bit) – Service Pack 1, Windows 8 (64 bit)
  • DirectX 11 graphics card with 1GB Video RAM
  • Quad core CPU
  • 8GB Memory
https://robertsspaceindustries.com/download

I guess under such a requirement having virtual cores are better than nothing. But IMO it is better to buy more cores than to pay for virtual cores if you know you can use those extra cores. If the HT was "free" then sure it is good to have it in case it may be of help. But as it is Intel fuses off and cripples its own silicons to segment the market, and consumers should be informed of what they are really getting for their money.
lopri" said:
What distinguishes Pentium from i3?

AVX
Thank you. I have long lost track of all the different SKUs Intel churns out.
 

Ranulf

Platinum Member
Jul 18, 2001
2,331
1,139
136
Remember those arguing for saving $100 for a 955BE instead of a 2500K because that $100 can be spent on a better GPU back in 2011?

Guess who look like fools now?

Sure but that argument has been made for years now after that generation and given past history was good advice in 2011. Post 2013 not so much.

Yeah, but who in 2011 planned on keeping the same CPU for five to seven years?

Indeed. I didn't with my 955. The AMD chip was the best option for those on a cheap budget (like they were for several years). If I knew then what I know now, I would have bought a 2600k instead of my 2500k. I did ok with upgrading my 955 system to an 8350 but meh, I would have been better off getting a whole new system around an 8350 and a new 970/990 board.
 
  • Like
Reactions: f2bnp
Feb 25, 2011
16,777
1,466
126
Sure but that argument has been made for years now after that generation and given past history was good advice in 2011. Post 2013 not so much.



Indeed. I didn't with my 955. The AMD chip was the best option for those on a cheap budget (like they were for several years). If I knew then what I know now, I would have bought a 2600k instead of my 2500k. I did ok with upgrading my 955 system to an 8350 but meh, I would have been better off getting a whole new system around an 8350 and a new 970/990 board.
I think, in hindsight, that the clues were there.

Intel almost doubled per-core performance in 2006 with the Core 2 architecture. Then bolted four of them together to release the Q6600 by 2008. And in 2010 or 2011, the top-end i7s had double the performance of THAT.

I think we were mostly too freaked out to think, "Gee... that can't keep this up much longer, can they."
 

VirtualLarry

No Lifer
Aug 25, 2001
56,230
9,991
126
Ok, got my G4600 Kaby Lake Pentium dual-core with Hyperthreading (3.6Ghz) today.

Put it into my DeskMini that had a G3900 Skylake Celeron in it. Still rocking only a single stick of DDR4-2400, so I'm in single-channel. (I thought ahead to Kaby Lake when I got RAM for my DeskMini units, that's why I got 2400.)

For browsing the forums with Waterfox 50.1.0 in Win10 1607 64-bit, it seems marginally faster, but not much. I'm guessing, just the single-threaded speed differences between the 2.8Ghz SKL and the 3.6Ghz KBL. I doubt that the HT is doing much yet, as I tested Waterfox (or was it Firefox) on PeaceKeeper a while back, between my BLCK OCed i5-6400, and my G4400, at roughly the same speed, but with double the threads, it still scored pretty-much the same.

Edit: CPU-Z 1.78 benchmark results:
ST: 1793
MT: 4064
(single-channel DDR4-2400)
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
Ok, got my G4600 Kaby Lake Pentium dual-core with Hyperthreading (3.6Ghz) today.

Put it into my DeskMini that had a G3900 Skylake Celeron in it. Still rocking only a single stick of DDR4-2400, so I'm in single-channel. (I thought ahead to Kaby Lake when I got RAM for my DeskMini units, that's why I got 2400.)

For browsing the forums with Waterfox 50.1.0 in Win10 1607 64-bit, it seems marginally faster, but not much. I'm guessing, just the single-threaded speed differences between the 2.8Ghz SKL and the 3.6Ghz KBL. I doubt that the HT is doing much yet, as I tested Waterfox (or was it Firefox) on PeaceKeeper a while back, between my BLCK OCed i5-6400, and my G4400, at roughly the same speed, but with double the threads, it still scored pretty-much the same.

Edit: CPU-Z 1.78 benchmark results:
ST: 1793
MT: 4064
(single-channel DDR4-2400)
Try adding adding another stick of memory. You should see enough improvement to be worth the extra stick.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,230
9,991
126
Try adding adding another stick of memory. You should see enough improvement to be worth the extra stick.

That's kind of what I figured. That's next on my list to purchase, another 2x8GB DDR4 SO-DIMM kit.

post images

Wow, HEVC and VP9, up to 10-bit and 8K. The media-decode block on KBL is impressive!

I'm thinking to get a few of these for F&F. Time to upgrade, everybody!
 
Last edited:
  • Like
Reactions: Ironhide1
Mar 10, 2006
11,715
2,012
126
I don't get the idea of "stopgap" CPUs/components. That's just double dipping, handing your money over twice to the CPU company when you could just hand it over once for a good CPU and not need to come back for 3-5 years+.

I can virtually guarantee that anybody who buys a 7700K for a gaming PC today will not "need" to upgrade for a long, long time. Even the i5 7600 isn't going to be a problem for a long time.

But a Celeron, Pentium, or i3? That'll need to be upgraded much sooner.

Now of course this argument works IF you have the money to spend upfront for the i7. If you don't, then you buy the best you can and ride it as long as possible. But people who CAN afford the 7700K but end up just throwing money away on stopgaps are just enriching Intel and its shareholders at the expense of their own pocketbooks.
 
  • Like
Reactions: richierich1212

crashtech

Lifer
Jan 4, 2013
10,521
2,111
146
I don't get the idea of "stopgap" CPUs/components. That's just double dipping, handing your money over twice to the CPU company when you could just hand it over once for a good CPU and not need to come back for 3-5 years+.

I can virtually guarantee that anybody who buys a 7700K for a gaming PC today will not "need" to upgrade for a long, long time. Even the i5 7600 isn't going to be a problem for a long time.

But a Celeron, Pentium, or i3? That'll need to be upgraded much sooner.

Now of course this argument works IF you have the money to spend upfront for the i7. If you don't, then you buy the best you can and ride it as long as possible. But people who CAN afford the 7700K but end up just throwing money away on stopgaps are just enriching Intel and its shareholders at the expense of their own pocketbooks.
That's not as true for those who buy and then SELL their hardware. For instance if I buy a $250 CPU, keep it for a year, then sell if for $200, I've more or less "rented" that CPU for less than $5/mo. I don't mind the extra expenditure because it's a hobby for me, not just a business decision. It's not the best use of money, but neiter is it as bad as some make it out to be.
 
  • Like
Reactions: poofyhairguy
Mar 10, 2006
11,715
2,012
126
That's not as true for those who buy and then SELL their hardware. For instance if I buy a $250 CPU, keep it for a year, then sell if for $200, I've more or less "rented" that CPU for less than $5/mo. I don't mind the extra expenditure because it's a hobby for me, not just a business decision. It's not the best use of money, but neiter is it as bad as some make it out to be.

If it is a hobby, then all bets are off. I personally do that too, I buy and sell hardware like crazy, the hobby costs me money, but since I enjoy it it's all worth it -- to me! :)

I'm more talking about the people who just want to build a system to game and want to get the best value for their money over the long term.
 

crashtech

Lifer
Jan 4, 2013
10,521
2,111
146
If it is a hobby, then all bets are off. I personally do that too, I buy and sell hardware like crazy, the hobby costs me money, but since I enjoy it it's all worth it -- to me! :)

I'm more talking about the people who just want to build a system to game and want to get the best value for their money over the long term.
I mentally lumped poofyhairguy into our hobbyist category, but maybe we should ask him. :)
 

VirtualLarry

No Lifer
Aug 25, 2001
56,230
9,991
126
Well, after using it for a few days, I have to say, I think that the G4600 is a Winner, for basic web-browsing tasks and whatnot. Still, I'm only using single-channel memory, and scrolling this forum in Waterfox 50.1.0, when the window is maximized full-screen, in 4K, it scrolls a little bit choppy.

I've got some more RAM incoming, so I'll be able to see if that improves with dual-channel. I have great hope. :)
 
  • Like
Reactions: poofyhairguy

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
That's not as true for those who buy and then SELL their hardware. For instance if I buy a $250 CPU, keep it for a year, then sell if for $200, I've more or less "rented" that CPU for less than $5/mo. I don't mind the extra expenditure because it's a hobby for me, not just a business decision. It's not the best use of money, but neiter is it as bad as some make it out to be.

The same reason why I don't recommend i5s since Devil's Canyon, it doesn't make sense not to spend $100 for the better i7 when you can recoup the difference back later selling it anyway.

Heck if I can sell my 4790K now on Ebay I would have only lost $30 which is a pittance for 2.5 years of use.
 
  • Like
Reactions: richierich1212

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
That's kind of what I figured. That's next on my list to purchase, another 2x8GB DDR4 SO-DIMM kit.

post images

Wow, HEVC and VP9, up to 10-bit and 8K. The media-decode block on KBL is impressive!

I'm thinking to get a few of these for F&F. Time to upgrade, everybody!

and I think the G4560 keeps the same capabilities right?
that would be a really nice and cheap HTPC.

one question, can you use the IGP video decode for the web browser and other software (like MPC) while you run an older discrete VGA as your video out?