Kabini 5350 a good option for basic usage?

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
I do apologize to anyone especially moderators if a honest to god consideration of a Kabini or APU platform started any debates.I was merely a interested customer who was laid away with better performance with a new budget.

I will say if the GT 740 GDDR5 card USER8000 found for $60 was $90 like the rest of them there was no doubt in my mind i wouldn't have jumped on the 7600.Quad core and a considerable IGP as good if not better then HD530 in any debate surely is more affordable then Skylake if not priced nearly the same+ the bonus of 4 threads for $85.

$130 as my old budget i would have jumped on the 5400k either way as long,i did some mods to the case for the build and i would be amazed if even my i5 2500 non k and 660 overheat in it so in the end wattage didn't matter to me as purely what graphics+cpu power i could have gotten for my bottom dollar.
 
Last edited:

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
I didn't see you complaining when GameGPU used out of spec DDR3-2666 for the AMD platform in your thread (A10 can use it at default @ 2400). Your double standards are laughable.

Not to mention that even cheap mobos feature ram oc so you don't need a z mobo.
By the way does anyone know if the skylake Igpu is overclockable like in haswells non-k cpus?


http://www.asrock.com/mb/Intel/H110M-HDS/?cat=Beta

http://www.newegg.com/Product/Produ...3157685&cm_re=PCIe_SSD-_-13-157-685-_-Product
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
I mentioned in one of the other APU threads i would benchmark GTA V and Hard Reset using in-game benchmarks.Not a fan of canned benchmarks.Will run 1024x768 and 1280x800 and for both resolutions GTA V will be run at lowest possible settings while hard reset will be maxed out with fxaa disabled.

More or less benching to see where a $140 7870k lands in comparison to a $45 G1820 and a $60 GT 740.Add in the price difference of a cheap stick of 4gb ddr3 1333 versus a 2133+ set of dual channel ddr3.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Officially supported RAM on the G4500 and Core i3 6100 is 2133MHZ DDR3 and DDR4. Hence any review which is overclocking RAM on the Intel platform should be doing the same for the AMD one too and anyway most users gaming on an IGP last time I checked will probably not be overclocking RAM. In fact I don't know a single person who overclocks RAM.
 
Last edited:

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Officially supported RAM on the G4500 and Core i3 6100 is 2133MHZ DDR3 and DDR4.

The boards focusing on just DDR4 are mostly for overclocking i guess?There is a few DDR3 based motherboard and even a few Asrock combo boards supporting both DDR3 and DDR4.

DDR3 based Skylake motherboards could just as well as a DDR4 one if your not using the IGP and not overclocking and sticking to something like a Celeron,i3 or locked down i5?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Officially supported RAM on the G4500 and Core i3 6100 is 2133MHZ DDR3 and DDR4.

1600Mhz DDR3L and 2133Mhz DDR4 as validated(any memory). However the IMC is validated for 4166Mhz with DDR4 and I cant remember with DDR3L.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
The boards focusing on just DDR4 are mostly for overclocking i guess?There is a few DDR3 based motherboard and even a few Asrock combo boards supporting both DDR3 and DDR4.

DDR3 based Skylake motherboards could just as well as a DDR4 one if your not using the IGP and not overclocking and sticking to something like a Celeron,i3 or locked down i5?

Remember you need DDR3L, not DDR3. That means 1.35V DDR3 and not 1.5V.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Remember you need DDR3L, not DDR3. That means 1.35V DDR3 and not 1.5V.

I have heard mixed answers on this but for the most part i wasn't comfortable anyways going with a G4500 and even trying to stick one of my 1.5v sticks in there.

Many people have the 1.5v DDR3 of course,is the DDR3 1.35v and DDR3 based 1151 motherboard simply a cost effective option to the DDR4 and DDR4 based motherboards?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Its just an alternative during the transition to DDR4 only. You can use 1.5V DDR3, but its just not validated with any memory.

DDR3(L) based LGA1151 boards account for around 1/5th of the boards.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
http://www.techspot.com/review/806-amd-kabini-vs-intel-bay-trail-d/page9.html

"So starting with the $50 Celeron G1820... is it a better buy than the Athlon 5350? Keep in mind you can purchase LGA1150 H81 boards for as little as $50, so the build cost is the same as the Athlon 5350 with an AM1 board.

In our application tests, the Celeron G1820 was 20% faster than the Athlon 5350 on average. Moreover, it was 63% faster at encoding and even 10% faster in our gaming tests. With that said, the G1820 consumes 45% more power, so it comes down to whether you care more about performance or power consumption.

Given that the extra power scales pretty well with the extra performance that you will see, the Celeron G1820 seems like a no brainer. For the same $50, AMD offers the A4-4020 which is a 200MHz higher clocked version of the A4-4000 we tested with, and given how that performed the G1820 shouldn't be threatened."

Application_04.png


By default, I can't see why you'd go Kabini over a Haswell Celeron (or the upcoming Skylake Celerons).
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
By default, I can't see why you'd go Kabini over a Haswell Celeron (or the upcoming Skylake Celerons).

Not sure if that was a statement towards me but i already ordered a G1820,a H81 motherboard and even a 1gb GT 740 GDDR5 .:D

As for Kabini,its a bottom dollar solution that appears to work for people who Netflix/Youtube and browse the web.HTPC option maybe not for some but a few bucks more gets you a 5400k+FM2 which has much more going for it .:thumbsup:
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Not sure if that was a statement towards me but i already ordered a G1820,a H81 motherboard and even a 1gb GT 740 GDDR5 .:D

As for Kabini,its a bottom dollar solution that appears to work for people who Netflix/Youtube and browse the web.HTPC option maybe not for some but a few bucks more gets you a 5400k+FM2 which has much more going for it .:thumbsup:

Netflix is a short one, they want to recode all 1080P into HEVC, making Kabini unable to show it. Its already of at 4K obvious. Kabini and YouTube isn't a match in heaven either.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Not to mention that even cheap mobos feature ram oc so you don't need a z mobo.
By the way does anyone know if the skylake Igpu is overclockable like in haswells non-k cpus?


http://www.asrock.com/mb/Intel/H110M-HDS/?cat=Beta

http://www.newegg.com/Product/Produ...3157685&cm_re=PCIe_SSD-_-13-157-685-_-Product


The highest i could OC the DDR-3 on the ASUS H110M-K D3 was at 1866MHz. The board BIOS allowed to increase the RAM higher but the system wouldnt boot.

Also there were no settings in the bios to OC the iGPU.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Not sure if that was a statement towards me but i already ordered a G1820,a H81 motherboard and even a 1gb GT 740 GDDR5 .:D

As for Kabini,its a bottom dollar solution that appears to work for people who Netflix/Youtube and browse the web.HTPC option maybe not for some but a few bucks more gets you a 5400k+FM2 which has much more going for it .:thumbsup:

Just saying. I'm using a G1850 and an H81M-S2H now and its sufficient. Which mobo?
 

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
Its just an alternative during the transition to DDR4 only. You can use 1.5V DDR3, but its just not validated with any memory.

DDR3(L) based LGA1151 boards account for around 1/5th of the boards.

Most 1.5v memory can be undervolted to 1.35V
 

The Stilt

Golden Member
Dec 5, 2015
1,709
3,057
106
AMD has fallen behind nVidia and Intel also in fronts, where they have usually been extremely strong. It also happened extremely fast, much faster than the downfall in x86 front. Even the most recent AMD dGPUs are now severely lacking in capabilities, which make them undesireable for HTPC use. I fear that it might hurt AMD in embedded markets too.

- <= GCN 1.1 (UVD 4.x) H.264 HW decode supported only up to 1080P, no HEVC or VP9 support
- GCN 1.2 (UVD 5.x / 6.x) H.264 HW decode supported up to 4K, HEVC HW decode supported up to 4K (8-bit only), no VP9 support

The lack of VP9 support is extremely bad for AMD, since Youtube now uses it as the primary format. You can still use X264 in Youtube, but you will loose all the features of HTML5 player at the same time.

Also while the HEVC HW decode implementation in Carrizo and Fiji works extremely well, it only supports 8-bit color depth. Usually the color depth is irrelevant with this kind of products, but as it happens Ultra HD Blu-Ray specification dictates that all of the material must be encoded with 10-bit color depth. Something that AMDs UVD cannot currently decode :(

AMD needs to implement VP9 HW decode, 10-bit HEVC HW decode and preferably HEVC HW encode (VCE) support to all of their GPUs, regardless of the segment (dGPU, SoC iGPU, etc) ASAP.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
They used the same results of the A8-7600 review they made on February 2014 to the Core i7 6700K review they posted on September 2015.

Again, you're the one crying about drivers. Is it my fault or Intel's fault that AMD doesn't release good drivers upon the release of their products?

Since February 2014 AMD, released more than 10-12 drivers. If you believe this is ok and that makes them more reliable than my own results that use the latest drivers and all the hardware/software used for the reviews are clearly illustrated, then i dont have anything else to say about that review site or you.

Great, now if you'd just stop whining every time it's shown that AMD has utterly horrid release drivers, and acting as if that is anyone but AMD's fault, there wouldn't be a problem.

You missed the OP saying his wife started playing WOT. So the recommendations changed since then towards WOT playability.

Yes, I did miss it. Sorry about that. I still think that almost any recent CPU, two-threaded or quad-threaded, matched with a decent dGPU will outperform (in gaming) any APU available today.
 

Maxima1

Diamond Member
Jan 15, 2013
3,549
761
146
Yes, I did miss it. Sorry about that. I still think that almost any recent CPU, two-threaded or quad-threaded, matched with a decent dGPU will outperform (in gaming) any APU available today.
AMD also has this problem of castrating their own APUs, which I'm assuming is to stop cannibalization of their discrete business. They use previous gen graphics tech (lulz to the recent APU newcomers to legacy support) and have been stuck on garbage bandwidth/holding back on shaders. Hell, Kaveri couldn't even reach 1 TFLOP. They've been basically recycling the same APU for years.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
AMD needs to implement VP9 HW decode, 10-bit HEVC HW decode and preferably HEVC HW encode (VCE) support to all of their GPUs, regardless of the segment (dGPU, SoC iGPU, etc) ASAP.

Indeed. I used to recommend their APUs for HTPC, but it's becoming harder and harder to do, as the streaming websites continually upgrade their software, and AMD has failed to keep up with them, while Intel and nVidia have been keeping up just fine.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Again, you're the one crying about drivers. Is it my fault or Intel's fault that AMD doesn't release good drivers upon the release of their products?
Great, now if you'd just stop whining every time it's shown that AMD has utterly horrid release drivers, and acting as if that is anyone but AMD's fault, there wouldn't be a problem.

You do know that also Intel and NVIDIA have released more than a dozen drivers since February 2014. So according to you both Intel and NVIDIA release drivers are utterly horrid as well.

Yes, I did miss it. Sorry about that. I still think that almost any recent CPU, two-threaded or quad-threaded, matched with a decent dGPU will outperform (in gaming) any APU available today.

That is true but in the majority of cases, you also have to spend more in order to get higher dGPU performance. And you will have to compromise in both cases(at the same price point). You can get higher dGPU performance but you will loose two cores/threads or you will get higher CPU performance (extra cores/threads) but settle for lower iGPU performance.
 

Maxima1

Diamond Member
Jan 15, 2013
3,549
761
146
That is true but in the majority of cases, you also have to spend more in order to get higher dGPU performance. And you will have to compromise in both cases(at the same price point). You can get higher dGPU performance but you will loose two cores/threads or you will get higher CPU performance (extra cores/threads) but settle for lower iGPU performance.

AtenRa, it's just a few Happy Meals.
 

The Stilt

Golden Member
Dec 5, 2015
1,709
3,057
106
Hell, Kaveri couldn't even reach 1 TFLOP. They've been basically recycling the same APU for years.

AMD could easily had pushed Kaveri / Godavari iGPU to 1 TFlops, however they (finally) realized there was no point. On Steamroller APUs the bandwidth available at DDR-2400 (unofficially supported) is sufficient for ~ 614 GFlops (512SP @ ~ 600MHz), while the iGPU of the fastest Steamroller APU (A10-78*0K, * = 7/8/9) already operate at 866MHz. The improved frame buffer compression introduced in GCN 1.2 helps a bit, however it is still like treating terminal cancer with a tablet of Aspirin.

AMDs inability to implement faster frame buffer solutions (either by using on chip cache or faster DRAM) is the biggest nail on the coffin of APU concept.

You know the situation is pretty bad when the gaming performance is better at 553 GFlops @ 38.4GB/s bandwidth than at 737 GFlops @ 34.1GB/s bandwidth.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
You do know that also Intel and NVIDIA have released more than a dozen drivers since February 2014. So according to you both Intel and NVIDIA release drivers are utterly horrid as well.

And yet, I haven't seen a single person in this thread whining and crying about drivers used with either Intel nor nVidia products. That tells me that only AMD has horrid release drivers, since only AMD has people whining when their release drivers are used. See how simple this happens to be?

That is true but in the majority of cases, you also have to spend more in order to get higher dGPU performance. And you will have to compromise in both cases(at the same price point).

Which I think is the best argument against attempting to game with any company's iGPU. For just a few dollars more, you can get a faster CPU, plus a much faster GPU. I'd take an overclocked 860k with a GT 740 over any APU available at any price without a dGPU, and would recommend exactly that to anyone who asked about APUs/iGPUs, if they were wanting to game.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Which I think is the best argument against attempting to game with any company's iGPU. For just a few dollars more, you can get a faster CPU, plus a much faster GPU. I'd take an overclocked 860k with a GT 740 over any APU available at any price without a dGPU, and would recommend exactly that to anyone who asked about APUs/iGPUs, if they were wanting to game.

Yea and spending a little more you can have a better GPU than the GT 740. Spending another $50 on top of that will get you a faster CPU. And another $100 more will get you even better performance and so on.
We can always have more performance if we spend more, the point is that at the same low budget that APUs are created for in the first place, you always have to compromise between a better CPU or better GPU.
 
Aug 11, 2008
10,451
642
126
OK, lets take that argument in the other direction. APUs are too expensive, lets game on a 200.00 Bay Trail laptop. Oh, wait, lets save 100.00 more and use a bay trail tablet.

Nearly everything we do is a compromise of some sort. Using an APU vs a cheap cpu and discrete card compromises a *lot* of performance for a very minimal cost savings, at a performance level where better performance is sorely needed. Even the most diehard APU fans have to construct an almost absurdly cost constrained scenario to justify an APU for gaming.
 
Last edited: