Kabini 5350 a good option for basic usage?

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
The link is fixed now. BTW, Silent PC Review has at least 1,000x more credibility than you happen to have. Care to link to some of these "others", or are we talking about posters on other forums/these forums?

You mean literally most reviews,including that at Anandtech itself:

http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/20

Thats also them using the HD530 IGPs in the Core i5 6600K and Core i7 6700K which are clocked higher than the one in the Core i3 6100 and G4500,by around 8.6% too.

Plus if you look at that review,the Haswell Core i3 4330 has the same IGP as the Core i7 4770k and the latter is also clocked 8.6% higher.

There is a huge difference between the Haswell Core i3 and Core i7 performance which goes beyond the clockspeed difference of the IGP,ie,at least 14% to 15% and in some games its worse.

Equate that to the Skylake Core i3 6100 and G4500 and that means they are going to be much slower. People make this big mistake of looking at Intel Core i5 and Core i7 IGP performance in the best case scenario and equating it to the cheaper CPUs.

It didn't hold for Haswell and its unlikely to be the case for the cheaper Skylake CPUs.

Have you noticed there is a distinct lack of reviews comparing the Core i3 6100 and G4500 IGP performance with similarly priced AMD chips?

Here is a review actually pitting both chips:

http://www.clubedohardware.com.br/artigos/teste-dos-processadores-a10-7870k-vs-core-i3-6100/3192/8

Yet,if we look at the scaling in performance between the Core i3 4330 and Core i7 4770 we are seeing something similar too with the Core i3 6100.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
The Power Color R9 270 2GB is also on sale for $89.99 AR, free shipping:

http://forums.anandtech.com/showthread.php?t=2455804

P.S. R9 270 also works with CX430 (as it only needs one 6 pin PCIe).

Oh,I would get that then. The R7 360 might support Freesync but the R9 270 is another tier in performance above it.

Its 1280 shaders against 768 shaders and has far more memory bandwidth.

Edit to post.

At this rate,the OP's other half is going to end up with a rig faster than his!! :p
 
Last edited:

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
Equate that to the Skylake Core i3 6100 and G4500 and that means they are going to be much slower. People make this big mistake of looking at Intel Core i5 and Core i7 IGP performance in the best case scenario and equating it to the cheaper CPUs.

It didn't hold for Haswell and its unlikely to be the case for the cheaper Skylake CPUs.

Have you noticed there is a distinct lack of reviews comparing the Core i3 6100 and G4500 IGP performance with similarly priced AMD chips?

Here's one. Green team advantage is greatly inflated by some users. Even at 1080p (which favours AMD by forcing an iGPU bottleneck), A8-7600 is ~25-40% faster than a similarly priced Pentium G4500. Yes, AMD (usually) leads iGPU performance at the same price point, but Skylake GT2 is still able to run the same games at (slightly) lower resolution/image quality (it's in the same league as the A6-7400 that some people recommended here). In return, you're getting much better ST CPU performance and the best platform for dGPU gaming and future upgrades. Anyway, I wouldn't like to be stuck with a bandwidth starved iGPU in the X1/PS4-era.
 
Last edited:

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
let see,

First off all the results are from the Core i7 6700K, the Pentium HD530 is 10-20% slower.

Secondly, they used DDR-4 2666MHz memory on the Core i7 and only 1866MHz on the A8-7600.

Thirdly, no driver version to be found anywhere. That is because they used the original February 2014 A8-7600 review results in the Core i7 6700K review posted on September 2015. That makes them 1000 times more credible than me :rolleyes:

I love how, when asked for links, you can never, ever provide any, except for ones to your own blog. And it isn't nVidia's or Intel's fault that AMD is so horrible with their release drivers, now is it? Do you honestly think the OP's wife wants to wait an entire year to get decent framerates in the game she plays? :rolleyes:

Also, those games are more than 5 years old (Crysis released 2007, Resident Evil 5 demo released on 2008 etc) with many of them being DX-9/10 which are more CPU bound than iGPU at 1600x900.

His wife is playing her game at ~half the resolution as they used in the review I linked. That means, according to what you just wrote here, she's going to need more CPU horsepower than GPU horsepower, with which I completely agree. That means she wants an Intel CPU, btw.

In newer games (DX-11/ Mantle and DX-12) that really need more iGPU performance, the Kaveri A8-7600 iGPU is 30-50% faster than HD530 found on the Core i3 6300 when both are using 2133MHz Memory.

Mantle "supports" what, 8 or 10 games? Maybe 15? I'd bet you actual money that the only game that the OP's wife plays doesn't happen to be one of them. Okay, I found the entire list of games with Mantle support, and it is 12: https://en.wikipedia.org/wiki/Category:Video_games_that_support_Mantle

Had AMD instead researched how to have a GPU driver that uses more than one core, like nVidia has, it would have benefited them, instead of being a waste of their money. BTW, I noticed that you provided exactly zero links in this entire post. Will your reply to this post be another instance of you moving the goalpost again, or will you have actual proof of your claims this time?
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Mantle "supports" what, 8 or 10 games? Maybe 15? I'd bet you actual money that the only game that the OP's wife plays doesn't happen to be one of them. Okay, I found the entire list of games with Mantle support, and it is 12: https://en.wikipedia.org/wiki/Category:Video_games_that_support_Mantle

LOL, considering Mantle code is being used as the foundation directly for Vulkan -- and in principle for DirectX 12.... Mantle will unofficially support pretty much nearly every game moving forward. Although kudos for pushing a distorted agenda, though.

Nvidia isn't exactly having a great month right now. AMD seems to be bouncing back with a lot of positive buzz lately. The Samsung fab deal could really put them in a good place.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
I love how, when asked for links, you can never, ever provide any, except for ones to your own blog. And it isn't nVidia's or Intel's fault that AMD is so horrible with their release drivers, now is it? Do you honestly think the OP's wife wants to wait an entire year to get decent framerates in the game she plays? :rolleyes:

Kaveri release drivers were more than fine, i really dont know what you are talking about here.

Links have been posted before, but here once again with the actual game (WOT) the OPs wife is playing.

Oh and all of my latest reviews(see links in my sig) for the last 3-4 years are posted on AT forums not my blog ;)

http://gamegpu.ru/videoobzory/world-of-tanks-9-13-video-obzor-apu.html

http--www.gamegpu.ru-images-stories-Test_GPU-APU-wt2.jpg



His wife is playing her game at ~half the resolution as they used in the review I linked. That means, according to what you just wrote here, she's going to need more CPU horsepower than GPU horsepower, with which I completely agree. That means she wants an Intel CPU, btw.

Unlike DX-9, new DX-11 games need more iGPU performance as shown above. At lower Resolution his wife will be able to increase the Image Quality with the Kaveri APU far higher than with the Pentium HD530 graphics. That is because HD530 as all the Intel Graphics before it, takes a big hit in performance when we Increase the Image Quality. Its the reason why Intel likes 720p Low settings ;)

Mantle "supports" what, 8 or 10 games? Maybe 15? I'd bet you actual money that the only game that the OP's wife plays doesn't happen to be one of them. Okay, I found the entire list of games with Mantle support, and it is 12: https://en.wikipedia.org/wiki/Category:Video_games_that_support_Mantle

There are only 6 games released that support Mantle, the rest will use DX-12 which is almost the same.

Had AMD instead researched how to have a GPU driver that uses more than one core, like nVidia has, it would have benefited them, instead of being a waste of their money. BTW, I noticed that you provided exactly zero links in this entire post. Will your reply to this post be another instance of you moving the goalpost again, or will you have actual proof of your claims this time?

DX-12 is far better for the future of PC Gaming than DX-11 multithread drivers. AMD wisely devoted their time and money in to DX-12 both on Hardware (GCN) and Software.

Proof above ;)
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Do you really want to go the gamegpu route after you got exposed on your double standard issues last time?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
I love how, when asked for links, you can never, ever provide any, except for ones to your own blog. And it isn't nVidia's or Intel's fault that AMD is so horrible with their release drivers, now is it? Do you honestly think the OP's wife wants to wait an entire year to get decent framerates in the game she plays? :rolleyes:

Had AMD instead researched how to have a GPU driver that uses more than one core, like nVidia has, it would have benefited them, instead of being a waste of their money. BTW, I noticed that you provided exactly zero links in this entire post. Will your reply to this post be another instance of you moving the goalpost again, or will you have actual proof of your claims this time?

Bingo :thumbsup:
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
Do you really want to go the gamegpu route after you got exposed on your double standard issues last time?

You have been provided with answers to your gamegpu questions. But since you dont like the results you keep singing the same tune.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
The Power Color R9 270 2GB is also on sale for $89.99 AR, free shipping:

http://forums.anandtech.com/showthread.php?t=2455804

P.S. R9 270 also works with CX430 (as it only needs one 6 pin PCIe).

Oh,I would get that then. The R7 360 might support Freesync but the R9 270 is another tier in performance above it.

Its 1280 shaders against 768 shaders and has far more memory bandwidth.

Edit to post.

At this rate,the OP's other half is going to end up with a rig faster than his!! :p

Powercolor HD6950 1GB for $49.99 AR shipped exists as well:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131682

1440 VLIW4 shaders @ 800 Mhz with 256 bit GDDR5 (1GB)

Keep in mind though that AMD stopped driver development as of Nov. 24, 2015 with Crimson Beta --> http://support.amd.com/en-us/download/desktop/legacy?product=legacy3&os=Windows+10+-+64

It does have a Windows 10 driver with control panel though.

P.S. Would be interesting to eventually find out how this HD6950 compares to the GT 740 GDDR5 (aka GTX 650) linked earlier for $59.99 shipped in future games. (re: In Linux (even with AMD's notoriously bad Linux driver) HD6950 is still faster than the GT 740 GDDR5 (aka GTX 650) except for one game, where it ties GTX 650).
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
You have been provided with answers to your gamegpu questions. But since you dont like the results you keep singing the same tune.

You couldn't provide it. Yet you keep using it as gospel.

And if AMD didn't use the latest driver hell broke lose.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
That is because HD530 as all the Intel Graphics before it, takes a big hit in performance when we Increase the Image Quality. Its the reason why Intel likes 720p Low settings ;)

Bullshit. ;)

sm.wot.800.png


sm.csgo.800.png


sm.f1-3.800.png


sm.metro-3.800.png


And those misleading GameGPU results with drivers that don't even support Skylake and gimped memory for the Intel systems won't cut it, we already exposed you in the other thread.

Unlike DX-9, new DX-11 games need more iGPU performance as shown above. At lower Resolution his wife will be able to increase the Image Quality with the Kaveri APU far higher than with the Pentium HD530 graphics.

Both will be more than enough to max those two games at 1024x768, and by going the Skylake route he gets the better CPU performance/platform.
 
Last edited:

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Here's one. Green team advantage is greatly inflated by some users. Even at 1080p (which favours AMD by forcing an iGPU bottleneck), A8-7600 is ~25-40% faster than a similarly priced Pentium G4500. Yes, AMD (usually) leads iGPU performance at the same price point, but Skylake GT2 is still able to run the same games at (slightly) lower resolution/image quality (it's in the same league as the A6-7400 that some people recommended here). In return, you're getting much better ST CPU performance and the best platform for dGPU gaming and future upgrades. Anyway, I wouldn't like to be stuck with a bandwidth starved iGPU in the X1/PS4-era.

The only problem as indicated by those results - the AMD chip is hitting 60FPS in CS:GO which is a fast paced shooter and as you drop the resolution its probably going to go past that and there is the issue of minimums too. Trust me as a game,people aim for as high a framerate as possible.

WoT might be somewhat better since its third person game,but again despite the game being very lightly threaded and the G4500 having better ST performance than the A8 7600 its not exactly beating it,and is significantly slower.

Its the case,that since the IGP is that much slower,you will be upgrading to a dGPU quicker anyway.

The problem is that in many cases,we are still massively graphics limited - review sites tend to test with higher end cards,and not the sort of cards which people will be running on these very cheap CPUs.

Now,the Pentium technically is faster CPU-wise in lightly threaded games and has a far better upgrade path than FM2+ but again it's only a dual core with HT disabled,so at this point I would rather get a Core i3 6100 which I see as lasting much longer or alternatively wait for the Skylake Celerons to be released,as I doubt they will be much slower.

It was the same with Haswell - I would rather push people to a Celeron instead of a Pentium if they were quite budget constrained and could not afford a Core i3. At least they could maximise the spend on the GPU then,and then have a much bigger upgrade to a Core i5 later.

As seen in this thread - even $20 more can buy you a much faster graphics card.

Edit to post.

However,we digress now as the OP has kind of decided what path they want to go.

I think it is better that we put our collective efforts towards finding some graphics card deals for him.
 
Last edited:

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Powercolor HD6950 1GB for $49.99 AR shipped exists as well:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131682

1440 VLIW4 shaders @ 800 Mhz with 256 bit GDDR5 (1GB)

Keep in mind though that AMD stopped driver development as of Nov. 24, 2015 with Crimson Beta --> http://support.amd.com/en-us/download/desktop/legacy?product=legacy3&os=Windows+10+-+64

It does have a Windows 10 driver with control panel though.

P.S. Would be interesting to eventually find out how this HD6950 compares to the GT 740 GDDR5 (aka GTX 650) linked earlier for $59.99 shipped in future games. (re: In Linux (even with AMD's notoriously bad Linux driver) HD6950 is still faster than the GT 740 GDDR5 (aka GTX 650) except for one game, where it ties GTX 650).

Had a look at this review:

https://www.techpowerup.com/reviews/Sapphire/HD_7790_Dual-X/26.html

The 2GB version was in-between an HD7790 and an HD7850 in performance.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
LOL, considering Mantle code is being used as the foundation directly for Vulkan

And Vulkan supports exactly how many games today? Oh that's right, exactly zero. Might it be the next big thing in the graphics industry? Of course, it might, but then it also may very well be the next big flop in the graphic industry, just like Mantel happened to be.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Kaveri release drivers were more than fine, i really dont know what you are talking about here.

I wasn't the one crying about the AMD drivers being used, you were:

Thirdly, no driver version to be found anywhere. That is because they used the original February 2014 A8-7600 review results in the Core i7 6700K review posted on September 2015.

Links have been posted before, but here once again with the actual game (WOT) the OPs wife is playing.

She plays one game, and it's about as far from WoT as you can get.

My wife is wanting me to build her a tower that does Facebook/Netflix and Youtube.She does play Wizard 101 and that should certainly be fine for the 8400.

Oh and all of my latest reviews(see links in my sig) for the last 3-4 years are posted on AT forums not my blog ;)

Kudos. The last I saw, which was nowhere near 4 years ago, you were still inviting people to click on the links in your signature, as "proof" that what you were saying was somehow true.

There are only 6 games released that support Mantle, the rest will use DX-12 which is almost the same.

Yeah, it's nearly identical. That's why Microsoft is calling it Mantle, instead of Direct X, and is also paying AMD for the use of AMD's copyrighted software, huh? Oh, wait...

DX-12 is far better for the future of PC Gaming than DX-11 multithread drivers.

What does the above have to do with the fact that there are thousands of DX-10 & 11 games available to be bought, and a single DX-12 game?

AMD wisely devoted their time and money in to DX-12 both on Hardware (GCN) and Software.

That must be why the Fury X decimates the GTX 980 Ti in DX-12 then, huh?

Oops, looks like they are more or less identical to the performance differences in DX-11, which is almost nil:

DX12-High.png

DX12-Batches-4K.png

DX12-4xMSAA.png

http://www.extremetech.com/wp-content/uploads/2015/08/DX12-Batches-1080p-4xMSAA.png[/QUOTE]

[URL]http://www.extremetech.com/gaming/212314-directx-12-arrives-at-last-with-ashes-of-the-singularity-amd-and-nvidia-go-head-to-head/2[/URL]

OP, you will get the best performance for your money by buying a CPU, not a GPU, and a dedicated video card. If I were in your predicament, I'd buy an AMD 860k, a cheap motherboard that can overclock it, and an nVidia 2GB GT740 video card. It's the absolute best mixture of performance per dollar. AMD's cards require faster CPUs, because AMD's drivers are single-threaded, and no matter how many cores your CPU happens to have, the rest of them wait on the AMD driver thread, slowing down everything else.

You'll end up with a happier wife with that ~$200 combination than any other, I promise. She will immediately get to raise her gaming resolution as high as her 1440x900 or 1600x1200 monitor can go, and that's always what you want to do. An Intel quad-core would be higher performance, of course, but higher performance costs a lot more. Of all of the different combinations you could choose, using the iGPU of either company will give her by far the lowest possible performance.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
I ordered pretty much the config USER8000 pulled from Newegg for the Intel G1820+H81+GT 740 1gb GDDR5 card.I only swapped out the CX430 for the CX430M which cost just a few bucks more.:D

I would kill for a modular power supply and my wife is getting one.:(

Will certainly post back with my results on how it goes.:)
 
Last edited:

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
I ordered pretty much the config USER8000 pulled from Newegg for the Intel G1820+H81+GT 740 1gb GDDR5 card.

The G1820 + GT 740 would be my second choice, and is a fine one, especially since most of the free games are single-threaded, and need much more single-threaded performance than they do "moar coars". She'll be happy with that combo.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
The G1820 + GT 740 would be my second choice, and is a fine one, especially since most of the free games are single-threaded, and need much more single-threaded performance than they do "moar coars". She'll be happy with that combo.

Gonna have pretty much all my games that run well on dual cores pretty much on my wifes computer. BF4,GTA V and COD BO2 and a few other titles that require 4 threads will end up staying on my i5 rig.

If my wife wants to play GTA V or BF4 on my rig later on and i am in a WOT mood,i will use her tower with no issues.My tower is in the room hooked up to a t.v with speakers and her tower will have headphones and it sits in the living room where her uncle watches t.v. If she happens to get into a lay down and watch Netflix mood she can while i sit and play some CSGO or WOT on her tower .:thumbsup:

Eventually when i drop my 660 into her tower,about the same time there may very well end up being a i3 or i5 in there already.:)
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
I wasn't the one crying about the AMD drivers being used, you were:

They used the same results of the A8-7600 review they made on February 2014 to the Core i7 6700K review they posted on September 2015.

Since February 2014 AMD, released more than 10-12 drivers. If you believe this is ok and that makes them more reliable than my own results that use the latest drivers and all the hardware/software used for the reviews are clearly illustrated, then i dont have anything else to say about that review site or you.

She plays one game, and it's about as far from WoT as you can get.

You missed the OP saying his wife started playing WOT.

Wife is getting into WOT suddenly on my rig,

So the recommendations changed since then towards WOT playability.

Anyway the OP made up his choice.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
Bullshit. ;)

sm.wot.800.png

A8-7670K = 34,4 = playable
Pentium G4500 = 26,4fps = unplayable

A8-7670K = 30% faster than Pentium HD530 and that is with out of Specs Pentium G4500 because they used DDR-4 2666MHz memory on the Z170 board.

Bullshit all you want, your own results makes the HD530 slower than A8.

 

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
A8-7670K = 34,4 = playable
Pentium G4500 = 26,4fps = unplayable

A8-7670K = 30% faster than Pentium HD530 and that is with out of Specs Pentium G4500 because they used DDR-4 2666MHz memory on the Z170 board.

Bullshit all you want, your own results makes the HD530 slower than A8.


See? A lot less than your misleading GameGPU results (showing more than twice the framerate) suggest. Few adjustments and any game that runs on the overhyped APUs runs fine on HD 530 too (close to 30 FPS @ 1080p Medium, should run fine at Low). Thank god OP made the right choice given his budget and got the best low-budget CPU for free-games and a dGPU, didn't fall for your APU hype. ;)


A8-7670K = 30% faster than Pentium HD530 and that is with out of Specs Pentium G4500 because they used DDR-4 2666MHz memory on the Z170 board.

I didn't see you complaining when GameGPU used out of spec DDR3-2666 for the AMD platform in your thread (A10 can use it at default @ 2400). Your double standards are laughable.
 
Last edited: