[gamegpu.ru] APU gaming including Skylake GT2 and Broadwell Iris Pro 6200

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
ADF is so desperate they need to overclock Kaveri to have it catch up to a Pentium. Yes, a Pentium.
 
Last edited:

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
ADF is so desperate they need to overclock Kaveri to have it catch up to a Pentium. Yes, a Pentium.

How is that different from overclocking a G3258 to have it catch up to faster Pentiums? Because's let's face it -- it's not like an overclocked G3258 can get anywhere close to a Haswell i5 in Multi-threaded performance when the software is well written.
You can overclock a $75 chip, but there are obvious tradeoffs when you are buying a cheap CPU.... And that applies to every CPU manufacturer.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
ADF is so desperate they need to overclock Kaveri to have it catch up to a Pentium. Yes, a Pentium.

Well, you can OC the Pentium to hell and back and still, it will not be faster than a Dual Module Kaveri in MultiThread/MultiTasking.

Not to talk about iGPU performance, even the double the size Iris Pro 6200 can barely keep up with the almost two year old Kaveri.
Yeap, two year old Kaveri at 28nm planar and Iris Pro 6200 with double the IC on 14nm with eDRAM on top of that and they just keep up.
In 2017 when AMD will have 14nm FF APUs it will be a slaughter.
 

TheELF

Diamond Member
Dec 22, 2012
4,029
753
126
Well, you can OC the Pentium to hell and back and still, it will not be faster than a Dual Module Kaveri in MultiThread/MultiTasking.
Yup it won't, but it will be just as fast meaning that 2 cores are as fast as 4 how is that possible if kaveri cores are faster than haswell's?
 

TheELF

Diamond Member
Dec 22, 2012
4,029
753
126
Yet Kaveri cant play HEVC movies or VP9...
Well it is just an ATi card slapped on a CPU, they where never designed for modern times needs,they are decently fast but power consumption and lack of features shows you what decade they belong in.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Goalpost change from iGPU performance to GPU features.
Did Intel/NVIDIA 2014 hardware supported HEVC and VP9 ??? no they didnt.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Yes they did,haswell supports those codecs in hardware,not full hardware acceleration but still acceleration.
https://communities.intel.com/thread/59216?tstart=0

Kaveri also supports x265/HEVC thru OpenCL

index.php
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Only works with 1080P@30FPS doesn't it. And its doesn't work with any player, nor does it support all formats. Its was an awful bruteforce attempt and already forgotten.

Remember the details. ;)
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
It may be, i dont actually remember the details for Kaveri now, but Carrizo (2015) has native 4K H.264 and h265/HEVC decode.

amd-2015-carrizo-5.jpg
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Yes, HEVC 8bit is good for Carrizo, Tonga and Fiji. As long as you dont play 4K Blurays using main10 or any other content with HEVC Main10 or VP9 in HTML5(YouTube for example).

So even those are horrible outdated. This also disqualifies AMD for HTPC tasks.

AMD need to step up, because they are so far behind its embarrassing.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Sorry but you are over exaggerating, although AMD need to address those hardware codecs in 2016-17, the 4K users today are not even 1% of the global HTPC base.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Sorry but you are over exaggerating, although AMD need to address those hardware codecs in 2016-17, the 4K users today are not even 1% of the global HTPC base.

Netflix and others are moving to HEVC for 1080P. AMD cant play these as streaming. They can only play them with a 3rd party player with 30FPS clips and limited support.

VP9 and HTML5 isn't supported at all. This includes YouTube.

AMD needed support this year, not in 2 years or more.

As always the train departs, and AMD like its users are left behind at the station.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Yes yes, all those NVIDIA and Intel hardware that doesnt support all those codecs will also become obsolete some time. For 2015 and 2016 the AMD APU codec support is fine for the vast majority of the HTPC user base. And 2016 14nm GPUs will have support for all those codecs, so dont worry about the train ;)
 

coercitiv

Diamond Member
Jan 24, 2014
7,483
17,879
136
Yes, HEVC 8bit is good for Carrizo, Tonga and Fiji. As long as you dont play 4K Blurays using main10 or any other content with HEVC Main10 or VP9 in HTML5(YouTube for example).
By the way, have you had the chance to test a HEVC 10bit @ 60 FPS file on Skylake? It's GPU accelerated decoding, and I have yet to find out how well it performs.

I'm considering an upgrade for my HTPC machine to accommodate a Plex server and soon enough I'll have to decide between a heavily discounted Haswell or top shelf Skylake. The Haswell would need a dGPU for decoding HEVC @ 60FPS, need to know if Skylake changes that 100%.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Yes yes, all those NVIDIA and Intel hardware that doesnt support all those codecs will also become obsolete some time. For 2015 and 2016 the AMD APU codec support is fine for the vast majority of the HTPC user base. And 2016 14nm GPUs will have support for all those codecs, so dont worry about the train ;)

So you say APU users need to buy a dGPU for the things other IGPs support? :)

Super!
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
By the way, have you had the chance to test a HEVC 10bit @ 60 FPS file on Skylake? It's GPU accelerated decoding, and I have yet to find out how well it performs.

I'm considering an upgrade for my HTPC machine to accommodate a Plex server and soon enough I'll have to decide between a heavily discounted Haswell or top shelf Skylake. The Haswell would need a dGPU for decoding HEVC @ 60FPS, need to know if Skylake changes that 100%.

Haswell should be able to decode at 60FPS.

My i3 6100U NUC got no problem with 4K HEVC Main10 or 4K VP9 via YouTube. It uses around 6W playing those, contra 2W with 8Bit.

Remember 4K@60Hz decode requires dual channel for Braswell, Haswell, Broadwell and Skylake. (Maybe EDRAM models excluded.)
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
7,483
17,879
136
Haswell should be able to decode at 60FPS.

My i3 6100U NUC got no problem with 4K HEVC Main10 or 4K VP9 via YouTube.

Remember 4K@60Hz decode requires dual channel for Braswell, Haswell, Broadwell and Skylake. (Maybe EDRAM models excluded.)
Haswell cannot do 60 FPS via iGPU (tested on dual channel machine at home), that's why I'm asking if you had the chance to test 4k HEVC Main10 @ 60 FPS on your Skylake HTPC...
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
So you say APU users need to buy a dGPU for the things other IGPs support? :)

Super!

You know very well what i said, AMD and Intel(Haswell and before) iGPUs are fine for HTPC for the vaaaaaaaaaaaasttttttttt majority of the users. VP9/HTML5 and 1080p HEVC are not mandatory for the vastttttttttt majority of HTPC user base.

The user base that needs those today is not even 1% of the global HTPC base, so please dont say things only to make AMD look bad. We know and I have acknowledged here the fact that AMD today lacks some new codec support but that doesn't make them HTPC irrelevant in 2015.

Anyway i will stop here the offtopic.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Haswell cannot do 60 FPS via iGPU (tested on dual channel machine at home), that's why I'm asking if you had the chance to test 4k HEVC Main10 @ 60 FPS on your Skylake HTPC...

Yep, uses around 6W, vs 2W for full decode support. (Package power)
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
In relation to the dual channel part. DRAM power usage changes from 0.9W to 2.2W when playing 4K.