Best CPU for 4K Blu-ray playback & Gaming?

mathew70

Junior Member
Aug 30, 2014
14
0
0
Yes I realise 4K Blu-ray does not yet exist but I want the PC I'm building to be future proof ;)

I don't know much about computers so any suggestions about the best CPU
to get is much appreciated. Thanks! :)
 
Last edited:

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
QUOTE=MiddleOfTheRoad;36666603]Well, Socket 2011-V3, 16 GB DDR4 and an i7 5930K should get you started[/QUOTE]
Lol!

anyways, since bluray playback will be geared towards the masses, i imagine an i3 cpu and above should be fine.:)

For gaming, get the most powerful GPU (SLI/Crossfire preferably) u can afford cause @ 4k its all GPU grunt. Any i5 cpu is fine for gaming & going up doesnt make a difference, its all about the GPU.
 
Last edited:
Dec 30, 2004
12,553
2
76
for gaming I have no points but for bluray playback the video card is what's going to matter.

even my old 1.3ghz single core atom with GMA500 graphics was able to play 1080p smoothly. all it had to do was unpack the feed and feed the GPU

and I don't even think you should think about 4k gaming that's going nowhere, slow. at 10fps
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
All depends on the codec used. A Celeron with its IGP can be as good to play it as anything else.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
Yes I realise 4K Blu-ray does not yet exist but I want the PC I'm building to be future proof ;)

I don't know much about computers so any suggestions about the best CPU
to get is much appreciated. Thanks! :)
TBH, it's pointless trying to build a "future proof" PC. Just buy what you need when you need it. You might find hardware/software combinations far better clarified two years down the line. Eg, in two years even a Pentium or low-end budget passive GFX card (which can be added to any older CPU) may have fully hardware accelerated 4K/H265 (if they already don't). Increasingly such tasks are GPU / fixed function accelerated not CPU "software" based so you may not even need a new CPU at all.

And that's all assuming 4k Blu-Ray actually takes off and is worth the effort / money / time / risk for studio's to put more than just the usual 1980-today blockbuster token titles out just for the enthusiast market. It may well fizzle out due to lack of demand if most people (ie, non enthusiasts) deem 1080p Blu-Ray "good enough" and not worth rebuying their collection yet again (for the 3rd time in 17 years...). Look at what happened to DVD-Audio & SACD which audiophiles at the time swore blind was "the future". Then there's 3D Blu-Ray, etc. :D...
 
Last edited:

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,696
136
All depends on the codec used. A Celeron with its IGP can be as good to play it as anything else.

There's a small problem since Celeron/Pentiums do not support 4K output... :D

You need either an i3 (minimum) or a discrete graphics card to do that.
 

jkauff

Senior member
Oct 4, 2012
583
13
81
You'll need a CPU only powerful enough to decode 4K frames at a decent frames-per-second rate--an i3 or i5 will do the job. Because of new compression technology, the bitrate for the H.265 standard will be comparable to regular Blu-ray. QuickSync and CUDA do not currently support decoding H.265 video, so you'll have to rely on the CPU for now.

Rendering enhancements affecting picture quality pretty much all happen on the GPU, so as others have said spend your dollars where they'll do the most good, on the most powerful graphics card you can afford. And, of course, on a nice 4K monitor/TV.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
You'll need a CPU only powerful enough to decode 4K frames at a decent frames-per-second rate--an i3 or i5 will do the job. Because of new compression technology, the bitrate for the H.265 standard will be comparable to regular Blu-ray. QuickSync and CUDA do not currently support decoding H.265 video, so you'll have to rely on the CPU for now.

Rendering enhancements affecting picture quality pretty much all happen on the GPU, so as others have said spend your dollars where they'll do the most good, on the most powerful graphics card you can afford. And, of course, on a nice 4K monitor/TV.

there does exist an opencl based h.265 decoder for amd but it doesnt support 4k.
http://forums.anandtech.com/showthread.php?t=2387592
works on my kaveri apu.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
My recommendation is not to worry about the CPU. By the time Blu-Ray 4K becomes a big enough deal to care about, Nvidia and AMD will have updated their hardware decoding to support H.265. It's rumored that some of Nvidia's upcoming Maxwell cards may already have this ability. When this happens, you can keep the CPU and just update the graphics card.

What kind of things are you planning to do on this PC other than watching movies? It's hard to recommend a CPU without knowing the intended use case. Will this just be for general web browsing and/or office-type stuff? Photo editing? Gaming?
 

mathew70

Junior Member
Aug 30, 2014
14
0
0
I don't plan on EVER using hardware decoding of HD video becuase I believe software decoding to be superior. Im guessing a Intel Core i5 or i7 @3.5Ghz should be powerful enough, though I am wondering what speed memory I should get. At the moment I am leaning towards 2400Mhz. I am trying to find memory with the lowest latency.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,696
136
My recommendation is not to worry about the CPU. By the time Blu-Ray 4K becomes a big enough deal to care about, Nvidia and AMD will have updated their hardware decoding to support H.265. It's rumored that some of Nvidia's upcoming Maxwell cards may already have this ability. When this happens, you can keep the CPU and just update the graphics card.

The 750/750TI already has the ability to decode h265, though its not full hardware decode.
 

jkauff

Senior member
Oct 4, 2012
583
13
81
The 750/750TI already has the ability to decode h265, though its not full hardware decode.
That's true of the latest Intel iGPUs and QuickSync. As you say, it's not full hardware decode.

Irrelevant to the OP, of course. Software decoding is considerably faster anyway on a modern CPU. I only use QuickSync decode when I'm watching a movie while doing a Handbrake encode.
 

Eric1987

Senior member
Mar 22, 2012
748
22
76
for gaming I have no points but for bluray playback the video card is what's going to matter.

even my old 1.3ghz single core atom with GMA500 graphics was able to play 1080p smoothly. all it had to do was unpack the feed and feed the GPU

and I don't even think you should think about 4k gaming that's going nowhere, slow. at 10fps

Not true but nice try.
 

BonzaiDuck

Lifer
Jun 30, 2004
16,742
2,094
126
I don't plan on EVER using hardware decoding of HD video becuase I believe software decoding to be superior. Im guessing a Intel Core i5 or i7 @3.5Ghz should be powerful enough, though I am wondering what speed memory I should get. At the moment I am leaning towards 2400Mhz. I am trying to find memory with the lowest latency.

the choice between "stock" RAM speed (1333 to 1600) and the high-end represented by 2400 has been discussed at length in many threads -- some, but not all, found in the Memory & Storage forum.

Based on those discussions, I had long-since accepted that "performance" is barely impacted except for benchmark scores, which still don't represent much of anything substantive.

Then, there's this issue about whether or not 4K video will catch on. This is another one of those aspects for which an old Medicare fart like myself feels slightly bewildered or exasperated.

At what point do the limitations of the human eye make it irrelevant for using higher resolutions? Here, I'm raising questions that someone else might try and answer.

I've just come up to speed on "Smart phone." I can surf the web and read books on my smart phone. I can access this web-page and read the posts. I come away from the experience thinking I should get a special pair of reading glasses -- the new progressive lenses I bought last month still make it seem that I'm reading the Compact Edition of the OED without the magnifier. But I wouldn't need an electron microscope or something equivalent. So WHY 4K?! What GOOD would it do for me?

Maybe I'm missing something. I haven't even replaced a DVD-burner with a BD-burner* on my PCs yet. If I want movies, I subscribe to them online or simply use my Media Center to DVR/record premium channels to a very large hard disk.
And if I'm wrong about the RAM, I'd expect someone else to say so, but I'm pretty sure my answer on that count is in the ballpark.

*If I want to "play" BD movies, and for using other than your COSTCO $100-bargain HT Blu-Ray player, I'd only buy a "burner" and consider the value of using BD discs for back-up or ripping movies to optical. In this very last aspect, I see no reason to do that either -- I have a 1TB drive on my system used exclusively for DVR or "capture." Then -- I archive files to my 8TB server . . .
 
Last edited:

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,696
136
That's true of the latest Intel iGPUs and QuickSync. As you say, it's not full hardware decode.

Intel uses the same fixed function + shader strategy as NV. So they're similar in that regard.

There is a HSA decoder for AMD APUs floating around. But since I don't have the hardware to try it out, I can't say how well it works.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Intel uses the same fixed function + shader strategy as NV. So they're similar in that regard.

There is a HSA decoder for AMD APUs floating around. But since I don't have the hardware to try it out, I can't say how well it works.

works for the most part but doesn't do 4k.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
DDR3 1600/1866 + Haswell i3, or i5 if you're feeling generous with with your budget.