What CPU speed is needed for 1080p playback

seanp789

Senior member
Oct 17, 2001
374
0
0
I'm using Core 2 Duo as a reference point.

What is the minimum speed will I need to handle 1080P bluray, H.264, VC-1, and x264 playback.

I've tested as low as 2.4Ghz core 2 duo. Anandtech quotes "From our previous experience though, only CPUs faster than an E6600 can guarantee smooth decoding in the absence of GPU acceleration. "

I read rumors the mac mini which is clocked at 1.8Ghz can handle playback smoothly.
"Tom's Hardware" seems to confirm this with an e4300 @ 1.8Ghz. The catch is that they are outputting at 1680x1050 which is not Full 1080P even though the video source is 1080P.

Has anyone had smooth playback using C2D speeds lower than 2.4Ghz in full 1080P resolution?
How different are laptop chip clock speeds?
is H.264 or x264 more difficult to decode?
 

Falloutboy

Diamond Member
Jan 2, 2003
5,916
0
76
really depends on your codec as well. core avc is better than ffdshow if your cpu is maxing out.
 

Falloutboy

Diamond Member
Jan 2, 2003
5,916
0
76
I know my 2.4ghz single core a64 can do 720p but can't do 1080p without droping frames
 

Jyve

Guest
Jan 28, 2006
73
0
0
Originally posted by: Rhoxed
I have an Athlon 3800+ x2 that plays 1080p beautifully @ 2.2 Ghz


Your 3800 can play 1080p because of the 8600 that's paired with it. I have a 5000+ @ 2.6ghz that is just a hair shy of being able to play a full high bitrate 1080p mkv rip. I need to overclock it a couple hundred mhz for the sound to not go out of sync. This is using coreavc and NOT ffdshow.
 

faxon

Platinum Member
May 23, 2008
2,109
1
81
dont cripple yourself and build a rig without even a basic video card if you plan on playing 1080p HD video. you can get a card which will handle it just fine for $60 US

http://www.newegg.com/Product/...x?Item=N82E16814102726 my step dad has a 2.4GHz conroe C2D and that video card and it seamlessly plays 1080p blu-rays no questions asked
 

harpoon84

Golden Member
Jul 16, 2006
1,084
0
0
Well, as you already mentioned, according to Anandtech you need around a C2D E6600 (2.4GHz) to playback 1080p smoothly without dropping framee, but thats without GPU acceleration. Basically any single core CPU can playback 1080p with hardware acceleration though.
 

faxon

Platinum Member
May 23, 2008
2,109
1
81
Originally posted by: harpoon84
Well, as you already mentioned, according to Anandtech you need around a C2D E6600 (2.4GHz) to playback 1080p smoothly without dropping framee, but thats without GPU acceleration. Basically any single core CPU can playback 1080p with hardware acceleration though.

yea pretty much. i am really pissed because some of the HD tv shows my family downloaded from Itunes dont run without a wierd graphical artifact on my computer, and when i tone down the graphical accelleration to the point where it dissapears on my HD 2900 PRO, my athlon 4000+ cant keep up anymore :(. good thing we have a 40 inch HDTV in the living room connected to our apple TV lol
 

noriseghir

Member
Jul 4, 2008
27
0
0
Pentium D @ 3.4 Ghz decode 1080p with ~ 70% CPU occupation (x264 enc / ffdshow dec). And it's way less efficient than a C2D @ 2.4 Ghz.
Question : Is x264 encoded content is GPU H/W accelerated ? MKV played with ffdshow for example?
 

nerp

Diamond Member
Dec 31, 2005
9,865
105
106
One option might be a 780G chipset. With that, you can decode 1080p even with a low end sempron. :)

My 780G board and 4850e 45W cpu @2.5 ghz can handle it handily.
 

Tlkki

Member
May 20, 2005
165
0
0
There are trillion different 1080p AVC streams available and thus trillion different configurations needed to run them. EVERY stream has its OWN requirements based on the encoder and the encoder options used. Level 5.1 encoding with high bitrate has insane requirements, thus no bluray player nor the PS3 supports it. So basically if you want to be able to play EVERY 1080p encoded to date. You dont want to skimp on teh CPU.
L3.1 low bitrate easily comppressible material say anime is way less taxing on the processor.

I doubt any videocard supports HW accelerated x264 playback from the matroska container. And of course CoreAVC is faster than ffdshow, but I personally dont like the artifacts it sometimes produces.

e:
Just to clarify the bitrate issue.

Some parts of a movie might play smooth on a lowish single core macine. But there are parts where the situation gets hot and the frames more complex to encode, the encoder cranks up the bitrate to preserve quality. Then you might get every other frame, basically 10fps. I dunno the exact maximum bitrate specified in L5.1 or even if there is any, but ive seen bitrates exceeding 40Mbits. With 6+ reference frames and exhaustive motion prediction calculations, many a CPU is brought to its knees.
 

vic5014

Junior Member
Aug 4, 2008
8
0
0
Just get a cheap video card made in the last couple of years. nearly all gpus now support offloading of processing from the cpu for hi-def video decode and playback. Read the quote more carefully. The speed mentioned is in the absence of any kind gpu-assisted playback. Heck, plenty of integrated graphics mobos will do it too.