Do video cards make a difference in general HTPC use?

Charlie98

Diamond Member
Nov 6, 2011
6,292
62
91
HTPC in sig below...

I have a HD6450 in there right now... a passively cooled one.. and it does the job, that is streaming stored video and Netflix via HDMI to a 60" Visio. As I understand it, the GPU corrects the Sandy fault in 24fps playback, which is why I have it.

Would a more powerful GPU improve anything anywhere on this system? I have my old GTX560Ti setting around, I've always toyed with the idea of stabbing it in my HTPC but never did so because of the power/noise penalty... and I assumed it would make no difference. But the more I research HTPC playback the more I realized I've made some dumb assumptions... :\

Before I sell this old GTX off, convince me I'm not making a mistake...
 

bradly1101

Diamond Member
May 5, 2013
4,689
294
126
www.bradlygsmith.org
I think they make a difference. Like anything it all depends on your needs. I stuck a passive GT 630 in mine. Most any card will handle 1080P video, but in my case I wanted the card to be able to handle some 3D duty, like an aquarium screensaver. But if we have people over we have to put a Post-It note on the TV's bezel that says, "Don't tap on the glass." :)
 

NutBucket

Lifer
Aug 30, 2000
27,034
546
126
Generally speaking for video viewing it doesn't matter. I use integrated i5 graphics and it works just fine.
 

Charlie98

Diamond Member
Nov 6, 2011
6,292
62
91
It could be used with MADVR to run filters and improve image quality.

I keep my HTPC simple-stupid... I don't even know what MADVR is... although I've seen references to it here. If it can take my average rips and pump them up, I'd be willing to learn more... :D
 

Charlie98

Diamond Member
Nov 6, 2011
6,292
62
91
...like an aquarium screensaver. But if we have people over we have to put a Post-It note on the TV's bezel that says, "Don't tap on the glass." :)

HA! ...that's funny! We have two turtle tanks in the living room (real ones, not screensavers... ;) ) so everyone jacks with Bill and Ted, instead.
 

Binky

Diamond Member
Oct 9, 1999
4,046
4
81
A GPU can make a difference if you use graphics-intensive interfaces, like the Media Browser interface inside WMC. The interface is a bit snappier with even a modest card. I never did like the low-end AMD cards like the 5450 and 6450. Something just a bit more powerful than this seems to be the sweet spot, unless you are doing post processing like Madvr.
 

Charlie98

Diamond Member
Nov 6, 2011
6,292
62
91
A GPU can make a difference if you use graphics-intensive interfaces, like the Media Browser interface inside WMC. The interface is a bit snappier with even a modest card. I never did like the low-end AMD cards like the 5450 and 6450. Something just a bit more powerful than this seems to be the sweet spot, unless you are doing post processing like Madvr.

That's actually our setup... WMC with Media Browser, I like it a lot. The 6450 has been OK, I picked it because it was reasonably priced and passively cooled. Jeepers... I might just have to stab that 560Ti in... even though it means uninstalling and installing video drivers... :\

...and there's that MADVR reference again. :eek:
 

MongGrel

Lifer
Dec 3, 2013
38,751
3,068
121
Have an old EVGA GTX 260 16 I stuck in mine, because it wasn't doing anything else at the time when I cobbled it together I guess.

With the media player I use I think it probably helps a bit.
 
Last edited:

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
A GPU can make a difference if you use graphics-intensive interfaces, like the Media Browser interface inside WMC. The interface is a bit snappier with even a modest card. I never did like the low-end AMD cards like the 5450 and 6450. Something just a bit more powerful than this seems to be the sweet spot, unless you are doing post processing like Madvr.

Great point. I have upgraded a GPU before just for more FPS in Kodi's GUI.

I also agree the sweet spot is quite a little beyond the 6450. I like to think of the GT 430 as the floor.
 

Binky

Diamond Member
Oct 9, 1999
4,046
4
81
The Nvidia GT naming convention depends on the 2nd digit, so the x20 is probably slow. I'd look at the x30 or x40 cards. I could be wrong though, as i haven't read a single review on the 7xx series cards.
 

dn7309

Senior member
Dec 5, 2012
469
0
76
from my experience, an i5 integrated graphics (Haswell) causes tearing on video.
 

Charlie98

Diamond Member
Nov 6, 2011
6,292
62
91
The Nvidia GT naming convention depends on the 2nd digit, so the x20 is probably slow. I'd look at the x30 or x40 cards. I could be wrong though, as i haven't read a single review on the 7xx series cards.

The 720's core clock is just as fast as my 560Ti, just half as many CUDA cores as the vanilla 560... but with only a 64-bit memory. I see what you are saying. Unfortunately, it's the only one with a passive cooler.

Definitely going to throw the GTX560 into the HTPC this week and see how well it works. Hope it's not too noisy... :eek:
 

pauldun170

Diamond Member
Sep 26, 2011
9,133
5,072
136
For simple HTPC usage (XBMC\1080p\Netflix) I didn't notice anything between HD2000 on the I3-2100, HD4830, HD7770 or GTX570.

All worked fine.

Only reason I run a discrete card (the HD7770) because we use the HTPC as the family gaming box in addition to the WII\XBOX etc etc etc.
If we didn't game with it I'd chuck the HD7770
 

jkauff

Senior member
Oct 4, 2012
583
13
81
madVR can do amazing things to picture quality using your GPU, but it's a hobby unto itself. If terms like luma and chroma, judder, and error diffusion are foreign to you, stick with what you have and just enjoy the movie.
 

piasabird

Lifer
Feb 6, 2002
17,168
60
91
I use only integrated graphics at home. I have 2 computers set up to watch video over the Internet. One is an i5 2500k with the HD3000 graphics, I think Sandy Bridge. The other one is a newer i3 with 4 Meg Cache with the HD4600 graphics.

http://www.newegg.com/Product/Produc...82E16819116945

I could see some jumpiness in video being caused by the TV. A lot of newer HDTV's have faster resolution than needed for typical movie watching. They also have technology that tries to correct or speed up the frame rate and adds extra frames. Adjusting the settings on the HDTV is critical to watching video. Basically you adjust down a lot of the added features and just watch it slower. I purposely purchased a slower HDTV because of this.

Maybe what a video card really does is speeds up the number of frames per second so there is no reason to correct the frame rate.

Still I only have a 40" Samsung at about 60hrz so Maybe the video is too poor to notice it on the smaller screen.
 
Last edited:

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Poofy, can you run madVR with WMC/MB like I have? Is it passive... as in set it and forget it?

Most people use MadVR with Media Player Classic HC or with JRiver's stuff. It is not really a set and forget type thing, more of a configure a hundred little options to barely make a difference until you are crazy thing. My personal option is the only people who need it are those with 4K TVs, and then they need a good gaming GPU.

For casual media use (aka Kodi/WMC/etc.) there is a difference in GPU picture quality in Windows but it is SLIGHT (in this order: AMD 6xxx+, Nvidia, AMD 5xxx or less, Intel). Most of it is color correction, which you can't see in every scene.

Really what separates the solutions is decoding robustness- Nvidia's new stuff can decode x264 at 4k while I have some 1080p files (interlaced VC1) that make every Intel GPU I have barf. But with Intel normally you have enough CPU power to just cut through whatever.

Does I use dedicated GPUs? Yes, I do whenever I can.

But the reason I use them is SO niche I bet it doesn't apply to you- Nvidia+Linux has some features nothing else in Linuxland can touch. For 99% of HTPC use they are probably not needed.

Maybe what a video card really does is speeds up the number of frames per second so there is no reason to correct the frame rate.

There is a MadVR filter that gets rid of 3:2 pulldown. I am a fan of that on 60hz tvs or tvs that take forever to switch into 24hz mode.

The video card does nothing by default without software.
 

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
For video only (moderate or no 3d gaming) a 6450 will easily handle the job. The decoding hardware is actually fairly robust. It's the compute that is lacking, aside from mild effects, MADvr will probably choke as will any new game beyond Source Engine games.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
depends. If you go the MPC-HC/MadVR route, it can be made to use every last bit of GPU power you can possibly throw at it.
 

BonzaiDuck

Lifer
Jun 30, 2004
15,709
1,450
126
Posts in this thread deserve my attentive study. But a lot on my plate today!

Lemme suggest this thought.

At least Live TV and some features of Media Center are really low-level processes. And in the wider category of recent topics, switching between displays has quirks regardless the hardware or how you do it. I have multiple PCs providing multiple instances of Media Center with multiple monitor outputs.

I should think that an iGPU of some kind would serve just fine for Media Center output. It is also quite possible to run multiple GPUs without SLI, or SLI/XFire with an additional graphics adapter not part of the SLI/Xfire mix.

What would be the prospects, and what would be the quirks of setting it up?
 

XMan

Lifer
Oct 9, 1999
12,513
49
91
I have HD4600 in mine and it's been perfectly sufficient for everything from HD recorded TV to 3D Blu Rays.