Non-Gamer; High quality video GPU for HD

trclac

Member
Jul 21, 2012
85
0
0
I don't play games, but I'm looking to replace my three years old graphics card with a more recent model. For graphics needs, I primarily use it to watch HD movies and Blu Ray movies on the computer, so I need a card that can handle those demands. But since I'm not a gamer, I doubt I need to spend anywhere close to $400 for a high quality card that produces great video. I've gone through several posts, but I didn't see specific recommendations for users who primarily use the card for normal computing work and watch high quality HD movies/Blu Ray. I have a high end monitor for the purposes of obtaining the best picture quality I can in HD, but the card is also essential.

Any recommendations would be much appreciated (and, yes, I've recently rebuilt the system and am looking to upgrade this component as well -- I am using the Asus P8Z77-V Deluxe MB with the i5-3570 CPU, 16GB RAM and a 750w PSU).

Thanks!!!
 
Last edited:

Reversed

Member
Aug 9, 2012
29
0
0
You don't really need a discrete video card just for watching movies. Your integrated HD 4000 will do this just fine, assuming you have an i5-3570k, depending on your resolution ofcourse. Normally though, integrated graphics are sufficient.
 
Last edited:

trclac

Member
Jul 21, 2012
85
0
0
You don't really need a discrete video card just for watching movies, your integrated HD 2500 will do this just fine, depending on your resolution ofcourse. Normally though, integrated graphics are sufficient.

Thanks. I was thinking the same thing, especially with these new processors, but a guy in IT at work swears by discrete cards in all cases (whether it's for simple computing or HD video). He gave me some reason, but I don't recall what it was (been a few weeks since I spoke to him).
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
The HD4000 in the new CPUs from Intel is pretty much a descrete card on the CPU. It does everything a basic AMD or Nvidia card would. HDMI output, HD video encoding and decoding, gaming.
 

FalseChristian

Diamond Member
Jan 7, 2002
3,322
0
71
Your IGP is more than enough for non-gaming. Also, the HD 4000 can even play games like Crysis at low resolutions such as 1024x768 with everything set on Enthusiast.
 

trclac

Member
Jul 21, 2012
85
0
0
Thanks, all!!! And thanks for helping me save the money on the discrete card that the guy in IT said I need! :)
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I'll echo what the rest have already said, the IGP is more than enough. In fact I have a HTPC with a core 2 duo and a discrete GPU (5450) and have considered going to IB so I can get rid of the car lol. That IT guy has no good reason for his recommendation.
 

TakeNoPrisoners

Platinum Member
Jun 3, 2011
2,599
1
81
The new HD4000 GPUs have destroyed the under $100 video card market; really there is no reason to buy a GPU if you have a budget under $100 as it sweeps the floor with everything under that price. The only reasons to spend more is if you are a serious gamer or need more GPU muscle for a specific application.
 

slugg

Diamond Member
Feb 17, 2002
4,723
80
91
Intel integrated is actually ideal for the OP's usage. Dedicated transistors just for video encoding, decoding, and transcoding. Lower power consumption than a discrete GPU. In my eyes, that's actually better than a discrete solution for the OP's application!
 

palladium

Senior member
Dec 24, 2007
539
2
81
A guy in IT at work swears by discrete cards in all cases (whether it's for simple computing or HD video). He gave me some reason, but I don't recall what it was (been a few weeks since I spoke to him).

Just shows how bad Intel's IGP reputation was, prior to SB.

Back on topic, OP's better off spending the money on a sound card (assuming he hasn't got one already).
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Thanks. I was thinking the same thing, especially with these new processors, but a guy in IT at work swears by discrete cards in all cases (whether it's for simple computing or HD video). He gave me some reason, but I don't recall what it was (been a few weeks since I spoke to him).

Sounds like he works in the wrong field.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Just shows how bad Intel's IGP reputation was, prior to SB.

Back on topic, OP's better off spending the money on a sound card (assuming he hasn't got one already).

I'd say it goes to show that the term "IT guy" should be takem with a bucket of salt. SB is already well over a year old, any half decent IT guy would be aware of their capabilities and recognize its plenty for the OPs needs.

It's good that the OP has enough sense to not take that advise at face value and took initiative to seek more opinions.

I'd say the money is better left... Unspent. Unless he's an avid audiophile with a nice speaker system, a sound card really isn't needed IMO. If he's outputting to a receiver it's certainly not needed, he can use HDMI out for that.

If you're recording on the other hand, a good sound card is a must.
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
The new HD4000 GPUs have destroyed the under $100 video card market; really there is no reason to buy a GPU if you have a budget under $100 as it sweeps the floor with everything under that price. The only reasons to spend more is if you are a serious gamer or need more GPU muscle for a specific application.


Thats a slight exageration.
You can get a 6670 for 49$ (brand new) thats alot faster than the HD4000 (IGP) in intel Sandy Brigdes.

The fact that Sandy Brigde can give something like a Geforce 520 a run for its money, makes low-end cards like the 520 useless though. That however doesnt mean there arnt sub 100$ video cards that cant kick the HD4000 (IGP)'s arse in terms of performance.

I mean there are 7750's going for less than 100$, and their many many times faster than the HD4000.
 

Sonikku

Lifer
Jun 23, 2005
15,914
4,954
136
The new HD4000 GPUs have destroyed the under $100 video card market; really there is no reason to buy a GPU if you have a budget under $100 as it sweeps the floor with everything under that price. The only reasons to spend more is if you are a serious gamer or need more GPU muscle for a specific application.

Would it be better to get an ivy bridge cpu and use integrated over getting a sandy bridge pentium cpu and a 6670 video card for WoW?
 

AnonymouseUser

Diamond Member
May 14, 2003
9,943
107
106
Would it be better to get an ivy bridge cpu and use integrated over getting a sandy bridge pentium cpu and a 6670 video card for WoW?

No, if you want integrated gpu get one of the new AMD Trinity APUs. Though the HD 4000 isn't compared in that review, it is compared to the HD 3000 here, and is ~50% faster than the HD 3000 which gets stomped by the AMD APUs. Even the lowly ~$50 GT 430 is roughly 60 to 100% faster than the HD 4000.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I wouldn't give up massive CPU performance for a small bump in IGP.. IB > Trinity
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
234
106
Intel VGA drivers.... are still inferior, though. Simpler? cheaper? yes!

Better? No way.

2vmvm91.jpg


2) Kepler includes an actual hardware chip dedicated to H.264 encoding, that rivals Intel Quick Sync.

1ywr69.gif


NVENC
All Kepler GPUs also incorporate a new hardware-based H.264 video encoder, NVENC.
Prior to the introduction of Kepler, video encoding on previous GeForce products was handled by
encode software running on the GPU’s array of CUDA Cores. While the CUDA Cores were able to deliver
tremendous performance speedups compared to CPU-based encoding, one downside of using these
high-speed processor cores to process video encoding was increased power consumption.
By using specialized circuitry for H.264 encoding, the NVENC hardware encoder in Kepler is almost four
times faster than our previous CUDA-based encoder while consuming much less power.

It is important to note that an application can choose to encode using both NVENC hardware and
NVIDIA’s legacy CUDA encoder in parallel, without negatively affecting each other. However, some video
pre-processing algorithms may require CUDA, and this will result in reduced performance from the
CUDA encoder since the available CUDA Cores will be shared by the encoder and pre-processor.
NVENC provides the following:

  • [Can encode full HD resolution (1080p) videos up to 8x faster than real-time. For example, in high
    performance mode, encoding of a 16 minute long 1080p, 30 fps video will take approximately 2
    minutes.]

  • Support for H.264 Base, Main, and High Profile Level 4.1 (same as Blu-ray standard)

  • Supports MVC (Multiview Video Coding) for stereoscopic video—an extension of H.264 which is
    used for Blu-ray 3D.

  • Up to 4096x4096 encode
We currently expose NVENC through proprietary APIs, and provide an SDK for development using
NVENC. Later this year, CUDA developers will also be able to use the high performance NVENC video
encoder. For example, you could use the compute engines for video pre-processing and then do the
actual H.264 encoding in NVENC. Alternatively, you can choose to improve overall video encoding
performance by running simultaneous parallel encoders in CUDA and NVENC, without affecting each
other’s performance.

NVENC enables a wide range of new use cases for consumers:

  • HD videoconferencing on mainstream notebooks

  • Sending the contents of the desktop to the big screen TV (gaming, video) through a wireless
    connection

  • Authoring high quality Blu-ray discs from your HD camcorder
A beta version of Cyberlink MediaEspresso with NVENC support is now available on the GeForce GTX
680 press FTP. Support will be coming soon for Cyberlink PowerDirector and Arcsoft MediaConverter.
Choose wisely.
 
Last edited:

lehtv

Elite Member
Dec 8, 2010
11,897
74
91

How is this surprising...? The IGP is on the CPU die so naturally CPU utilization goes up with integrated graphics.

2) Kepler includes an actual hardware chip dedicated to H.264 encoding, that rivals Intel Quick Sync.

1ywr69.gif

I don't get this either. Looks to me like Intel Quick Sync is a winner here, same performance for a fraction of the cost.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
234
106
How is this surprising...? The IGP is on the CPU die so naturally CPU utilization goes up with integrated graphics.
Well, many people were bashing non-Intel GPUs. This is one test to show, that not everything's rosy in Intel's backyard.

I don't get this either. Looks to me like Intel Quick Sync is a winner here, same performance for a fraction of the cost.
There is a big improvement from the last generation. Point is, you no longer need to access an onboard GPU (thru Virtu) if you have a modern discrete card and plan to encode your home video collection to iPad. It can encode as fast.

If you intend to game with the IGP, then Trinity > IB. Intel graphics are still the bottom feeders.
True. HD 4000 is on par with Geforce 520, which wasn't designed for games. If you game, try to compare it with something more powerful. But with any Geforce you get superior drivers & reliable multi-monitor support. Something to consider for stock traders.
 
Last edited:

Tango11

Junior Member
Oct 5, 2012
3
0
0
Is the HD4000 adequate for HD video editing or do I need a discrete video card? I won't be playing any games.

I understand that some editing software such as Sony Movie Studio can benefit from a discrete video card for rendering, but does that only apply to the final render (rather than previewing during editing)? If I've spent a few hours filming and editing, I don't mind waiting 15 minutes rather than 10 for the final output to render.

I'm thinking of getting an i7 3770k overclocked to 4.2ghz (along with 16GB RAM, 120GB SSD). Initially I had thought that I would need a video card such as a GTX 660ti, but these forums seem to be suggesting that a powerful video card is needed only for gaming, and that the HD4000 on an i7 should be adequate for HD video editing.
 

trclac

Member
Jul 21, 2012
85
0
0
Is the HD4000 adequate for HD video editing or do I need a discrete video card? I won't be playing any games.

I understand that some editing software such as Sony Movie Studio can benefit from a discrete video card for rendering, but does that only apply to the final render (rather than previewing during editing)? If I've spent a few hours filming and editing, I don't mind waiting 15 minutes rather than 10 for the final output to render.

I'm thinking of getting an i7 3770k overclocked to 4.2ghz (along with 16GB RAM, 120GB SSD). Initially I had thought that I would need a video card such as a GTX 660ti, but these forums seem to be suggesting that a powerful video card is needed only for gaming, and that the HD4000 on an i7 should be adequate for HD video editing.

In addition to my original post regarding the best solution for watching HD video and blu-ray (and I'm not a gamer), the question above is also an issue for me. Would the advice be the same as in this thread for a non-gamer who wants to edit HD captured video (camcorder, etc)?

Thanks!