Will NVidia 8800 refresh bring PureVideo HD to the high end cards?

yacoub

Golden Member
May 24, 2005
1,991
14
81
Would rather wait for the 8800GTS and GTX refresh due out by September if they'll finally add PureVideo HD to the high-end cards.

I'd hope they add in the features that were left out (probably because they weren't ready last November but by the time the 8600/8500 series launched they could add them in). I assume that's the only reason PVHD didn't make it into the 8800GTS/GTX.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I read somewhere that the 8800 series was powerful enough to not need an extra decoder on the card which is why they don't have it. Same for the HD2900XT.
 

broly8877

Senior member
Aug 17, 2004
461
0
0
I'd say the case is that whoever buys an 8800/2900 card is overwhelmingly likely to own a CPU that can handle HD content. G84's acceleration is leagues better than G80's
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: mercanucaribe
I don't have any Purevideo and I can watch HD movies. What's the big deal??

Because you might want to rip and encode 2 dvds while watching HD content, apparently.

;)
 

ra990

Senior member
Aug 18, 2005
359
0
76
There is this huge confusion about Purevideo HD. Fact is, ever video card since the GeForce 7 series has PureVideo HD, including the 8800 chips. What the 8500/8600 has that the 8800 doesn't have is the VP2 processor, which is the newer version of the video processor by nvidia, featuring enhanced acceleration compared to the prior video processor.
 

secretanchitman

Diamond Member
Apr 11, 2001
9,352
23
91
Originally posted by: broly8877
I'd say the case is that whoever buys an 8800/2900 card is overwhelmingly likely to own a CPU that can handle HD content. G84's acceleration is leagues better than G80's

i second that. if you get an 8800, you're likely to have at least a high end athlon64 or some sort of dual core (athlon64 X2/opteron, core 2 duo).

besides, my 8800GTS 320MB and my pentium D 930 play 1080p just fine, as well as my opteron 170 @ 2.6Ghz with a 7900GT. both systems do have 2GB of ram though.
 

customcoms

Senior member
Dec 31, 2004
325
0
0
I would agree with the comments posted above in that if you have an 8800 series card you probably don't NEED full HD decode ability, because the rest of the system is plenty fast enough (or it should be, otherwise you can mail me your card). However, I think what many people are concerned with is the ability to multitask. If 50%+ of the ram and cpu power are being used for HD decoding/acceleration vs. the 10-20% of cpu and ram power used with full offload onto the gpu, you can't decode that dvd, compress gigs of data or other background tasks. Now, its probably not that big of a deal since you will be suing your computer to watch the movie, but for some it might matter.
 

Chadder007

Diamond Member
Oct 10, 1999
7,560
0
0
Originally posted by: cmdrdredd
I read somewhere that the 8800 series was powerful enough to not need an extra decoder on the card which is why they don't have it. Same for the HD2900XT.

Then again, I wonder if it takes more power for the CPU to decode HD or if it takes more power for the GPU to do so??? Has anyone done a total system power usage test to see which one draws more power while decoding yet?
 

nanaki333

Diamond Member
Sep 14, 2002
3,772
13
81
i suppose if you want to put a powerful GPU on a celeron 2.66, that would make sense to put on a video card.
 

gramboh

Platinum Member
May 3, 2003
2,207
0
0
Originally posted by: Chadder007
Originally posted by: cmdrdredd
I read somewhere that the 8800 series was powerful enough to not need an extra decoder on the card which is why they don't have it. Same for the HD2900XT.

Then again, I wonder if it takes more power for the CPU to decode HD or if it takes more power for the GPU to do so??? Has anyone done a total system power usage test to see which one draws more power while decoding yet?

I don't have a Bluray/HD-DVD player or any x264/h264 video content to test with, but when I watch a 1080P Quicktime movie trailer (not sure how they encode or what the bitrate is, probably low given the file sizes) I get about 20-30% CPU usage on my C2D E6600 @ 3.2GHz with an 8800GTS. Not sure if the video card is doing any work.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Originally posted by: broly8877
I'd say the case is that whoever buys an 8800/2900 card is overwhelmingly likely to own a CPU that can handle HD content. G84's acceleration is leagues better than G80's

I wouldn't doubt that you're right, but I still don't like the fact that nVidia hasn't enabled any drivers to do this for the 8800GT(S, X) owners (as I am one). I may have an E6600 in my PC, but I still paid for a GPU that I expected to perform video decoding and I got one that doesn't. Also, we pay a premium price for our cards yet we don't get premium features?

Personally, I feel a bit short-changed on the deal.
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,185
520
126
Originally posted by: cmdrdredd
I read somewhere that the 8800 series was powerful enough to not need an extra decoder on the card which is why they don't have it. Same for the HD2900XT.

Which is pure bull. The whole point of pure video is to allow you to offload the operations to the graphics card instead of running them on the CPU. The new HD-DVD or Blu-Ray codecs will completely eat up any CPU including quad cores if you have to run in software and not hardware.

Even thought the 8800 is powerful enough, without software written to tell the card that it can do the decode in the hardware, it won't use the more powerful 8800 hardware to do the extra decoding, which puts that stress on the rest of your system, even though there is perfectly good hardware which could do it sitting there idle.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: Aikouka
Originally posted by: broly8877
I'd say the case is that whoever buys an 8800/2900 card is overwhelmingly likely to own a CPU that can handle HD content. G84's acceleration is leagues better than G80's

I wouldn't doubt that you're right, but I still don't like the fact that nVidia hasn't enabled any drivers to do this for the 8800GT(S, X) owners (as I am one). I may have an E6600 in my PC, but I still paid for a GPU that I expected to perform video decoding and I got one that doesn't. Also, we pay a premium price for our cards yet we don't get premium features?

Personally, I feel a bit short-changed on the deal.
Huh? Hardware accelerated H.264 decoding is working fine for my 88000GTX.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Leinad
Based upon Mike's Hardware Roadmap it looks like the 8900 is due next month. Anyone have any insight on this one?

I sure would like to know where "Mike" got this info from. I haven't heard a peep about any 8900 making it to retail anytime soon.
 

gobucks

Golden Member
Oct 22, 2004
1,166
0
0
i can almost guarantee nvidia's refresh will have it. it seems to be how it always works for them - they release their high-end cards first; after that, they decide on some great new bit of hardware acceleration, and it is included on their midrange cards. Then all cards of the next generation have it, including the high end, but then their new midrange cards have something even better, etc. This has happened several times before - the 6600, 7600, and 8600 all had features that their earlier high-end siblings didn't have.

I think this is especially true now that nvidia knows that ATI didn't include this feature on their just-released 2900-series cards. The HD2900XT was supposed to have this capability but didn't so I'm sure that nvidia is salivating at the opportunity to kick ATI while it's down.