Nvidia finally *gasps* brings what was promised.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

EightySix Four

Diamond Member
Jul 17, 2004
5,122
52
91
Even with the purchase of the purevideo decoders I couldn't ever get the wmv-hd decoding working. DVD's work fine, but WMV-HD doesn't, and yes my card is a PCIe card.
 
Mar 19, 2003
18,289
2
71
Originally posted by: crazySOB297
Originally posted by: SynthDude2001
Originally posted by: crazySOB297
Even with the purchase of the purevideo decoders I couldn't ever get the wmv-hd decoding working. DVD's work fine, but WMV-HD doesn't, and yes my card is a PCIe card.

http://support.microsoft.com/kb/888656/en-us

Been there done that

Sorry, I forget what exactly I did to get it working before. I just installed that now and it had no impact on my CPU usage; of course I forgot that I didn't have it installed right now in the first place since my new CPU is a lot faster than my old. :p

Maybe if I dig up some of my old posts I can remember how I got it working on my old Athlon XP...
 

EightySix Four

Diamond Member
Jul 17, 2004
5,122
52
91
That would be nice, thanks... I've installed the patches and stuff, just to no avail... I get the purevideo decoding on the dvd's though, and those look great.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: nts
Originally posted by: Gstanfor
Dug, why would GPU speed be irrelevant if there were a seperate video processor? It is still on die, and has to be clocked at some rate. The know rates for 6800 GT are 300m Hz (2D operation), 350 mhz (3D operation) and 400 mhz (ROP output operations)

It doesn't have to run at the same speed as the core though if its a separate processor.

My assumption was that they would have their video processor run at some other common speed (common to all chips).


I agree with nts. The video processor may well be clocked differently per different card in logical order. 6800 series > 6600 series for example. Its also good to see the 7 series not having a borked video processor unlike the nv40.

btw shouldnt we be rejoycing instead of flaming each other?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
No doubt the video processor would be clocked different for 6600 and above. these came later than nV40, and nVidia added features/revisions to these that nV40 missed out on. Full blown discrete clock domains didn't appear until G70 though.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Sigh.

I'd think this topic had been covered enough that we wouldn't have any more confusion about it. Nonetheless:

Originally posted by: nts
So which ones have the broken video engine? (6800/GT/Ultra?)

Here you go nts:
http://www.nvidia.com/page/purevideo_support.html

The short answer is all 6800Ultras, 6800GTs, and AGP 6800NUs do not support WMV9 decode acceleration, but do support all other Pure Video features.

Originally posted by: dug777
so is this true, do we have to PAY for purevideo to use this stuff? If that's teh case its TOTALLY unacceptable imo :|
While your position does not surprise me, it could be noted that nVidia/ATI are charged for the MPEG decoders.
We could have that built into the price of the cards, and even those who never use it pay, or it can be passed on to consumers as they require it.
As the decoders cost ATI and nVidia money, you will be charged for them whether in the fashion you are, or as a hidden cost.
Believe it or not, gpu manufacturers don't often buy millions of people gifts.

Originally posted by: SynthDude2001
I've gotta wonder if there ever was a "video engine/processor" in the 6-series at all. "GeForce 6 and 7 GPUs all have a video engine built into them." could mean something as simple as "pixel shaders able to be used as a general video processing unit", maybe....

I find it interesting that their wording (even including the stuff about Purevideo on Nvidia's website) no longer seems to refer to a discrete video processor at all.
You need to read page three of the tech brief linked in the Additional Info box here.
It certainly does say there is a discrete video processor?
Beyond that, if they were using the pixel shaders like ATI, don't you think they would on the 6800GTs and 6800Ultras? Last I looked, they have pixel shaders, and people certainly wanted to decode WMV9 with them.



This is an interesting post for two reasons to me:
1. Wish I still had the PCIE 6800NUs, I'd see how much difference clock speed makes.
2. I was unaware different models of ATI cards supported different resolutions.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: dug777
Originally posted by: Gstanfor
Originally posted by: SynthDude2001
Originally posted by: dug777
Originally posted by: nts
...
Most GPUs from the 6600GT and upwards will be able to handle full bitrate, 1080p content, according to NVIDIA.
...

So which ones have the broken video engine? (6800/GT/Ultra?)

interesting point...

I've gotta wonder if there ever was a "video engine/processor" in the 6-series at all. "GeForce 6 and 7 GPUs all have a video engine built into them." could mean something as simple as "pixel shaders able to be used as a general video processing unit", maybe....

I find it interesting that their wording (even including the stuff about Purevideo on Nvidia's website) no longer seems to refer to a discrete video processor at all.

:disgust:
From the quoted link in the first post:
Avivo?
This is different from ATI's Avivo technology, which decodes H.264 video on Radeon X1000 series cards. Avivo uses pixel shaders to decode the video, and this places a limitation on what cards can handle what resolutions. The X1300 can handle 480p H.264 video, the X1600 can handle 720p and the X1800 can handle 1080p.

then why this bit, before you get all unpleasant with people mate,

'The quality of the acceleration depends directly on the speed of the GPU. At 300MHz, the acceleration is going to be minimal. As you scale on up towards 500MHz GPU clock speed, acceleration gets to the point where less than 50% CPU is being used to deliver full speed playback. '

?

He's just posing a question is all, not criticising your precious nvidia ;)

To me a separate pvp would mean the gpu speed should be effectively irrelevant...

It's a processor dug. Like any other processor in the world, AMD, Intel, SUN, motorola, PowerPC, ATI, Nvidia. The faster the core speed, the more work it can do in the same amount of time. Else nobody would ever overclock anything.

 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: SynthDude2001
Originally posted by: Gstanfor
Originally posted by: SynthDude2001
Originally posted by: dug777
Originally posted by: nts
...
Most GPUs from the 6600GT and upwards will be able to handle full bitrate, 1080p content, according to NVIDIA.
...

So which ones have the broken video engine? (6800/GT/Ultra?)

interesting point...

I've gotta wonder if there ever was a "video engine/processor" in the 6-series at all. "GeForce 6 and 7 GPUs all have a video engine built into them." could mean something as simple as "pixel shaders able to be used as a general video processing unit", maybe....

I find it interesting that their wording (even including the stuff about Purevideo on Nvidia's website) no longer seems to refer to a discrete video processor at all.

:disgust:
From the quoted link in the first post:
Avivo?
This is different from ATI's Avivo technology, which decodes H.264 video on Radeon X1000 series cards. Avivo uses pixel shaders to decode the video, and this places a limitation on what cards can handle what resolutions. The X1300 can handle 480p H.264 video, the X1600 can handle 720p and the X1800 can handle 1080p.

I'm quite capable of reading. ;)

I'm just wondering, if there really was a video processor there in the first place and it was claimed to be broken in 2004 (for certain cards), what has changed? My 6800GT with its apparently broken PVP (so they say) sure seems to play 1080p WMV-HD and 1080i transport streams with very low CPU usage and no dropped frames... (Edit: keeping in mind that WMV-HD was the very thing claimed to be broken in the first place)

Wasn't it supposed to be WMV9 that NV40 had problems with? Or is that the same thing as saying 1080p WMV-HD? Dunno.

 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: dug777
Originally posted by: Gstanfor
Originally posted by: SynthDude2001
Originally posted by: dug777
Originally posted by: nts
...
Most GPUs from the 6600GT and upwards will be able to handle full bitrate, 1080p content, according to NVIDIA.
...

So which ones have the broken video engine? (6800/GT/Ultra?)

interesting point...

I've gotta wonder if there ever was a "video engine/processor" in the 6-series at all. "GeForce 6 and 7 GPUs all have a video engine built into them." could mean something as simple as "pixel shaders able to be used as a general video processing unit", maybe....

I find it interesting that their wording (even including the stuff about Purevideo on Nvidia's website) no longer seems to refer to a discrete video processor at all.

:disgust:
From the quoted link in the first post:
Avivo?
This is different from ATI's Avivo technology, which decodes H.264 video on Radeon X1000 series cards. Avivo uses pixel shaders to decode the video, and this places a limitation on what cards can handle what resolutions. The X1300 can handle 480p H.264 video, the X1600 can handle 720p and the X1800 can handle 1080p.

then why this bit, before you get all unpleasant with people mate,

'The quality of the acceleration depends directly on the speed of the GPU. At 300MHz, the acceleration is going to be minimal. As you scale on up towards 500MHz GPU clock speed, acceleration gets to the point where less than 50% CPU is being used to deliver full speed playback. '

?

He's just posing a question is all, not criticising your precious nvidia ;)

To me a separate pvp would mean the gpu speed should be effectively irrelevant...

Dug, actually you have it correct. The video processor was "supposed" to be a discrete processor within a processor so to speak. It's "allegedly"capable of a seperate clockspeed from the host GPU and uses ~20M transistors of the die.

However, after finding they had issues with the NV40 chip:
NVIDIA also confirmed to us that NV40-based cards do "load balancing" between the video processor and pixel shaders for some video processing tasks, although we didn't get into the nuts and bolts of which computations were handled by the CPU, the video processor, and the pixel shaders. That's just for decoding, as far as I know. Anand did a nice write-up about this problem, and he said that NVIDIA wouldn't answer his questions about whether the NV40 will do any hardware encoding.
Link

Read that carefully.... What Nvidia is doing "NOW" is using the shader pipeline in conjuction with the PVP to process video. Nvidia isn't forthcoming with what is doing what and how its doing it, so you have to "guess" what is really going on. Nobody here knows, thats for certain...anyone that truely does is under NDA.

Obviously, if the different range of new GPU's (NV4x,g7x) have different capabilities, then it would "seem" to be from this "load balancing" as the PVP alone would be the same across the entire product line.

I've suspected the NV40 can't support WMV acceleration is related to the programmable nature of its shader pipeline in addition to the fact that its PVP is broken and not programmable. If H264 acceleration "IS" enabled for NV40, we know that that assumption is aleast partially wrong, but does beg to ask why WMV and other MPEG4 formats aren't available yet, and encoding...hello?

It is a very interesting situation.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: rbV5
Link

Read that carefully.... What Nvidia is doing "NOW" is using the shader pipeline in conjuction with the PVP to process video. Nvidia isn't forthcoming with what is doing what and how its doing it, so you have to "guess" what is really going on. Nobody here knows, thats for certain...anyone that truely does is under NDA.

Obviously, if the different range of new GPU's (NV4x,g7x) have different capabilities, then it would "seem" to be from this "load balancing" as the PVP alone would be the same across the entire product line.

I've suspected the NV40 can't support WMV acceleration is related to the programmable nature of its shader pipeline in addition to the fact that its PVP is broken and not programmable. If H264 acceleration "IS" enabled for NV40, we know that that assumption is aleast partially wrong, but does beg to ask why WMV and other MPEG4 formats aren't available yet, and encoding...hello?

It is a very interesting situation.

Okay, last point first WMV, MPEG4, encoding etc will be available - just wait for FW85.xx

As for the bit about pixel shaders, that was covered by the B3D article I quoted earlier.

While ATI have been mapping some of their video processing over the Shader Core for some time NVIDIA have decided not to do this as they feel instructions required for video processing do not lend themselves well to the instruction set in the pixel shader pipeline; thus a dedicated unit may be more optimal for this type of work. When running video processing though the shaders this means the 3D core is active and consuming power as well, which may not be desirable in all situations, especially where mobile devices are concerned - the NV4x VP is a smaller unit dedicated to video processing so it should require less power for video processing than utilising the shader core. This is not to say, however, that NVIDIA won't utilise the shader core in conjunction with the VP in some instance, should they choose to do so.
I'm guessing that "some instances" would include when the VP hardware is patially broken... In any case MPEG2 (the format with the most consumer content is fully hardware dedcoded by the VP (and I expect nV40 owners will be able to at least encode MPEG2 - given the popularity of DVD "backup" software amongst the general population I supsect this won't be unappreciated.)
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Gstanfor
I'm guessing that "some instances" would include when the VP hardware is patially broken... In any case MPEG2 (the format with the most consumer content is fully hardware dedcoded by the VP (and I expect nV40 owners will be able to at least encode MPEG2 - given the popularity of DVD "backup" software amongst the general population I supsect this won't be unappreciated.)

If that is true, it would be great news for all the 6800GT and Ultra owners who want WMV9 decode acceleration.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
I'm guessing that "some instances" would include when the VP hardware is patially broken... In any case MPEG2 (the format with the most consumer content is fully hardware dedcoded by the VP (and I expect nV40 owners will be able to at least encode MPEG2 - given the popularity of DVD "backup" software amongst the general population I supsect this won't be unappreciated.)

I'm thinking MPEG2 is not fully decoded by the VP actually. My AIW9700pro and 6800 had virtually the same CPU utilization when decoding MPEG2 DVD's using Purevideo decoders and MCE 2005 (WMP10 rendering engine) Purevideo decoders property sheet clearly show DXVA acceleration was being used for both cards. So DXVA does work on NV40, it just doesn't work for anything other than accelerating MPEG2, and thats the way it has been since purevideo decoders were released.

It was apparent that some VP features were working (deinterlacing for instance) and I had felt that the 6800 did have slightly better PQ, however I questioned whether it was a placebo effect, or actually seeing better PQ.

Anands initial Purevideo vs AVIVO test verified that it was indeed better at that point.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
If that is true, it would be great news for all the 6800GT and Ultra owners who want WMV9 decode acceleration.

ANY NV40 owners including the 6800GS AGP owners. It would also give Nvidia back some of the repect they lost by not delivering for so long. H264 on top of that would be awesome.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
From rbV5's link (I could find more if I could be bothered)
However, NVIDIA wasn't able to get everything working as planned in NV40 silicon, so the NV40 video processor cannot fully accelerate WMV, just MPEG2. Instead, it has to farm out encoding/decoding work to the CPU, as GeForce FX cards did.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: Gstanfor
From rbV5's link (I could find more if I could be bothered)
However, NVIDIA wasn't able to get everything working as planned in NV40 silicon, so the NV40 video processor cannot fully accelerate WMV, just MPEG2. Instead, it has to farm out encoding/decoding work to the CPU, as GeForce FX cards did.

Its nowhere near 95% of the decodes originally claimed by Nvidia for MPEG2, its nearly exactly the same CPU utilization as a DX9 Radeon card that is minus the PVP using the same decoders and software and the purevideo decoder property itself sheet says its using DXVA....looks like the shaders are doing the acceleration and the PVP is doing the post processing.

I'd say its using shaders to accelerate MPEG2 alright. Understand that using the GPU's shader pipeline is not the same as "offloading to the CPU", however it is most certainly offloading cycles to the GPU, not fully accelerated using the video processor.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I guess that is why nVidia says "don't expect much acceleration at 350mhz"

And about the shaders doing the acceleration, I'm sorry, but I'll believe nVidia and published sources before I believe an ATi beta software tester...

If shaders are being used, it should be possible to detect their usage via a tool like 3danalyze or similar.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: Gstanfor
I guess that is why nVidia says "don't expect much acceleration at 350mhz"

And about the shaders doing the acceleration, I'm sorry, but I'll believe nVidia and published sources before I believe an ATi beta software tester...

If shaders are being used, it should be possible to detect their usage via a tool like 3danalyze or similar.

I tested my own NV40 for months including comparisons of my ATI cards with various softwares (no longer have the card however) DXVA is microsofts interface for using the GPU to accelerate video with the shader pipeline. I'm not an expert, but I've followed this for sometime. I'm going off information gleaned from personal experience and published sources since March '04.
 

Steelski

Senior member
Feb 16, 2005
700
0
0
Originally posted by: Wreckage
Originally posted by: dug777
so is this true, do we have to PAY for purevideo to use this stuff? If that's teh case its TOTALLY unacceptable imo :|
If you want to use H.264 on ATI AVIVO you have to pay.

HMMMM. No you dont!!!!!!!!!!!!!!!!!!!!!!!!!
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: rbV5
Originally posted by: Gstanfor
I guess that is why nVidia says "don't expect much acceleration at 350mhz"

And about the shaders doing the acceleration, I'm sorry, but I'll believe nVidia and published sources before I believe an ATi beta software tester...

If shaders are being used, it should be possible to detect their usage via a tool like 3danalyze or similar.

I tested my own NV40 for months including comparisons of my ATI cards with various softwares (no longer have the card however) DXVA is microsofts interface for using the GPU to accelerate video with the shader pipeline. I'm not an expert, but I've followed this for sometime. I'm going off information gleaned from personal experience and published sources since March '04.

So? There are lots of other people who have owned nV40 since release (me included). What personal information leads you to believe nV40 uses pixel shaders for MPEG2 decode? Also show your published sources.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: Gstanfor
Originally posted by: rbV5
Originally posted by: Gstanfor
I guess that is why nVidia says "don't expect much acceleration at 350mhz"

And about the shaders doing the acceleration, I'm sorry, but I'll believe nVidia and published sources before I believe an ATi beta software tester...

If shaders are being used, it should be possible to detect their usage via a tool like 3danalyze or similar.

I tested my own NV40 for months including comparisons of my ATI cards with various softwares (no longer have the card however) DXVA is microsofts interface for using the GPU to accelerate video with the shader pipeline. I'm not an expert, but I've followed this for sometime. I'm going off information gleaned from personal experience and published sources since March '04.

So? There are lots of other people who have owned nV40 since release (me included). What personal information leads you to believe nV40 uses pixel shaders for MPEG2 decode? Also show your published sources.

I've noticed that when using the Nvidia decoders, that the property sheet was essentially the same between my 2 DX9 cards at the time 1)AIW9700pro 2)EVGA6800. I thought I had a screeshot of it somewhere, I'll look and post it. To me its really unclear what exactly is doing what, but I've read a number of articles like this:
.....Initially, DVD-playback software was forced, out of necessity, to interrogate the graphics subsystem and find out what GPU it contained and subsequently to include numerous GPU-specific routines that reflected each chip's hardware-acceleration capabilities. For example, early Nvidia chips had fewer MPEG-2 features than their ATI Technologies counterparts. (However, this gap has closed in recent years. Nvidia GeForce 6xxx and 7xxx chips, for example, contain three dedicated video engines for MPEG-2 decoding, motion estimation, and video processing. They also take advantage of the chips' shader processors for video functions.)....
Link

I'm not sure it matters even, other than the decoding isn't transparent, it requires the same rendering pipeline as ATI: Media>Hardware>special decoder>enabled software>output to display.

I'm interested from a user standpoint, I'm looking forward to the H264 acceleration of NV6/7. It should provide some interesting insight on Purevideo IMO.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Here's a screenshot of my Purevideo property sheet decoding MPEG2 in WMP10 with my Radeon card, do you have one of a NV40, or another NV card using NV DVD decoder in WMP10. Take a screenshot of the property sheet while decoding MPEG2 and post it for comparison.Link
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Uhhh, call me dumb if you want, but I can't see any proof of nVidia using pixerl shaders on your image rbV5...