EightySix Four
Diamond Member
- Jul 17, 2004
- 5,122
- 52
- 91
Even with the purchase of the purevideo decoders I couldn't ever get the wmv-hd decoding working. DVD's work fine, but WMV-HD doesn't, and yes my card is a PCIe card.
Originally posted by: crazySOB297
Even with the purchase of the purevideo decoders I couldn't ever get the wmv-hd decoding working. DVD's work fine, but WMV-HD doesn't, and yes my card is a PCIe card.
Originally posted by: SynthDude2001
Originally posted by: crazySOB297
Even with the purchase of the purevideo decoders I couldn't ever get the wmv-hd decoding working. DVD's work fine, but WMV-HD doesn't, and yes my card is a PCIe card.
http://support.microsoft.com/kb/888656/en-us
Originally posted by: crazySOB297
Originally posted by: SynthDude2001
Originally posted by: crazySOB297
Even with the purchase of the purevideo decoders I couldn't ever get the wmv-hd decoding working. DVD's work fine, but WMV-HD doesn't, and yes my card is a PCIe card.
http://support.microsoft.com/kb/888656/en-us
Been there done that
Originally posted by: nts
Originally posted by: Gstanfor
Dug, why would GPU speed be irrelevant if there were a seperate video processor? It is still on die, and has to be clocked at some rate. The know rates for 6800 GT are 300m Hz (2D operation), 350 mhz (3D operation) and 400 mhz (ROP output operations)
It doesn't have to run at the same speed as the core though if its a separate processor.
My assumption was that they would have their video processor run at some other common speed (common to all chips).
Originally posted by: nts
So which ones have the broken video engine? (6800/GT/Ultra?)
Here you go nts:
http://www.nvidia.com/page/purevideo_support.html
The short answer is all 6800Ultras, 6800GTs, and AGP 6800NUs do not support WMV9 decode acceleration, but do support all other Pure Video features.
While your position does not surprise me, it could be noted that nVidia/ATI are charged for the MPEG decoders.Originally posted by: dug777
so is this true, do we have to PAY for purevideo to use this stuff? If that's teh case its TOTALLY unacceptable imo :|
We could have that built into the price of the cards, and even those who never use it pay, or it can be passed on to consumers as they require it.
As the decoders cost ATI and nVidia money, you will be charged for them whether in the fashion you are, or as a hidden cost.
Believe it or not, gpu manufacturers don't often buy millions of people gifts.
You need to read page three of the tech brief linked in the Additional Info box here.Originally posted by: SynthDude2001
I've gotta wonder if there ever was a "video engine/processor" in the 6-series at all. "GeForce 6 and 7 GPUs all have a video engine built into them." could mean something as simple as "pixel shaders able to be used as a general video processing unit", maybe....
I find it interesting that their wording (even including the stuff about Purevideo on Nvidia's website) no longer seems to refer to a discrete video processor at all.
It certainly does say there is a discrete video processor?
Beyond that, if they were using the pixel shaders like ATI, don't you think they would on the 6800GTs and 6800Ultras? Last I looked, they have pixel shaders, and people certainly wanted to decode WMV9 with them.
This is an interesting post for two reasons to me:
1. Wish I still had the PCIE 6800NUs, I'd see how much difference clock speed makes.
2. I was unaware different models of ATI cards supported different resolutions.
Originally posted by: dug777
Originally posted by: Gstanfor
Originally posted by: SynthDude2001
Originally posted by: dug777
Originally posted by: nts
...
Most GPUs from the 6600GT and upwards will be able to handle full bitrate, 1080p content, according to NVIDIA.
...
So which ones have the broken video engine? (6800/GT/Ultra?)
interesting point...
I've gotta wonder if there ever was a "video engine/processor" in the 6-series at all. "GeForce 6 and 7 GPUs all have a video engine built into them." could mean something as simple as "pixel shaders able to be used as a general video processing unit", maybe....
I find it interesting that their wording (even including the stuff about Purevideo on Nvidia's website) no longer seems to refer to a discrete video processor at all.
:disgust:
From the quoted link in the first post:
Avivo?
This is different from ATI's Avivo technology, which decodes H.264 video on Radeon X1000 series cards. Avivo uses pixel shaders to decode the video, and this places a limitation on what cards can handle what resolutions. The X1300 can handle 480p H.264 video, the X1600 can handle 720p and the X1800 can handle 1080p.
then why this bit, before you get all unpleasant with people mate,
'The quality of the acceleration depends directly on the speed of the GPU. At 300MHz, the acceleration is going to be minimal. As you scale on up towards 500MHz GPU clock speed, acceleration gets to the point where less than 50% CPU is being used to deliver full speed playback. '
?
He's just posing a question is all, not criticising your precious nvidia![]()
To me a separate pvp would mean the gpu speed should be effectively irrelevant...
Originally posted by: SynthDude2001
Originally posted by: Gstanfor
Originally posted by: SynthDude2001
Originally posted by: dug777
Originally posted by: nts
...
Most GPUs from the 6600GT and upwards will be able to handle full bitrate, 1080p content, according to NVIDIA.
...
So which ones have the broken video engine? (6800/GT/Ultra?)
interesting point...
I've gotta wonder if there ever was a "video engine/processor" in the 6-series at all. "GeForce 6 and 7 GPUs all have a video engine built into them." could mean something as simple as "pixel shaders able to be used as a general video processing unit", maybe....
I find it interesting that their wording (even including the stuff about Purevideo on Nvidia's website) no longer seems to refer to a discrete video processor at all.
:disgust:
From the quoted link in the first post:
Avivo?
This is different from ATI's Avivo technology, which decodes H.264 video on Radeon X1000 series cards. Avivo uses pixel shaders to decode the video, and this places a limitation on what cards can handle what resolutions. The X1300 can handle 480p H.264 video, the X1600 can handle 720p and the X1800 can handle 1080p.
I'm quite capable of reading.
I'm just wondering, if there really was a video processor there in the first place and it was claimed to be broken in 2004 (for certain cards), what has changed? My 6800GT with its apparently broken PVP (so they say) sure seems to play 1080p WMV-HD and 1080i transport streams with very low CPU usage and no dropped frames... (Edit: keeping in mind that WMV-HD was the very thing claimed to be broken in the first place)
Originally posted by: dug777
Originally posted by: Gstanfor
Originally posted by: SynthDude2001
Originally posted by: dug777
Originally posted by: nts
...
Most GPUs from the 6600GT and upwards will be able to handle full bitrate, 1080p content, according to NVIDIA.
...
So which ones have the broken video engine? (6800/GT/Ultra?)
interesting point...
I've gotta wonder if there ever was a "video engine/processor" in the 6-series at all. "GeForce 6 and 7 GPUs all have a video engine built into them." could mean something as simple as "pixel shaders able to be used as a general video processing unit", maybe....
I find it interesting that their wording (even including the stuff about Purevideo on Nvidia's website) no longer seems to refer to a discrete video processor at all.
:disgust:
From the quoted link in the first post:
Avivo?
This is different from ATI's Avivo technology, which decodes H.264 video on Radeon X1000 series cards. Avivo uses pixel shaders to decode the video, and this places a limitation on what cards can handle what resolutions. The X1300 can handle 480p H.264 video, the X1600 can handle 720p and the X1800 can handle 1080p.
then why this bit, before you get all unpleasant with people mate,
'The quality of the acceleration depends directly on the speed of the GPU. At 300MHz, the acceleration is going to be minimal. As you scale on up towards 500MHz GPU clock speed, acceleration gets to the point where less than 50% CPU is being used to deliver full speed playback. '
?
He's just posing a question is all, not criticising your precious nvidia![]()
To me a separate pvp would mean the gpu speed should be effectively irrelevant...
LinkNVIDIA also confirmed to us that NV40-based cards do "load balancing" between the video processor and pixel shaders for some video processing tasks, although we didn't get into the nuts and bolts of which computations were handled by the CPU, the video processor, and the pixel shaders. That's just for decoding, as far as I know. Anand did a nice write-up about this problem, and he said that NVIDIA wouldn't answer his questions about whether the NV40 will do any hardware encoding.
Originally posted by: rbV5
Link
Read that carefully.... What Nvidia is doing "NOW" is using the shader pipeline in conjuction with the PVP to process video. Nvidia isn't forthcoming with what is doing what and how its doing it, so you have to "guess" what is really going on. Nobody here knows, thats for certain...anyone that truely does is under NDA.
Obviously, if the different range of new GPU's (NV4x,g7x) have different capabilities, then it would "seem" to be from this "load balancing" as the PVP alone would be the same across the entire product line.
I've suspected the NV40 can't support WMV acceleration is related to the programmable nature of its shader pipeline in addition to the fact that its PVP is broken and not programmable. If H264 acceleration "IS" enabled for NV40, we know that that assumption is aleast partially wrong, but does beg to ask why WMV and other MPEG4 formats aren't available yet, and encoding...hello?
It is a very interesting situation.
I'm guessing that "some instances" would include when the VP hardware is patially broken... In any case MPEG2 (the format with the most consumer content is fully hardware dedcoded by the VP (and I expect nV40 owners will be able to at least encode MPEG2 - given the popularity of DVD "backup" software amongst the general population I supsect this won't be unappreciated.)While ATI have been mapping some of their video processing over the Shader Core for some time NVIDIA have decided not to do this as they feel instructions required for video processing do not lend themselves well to the instruction set in the pixel shader pipeline; thus a dedicated unit may be more optimal for this type of work. When running video processing though the shaders this means the 3D core is active and consuming power as well, which may not be desirable in all situations, especially where mobile devices are concerned - the NV4x VP is a smaller unit dedicated to video processing so it should require less power for video processing than utilising the shader core. This is not to say, however, that NVIDIA won't utilise the shader core in conjunction with the VP in some instance, should they choose to do so.
Originally posted by: Gstanfor
I'm guessing that "some instances" would include when the VP hardware is patially broken... In any case MPEG2 (the format with the most consumer content is fully hardware dedcoded by the VP (and I expect nV40 owners will be able to at least encode MPEG2 - given the popularity of DVD "backup" software amongst the general population I supsect this won't be unappreciated.)
I'm guessing that "some instances" would include when the VP hardware is patially broken... In any case MPEG2 (the format with the most consumer content is fully hardware dedcoded by the VP (and I expect nV40 owners will be able to at least encode MPEG2 - given the popularity of DVD "backup" software amongst the general population I supsect this won't be unappreciated.)
If that is true, it would be great news for all the 6800GT and Ultra owners who want WMV9 decode acceleration.
However, NVIDIA wasn't able to get everything working as planned in NV40 silicon, so the NV40 video processor cannot fully accelerate WMV, just MPEG2. Instead, it has to farm out encoding/decoding work to the CPU, as GeForce FX cards did.
Originally posted by: Gstanfor
From rbV5's link (I could find more if I could be bothered)
However, NVIDIA wasn't able to get everything working as planned in NV40 silicon, so the NV40 video processor cannot fully accelerate WMV, just MPEG2. Instead, it has to farm out encoding/decoding work to the CPU, as GeForce FX cards did.
Originally posted by: Gstanfor
I guess that is why nVidia says "don't expect much acceleration at 350mhz"
And about the shaders doing the acceleration, I'm sorry, but I'll believe nVidia and published sources before I believe an ATi beta software tester...
If shaders are being used, it should be possible to detect their usage via a tool like 3danalyze or similar.
Originally posted by: Wreckage
If you want to use H.264 on ATI AVIVO you have to pay.Originally posted by: dug777
so is this true, do we have to PAY for purevideo to use this stuff? If that's teh case its TOTALLY unacceptable imo :|
Originally posted by: rbV5
Originally posted by: Gstanfor
I guess that is why nVidia says "don't expect much acceleration at 350mhz"
And about the shaders doing the acceleration, I'm sorry, but I'll believe nVidia and published sources before I believe an ATi beta software tester...
If shaders are being used, it should be possible to detect their usage via a tool like 3danalyze or similar.
I tested my own NV40 for months including comparisons of my ATI cards with various softwares (no longer have the card however) DXVA is microsofts interface for using the GPU to accelerate video with the shader pipeline. I'm not an expert, but I've followed this for sometime. I'm going off information gleaned from personal experience and published sources since March '04.
Originally posted by: Gstanfor
Originally posted by: rbV5
Originally posted by: Gstanfor
I guess that is why nVidia says "don't expect much acceleration at 350mhz"
And about the shaders doing the acceleration, I'm sorry, but I'll believe nVidia and published sources before I believe an ATi beta software tester...
If shaders are being used, it should be possible to detect their usage via a tool like 3danalyze or similar.
I tested my own NV40 for months including comparisons of my ATI cards with various softwares (no longer have the card however) DXVA is microsofts interface for using the GPU to accelerate video with the shader pipeline. I'm not an expert, but I've followed this for sometime. I'm going off information gleaned from personal experience and published sources since March '04.
So? There are lots of other people who have owned nV40 since release (me included). What personal information leads you to believe nV40 uses pixel shaders for MPEG2 decode? Also show your published sources.
Link.....Initially, DVD-playback software was forced, out of necessity, to interrogate the graphics subsystem and find out what GPU it contained and subsequently to include numerous GPU-specific routines that reflected each chip's hardware-acceleration capabilities. For example, early Nvidia chips had fewer MPEG-2 features than their ATI Technologies counterparts. (However, this gap has closed in recent years. Nvidia GeForce 6xxx and 7xxx chips, for example, contain three dedicated video engines for MPEG-2 decoding, motion estimation, and video processing. They also take advantage of the chips' shader processors for video functions.)....
