Nvidia finally *gasps* brings what was promised.

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Link

I just spotted the following on BitTech: NVIDIA has shown off H.264 video decoding using its GeForce cards for the first time, today.

We were able to watch a GeForce 7800 use its GPU processing power to decode 20mbit, 1080p high definition content encoded with the H.264 standard. With the GeForce running at full throttle, CPU usage was throttled back to below 50%.

NVIDIA took some time out to explain their strategy for high-definition decoding - a strategy which will click into place in just a couple of months, with the release of ForceWare version 85.

H.264 is the codec used by both Blu-Ray and HD-DVD, and at a high bitrate such as 20mbit, most CPUs struggle to decode at full speed.

The GeForce video engine
GeForce 6 and 7 GPUs all have a video engine built into them. The engine, powered by the PureVideo software, will get an upgrade in ForceWare 85 that will enable the H.264 decoding. Most GPUs from the 6600GT and upwards will be able to handle full bitrate, 1080p content, according to NVIDIA.

The quality of the acceleration depends directly on the speed of the GPU. At 300MHz, the acceleration is going to be minimal. As you scale on up towards 500MHz GPU clock speed, acceleration gets to the point where less than 50% CPU is being used to deliver full speed playback.

Avivo?
This is different from ATI's Avivo technology, which decodes H.264 video on Radeon X1000 series cards. Avivo uses pixel shaders to decode the video, and this places a limitation on what cards can handle what resolutions. The X1300 can handle 480p H.264 video, the X1600 can handle 720p and the X1800 can handle 1080p.

In contrast, the integrated graphics chipset GeForce 6150 can handle 720p, whilst 6600GT, 6800, 7600 and 7800 cards can all do 1080p.

Laptops

Because the PureVideo update in ForceWare 85 is backwards compatible, it means that existing users will be able to upgrade to this functionality. Since the video codec is being used in HD-DVD, NVIDIA were able to demonstrate a notebook using GPU acceleration to deliver HD-DVD, which we've shown you below.

The new version of PureVideo will also hardware accelerate VC1, the codec powering Microsoft's WMV HD.


We can't wait to get our hands on ForceWare 85 and start doing some direct comparisons between the new PureVideo and Avivo.
 

James3shin

Diamond Member
Apr 5, 2004
4,426
0
76
i still think we're required to DL the purevideo software in addition to the driver. It would be great if I were to be mistaken though:)
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
so is this true, do we have to PAY for purevideo to use this stuff? If that's teh case its TOTALLY unacceptable imo :|
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
(I know it isn't realeased yet, but) All I have to say is "Shove that up your arse fanATics (especially those who attacked me over a feature I said nothing about in a previous thread) and have a good spin on it!"
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: Gstanfor
(I know it isn't realeased yet, but) All I have to say is "Shove that up your arse fanATics (especially those who attacked me over a feature I said nothing about in a previous thread) and have a good spin on it!"
nvm.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: dug777
so is this true, do we have to PAY for purevideo to use this stuff? If that's teh case its TOTALLY unacceptable imo :|

Dug, nVidia & ATi have to licence the codecs that they use. That costs money. Like all companies they will pass those costs on (unless you know how to reverse engineer the codecs legally for them).
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
The quality of the acceleration depends directly on the speed of the GPU. At 300MHz, the acceleration is going to be minimal. As you scale on up towards 500MHz GPU clock speed, acceleration gets to the point where less than 50% CPU is being used to deliver full speed playback.

Avivo?
This is different from ATI's Avivo technology, which decodes H.264 video on Radeon X1000 series cards. Avivo uses pixel shaders to decode the video, and this places a limitation on what cards can handle what resolutions. The X1300 can handle 480p H.264 video, the X1600 can handle 720p and the X1800 can handle 1080p.

In contrast, the integrated graphics chipset GeForce 6150 can handle 720p, whilst 6600GT, 6800, 7600 and 7800 cards can all do 1080p.

Nice BS propaganda there, and way to contradict themselves. If Avivo depends on the shader pipeline for performance, and PV does not, then why the hell does it depend on clockspeed? Last I checked, a 6800nu is clocked at 325mhz, so first they say at 300mhz the acceleration is minimal, and then they say the 6800 will do 1080p no problem? Which one is it?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: munky
The quality of the acceleration depends directly on the speed of the GPU. At 300MHz, the acceleration is going to be minimal. As you scale on up towards 500MHz GPU clock speed, acceleration gets to the point where less than 50% CPU is being used to deliver full speed playback.

Avivo?
This is different from ATI's Avivo technology, which decodes H.264 video on Radeon X1000 series cards. Avivo uses pixel shaders to decode the video, and this places a limitation on what cards can handle what resolutions. The X1300 can handle 480p H.264 video, the X1600 can handle 720p and the X1800 can handle 1080p.

In contrast, the integrated graphics chipset GeForce 6150 can handle 720p, whilst 6600GT, 6800, 7600 and 7800 cards can all do 1080p.

Nice BS propaganda there, and way to contradict themselves. If Avivo depends on the shader pipeline for performance, and PV does not, then why the hell does it depend on clockspeed? Last I checked, a 6800nu is clocked at 325mhz, so first they say at 300mhz the acceleration is minimal, and then they say the 6800 will do 1080p no problem? Which one is it?

I don't know the answer for sure, but, it could be that the video decoder works at 3d clock rates, not 2d. So the accleration wouldn't be great on things like 6600 plain and below, but better on 6600GT and above.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: dug777
so is this true, do we have to PAY for purevideo to use this stuff? If that's teh case its TOTALLY unacceptable imo :|
If you want to use H.264 on ATI AVIVO you have to pay.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: Wreckage
Originally posted by: dug777
so is this true, do we have to PAY for purevideo to use this stuff? If that's teh case its TOTALLY unacceptable imo :|
If you want to use H.264 on ATI AVIVO you have to pay.

i'm confused as to why this is is the case, one would have thought it would be in the gfx card manufacturer's interests to bundle it free and absorb the cost, since a dedicated discreet gfx card will instantly become the basic minimum requirement to watch this stuff...a sales feature that isn't 3d driven for discreet gfx cards seems like a godsend to ATI/NVIDIA to me...

zomg mom and pop now need a 6600GT minimum to watch h.264 content! ;)
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: dug777
Originally posted by: Wreckage
Originally posted by: dug777
so is this true, do we have to PAY for purevideo to use this stuff? If that's teh case its TOTALLY unacceptable imo :|
If you want to use H.264 on ATI AVIVO you have to pay.

i'm confused as to why this is is the case, one would have thought it would be in the gfx card manufacturer's interests to bundle it free and absorb the cost, since a dedicated discreet gfx card will instantly become the basic minimum requirement to watch this stuff...a sales feature that isn't 3d driven for discreet gfx cards seems like a godsend to ATI/NVIDIA to me...

They have to pay for the codec, so both card companies pass the cost on to the consumer. I don't blame them because a lot of people may buy the cards an not even use the decoder.

When I was a sys admin for a large company I used to buy dozens of video cards and never used any of the software that came with them.

Although eVGA used to (or still does?) bundle NVDVD with their cards. I had about 20 copies at one time :)
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: dug777
Originally posted by: Wreckage
Originally posted by: dug777
so is this true, do we have to PAY for purevideo to use this stuff? If that's teh case its TOTALLY unacceptable imo :|
If you want to use H.264 on ATI AVIVO you have to pay.

i'm confused as to why this is is the case, one would have thought it would be in the gfx card manufacturer's interests to bundle it free and absorb the cost, since a dedicated discreet gfx card will instantly become the basic minimum requirement to watch this stuff...a sales feature that isn't 3d driven for discreet gfx cards seems like a godsend to ATI/NVIDIA to me...

zomg mom and pop now need a 6600GT minimum to watch h.264 content! ;)

It just doesn't work like that in real life Dug. Joe Average could care less if his bit of 'net porn is hardware acclerated or not so long as he can view it, and the sheer number of pc's out there without video playback acceleration vs those with, combined with powerful, cheap cpu's guarantees software playback will be around for a long time to come.
 

nts

Senior member
Nov 10, 2005
279
0
0
...
Most GPUs from the 6600GT and upwards will be able to handle full bitrate, 1080p content, according to NVIDIA.
...

So which ones have the broken video engine? (6800/GT/Ultra?)

 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: nts
...
Most GPUs from the 6600GT and upwards will be able to handle full bitrate, 1080p content, according to NVIDIA.
...

So which ones have the broken video engine? (6800/GT/Ultra?)

Actually H.264 may work on ALL 6xxx series video cards. Only the 6800 AGP's had a problem with WMV while the MPEG side still worked just fine. So it's not like say the X800 series which has nothing to offer at all.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: nts
...
Most GPUs from the 6600GT and upwards will be able to handle full bitrate, 1080p content, according to NVIDIA.
...

So which ones have the broken video engine? (6800/GT/Ultra?)

interesting point...
 

Pocatello

Diamond Member
Oct 11, 1999
9,754
2
76
I don't mind paying, as long as it's reasonable. Programmers got to eat too. If I remember it right, Anand said something about the 6800 GT/Ultra didn't have H.264 video decoding, or it was broken. The cheaper 6600 GT series had it. We'll see.
 
Mar 19, 2003
18,289
2
71
Originally posted by: dug777
Originally posted by: nts
...
Most GPUs from the 6600GT and upwards will be able to handle full bitrate, 1080p content, according to NVIDIA.
...

So which ones have the broken video engine? (6800/GT/Ultra?)

interesting point...

I've gotta wonder if there ever was a "video engine/processor" in the 6-series at all. "GeForce 6 and 7 GPUs all have a video engine built into them." could mean something as simple as "pixel shaders able to be used as a general video processing unit", maybe....

I find it interesting that their wording (even including the stuff about Purevideo on Nvidia's website) no longer seems to refer to a discrete video processor at all.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: nts
...
Most GPUs from the 6600GT and upwards will be able to handle full bitrate, 1080p content, according to NVIDIA.
...

So which ones have the broken video engine? (6800/GT/Ultra?)

Perhaps the engine isn't broken anymore... (IOW: they have found a workaround).
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: SynthDude2001
Originally posted by: dug777
Originally posted by: nts
...
Most GPUs from the 6600GT and upwards will be able to handle full bitrate, 1080p content, according to NVIDIA.
...

So which ones have the broken video engine? (6800/GT/Ultra?)

interesting point...

I've gotta wonder if there ever was a "video engine/processor" in the 6-series at all. "GeForce 6 and 7 GPUs all have a video engine built into them." could mean something as simple as "pixel shaders able to be used as a general video processing unit", maybe....

I find it interesting that their wording (even including the stuff about Purevideo on Nvidia's website) no longer seems to refer to a discrete video processor at all.

even more interesting point ;)
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: SynthDude2001
Originally posted by: dug777
Originally posted by: nts
...
Most GPUs from the 6600GT and upwards will be able to handle full bitrate, 1080p content, according to NVIDIA.
...

So which ones have the broken video engine? (6800/GT/Ultra?)

interesting point...

I've gotta wonder if there ever was a "video engine/processor" in the 6-series at all. "GeForce 6 and 7 GPUs all have a video engine built into them." could mean something as simple as "pixel shaders able to be used as a general video processing unit", maybe....

I find it interesting that their wording (even including the stuff about Purevideo on Nvidia's website) no longer seems to refer to a discrete video processor at all.

:disgust:
From the quoted link in the first post:
Avivo?
This is different from ATI's Avivo technology, which decodes H.264 video on Radeon X1000 series cards. Avivo uses pixel shaders to decode the video, and this places a limitation on what cards can handle what resolutions. The X1300 can handle 480p H.264 video, the X1600 can handle 720p and the X1800 can handle 1080p.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: Gstanfor
Originally posted by: SynthDude2001
Originally posted by: dug777
Originally posted by: nts
...
Most GPUs from the 6600GT and upwards will be able to handle full bitrate, 1080p content, according to NVIDIA.
...

So which ones have the broken video engine? (6800/GT/Ultra?)

interesting point...

I've gotta wonder if there ever was a "video engine/processor" in the 6-series at all. "GeForce 6 and 7 GPUs all have a video engine built into them." could mean something as simple as "pixel shaders able to be used as a general video processing unit", maybe....

I find it interesting that their wording (even including the stuff about Purevideo on Nvidia's website) no longer seems to refer to a discrete video processor at all.

:disgust:
From the quoted link in the first post:
Avivo?
This is different from ATI's Avivo technology, which decodes H.264 video on Radeon X1000 series cards. Avivo uses pixel shaders to decode the video, and this places a limitation on what cards can handle what resolutions. The X1300 can handle 480p H.264 video, the X1600 can handle 720p and the X1800 can handle 1080p.

then why this bit, before you get all unpleasant with people mate,

'The quality of the acceleration depends directly on the speed of the GPU. At 300MHz, the acceleration is going to be minimal. As you scale on up towards 500MHz GPU clock speed, acceleration gets to the point where less than 50% CPU is being used to deliver full speed playback. '

?

He's just posing a question is all, not criticising your precious nvidia ;)

To me a separate pvp would mean the gpu speed should be effectively irrelevant...
 
Mar 19, 2003
18,289
2
71
Originally posted by: Gstanfor
Originally posted by: SynthDude2001
Originally posted by: dug777
Originally posted by: nts
...
Most GPUs from the 6600GT and upwards will be able to handle full bitrate, 1080p content, according to NVIDIA.
...

So which ones have the broken video engine? (6800/GT/Ultra?)

interesting point...

I've gotta wonder if there ever was a "video engine/processor" in the 6-series at all. "GeForce 6 and 7 GPUs all have a video engine built into them." could mean something as simple as "pixel shaders able to be used as a general video processing unit", maybe....

I find it interesting that their wording (even including the stuff about Purevideo on Nvidia's website) no longer seems to refer to a discrete video processor at all.

:disgust:
From the quoted link in the first post:
Avivo?
This is different from ATI's Avivo technology, which decodes H.264 video on Radeon X1000 series cards. Avivo uses pixel shaders to decode the video, and this places a limitation on what cards can handle what resolutions. The X1300 can handle 480p H.264 video, the X1600 can handle 720p and the X1800 can handle 1080p.

I'm quite capable of reading. ;)

I'm just wondering, if there really was a video processor there in the first place and it was claimed to be broken in 2004 (for certain cards), what has changed? My 6800GT with its apparently broken PVP (so they say) sure seems to play 1080p WMV-HD and 1080i transport streams with very low CPU usage and no dropped frames... (Edit: keeping in mind that WMV-HD was the very thing claimed to be broken in the first place)
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Dug, why would GPU speed be irrelevant if there were a seperate video processor? It is still on die, and has to be clocked at some rate. The know rates for 6800 GT are 300m Hz (2D operation), 350 mhz (3D operation) and 400 mhz (ROP output operations)

As for what has changed, I don't know. I guess they have found a way of reordering things that works.

nv40 VPU overview
While ATI have been mapping some of their video processing over the Shader Core for some time NVIDIA have decided not to do this as they feel instructions required for video processing do not lend themselves well to the instruction set in the pixel shader pipeline; thus a dedicated unit may be more optimal for this type of work. When running video processing though the shaders this means the 3D core is active and consuming power as well, which may not be desirable in all situations, especially where mobile devices are concerned - the NV4x VP is a smaller unit dedicated to video processing so it should require less power for video processing than utilising the shader core. This is not to say, however, that NVIDIA won't utilise the shader core in conjunction with the VP in some instance, should they choose to do so.
 

nts

Senior member
Nov 10, 2005
279
0
0
Originally posted by: Gstanfor
Dug, why would GPU speed be irrelevant if there were a seperate video processor? It is still on die, and has to be clocked at some rate. The know rates for 6800 GT are 300m Hz (2D operation), 350 mhz (3D operation) and 400 mhz (ROP output operations)

It doesn't have to run at the same speed as the core though if its a separate processor.

My assumption was that they would have their video processor run at some other common speed (common to all chips).