Nvidia finally *gasps* brings what was promised.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Jun 14, 2003
10,442
0
0
Originally posted by: dug777
so is this true, do we have to PAY for purevideo to use this stuff? If that's teh case its TOTALLY unacceptable imo :|


no

you only have to pay for the DVD decoders because of royalties......you do not have to pay for the other features.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: Gstanfor
Uhhh, call me dumb if you want, but I can't see any proof of nVidia using pixerl shaders on your image rbV5...

Its my Radeon card decoding with Nv DVD decoders. I clearly, simply, asked for the same screenshot from a Nvidia card doing the same....xtknight provided it, thank you:) I just want to get a feel with whats really going on, not play some sort of marketing feature face off thank you very much.

From the screenshots, I can see the 7800 is using DXVA mode A, vs The Radeon using DXVA mode C. IIRC, Mode C is less stringent IDCT generic acceleration, and Mode A supporting blending, or some more more advanced features that are more hardware specific. Clearly a difference, I don't recall that with my NV40.

Again, whether or not they are using shader specifically isn't so much an issue, in the end its whether it works or not, offers excellent PQ and is better than simply using existing software decoding which is excellent on the high performance rigs we're testing with anyway.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
My old GeForce 6800 (NV41) also used DXVA_A.

The funny thing is I use ATI MPEG Video Decoder (atidvcr.dll that came with AVIVO transcoder) and it says this in the property page (Video Out).

Sub Type: DXVA_ModeMPEG2_C

For NVIDIA Video Decoder it says this:

Sub Type: DXVA_ModeMPEG2_A

It's probably the opposite for X1xx owners. Their ATI decoder says A and the NVIDIA says C? Weird...

As I understand it (well, as I 'guess'), NVIDIA uses only the dedicated chip (or the dedicated part of NVxx/Gxx) for MPEG-2 then for H.264 they might have to grab some extra horsepower from the shaders. That's why the GeForce 61xx won't quite be up to par with the GeForce 66xx, but the GeForce 66xx on up has enough horsepower in its shaders to supplement the VPP. ATI has been using 'SmartShaders' for the acceleration all along, even for DVD, I think.

It should be easy to prove. Just run RTHDRIBL on super high settings (it shouldn't use much CPU). Then play a video alongside it. If RTHDRIBL slows down it's using the shaders. Assuming RTHDRIBL uses a lot of shaders anyway, I think it does?

Blah...nm...RTHDRIBL uses 100% of my CPU.
 

drpootums

Golden Member
Oct 22, 2004
1,315
0
0
so, i might be kinda slow, but this means that our AGP 6800GT's may have the HD decoding feature finally????

If so, this is great for us!!!!



I didnt read all of the posts, and i'm not up on all of the video format lingo, i just need an answer in simple terms....:eek:
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: drpootums
so, i might be kinda slow, but this means that our AGP 6800GT's may have the HD decoding feature finally????

If so, this is great for us!!!!



I didnt read all of the posts, and i'm not up on all of the video format lingo, i just need an answer in simple terms....:eek:

Yes, just ignore the last few posts about possible pixel shader usage with mpeg2 - its all nonsense. WMV-HD may require some pixel shader intervention however.

about the screenshots, I'm using V1.02.185 retail, and can't find an options screen like that anywhere, instead you get a utility called nstant media pic1 pic2 It has a builtin media player.

I also have Nero Showtime, WinDVD DXVA, Media player Classic & WMP10 on the system. No Cyberlink though (not about to buy it either, DVD/movie watching isn't something I do with my system).
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Steelski
Originally posted by: Wreckage
Originally posted by: dug777
so is this true, do we have to PAY for purevideo to use this stuff? If that's teh case its TOTALLY unacceptable imo :|
If you want to use H.264 on ATI AVIVO you have to pay.

HMMMM. No you dont!!!!!!!!!!!!!!!!!!!!!!!!!

Yes you do!!!!!!!!!!!!!!!!!!!!!!! You have to buy the decoder from Cyberlink.

Go to the ATI site and find out for yourself, instead of posting out your arse.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Link

SOURCES CLOSE to Nvidia are claiming that Nvidia has a secret video driver that can make a real difference and give ATI's Avivo a run for its money. ATI is claiming dominancy with its 5.13 drivers that can do H.264 just fine on the new line of products. Nvidia is keeping its cards close to its corporate chest.

Nvidia is getting ready to release the miraculous driver that is supposed to beat ATI in Video performances. The driver will ensure that IGP 6150 integrated graphic core and 6200 low end core can play H.264 content at 720P resolution. Faster cards should be able to play 1080 P resolutions without problems.

The new driver will allow all Geforce 6 and 7 series to play H.264 content. Yes, this means each and every card from Geforce 6200 to 7800 GTX will play H.264. I still wonder about NV40 based cards as they had some troubles with WMV files before. ATI R5XX generation can play H.264 but not the older R420/430/480 generations. None of these cards are actually supported for H.264 playback and we don?t think ATI will provide any kind of H.264 support for those older cards. Kind of bad for you if you have one.

If this comes true, then ATI might have a problem when it comes to H.264 just weeks after it claimed Video dominancy. The drivers have appeared over the horizon and are steaming towards you demonstrating the curvature of the planet. µ

I think the video processors on those cards are clocked differently.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
I think the video processors on those cards are clocked differently

The video processors have an independant clock from the VPU core even though its embedded in the core so it could be that they run different VP speeds on the various cards rather than its also relying on the shader power, which would also account for the differences in performance.

Just found that properties dialog - didn't realise it appears in the system tray.

Image

I see your card is using DXVA Mode "B" rather than DXVA mode "C" that Xtknight's is using.

Are you using the 6800GT in your sig? <change your link to "http://www.anandtech.com/mysystemrig.html?rigid=5978" in your sig and it will fix your rig link>