• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

So I guess Avivo was just marketing?

Spicedaddy

Platinum Member
OK, my current card is a 9700Pro and I was hesitating between a 7800GT and an X1800XL for my next upgrade. 7800GT is cheaper and I was leaning that way until last week when I saw all the Avivo previews popping up. So a week later, here's a summary of what Avivo is:

1. Encoding: 100% CPU based, it's fast, but gives crappy quality. I mean, you're not going to convince me ATI has miraculously discovered a way to encode faster using your CPU, while conserving equivalent quality. And why officially release it only for X1K hardware, if it has nothing to do with GPU hardware?

2. Decoding: The 5.13 drivers have been released, and they support hardware H.264 decoding for X1K products. They mention in the release notes that you'll need the Cyberlink decoder for it to work, and the link they give points to a page that says "H.264 decoder available for download soon". So that doesn't work either. And it also means decoders will have to be modified to support hardware decoding. Cyberlink is nice, but will it work with WMV-HD or Quicktime HD movies?


So basically, ATI's been pimping this whole thing to promote X1K products for the holidays, when in reality, X1K owners get nothing more than what they had a week ago.
 
well ATi has admitted that offloading some of the encoding work to the GPU can't happen yet, but that they are working on that. maybe future drivers..
 
you actually believed what ATI said? nvm

i was gonna try it, since theres a version that works on any card (and why shouldnt it when it makes zero use of the GPU) then i read around that while it was fast....the quality wasnt up to par. ill stick with DivX HD thanks.

they really need to get some substance behind their words before they release anything.
 
Originally posted by: otispunkmeyer
you actually believed what ATI said? nvm

i was gonna try it, since theres a version that works on any card (and why shouldnt it when it makes zero use of the GPU) then i read around that while it was fast....the quality wasnt up to par. ill stick with DivX HD thanks.

they really need to get some substance behind their words before they release anything.

I never said I believe anything ATi says, I merely stated that they lied. You don't have to believe something to know if people are lying about it or not.
 
Originally posted by: Gstanfor

I never said I believe anything ATi says, I merely stated that they lied. You don't have to believe something to know if people are lying about it or not.

Well, there's marketing, and there's lying. Just like the '64 bit content' in Far Cry that runs just fine on 32 bit CPUs, ATI only supports the Avivo software on X1xxx CPUs for marketing reasons. You can get it to work on other GPUs (including nVidia's) but ATI won't field tech support questions in case it doesn't work. =)

Rather disappointing however. Video encoding is the hard part, decoding is a pretty light load on even feeble CPUs. I'm wondering if it's just not practical to upload uncompressed video streams to the GPU, have it compress and download the result even over 16x PCIe. Even though the CPU load would be much lighter, the end result including latency of transfer may be slower than CPU-based encoding on modern CPUs.

 
Originally posted by: Gstanfor
Originally posted by: otispunkmeyer
you actually believed what ATI said? nvm

i was gonna try it, since theres a version that works on any card (and why shouldnt it when it makes zero use of the GPU) then i read around that while it was fast....the quality wasnt up to par. ill stick with DivX HD thanks.

they really need to get some substance behind their words before they release anything.

I never said I believe anything ATi says, I merely stated that they lied. You don't have to believe something to know if people are lying about it or not.

I get the feeling spunkmeyer was replying to the OP.. Quick to get defensive Gstanfor
 
Originally posted by: Spicedaddy
OK, my current card is a 9700Pro and I was hesitating between a 7800GT and an X1800XL for my next upgrade. 7800GT is cheaper and I was leaning that way until last week when I saw all the Avivo previews popping up. So a week later, here's a summary of what Avivo is:

1. Encoding: 100% CPU based, it's fast, but gives crappy quality. I mean, you're not going to convince me ATI has miraculously discovered a way to encode faster using your CPU, while conserving equivalent quality. And why officially release it only for X1K hardware, if it has nothing to do with GPU hardware?

2. Decoding: The 5.13 drivers have been released, and they support hardware H.264 decoding for X1K products. They mention in the release notes that you'll need the Cyberlink decoder for it to work, and the link they give points to a page that says "H.264 decoder available for download soon". So that doesn't work either. And it also means decoders will have to be modified to support hardware decoding. Cyberlink is nice, but will it work with WMV-HD or Quicktime HD movies?


So basically, ATI's been pimping this whole thing to promote X1K products for the holidays, when in reality, X1K owners get nothing more than what they had a week ago.

So lame. ATI :thumbsdown: for sure.
Hope you enjoy your Radeon X1K Ackmed.

Just more ATI lies.
 
Originally posted by: v8envy
decoding is a pretty light load on even feeble CPUs.

Not decoding H.264 1080p, which the Radeon X18xx will supposedly do.

I'm wondering if it's just not practical to upload uncompressed video streams to the GPU, have it compress and download the result even over 16x PCIe.

I don't think it's necessarily uploading the uncompressed video stream, at least not in its entirety. Some functions are inline-optimized to take advantage of the GPU, so it requires very little bandwidth. The overall/general execution of the encoder still lies within the CPU if I'm not mistaken, with it just sending requests for DCT/whatever else to the GPU when needed. It's just like the Direct3D API. The whole game isn't really uploaded. The CPU still keeps track of what's going on, including but not limited to thread synchronization.

Even though the CPU load would be much lighter, the end result including latency of transfer may be slower than CPU-based encoding on modern CPUs.

Huh? Naw...

It sends big enough chunks to take care of overhead. Otherwise it would be rendered useless, pun intended.
 
All I see in this topic is a bunch of people high-fiving each other and yelling "ATI LIES!"

I've still not exactly sure what they supposedly lied about...
 
They were trying to steal Purevideo's thunder. Problem is that Purevideo has been out for about 2 years so they are just trying to show the world that they can do the same thing as NVIDIA only 2 years later (like SM3.0, SLI, HDR,)

Maybe next year we will get to see hard launches as well, maybe.
 
Originally posted by: Wreckage
They were trying to steal Purevideo's thunder. Problim is that Purevideo has been out for about 2 years so they are just trying to show the world that they can do the same thing as NVIDIA only 2 years later (like SM3.0, SLI, HDR,)

Maybe next year we will get to see hard launches as well, maybe.

Do you not stop?

How many months was PureVideo not enabled for? How many thousands of 6800s was its main component broken on? AVIVO is just a fancy marketing logo for an emphasis on encoding and decoding, which happen to be two things ATI has just jumped ahead of nVidia in regards to.
 
All Avivo enabled graphics cards will support this GPU assisted transcode; the only requirement will be that the appropriate Catalyst driver is installed. While ATI is currently planning to release the first of their R5xx GPUs by the end of this month, the transcode acceleration will not be ready by that time. ATI has committed to delivering the transcode acceleration by the end of this year, and more specifically, about a month after the release of the R5xx GPUs.
Source
All Avivo graphics cards (e.g. R520, RV530, RV515) will feature decode assist for H.264, MPEG-2, MPEG-4, VC-1 and WMV9. This feature will be enabled on the day that the products ship, through Catalyst.
Source
At this point, we're mostly excited about the GPU assisted transcode and decode features of the R5xx series of GPUs, and even more excited that both features are supposed to be available by the end of this year.
Source
ATI is committed to bringing both H.264 decode acceleration and transcode assist by the end of the year, but for now, we have no way of testing those features
Source


What there is:
H.264 Decode Acceleration - As Promised
Source
Unfortunately, the GPU accelerated transcode isn't yet ready for debut, but what ATI is making available is the software front end for it.
Source
We should note that the Avivo Video Converter, despite not being GPU accelerated, will only work on ATI Radeon X1000 series of GPUs. ATI is still working on bringing a GPU accelerated version of the Avivo Video Converter to market, but that's still a while away.
Source


So, we were promised hwardware accelerated transcoding by end of the year in late Sept/early October, and it's not here...YET.
But there are clear signs of progress (decode and the software end of the transcoder).
ATi promised, and have not fully delivered, but at least they are getting there, which is nice(r than nothing).
 
Originally posted by: xtknight

Not decoding H.264 1080p, which the Radeon X18xx will supposedly do.

Even that. 720x480 DVD resolution (345600 pixels) is 1/6 the size of 1920x1024 (1966080 pixels). A P233 with no decoding acceleration can handle an mpeg4 DVD resolution stream. H.264 is mpeg4 with a particular standard codec, aka mpeg4 part 10.

6x that CPU load would still allow some post effect processing on modern CPUs. We're talking about decompressing and processing 240 megabytes/sec here, barely double the old PCI bus throughput.

Now, encoding that is a completely different story.

I don't think it's necessarily uploading the uncompressed video stream, at least not in its entirety. Some functions are inline-optimized to take advantage of the GPU, so it requires very little bandwidth. The overall/general execution of the encoder still lies within the CPU if I'm not mistaken, with it just sending requests for DCT/whatever else to the GPU when needed. It's just like the Direct3D API. The whole game isn't really uploaded. The CPU still keeps track of what's going on, including but not limited to thread synchronization.

The whole compression task requires having the frame data to do compression on. You can't do a discrete cosine transform without having the data to do it over. Your approach sounds like an asymetric multi-cpu approach to encoding, where the CPU does the partitioning of the work and some of the work, with the GPU doing work on other tiles. That would be... a hard problem to solve well. Which might explain why there's no solution yet.

Once again, bandwidth isn't the problem. A single uncompressed frame HDTV frame is only 8 megabytes, so 30 of them for real time encoding is well within AGP bandwidth, even for a round trip. I'm just wondering if offloading a portion of the data to the GPU and processing the frame in lockstep is SLOWER than just using the CPU in the first place when you factor in the latency of the round trip.

It sends big enough chunks to take care of overhead. Otherwise it would be rendered useless, pun intended.

I suppose latency could become a non-issue if *all* the work was done on the GPU. Then frames are streamed to the GPU, and compressed data is streamed back out. No lockstep of waiting for a completed frame, and thus no latency. That wouldn't work in the asymetric multi-CPU approach above, though.
 
Originally posted by: v8envy
Even that. 720x480 DVD resolution (345600 pixels) is 1/6 the size of 1920x1024 (1966080 pixels). A P233 with no decoding acceleration can handle an mpeg4 DVD resolution stream. H.264 is mpeg4 with a particular standard codec, aka mpeg4 part 10.

Remember, a video DVD is MPEG-2. That is worlds less intensive than MPEG-4 Part 10 Main profile. http://www.anandtech.com/video/showdoc.aspx?i=2536&p=4

Where are you getting this data? Do you mean a P233 can handle decoding acceleration of MPEG-4 Part 10 Main 720x480?

Would you call an Athlon 64 3500+ a feeble processor? It can barely handle the Quicktime 1080p videos, and in some cases it gets overworked. That is why they are adding in GPU decode assist, because today's CPUs just can't cope with the processing requirements.

The whole compression task requires having the frame data to do compression on. You can't do a discrete cosine transform without having the data to do it over. Your approach sounds like an asymetric multi-cpu approach to encoding, where the CPU does the partitioning of the work and some of the work, with the GPU doing work on other tiles. That would be... a hard problem to solve well. Which might explain why there's no solution yet.

Looks like ATI's got it covered. 🙂 GPU encoding is also solely 'assisted', so I must infer that some is done by the CPU as well, but it's anyone's guess as to what.

http://www.anandtech.com/video/showdoc.aspx?i=2536&p=4

Keep in mind that we're still talking about GPU assisted decode, so there are still a lot of functions that are done by the CPU. Avivo GPUs will perform in-loop deblocking, motion compensation and inverse transform.

Once again, bandwidth isn't the problem. A single uncompressed frame HDTV frame is only 8 megabytes, so 30 of them for real time encoding is well within AGP bandwidth, even for a round trip. I'm just wondering if offloading a portion of the data to the GPU and processing the frame in lockstep is SLOWER than just using the CPU in the first place when you factor in the latency of the round trip.

OK. Suppose the HDTV frame is 1920x1080p (1920x1080 at 60 progressive FPS). Then suppose there's 8 bits per pixel (24-bit color). That means 1920 x 1080 x 8bpp x 60Hz every second. 995,328,000 bits per second. Which equates to 124,416,000 bytes per second. 121,500 KB/sec. 118.65234375 MB/sec. Yeah that's definitely within the capability of PCIe x16, and an unsaturated PCI bus.

I suppose latency could become a non-issue if *all* the work was done on the GPU. Then frames are streamed to the GPU, and compressed data is streamed back out. No lockstep of waiting for a completed frame, and thus no latency. That wouldn't work in the asymetric multi-CPU approach above, though.

Well, I don't know...we'll just have to wait until it comes out to see how it works. If 'assist' is correct, the CPU must be doing something.

I don't understand the latency you're speaking of here. The CPU can send the uncompressed frame at 60 FPS without a problem (I think?). Then the GPU just sends it back to memory which the CPU addresses and writes to the hard disk. It'll do this sequentially, maybe in bigger blocks to reduce overhead. Rinse and repeat. This happens with WMV HD/VMR9 renderless already through DXVA (well the GPU->memory part), only now it's writing to the hard disk.

I'm not saying you're wrong per se, but it seems odd for ATI to advertise and tout GPU encoding assist when it's not feasibly faster on any of today's PCs. They have been claiming a 5x speed-up.

Edit: quote alignment problem/added more explanation
 
Originally posted by: Spicedaddy
And it also means decoders will have to be modified to support hardware decoding. Cyberlink is nice, but will it work with WMV-HD or Quicktime HD movies?
Any decoder that implements DirectX Video Acceleration(DXVA) support will be able to use the H.264 decoding features. This is exactly what's done with MPEG2 decoding these days, and it shouldn't change for H.264.

 
Originally posted by: Chocolate Pi
All I see in this topic is a bunch of people high-fiving each other and yelling "ATI LIES!"

I've still not exactly sure what they supposedly lied about...



Choc Pi, what is a the card is your sign?, flashed to PE, I didnt even know ATI had PE edition out yet?
 
The Sapphire X1800XT 512MB Performance Edition BIOS was leaked. It works on any X1800XT 512 card and is 700 core, 800 mem. A rather nice upgrade, I have yet to hear of a single soul whose flash has failed. (Except some funny guy who tried to make it work on his XL, lol.)
 
Originally posted by: SolMiester
Originally posted by: Chocolate Pi
All I see in this topic is a bunch of people high-fiving each other and yelling "ATI LIES!"

I've still not exactly sure what they supposedly lied about...



Choc Pi, what is a the card is your sign?, flashed to PE, I didnt even know ATI had PE edition out yet?

Sol Mi, what is a the grammer in yuor post?, i dont get it, I didnt even know.

My post was almost as pointless as yours.
 
Originally posted by: Wreckage
They were trying to steal Purevideo's thunder. Problem is that Purevideo has been out for about 2 years so they are just trying to show the world that they can do the same thing as NVIDIA only 2 years later (like SM3.0, SLI, HDR,)

Maybe next year we will get to see hard launches as well, maybe.

Purevideo has been out for 2 years, yes, but it didnt work until 1 year ago and you still have to put down extra money to make it work - for a feature that is advertised on the box. Not to mention its rather ridiculous implementation in the 6800 series where it doesnt support WMV or is just plain broken.

 
Originally posted by: Avalon
Originally posted by: SolMiester
Originally posted by: Chocolate Pi
All I see in this topic is a bunch of people high-fiving each other and yelling "ATI LIES!"

I've still not exactly sure what they supposedly lied about...



Choc Pi, what is a the card is your sign?, flashed to PE, I didnt even know ATI had PE edition out yet?

Sol Mi, what is a the grammer in yuor post?, i dont get it, I didnt even know.

My post was almost as pointless as yours.


Oh, sorry, must have had a few buy then...LOL

Anyway, that guys sign has X1800xt, flashed to PE. Thought that was funny as didnt think there was a PE edition out offically as yet!
 
Back
Top