New ATI linux drivers

Reel

Diamond Member
Jul 14, 2001
4,484
0
76
ATI download

Everything I have read has said Nvidia's proprietary drivers with XvMC have been far superior to ATI's (particularly in relation to MythTV). Does this driver update help ATI out or are we still waiting?
 

drag

Elite Member
Jul 4, 2002
8,708
0
0
In the release notes there was a mention of a bugfix with certain colors and XV.

That's about it.

If you want to use Linux don't buy ATI. That's all there really is about it.

If your stuck with a laptop or a ATI card and you can't get rid of it, then you have my sympathies.
 

TheVrolok

Lifer
Dec 11, 2000
24,254
4,090
136
Originally posted by: drag
In the release notes there was a mention of a bugfix with certain colors and XV.

That's about it.

If you want to use Linux don't buy ATI. That's all there really is about it.

If your stuck with a laptop or a ATI card and you can't get rid of it, then you have my sympathies.

Thankfully my laptop's ATI card didn't cause any problems for my Ubuntu install, I was much dismayed at having to purchase a laptop with an ATI card for that very reason - but the price was right. :)
 

xSauronx

Lifer
Jul 14, 2000
19,582
4
81
Originally posted by: TheVrolok
Originally posted by: drag
In the release notes there was a mention of a bugfix with certain colors and XV.

That's about it.

If you want to use Linux don't buy ATI. That's all there really is about it.

If your stuck with a laptop or a ATI card and you can't get rid of it, then you have my sympathies.

Thankfully my laptop's ATI card didn't cause any problems for my Ubuntu install, I was much dismayed at having to purchase a laptop with an ATI card for that very reason - but the price was right. :)

i havent much to complain about on my T40. it uses an older radeon 7500. ive had a couple of times where the windows dont...refresh themselves, without being minimized. but i think its more likely due to heat, as it only happens when im working in the field, *never* at home.

i did finally get my older tower back recently. i traded the 9700 for a gf4 because id heard such bad things about ATIs linux support.
 

drag

Elite Member
Jul 4, 2002
8,708
0
0
On older cards driver support for ATI stuff is pretty good. This is because people have reverse engineered the hardware and have produced working drivers...

for example with that 7500 (as well as 8500 through 9200 cards) ATI has totally dropped support for in their drivers. If it wasn't for OSS drivers there wouldn't be anything that realy would work well on newer Linux distros.

But for newer stuff, go Intel. Intel has the most cost effective and best supported cards out there. They lack performance, unfortunately. They are fast enough for things like simple games or Beryl, but if you need the horsepower the only real choice is nvidia.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: xSauronx
Originally posted by: TheVrolok
Originally posted by: drag
In the release notes there was a mention of a bugfix with certain colors and XV.

That's about it.

If you want to use Linux don't buy ATI. That's all there really is about it.

If your stuck with a laptop or a ATI card and you can't get rid of it, then you have my sympathies.

Thankfully my laptop's ATI card didn't cause any problems for my Ubuntu install, I was much dismayed at having to purchase a laptop with an ATI card for that very reason - but the price was right. :)

i havent much to complain about on my T40. it uses an older radeon 7500. ive had a couple of times where the windows dont...refresh themselves, without being minimized. but i think its more likely due to heat, as it only happens when im working in the field, *never* at home.

i did finally get my older tower back recently. i traded the 9700 for a gf4 because id heard such bad things about ATIs linux support.

I think that's a bug with compiz/beryl, as the computers I've used with intel integrated graphics have the same problem.

Older ATI cards are supported quite well in linux. Prior to their dx9 stuff, they have official open source drivers...and the 9700 and x800 series have very low performance open source drivers that are at least fully compatible. X1800 series and beyond are screwed though.
Nvidia is the way to go for a high performance linux card. Not to mention a fully compatible card. ATI cards, if they will even run the app you're using, will run it slow.
 

SleepWalkerX

Platinum Member
Jun 29, 2004
2,649
0
0
Originally posted by: drag
On older cards driver support for ATI stuff is pretty good. This is because people have reverse engineered the hardware and have produced working drivers...

for example with that 7500 (as well as 8500 through 9200 cards) ATI has totally dropped support for in their drivers. If it wasn't for OSS drivers there wouldn't be anything that realy would work well on newer Linux distros.

But for newer stuff, go Intel. Intel has the most cost effective and best supported cards out there. They lack performance, unfortunately. They are fast enough for things like simple games or Beryl, but if you need the horsepower the only real choice is nvidia.

ATi's not completely evil, they did release specifications for the r200 chipset(s) to xorg developers. Its just the r300 and higher they're not and only have shoddy linux driver support. Nvidia has a great proprietary linux display module, but never really helped like ATi has.
 

SleepWalkerX

Platinum Member
Jun 29, 2004
2,649
0
0
Originally posted by: Nothinman
but never really helped like ATi has.

We're supposed to applaud ATI for releasing some specs on a 5+ yr old chipset but nothing since then?

Never said that, I just feel its kinda dumb to be praising nvidia for giving us a proprietary linux driver when ati does the same and actually contributed something to the open-source community. Both give you proprietary junk anyway. Luckily, it seems like ati will get some pressure put on them from dell, but until I can see some code I'll be staying away from both.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
Never said that, I just feel its kinda dumb to be praising nvidia for giving us a proprietary linux driver when ati does the same and actually contributed something to the open-source community.

The difference is that the binary nVidia driver works a lot better IME.

Luckily, it seems like ati will get some pressure put on them from dell, but until I can see some code I'll be staying away from both.

If anything comes from it it'll likely only be a slightly better supported binary driver.
 

drag

Elite Member
Jul 4, 2002
8,708
0
0
Stuff like Dell will help a little bit.

But the only people who have the ability to put pressure on ATI and Nvidia are going to be users.

One thing to keep in mind also is that the 'users' that ATI and Nvidia care most about is Unix Grahical Workstations. Linux has effectively replaced SGI a while ago and is currently the most popular OS for high end video and 3d graphics stuff. I am talking about Hollywood studios and scientific workstations. These are people that spend a LOT of money on hardware.

If it wasn't for this market then there would be no drivers coming out of Nvidia or ATI, proprietary or not. Consumer cards are just a afterthought, with Nvidia being a bit better because I think they have a more standardized hardware interface for producing 'unified drivers'.


In the future I see things changing. The 'hardware acceleration' provided by video hardware.. well isn't. Notice the biggest change to come to graphics are things like programmable shader languages and thing like that. You program using a special language, compile it using a GLSL compiler and then run it on the GPU.

Same thing with accelerated 3D graphics. Your not dealing with 'OpenGL expressed in hardware' or anything like that.. The OpenGL protocol stacks are now just very optimized software and compiled to run on both the GPU and the CPU, with the GPU geared towards specific types of workloads the CPU is geared towards generic workloads.


There is a QT hacker, Zack Rusin, that has a little side project were he is working on taking LLVM (which is a compiler/vm language suite based around GCC) and MESA and making it so that you can compile software to directly run on the GPU. You can do shading languages in Python, Ruby, C or C++ if you want. OS X and Apple use LLVM for doing 3D graphic stuff somewhat, but it's not hardware accelerated.


The way things are going is that the GPU is going to be used for more and more generic stuff. It's going to be another CPU core you can take advantage of to run your software. The CPU-GPU integration. GPGPU or Fusion or whatever.

In a couple years your going to see 16-way and 32-way proccessors. Right now Intel has shown 80-core processors for demo'ng/testing reasons. And CPU manufacturers are always working on cpu designes 2 generations out.

Having 2-way cpus is very good for the desktop. Having 4-way would be nice also. But I figure once you get up to 16-way or 32-way cpus then there is virtually no benefit for desktop performance over 4-way cpus.


So in order to get customers wanting to buy newer processors your probably going to see different types of cpu cores that are tuned for specific workloads. The most obvious optimized core would be very GPU-like for processing graphics and media encoding/decoding which are currently the most CPU-intensive stuff being done today.


Hence you have Larrabee.
http://arstechnica.com/news.ar...ans-with-larrabee.html

It's Intel's stab at the discrete video card market. Their attempt to go head to head with ATI/AMD and Nvidia.

Their stuff is suppose to be very 'x86-like'. Were your going to effectively have 16-way processors specificly geared for graphical workloads in those video cards.


So this _could_ mean that Linux users working on a Intel-based workstation with Intel graphics would have a substantial performance advantage over running ATI or Nvidia graphics. Not so much for pure 3D gaming performance, but for doing all sorts of stuff like media encoding/decoding, rendering, and such. You'd compile software optimized to run on BOTH the CPU and the GPU. Except for the normal threading issues with multiple-core processors the compiler and the software should be able to use either the GPU's or the CPU's based on what can accelerate the software the best. All of this automaticly and without a huge amount of effort on the part of the programmer, besides the normal multi-threading issues.

This is very similar to IBM Cell.. which currently Linux is the only OS able to support that properly. It could be that Intel wants to support Linux well because when these new architectures start coming out Linux will be the only system able to effectively support them at first, to the fullest advantage. Similarly it took Microsoft years to produce a workstation OS that ran well on AMD64 vs Linux which supported that CPU since day one.



If all of my speculation is true then being open has very significant advantages over being closed.

Could you imagine having a processor were the maker absolutely refuses to tell you how to program for it? That is the current situation with Nvidia and ATI...