I'd like to see considerable effort put into writing quality open-source alternatives to closed-source blob drivers.
I really am getting tired of having to load low quality binary drivers (I'm looking at you, ATI/AMD) just to get my hardware to work the way I need it to.
Well that's something for the end user to decide.
If people didn't put up with 'blob' drivers then nobody would be making them.
Right now you can take care of the situation by buying better hardware. Currently the only category of hardware that requires 'blob' drivers for Linux is high-performance 3D acceleration. (For regular video and average 3D performance Intel is what you'd want.) Everything else has nice open source drivers available for it one way or another.
I understand this is easier said then done, but if your building or buying a machine were you want to run Linux on it then it's not that difficult.
The thing is that by the time it takes a project to produce working drivers by reverse engineering hardware that hardware is pretty much already obsolete.
For example... It took years for the open source Linux drivers for R200 hardware to surpass the quality and speed of ATI's own proprietary drivers. By the time that happenned ATI had dropped all support for it anyways. For R300/R400 drivers by the time they were stable and useful for the desktop it is almost impossible to find new versions of the card in retail outlet.
For broadcom 802.11B/G wireless cards we now have stable drivers, but thats only until recently. Now your going to start seeing 802.11B/G/N cards out there from broadcom, which means that you'll have a whole another class of hardware that people are going to use ndiswrapper for. Now for 'n' hardware it won't take nearly as long, but it's still going to take a while after those cards reach markets.
For modern hardware Linux devs have figured out that the only way to keep up with the consumer market is to have direct cooperation from the hardware manufacturers themselves. For stuff to appear in a timely basis they need to be able to talk to engineers and find fixes for hardware bugs and other such things.
The engineering and hardware design folks are probably more then willing to help out, but the business side of things needs to have economic justification for this sort of stuff. It's up to Linux end-users to provide that justification. If bad companies like ATI and Nvidia continue to sell lots of hardware then what economic justification is there for 'good' companies to have the risk and the expense of being more 'open'?
If they don't then it's probably just going to get worst.
Modern video cards that provide 3D acceleration do most of it through software. GPUs as we see today are just specialized processors that are tuned to support the sort of calculations required for high speed graphics. They are not 'DirectX expressed in hardware' or anything like that.. DirectX/OpenGL is practically all software that runs on both your CPU and GPU.
Look at the latest fad in accelerated graphics is programmable shaders for doing 2D video and 3D special effects. You have GLSL compilers and that sort of thing that takes code from special languages and makes it work on your GPU. It can be used for practically anything.
Both ATI and Nvidia are producing specialized hardware for the HPC (high performance computing) market. Their 'GPU's are very fast at certain types of calculations that are useful for some times of scientific computing and such things. So they are selling boxes that are pretty much just stripped down video cards to this market. But they don't even then allow access to hardware directly. They don't let the scientists know how to access the hardware.
What they do is provide special libraries and software abstraction to make it easier to program on their GPUs. It's 'CUDA' for Nvidia, and 'CTM' for ATI/AMD. This provides standard interface to abstract changes between different generations of cards (so that older software would still be compatable with newer hardware), simplify programming, and then also hide their 'IP' from prying eyes.
The long term trend of all this is that your going to start seeing GPU's integrated into your central CPU as just another core.
For desktop purposes your probably not going to see any benefit from massive amounts of similar cores on your CPU. Sure going from 1 core to 2 cores is very good. Going from 2 to 4 is probably useful also.. but I really doubt that going from 4 to 8 or to 16 cores is really going to provide any benefit for desktop, even for the most hardcore PC user.
Intel has currently 80-core prototype cpus...
So what is going to happen is that your going to see specialized cores for your computer. That is you'll have 2-4 generic cores that are designed similar to they are today.. then you'll see a few more cores that are specially designed to accelerate specific workloads. The most obvious of this would be to include numerous GPUs in as just another bunch of cores.
Unless ATI opens up what may happen is that your going to require special drivers to just to be able to access the full capabilities of just very basic hardware.