<<
Why can't we insist on a new architecture or a manner of adding another 10 IRQ's or so?
>>
<<
We will be rid of irq's soon enough, when we move away from the x86 architecture, 'till then we will have to juggle, adding more irq's, if it is even possible, would add latency.
>>
We will never "be rid" of IRQ's. Hardware interrupts are a feature of every chip architecture around, including any architecture that could replace the x86. MIPS? It's got hardware interrupts. Alpha? Yup. PowerPC? You bet. IA64? Of course.
We are not moving away from the x86 architecture, at least not in the next 5 years, probably not in the next 10. Compatibility is too important to most PC owners. While binary emulation could help ease the transition for business users whose applications can afford to give up 75-80% of their CPU cycles, game players (who, ironically would stand to benefit the most from a new architecture) would never tolerate the loss of CPU cycles.
I know the x86 opcode map like the back of my hand, and believe me, I'm not any happier with the x86 than you are. Still, alternative architectures would have to outperform x86 machines by a really wide margin for a switch to happen, and that won't happen in the next 5 years.
Adding more IRQ's would mean adding more lines to the peripheral bus. Every IRQ needs its own signal wire, although I think the PCI bus uses a multiplexing scheme to reduce the number of wires on the actual bus connector. It would be difficult, to have an arbitrary number of IRQs.
What we need are PCI and USB peripherals that weren't designed inconsiderate idiots. Just why does the SB Live behave badly in some people's machines when forced to share IRQ's? Because no one has held them to account for it.
This is actually where review sites like Anand's could could make a real difference. Punish vendors like ATI and Creative in reviews for producing unruly peripherals. Or maybe compile a list of peripherals that are known to behave badly.