mikeymikec
Lifer
- May 19, 2011
- 18,417
- 11,032
- 136
I remember years ago reading of people who refused to ever buy a processor with integrated graphics, like it was an affront to their honor or something. It was as if Intel including it on the processor was an insult. Disabling it wasn't good enough, it had to not be there at all. I wonder how well that position has held up.
I suspect at the time I would have been worried that it would drive up CPU power requirements, cost or system temps. My reaction wouldn't have been as strong as you suggest, but I think I would have been a bit sceptical.
I didn't trust onboard graphics for a long time (bear in mind I saw it in its infancy on the PC with e.g. ATI 1MB onboard graphics (shared memory), which performed horribly for even the basics compared to almost any graphics card), until AMD started doing dedicated graphics RAM (rather than shared system memory only) for onboard graphics. Since then I've been OK with it for the average user.
These days I don't see the point in getting rid of it now it has been implemented well enough for the basics. I like the idea of being able to fall back to onboard graphics in an emergency / testing in case my graphics card failed.
