Back before the internet became popular as it is, games had to be released "ready." Now they have the option to rely on patches. I would argue that it leads to games that come out ahead of time. I think they (the game companies) are too reliant on this.
I think there are a number of things at play, and I list them in order of importance (IMHO):
1) Software is several orders of magnitude more complex than it was in the 90's, which naturally leads to more potential bugs/defects.
2) Hardware is significantly more varied, not counting consoles (although even with those you now have some variance with different revisions and updates, etc.). No company, not even AAA releases, can test across every conceivable combination of hardware/driver version/OS/etc. to make sure there are no compatibility issues with your hardware. Back in the day you had a PC-compatible machine running DOS with a VGA monitor, now you have XP, Vista, Win7, AMD, intel, nVidia, etc.
3) # of users -- with a much larger market for games these days, you have a lot more users. Somewhat tied to #2, you also have a better chance of someone stumbling upon an obscure bug if you have a million people playing your game rather than a few thousand.
4) Ease of patching -- before the internet was ubiquitous and large downloads were possible, it did place a premium on fully testing your product because patching it was difficult/impossible. It's human nature to cut some corners if you know you can come back to something down the line and easily fix it if it breaks.
So yes, I think the ease of patching definitely plays a role in things, but if you think that creating a bug-free version of Tetris in the 80s is as easy as creating a bug-free version of World of Warcraft in 2010, then I have a bridge to sell you.