Suspicious-Teach8788
Lifer
- Feb 19, 2001
- 20,155
- 23
- 81
Originally posted by: Staples
Um yeah, you still need a 1080p TV to see the effective resolution of 1080i. The support for 1080p is just a bonus.
This article just states the obvious but there are few people who know enough about it to have drawn this conclusion.
This is like people saying you don't need SM 3.0 because games don't support it yet (ATI fanbois). Remember that argument in the X800 vs 6800 era?
Same deal. This is retarded.
3 years later people will laugh at you for buying 1080i when 1080p is the mainstream. Then what do you say?
It's just like dual core didn't really matter when it first came out. A lot of people still went for fast single core processors because their reasoning was "gaming." Now whether you game or not, is there even a reason to go single core? It doesn't make sense. When dual core is the standard, you don't save much by going single core unless you're trying to build a $500 computer. When your budget is $1000, I don't get why anyone would go single core.
Similarly, if your budget is reasonable, you should be getting 1080p and not 1080i. If you're going cheap on HDTV sets today, then you're thinking wrong. You're still an "early adopter" if you get such technology today, and if you expect to get away spending very little, don't expect much.
Sure 1080i and 1080p might look the same from 10 feet away... but how about this..
WHY do people play games at high resolutions? Shouldn't you play it on low resolutions and then max out AA and AF? Because if you're at a certain distance, you shouldn't notice right? (Note my argument applies for CRT monitors because we all know LCDs need to be at native resolution or else quality = sux).