Originally posted by: kki000
To me, a mere 1280x720 is not "true" HD
so agp 8x is better than 4x right? rambus is better than ddr, cuz the numbers are higher right?
Its not as simple as just reading off the numbers or buying into the marketing hype, you need to know how to apply the technology and figure out which is a useful feature or not.
Let's quote the whole thing:
To me, a mere 1280x720 is not "true" HD, the "true" HD is 1920x1080. However, both are actually HD content.
I was contrasting that with your statement that "1280x720 is true hd". Notice how I put quote marks around the word "true" ? Thats quotes as in "so-called true." I'm saying that you may consider 720p to be true HD, but I consider 1080 to be much better HD. HOWEVER, both are HD. "true" hd is a rather pointless term (which I thought I was emphasising with the use of quote marks) since both are actual HD by the very definition of the word.
And actually, AGP 8x was more a marketing ploy than anything else. Nothing ended up using that bandwidth, mostly because no sane game designer would use the slow main memory for most operations. In fact 3dmark showed that when you exceeded the video card texture memory and had to swap from main.
There were benchmarks that benched the same card at 4x and 8x, and most showed no discernable differance.
A similar situation exists currently with SATA 3g hard drives. So far there's no hard drives that even come close to it, so its not something I'd look for in a hard drive. But I'd still put that as a feature to have on a motherboard because drives such as raptor ARE approaching the 150 meg barrier with sustained transfers, and I'm sure some hard drives will pass it sooner or later, and burst transfers will need it far sooner.
Originally posted by: kki000
This car is good, but it doesnt have a hemi in it, as long as you understand its limitations yer ok?
When I say I'm fine with people purchasing this if they know its limitations, I'm specifically referring to its ability to display 1080 video at that resolution or having to downscale it. I dont want people buying it expecting to show the full glory of 1080 video, when it cannot. Does that mean this sucks? Of course not. Does that mean that a 1080 native HD display will automatically be better? Of course not, there are other things you have to look at. If a 1080 native TV has crap picture quality, you should pass on it regardless of its native resolution.
As for cars, I'm not a car expert, but from what I recall, you can get good cars with good horsepower with or without a hemi engine.
Originally posted by: kki000
You are mixing up hdmi and hdcp again. A set can support hdcp without hdmi.
HDMI is the connector format, and HDCP is the encryption.
I know that windows vista will require HDCP capable displays if you want to display HD media in HD res (which means no CRT's that I'm aware of, and almost no PC LCD monitors I know of support it either).
As for TV's perhaps I was mixing up the terms. I know that standalones HD players will require one of them. I guess its probably HDCP. Well in that case, I dont know if this TV supports HDCP.
edit: Ok, I found one of the stories I had read a while back:
http://www.engadget.com/2005/07/12/tosh...layers-will-do-high-def-only-via-hdmi/
It says only through HDMI. But it also says thats because of HDCP. So I imagine you would need BOTH in order to play HD movies at HD res.
Found this page:
http://www.sonystyle.com/is-bin/INTERSH...o&CategoryName=tv_hdtv_30%22to42%22TVs
Says it supports both HDCP and HDMI, so its fine then for future standalone HD players. Which is what I said earlier anyway
