that is VERY incorrect.
The difference between 1080i and 1080p is quite fundamental. 1080i is an interlaced display mode, which means that the screen is drawn in alternating "fields" - where each field is the odd or even lines of the screen. This is done ~60 times per second.
Interlaced scanning is very common - and is standard on NTSC and PAL TV.
The technical reasons behind it are complex, but basically it lets you get 60 pictures per second (59.94 actually), while only "really" using 30 fps of bandwidth. The sacrifice is that in high motion, you don't get as much spatial resolution.
Progressive scanning (1080p) is what we're all accustomed to on PC monitors. The display is refreshed one line at a time, which gives full spatial resolution at all times. 1080p60 is very intense to render and deliver, but most (video) content is actually 1080p24, since films are shot at 24fps.
Interlaced video loses perceived spatail resolution in high motion because each field is 540 lines, so in a worst case scenario (LOTS of movement) you only get 540 lines of resolution for each picture you see. In low motion, the fields blend together nicely, and you can "see" the full 1080 lines, since your eyes are tricked by persistence of vision.
It's very important to realize that interlaced scanning only properly works on interlaced displays! PC monitors, LCDs, DLPs, etc ARE NOT. They are inherently progressive. The only "true" 1080i displays are CRTs. All other displays must deinterlace 1080i to 1080p. This is a very difficult process to do well, and in the worst case scenario can result in literally throwing away half the fields and scaling up 540 lines to 1080. The quality of deinterlacers varies tremendously, and in the best case (with a dedicated hardware processor) you can get great results. Most of the time, it's best to just stick with progressive scan output to a progressive scan display. That's why most of the time 720p > 1080i on most displays.
A lot of TVs are actually 1080p these days. If you see an LCD or other flat panel TV advertising 1080i, that means it's likely a 720p panel that takes 1080i, does a questionable job of deinterlacing it (maybe to even 540p as I said), and then scaling it to 720p.
So, long winded video-nerd technical discussions aside - your best bet when connecting your TV to your PC is to determine what the true native resolution of your TV is, and then output that resolution, or the next best thing.
If your panel is native 720p (not exactly 1280x720, but something very close to it, I forget the funky standard) - then output 720p! If you panel is 1080p, then output 1080p! Don't output interlaced video from your PC unless you're going to an old CRT HDTV, or a standard NTSC or PAL CRT TV. In those cases, interlaced is the way to go!
[edit]
After closely reading the thread, I see that this TV probably has a 1080p panel, but doesn't properly take 1080p60 in through HDMI. In this case, it's all up to the deinterlacer in this TV. If it's got good deinterlacing, then 1080i will end up looking better than 720p. If it's a crappy deinterlacer, then 720p will be a better bet.
[/edit]
~MiSfit