An affordable HD video camera review ;)

Todd33

Diamond Member
Oct 16, 2003
7,842
2
81
It will be cool in a year or so, but editing and authoring 1080i sounds like a pain.
 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
It is cool. In the states, the A1 is about $400 more with XLR and more controls. There is a HC3 on the way, but I have not seen details yet.

And no, you did not want Paris Hilton on one of these. 4x the resolution of DV. That usually means the ugly really shines through. I have seen folks say, "Oh man, would this be great for porn!" No. It means every imperfection shows up better too.

That said, I want one, but since I use it Pro, I want a Sony Z1 or a Canon H1.

As for editing, My Rigs Workstation is built for it. Avid Liquid does native HD MPEG editing and works extremely well. Fast/Pinnacle/Avid working with ATI developed an editor that uses the GPU through DirectX. I get accelerated real-time playback on a lot of stuff and I am supposed to have HD preview out the AIW card according to one of the guys who did tech support for Pinnacle. Now, if I just had a HDTV to plug it into in the editing room.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: gsellis
It is cool. In the states, the A1 is about $400 more with XLR and more controls. There is a HC3 on the way, but I have not seen details yet.

And no, you did not want Paris Hilton on one of these. 4x the resolution of DV. That usually means the ugly really shines through. I have seen folks say, "Oh man, would this be great for porn!" No. It means every imperfection shows up better too.

That said, I want one, but since I use it Pro, I want a Sony Z1 or a Canon H1.

As for editing, My Rigs Workstation is built for it. Avid Liquid does native HD MPEG editing and works extremely well. Fast/Pinnacle/Avid working with ATI developed an editor that uses the GPU through DirectX. I get accelerated real-time playback on a lot of stuff and I am supposed to have HD preview out the AIW card according to one of the guys who did tech support for Pinnacle. Now, if I just had a HDTV to plug it into in the editing room.

cool to hear from a recording enthuisast :)
 
Mar 19, 2003
18,289
2
71
And another thing...what's with 1440x1080? 4:3 HD? :confused: Unless they're using nonsquare pixels (which wouldn't make much sense either), why would they not go for an actual 16:9 resolution (such as 1920x1080i)?
 
Mar 11, 2004
23,444
5,851
146
Cheaper maybe? Doesn't need as powerful of a chip inside to process the image. Since its a relatively budget camera for what it is, they're going to cut cost somehow. This way they get to market it as 1080 capable, but also leave room to let the salespeople get people to move up to the more expensive ones. "This one is good, but its not widescreen, and you want your movies to be widescreen like that new 50" plasma, don't you?"
 
Mar 19, 2003
18,289
2
71
Originally posted by: darkswordsman17
Cheaper maybe? Doesn't need as powerful of a chip inside to process the image. Since its a relatively budget camera for what it is, they're going to cut cost somehow. This way they get to market it as 1080 capable, but also leave room to let the salespeople get people to move up to the more expensive ones. "This one is good, but its not widescreen, and you want your movies to be widescreen like that new 50" plasma, don't you?"

I suppose so, that does make sense, I've just never heard 1440x1080 used as a resolution for anything really...
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: SynthDude2001
Originally posted by: darkswordsman17
Cheaper maybe? Doesn't need as powerful of a chip inside to process the image. Since its a relatively budget camera for what it is, they're going to cut cost somehow. This way they get to market it as 1080 capable, but also leave room to let the salespeople get people to move up to the more expensive ones. "This one is good, but its not widescreen, and you want your movies to be widescreen like that new 50" plasma, don't you?"

I suppose so, that does make sense, I've just never heard 1440x1080 used as a resolution for anything really...

http://www.sonyhdvinfo.com/article.php?...e-Chips-and-1080i-Works-on-the-HDR-FX1
 
Mar 19, 2003
18,289
2
71
Originally posted by: Snakexor
1080i is basically 540p right? if so would the resolution would be 1440x540?

I don't think it quite works like that...as I understand it, there are actually 1080 distinct rows of addressable pixels, they're just operating (either "recording" for a camcorder or "displaying" for a 1080i native HDTV) at alternate times; like switching between even/odd/even/odd fields every 1/60 of a second.
 
Mar 11, 2004
23,444
5,851
146
Yeah, 1080i can still show more detail than 540p, since its got twice the number of pixels. 540p would theoretically be better for moving images, but since the time that each individual set of 540pixels is displayed so quickly when using 1080i, that its really not that noticable. All the local major network affiliates broadcast in 1080i for everything and its never been a problem for sports (NFL, college basketball, olympics etc.).

Its weird though, because even though the camera is technically getting a 1920x1080 image (or rather should be), it requires much less bandwidth than 1080p since its sending half the vertical resolution every 1/60 versus the full resolution every 1/60 of a second. So in one second its sending half as much data as 1080p. With 540p you should be having a resolution of what, 960x540, so you're getting much less actual pixels.

Everything is so messed up though, as its possible that stations could be recording at 540p or 720p and then be converting it all to 1080i or something like that. Makes it near impossible to tell what is actually being run.

I can't quite figure it all out. For instance, for news and some other things, my TV tells me I'm getting a 1920x1080 signal but it puts black bars up on the sides (so its showing a 4:3 image), so I guess they're really recording maybe even just a 480p image and then putting the black bars and broadcasting a 1080i image.

So it looks like I was incorrect, and that the 1440 pixels are wide so that its somehow a widescreen image that gets recorded?
 
Mar 19, 2003
18,289
2
71
Originally posted by: darkswordsman17
Everything is so messed up though, as its possible that stations could be recording at 540p or 720p and then be converting it all to 1080i or something like that. Makes it near impossible to tell what is actually being run.

I can't quite figure it all out. For instance, for news and some other things, my TV tells me I'm getting a 1920x1080 signal but it puts black bars up on the sides (so its showing a 4:3 image), so I guess they're really recording maybe even just a 480p image and then putting the black bars and broadcasting a 1080i image.

So it looks like I was incorrect, and that the 1440 pixels are wide so that its somehow a widescreen image that gets recorded?

On the second paragraph I quoted here - usually what happens is a station will be running all day, broadcasting in 1080i (or 720p, whatever their resolution of choice is). But they will not always have HD source material available; in fact, at least with the national networks that you can pick up OTA, most of the day there's no HD content available (and almost no local stations have their own HD cameras and equipment yet for news and such), so they upconvert the SD and place the 4:3 within the 16:9 frame with letterboxing bars. Very rarely do stations actually change their broadcast resolution depending on the content (I know of at least one station in San Antonio that switches between 480i and 1080i at set times of day though).

But yeah, looks like the pixels on the camcorder are wide...I still never understood the non-square pixel thing, sure you'll get a 16:9 1080i image, but you won't really have 1920 addressable pixels....just seems strange to me.
 

Dravic

Senior member
May 18, 2000
892
0
76
1440x1080 is a broadcast dirty little secret, when directv was feeling the HD bandwidth crunch before there new satillites went up, they were broacasting some of their HD channels at 1440x1080 instead of true 1080i.

Its lower bandwidth, but good enough quality for the mainstream not to notice. To bad for them, the HTPC enthusiats over at avsforums were able to see the shift in resolution due to the software many of them were using to capture HDTV.

I wont mind it for a personal digicam..

hoping to get one for the family trip to Italy this year... that or a new lense for the Digital SLR

 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
1440x1080 is because the pixels are not square. It is true 16:9. You also are missing that 1080i is at 60 fps. That would be like 1080p at 30 fps... ;) 720p is 30fps.