How much bandwidth does one need for 1920x1080 HDTV?

MadRat

Lifer
Oct 14, 1999
11,961
278
126
Correct my math:

1920 x 1080 = 2073600-bits just to render 1-bit per pixel

Progressive 24p (24Hz refresh) 49766400-bits to send 1-bit per pixel
@ 8-bits (256 colors) we get 398131200-bits to send (370.8 Mbs = 46.4 MB/sec)
@ 16-bits (65K colors) we get 3185049600-bits to send (741.6 Mbs = 92.7 MB/sec)
@ 24-bits (4M colors) we get 47775744000-bits to send (1.11 Gbs = 139 MB/sec)

Progressive 30p (30Hz refresh) 62208000-bits to send 1-bit per pixel
@ 8-bits (256 colors) we get 497664000-bits to send (462.6 Mbs = 57.8 MB/sec)
@ 16-bits (65K colors) we get 9995328000-bits to send (925.2 Mbs = 115.7 MB/sec)
@ 24-bits (4M colors) we get 1492992000-bits to send (1.4 Gbs = 173.5 MB/sec)

Progressive 60p (60Hz refresh) -bits to send 1-bit per pixel
@ 8-bits (256 colors) we get 98496000 (925.2 Mbs = 115.6 MB/sec)
@ 16-bits (65K colors) we get 196992000 (1.4 Gbs = 231.3 MB/sec)
@ 24-bits (4M colors) we get 295488000 (2.8 Gbs = 347 MB/sec)

Interlaced 60i (30Hz x 2 interlaced lines refresh) would be the same rate as 60p, right?

These rates seem awfully steep for today's DVD and cable TV technology. If I miscalculated somewhere, then please correct me where I've gone wrong. How does one feed one of these giant screens to their ultimate resolution?
 

Dug

Diamond Member
Jun 6, 2000
3,469
6
81
1920x1080 for high definition will be interlaced.

1280x720 will progressive.

Plus most content will be compressed.
For instance a MS high definition feed (which btw isn't a true 1920) will run about 8.2Mb/s.
 

MadRat

Lifer
Oct 14, 1999
11,961
278
126
I was using this link as my source. 1920 x 1080 comes only in 24i, 24p, and 30p. Although 60p is not supported in the HDTV format, its a natural progression of the numbers.

After reading some more on this site, it sounds like 60i is only 1920 x 540 with 2 fields, making it comparable to 30p.

Progressive 60i (60Hz refresh) 62208000-bits to send 1-bit per pixel
@ 8-bits (256 colors) we get 497664000-bits to send (462.6 Mbs = 57.8 MB/sec)
@ 16-bits (65K colors) we get 9995328000-bits to send (925.2 Mbs = 115.7 MB/sec)
@ 24-bits (4M colors) we get 1492992000-bits to send (1.4 Gbs = 173.5 MB/sec)
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
HDTV isn't raw, it's compressed. The level of compression varies, but they are no where near those numbers.

"HDTV is based on timing parameters of SDTV. For purposes of transfers and conversions, it makes sense to design the timing, or sampling, from a common point of reference. The frequency 2.25 MHz becomes important because it's the lowest common multiple between NTSC and PAL systems for digital sampling. Six times 2.25 equals 13.5 MHz, the sampling rate used for both in standard component digital. HDTV maintains an integer relationship for the same reasons. Our 1920 x 1080 active high definition system utilizes a 74.25 MHz sample rate. Multiply 2.25 by 33 and you'll get 74.25. There are a total of 2200 samples per line and 1125 lines. The compressed HDTV signal requires the full 19.4 Mbps data rate? a compression ratio of 64:1."
 

Dug

Diamond Member
Jun 6, 2000
3,469
6
81
The real problem is getting a display that will truely show the high definition signal.

Due to technical reasons having to do with the video equipment, recording technologies, and the 19.2 Mbit/s-limited ATSC channel, some HDTV signals will not reach their nominal resolution. Most notably, 1080i60 is impossible to broadcast without artifacts at this bandwidth. Most 1080i broadcast signals actually are filtered to 1440 horizontal samples to allow adequate compression, and most current consumer HDTVs based on CRTs cannot resolve even 1440 horizontal samples (most rear-projection CRTs will resolve 1200-1300 at best, unless based on 9" guns)

There's two problems-
There's only a handful of projectors and tv's that can really display the true resolution and cost around $35k.
Getting that resolution to your tv without artifacts, which is a problem with 1080i.

Feeding the display a better compressed signal will alleviate the need for more bandwidth.

 

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
Originally posted by: Dug
The real problem is getting a display that will truely show the high definition signal.

Due to technical reasons having to do with the video equipment, recording technologies, and the 19.2 Mbit/s-limited ATSC channel, some HDTV signals will not reach their nominal resolution. Most notably, 1080i60 is impossible to broadcast without artifacts at this bandwidth. Most 1080i broadcast signals actually are filtered to 1440 horizontal samples to allow adequate compression, and most current consumer HDTVs based on CRTs cannot resolve even 1440 horizontal samples (most rear-projection CRTs will resolve 1200-1300 at best, unless based on 9" guns)

There's two problems-
There's only a handful of projectors and tv's that can really display the true resolution and cost around $35k.
Getting that resolution to your tv without artifacts, which is a problem with 1080i.

Feeding the display a better compressed signal will alleviate the need for more bandwidth.

yep, that's what I'm finding. The WMV-HD that Microsofts uses to demo HD are 1440x1080x24fpsx24bit. That works out to approx. 854 Mbps for raw video. The MS video stream runs at approx 8000 Kbps, so the effective compression ratio on this piece is approx. 109:1. The only place that you actually have an 854 Mbps stream is between your CPU and your video card, your video card then streams this off either over a dual-DVI link or as analog video.

Assuming you maintain roughly the same compression ratio for full 1920x1080x24p you get a stream of ~11Mbps. If you take that figyre and apply it to a 2 hour movie and you need roughly 9.7 GB to store video.

Similarly WMA9 can compress high quality 5.1, 24-bit, 48KHz audio to about 800 Kbps. So if you assume that you'll use this to compress the audio for 2 hours, you get another 700 MB of audio data.

So all in all you need roughly 10.5 GB of storage space for a WMV9 compressed 1080p HD movie. Obviously, to decode all this into something useful, you need a powerful decoding system. In fact MS, recommends a 3.0 GHz Pentium 4 for use with its pseudo-1080p stream. Assuming that computing time is approx linear with respect to the number of pixels you'd need a 4 GHz P4 (or equivalent) to handle the monstruous decoding requirements put forth by HD streams. While dedicated hardware will definitely do a better job of decoding this than a general purpose CPU, this gives you an idea of the processing power you'll need for 1080p video!

The other problem is actually displaying a 1080p HD stream on a monitor, as Dug said!
 

manko

Golden Member
May 27, 2001
1,846
1
0
Of course, you can always go with a dedicated hardware decoder board if your CPU isn't up to the task.

EDIT: Uh, what he said. ^ ;)
 

JYDog

Senior member
Feb 17, 2003
290
0
0
Originally posted by: Dug
The real problem is getting a display that will truely show the high definition signal.

Due to technical reasons having to do with the video equipment, recording technologies, and the 19.2 Mbit/s-limited ATSC channel, some HDTV signals will not reach their nominal resolution. Most notably, 1080i60 is impossible to broadcast without artifacts at this bandwidth. Most 1080i broadcast signals actually are filtered to 1440 horizontal samples to allow adequate compression, and most current consumer HDTVs based on CRTs cannot resolve even 1440 horizontal samples (most rear-projection CRTs will resolve 1200-1300 at best, unless based on 9" guns)

There's two problems-
There's only a handful of projectors and tv's that can really display the true resolution and cost around $35k.
Getting that resolution to your tv without artifacts, which is a problem with 1080i.

Feeding the display a better compressed signal will alleviate the need for more bandwidth.



I've seen several 1920 x 1080p sets using Intel's LCOS for less than 6Gs.

From froogle.com [search: LCOS]


Toshiba LCOS 57" Wide Screen Televisions 57HLX82
Toshiba Televisions Big Screen TV

Code: toshiba-57hlx82
Price: $5,999.95


Toshiba LCOS 57" Wide Screen Televisions 57HLX82

MSRP: $8999.99

Definitely digital, definitely different. The 57HLX82 is simply the HIGHEST RESOLUTION television available today. Powered by a unique digital light engine, and comprised of the WORLD'S FIRST three 1080p chip technology...

The Toshiba Advantage: Three 1080p LCOS chips use the power of 6,220,800 pixels to create the highest resolution television picture available.

Toshiba's IDSC? Digital technology displays all input signals at 1080p for the sharpest, clearest picture possible from every source.

A new Ultra Fine Pitch Screen (less than .1mm) allows the 57HLX82 to display the incredible sharpness of HDTV signals displayed at 1080p.







 

Dug

Diamond Member
Jun 6, 2000
3,469
6
81
Originally posted by: manko
Of course, you can always go with a dedicated hardware decoder board if your CPU isn't up to the task.

EDIT: Uh, what he said. ^ ;)

Unfortunately there isn't a dedicated hardware decoder board for WM9 HD material, yet.
It takes a hefty cpu and video card to do it.

JYDog- Wow! Thanks for the info. That's news to me. I didn't realize anyone had a (somewhat) affordable LCOS TV out.
That's definately the wave of the future. I hope they can do something with it for front projectors.

 

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
Originally posted by: Dug
Originally posted by: manko
Of course, you can always go with a dedicated hardware decoder board if your CPU isn't up to the task.

EDIT: Uh, what he said. ^ ;)

Unfortunately there isn't a dedicated hardware decoder board for WM9 HD material, yet.
It takes a hefty cpu and video card to do it.

JYDog- Wow! Thanks for the info. That's news to me. I didn't realize anyone had a (somewhat) affordable LCOS TV out.
That's definately the wave of the future. I hope they can do something with it for front projectors.
TI has DSPs that can handle WMV9. As of now however, there is no demand in the consumer marketplace for it. Once HD-DVD specs get ironed out, solutions using these DSPs will start to appearing in electronic manufacturer's roadmaps.
LINK
 

hahher

Senior member
Jan 23, 2004
295
0
0
Originally posted by: RaynorWolfcastle

So all in all you need roughly 10.5 GB of storage space for a WMV9 compressed 1080p HD movie. Obviously, to decode all this into something useful, you need a powerful decoding system. In fact MS, recommends a 3.0 GHz Pentium 4 for use with its pseudo-1080p stream. Assuming that computing time is approx linear with respect to the number of pixels you'd need a 4 GHz P4 (or equivalent) to handle the monstruous decoding requirements put forth by HD streams. While dedicated hardware will definitely do a better job of decoding this than a general purpose CPU, this gives you an idea of the processing power you'll need for 1080p video!

remember that similar processing problems occurred during dvd/mpeg2 days when they first came out. hollywood card comes to mind. but we can do dandy without them now.

i think this time around though, that the temp need for hardware decoding will be filled by gpu instead of standalone card.