HDTV Bitrate

I recently ranted about this on Facebook but I have decided to post it here as well.

Broadcast HDTV

  • HDTV broadcast standard = 18Mbps (megabits per second)
  • 18 Mbps * 60 Seconds * 60 Minutes = 64,800 Mb / Hour
  • 64, 800 / 8 = 8,100 MB = 8.1 GB for 1 hour of recorded MPEG2

Tests from My Cable Provider (Cox)

  • 1 hour 5 minute recording (that is what I have my DVR set to)
  • Cox on Fox, CBS, NBC, (Over the air channels) averages 6GB (75% of broadcast quality)
  • HBO averages 4.5 (55% of broadcast quality)
  • Just about everything else averages 4.5 (55% of broadcast quality)

What does that mean

  • 25% loss of bit rate for OTA channels
  • 45% loss of bit rate for everything else
  • OTA channels are barely above DVD bit rate quality
  • Other channels are less than DVD bit rate quality

Why do they do this?

  • 256 Digital QAM transmits at 38.5 Mbps
  • 2 HDTV signals can be sent per QAM
  • Limited number of QAM frequencies mean that cable providers must try and cram more than 2 streams into a single QAM
  • In the case of Cox they try and jam 4 streams down something that is meant to only hold 2

Solution

Assumptions

  • Let’s assume that a family of 4 has 4 TVs (3 for the bed rooms and 1 for the family room)
  • Each TV has a DVR that can record up to 4 stations
  • Each DVR is utilizing all 4 stations to come up with a total of 16 stations being recorded at once
  • Total Bandwidth required is 288Mbps

Gigabit Internet

  • Give everyone a 1 Gbps connection
  • 802.11ac is now available which means we can easily stream 288Mbps over Wi-Fi without much of an issue (buffering would be involved)
  • Create a device that has an RJ45 jack (optional 802.11AC) and a coax that enables TV’s and DVR’s to work the way they still do (similar to an MTR) but everything is sent via the internet instead
  • Still offer standard TV for those that don’t use/want a DVR