-1

this is not exactly a programming question but its for a website I'm making.

I've been asked to create a site for a TV channel that will have streaming video for a local group (it's not porn), assuming the site is broadcasting 8 hours of HD video repeated 3 times (not sure if this makes a difference to bandwidth if its repeated or new), and watched by 1000 people, how much bandwidth would that be?

So how much bandwidth would 24 Hours HD video watched by 1000 people be?

Also, where should something like this be hosted? The channel will have a fiber optic internet connection, would it be better to get their own server or use web hosting?

Sorry for the long question but I searched online and didn't find any good answers.

methode
  • 5,348
  • 2
  • 31
  • 42
Mankind1023
  • 7,198
  • 16
  • 56
  • 86
  • It really depends on the bitrate and how far you are willing to compress videos. True HD would be at least 720p, and I think the lowest bitrate you can have at 720p without making it look like mud is 1 Mbps (using h264). A quick calculation gives you 1 * 86400 * 1000 = 10.3 TB (yup, TeraBYTES) in 24 hours. – NullUserException Feb 22 '11 at 02:46

2 Answers2

6

The way you've worded the question is pretty arbitrary, because there are a lot of factors to determine the data rate (and therefore bandwidth requirements) of a stream.

For instance, "HD" video is by definition any feed with a greater resolution than SD video, which is 720x480 widescreen (640x480 with a 4:3 ratio). This means you could be talking about 1080i, 720p, 1080p, or some other variant.

Then there's the issue of encoding. Compression has more bearing on data rate than resolution. For instance, DVD-quality data rates typically range between 3Mbit/s to 10Mbit/s, and that's only for 480p video. However, full 1080p encoded video can technically fit into a smaller data rate.

For a real-world example, Hulu's 720p video streams at about 2.5MBit/s. (ref: HowStuffWorks). According to this blog, high quality videos on YouTube are approximately 3.0Mbit/s to 3.5Mbit/s. Let's assume 3.0Mbit/s for simplicity. Multiplied by 1000 users gives you about 3Gbit/s of necessary bandwidth, plus overhead. Over the span of 24 hours, that's around 32 terabytes of data.

Having a fiber-optic connection also doesn't give us enough information about the available bandwidth of the connection. This will be limited by the on-site hardware and the provider themselves. You would need to find out the capabilities of both to determine how many bits per second the system is capable of. My guess is they will want to look into a hosting solution, as they have the infrastructure and internet backbone tie-ins to regularly handle this.

Here's an article discussing streaming to give you some ideas about streaming, and here's another article discussing associated costs. For some hosting solutions, just do a Google search for "streaming video hosting" to find some vendors.

D.N.
  • 2,160
  • 18
  • 26
  • Between 8Mbits and 15Mbits ? I highly doubt anyone streams at that rate. Hulu streams their 720p videos at less than 1.5 Mbps; iTunes' HD content (mostly 720p) is 4 Mbps – NullUserException Feb 22 '11 at 02:40
  • Good point, that's probably closer to broadcast quality versus true streaming. I'll clean that up a bit. – D.N. Feb 22 '11 at 02:42
1

Use a video streaming calculator

You need streaming server with special network interfaces and unmetered bandwidth which is not available on general data centers. The price difference would be great if you ask the service from non-streaming data centers.

Xaqron
  • 29,931
  • 42
  • 140
  • 205