1

I'm looking for hardware to stream large amount of media from (5000Mbit per second and more).

Is there any hardware or special servers to accomplish that task? FMS or WMS don't matter as long as end user could view videos on a webpage.

How does Google do that on YouTube?

splattne
  • 28,508
  • 20
  • 98
  • 148
eugeneK
  • 410
  • 2
  • 8
  • 18

1 Answers1

3

Yep, there are lots, I use Cisco CDE kit (Clicky).

I'm a VoD-guy, there are so many questions you need to ask yourself before you can choose a streaming technology - the first being do you know what your client software/codec is, the second being whether you need QoS or not. Are you planning on charging for content or will it all be free?

As for Youtube, well it's mostly a combination of open-source code on commodity kit with a pinch of self-developed code.

Chopper3
  • 101,299
  • 9
  • 108
  • 239
  • Basically i want to have Tube, like Youtube. Quality of videos would be in two formats low (for 1.5Mbit connections) and high (for 5Mbit and above) WEB USERS. We plan to charge for high quality content in future. thanks – eugeneK Jan 25 '10 at 14:05
  • 4
    Oh mate, it takes more than just asking a few questions to setup something like that, just the charging and entitlement-proxy stuff is complex. What are you using as a CMS? – Chopper3 Jan 25 '10 at 15:07
  • This is our own VOD/Tube site .... – eugeneK Jan 26 '10 at 07:20
  • I've been doing huge video-on-demand systems for about half a decade now and I reckon I've got a handle on about half of it all - it's an enormous subject and not something anyone can add sufficient detail to in the form of a SF answer. You need to understamd the load on your routing, front-end load-balancers and your firewalls before you can even think about what is actually performing the streaming - and the streamers themselves are a giant engineering effort to get right too. Basically I'm saying be very careful about what you commit to because you won't have thought of it all yet. – Chopper3 Jan 26 '10 at 12:52
  • We talking about 10000 connections at the time MAX... Currently we host videos on storages connected to servers. Each storage has certain of all videos and each hard disk has certain amount of all videos on storage. This outperformed RAID10 with the same amount of disks probably because there is large amount of files which are uploaded few times a hour. So write speed affects read speed thus RAID10 unusable... We use FMS as streaming application , we had no luck with Red5 and Windows Media Services which are better imho that FMS require WMP which has a lot of it's issues. What do you suggest ? – eugeneK Jan 26 '10 at 16:10
  • All the logic where video going to be played from taken from DB... – eugeneK Jan 26 '10 at 16:11
  • You appear to need beteen 15 and 50Gbps of network bandwidth to handle that many users, do you have that much of clearable bandwidth? If you're doing this off servers I'd bank on no more than 1Gbps from each server (you can do better than this but it often needs a fair amount of tuning), so assume 15 to 50 servers. Whether you use servers or the Cisco kit I mentioned you're going to need between 1.9 and 6.25GBps of storage capacity - which you're going to struggle with from a single storage box, which means duplication or sharding of your video - requiring a centralised content management app. – Chopper3 Jan 26 '10 at 16:19
  • "All the logic where video going to be played from taken from DB" - this seriously makes no sense to me at all sorry. – Chopper3 Jan 26 '10 at 16:42
  • I meant by that that any hard drive in storage is numbered the same as video file's path in database. So we we avoid content duplication, the only duplication there is is backup. We can order as much bandwidth as we want. My thinking is that 1Gbps network cards on the servers aren't the bottleneck but servers CPU or even more probable hard drives themselves. – eugeneK Jan 26 '10 at 18:44
  • Yes I agree, we have video servers that are consistently pushing out 4Gbps but only due to literally years of tweaking of the hardware, storage, network config and most importantly the software and kernel - hence my suggestion that you don't bank on more than 1Gbps per server, so you're not disappointed. We use a combination of very large caches (192GB), SSDs, 15krpm 2.5" SAS disks and 450GB 3.5" SAS disks based on content usage - mind you we have a client playback SLA and end-to-end QoS - you have the luxury of none of that. – Chopper3 Jan 26 '10 at 19:11