Our adventures in live mobile streaming continue. If anyone should happen to read this post on Thursday May 27, you can see the results of this effort at http://harvard.edu/commencement2010/
So what are the lessons learned so far? Here’s a preliminary list in no particular order:
- Setting up the server side of things is the easiest part. Configuring for FMS delivery from Limelight, and for Wowza on Amazon EC2 was a breeze. Multiple bitrates, the RSS playlist for JW Player, the SMIL playlist for Wowza….once you figure out the moving parts, it works almost just like it’s supposed to.
- Adaptive streaming from Limelight and other CDNs that use the ‘fcsubscribe’ method for load-balancing can cause a problem when switching to a streams that comes from a new edge node. More on this later…
- Mobile devices: Make sure you’re encoding H.264 with Baseline profile level as low as you can go. iPhones and iPads turned out to be the easiest to support fully. Blackberries and Droids work…or they don’t. It seems to depend on the model phone, and on the network you’re on. My personal Blackberry gets the RTSP stream just fine. Others around the office with different Blackberries can’t play the stream. Same with Droids – some people are able to play it, some not. I haven’t discovered why just yet. Codec issues are a likely possibility, but it’ll take some digging to find out. I have not found any useful documentation on the differences between Blackberry models, in terms of live video streaming support.
- Encoders – this has been the headache of all headaches and took many many man-hours to get right.
- Encoding three bitrates (100k, 500k, 1000k) to two different CDNs (Limelight, Wowza/EC2) takes a lot of horsepower.
- One brand new 8-core Cisco machine with a brand new Osprey 240 proved unsuitable for capturing video at all.
- Adobe Flash Media Live Encoder (FMLE) and Telestream Wirecast on Windows both depend on your display hardware and drivers. If you’re planning a headless encoding system, plan extra time to get it all working.
- A 2-core IBM/Windows/Osprey system running FMLE gave us better encoding performance than an 8-core Mac Pro/AJA system running Wirecast.
- All of the above systems had issues with audio/video sync, either being off from the start, or drifting as the webcast went on. Only on the Mac/AJA system were we able to resolve these in time for a successful webcast.
- Ordinary desktop PC running consumer USB video capture devices are easiest to set up and are the machines most likely to work right off the bat. No audio/video sync issues occurred with these, even though we were capturing video on one of a couple $50 USB
devices and audio using the built-in audio support on the PC. The more expensive and industrial-grade the hardware, the more trouble it gave us.
- Our final encoding configuration included an 8-core MacPro/Wirecast for the 1Mbps and 500kbps streams, a single-core desktop PC running FMLE for the 100k streams, and a dual-core desktop PC with FMLE for capturing a 1.2Mbps H.264 archive file.
- Some of our partner schools are using our infrastructure for mobile streaming. They’ve got Digital Rapids TouchStream appliances, and have had no encoding issues doing multiple bitrates from HD down to 3G/mobile. I’m quickly becoming a big fan of purpose-built appliances for encoding.
That’s about it for now…I’ll follow up on some of these as we do some analysis and learn more.