Abstract: Adobe’s announcement regarding forthcoming Apple HTTP Live Streaming capabilities for Adobe Flash Media Server needs context and clarification. Other vendors have introduced similar technology over the last sixteen months. Video content creators and integrators (designers, developers, managers) should strongly consider using adaptive streaming for mobile/device video delivery.

by Robert Reinhardt

At NAB this past week, Adobe (twitter feed) previewed some forthcoming technology from Flash Media Server: streaming video to the iPad. Now, to be clear, I’m not privy to any details of this new technology. While I am an Adobe Community Professional, I haven’t had access to this beta or preview edition of Flash Media Server. However, most of my professional work deals with online video, throughout the production and post-production workflow. Due to client demand and my own online encoding and hosting venture with videoRx.com (twitter feed), I’ve had to seek out video solutions that work across desktops, smartphones, and most recently, tablets.

There may be some misunderstandings with the news of this forthcoming technology from Adobe. Some tweets I’ve found the past couple of days:

  • Coming Soon From Adobe: Flash Video To Your iPad http://zite.to/fjD028 via @Ziteapp
  • Live-streaming Flash video coming to iOS devices http://t.co/WhNwAag via @mactrast
  • Brilliant move by Adobe. Tech pundits won’t understand it. Business pundits will. Flash video coming to iPad http://zite.to/evAQZ9
  • Adobe throws in towel, adopts HTTP Live Streaming for iOS arst.ch/p17 via @arstechnica

Most of these articles are making it sound like Adobe’s new technology can:

  • Convert any Flash Video (or Flash SWF content for that matter) to iOS-compatible content
  • Deliver HTML5 compatible video to any browser without the Flash plugin
  • Be seen as an admission of defeat on Adobe’s part in making content more cross-device friendly for non-Flash devices

If you believe any of these assertions, you’re wrong. The fact is that Adobe previewed video technology, not general Flash (or SWF) capabilities. I’m not sure if they disclosed what kind of source video was being streamed to the iOS, but there’s nothing new being revealed from a capabilities point of view. Pre-existing solutions with Wowza Media Server or Microsoft IIS7 with Smooth Streaming can already repurpose properly encoded H.264 to iOS devices, using the specification drafted (and made open to other vendors, without licensing costs) by Apple called HTTP Live Streaming. Wowza is the most flexible of the currently available options, as it can reuse H.264 content to stream HTTP “chunks” of video to Adobe Flash Player, Microsoft Silverlight, and Apple iOS. (It can actually do more devices and platforms, via RTSP, but who wants that protocol any more? :P ) Microsoft IIS7 and Media Services 4 has a Smooth Streaming module that’s designed to work optimally with Silverlight, but it can also chunk H.264 content for iOS (view example).

So, I’ve been using the word “chunk” already—what am I talking about? Basically, these existing solutions take an H.264 video file and carve it up into equally sized chunks, based on time fragments typically ranging from 2 to 10 seconds in duration. If you have a 1 minute video file and a chunk duration of 10 seconds, the server will dice up an H.264 video into 6 chunks, and only deliver the chunks that your player buffers and plays. On top of that, these chunks are streamed to the video player over standard HTTP or SSL/HTTPS, enabling video to get past corporate firewalls that might block video delivered over other protocols such as RTSP (older streaming protocol) and Adobe’s RTMP technology used by Flash Player. And, just like “real-time streaming” protocols, HTTP streaming enables you to seek within the video file without needing to download any earlier portions of the video, reducing wait times for the viewer.

But the real advantage of using such a chunking mechanism is adaptive streaming. If you encode the same piece of video content into multiple bitrates (or data rates) varying in quality and file size, then you can use this same chunking approach to switch the quality of a video stream on-the-fly! It’s quite possible you’ve seen this technology demonstrated already, or maybe you already use it with your own video deployment on the Internet.

Years ago, one of the larger video projects for which I provided technical direction at Schematic (now Possible Worldwide) was ABC.com’s first generation Full Episode Player. ABC.com later used Move Networks video technology to deliver adaptive streaming over HTTP. (I wrote an article titled “Flash: Move Over?” discussing this migration to Move Networks.) While a viewer watched an episode of a TV show, the quality of the video could adapt to the speed of your connection.

I’ll never forget the first time I saw this technology in action while I was on a business trip and watching an episode of LOST in my hotel room over the hotel’s wireless Internet. For the first ten minutes, I had a decent quality of image—probably one of the middle bitrates available. Then, my connection speed dropped but the video didn’t stop playing. The video player simply downgraded playback to a lower quality stream. No rebuffering, no stuttering playback. In a few moments, my connection came back and the picture quality improved. That technology was available in 2006. Since then, we’ve seen adaptive streaming become more widely available.

In late 2007/early 2008, Adobe added adaptive streaming (or “dynamic streaming” as they called it) over RTMP in Flash Media Server 3, and in early 2009, Flash Media Server 3.5 added HTTP support for dynamic streaming. Both of these versions of adaptive streaming from Adobe worked only with the Flash platform. In December 2009, Wowza Media Systems (twitter feed) released Wowza Media Server 2 with HTTP adaptive streaming to the iPhone, as well as RTMP adaptive streaming to Flash. For the last 16 months, production-level technology has existed to take H.264 files and stream them to both Flash (aka “Flash Video”) as well as the iOS.

Now, just because these streaming solutions can deliver content to Safari Mobile, an iOS app, or Safari Mac (desktop) doesn’t mean they’re compatible with any other HTML5 browser implementation of the <video> tag. In fact, no other HTML5 browser has adaptive streaming capabilities without the intervention of a plug-in such as Adobe Flash Player or Microsoft Silverlight.

[UPDATE] Apparently, I missed that Android 3.0 aka “Honeycomb” (and presumably 2.2) has native support for HTTP Live Streaming via M3U8 playlists, same as Apple iOS/Safari Mac. I found more information about Android support for HLS posted by Jason Shah. If you have an Android 2.3 or 3.0 device, you can view this trailer using HTTP Live Streaming.

And here’s the clincher: anyone who’s making video experiences on the Internet really needs to be using adaptive streaming to effectively deploy video across the vast landscape of networks in use today. We’re all too familiar with how “non-3G” most 3G or higher mobile networks are in day-to-day use. You just can’t reliably deliver one video file across all devices and desktops and think the experience will be the same on all of them—you need multiple “qualities” of the video.

[UPDATE] To read more on other reasons why plug-in based video is here to stay, read Bart Czernicki’s “Five Things that HTML5 Video Currently will not do, but Silverlight or Flash will”.

And that brings me to considerations for encoding video. You not only need multiple bitrates for your video (each bitrate is a separate video file), but you will also need multiple sets for each CPU target. Mobile devices have notably slower processing speeds compared to our multi-core desktops, and as such, usually need a simpler “version” of H.264 content. And not just simpler in bitrate or other standard video geek terms. Simpler as in complexity of the video compression. H.264 video is typically encoded in one of three profiles, in increasing order of complexity: Baseline, Main, or High. (There are other profiles, but they’re not nearly as common.) All mobile video should be encoded with the Baseline profile. All older iPhones and iPods can play Baseline H.264 video, but they can’t play the Main or High profile. The Main profile is typically used for desktop playback. The High profile can also be used for desktop playback, but it won’t play well on older dual-core machines or older Pentium/PowerMac CPU architectures. Why not just encode all video then with the Baseline profile? Well, Baseline just can’t use all the bells and whistles that Main and High profiles can: B-frames, CABAC entropy compression, and much much more from the H.264 spec. Baseline is basically stripped of all these goodies, and intentionally so—those features carry hefty CPU decoding requirements that mobile devices can’t provide yet.

Because of these playback issues, I strongly recommend encoding a set of adaptive encodes for mobile using H.264 Baseline and another set of adaptive encodes for tablet and desktop using H.264 Main. (If you use my OmniRx encoding presets on videoRx.com, that’s exactly the two sets of H.264 output you’ll get.) One day, I can hope that my buddy Andy Beach (twitter feed) and the company he works for, SeaWell Networks, will see their push for SVC/H.264 succeed—this format, Scalable Video Coding H.264, can wrap several tiers of H.264 content into one file, reducing the headache of having several versions of a singular piece of content on a server.

Regardless of the hype around Adobe’s announcement, one thing has become clear to me in the last few days: the perception of Adobe and its commitment to quality online experiences is being bolstered with this news. Some people are seeing as an admission of defeat because Flash technology is still not available on the iOS, but I’ve never seen Adobe as a “Flash” company. While the Flash platform is an important part of Adobe solutions, Adobe has a greater interest in providing tools that online-based businesses need to grow, whether it is client tools for HTML5 workflow or server technology for dynamic web pages, media, and published content. Perhaps more importantly for my business’s point of view, I believe this announcement from Adobe will help raise awareness of optimized video delivery. Most Internet media buyers, marketers, web designers and developers haven’t a clue about the details of adaptive streaming, and may only know the basics of video playback with a progressive download file from a standard web server.

If you want to see how video on the same web page can replay with Flash on the desktop or with Apple iOS, take a look at this recent 90 minute video uploaded, encoded, and hosted on videoRx.com: Welcome to Sportland.

More reading:

Andrew Fecheyr Lippens: A Review of HTTP Live Streaming (don’t miss the PDF paper linked in that article)