If you followed my past posts, you often heard terms such as “addressing fragmentation” and “solving the Android video problem”. But what does this really mean, and what is the problem?
Doesn’t Android support HLS on newer Android devices, which solves all your video playback and fragmentation issues? Not really.
The main challenge is HLS was designed by Apple and optimized for iOS devices. Even though the live latency is pretty significant, everything else from the switching logic to the playback experience is excellent on iOS. But if you are implementing HLS on other platforms, it is based on the HLS IETF draft, which still remains in informational state after many years. As with any spec, it gives a lot of room for interpretation, which unavoidably leads to different implementation qualities of HLS.
LongTail did an analysis of current issues of HLS on Android, which lists some of the current shortcomings.
Android 2.3 (Gingerbread)
- No Support, despite being the most popular version of Android
Android 3.0 (Honeycomb)
- Streams cause tablet devices to crash
Android 4.0 (Ice Cream Sandwich)
- VOD streams do not seek
- Aspect ratios are not detected and cause image deformation
- Fullscreen causes videos to restart from the beginning
Android 4.1+ (Jelly Bean)
- Aspect ratio issue is fixed, but seek is still unavailable
- Chrome does not understand HLS leading to broken mimetype detection
- Taking video fullscreen causes devices to throw an error and stop.
The original solution for video on Android used to be Flash Player, which provided video applications an abstraction layer to reach all Android devices, no matter what the underlying video capabilities were.
Even though not officially supported anymore, there is a recent tutorial how to install the archived version of Flash Player on Android – unfortunately not compatible with Chrome, only with the old Android browser.
This leads to a question – without Flash supported and not very robust native HLS video playback, how is it possible that there are so many video applications on Android?
This is were the openness of the Android platform comes in. Android might not offer a high quality out of the box HLS video stack, but it provides APIs that grant low level access to develop solutions such as Adobe Primetime Player, which includes, besides a lot of other features, its own HLS video stack – this is very different from iOS, where using the native HLS video stack is required get 3G/4G application approval.
Is HLS on Android really bad? Yes, if you only rely on native capabilities – but it does not mean you cannot deploy high quality HLS playback (or any other protocols), since Android provides flexible interfaces to extend and provide a high quality experience. It’s a philosophical question if this is a better or worse approach compared to iOS. Android’s openness leads to more room for issues, but on the contrary, more freedom for extensibility.