How to Solve Your Video Fragmentation Challenge of 4k Distinct Android Devices
As I pointed out in my previous Guide to Mobile Video Delivery, Android fragmentation is a challenge for mobile delivery. OpenSignalMaps ran an interesting study that provides some mind-blowing numbers.
Over the past 6 months we’ve been logging the new devices that download OpenSignalMaps, we’ve based this study on 681,900 of these devices. We’ve looked at model, brand, API level (i.e. the version of Android) and screen size and we’ve tried to present this in the clearest form we can.[…] We’ve spotted 3997 distinct devices.[via OpenSignalMaps].
To visualize this, those are a lot of devices.
Adobe AIR provides an abstract layer on top of a complex and fragmented foundation, with a lot of continuous efforts invested in supporting different chipsets and drivers. As pointed out in a previous blog post, even the supported streaming protocols differ based on the Android version. Once built on top of AIR, the application will adopt to new devices with minimal efforts due to continuous AIR updates. [via Guide to Mobile Video Delivery]
Does this mean AIR will run on all of those devices? Probably not all, since there are minimum hardware requirements.
- ARMv7 processor with vector FPU, minimum 550MHz, OpenGL ES 2.0, H.264 and AAC HW decoders
- Android™ 2.2, 2.3, 3.0, 3.1, 3.2, and 4.0
- 256MB of RAM
But it runs on devices powerful enough for video and the ones you would target for video applications. Clearly there seems to be value to build on top of a layer like AIR than having to worry about video fragmentation with hundreds/thousands of test devices.