Hacker News new | past | comments | ask | show | jobs | submit login
Crashing competing media players on Android (abecassis.me)
49 points by etix on April 18, 2014 | hide | past | favorite | 26 comments



Hardware-acceleration of video decoding on Android is really a mess...

In addition to the point of the article (asserting on a file that is not processed correctly, instead of slowly bailing out), every piece of CPU/GPU behave differently and the Android test suite is extremely limited. On top of that, there are many layers (OMX IL, stagefright, MediaCodec, OMX AL, Java Media) that interact together in a complex way and everyone messes in those layers in a different way...

For example, fora long time, even after mediacodec API was out (4.1), you could get the data back from the codec, but the frame format was not standardized and there were no tests in the Android validation! So every GPU was doing something differently.

On top of that, the hw decoders don't even give the profile and level they support, so you have to guess them.

And even those days, there is no way to get a YUV 420 surface that works on most devices, even when using KitKat APIs.

And so many joyful things :)


The MediaCodec support is SO BAD in Android. I spent months and months fighting it every step of the way to try and get real time streaming video working in a app. YUV 420 vs rbg pixel formats still gives me nightmares. I remember testing 3 different phones and each one returning a different pixel format, two were yuv base (422 or 410 i think but it's been a year and I try to block that out) and one rbg. Bleh! that project made me hate android for about 6 months after I was done with it.


Sounds like a solid opportunity for some middleware.


This is the reason we're investing so much time on VLC for Android. Taking care of all the dirty work and just play your files no matter the device you're using.


VLC is a great model for playing files but getting a raw bit stream from a server and then converting that stream into a frame to play on a surface is a giant pain in the ass IMO.


You should really look at libVLC :)


Adobe AIR supposedly handles media well. These guys offer a library, but licensing seems to be 'call us': https://github.com/yixia/VitamioBundle

source: http://stackoverflow.com/questions/20366848


Looks like the license states it's unrestricted for individual developer use, which seems reasonable to me.


I worked on Adobe Flash's Android video support, so I feel your pain. :) Flash sits in the uncomfortable position between an internet's worth of poorly-encoded MP4 files and all the crappy Android devices' hardware decoders.

Google's Stagefright developers told me they did not want to provide an API to query the hardware decoder's profile and level support because some hardware supports H.264 capabilities beyond their state profile or level. Their recommendation was to "just try decoding the video" to see if it works (as if that is something that can be easily determined programmatically), ignoring the fact that playing an unsupported or corrupt file will trigger Stagefright's fatal CHECK() aborts. In Flash, we had to duplicate a lot of Stagefright's MP4 parsing code so Flash could fall back to its H.264 software decoder instead of hitting Stagefright's CHECK() "landmines".

We also had to write almost 10 different color conversion functions for all the random color formats used by long-tail Android devices.


The funny thing is that Android is parsing all your media files in your back, so even if you don't use their demuxer, the Stagefright's CHECK() can bite you...

> We also had to write almost 10 different color conversion functions for all the random color formats used by long-tail Android devices. We have those too. Are some of yours shareable?


Unfortunately, the Android color conversion functions in Adobe's Flash Player code are all closed source.

But Mozilla has some Android color conversion functions, including some quirk workarounds for specific devices, for Firefox on Android and Firefox OS:

https://hg.mozilla.org/mozilla-central/file/582b2d81ebe1/med...

Google also provides libyuv, which is also used in Firefox:

https://code.google.com/p/libyuv/


> On top of that, the hw decoders don't even give the profile and level they support, so you have to guess them.

We actually put together a codec tester that runs right before the first time a user plays a video precisely because of this. It throws a few frames of every profile/level we plan to support at each decoder component on the device to figure out which profiles we can use and which component to use to decode them. :(


The most questionable thing is why did they keep the assert in production code. Looks like unfinished job to me.


That's how I feel when I read through Android documentation with most things: this just seems unfinished.


When given the option to either abort the process or put hardware into an unrecoverable state and requiring a reboot, you leave the assert in for production.


Oh come on, this is just ridiculous. This is a process that is parsing a MKV or MP4 file. Nothing that needs hardware decoding.

Failing to parse a file should not abort a process. Multimedia is about parsing broken files!


There's demuxing and there's decoding. Demuxing can easily be done in software where it can fail gracefully. Decoding is computationally expensive and can't be done in software. On many devices, software decoders are too slow to decode a frame of video before it's too late to display, so you have to drop frames. Some software codecs refuse to even attempt to decode h264 of a profile/level above baseline/2.1.

So instead, decoding is done in hardware using finicky SoCs that don't always make it possible to fully meet the OpenMAX IL spec. In fact most device makers hack up OMXCodec and other classes in libstagefright to get the higher level Android media APIs to work.

All this code is littered with CHECK()s probably because Google can't do any better due to hardware being too fragmented (what you would do to recover from a hosed decoder might work on one phone, but not another). And device makers don't because either 1) once they hack up their ROM to get the YouTube app working, they just ship the device, or 2) implementing a way to recover would require a hardware change and ain't nobody got time fo dat.


Android's Stagefright does include CHECK()s that are unrelated to hardware decoding, such as MP4 demux parser code that could easily just return an error code but instead aborts your process.


Exactly. This is exactly the problem here, Android System is going to scan your media collection in your back, and by doing so, will crash your media player and may even reboot your phone.


Honestly, media on Android is so bad it doesn't need any help crashing. I worked on a large production app that played sound files in the background when running and about once in a thousand the Android media calls to play the file would just permanently hang. The only way to reliably handle media on Android is to have a watchdog process that checks when your own is hung in the Android code and kills and restarts it from saved state files. This was a call to play just never returning, so there was no error to catch or anything that could be done.


All video on Android seems to be a mess (except YouTube). For example, I can't figure out what streaming video formats are supported, and/or which apps work best. Does anyone have enough experience to say whether HLS or RTSP is better supported on Android?


Same's kind of true on iOS. Ultimately, the best solution seems to be on both platforms, ... VLC. Everything else seems to be based on ffmpeg anyway.


For Android > 4.0 go for HLS but you'll need a proper encoder.


"Since your average Android application does not have sufficient permission to access /dev/* entries on your system, you must use a proxy called the mediaserver in order to communicate with HW codecs."

This is incorrect. Application access to hardware de/encoders isn't a matter of permissions. You can't get at them directly because the api isn't in the SDK/NDK. However, if you're willing to pull in headers from AOSP, and fool around with dlopen/dlsym you can develop and build against libstagefright.


It depends on your SoC, but most of the time your application won't be able to access HW codecs and then you have no choice but using the mediaserver. I think that if you pull OMX functions from libstagefright you are actually using IOMX, it's the regular OMX IL API but using IPC with the mediaserver.


I think both author and you are correct. There is a misunderstanding in the 'permission' term. Author uses it generally and you specifically as in Android permission.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: