

Crashing competing media players on Android - etix
http://felix.abecassis.me/2014/04/crashing-competing-media-players-on-android/

======
jbk
Hardware-acceleration of video decoding on Android is really a mess...

In addition to the point of the article (asserting on a file that is not
processed correctly, instead of slowly bailing out), every piece of CPU/GPU
behave differently and the Android test suite is extremely limited. On top of
that, there are many layers (OMX IL, stagefright, MediaCodec, OMX AL, Java
Media) that interact together in a complex way and everyone messes in those
layers in a different way...

For example, fora long time, even after mediacodec API was out (4.1), you
could get the data back from the codec, but the frame format was not
standardized and there were no tests in the Android validation! So every GPU
was doing something differently.

On top of that, the hw decoders don't even give the profile and level they
support, so you have to guess them.

And even those days, there is no way to get a YUV 420 surface that works on
most devices, even when using KitKat APIs.

And so many joyful things :)

~~~
etix
The most questionable thing is why did they keep the assert in production
code. Looks like unfinished job to me.

~~~
jpk
When given the option to either abort the process or put hardware into an
unrecoverable state and requiring a reboot, you leave the assert in for
production.

~~~
jbk
Oh come on, this is just ridiculous. This is a process that is parsing a MKV
or MP4 file. Nothing that needs hardware decoding.

Failing to parse a file should not abort a process. Multimedia is about
parsing broken files!

~~~
jpk
There's demuxing and there's decoding. Demuxing can easily be done in software
where it can fail gracefully. Decoding is computationally expensive and can't
be done in software. On many devices, software decoders are too slow to decode
a frame of video before it's too late to display, so you have to drop frames.
Some software codecs refuse to even attempt to decode h264 of a profile/level
above baseline/2.1.

So instead, decoding is done in hardware using finicky SoCs that don't always
make it possible to fully meet the OpenMAX IL spec. In fact most device makers
hack up OMXCodec and other classes in libstagefright to get the higher level
Android media APIs to work.

All this code is littered with CHECK()s probably because Google can't do any
better due to hardware being too fragmented (what you would do to recover from
a hosed decoder might work on one phone, but not another). And device makers
don't because either 1) once they hack up their ROM to get the YouTube app
working, they just ship the device, or 2) implementing a way to recover would
require a hardware change and ain't nobody got time fo dat.

~~~
cpeterso
Android's Stagefright does include CHECK()s that are unrelated to hardware
decoding, such as MP4 demux parser code that could easily just return an error
code but instead aborts your process.

~~~
jbk
Exactly. This is exactly the problem here, Android System is going to scan
your media collection in your back, and by doing so, will crash your media
player and may even reboot your phone.

------
lnanek2
Honestly, media on Android is so bad it doesn't need any help crashing. I
worked on a large production app that played sound files in the background
when running and about once in a thousand the Android media calls to play the
file would just permanently hang. The only way to reliably handle media on
Android is to have a watchdog process that checks when your own is hung in the
Android code and kills and restarts it from saved state files. This was a call
to play just never returning, so there was no error to catch or anything that
could be done.

------
j_s
All video on Android seems to be a mess (except YouTube). For example, I can't
figure out what streaming video formats are supported, and/or which apps work
best. Does anyone have enough experience to say whether HLS or RTSP is better
supported on Android?

~~~
taiki
Same's kind of true on iOS. Ultimately, the best solution seems to be on both
platforms, ... VLC. Everything else seems to be based on ffmpeg anyway.

------
jpk
"Since your average Android application does not have sufficient permission to
access /dev/* entries on your system, you must use a proxy called the
mediaserver in order to communicate with HW codecs."

This is incorrect. Application access to hardware de/encoders isn't a matter
of permissions. You can't get at them directly because the api isn't in the
SDK/NDK. However, if you're willing to pull in headers from AOSP, and fool
around with dlopen/dlsym you can develop and build against libstagefright.

~~~
flx42_
It depends on your SoC, but most of the time your application won't be able to
access HW codecs and then you have no choice but using the mediaserver. I
think that if you pull OMX functions from libstagefright you are actually
using IOMX, it's the regular OMX IL API but using IPC with the mediaserver.

