Hacker News new | comments | show | ask | jobs | submit login
Show HN: Kickflip – Open Source Live Mobile Video Streaming SDKs (kickflip.io)
39 points by Mizza 1251 days ago | hide | past | web | 49 comments | favorite

Hey guys!

Rich from Kickflip here. We've been working really hard on this for a long time and are super happy to get to announce it today.

Kickflip is some really cool tech - we've lowered the cost of mobile video broadcasting by orders of magnitude, and we've opened it up to everybody!

If you've got any questions or feedback, let me know and I'll be happy to answer them.

Thanks very much!

Glad that someone is doing this. The last time I looked at it, it was hard to do hardware encoded video (out)streaming well on the iphone. The open API only allowed writing the compressed stream to a file. Pipelining or FIFO tricks did not work well because the file was not in a consistent state untill it had been closed. It appeared that the OS was not flushing key metadata until then. So the options seemed to be either have a quick cycle of starting and stopping recording on a ring buffer of files, this would drop frames; or parse and fix (with some second guessing) the compressed stream in the application. The first one was easier to implement but quality wasn't great, although software interpolation of the dropped frames might help some. The second seemed too much work.

Has the scenario on the open API changed ? I would guess not. Congrats for taking care of this.

Unfortunately Apple still doesn't expose the APIs that we need and are forced to use tricks to get the encoded frames. We based our hardware encoder wrapper on the technique described by Geraint Davies [1], but have modified and improved upon his method.

The main trick is to encode only video frames in your temporary .mp4 file, and to make an additional 1 frame .mp4 file for extracting the avcC record.

1. http://www.gdcl.co.uk/2013/02/20/iOS-Video-Encoding.html

Can you share your experience of the typical delay for CDNed live stream? 5 sec? 10 sec?

We found that Bambuser had a delay of around ~60s - ours is 10-20s, depending on the buffer size. It can be brought down to 5-10 by shortening the chunk size, but that increases the overhead somewhat. However, we're experimenting with this all the time, and optimizing based on the needs of our users, so hit us up via email if you have any specific requirements.

It is worth nothing that this is definitely a broadcast-first technology (at least for now), meaning that it's designed for delivering a high quality stream which will be seen by a large audience, rather than a low quality stream to an indivudual audience with a short amount of latency.

That's actually a double 720-flip, not a kickflip: https://kickflip.io/

Well, that's because we're the only live video broadcasting SDK which can do 720p!

(Ok, it's actually just because it's an awesome trick. The 720p thing is just a coincidence.)

Very cool! I've recently open sourced a good portion of my game broadcasting SDK, which now has a few people using it for purely making camera apps to stream to RTMP services. Mine is not nearly as refined as this one, nor do I have any sort of CDN heh.

Congrats guys, looks awesome.

Thanks man! I really appreciate it.

All of our streams are HLS rather than RTMP, although we do have RTMP support working in the SDKs, we don't plan on making that a priority product focus right now - HLS and DASH are the future. Kickflip will likely officially support RTMP in the 1.1 versions of the SDKs.

Thanks again for your support!

Hey https://github.com/jamesghurley/VideoCore looks great, wish I knew about it earlier.

Thanks! It's a side-project but I try to fix any bugs as quickly as I can and I've got some new features planned/in development for the future.

Wow, if only I knew about VideoCore earlier! I used a modified version of Geraint Davies temp .mp4 file hack, but it looks like you developed a great alternative.

I've never actually seen this code before. I find it interesting that he found that the frames were arriving 500ms after capture, but I've used a similar technique (to how I did it in VideoCore) for video conferencing that achieved much, much lower latency than that. I wonder if that's something that was true in 2010 but is no longer true.

I also haven't really encountered much of an issue with frame reordering, but as it is I mostly don't care about that. I have noticed, however, that as of iOS 7.0.4 or so the encoder can output multiple slices per frame which it previously didn't do (or at least I never caught it doing it anyway)

Yeah I made modifications to disable frame reordering and handle the extra SEI information properly, but I'll study your solution too because it seems really awesome. The lower the latency the better :)

"python client source code" link is dead: https://github.com/kickflip/kickflip-docs#api

Nice catch! :)

The python library and client is still work in progress.

Still, I've open sourced the repo so those who are curious can take a peek, although there isn't much worth showing there yet. We'll make a larger announcement when that happens.. there's a lot of stuff we've got planned for the desktop yet.

Here's the fixed link: https://github.com/Kickflip/python-kickflip

Transcoding on the client instead of in the cloud increases the amount of CPU and upstream bandwidth needed, and both are in short supply on phones.

To put this in perspective, three live encodes/ streams (360P, 540P, 720P) using Wirecast on a high end Macbook Pro pegs the CPU around 60% and requires at least 5MBPS upstream.

I understand the financial reasons for this architecture - cloud-based transcoders are crazy expensive - but from a customer satisfaction POV it's not an easy project.

The CPU usage actually isn't a whole bunch, since we do the video stuff with the hardware encoders on the devices.

It does use more bandwidth, since the streams are higher quality than other methods, but that's . It's also configurable.. it doesn't _have_ to be higher quality. Implementers can choose to lower the resolution.

All in all, this is definitely a better solution for developers and clients.

This is definitely a cheaper solution than outsourcing to a cloud transcoding service, which is $$$.

The streams are not higher quality than other methods. Your 720P is the same as everybody elses.

But everybody else doesn't have 720p, precisely because cloud transcoding is so expensive. By default, Bambuser broadcasts at a measly 240 resolution with a low bitrate, and other streaming services were similar. In our test, our video quality was always much much higher.

This is a great opportunity for a future blog post in which we explore the differences you're bringing up!

Encoding is done in hardware on the client side and doesn't increase CPU load by very much. I'm using the same trick in a library I made to do game broadcasting from iOS and honestly network traffic increases CPU load more than encoding an 1136x640 video does.

Can you be more specific by "in hardware"? I'll buy that there's some trick analogous to shaders on the desktop. But I'd want to know the specifics to be able to evaluate that. How many encodes have you been able to do, anyway?

Encoding and upstreaming a single 640p video does seem plausible. But for each additional encode you need HLS and RTMP, and both CPU and upstream bandwidth grow linearly with each encode. A single bitrate means HLS and RTMP - two encodes, two upstreams. (Only HLS means only Apple software - 20% of the mobile market, 5% of the desktop market).

Hello! Kickflip developer here!

The SoCs included with virtually every mobile device today have circuits designed specifically for encoding/decoding H.264 video and AAC audio. This is how your mobile device's native camera app reliably records HD video. Whether it's 640p or 1080p, there isn't a significant drain on system resources.

These chips were traditionally accessed via a standard OMX[1] library (Think OpenGL for Video hardware), but the OMX implementations were device specific, making it nearly impossible to write custom video software with mass market appeal.

Just recently (July '13 on Android), the video hardware has become somewhat controllable by the standard Android / iOS platform APIs allowing us to write a truly compatible video product.

We currently only offer single stream output. When we do offer multiple bitrate outputs (transcoding), we'll do that work serverside.

[1] http://en.wikipedia.org/wiki/OpenMAX

If you only offer a single stream, your claims of being better than cloud-based transcoders have no meaning. You don't transcode, they do.

You push a single stream, which any client can already do, and they can already live without transcodes. Your app doesn't add value by freeing them from the cost of cloud-based transcoders.

Our current offer is a significant cost reduction for a single high-quality broadcast to a large audience. On top of that we handle all the plumbing related to your iOS/Android cloud video app. Our SDK can manage all your application's broadcasts and users (if you choose).

I'm excited for our open-source Android and iOS clients to stimulate development of some novel video apps. Maybe Security monitoring systems or even Phone + $20 Weather balloon = Weather satellite!

Why should you have to encode twice for HLS and RTMP? The only reason to encode twice or more would be if you wish to have different movie properties (i.e. bitrate, fps, resolution, etc)

For example, for h.264, once you get the NAL units from the encoder (which in the case of iOS is a chip provided by PowerVR) you can encapsulate them in whatever format you need...mpeg-ts, MP4, RTMP, RTP, it doesn't matter...

I only support one simultaneous encode in my SDK, and I'm not actually sure how many the hardware would support as an upper limit.

I've done simultaneous hardware encoding before (two 640x480 videos) but I haven't pushed the boundaries yet. I have a hunch that newer phones might be able to do a few bitrates at the same time, but at that point we will do re-encoding server side because end users will have more limited upload bandwidth than our servers.

Interesting... I should do an experiment to see how many encodes I can do at once and what ends up happening.

jgh, it's a completely different stack for HLS publishing vs. Flash. Different media, different upstreams, different endpoints. Looks at any encoding/pushing tools.

It's possible to perform a single encode per bitrate, then split it between Flash and HLS on the server side, but then again that already exists in the form of cloud-based transcoders.

It's also possible to take that single encode - h.264, for example - and wrap it twice, once for MP4 and once for HLS, but that would be software, which is CPU intensive. If it's in software, it's not in hardware.

You don't need to do that though. The container or packet format is completely independent from the encoder format and they're all fairly simple, so packetizing compressed audio and video data and sending multiple streams, while not ideal due to bandwidth concerns, is most definitely not limited by CPU resources.

Hey HN can we stop this petty downvoting please. It is starting to get really annoying. Dont know if anyone else has noticed this, but it seems stories these days are riddled with greyed out posts, even perfectly legit comments. I try to compensate with upvotes, but as I said this is really leaving a bad taste everytime I see this happening.

This change seems highly correlated with the change of hands. I thought this would be a temporary phase, but seems to have become worse with time. I think a line of over-moderation is being crossed here.

I would try and flag my own comment so that it gets some attention. EDIT: Unfortunately it seems one cannot do that, but if you agree, and can flag this, please do.

You must have missed the recent posts where we've written about this at length. Here are a few to get you caught up:






You'll notice that I've written a lot about the community practice of corrective upvotes, which is the best antidote to unfairly faded-out comments. This practice works most of the time. For example, the comment score you're complaining about was corrected hours ago. Corrective upvotes work because few such comments go far into negative territory. They're the best solution because they fix most of these problems silently.

The reason the guidelines ask you not to post complaints about downvoting is that those, by contrast, stick around forever, worsening the threads.

Replying here because I may have your attention. Feel free to remove, because after that my purpose would have been served.

I am well aware of your position, and I do try to correct comments that I feel were unfairly down-voted. However, I dont like this solution at all.

> For example, the comment score you're complaining about was corrected hours ago.

I had upvoted it to compensate for the downvote myself. If this was a single instance I would have moved on. I am seeing this too frequently and I dont see how else I can bring this to your attention. (If there is an alternative please do mention, Would be happy with a Ask dang page.)

A downvote is unfriendly (although sometimes it is indeed deserved). An unfair downvote may or may not get corrected, and now I can only correct it and not give it a positive karma that I otherwise would have liked to. That does not seem to have been the HN way.

It still would have been fine if the downvote pattern would have been as before.

Compensatory upvote has been a part of HN since ever. It is not a new invention.

Its the recent uptick in downvotes on perfectly benign comments (not mine) that is bothering me.

What is bothering me is that I noticed that downvoting has acquired a distinct hair trigger edge and a tendency leaning towards petty disagreements. There is a heavy feeling of over moderation in several threads. I am not an old timer by any means but have been around for a while to know that this feels very alien on HN, unsavory too.

On the contrary, this mythical "highly toxic comment" that is being used to justify the change I have seen very little of. Far rarer than a new story with half the comments greyed out, for the entire night.

I love spending time here, I am sure you would want HN to be something like that, so wanted to bring this to your attention even though I was sure this will get downvoted. If this causes some reflection, thats a tiny cost to pay.

Finally, like with system administration, good moderation is one that is barely noticed. I sense things are getting a little overboard. Just an FYI

> I dont see how else I can bring this to your attention

That's easy: email hn@ycombinator.com, as the guidelines say.

"Heavy feeling of over-moderation" seems to suggest that you think we're the ones doing most of the downvoting. We're not, of course; users are.

> That's easy: email hn@ycombinator.com

Excellent! have overlooked this, and thanks for responding.

> We're not,

Never doubted that.

Is the SDK going to allow a way to bypass all of the Kickflip infrastructure to allow the core streaming library to be used on your own maintained infrastructure?

On the product I work on, we already have our own house built media server and rtmp restreamer. As to not supporting RTMP, in our experience RTMP had much lower latency, which resulted in us switching away from HLS wherever possible.

RTMP support is on our roadmap (and actually exists inside the SDK in a disabled state), and our live transcoding backend will have the ability to restream to any arbitrary RTMP endpoint in addition to the other outputs.

That's good to hear. Is the plumbing the "cost" of using the open source sdk, which in turn technically requires you pay for the service, or is there a way to remove all of the kickflip infrastructure plumbing and rely on our own?

I don't actually work on the mobile streaming side of things, I work on the web side, particularly the video player.

Very cool. This could be the holygrail for live streaming. Unfortunately Qik was retired. Good job.

Besides using FFmpeg, are any novel optimizations/compression being offered?

We actually don't use FFMpeg to do the compression, we use the hardware encoders, so it uses much less CPU. That's how we're able to broadcast at a much better resolution.

It's also bandwidth adaptive streaming, backed by a global CDN, and generates both Live and VOD HLS streams.

But, more importantly, we've designed it in such a way that as a developer, you don't even have to worry about that! The library just exposes a really simple API that does all of the video magic for you.

Nice work! What are the min hardware requirements for iOS & Android devices?


Kickflip currently requires iOS 7 and works on any iOS device which supports iOS 7.

The Android requirements are a bit steeper, unfortunately: Android needs at least 4.3, as that's when the hardware encoder access API which we use was introduced.

I created an account but when trying to create an app it redirects to a blank page.

Can you share the costs of streaming to 100Ks clients?

What do you mean by "share the costs"? Our pricing plans are based on usage tiers: https://kickflip.io/pricing

is this some sort of hosted service and if so where is the pricing ?

Yep, we manage all of the infrastructure so you don't have to. We should make our pricing more up-front but here is a direct link: https://kickflip.io/pricing

requires me to sign in before viewing the pricing ??

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact