
Show HN: Pxy – A Go server that proxies websocket livestreams to RTMP servers - chipneverdies
https://github.com/chuabingquan/pxy
======
mmcclure
Awesome to see this on the front page! I’ve been chatting with the OP for a
few days and this is based on a Node.js PoC I wrote for a blog post on the
state of broadcasting live from a browser[1].

This is, IMO, the simplest way to be able to go from a modern browser to an
RTMP endpoint. You could try and do server-side WebRTC using a project like
Pions[2], or use headless Chrome and be a peer, but both of those come with
their own headaches. This just uses the MediaRecorder API to send chunks of
video as it’s available via a WebSocket, then the server pipes those messages
into FFmpeg via stdin. It’s lightweight enough that it even runs incredibly
well on Glitch[3].

Also, if you’re curious, the videos that come out of the MediaRecorder are all
over the place between browsers, but they’re all varying degrees of “barely
qualifying as playable video”. Incredibly fun to play with, but you will
absolutely need to encode any output from it.

[1] [https://mux.com/blog/the-state-of-going-live-from-a-
browser](https://mux.com/blog/the-state-of-going-live-from-a-browser)

[2] [https://github.com/pion/webrtc](https://github.com/pion/webrtc)

[3] [https://glitch.com/~mmcc-next-streamr](https://glitch.com/~mmcc-next-
streamr)

~~~
chipneverdies
Hey Matthew, thanks for the insights you've been providing to me thus far,
I've added [1] to the references in the project's README!

Meanwhile, there's another well-explained tutorial that complements [1] along
with a code walkthrough to implement something similar in Node.js.

[https://github.com/fbsamples/Canvas-Streaming-
Example/blob/m...](https://github.com/fbsamples/Canvas-Streaming-
Example/blob/master/README.md)

~~~
mmcclure
Thank you! Really appreciate the link.

I'm really interested to see where this project goes, please keep me in the
loop!

~~~
chipneverdies
Will do!

------
scottlamb
In the same genre of Go video streaming adapter servers. Here are a couple
neat projects which stream from RTSP (as used by eg security cameras) to
either fragmented .mp4 files over WebSocket or WebRTC:

[https://github.com/deepch/RTSPtoWSMP4f](https://github.com/deepch/RTSPtoWSMP4f)

[https://github.com/deepch/RTSPtoWebRTC](https://github.com/deepch/RTSPtoWebRTC)

I'm jealous of Go's WebRTC library that makes the latter possible. I'd love to
have a similar Rust crate.

~~~
kingosticks
Doesn't Gstreamer provide a rust-based plugin to achieve this yet? It is
surely just a matter of time if not.

~~~
bluejekyll
Yes: [https://github.com/centricular/gstwebrtc-
demos/tree/master/s...](https://github.com/centricular/gstwebrtc-
demos/tree/master/sendrecv/gst-rust)

------
difosfor
What kind of video sources provide output via websockets to be proxied to
RTMP? OBS and other streaming video tools usually support RTMP themselves and
browsers only support WebRTC out of the box as far as I know. I'm curious what
kind custom browser video streaming solution you're using on the client side.

~~~
iameli
This appears to use the MediaStream Recording API
([https://developer.mozilla.org/en-
US/docs/Web/API/MediaStream...](https://developer.mozilla.org/en-
US/docs/Web/API/MediaStream_Recording_API)) to produce a WebM H.264 stream,
which can be sent to the WebSocket. The server then transmuxes this to a FLV
H.264 stream for RTMP output.

~~~
mgamache
You get Nal h264 packets from MediaStream. Note MediaStream only supports h264
baseline on Chrome (you can request other profiles, but that's all you get).
Not sure about FireFox.

[https://en.wikipedia.org/wiki/Network_Abstraction_Layer](https://en.wikipedia.org/wiki/Network_Abstraction_Layer)

------
alufers
I see you are invoking Ffmpeg via the shell. I also tried steaming to Youtube
from Go and had to resort to the same solution. It's a shame there isn't a
usable library for implementing a RTMP client in Go. Using Ffmpeg from the
shell is very limiting because there is no way to monitor the status of the
stream or to provide audio and video separately using stdin.

~~~
nightfly
Couldn't you use named pipes for passing separate streams of input?

~~~
alufers
I tried doing that, but I ran into some performance problems due to streaming
raw video. Instead I streamed the audio via stdin and created a HTTP MJPEG
server listening on localhost and pointed ffmpeg there. I was limited to maybe
10 FPS top and the JPEG encoding overhead was also getting noticeable, but I
limited the framerate to 1 per second because it was mostly static (song name
and current/remaining time of the track).

------
iameli
Cool project! I've done similar things with Kurento acting as a WebRTC peer to
proxy WebRTC --> RTMP before but this is a lot more lightweight.

One thing I'd love to see at some point is a project that compiles FFMpeg's
transmuxing code to WASM such that the WebM --> FLV conversion could be done
client-side. That way the server-side portion would be as simple as proxying
the WebSocket to a TCP socket for the outgoing RTMP traffic.

~~~
chipneverdies
Hello, I'm the author here, thanks for liking the project! I'm not aware of
Kurento prior to your post, it seems pretty nifty, good to explore for future
projects!

Getting the client to do the heavy lifting while the server acts solely as a
relay is a really good idea. In my use case, however, I have a flutter client
that's not capable of doing that at the moment, perhaps that's something that
could be explored further in the future.

------
SahAssar
A similar thing is used by Axis security cameras but with RTSP. The frontend
bits are open source here: [https://github.com/AxisCommunications/media-
stream-player-js](https://github.com/AxisCommunications/media-stream-player-
js)

------
amerine
When we first started replacing our Erlang router for Heroku Private Spaces
with a go-based one, the name of the POC was ‘pxy’. Very big fan of the name.
Yours seems fun too. <3

------
cosmotic
Which side of this proxy is producing/consuming only FLV/Flash streams?

~~~
chipneverdies
Hello, I'm the author here. The frontend would stream the video/audio to Pxy
via WebSockets. In turn, Pxy utilises FFmpeg to convert the incoming stream to
FLV and send the FLV stream to your desired external RTMP service (in this
case, I'm using mux.com).

------
ParadisoShlee
OP, could you support SRT? Listener mode would be really interesting.

------
taf2
Can this be used as a web front end for say a raspberry pi camera ?

~~~
mgamache
this will allow you to send video to any service that supports RTMP input
(Facebook / YouTube etc..) If you just want to view your Rpi Camera remotely
there are better solutions.

------
detaro
link is 404? forgot to set the repo to public?

~~~
chipneverdies
My bad, thanks for pointing out! The repo is now made public

