Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Advice on a live video processing project
3 points by pedrolins on Feb 24, 2022 | hide | past | favorite | 4 comments
Hello HN, I'd love some advice on live video processing for a hobby project I'm working on.

I'm currently working on placing a live camera on a beach. My current setup leaves a lot to be desired, but my main problem has been with video-processing-related stuff which I have had no contact with as a developer until now.

It consists of a camera-equipped microcontroller connected to WiFi that is constantly sending JPEGs to a server through HTTP POST requests. On the server side, these JPEGs get saved as files in a folder which in turn are converted to an RTMP live video feed - using ffmpeg - which gets sent to a streaming platform (Twitch or Youtube).

There are problems with my current approach mainly that ffmpeg "finishes" converting my JPEGs mid "transmission" (I use quotes here because all that means is that my microcontroller is still sending JPEGs to the server).

I searched extensively but I've had trouble formulating the queries given how specific my use-case seems to be, and have had little success in this regard. That leads me into believing that I'm commiting mistakes in my approach to this project - which is of no surprise, given my lack of experience in video, specially streaming.

So I'd like some pointers or suggestions from you - it could be pointed at any part of the process I'm currently doing, though what made me write this in the first place was the video streaming problems I'm facing. It could be general resources into encoders, decoders, transcoders, video in general... Or even just comment what you'd change to make streaming work.

Perhaps as a way of being more direct: how does can "generic" - as in normal practice - livestreaming be achieved without the help of something like OBS?

Thank you.




I've had a small amount of experience with this sort of thing, though not with the full blown streaming apps like OBS. For example I've written an Android App that sends video frames from a phone's camera to a remote client. I believe I set it up so that the camera side acts as the server and the client side just issues GET requests, but it's been a while since I looked at the code so I'm not entirely sure. It worked OK for what I wanted but I always hoped to improve it, which I is something I have yet to get around to.

As for your setup I'm wondering why you chose to use a microcontroller on the camera side instead of something more powerful like a RasberryPi. I would think the later would give you more options and I doubt the cost difference would be very significant to the overall project.


Yeah, the Pi would definitely be a better choice for this but even the Pi Zero would be 2x the price of the ESP32CAM I'm using (which in the country I live isn't as cheap as in the US).

> I've had a small amount of experience with this sort of thing, though not with the full blown streaming apps like OBS

My main issue has been to understand the workflow around livestreaming video. I have ok knowledge of networking but just hadn't really come across this domain, though thank you for sharing.


May I ask why you are using a collecting images this way? There are super cheap cameras that output rtsp streams. Are you dealing with a low-bw connection?

GStreamer is pretty neat for this sort of stuff. More flexible than FFMpeg for this sort of use case.

I also don't understand the "ffmpeg finishes .. mid-transmission" part .. are you getting chopped images in your video stream?


There's no particular reason for collecting images this way other than that's what the hardware I have allow (an ESP32CAM).

> There are super cheap cameras that output rtsp streams.

Yeah I forgot about that. I think that if I can't get my current setup to work I'll buy one, thanks for the advice. I didn't think of buying an RTSP camera because I thought these cameras were made for local networks only given that it's the standard protocol for security cameras (I didn't know what RTSP was until I actually started working on this)

> GStreamer is pretty neat for this sort of stuff. More flexible than FFMpeg for this sort of use case.

Thanks for the recommendation, I'll read their docs

> I also don't understand the "ffmpeg finishes .. mid-transmission" part .. are you getting chopped images in your video stream?

It's just that when I execute ffmpeg on my terminal in parallel with receiving + writing the image files to a folder, the ffmpeg process finishes (instead of continuing indefinitely as long as I'm receiving files) because it goes faster than the actual rate at which files are coming. I found that the -re flag tells ffmpeg to process files at the same frame rate of video - which is exactly what I'd need, since that would ensure it would go on as long as I had images coming - but since I'm processing JPEGs I don't have a constant frame rate, so that's what I'm trying to figure out right now in order to make it work.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: