
Ask HN: Building an automated headless YouTube upload server with v3 API - ilmiont
I have very poor rural broadband, ~300Kbps up max. Uploading even a 100MB video takes an hour.<p>I want to use a Pi to build an automated upload server that will silently upload videos overnight. I have a working Python script to upload, pulled from the API docs, but want to take it to the next level.<p>I am aiming to create a web interface to upload videos. Select a video, it pulls across network to the Pi and the Pi then uploads to YouTube overnight. Adding another video should queue it to upload after the first one. Going to the web interface again should display the upload status.<p>My question is how to get this working in the background. Either I call a Python script from PHP and use nohup and &amp; for background or I upload from PHP (preferred). But surely this will hit max runtime limits and fail?<p>I&#x27;d like some guidance on this architectural decision. I have the basis of the app - videos stored to a sqlite so status can be retrieved - and video upload to the Pi. I now need to find the best way to upload to YT, with a queue system so uploads are handled in order, and in the background. Uploads must continue after the browser is closed and the server needs to run headless.
======
kgtm
First of all, is a web interface absolutely required? It adds a lot of
complexity to something that is no more than a couple of lines of Bash glue.
For monitoring, you can just keep an SSH session to the Pi.

How i would do it, requiring no user input:

* Designate a hot folder on the NAS, where i put all the videos to be uploaded.

* Establish a list of what has been transferred (nothing initially).

* From the Pi, poll the NAS folder for files that haven't been transferred yet.

* If a file is found, cat file | curl --data-binary @- POST it to YT.

* On success, record the transferred filename.

* Continue polling.

Of course, you can quite easily bolt-on a web interface to this, by exposing
some of the steps as API endpoints.

~~~
ilmiont
Something like this was actually my original idea - just monitor a directory
for changes. Then I've still got the issue of running this in the background,
constantly, though, so the uploads occur automatically and keep running in
background even after user logs out. (Monitoring is not as important as
headless uploads)

~~~
kgtm
That shouldn't be an issue, why aren't you using tmux/screen in the first
place? It's what most people use to persist sessions across logins (and much
more, like terminal multiplexing). I personally use Byobu with the tmux
backend on all my servers.

If you don't want to use extra software, a simple cron job running every
minute would suffice.

