Suggestion to everyone else: learn the CLI. Then you can do things like me last night. My wife collected a list of classical music videos she wants to play to our kid, and all I did was:
youtube-dl -x --audio-format "mp3" --audio-quality 0 -o "%(title)s.%(ext)s" -a music.txt
mp3splt -r *.mp3
I do batch downloads like these rarely, roughly once a year (though I expect it to happen more frequently in the near future). Because of that, I had both of the above commands stored in my snippets.org file. It's something I strongly recommend: store off-used CLI calls in your notes/personal wiki. It beats reinventing the commands a year down the line, or even trying to remember what tools did you use for that off job.
To preempt comments saying CLI is too hard for regular people: no, it isn't; they can learn, especially if you make them interested by e.g. showing them how to do batch jobs. But for the sake of the resistant, maybe the author of this GUI will upgrade it with option for batch downloads and format selection.
 - https://digitalsuperpowers.com/ is a nice book I read and can recommend. No affiliation, just curiosity, because the author is a regular HNer.
> It's something I strongly recommend: store off-used CLI calls in your notes/personal wiki.
All of my bash commands from any terminal get piped into a separate, unified history file, copying some setup I read about in a blog a long time ago. It's an easy, dumb setup that just works. I've been using it for years without any performance problems, and if I ever get any, I can just archive the current text file and start over with a fresh one.
Having access to past commands is really handy and has saved my butt multiple times.
$(history 1) =~ ^\ *[0-9]+\ +([^\ ]+\ [^\ ]+)\ +(.*)$
if [ "$command_part" != "$PERSISTENT_HISTORY_LAST" ]
echo $date_part "|" "$command_part" >> ~/.persistent_history
# Stuff to do on PROMPT_COMMAND
HISTTIMEFORMAT="%d/%m/%y %T "
alias phgrep='cat ~/.persistent_history|grep --color'
alias hgrep='history|grep --color'
My instinct says that if you'd run into any problems, it would be with this line:
'omxplayer' '--blank' '--aspect-mode' 'stretch' <(youtube-dl -o - --audio-format m4a --recode-video mp4 'https://www.youtube.com/watch?v=MKNYsKwM6HI')
It all comes together when the regular person takes a little time to learn the following three truths:
* the output of any CLI command may be redirected to the input of any other CLI command
* the output template of 99% of commands give you structured, predictable JSON data/metadata on which to operate
* 99% of CLI commands have the same sane defaults and are required by the spec to have the same option syntax for flags
* CLI is CLI. Generating a random number in the CLI on OSX is just as safe as doing the same on OpenBSD, Raspbian, QNX, etc.
 Three because I'm starting from zero
(If you ever happen to grow the project to the point of offering different languages, drop me a line, and I'll give you Polish language translations.)
EDIT: Also consider auto-updates for your application; it would be best if it could update its youtube-dl dependency independently from the app itself. I don't like auto-updates too much (especially when they're forced), but as others point out, youtube-dl and YouTube are in a state of cold war; YouTube downloads tend to magically break every couple of weeks now, which requires users to update the youtube-dl script.
 - https://www.lesswrong.com/posts/reitXJgJXFzKpdKyd/beware-tri...
Auto-update will definitely be a feature it's quite easy with the youtube-dl package I'm using.
It's something we could do better at bridging the gap between.
Tbh things like this, a gui front end for a terminal app seem the best way forward. I don't know why they aren't used a lot more in open source, it just seems to match up so much better to the open source model of a random person developing a project and no one taking over when that person loses interest. The front ends are the sexy bits where you get more churn but also more interest. The boring back end doesn't attract, but also doesn't need so much attention.
I suppose you could argue we do it at the library level, I still think there's room to do it at this level also.
For a lot of people this isn't true.
I think for web developers, making guis is fun (why this is done in electron). For a lot of non web developers, making the actual tool is the fun part. Having to then fiddle around with some gui feels less like programming and more like a chore.
Each have their pros and cons. I'd say what makes most sense is a library with frontends (CLI, TUI, GUI, (web)socket, etc)
In this case I wouldn't need a GUI with Electron. All I would need is a Firefox extension for youtube-dl (or the hypothetical library). Why keep reinventing wheel? Hence library.
One annoying thing with CLI is defaults. You might wanna set defaults (in CLI often used in an alias). So I end up managing a large alias list but what if I use more than 1 shell depending on machine? What if the machine does not use Git to sync such files?
No but you can script the underlying youtube-dl so with the gui front end you get the best of both worlds.
Re defaults, that isn't an inherent shortcoming of terminal apps. Theres no reason why 'youtube-dl url' shouldn't just do the 'right thing'. I agree it's a common problem though.
The error in your reasoning is that you assume the existence of a singular right thing. Software that attempts to just do the right thing tends to suffer from what I call Overly Clever Syndrome. I hate waiters or store clerks that second-guess my motivations. Why should I like software that does the same?
That said, `youtube-dl $url` will download the pointed-to video in the best available quality, so that's a pretty decent default.
Gui apps have defaults too, this isn't a problem unique to terminal apps. I'll go further and say GUI apps suffer more because they tend to design around a default and have many fewer non default options, in the name of being easy to use. Terminal apps on the other hand 'suffer' because they don't make enough assumptions about how it is to be used.
Suffer is very much in inverted commas though, sometimes you want to do that niche thing, and having something that makes no assumptions about how it will be used is great. On the other hand having to read a manpage for the simplest of things is not so great.
Elitism. A lot of people in tech have gigantic egos and believe they are better than everyone else because of their arcane knowledge, so they have no desire to see anything become more discoverable or easier to use.
It's not an ego issue - I doubt any tech worker is happy when dealing with non-discoverable software as a user. But connecting this experience to building your own software, and learning to design things to be discoverable by default (because in a typical job, if you don't do something right the first time, it probably won't ever be done right), is hard.
A) power users having different priorities to light users. That tool you use every day, you probably use keyboard short cuts, know all the flags. The tool you use once every 6 months, you probably scan the menus and check the man page. Open source tends to be focused more on the power users, mainly because its probably power users that made the thing to start with.
B) open source tends not to be a 'product'. There aren't focus groups and usability testers, the first releases are Alphas and Betas so tend to attract technically minded people that find their way around less discoverable interfaces.
Using it is like pulling up the manpage while typing out a command, but interactive. Once you're done formulating your command, it runs in your worksheet and you can modify it and re-run it if necessary, just like any other command.
A lot of my friends don't use youtube-dl for this exact reason. They end up messaging me, asking me to download files for them.
I have no desire to automate my interactions with my friends, especially for something that's easy enough for me to do and only takes a minute of my time.
One video per day is free - if you want extra "credits" pay for it on the website.
I thought higher number = higher quality but then I had a look at the manpage again...
But having them installing ffmpeg and python, manually copying some exe to a safe place, and adding that as a enviroment variable really cuts the interest off.
(Dunno which package managers are alright in Windows.)
This is how you get Chocolatey: https://chocolatey.org/install.
scoop install youtube-dl
Point to Ubuntu in the Microsoft Store. Python is preinstalled. You may need to install Python's package manager (I'm not sure does it come preinstalled) with "sudo apt install python3-pip". Install it from Python's package manager with "pip3 install youtube-dl".
You could technically skip Python's package manager by just doing "sudo apt install youtube-dl", but updates are slow there, and you really don't want slow updates with this tool ("pip3 install --upgrade youtube-dl"). I'd argue that three commands are way easier to explain than installing two binaries and clicking around far too many times, especially because youtube-dl needs to be updated on a regular basis in order to work properly.
"Execute these commands only once and then just type youtube-dl https://example.com. If something breaks, run this command to upgrade the script and retry." If the target audience is someone who you think will feel comfortable doing youtube-dl in a terminal, you may as well point them to the total of three other commands to set things up.
Yeah, that makes sense. Things have changed since last time I went through that pain.
Why do people who use Linux automatically think you cannot use scripts or there are no scripting languages on Windows?
However, I would say use Git and just sync your stuff to a public/private Gitlab/Github repo depending on your needs. That's how I organize most of my public configs.
When composing my comment, I wrote a couple words about Org mode, then realized it's off-topic and deleted them before submitting the comment. I fully intended to leave just "snippets file", but forgot to remove the ".org" part.
It feels like we've backslid by miles and miles since I first got on the Internet in the late 90's. I no longer like this Internet.
I would not be entirely suprised if Youtube one day just requires actual browser DRM to work at all. That would likely break every single smart TV out there, which is probably the only reason why they haven't yet done it. However, as they progress further on their path away from amateur content towards corporate-only content, I'd say there is a decent chance they'll pull that switch eventually. And die shortly after, of course.
Personally I think Youtube is a likely candidate for this. They may very well just remove all "legacy" content one day. They have absolutely no idea what to do with the platform, except offer it to music companies to feast on. They're at war with the majority of their content creators, I'd be very surprised if they didn't go for the nuclear option at some point - maybe after a large legal scandal...
It wouldn't, support for Widevine has been mandatory in order for Google to let TV manufacturers have a YouTube app for about four years now, and if you play back a Google Play film on the YouTube application on your smart TV it's already using Widevine DRM.
It is used in the YouTube apps, for playing back Google Play film purchases and rentals inside YouTube. Google does not allow YouTube to be installed if the YouTube apps cannot call the requisite APIs.
maybe not at all, but I definitely can see them offering the option to (some?) authors.
- Download from the project page and put it anywhere in your path (it's a single file)
- Subsequently do 'youtube-dl -U' whenever you want to update. It updates itself in-place with the latest release (there's usually 2 or 3 released per week)
I quickly learned the difference. Downloading from YouTube breaks. No distro update. But updating only the script is quick and easy.
Initial installation is also very easy. Two one-line recipies to install it are provided. One using CURL and one using WGET.
I have been using it with ffmpeg on a google pixelbook on cruoton for over a year and love it.
Better few than none.
This is where I am most often when I see stuff I want to use it on and while I always already have a terminal window open, it would be nice not to have to switch windows.
I already have a few different configs and aliases for getting stuff in the forms I really want. All I need is an even easier way to pass the URL in and trigger the download.
Being able to right click and get there would please slack me to no end.
Firefox _does_ have support for running this daemon-like-thing for you using Native messaging though.
youtube-dl-firefox-addon seems to employ this, so perhaps give that one a try?
I've been using a combination of firefox/chrome with youtube-dl/mpv+youtube-dl for a while
This comes very handy for playing high quality youtube videos on low end machines or downloading audio+/-video directly from youtube.
On the pre-webextension era of firefox/chrome , "open with" addon was the perfect fit.
Now, a special native messaging app needs to be installed to launch mpv/youtube-dl
"Open with" addon solved this by a python script (that obviously required installing python, and few other steps)
I decided to create a new tool to make this step easier and faster on windows: owclauncher
Basically a lightweight native messaging host for "open with" addon: it is not running on the background, and only started (and immediatly terminated) by the browser when passing a command to any other program
You can check and compile the source code or the .bat windows installer/uninstaller
In essence, the core of the whole thing is a single SQLite table, with rows of URL+status (and some metadata).
The web interface lists these, and provides a form to insert more rows. There is no ACL, my instance is only secured by htaccess. (I have specifically chosen to use a web server, as I'm sharing the URLs from multiple devices, but I want to download them over my home link and save to my NAS.)
The cron job reads the table, fetches rows that are status=queued, calls yt-dl to download them, and updates their status accordingly (with some fallbacks and error handling).
There's some extra stuff - like using ffmpeg to make thumbnails - but the basic operation is extremely simple.
(Browser extension and Android sharing option do the exact same thing as the web interface: post the current URL through the web insert form. Those are optional, but allow me to add URLs without extra steps.
The browser extension is installed unpacked, through developer options, and has the URL hard-coded. It works both in FF and Chrome.
The Android share option is through https://play.google.com/store/apps/details?id=ch.rmy.android... , making a HTTP request directly - I'll add a README.)
The alternative I'm considering is a script for my WM that essentially does "CTRL+L, delay 500ms, CTRL+C, append to a rolling 'to yt-download' list file".
It sounds complicated but i had never written any firefox addon at all before trying it and it took me less than an hour to write the entire thing just by following the docs. The biggest change was switching from regular firefox to developer edition so i can disable signature checking for addons (the regular firefox doesn't allow that), but so far all that meant is that the red icon globe is now blue and i get more frequent updates.
None of that is available publicly though since it is very configuration-specific - and really all it does is to call 'youtube-dl' - but it is easy to replicate. Also learning to make some "personal" addons might be helpful in the future, similar to how you customize Emacs or Vim.
- bookmark url in a specific "YouTube-dl" folder
- watch tht folder from a daemon
- when the folder changes, do the stuff automatically
I'm sure there's a way to read bookmarks from outside the browser ?
Bookmarks don't seem that easy to exfil either. Firefox for instance (just checked) stores bookmarks, along with browsing and download history, in an sqlite database, that it keeps locked, so sqlite3 won't read from it. Maybe there's a way to force-read that database, but I haven't figured it out yet.
Here's an alternative idea I might actually go for: a systemd service on my NAS in my home LAN that accepts links over the net and feeds them to youtube-dl. It could maintain a database of links already downloaded to avoid unnecessary redownloads. Then I could use an extension to simply send requests to an URL in my LAN.
My perfect UI would actually have this integrated with the Like button on YouTube. Click Like, it schedules a download. This is because there's plenty of videos that I wanted to get back to after few years, only to discover them gone (account banned, video removed, copyright bullshit, etc.).
 - Or something. There must be something in systemd that can be used to listen on a port, feed requests from it to a script, and reply with the output?
xterm -e yourube-dl --restrict-filename -o "~/downloads/%(title)s.%(ext)s" -f "best[height<=?1080]" $1
the height means no bigger than 1080 but if 1080 does not exist go to the next best down. i have a virtual desktop(tiling manager) that my downloads download at visually like wget youtube-dl and thats why i use xterm -e keeps them out of the way and trying to draw ontop elinks. the terminal automatically closes when done.
The user-facing business model of YT is quite simple:
> You get to see great variety of videos, but have to watch ads every now and then.
So from their perspective, youtube-dl allows you to steal the content w/o giving anything back to either YT or the creator.
To make this fair/sustainable youtube-dl user would either (A) have to tolerate advertisements in the downloaded video, or (B) pay a fee for each downloaded video.
I'd prefer for youtube-dl to stays small enough, that we can continue to freeload a little longer ; )
Actually, now that I think about it, I can't remember if I've tried a text browser, might be bearable, though images of videos are sometimes helpful.
It exists! It's called SMTube but don't let the name confuse you, as it works with a lot of media players, including mpv:
(note that by default it only shows music videos, but it's easy to change: https://github.com/mps-youtube/mps-youtube/wiki/Troubleshoot...)
It's no replacement for the CLI but it's better than copying the url, opening my ssh client, connecting back, navigating to the right spot, running youtube-dl, waiting for it to finish (this is the worst part), then closing my ssh app, and moving on with my life.
Does everything it needs to and doesn't require you to use electron.
Or does youtube-dl need to be already be pre-installed on the system?
I was just thinking I could for instance make a similar JavaFX crossplatform GUI on the JVM, but I'd have no idea how to call the youtube-dl python code. So I'm just curious how Electron solves that problem :)
As a sidenote I didn't know what you were referring to initially because the executable is named youtube-dl-gui...
I hope YouTube is not coming up with a solution.
How does it compare to jDownloader (jdownloader.org)?
It can analyze the clipboard - when I paste a YT link it asks me if I only want the video or the whole playlist.
I can choose from formats and I can choose to only download audio if I'm not interested in the video or want to save some space.
It has a reconnect feature for the router to reset to get a new IP and much more really cool things.
I just tried installing on windows following the README and gave up after fixing npm proxy issues and encountering multiple other errors afterwards. I could fix it, but I don't have the time to investigate.
Anybody who can fix it, likely won't need a front end like this.
But anyway, Youtube brakes API too often and maintaining any of YT apps is very painful.
Use chocolatey to install youtube-dl CLI as a self-contained executable.
But nice work. I just hope it doesn't bring the ire of Google
If you want to invite others to contribute, you may consider adopting a more widely employed js code-style.
This seems to be written in a very personal, opinionated and unconventional way.
More useful advice would be: use Prettier and ship a .prettierrc so people can trivially transform their contributions into code consistent with the rest of the project.