Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Download videos, but only the parts you want (videodownloadtool.io)
108 points by foxbarrington 22 hours ago | hide | past | favorite | 34 comments

I made this because I would download a lot of videos from Youtube to watch on my phone using VLC. Most of the videos I downloaded were from video game streams, and I wanted to constrain the video to just the gameplay (cropping the frame to just the game and trimming the timeline to ignore anything before/after). It got tedious to do this manually, so I made a tool to speed things up.

The video editing and streaming is CPU and bandwidth intensive. If you want to use this a lot, you're better off using your own server (there are 1-click deploys to both Heroku and Digital Ocean along with a link to the Dockerfile and source), but I've added a link to a demo server if you just want to try it out. Depending on how much use this gets, performance might go down the drain.

The cropping was a lot faster in that demo than I expected. Very cool!

Pretty cool!

I'm curious as to why this is offered as a cloud machine and not something that runs locally. Wouldn't the encoding performance on a free or cheap cloud instance be poor?

I create a lot of video compilations using twitch and youtube videos as source. My internet is 5Mbps down, 3Mbps up, so it can take hours to download long video streams and extract just the parts I want.

Because of my network limitation, I found a way to combine ffmpeg and youtube-dl to grab sections of video from a twitch or yt video. The following is an example which replicates the videodownloadtool.io example.

    ffmpeg -ss 00:15:34 -to 00:16:09 -i $(youtube-dl -f best -g https://www.youtube.com/watch?v=Qf2tplcb6eE) -vf "crop=(iw*0.75):(ih*0.80):(iw*0.20):(ih*0.08)" ./output.mp4

Wow, that's magic!

(It depends on youtube-dl -g producing a video URL that ffmpeg understands, which unfortunately is not always the case.)

Related: https://github.com/ytdl-org/youtube-dl/blob/master/README.md...

That's why he uses `-f best`, which in 99% cases will give the best single-file output from YouTube that ffmpeg definitely can understand (the exception being live streaming or VOD of a just finished live streaming, which may not have a non-DASH variant.)

The caveat is this will only give you up to 720p video, since higher quality have separate audio and video (likely in DASH format), which ffmpeg can understand, but you need to modify the command accordingly. (The default for youtube-dl is `-f bestvideo+bestaudio`; with `-g` it gives you exactly that: two urls for audio and video respectively).

Actually, the site seems to be based on doing basically the same by forking yt-dlp -J with one API request and then ffmpeg with another. The ffmpeg options: https://github.com/davidguttman/video-download-tool/blob/mai...

> Because of my network limitation

If I'm understanding correctly your snippet you're downloading the whole video anyway, no?

It looks kinda like that, but no. ‘youtube-dl -f best -g …’ returns a url, and ffmpeg seeks in the video stream and doesn’t download the whole thing.

Wow nice! Thanks for the explanation, else I would have dismissed this beautiful hack!

I also see that, that syntax should not just download a slice: -f is for the format and -g should be simulation (-g for --get-url : 'Simulate, quiet but print title'). Probably an "undocumented" behaviour of '-g'?

-g is "Simulate, quiet but print URL", not "print title" which is "-e".

Simulate here means it does not download the video. You pass the URL to the ffmpeg which actually downloads the video.

My former comment (uneditable) was not really getting the mechanism: -g does get the URL, as the expanded option name says (though the explanation says something different), and `ffmpeg -i URL` accepts as input the googlevideo.com URLs. Ok.

Also, the costruct seems to accept input, at least sometimes, from e.g. '-f 137' (mp4 1080), so "very Hi-Res", and '-f 140' (m4a 128kbit), which can be then recombined. So, downloads higher than "720 rasterbars" are in general possible - and they are probably frequent real cases, when one needs to obtain a few frames of very high quality video.

The thing is, it seems to download the initial part anyway. The command takes a lot of time to download, and using e.g. 'iftop' one can see that the command starts downloading immediately, e.g. maybe ~5MB with a '-ss 00:00:10', and ~10MB with a '-ss 00:00:20': the use of '-tt range_end' or '-t duration' will stop downloading at the correct point, but the construct does not really "seek" to the starting point of the video.

Maybe adding some parameter to the URL?

I assume ffmpeg is seeking ahead with the http Range header

As for something that I would rather have on a website instead of locally, extracting autogenerated/provided subtitles from YouTube videos, so you don't have to scrub through the entire video just to get to the information locked within it.

Since I'm usually using a web browser, it's more convenient for me to open a new tab and paste in the video URL than open a terminal and remember what the youtube-dl command has to be. Also, smartphones cannot use youtube-dl in convenient ways, at least the last time I tried, so I'd have to open a new SSH session on my phone and painstakingly type everything, and so on.

There used to be a site for exactly this purpose called hierogly.ph but it was shut down, and the author didn't open-source it.

Thanks! It’s cloud because I assume that a lot of people who want to use it will run into issues installing all the dependencies (yt-dlp and ffmpeg), and there’s also the bonus of being able to use this on any device (phones or work computers).

I made it open source so that anyone who wants to run it locally can do so, but maybe there’s a better way to do an easy to install local version?

Maybe it has something to do with the referral codes in the links?

below is (edited for hn) the script i use instead of youtube-dl. the ffmpeg option is added; need to test it. this script works with all non-commercial videos, like the one in the parent comment. for commercial videos that restrict downloads, a browser with limited DNS and request blocking or youtube-dl or a headless browser will work. if use m.youtube.com with minimum headers then should always get mp4 urls with 1-2 quality options: itag=18 or itag=22. run the script with no arguments to see what quality options are available. then run it again with 1 argument to download or run it with 3 arguments to download an excerpt.

if script is called "yt" usage is

  echo https://www.youtube.com/watch?v=Qf2tplcb6eE |yt

  echo https://www.youtube.com/watch?v=Qf2tplcb6eE |yt 22

  echo https://www.youtube.com/watch?v=Qf2tplcb6eE |yt 22 00:15:34 00:16:09

     x3(){ # youtube url variations, adjust as needed
     sed '
     case $# in :) 
     x3|(read x;curl -LHUser-Agent: -sv4 "$x"|grep -o "https://r[^\"]*"|sed 's/\\u0026/\&/g;s/%3D/=/g;s/%2C/,/g;s/%2F/\//g') >.yt;
     grep -o "itag=[^&%]*" .yt||exec echo video removed;
     x1=$(x3|sed 's/.*watch?v=//;s/[&%].*//')
     x2=$(sed -n "/itag=$1/p" .yt)
     test "${x1}"||exit;
     case $# in :)
     ;;1) exec curl -LHUser-Agent: -sv4Ro "${x1}".mp4 "${x2}"
     ;;3) exec ffmpeg -ss $2 -to $3 -i "${x2}" -vf "crop=(iw*0.75):(ih*0.80):(iw*0.20):(ih*0.08)" "${x1}".mp4

dont forget about cookies. disable them, block them or scrub them with a forward proxy. note the googlevideo.com domain does not serve cookies, only video. theres no video on youtube.com. plenty of javascript and telemetry, though.

> Because of my network limitation

I think you just answered your own question. ;)

I got HTTP 500 when trying to download anything using the demo site. Not sure if the server is overloaded, or a bug (would non-Latin characters in title affect anything?)

The video I tried is https://www.youtube.com/watch?v=5-X_2VXpEds and I just set a start and stop time.

This is what I have in mind, yet I don't have enough time to make this. I'm really happy that it is made into a reality. To get only the specific content of a long-duration video is quite efficient.

It's an amazing tool.

Why didn't you give it a different name than "video download...". It could project vidown or something that's easy to query.

This reminds me of https://kirp.io/

My favorite part is that you can select the part of the video. That's super sweet tool!

Sry but is there any online tools to do this? We all know there are a lot of online sites to download Youtube videos, but seems no such feature?

From personal experience.

There are a few out there, some have been shutdown for some reason, some seem to be not working and ones which promise think kind of feature are paid. So I needed a 30seconds clip from a 40mins video and not something that I need that frequently that I want pay a monthly subscription for. I ended up using youtube-dl and with a lot of trial and error of various custom commands to finally get what I wanted.

Haven't most of the online tools been bullied out of existence? I imagine that is why you must deploy it yourself to DO.

I think OP is suggesting that if you wanted that you'd need to set it up yourself.

clipconverter.cc is the only one that worked with long videos for me

Most sites, like mine use youtube-dl which just gets the video source file.

To crop it and process it at scale you are talking about thousands of dollars of servers to first download the file then process it.

Would be cool to see something client side using ffmpeg in browser, like https://github.com/ffmpegwasm/ffmpeg.wasm

Found a previous post from someone using that lib https://news.ycombinator.com/item?id=26580333

This would have been so useful when I was on a restricted internet connection a few years ago and trying to download some small archery clips for a compilation. No need to download a full 30 minute match!

I think you can make some money from this if you're interested to do so.

great work!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact