Hacker News new | past | comments | ask | show | jobs | submit login
ImageMagick: CLI for Image Editing (imagemagick.org)
160 points by ijidak on Dec 1, 2021 | hide | past | favorite | 98 comments



I always found it fascinating the grade-A executable names which imagemagick was able to claim in the global namespace:

    imagemagick /usr/bin/animate
    imagemagick /usr/bin/compare
    imagemagick /usr/bin/composite
    imagemagick /usr/bin/conjure
    imagemagick /usr/bin/convert
    imagemagick /usr/bin/display
    imagemagick /usr/bin/identify
    imagemagick /usr/bin/import
    imagemagick /usr/bin/mogrify
    imagemagick /usr/bin/montage
    imagemagick /usr/bin/stream


Thankfully as of v7 these are all bundled under a single "magick" command (with symlinks for compatibility). Hopefully in the future these can be removed


tbf this sounds like a problem that doesn't need to be fixed... Ironically this would introduce problems where many scripts sort of assume the binaries have the names they have.


Scripts written for version 6 or older are probably going to have problems with version 8 or later anyways as that's what the major version number is there to represent.


import is the most fun, when you forgot the `#!/usr/bin/env python` at the start of the script and of course the first instruction is import and you get an ImageMagik error


They are pretty common verbs, but they give 0 context to the user on what they do. I don't think these names are in any demand -- there's always a better name than "import" for your command line utility.


Sounds like you want PowerShell style naming :)


With good autocomplete, I wouldn't mind it!

I was honestly thinking of stuff like `du` (disk usage), `cp`, `mv`, and the like. Like I sort of mnemonically remember them, compared to mentally mapping `convert` -> imagemagick.


This sentiment directly contradicts your post above...


I'm making a distinction between mnemonic names (du = disk usage) and just vague names (convert = ???).


This is something I always preferred about the graphicsmagick fork, was the `gm` base command.


I came here to say the same thing


No self-respecting Unix tool would use such long names, so that's why they were free. "compare" vs "cmp", etc.


Image magick isn't something you use frequently and quickly like changing directories or moving a file. A longer more descriptive name is better here. Also easier to speak. Telling someone to write "compare" is easier than "C-M-P"


I was trying to make a joke about the infamous naming practices of Unix, an operating system with a call named “creat”.


My favorite is "ioctl", which is especially delightful when pronounced phoenetically


The reason the common commands were short was because typing on the keys for old terminals was a pain in general, and thus they are short for historical reasons. It is however an accident of history that them being shorter now makes it more convenient...if you have the names memorized.


A regrettable state that will hopefully be fixed someday.


Spell create with an 'e'. - Ken Thompson (referring to design regrets on the UNIX creat(2) system call and the fallacy of premature optimization)

.. via https://github.com/globalcitizen/taoup


If executable names were traded like internet domains, imagine what they'd be worth now.


I tried to convert those units but all I got was an imagemagick error


Image magick has been around for a pretty long time, and was doing image processing on linux before most had considered it.


ImageMagick was developed by John Cristy at DuPont in 1987 and release in 1990. Your statement is not only false, even it it wasn't, mentioning Linux in relation to ImageMagick is a non-sequitor. Maybe you should try other things.


Your point would have been much better if you had stopped after the first sentence.


Im really not sure what you are getting at?

I didn't say it was developed on/for Linux.

But because it was a mature product and added to Linux very early on, this is why it was able to get the names that it did.

so maybe you can share what it is you are responding to?


> Im really not sure what you are getting at?

To be quite clear, in the context of ImageMagick, claiming early Linux image processing with ImageMagick is irrelevant, also, not exactly true. What much does alleged Linux inclusion of any application (/usr/local/*) whatsoever have to do with that application? You're apparently promoting Linux by claiming it's a feat few had considered, or otherwise saying "I am cool," which is fine I guess and I don't doubt it, but also irrelevant.

As I pointed out, ImageMagick was developed in 1987, and I should have gone on to specify that most digital image manipulation techniques were developed in the 1960's. So assuming your Yggdrasil installation was processing images with it in early 1993, that it was "before most had considered it" is hardly knowable. I'm reading it similarly to saying your Chevy wipers were wiping your windshield before most had considered it when modern wipers were invented 50 years ago and all the techniques for doing so were worked out around the turn of the century. Chevy's are great and all, but there are other things.


> claiming early Linux image processing with ImageMagick is irrelevant

It is at relevant, because this is about why imageMagick has such prime namespace on linux.

I don't have any idea why you are going in these seeming random directions, I'm not promoting linux, or whatever else your going on about.

This is my original statement

> Image magick has been around for a pretty long time, and was doing image processing on linux before most had considered it.

Surely we can both agree part below is a fair and accurate statement

> Image magick has been around for a pretty long time

This next part, is the part that seemingly needs clarification

> and was doing image processing on linux before most had considered it.

When imagemagick was added to linux in the mid 90s (near ImageMagick 4) not a lot of people were doing image processing in the OS. and being that it was the first mature image processor added to the distro, it is why it has the namespaces it does.

I'm not sure how you are either not getting that clarifier of "on linux" and seem to be trying to debate something about the history of image processing itself?

To be very clear, and to hopefully prevent this continued necro'ing of this post, I was not, talking about this history of image processing, but the specific linux namespace that imagemagick has, and why it has it.

Any other points of debate about this are irrelevant.


Thank you for explaining. I think I see now your observation is that ImageMagick was doing image processing on Linux before most had considered doing image processing on Linux. If so, then I am forced to concede this point. Although I think it is still fair to say, with a market share even now only barely breaching 2%, that most, in fact nearly all, have not considered doing anything at all on Linux since inception and likely not even five or ten years from now. This means those intrepid Linux+ImageMagick users processing images in the mid-1990's are that much more notable as those lucky few doing stuff with Linux before most had considered it, but also that those getting directory listings in Linux right this moment are also somewhat intrepid because most, maybe as much as 97.1%, had never considered even doing anything like that on Linux. I wonder what else most have not even considered doing on Linux. I expect nearly everything.


the latest versions recommend that you use magick convert instead of just convert. I'm assuming because the globals are going away.


"convert" always gave me trouble on Windows with some existing tools that used Windows' built in "convert" tool. It was an edge case, but always entertaining when the tool needed to convert a filesystem or some tool I wrote wanted to convert an image, and got the wrong executable.


I totally agree and also had found it fascinating, except for...mogrify? I don't think I've ever heard that word before, and I don't even know what it could mean based on the only related word I can think of, "transmogrify".


This looks like useful output, what distro and command did you use to list this?


Arch Linux.

    $ pacman -Ql imagemagick | grep bin
    imagemagick /usr/bin/
    imagemagick /usr/bin/Magick++-config
    imagemagick /usr/bin/MagickCore-config
    imagemagick /usr/bin/MagickWand-config
    imagemagick /usr/bin/animate
    imagemagick /usr/bin/compare
    ....

You may also enjoy:

    $ pacman -Qo /usr/bin/{display,convert}
    /usr/bin/display is owned by imagemagick 7.1.0.16-1
    /usr/bin/convert is owned by imagemagick 7.1.0.16-1


For reference (mostly for me as I need this from time to time), you can do the same on dpkg-based distros with

     $ dpkg -L imagemagick-6.q16 | grep bin
     /usr/bin
     /usr/bin/animate-im6.q16
     /usr/bin/compare-im6.q16
     /usr/bin/composite-im6.q16
     /usr/bin/conjure-im6.q16
     /usr/bin/convert-im6.q16
     /usr/bin/display-im6.q16
     /usr/bin/identify-im6.q16
     /usr/bin/import-im6.q16
     /usr/bin/mogrify-im6.q16
     /usr/bin/montage-im6.q16
     /usr/bin/stream-im6.q16
Similarly, you get

     $ dpkg-query -S /usr/bin/convert-im6.q16 
     imagemagick-6.q16: /usr/bin/convert-im6.q16
This does not work for the canonical executables though as they are just symlinks to /etc/alternatives/foobar


> This does not work for the canonical executables though as they are just symlinks to /etc/alternatives/foobar

you can use realpath for this

  dpkg -S $(realpath /usr/bin/convert)
and you can ignore the path to it entirely with

  dpkg -S $(realpath $(which convert))


That's exactly my same thought every time I need to use one of those!


Is there an easy way to get that list for a package?


At my $OLDJOB, I used ImageMagick to compare snapshots during some automated front-end testing on our public Drupal site. It would compare the running test to previously accepted images, highlighting pixel diffs in red. It would also generate a 4 frame animated gif with of the original, highlit changes, the running test version, and back to the highlit changes (so it could loop for better comparison).

Imagemagick saved enormous amounts of time for us as we made CSS and other module upgrades.


Thanks for sharing this tip! Are there any guides you recommend?

Edit: Here’s one guide for the legacy Imagemagick (may not work for the latest release), https://legacy.imagemagick.org/Usage/compare/

I’m trying to encourage my front end developers to use visual diffs to validate rendering rather than use Protractor to test HTML/CSS.

I’ve known about BackstopJS, https://garris.github.io/BackstopJS/ and am on the lookout for alternatives.


A note of caution if you're planning to use Puppeteer for screenshots -- One oddity I've experienced with Puppeteer, is that there are sometimes very small height differences between headless and non-headless modes when generating screenshots. I suspect the root cause is that they are rounding sub-pixels differently, in certain scenarios. So, I typically run Puppeteer locally with { headless: false } to try to get a pixel-perfect match to the regular desktop Chrome experience.


Perceptual diff is another tool doing this:

http://pdiff.sourceforge.net


I would recommend generating an animated .webp for the diff output, which ImageMagick supports. Animated GIFs have color resolution limits.


For the lazy geeks out there, the service diffy does this easily and for cheap: https://diffy.website/


There are 634 CVE Records that match your search.

https://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=imagemagick

There have been a number of zero days.

My entire interaction with Imagemagick has been removing it. Often with great difficulty because there is some odd dependency.


Imagemagick is one of the few bits of software where the functionality is worth the risk. Simply find a way to remove any network access and use it. I used to run it in a docker container with (almost) all capabilities dropped but with a directory mapped into it to run.


Not sure if this is common knowledge (??) but I feel I should note here: in my job we absolutely do not consider containers to be a security boundary[1]. On the other hand I still tend to use them for isolation on my personal boxen, because they at least reduce the blast radius of bugs or shitty packaging.

[1] Random search result that appears to corroborate my claim: https://blog.aquasec.com/container-isolation


The article's sources disagree with the article. Its link to Microsoft's definition of a security boundary explicitly includes containers as a security boundary twice in the tables and offers bounties if you can break out of that security boundary. Its link and quote from Google say it's not a _strong_ security boundary yet the article claims Google said it wasn't a security boundary at all. The Red Hat link doesn't say anything about security boundaries whatsoever but it does say containers aren't perfect protection yet they do provide some protection. The Netflix link also explicitly says containers are a security boundary multiple times and they use additional protections to strengthen that boundary. At this point I'm doing following citations but you get the point.

If the security folks at your job truly doesn't consider containers security boundaries then they are wrong. What seems more likely is they don't consider containers alone a _good enough_ security boundary. And that's fine, some places consider separate processes with different rights good enough security boundaries. Others consider two boxes that are able to interact with each other not a good enough security boundary. It doesn't change that things that weren't secure enough for the use case are still security boundaries.


One way to make it safer is to run inside webassembly. I needed an easy way to modify photoshop files and allow give those commands to other users. So you may want to check out https://knicknic.github.io/imagemagick/ it’s Imagemagick in a progressive web app that allows you to share commands.


I always find it remarkable how people bash on IM without proposing alternatives. Should we all write our own libpng, libtiff, skia, cairo? Even libvips uses some imagemagick facilities for some of its functionality (file format support is just not there). While yes, processing images is complex and some formats are nearly Turing-complete (or outright turing-complete like the container/MP4 derivatives) saying "This software contains vulnerabilities therefore we are going to remove it" is an attitude we could have less of. If you replace your local imagemagick with some cloud service - don't you worry, in addition to your cloud bill growing the cloud service _also_ has to deal with IM vulnerabilities, containerization, sandboxing and all the other good stuff. And is lilely saving money by not going all the way on the above (if I had a dollar for every time a vulnerability could be injected into a service where images can be uploaded and the image renderer starts going out to the internet to embed something into a PNG).


I guess this means you should not use imagemagick in any process where the files (or other input) aren't trusted.

So you could use it in some typical dev workflows (or other business workflows) that are purely internal and maybe in certain non-internal processes where the inputs are strictly limited to trusted ones. But not, e.g., in services/apps that could process untrusted inputs.

(Seems like there are a number of leaks too, but since it's process-oriented, those probably won't be that hard to live with. They might be hard to notice normally.)

?


> My entire interaction with Imagemagick has been removing it.

Same. I've successfully moved all my image manipulation requirements to libVIPS. Far more performant and with a ton less memory usage.


I suppose it's probably a good idea to wrap it in a microservice in production.


Airgapped computer, it's the only way to be sure.


Printer output?


Don't forget GraphicsMagick!

http://www.graphicsmagick.org/


GraphicsMagick is purportedly much better code and faster... but people still reach for ImageMagick for some reason. Either one is a wonderful, powerful tool!


I don't know about using a CLI to edit images, but one of the best thing of having imagemagick installed on my workstation (it's an absolute essential) is the 'mogrify' CLI tool to batch resize, manipulate or change formats of a whole directory full of images.

https://imagemagick.org/script/mogrify.php


It's very handy for stuff like manipulating image uploads.


ImageMagick is an incredible bit of kit. It really is a piece of magic that surrounds us daily, that most people don't ever think about, but is easy to use, and insanely powerful.


Ditto for ffmpeg when you move into the audio or video realms.


ffmpeg, the command line tool, is wonderful, until you try its library libav* (not to be confused with the ffmpeg fork). The library is… a bit short of wonderful, at least in my experience. Namely: there is basically zero official documentation for it.


ffmpeg for video/audio, sox for audio fx.


Indeed there's a lot of very useful tools in there, with plenty of options to create custom workflow. I used `compare` with the fuzz option for instance to create a simple camera motion detection: https://github.com/laurent22/pmcctv/blob/e0930a0f7f51c319f66...


ImageMagick, ffmpeg, exiftool and YouTube-dl are 4 of the most useful tools via command line.


youtube-dl seems to be abandoned, now. yt-dlp (an actively developed fork of youtube-dl) seems to be it's accepted replacement, I think.

fyi


You are correct. I switched to yt-dlp just couple days ago (didn’t remember the name when I was typing the comment). YouTube-dl had started having issues where the download speed was super slow and restarting the download also didn’t work. Yt-dlp fixed it.


Thank you for the heads up! Been sad to see the site list that youtube-dl works on slowly degrade and shrink.


I did a project where I scanned a bunch of old media. The scans where so high quality that it was impractical to make a PDF for people to use. The Imagemagick community helped out with the right incantation to make everything just kinda work. One of the rare cases where a project's maintainers will just help you use it. I was wowed.


Was this on a public forum you can link?



I do book scanning a lot. Somebody might find these simple scripts useful: https://github.com/timonoko/BookScanner


If you wanna convert images for web, I use tools like:

  svgo (SVG Optimization)
  colorist / avifenc (AVIF)
  cwebp (webp)
  convert (ImageMagick, everything else)
because ImageMagick is able to convert MOST of thse, but not every format.

https://github.com/svg/svgo

https://github.com/joedrago/colorist

https://developers.google.com/speed/webp/docs/cwebp

https://imagemagick.org/


Are there any [e]books focusing on ImageMagick? I use it a good deal, but it's one of those things where you _know_ you're under-utilizing it, and I'd like to take a deep dive with examples and sage wisdom attached.


Totally agree with this. The documentation is very mid-1990's, and every now and then I'll see a fork to it doing something bonkers that I didn't know was in there.


Always my goto for any batch image manipulation. The documented examples are varied and helpful, but it takes a little while to get used to.

Here's one use of it in the wild, which batch takes a path of GAN output files, each with a grid of thumbnails, and splits them into individual images. Gloriously easy. https://github.com/binarymax/matchbox-twelvy/blob/master/dcg...


See also vips and libvips, presenting a much cleaner library interface and faster processing:

https://www.libvips.org/


I also echo this, I would go for VIPS first unless there are certain things that imagemagic can do that VIPS can't. You'll cut down on your memory usage significantly (like 500mb down to 20mb) and also use much less CPU.


echoing this. I've been playing with this library and it's great!


If you ever need something actually performant or that uses far less memory, have a look at libVIPS


This is really interesting. In the past I've used netpbm for this kind of work, but it has been on life support for at least a decade now. I never used ImageMagick very much because each time I tried I found a new and exciting crash bug somewhere in the path, plus it tended to be much slower.


Most people have probably moved onto just using someone else to host and manipulate their images, but for business reasons, one of my apps stores its images (and all the cached sizes) in the DB (single source of truth is also nice) so switching to libVIPS enabled me to reduce the amount of compute resources I pay for


seeing ImageMagick, of all hoary tools, trending on HN reminds me that I'm so old that the kids are relishing as classics the tools I regard as outdated.

This is expectable in music, but in tooling?

And with Image-frigging-Magick ?

Damn.

Looking forward to the front-page HN-trended article on `sendmail`


The next ones will be mplayer, XFig, XPDF, nmh, TKGate for electronic boards, Links and Bochs. Back to 2002 again.


my default browser is.... [sunglasses emoji] Arena


Well, a lot of people use links -g and Netsurf daily :p.

HN works on texts, lots of blogs are plain text, and ofc you have http://68k.news, https://lite.cnn.io and https://text.npr.org.


This appears to just go to the the script page of imagemagick.org. Is there something new or something else we should be looking at here?


Likely this came about from a reader of a recent post also on the front page [1] where image scaling is mentioned for website loading performance.

> There's a million ways to skin the cat of image resizing, whether you're using photoshop, gimp or a command line utility. We like to use imagemagick when ever possible.

I notice this pattern. Someone posts, readers look at the article / content and the comments, then find something else interesting, and that thing then becomes another submission to HN (either because it is indeed interesting on its own, or to gain points due to relevancy of surrounding material on the front page, or both).

[1] https://news.ycombinator.com/item?id=29405159


For each thing "everyone knows," there are tons of people learning about it for the first time right now.

https://xkcd.com/1053/


It's definitely a point worth remembering. I sometimes think about AWS's one year free tier offerings and think.. WTF doesn't have an AWS account and would still be signing up in 2021? In reality it's probably a lot of people.


> I sometimes think about AWS's one year free tier offerings and think.. WTF doesn't have an AWS account and would still be signing up in 2021?

AWS free tier offerings are per account; a person (or organization) can have lots of AWS accounts. The people signing up for a new one today may well be people that already have one.


I'm a dev with 11 years in the field and I've never touched AWS. It is on the todo list though, so I'll get to it eventually.

It can be easy to forget how vast this field truly is.


People just keep being born.


Such a good suite of utilities. When I first got on campus UNIX in '95, I pretty quickly found this and tried in vain to get it running on various versions of IRIX, Solaris, AIX, and whatnot. Wasn't until I got Linux going that I could actually use it.



Reminds me of the tz database is maintained by one person

https://onezero.medium.com/the-largely-untold-story-of-how-o...


wow, I thought the comic was already pretty applicable, and then I read the alt-text!


convert *.png output.pdf

Magic, indeed.


[1999]




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: