1. suitability of a tool to multiple tasks
2. number of features a tool has
1. simple but flexible tools
2. complex and featureful tools
3. specialized but simple tools,
4. highly specialized and highly featureful tools
So the question that remains when one is asking for a tool that 'does one thing well' is, do you mean for it to be specialized, or simple, or both?
1. API or API-like things (grep)
2. Highly configurable app (Photoshop, Word)
3. Default-heavy app (Instagram)
But the middle part, the product that is complex and configurable, but not really "programmed" with code except through a limited script layer, is the fat middle, because it demands so much more UI, and an experience that is well-integrated, amenable to default workflows, and yet also very easy to customize. Over time, big software projects always drift towards the middle.
The article specifically fits my model's type 1. A type 1 product is nearly ideal for the hacker who wants to glue together a bunch of different technologies. It only falls down when the abstractions are too crude or mismatched to the problem. But the operating system itself is more like a type 2 product - there are numerous assumptions about what the environment, services, access methods, etc. look like, conventions that were established early on in computing and haven't yet been revised. Every programming language makes accommodations for working in that environment, and not your weird custom operating system.
"specialized" = Do one thing
"flexible" = and do it well.
That said, the key ingredient of the Unix philosophy that is missing isn't specialised, flexible tools, it's composability of those tools. Imagine if you couldn't pipe grep output to another program, wouldn't it be a lot less useful? I'd say so.
There are ways we can get around this, but they would require a fair amount of development time. To give you one example, consider how the JACK audio server allows programs to share audio streams. Similar data streaming arrangements could be made for other types of data.
"Do it well" is so vague of a saying that it really could mean any of those things, whether it be flexibility, or simplicity, or etc.
Think about how programs become bloated. What helps the core Unix command line tools avoid this fate?
It's important to remember that while quite involved and extremely domain specific, Beeminder has an API. He seems to care a lot about that, and I have no idea if his real point is "make things that don't suck and avoid digital vertical integration".
But the point about simple & flexible tools is not to miss on features for the user, but to separate those into different tools. The people who prefer a simple text editor to an IDE aren't doing everything else by hand, but using a set of different tools to accomplish the same as the IDE built-in features would.
Taking the Beeminder example, and with the caveat that I've never used it: the graphing system seems cool, but it can't be used by anyone who doesn't have access to a credit card; if the graphs and the commitment payment systems were two separate tools that you could connect instead of a bundle, one could use the former with an alternative commitment system.
My model of gmail is gmail. This model is 100% accurate.
1 - The need for the web to be profitable in some way makes 'single tools' hard to build. You have to grow users, add features, etc, etc. So even if you have an API or something useful it likely won't scale economically. Unless it's government supported or behind a foundation of some sort. The Twitter API comes to mind here.
2 - This is tricky because for the web to meet the UNIX philosophy, everyone has to agree. You can't have the team that manages the equivalent of the `ls` website decide to change their output, or strike up a deal with the `diff` team to force `diff3` out.
Once again, capitalism ruins everything fun. That's hyperbole, kinda :-)
And creates everything fun.
1: I use the qualifier "mostly" because the government did farm out most research and engineering tasks to a handful of for-profit companies (Bolt Beranak Newman, the Mitre Corporation, SRI International) that specialized in governmental contracts.
Look at what capitalism gave us as an Internet like experience: compuserve, aol, etc. Those were all horrible closed wall systems.
- The SaaS/cloud model wasn't so popular, which means trying to lock you in by stealing your data, or doing absolutely ridiculous things like IoT does, wasn't something you saw.
How can someone claim with a straight face "but the web really was a lot more fun pre-dotcom-bubble" ? Good luck trying to find anything outside of nerd subculture and physics/math/CS content (exaggeration of course, but the core is true). (of course it could be 'true' if all one cares about is nerd subculture and physics/math/CS content...)
90% of for-profit content could disappear just like that, and humanity would be much better off. Every information that is valuable, you can almost always find for free, posted by people who don't try to use you.
Wikipedia didn't exist in 1999 (not for years, and even Slashdot was barely more than a gossip site. Good quality discussion back then was on usenet or irc. The concept of a 'mooc' was just a wet dream of some 'cypherpunks' and 'technoanarchists'. Download a manual for your microwave or car, do online banking, email anyone but your hardcore nerd friends? Forget it. Price comparison shopping, ordering something from another continent? Lol. I remember riding my bike to a local bank branch to pick up foreign currency, which I stuffed into an envelope and snailmailed. Had to ask at the counter of the post office how much the postage was, because there was no way to look that up online.
Mp3? Sort of existed, you could download song by song from geocities pages; that was before the crackdown that led to Napster even. But the "selection" was minuscule, compared to today.
Oh and back then, when you wrote a site, you chose between supporting ie or netscape, or browser sniffing and serving two versions, or sticking to yhe lowest common denominator which wasn't much, to put it mildly. Ffs people won 'best of the web' awards for html that worked on two browsers and didn't look like crap! Like obscure competitions today where you build a file that is both a pdf and a jpg! My first job, back around that time, was to 'port' a website from ie to netscape.
The more I think about, the more convinced I get - any claim that it was better in those days is just rose-colored glasses.
I don't find that anymore. Homepages came and went, blogs came and went, and if there's still a thriving web out there beyond the big commercial sites, I don't know where to look to find it.
I still have my personal site, and I still post on my blog occasionally, but it increasingly feels like calling into the wind. I'm not going to just switch over and post all the same stuff on some big commercial site; fuck that. The web was awesome when it looked like it was going to be a new way for humanity to talk to itself; now it seems like little more than another way for rich people to make money. It does that very impressively, but who cares? Rich people always had ways to make money and will always find more; it's not very exciting when they turn yet another collaborative community project into a profit center.
I was just using an API the other day to validate addresses. It's an amazing thing, being able to hit an endpoint and not worry about the myriad of complex processes that must happen for that validation. If that's capitalism, I'd say it's pretty good.
And the capitalism dig has to do with when your address verification endpoint decides that an API isn't profitable. Or that it's better for them to change the format of the response. Or require you to have a account of some sort. Or rate limit. Or...
In the overall picture, if you're not helping someone earn money, you're living on borrowed time with the services they provide.
The USPS (which is quasi-private, I'll admit) offers address verification, but their service hasn't been reliable, which is why we have competitors now. Your fallacy is that because something isn't flawless that the whole system is a failure. And throwing around the word 'profit' as some pejorative, dirty word is an emotional manipulation. How else would people hire or expand if not for profit? The service I'm using (Smartystreets) now offers international address verification; I'm not sure how they would have paid for that if it weren't for profit. That extra money you bring home from your job after taxes and other expenses... that's profit. Your employer is living on borrowed time from the services you provide. What happens when you quit, change your rate, or become bored with your job? I guess it's worth the risk to them.
Take Wikipedia, for example, who already offer their compiled data. Now to take that a step further they could expose the transformations they use to compile that data as separate endpoints.
The first problem I see is that "One Thing" is really subjective. Some people might see Postgres as doing one thing really well (It is, after all, an incredible Relational Database). But others might look at Postgres entirely differently: It is a client and a server. It is a data protection and credential management system. It is an in memory caching layer. It is a NoSQL key value engine. It is a SQL database. It is a data retrieval operation optimization system. It is a storage format. Hell, it does a million SQL things that other SQL databases don't...databases that probably also qualify as doing "One Thing" in other people's eyes.
The second problem is that the world is really fucking complex, and sometimes doing one thing well is impossible unless you also do one other thing well. Rich's big example in this article is Evernote, and his claim was that Evernote did one thing well, which was note synchronization. But notes are almost always more than just text...which was why they added photos and tables and web clippings. Who would ever want just the text aspect of their notes synchronized across devices, but not their photos that they took of powerpoint slides, their data tables, their diagrams, their emails, etc.? If Evernote wanted to do "Note Taking" well, they couldn't just stop at text synchronization across devices. So they should have stopped trying to do "Note Taking" well, because someone only used them for the text synchronization that they already did? Evernote is dying, that's for sure...but its not because it did more than one thing, it is because it didn't do them well.
I get it. People like simplicity. But the world is complex, everybody's view of the world is different, and that means that sometimes you just end up not being the target market. And I also get that some things actually do do one thing and do it extremely well. But trying to extrapolate that out infinitely across all things (or even just across all software things) just doesn't pan out in reality. And what does that mean for the philosophy? It should probably just be extended to "Do things well". But that is no longer a distinctive philosophy, is it?
In general I agree with your line of thinking, but I will nitpick on this particular sentence (or the way it's phrased) and say: the whole idea of doing only one thing is that if you try to do more than one thing, you will definitely not do them all well.
I do agree with you though. The "many things" that postgresql does are all actually just one thing: database system.
Yea some people could argue that under the cover it's many different things, but the whole idea is about focus, not about counting features. There are always multiple aspects to the "one thing" (whatever your one thing happens to be). Doing the one thing well means tackling all of these aspects.
The reason you can't do multiple things well is that each one of the multiple things you want to do will in itself have many aspects, and your focus will be fragmented trying to tackle too many things with very limited resources.
That's one thing I don't get. Dropbox could have made a decent markdown or (please, pretty please) org-mode editor and would have blown everyone out the water. Editor over Dropbox is a perfect thing - unlike every other butt solution out there, including Google Drive, it doesn't limit you to your butt provider. Files on Dropbox are still your files, in your own filesystem, and you can do whatever the fuck you want with them. A simple editor for people who don't like (or don't want to, on a particular device) playing with file juggling, and it would be a perfect note taking solution.
It would be amazing if they would just let me use this on their website:
I also felt like there was a lack of imagination in some of the comments. Poo-pooing images in notes as related selfies is nice hyperbole, but I used it all of the time for System Diagrams and other bits that are so much easier to draw on a large white board than to transcribe or even draw on a digital canvas.
I DO wish there was a easier way to donors than just 1 thing with IFTT, though. Perhaps that is in the cards someday.
Cohesion is achieved when the degree of complexity is determined by both domain and audience, so Photoshop's relative complexity and Pixelmator's relative simplicity are both fine - in each case the domain is satisfactorily handled for the needs of the respective audience. They're both cohesive tools. Now if Photoshop decided to throw in a chat client (hello gmail), we'd be having a different conversation.
However, if the author is actually advocating breaking up Photoshop into a thousand pluggable little tools that everyone would have to piecemeal assemble into some sort of shared "workspace," that's where we (and most non-geeks) part company.
2. There was a different metaphor that had its merits: object composition through standard interfaces, e.g. OLE/COM and the likes. One might argue about its implementation (embedding a Visio object in a Word document still produces crashes, 25 years later), but as a UI metaphor it was very powerful.
4. As for application it goes, "Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp" (Greenspuns tenth rule). Slightly updated this means that as programs get more complex, they tend to become more integrated abd customizable, to the point where you use a high-level language to glue the different parts. If you're lucky they may have replaced a custom common lisp with a mature V8 engine, but the lore is still the same.
5. Outside of academia, people (product managers in particular) tend to be focused on solutions, not abstractions. And for good reason: solutions are often shortcuts for common applications of abstractions, and therefore they provide lots of value. File.ReadAllLines() vs. doing the same with 3-4 Java classes in the old days is the best example.
6. In the end, we need people to think about abstractions: unix pipes, map-reduce, URLs. And we need other people who think about solutions: The IPhone, the Google world etc.
7. As for the OP: curl might be a good start of a pipe. Add a program that parses tables, and a program that posts to REST APIs...
The modern "web operating system" is Amazon's AWS, and I'd argue that the unix philosophy survived surprisingly well there. AWS has many services; each one does a specific job really well, and they interoperate very well together. That's the very embodiment of the unix philosophy.
My addendum is: "Every cloud app expands to the point where it can host group chat. Those apps which cannot so expand are replaced by ones which can."
I think it's also possible that I may have to augment the addendum to "group chat and screen sharing."
Get up and walk away from the keyboard for a bit. Maybe take a refreshing shower. Clear your head and let yourself think this through a bit. You'll come to the right conclusion. Now go back to your keyboard and work on something that delivers value for your customers :)
So one always ends up massaging the data to make them be understood.
Piping concept was actually more powerfull in Xerox PARC systems by using the respective language REPL and do LINQ style data transformations.
However the problem of the article is that once you scale out of the CLI, you need a standard communications API to this type of stuff.
One that is able work in distributed systems, dealing with all types of failure issues.
If anything, native programming on the mobile offers some kind of piping thanks to intents, contracts and extensions.
Here's an example of the first occasion I seriously used this, when producing a 3D anaglyphic animation for video: graphics frames - two images for left and right - were rendered in VistaPro (the original 3D landscape generator); as soon as rendering was complete, ImageFX (image processing software) picked up the output and combined the channels to make an anaglyph; this was then sent to the PAR animation recorder (a video recorder). So, THREE separate large packages from different software companies, working in synchrony with each other via ARexx. The whole operation worked so smoothly and efficiently, first time, that I still recall it with awe.
Sadly, yet another killer Amiga feature that never made it to modern computing...
I would say the closest we have today to ARexx experience are Powershell and AppleScript.
As others have pointed out, there are great practical difficulties in integrating different web services. Unix has pipes built into the core of the OS - anything analogous would have to be "bolted on" to the Internet, and thus probably turn out to be not as powerful or simple to use. And how about the UI and user friendliness? On Unix, every program has more or less the same UI - a couple of lines of text on a black background. The Internet? Everybody has a different fancy graphic layout. If you can do everything within the same "walled garden", that reduces confusion on this count.
For these (and other reasons, such as the aforementioned capitalism), I do not think that the mainstream Internet is ever going to behave the way the author envisions. What I could imagine, however, is a sort of parallel Internet that displays this property to some extent - a range of services explicitly targeting technical users (who are more likely to value the "do one thing well" approach and less likely to care if the GUI isn't quite as snazzy). These services would never grow big, but they could build up a loyal following. Kind of like HN, really...
To be fair, I don't think we have a good way to do that with graphical OS shells even on Unix, just CLIs, so its probably not surprising that we don't have it in web browsers.
Without the ability to string multiple, focused tools together even tools that did one thing well(i.e., Evernote) will continue to add features until it does a bunch of things meh.
On the web, "|" (pipe) means integration with other services, and it comes with all the associated trials and tribulations. To make web pipes work, just to even get started, you first have to solve authentication and data format/transfer standards. Those things are a big pain, its not surprising that most services attempt to provide the service directly.
Unix pipes were designed for builders, technical people who understand the structure of the data they're piping around. Posix commands come in a minimum standardized set that work well together, and while there are some commands here and there to pipe structured and/or binary data, the core set and what most people use pipes on and love pipes for is plain text.
The internet just doesn't have the same foundation or purpose or user base, its usage is not centered around technical people building pipelines, so its not surprising that web services by and large haven't flocked to meet a unix analogy -- its technically much harder to do than writing unix, and its not even clear its even close to a useful thing to do, for most people.
The web has a variety of other content types - images, videos, applications, structured data - that don't map well to this model. Before you have interoperable webapps, you need to define common data formats for them to interop.
Also: ditch plaintext for structured data and suddenly handling of other content becomes much, much easier, and they map perfectly to this model.
Unix pipes have already solved the authentication problem. For instance, to stream a compressed harddisk copy safely between two hosts you can use the following statement:
dd if=/dev/harddisk | gzip -9 | ssh user@xyz dd of=hdcopy.gz
tar cvf - FILE-LIST | gzip -c > FILE.tar.gz
$ tar --extract --verbose --gzip < foo.tar.gz
$ tar --create --gzip --file sql.tar sql/
After switching to this style I never have problems coming up with the correct command. It's a lot easier to read in shell scripts too.
In Plan 9, literally everything has a file abstraction. This also includes sockets. So, even shell programs can be network programs without external programs like curl or wget or anything like that. For anyone interested, look at webfs(4). You may say that this can be implemented with fuse. But having something first class, designed to operate well within the file abstraction, is very different from something that has been added as an afterthought. In some sense, the BSD folks who added sockets into Unix really screwed it up and missed the Unix philosophy altogether.
cdrecord, ffmpeg, emacs, dbus, imagemagick, all modern browsers ... the list goes on and on. plumber(4) does what dbus is trying to do, with very little amount of code and text file based rules.
I see it as a missed opportunity for linux kernel and hence to the wider audience to experience a computing environment that is a joy to use.
on the other hand there are plenty of services/applications that keep stable interfaces for many years at a time. they do not extend themselves too far beyond solving the problem they originally tried to solve. we can all imagine what craigslist would look like in the hands of short-term profiteers, endlessly tweaking the interface for more ad clicks and "user engagement".
the success of sites like the original google search, craigslist, and HN proves that the "do one thing, keep it simple" model is successful and can often be very profitable in the long term. sadly, it is very easy to forget about such ideals when people are constantly dangling fresh money in your face and/or you have salaries to pay. while page rank might be considered the key element to google's genesis and explosion, we also owe much respect to the people that decided and continually insisted that the UI stay clean and minimal.
After the communication channel is in place it needs a few standard interfaces, maybe there could be some video player interface with play/pause/seek functions. These dont have to be included in any w3 standard, its more of a de facto standard or agreement that if you make a video player and don't implement the IAwesomeVideoplayerInterface, websites will not allow embedding of your product.
What's sad is that it seems like vendors are trying to move away from this model, Facebook is the prime example of this with hosting of a copy of embedded videos and displaying linked news inline in their own format.
Sooner or later, you're bumping into the markets of other companies and it's impossible to compete against them b/c you're using your weaknesses against their strengths.
There is only so much money and attention (time) in the world for people to spend on things. The low hanging fruit in developed countries has already been plucked clean so it's quite hard to 'grow' new markets. Your new tv series better be REALLY good if you're competing with game-of-thrones and breaking-bad.
Interestingly enough, the Chinese companies actually have many more features and are often better (for their users) than the american counterparts. The population is more homogenous and has more of a crowd mentality. Network effects are huge. Monopoly-like companies also prevent too much fragmentation meaning technology as a tool becomes standardized. Rather than 20 different things competing for the attention of everyone, there is one big company with apps that do everything that is trusted and reliable and thus the 'best'.
While american companies compete for slices of a certain size of pie, one chinese company owns the entire huge pie. As long as the users have pie, they are happy.
In allocation of limited resources, this usually fails as equal distribution of resources leaves everyone poor which creates huge incentive for corruption and hoarding. For abundant resources that can be duplicated infinitely and have network effects to boot this is perhaps a better strategy.
After all, it's all about everyone having an abundance of pie, not who has more or less pie.
Also software engineering sounds unglamorous today and research in the field seems to have stalled. And therefore the craving for the simplicity of the Unix philosophy. Not that the Unix way of working actually solves the problems posed by software complexity, it just asks us to avoid complexity.
Wait a minute. This is called the clipboard. Try it -- you can just copy and paste the table into a Google Doc and it Just Works. What's missing?
1983 was a while ago: http://harmful.cat-v.org/cat-v/. That's a long time to lament something.
As one who still writes shell scripts for my work that do such a thing, and programs, too, I disagree as every Unix and BSD coder I know still does these things. It still serves us very well; far better than the glut of so-called package managers that pretend they can do better than 'make'.
I chalk a large portion of this up to those creating web pages without any real programming knowledge, training or background. Those who only know how to cut/past/npm everything they do. These are the same who think Unix is old and not modern.
I took an interview with a small company yesterday. There the creative director asked me what tools I knew and spewed out everything but the kitchen sink that they use. I was aware of all of them but questioned why he needed any of them.
You see, I've been running a web dev company for 11 years and have never found an advantage to any of it. He asked how we survived without npm or bower or etc. but, when I asked him if he knew how to write a Makefile, he didn't even know what it was or what it did.
A lot of the tools we use are things we built up over time ... or last week. Today's "modern" tools may be "instant on" for those who can't write a Makefile either but that's a fault and not a feature. If you need npm or bower to manager these things then what happens when something breaks, goes away, or becomes unsupported?
I stuck with npm and bower cause, when I tried to write about Angular and other things it got too long winded.
One of my points is, all the tools you need are already built into any Unix/BSD system so why look elsewhere? Those who do are only looking for quick fixes, as I pointed out earlier, and not interested in the science behind it. Creatives who want to build a web site but have no interest in the technology. They can get it to work, eventually, but "it works" is good enough.
No it's not. That's why smart companies hire mine.
Can you not make the connection?
That's just one, assuming you never want formatting, tables, pictures, etc.
 - http://orgmode.org/
That's why I personally want emacs on my phone and tablet. I don't know yet the best way to expose its functionality with a touch interface, but it's still, hands-down the best way to edit information.
Maybe something where a tap in the minibar offers some sort of helm- or ido-like command-picking mode, and with taps on the side to enable quick execution of text-editing commands? I dunno, really.
> That's just one, assuming you never want formatting, tables, pictures, etc.
Emacs can handle formatting, tables and pictures if you want.
For a proper "Web UNIX" we need:
- websites talking in structured data (not just plain text)
- less propertiary bullshit (hint: keep sales & marketing people away from APIs)
- ability to conveniently pipe them together anywhere (not on a third-party, complexity-hiding, feature-limiting site like IFTTT)
When I can start typing things like these in my own, local shell:
@twitter.com/me/tweets/latest | sort > tweets.log
echo tweets.log | @facebook.com/me/post/new --activity "Feeling: Happy" --photo /tmp/HN.jpg
then we'll have a web UNIX.
This is essentially the thesis statement:
> Unix has pipes, which make it easy to build complex applications from chains of simpler commands. On the Web, nobody may know you’re a dog, but we don’t have pipes, either. [emphasis mine]
It is explicitly lamenting the complexity of web-based applications as opposed to other kinds of applications. If you disagree that that's the premise, then you and I are living in rather distinct universes.
Editors that do fit with the Unix philosophy (they were written by its original developers) are ed sam and acme.
(Actually, most of the individual applications you're thinking about are these days provided through an Emacs package system.)
(Although one thing that can be said for Emacs as regards the Unix philosophy is that it is programmable.)