
Do One Thing - steelbird
http://radar.oreilly.com/2015/10/do-one-thing.html
======
astine
On the 'do one thing and do it well' issue, there seem to be two spectrums:

    
    
        1. suitability of a tool to multiple tasks
        2. number of features a tool has
    

This makes for sort of a field with four quadrants:

    
    
        1. simple but flexible tools
        2. complex and featureful tools
        3. specialized but simple tools,
        4. highly specialized and highly featureful tools
    

So take a tool like grep and it clearly belongs in the first quadrant. You can
do a lot with it, but it's pretty simple to learn and use. Emacs (or Eclipse)
belongs in the second quadrant as it can basically be used for anything and is
complicated to learn. Something like MyPaint belongs in the third category and
there are numerous domain specific tools that belong in the fourth.

So the question that remains when one is asking for a tool that 'does one
thing well' is, do you mean for it to be specialized, or simple, or both?

~~~
ZenoArrow
In this context...

"specialized" = Do one thing

"flexible" = and do it well.

That said, the key ingredient of the Unix philosophy that is missing isn't
specialised, flexible tools, it's composability of those tools. Imagine if you
couldn't pipe grep output to another program, wouldn't it be a lot less
useful? I'd say so.

There are ways we can get around this, but they would require a fair amount of
development time. To give you one example, consider how the JACK audio server
allows programs to share audio streams. Similar data streaming arrangements
could be made for other types of data.

~~~
ybx
How does flexible mean do it well?

"Do it well" is so vague of a saying that it really could mean any of those
things, whether it be flexibility, or simplicity, or etc.

~~~
astine
My understanding of the phrase 'do it well' isn't as an injunction but as a
way of saying that if you restrict your program to one thing, you'll have the
ability to do it very well. It's to contrast it to a program that does
multiple things, not so well.

------
bigethan
I had two thoughts while reading this.

1 - The need for the web to be profitable in some way makes 'single tools'
hard to build. You have to grow users, add features, etc, etc. So even if you
have an API or something useful it likely won't scale economically. Unless
it's government supported or behind a foundation of some sort. The Twitter API
comes to mind here.

2 - This is tricky because for the web to meet the UNIX philosophy, everyone
has to agree. You can't have the team that manages the equivalent of the `ls`
website decide to change their output, or strike up a deal with the `diff`
team to force `diff3` out.

Once again, capitalism ruins everything fun. That's hyperbole, kinda :-)

~~~
jshen
> Once again, capitalism ruins everything fun

And creates everything fun.

~~~
TeMPOraL
Depends. I accept the capitalism's ability to raise resources (capital) for
larger-scale projects, but generally profit motive fucks everything up. The
Internet was fine until every pathetic "enterepreneur" out there smelled easy
money and come to ruin it all. It happens in every industry. Capitalism
creates abundance - but abundance of worst possible crap sellers can still get
away with.

~~~
hueving
The Internet wasn't even accessible to the general public without capitalism
motivating the creation of ISPs.

~~~
josho
That may be true. But, the Internet existed because of massive multi-decade
investment from folks without a profit motive. All of the fundamental Internet
protocols were created through those organizations.

Look at what capitalism gave us as an Internet like experience: compuserve,
aol, etc. Those were all horrible closed wall systems.

~~~
hueving
But there are many massive investments at the academic and government research
level that lead to nothing directly useful for consumers due to lack of
investment interests in bringing it to market. Just look at the next
generation Internet architectures that go nowhere. You can't ignore economics.

------
saosebastiao
I'm definitely guilty of holding onto a philosophy and thinking it applies
universally, so I can't blame people that hold onto the Unix philosophy
extremely strongly. But I can't help but think that the world requires a hell
of a lot more pragmatism than the Unix philosophy can provide.

The first problem I see is that "One Thing" is really subjective. Some people
might see Postgres as doing one thing really well (It is, after all, an
incredible Relational Database). But others might look at Postgres entirely
differently: It is a client and a server. It is a data protection and
credential management system. It is an in memory caching layer. It is a NoSQL
key value engine. It is a SQL database. It is a data retrieval operation
optimization system. It is a storage format. Hell, it does a million SQL
things that other SQL databases don't...databases that probably also qualify
as doing "One Thing" in other people's eyes.

The second problem is that the world is really fucking complex, and sometimes
doing one thing well is impossible unless you also do one other thing well.
Rich's big example in this article is Evernote, and his claim was that
Evernote did one thing well, which was note synchronization. But notes are
almost always more than just text...which was why they added photos and tables
and web clippings. Who would ever want just the text aspect of their notes
synchronized across devices, but not their photos that they took of powerpoint
slides, their data tables, their diagrams, their emails, etc.? If Evernote
wanted to do "Note Taking" well, they couldn't just stop at text
synchronization across devices. So they should have stopped trying to do "Note
Taking" well, because someone only used them for the text synchronization that
they already did? Evernote is dying, that's for sure...but its not because it
did more than one thing, it is because it didn't do them well.

I get it. People like simplicity. But the world is complex, everybody's view
of the world is different, and that means that sometimes you just end up not
being the target market. And I also get that some things actually do do one
thing and do it extremely well. But trying to extrapolate that out infinitely
across all things (or even just across all software things) just doesn't pan
out in reality. And what does that mean for the philosophy? It should probably
just be extended to "Do things well". But that is no longer a distinctive
philosophy, is it?

~~~
hasenj
> but its not because it did more than one thing, it is because it didn't do
> them well.

In general I agree with your line of thinking, but I will nitpick on this
particular sentence (or the way it's phrased) and say: the whole idea of doing
only one thing is that if you try to do more than one thing, you will
definitely not do them all well.

I do agree with you though. The "many things" that postgresql does are all
actually just one thing: database system.

Yea some people could argue that under the cover it's many different things,
but the whole idea is about focus, not about counting features. There are
always multiple aspects to the "one thing" (whatever your one thing happens to
be). Doing the one thing well means tackling all of these aspects.

The reason you can't do multiple things well is that each one of the multiple
things you want to do will in itself have many aspects, and your focus will be
fragmented trying to tackle too many things with very limited resources.

~~~
paulddraper
What if something if really good at being a website?

~~~
paublyrne
Or really good at being a platform.

------
m_fayer
It sounds like what he's actually arguing for is cohesion, not just
simplicity. And if that's the case, I'm with him.

Cohesion is achieved when the degree of complexity is determined by both
domain and audience, so Photoshop's relative complexity and Pixelmator's
relative simplicity are both fine - in each case the domain is satisfactorily
handled for the needs of the respective audience. They're both cohesive tools.
Now if Photoshop decided to throw in a chat client (hello gmail), we'd be
having a different conversation.

However, if the author is actually advocating breaking up Photoshop into a
thousand pluggable little tools that everyone would have to piecemeal assemble
into some sort of shared "workspace," that's where we (and most non-geeks)
part company.

------
cmarschner
1\. Unix shell solved one thing very well: (unary) function composition based
on a text protocol. It gets complicated already when you want to beyond a pipe
- e.g. A graph like a makefile. Then you need to leave the pipe metaphor in
nost cases and deal with artifacts like files. Even more so, map-reduce was a
big shift that went beyond the unic metaphor.

2\. There was a different metaphor that had its merits: object composition
through standard interfaces, e.g. OLE/COM and the likes. One might argue about
its implementation (embedding a Visio object in a Word document still produces
crashes, 25 years later), but as a UI metaphor it was very powerful.

3\. The web's metaphor is coupling of disparate content through URLs and HTTP
(and HTML). One of the most mind-boggling metaphors ever introduced to man
(talk about doing one thing well). Today we use REST APIs as the atomic pieces
(do one thing well) and Javascript/ObjectiveC/Java as glue. Same thing.

4\. As for application it goes, "Any sufficiently complicated C or Fortran
program contains an ad hoc, informally-specified, bug-ridden, slow
implementation of half of Common Lisp" (Greenspuns tenth rule). Slightly
updated this means that as programs get more complex, they tend to become more
integrated abd customizable, to the point where you use a high-level language
to glue the different parts. If you're lucky they may have replaced a custom
common lisp with a mature V8 engine, but the lore is still the same.

5\. Outside of academia, people (product managers in particular) tend to be
focused on solutions, not abstractions. And for good reason: solutions are
often shortcuts for common applications of abstractions, and therefore they
provide lots of value. File.ReadAllLines() vs. doing the same with 3-4 Java
classes in the old days is the best example.

6\. In the end, we need people to think about abstractions: unix pipes, map-
reduce, URLs. And we need other people who think about solutions: The IPhone,
the Google world etc.

7\. As for the OP: curl might be a good start of a pipe. Add a program that
parses tables, and a program that posts to REST APIs...

------
coffeemug
_> That philosophy was great, but hasn’t survived into the Web age._

The modern "web operating system" is Amazon's AWS, and I'd argue that the unix
philosophy survived surprisingly well there. AWS has many services; each one
does a specific job really well, and they interoperate very well together.
That's the very embodiment of the unix philosophy.

~~~
paulddraper
Yep. AWS is a great example. Many products that can be used very well together
(S3 and Cloudfront, for example), but the chosen abstractions are such that
they can be used independently as well.

------
jonstokes
I would like to propose a "cloud" addendum to Zawinski's law, which states
that "Every program attempts to expand until it can read mail. Those programs
which cannot so expand are replaced by ones which can".

My addendum is: "Every cloud app expands to the point where it can host group
chat. Those apps which cannot so expand are replaced by ones which can."

I think it's also possible that I may have to augment the addendum to "group
chat and screen sharing."

~~~
ExcilSploft
I suggest it is package manager. Everything has it's own little little plugin
management tools, for retrieving, installing etc.

~~~
ryandrake
If you're NOT writing a web browser, and the little voice in your head says
something like "I know! I'll solve that by writing a plug-in system and let
other developers write to my API, and all their code will run in a sandbox,
and I'll write a virtual filesystem for them to use, and I'll provide an
installation/uninstallation system, and maybe a debugger..."

STOP

Get up and walk away from the keyboard for a bit. Maybe take a refreshing
shower. Clear your head and let yourself think this through a bit. You'll come
to the right conclusion. Now go back to your keyboard and work on something
that delivers value for your customers :)

------
pjmlp
Unix pipes only work in the CLI with a text based interface and even then they
break down when the applications aren't able to parse the data.

So one always ends up massaging the data to make them be understood.

Piping concept was actually more powerfull in Xerox PARC systems by using the
respective language REPL and do LINQ style data transformations.

However the problem of the article is that once you scale out of the CLI, you
need a standard communications API to this type of stuff.

One that is able work in distributed systems, dealing with all types of
failure issues.

If anything, native programming on the mobile offers some kind of piping
thanks to intents, contracts and extensions.

~~~
fractallyte
The Amiga ecosystem perfected this: most (good) software featured an ARexx
'port' \- an API, based on the REXX language, that enabled any one program to
communicate with another. Or one could simply write an ARexx script to
automate a process.

Here's an example of the first occasion I seriously used this, when producing
a 3D anaglyphic animation for video: graphics frames - two images for left and
right - were rendered in VistaPro (the original 3D landscape generator); as
soon as rendering was complete, ImageFX (image processing software) picked up
the output and combined the channels to make an anaglyph; this was then sent
to the PAR animation recorder (a video recorder). So, THREE separate large
packages from different software companies, working in synchrony with each
other via ARexx. The whole operation worked so smoothly and efficiently, first
time, that I still recall it with awe.

Sadly, yet another killer Amiga feature that never made it to modern
computing...

~~~
pjmlp
The libraries concept used to extend the OS were another cool Amiga feature.

I would say the closest we have today to ARexx experience are Powershell and
AppleScript.

------
veddox
I am a great fan of the Unix philosophy and can sympathize with the author and
his points, but I really don't know how realistic the whole "make the Internet
a new Unix"-scheme is.

As others have pointed out, there are great practical difficulties in
integrating different web services. Unix has pipes built into the core of the
OS - anything analogous would have to be "bolted on" to the Internet, and thus
probably turn out to be not as powerful or simple to use. And how about the UI
and user friendliness? On Unix, every program has more or less the same UI - a
couple of lines of text on a black background. The Internet? Everybody has a
different fancy graphic layout. If you can do everything within the same
"walled garden", that reduces confusion on this count.

For these (and other reasons, such as the aforementioned capitalism), I do not
think that the mainstream Internet is ever going to behave the way the author
envisions. What I could imagine, however, is a sort of parallel Internet that
displays this property to some extent - a range of services explicitly
targeting technical users (who are more likely to value the "do one thing
well" approach and less likely to care if the GUI isn't quite as snazzy).
These services would never grow big, but they could build up a loyal
following. Kind of like HN, really...

~~~
dkarapetyan
The internet already has a very simple interoperable protocol, HTTP. There is
nothing to bolt on really since the pipe is just a channel for text and
sockets on port 80 can just as easily act as those channel.

~~~
copperx
But HTTP is one-directional (disregarding Websockets which is a client thing).

~~~
dragonwriter
HTTP is asymmetric (the two sides are not interchangeable), but the
information flow is bidirectional. I don't see how that it prevents something
like Unix piping from being implemented between web services using it. Seems
to be mostly a UX design issue.

------
rco8786
I agree on the face of things, but what the web is missing from the unix
philosophy is an equivalent to |.

Without the ability to string multiple, focused tools together even tools that
did one thing well(i.e., Evernote) will continue to add features until it does
a bunch of things meh.

~~~
dahart
Absolutely! And just follow that to logical conclusions, to me it seems to
answer the question of why we don't have it, right?

On the web, "|" (pipe) means integration with other services, and it comes
with all the associated trials and tribulations. To make web pipes work, just
to even get started, you first have to solve authentication and data
format/transfer standards. Those things are a big pain, its not surprising
that most services attempt to provide the service directly.

Unix pipes were designed for builders, technical people who understand the
structure of the data they're piping around. Posix commands come in a minimum
standardized set that work well together, and while there are some commands
here and there to pipe structured and/or binary data, the core set and what
most people use pipes on and love pipes for is plain text.

The internet just doesn't have the same foundation or purpose or user base,
its usage is not centered around technical people building pipelines, so its
not surprising that web services by and large haven't flocked to meet a unix
analogy -- its technically much harder to do than writing unix, and its not
even clear its even close to a useful thing to do, for most people.

~~~
nostrademons
The other big factor is that pipes only work because of the power of plain
text. The output of every UNIX command is ASCII text, and most of the time,
it's ASCII text with columns delimited by \t and rows delimited by \n. And
there are command-line utilities like awk, cut, sed, & xargs for parsing &
rearranging that text, and funneling it into formats that other commands
understand.

The web has a variety of other content types - images, videos, applications,
structured data - that don't map well to this model. Before you have
interoperable webapps, you need to define common data formats for them to
interop.

~~~
TeMPOraL
I disagree with that. Pipes seem to work _in spite of_ plain text, being only
made incredibly inefficient by it. A much better way is to send data
structures through them (for which, btw. modern web is perfectly well-suited,
with present JSON domination) - you can always render the data to text if you
need, but you don't have to write arcane and bug-laden shotgun parsers with
sed and awk because every step in UNIX polka means throwing away metadata.

Also: ditch plaintext for structured data and suddenly handling of other
content becomes much, much easier, and they map perfectly to this model.

------
jobu
There is some merit to "doing one thing well", but taken to the extreme it can
be horrible in a different way:

 _tar cvf - FILE-LIST | gzip -c > FILE.tar.gz_

([https://xkcd.com/1168/](https://xkcd.com/1168/))

~~~
ainar-g
Rob in that comic clearly hasn't used Unix long enough to know about the
Arnold mnemonic[1].

[1] [http://i.imgur.com/Vf0An8J.png](http://i.imgur.com/Vf0An8J.png)

~~~
J_Darnley
What kind of tar are you both using that doesn't detect the compression type?
tar -xf FILE is fine.

------
gnu
Since no one mentioned Plan 9, I thought I will mention one of the many
features that makes Plan 9, one notch above Unix to make it easy in the web
era.

In Plan 9, literally everything has a file abstraction. This also includes
sockets. So, even shell programs can be network programs without external
programs like curl or wget or anything like that. For anyone interested, look
at webfs(4). You may say that this _can be_ implemented with fuse. But having
something first class, designed to operate well within the file abstraction,
is very different from something that has been added as an afterthought. In
some sense, the BSD folks who added sockets into Unix really screwed it up and
missed the Unix philosophy altogether.

~~~
gnu
Modern 'Linux' systems are full of programs that violate Unix philosophy.

cdrecord, ffmpeg, emacs, dbus, imagemagick, all modern browsers ... the list
goes on and on. plumber(4) does what dbus is trying to do, with very little
amount of code and text file based rules.

I see it as a missed opportunity for linux kernel and hence to the wider
audience to experience a computing environment that is a joy to use.

------
Asbostos
Ironically, his own blog includes a screen reader that appears whenever you
select some text. Don't blind people already have good screen readers? I
select text to remind me the place I'm up to, and get annoyed when little
buttons pop up.

------
cbetz
economic incentives typically lead firms to look for new sources of revenue.
they hope to achieve this extra revenue with minimal cost, ideally by not
creating new products or developing new customer relationships. the natural
side-effect of this situation for software companies is that they add feature-
upon-feature to existing products in an endless quest for growth. at the very
end of the road this situation doesn't usually work out well for the product,
the user, or the company.

on the other hand there are plenty of services/applications that keep stable
interfaces for many years at a time. they do not extend themselves too far
beyond solving the problem they originally tried to solve. we can all imagine
what craigslist would look like in the hands of short-term profiteers,
endlessly tweaking the interface for more ad clicks and "user engagement".

the success of sites like the original google search, craigslist, and HN
proves that the "do one thing, keep it simple" model is successful and can
often be very profitable in the long term. sadly, it is very easy to forget
about such ideals when people are constantly dangling fresh money in your face
and/or you have salaries to pay. while page rank might be considered the key
element to google's genesis and explosion, we also owe much respect to the
people that decided and continually insisted that the UI stay clean and
minimal.

------
Too
There is always <iframe>. You can easily embed a youtube or vimeo video even
if your core product is not displaying videos. I always felt that iframe is a
bit of lost potential, what it needs is a javascript api so that you can
communicate back and forth with the frame/plugin, last time I checked this was
not allowed due to same origin policy.

After the communication channel is in place it needs a few standard
interfaces, maybe there could be some video player interface with
play/pause/seek functions. These dont have to be included in any w3 standard,
its more of a de facto standard or agreement that if you make a video player
and don't implement the IAwesomeVideoplayerInterface, websites will not allow
embedding of your product.

What's sad is that it seems like vendors are trying to move away from this
model, Facebook is the prime example of this with hosting of a copy of
embedded videos and displaying linked news inline in their own format.

------
dlwj
IMO there is a limited market for mature ideas that is often much smaller than
the previous prediction. When startups start meeting this boundary, instead of
reigning in expectations and become an efficient tool serving a small market,
it's "go big or go bust" with desperate attempts to validate the previous
market size predictions.

Sooner or later, you're bumping into the markets of other companies and it's
impossible to compete against them b/c you're using your weaknesses against
their strengths.

There is only so much money and attention (time) in the world for people to
spend on things. The low hanging fruit in developed countries has already been
plucked clean so it's quite hard to 'grow' new markets. Your new tv series
better be REALLY good if you're competing with game-of-thrones and breaking-
bad.

Interestingly enough, the Chinese companies actually have many more features
and are often better (for their users) than the american counterparts. The
population is more homogenous and has more of a crowd mentality. Network
effects are huge. Monopoly-like companies also prevent too much fragmentation
meaning technology as a tool becomes standardized. Rather than 20 different
things competing for the attention of everyone, there is one big company with
apps that do everything that is trusted and reliable and thus the 'best'.

While american companies compete for slices of a certain size of pie, one
chinese company owns the entire huge pie. As long as the users have pie, they
are happy.

In allocation of limited resources, this usually fails as equal distribution
of resources leaves everyone poor which creates huge incentive for corruption
and hoarding. For abundant resources that can be duplicated infinitely and
have network effects to boot this is perhaps a better strategy.

After all, it's all about everyone having an abundance of pie, not who has
more or less pie.

------
tatx
There is nothing inherently wrong with complexity. Complex things can be made
to appear simple, usable and open too. It is just that, when it comes to
software, we haven’t yet figured out a way of building complex systems that
are beautiful (in a very broad sense of the term). Whenever we try to create
anything complex in software it inevitably turns into a mess.

Also software engineering sounds unglamorous today and research in the field
seems to have stalled. And therefore the craving for the simplicity of the
Unix philosophy. Not that the Unix way of working actually solves the problems
posed by software complexity, it just asks us to avoid complexity.

------
bertr4nd
> The first person to create a tool that can pipe a table from a browser into
> a spreadsheet...

Wait a minute. This is called the clipboard. Try it -- you can just copy and
paste the table into a Google Doc and it Just Works. What's missing?

~~~
dgudkov
It's missing the ability to build an automated process. Clipboard operations
are designed to be manual.

~~~
pjmlp
Or called by a script. Problem solved.

------
erikpukinskis
Evernote never even did the one thing well. I would add an item to a list on
my computer, and then later add another one on my phone, and the next day both
sources would be missing one of the edits.

------
nathan_long
Hardware suffers from the same "walled garden" mentality. Why the heck would I
want a "smart TV"? A TV exists to display video. It should have lots of inputs
and maybe some outputs for audio. If I want to get video from a VCR, a flash
drive, or a web service, I should be able to plug in a device to do that. And
unplug it when it becomes obsolete.

------
Twirrim
Then along came systemd....

------
kenko
> I’ve been lamenting the demise of the Unix philosophy: tools should do one
> thing, and do it well.

1983 was a while ago:
[http://harmful.cat-v.org/cat-v/](http://harmful.cat-v.org/cat-v/). That's a
long time to lament something.

------
forgotmypw666
>The first person to create a tool that can pipe a table from a browser into a
spreadsheet, a Google doc, or even a text file without massive pain will be my
hero.

Pigshell.com

------
ljk
Everyone wants to be the walled garden
[https://xkcd.com/927/](https://xkcd.com/927/)

------
gopowerranger
"That philosophy was great, but hasn’t survived into the Web age."

As one who still writes shell scripts for my work that do such a thing, and
programs, too, I disagree as every Unix and BSD coder I know still does these
things. It still serves us very well; far better than the glut of so-called
package managers that pretend they can do better than 'make'.

I chalk a large portion of this up to those creating web pages without any
real programming knowledge, training or background. Those who only know how to
cut/past/npm everything they do. These are the same who think Unix is old and
not modern.

I took an interview with a small company yesterday. There the creative
director asked me what tools I knew and spewed out everything but the kitchen
sink that they use. I was aware of all of them but questioned why he needed
any of them.

You see, I've been running a web dev company for 11 years and have never found
an advantage to any of it. He asked how we survived without npm or bower or
etc. but, when I asked him if he knew how to write a Makefile, he didn't even
know what it was or what it did.

A lot of the tools we use are things we built up over time ... or last week.
Today's "modern" tools may be "instant on" for those who can't write a
Makefile either but that's a fault and not a feature. If you need npm or bower
to manager these things then what happens when something breaks, goes away, or
becomes unsupported?

I stuck with npm and bower cause, when I tried to write about Angular and
other things it got too long winded.

One of my points is, all the tools you need are already built into any
Unix/BSD system so why look elsewhere? Those who do are only looking for quick
fixes, as I pointed out earlier, and not interested in the science behind it.
Creatives who want to build a web site but have no interest in the technology.
They can get it to work, eventually, but "it works" is good enough.

No it's not. That's why smart companies hire mine.

~~~
jacobolus
What does your comment have to do with interoperability between web services?

~~~
gopowerranger
Part of my comment was about his statement that the Unix philosophy is
something of the past, which is false. Parts of the rest of my comment dealt
with what he said about the plethora of "one size fits all" tools people are
using now instead of the simple tools.

Can you not make the connection?

~~~
taco_emoji
The OP was talking about web apps. What do Makefiles and npm have to with
Evernote or Dropbox?

~~~
zeveb
Why use a web app to take notes when one has vi or emacs?

~~~
JoBrad
Because it's really convenient to be able to make a note on your phone or
tablet, then access it and make additional post-meeting notes shortly
thereafter, on a laptop, all in the same interface (so you have the same
features, or at least a set of common features).

That's just one, assuming you never want formatting, tables, pictures, etc.

~~~
TeMPOraL
Oh Emacs does tables[0] better than anything out there, except maybe MS Excel.
Definitely eats stuff like Evernote or Google Drive for breakfast.

[0] - [http://orgmode.org/](http://orgmode.org/)

~~~
justinhj
I love org mode but due to the fact that emacs doesn't run on my phone and
Evernote does, I take notes on Evernote when I'm not at my computer

------
draw_down
People mean a lot of things when they say "one thing". "Do one thing" can mean
"replace all occurrences of one string with another". Or it could mean "browse
the web", which of course isn't really one thing but a thousand things.

~~~
criddell
I've always wondered how Emacs fits with Unix philosophy? With it you can
replace strings AND browse the web, but also play games, do file management,
email, chat, etc...

~~~
jonjacky
Emacs doesn't really fit with the Unix philosophy. Emacs came from the MIT AI
Lab and ITS, not Bell Labs and Unix - it got included with most Unix
distributions later.

Editors that do fit with the Unix philosophy (they were written by its
original developers) are ed sam and acme.

