The divergence between users and programmers became more pronounced over time. When command line interfaces were dominant they naturally made programmers out of users, even if they didn't realize it. CLIs made “using the computer” and “programming the computer” effectively the same activity in a lot of cases. A command someone entered to run a program was itself a program. Entering the previous command again and modifying it, for instance to pipe the output of the first program into another program, was also a program. Once the desired result was achieved, that final command could be saved and used again later. Or shared with someone to be used as-is, or to be tweaked a little bit for their own use case.
Each interaction with a CLI results in a valid program that can be saved, studied, shared, and remixed. That's a powerful model for the same reasons the spreadsheet model is powerful: it's immediate, not modal, and successful interactions can be saved as an artifact and resumed later. Can we do the same things for GUIs? What is the GUI equivalent of pressing the up arrow key in a shell, where I can recall my previous interaction with the system and then modify it? Can I generate artifacts as byproducts from my interactions with a GUI system that I can save for later and share with others?
I've been trying to solve that for a few years now. The closest existing thing I could find in my research was acme from Plan 9, which actually does a pretty good job. The trick it uses is to have you click on a typed command instead of "sending" it, and it stays on your screen after it runs. So you can save several commands in a file, and that's a menu. Or clear the current document and print out several commands from a script, and that's a dynamic menu.
I highly recommend reading the paper[a] and trying it out. It's really interesting, and pretty easy to program.
The main problems with it are that it's too text-centered, and the interaction model is kinda weird for modern standards. I feel these are solvable (Plan B's Omero tried, with partial success), but they are hard to do without integrating the UI and the script into a single process, which feels like cheating. But well. If I ever get around to making a prototype, it will be here on Show HN.
Consider watching the mother of all demos. The GUI paradigm we use today was intended to be for children. The intention was to have much more complex and composable UI for tech workers.
A lot of early users became programmers or sysadmins because they had to. Cliff Stoll was not the only scientist who suddenly found himself in charge of mainframes and learning everything about them. There wasn't a big pool of CS grads to do those things.
The shift to microcomputers enabled mass ownership by driving prices down and dumbing software down enough for everyone to use it. Except for price outliers like Macintosh and usability nightmares like blue screens.
Phones finally got us to today - ubiquitous, easy-to-use computers in everyone's pockets. Most users are clearly not programmers today. Nor do they want to be. Lumping all humans together as "operators" seems like a category error.
This is exactly what I have been thinking lately, starting with looking at Oberon. It seems to me that writing a simple GUI should be the same as writing a simple text-oriented script. GUIs have their own challenges, of course. However, doing the GUI equivalent of print() statements to show calculation output is a think a modern operating system should do, not have a distance between user and the graphic system. At the moment, it's a pretty ideal, but there are cases I wish their was less friction with it.
edit: I never tried, but isn't this where Smalltalk comes in?
Stamp collecting is a good outlet for research as leisure. If you're the type of person who falls into wiki holes and likes talking about what you learn with other people, it might be for you.
Many stamp collectors follow the research-as-leisure framework naturally as part of their hobby. The article outlines cultivating curiosity, developing questions, gathering evidence, developing answers, and building communities around that process, which is basically what collectors are doing when they're talking about stamps and other philatelic material. They're sharing discoveries they find interesting, often only after identifying the material, looking up its historical context, drawing parallels to current events, and then formulating some kind of answer or conclusion that makes it worthwhile to share what they've found.
I think the experts in a lot of hobbies engage in this sort of recreational research for the joy of it, but it's closer to the norm for casual collectors. There is a whole wide spectrum of collectors though, ranging from aesthetics-driven folks who spend more time on thematic album pages than on researching anything, over to experts in narrow areas like Transylvanian hotel stamps who publish whole books on about how they weren't valid for postal use but were used by hospitality workers nestled up in the Carpathians to get mail from guests back into the official mail stream because the state couldn't be bothered to service up there. If you get into it you'll find there are a lot of curious and motivated people in the middle who are happy to share what they've been reading about (or listening to, or watching) lately.
Nice work with the guide, the bevy of examples makes it easy to digest.
The colon being used in multiple contexts is tricky. As I was scanning the examples I found postfix `:` doing type conversion like in `(%)\. {%/Apple/}{`3:}` and then I was wondering what it does when it has nothing on its left-hand side, like in `[(+)|0 [:1"x]`. Then I noticed that the [ were unbalanced in the latter example, and eventually figured out that `[:` is its own operator separate from `:` and the middle `[` had nothing to do with function syntax.
GitHub itself used pjax heavily and I liked those interactions far more than the newer React ones, the HTML was much more semantic for one, with middle click always being respected.
A big use case for them was using them as scriptable editors in pipelines, and a lot of vim's commands are inherited from them. The diff(1) utility actually has an option to emit ed script, allowing files to be patched in ed pipelines: http://www.gnu.org/software/diffutils/manual/html_node/ed-Sc...
It's common for people today to think of ancient Greeks as being very unsophisticated and primitive, but then when you start to read what ancient philosophers wrote you think "Wow, these guys really had things figured out, maybe better than we do now." I kind of feel the same way comparing modern GUI applications to ancient editors like ed.
I find learning this lesson often enough made me realize that it's difficult to remember that people in the past were every bit as intelligent as people are now. They simply had fewer, older tools.
There's another related lesson there that progress, like evolution, is progress in a certain direction. No one said that direction is whatever you call "good" at the moment.
Good point.This argument that domestication lowers intelligence (among other powerful abilities) and that we are unquestionably domesticating ourselves is pretty damn solid. The only argument against it seems to be "... but my ego!".
Obviously there is some interplay with the fact that we develop new mental models and thinking tools to augment intelligence.
Also immersion as children in highly abstract ways of thinking further augments/multiplies raw intelligence (most convincing explanation to the Flynn effect imo).
I've lost the link, but there was an excellent article I read related to the amazing Otzi discovery (https://en.wikipedia.org/wiki/%C3%96tzi) describing how adults of that era (modern humans, primitive societies) would likely have been terrifying to us now in just how much they outclassed us in raw strength, intelligence and stamina. We would be relying a lot on the benefits of childhood nutrition and education to feel superior. This isn't completely convincing, there are a lot of factors in play, but those levels of brutal competition and danger would have a profound effect, especially epigenetically.
Which is why I have been learning and using emacs daily for a few years now, and it's amazing how much more powerful it is than I ever seem to fully understand. I'll think "now I have it all like I want it", and then a few months later I learn about a way to do X.
Especially as a sysadmin, I spend so much time in a ssh cli, that part of my reasoning was "I want to be able to do all my normal tasks without the gui." Gmail, news, and rss in gnus, irc in erc, org-mode (loving export to latex for reports), eww for browsing, and who knows whats next. Elixir and go modes are improving too, and my general productivity due to staying mostly inside a single ecosystem has really improved.
Another reason I have done this though, is I feel like it's less about gui vs text, and much more about FOSS vs proprietary. In a few years when everyone has an iBrain with Apple (NSA) inside, I intend to have a MEmacs brain that I have control over.
I think RMS will be vindicated in history as a man far ahead of his time.
Very true. I read Plato's Timmeus for a music theory class once and noticed that along the way of explaining his version of music theory, he also--from first principles only--deduced the existence of fundamental particles. He didn't call then protons, neutrons, and electrons, obviously. Be he got the basic idea correct: that all the things that exist are fundamentally built from the same things.
But then you remember that Ed was designed to be used on what essentially was a typewriter, because dialup was so slow, that you couldn't send an entire screen worth of text at once. So you would basically be remembering the text in your head, while inserting and regexing lines.
And the great error handling when it doesn't know what you just told it to do.
?
It's not that they had dialup, or that they couldn't send an entire screen at once (though I'm sure these factors later helped keep ed around). The devices they were using were much more like a typewriter than you seem to think, they literally printed the text out on paper! The programmer didn't need to remember the text in their head, they had the paper right there to look at!
Sort of. While ed was scriptable, it was by sending the commands to stdin, which doesn't lend itself to being composed in a pipeline, where you'd more intuitively expect the text it's modifying to be on stdin. We can do it without problem now, because shells have process substitution, but ed predates that shell feature by a couple of decades. Instead, to use ed in pipelines, we ended up with a modified ed to work on streams: `sed`, the Stream ED.
I like the ideas here, but for long-running processes like file watching, dev servers, hot reloading, etc. a better format is Procfile (https://devcenter.heroku.com/articles/procfile). The ideas from this article could be nicely applied to it.
Procfil is a format that declares a named list of processes to be run that can be controlled by tools like Foreman (http://ddollar.github.io/foreman/) and Honcho (https://pypi.python.org/pypi/honcho). The advantage is being able to start and stop them concurrently as a group, useful for things that otherwise take a tmux session or multiple windows/tabs, like dev server + file watching + live reload: they become a simple `foreman start`. Processes can also be started individually. Procfiles can also be exported to other formats, like systemd, upstart, inittab, etc.
Here's an example Procfile from a web project I've been working on. Since it uses node I went with node tools like http-server and watch, but it could just as easily use any other web server or file watcher. The way it works is it starts a web server serving public/; starts a live reload server for public/; and watches the src/ directory for changes and re-runs make. The makefile has a few rules for compiling JS and CSS from src/ to public/.
web: ./node_modules/.bin/http-server
livereload: ./node_modules/.bin/livereload public
watch: ./node_modules/.bin/watch make src
An important point of the article was to use standard tools installed everywhere and not some obscure niche tools. Please note that you yourself felt the need to explain what “Procfil” is in the first place.
I wouldn't consider the industry backing of Heroku and a collective 4,500+ GitHub stars between the tools particularly niche or obscure. Crank it up to ~20k stars if you want to count deploy tools that can use them, like Dokku or Flynn. Anyway, the other great thing about Procfile is that it's just a declarative format. Lots of tools can leverage them, and they are an important part of my and many others' workflows.
I also don't think the article is too keen on "standards", judging by it referring to make a "task launcher" and the suggested usage completely diverging from the expected behavior of the program.
Is it even packaged for Debian/Ubuntu yet? Make has been there for decades and is available on practically every developer machine out there. Compared to Make, nearly every tool is niche and obscure.
> but for long-running processes like file watching, dev servers, hot reloading, etc.
I don't think anybody should use make to do that at first place. That's not what make was built for. Likewise Foreman should not be used as a build tool because it is not.
EDIT:
now i've seen the makefile in the example,I understand your comment and this is absolutely not where one wants to use make, that's just ridiculous.
Wouldn't "it's not appropriate for the task" be a better reason not to use something than "it's not made for the task"? Don't you have any better reasons at all? Does make bring out people's conservative side or something?
Let me ask you this. Would you sit on a tree stump? How about kill a fly with a newspaper? Sometimes things are great for purposes for which they weren't originally intended.
> Does make bring out people's conservative side or something?
Misuse of tools in software development is why we end up with broken software, useless solutions that solve stupid problems because the problem wasn't well understood as first place, and first and foremost unnecessary dependencies. That's why we end up with this makefile "hack".
Now explain what it's got to do with "conservatism". bad practices != innovation .
Do you really believe any use of a tool in a way that wasn't intended is a "bad practice"? Is there no more subtlety or thought to it than that? This adherence to an ultra-simplistic black-and-white rule is absolutely a form of conservatism.
If you think this particular use of make is a "bad practice", then argue why it is! If there's no better reason than "This use isn't as intended!" then your opinion won't have much weight with people.
ffmpeg can utilize streams, in both input and output. The trouble comes from different codecs and containers, especially on output. Some formats aren't append-only—the prime example being MP4 + h.264—and so ffmpeg needs to be able to write to a seekable output device, ruling out streaming output in those cases.
[1]: https://github.com/evincarofautumn/protodata