I hate the ideas many programmers have about backwards compatibility, that it's more important than development speed and modern concepts. There is nothing holy about Unix era software, chances are it's shit and a lot of it should be thrown out.
Look at SublimeText, it's got 1% of the features of Vim, yet it's converting Vim users left and right, by its sheer usability.
We as developers in the Open Source community should be ashamed people are still using Vim to write LaTeX in Bash running on terminal emulators. (Yes, it gives me shivers just thinking about how much each of those technologies sucks when you think about how good it all could be.)
> ...should be ashamed people are still using Vim to write LaTeX in Bash running on terminal emulators.
Markdown is better. Microsoft Word is better. They just both lack certain things we need, and LaTeX has those things.
Could you not envision things could be better?
A LaTeX that parses to an in-memory tree so it could be transformed before compilation?
A shell that instead of working on character streams worked on structured and annotated data streams, so it could intelligently interpret what is going on?
An editor that does have at its core an expectation that its output is ASCII character commands controlling a 1970s terminal interface?
Of course you'll stick to using Vim to write LaTeX in a Bash shell, I do too. Doesn't mean I'm not mad about it..
2) Markdown and LaTeX are designed for different tasks. Next you'll be telling me that if you have a leafblower, you have no need for a chainsaw.
3) Find me the structured and annotated data streams first - and show me that the structure and annotations are correct - then lament the lack of a shell that can route them to the programs that don't interpret stdin and stdout and stderr that way anyway.
4) I use vim to write latex when I write latex, but "in a bash shell" doesn't make any sense. You fire up vim from bash, yes, but unless you're trying to do an ex-style editing session, you're not editing in the shell any more than you're editing in the kernel or the tty driver...
These improvements aren't all that difficult to imagine, but they don't happen, because, well, why? GP seems to think an obsession over backwards compatibility.
What you're basically advocating is nobody should innovate because things are "good enough". With that sort of attitude we would never have any progress on anything.
Which is not what you're saying I'm advocating.
It's also not "fixing what's not broken", but improving something that could... well... be improved.
Bad example :P
if you want extensibility and still a rather lightweight syntax, try restructuredtext. it has exactly one flaw compared to markdown, which is the hideous inline link syntax (which practically forces you to use named links). that makes it less suited for forum comments (where you might want to quickly inline 1-2 links), but perfect for books (where you can neatly specify your link targets below the current section)
And yes, there are efforts underway to improve it (LaTeX3) but 2e works so well, there's not been much drive there.
And if you find LaTeX is so hard, there's always LyX...
Well, mostly math and science books. Very few (if any) fancy and well typespeced books by major publishers have been made with LaTeX. I know, because I know that industry. So, while TeX was once created to be a general typesetting solution, it has been relegated to something math and cs geeks use for their papers.
>And if you find LaTeX is so hard, there's always LyX...
A not really maintained, relic of a program, that tries too hard to work around the issues raised by a backend like LaTeX that wasn't really created with such GUI control in mind.
What do they use, then? (I'm actually pretty curious, because I am writing a dissertation that I currently build with LaTeX, but I find it much nicer to write in Org mode. I'm doing whatever I can to avoid a hard dependency on LaTeX for the backend, and to allow exporting to multiple formats.)
Why do they use whatever typesetting solution(s) they use? Does it actually produce better/nicer output than LaTeX (for non-math text)? Does it just have nicer syntax or friendlier error messages? What's the advantage?
Yes, but a leafblower combined with a chainsaw...
I don't know about you, but that's a tool I could find some uses for.
+1 coffee sputter.
> "LateX is slow, inconsistent and needs to be ran multiple times to give a correct result"
It is not slow, it runs in under a second on most documents I have authored.
I'm not sure what tinco meant by this, but it does in fact return the same results for the same file across multiple runs.
> "Needs to be run multiple times to give the correct result"
This is true for things like references, etc. However, this proves to be a non-issue in practice as you are recompiling so often to view changes that references are always up to date.
> "It's syntax is ugly."
I don't see how learning LaTeX is different from learning any other programming language. I don't think that many C++ gurus would call C++ "ugly". Calling a progamming language "ugly" is often the last argument you see when someone couldn't come up with a decent argument against it.
Slow? I don't know about that, but compared to what? Even in word it does need some time to reach the "Save as PDF" menu, right?
> I'm not sure what tinco meant by this, but it does in fact return the same results for the same file across multiple runs.
Well, there is some truth about this. There is an odd naming scheme if you look at stuff like "enumerate", "itemize" and "description" - beginners get confused by this as they assume it would be called "describe". The same goes for most of the packages. This is somehow historical, but I think it also makes it unnecessary hard to figure out LaTex for beginners, right?
Also the reason why LaTex needs several runs to finally create the full document. Nowadays you would do this differently I guess as there is simply no need to start the program twice if the application would be smart enough to call sub-programs (like biblatex) by itself on the fly. It would still run the same routine several times, but users wouldn't see it and it would be one command for them to build the document. That's another point beginners tend to go crazy about :)
Having said that, I should add that I'm using LaTex (and XeLaTex, etc.) daily and there isn't a single program matching the power and beauty of it - I even design covers for my publications with it. But I'm able to use vim and emacs, right? 90% of the people are bloody beginners and it would be a shame to hide the beauty of a LaTex document from them.
>> "It's syntax is ugly."
Yeah, well...ugly is a bad term to discuss about. I like it and I think the syntax is way cleaner than reStructured text :D
what is unclean about rST in your opinion?
It is slow. On my PhD thesis, it took more than one minute at some point, on an SSD. Not very convenient if you are tweaking figures and equations, and want to preview a change quickly.
However, I will agree that using it to address slowness of a build process is just a hack.
Syntax is important, remember he is comparing it to Markdown, not C++. C++ is certainly ugly and should not be used as the basis for any comparison, certainly not for a word processor.
I think he is just pointing out that general theme amongst these tools. They work for the people using them well enough, so they stay that way forever. It isn't that features aren't added or that they stagnate, it is just that thinking outside of the box is impossible.
Definitely initially for simpler.
But what about using it collaboratively, merging changes and comparing between versions? Especially large documents with a multitude of authors combined later (think journals).
What about documents with programmatically generated content?
What about consistent formatting for your documents?
What about when Word gets confused by some element in the xml not visible to you?
Sure Word may be easier but there are limitations to it's use.
I want to draw a conformal mapping diagram in word and also write a 3-vector of the navier-stokes equations.
Also, I want to do a lot of tensor arithmetic with tons of superscripts and subscripts and arrows.
Tell me how any of this is accomplished in Word please. Saying Word is easier to us is like saying BASIC is easier than C++.
Hell, I just had a postdoc send me figures as part of a manuscript draft where she was using Powerpoint to make the figures instead of something more appropriate like Illustrator. Most people who publish don't care about how their documents are created, they just want them to work and get published. Word does a pretty good job for this.
Use the best tools for the job. Most people would tell you that BASIC is easier to use than C++, if you can write your program in BASIC. And if you don't want to deal with learning curve of C++ and you get get by with BASIC, then what's wrong with using a tool that works?
I'm sick of the "HN strawman." People on these threads need to read and apply basic critical reading and logical skills.
You don't need to tell ME to use the best tools for the job. I've been telling people that for fucking years. I don't why your comment annoyed me so much but it did.
In some fields LaTeX is the easier method to produce documents (good luck with anything more than a simple equation in Word), but those are relatively small niches.
But you can't honestly say that Word and LaTeX really have that big of a difference in intended jobs. It seems to me that they are both in the document production business.
One can be abused to fill the role of the other, and if you look at them both from 1000 feet and squint then both "for" the same thing, but the reality is that when you actually examine what each is designed to do and in which situations each is used, they really have little to do with each other.
LaTeX and Word are for documents - making documents. How they operate is very different. For example, Word includes it's own editor, LaTeX does not. But they are both used for producing documents.
I am not fond of LaTeX. However, for some fields there are simply no dedicated programs, while there are good LaTeX packages. For instance, for my thesis I drew a lot of attribute-value matrices (in the unification grammar/HPSG sense). There is a great packages to draw these structures, both flat and in tree form.
I had some hope that DocBook et al. could replace LaTeX for most uses. It's great in many respects: it's just XML, so easy to machine-interpret. Customising output is easy (via XSL stylesheets). There is good support for producing PDFs (via XSL-FO). And it supports SVG and MathML. There are also great WYSIWYG editors, such as Oxygen. However, that toolchain never became popular, because most people hate XML.
The book that we never got to finish is written in DocBook, equations are in MathML. The Slackware book that I once wrote  is also completely done in DocBook.
I'm a biomedical researcher, and I use LaTeX. It's my preferred method of document preparation. I know many others who use LaTeX as well.
I would say LaTeΧ can be perfectly well seen as a combination of many aspects of Word and Markdown in the sense that it offers the feature set and backwards compatibility that Words offers, while culminating a legacy, complexity and perceived indeterminism that also plagues Word (though in a different way); and on the other side being a plain text format that can be version controlled and taken apart and shared as snippets.
Apples to Oranges? I can compare these Braeburns to those Fujis just fine thank you.
Java and C are both programming languages but it is still apples and oranges to compare them.
I have another question though, how much have you used Word? I have used it only a couple of times since highschool, but it's gotten pretty impressive. I bet that if you really dive into it you could get very close to the power of latex. At least, with regards to the typesetting. The drawing images/formulas/schemas thing is a whole different ballgame I'd agree.
Anyway, my point was not that Microsoft Word in any way is a good replacement for LaTex, I'm just saying that if LaTex could be as well written as Word is, it could have these fancy GUI and fast rendering features too.
There's people out there that think that the power in typesetting that LaTex has is mutual exclusive with the speed and usability that Word has. That just isn't so.
The only reason LaTex isn't fast and more easily extensible is that it was written for the machines of 40 years ago, that didn't have enough RAM to hold the parsed tree of a document.
LaTex could be ten times as fast, were it written in Ruby, the slowest language out there. Just because a modern programmer would make use of the modern hardware and modern design concepts, and just make it responsive. A GUI could be as simple external process that accessed a message passing API, rendering it to HTML and run it through Webkitview.
Overhead everywhere, still ten times faster than LaTex...
(edit: definitely worth a downvote, this comment :P)
you've written a handful of latex documents, so you're now an expert on what latex is used for and what people want out of it. would it help to point out that given latex is more difficult to use than word, I'd bet most latex users have tried and discarded word in favor of latex? Would that affect your judgements about the relative use cases?
There's people out there that think that the power in typesetting that LaTex
has is mutual exclusive with the speed and usability that Word has. That
just isn't so.
If you built a better system, you'd find customers for it.
I had a long talk about something like this with a close friend last night. He has a favorite technology stack that is regularly trash talked on sites like this. From what he's seen its usually from people who have only a passing knowledge of it. Either they worked a job and only had to mess with it for a few months or they worked with it for a long time but never went past just the surface. Almost all the complaints he's seen about it are either extremely outdated, fixable with a little custom code or configuration changes. I think you should spend a little more time with it before you say that developers should feel "ashamed" for using it. Your post falls squarely into the category he was talking about.
I have a lot of respect for having your own toolchain that's optimized and comfortable for you.
I am sorry I come across wrong, I am not saying that developers should feel ashamed for using it. I am saying developers should feel ashamed that other people are using it.
I'm not saying using LaTex is dumb, I've stated multiple times in this thread that I use it myself. I'm just saying we could do better.
The things that I use LaTeX for are usually my own writing that I don't have to collaborate with other people for. If I have to work with other people I use Word. The things that I like about latex are that I can have it automatically generate my figures because I can script it, which I can't do with Word.
I agree that LaTeX can be difficult to write but you have to realize that word has it's shortcomings too (put an image in and send the document to somebody else, chances are the layout will get messed up). Tools like pandoc are trying to alleviate both problems so people can work in whatever environment they prefer, which seems like a good solution to me. I could write part of my document in markdown, convert to LaTeX and write my formulae and then convert to Word so my coworkers can edit it easier.
And you never stopped to investigate why, in spite of the far greater ease of entry that alternatives have?
I'll take LaTeX. It's more reliably reproducible. Once I got the basic format created (for her second book)—which took about as much time as it took to format the first book in Pages—the third book took about a tenth the time. My tooling was in place.
It's not for everyone, and you really have to either dig deep or find a macro set that does what you want (or both—I'm using a macro set on top of scrbook). There is room for something better, but that something better is not Word (or Pages).
(I'd almost say that what LeanPub does is good enough, but their LaTeX typography sucked when I tried it almost two years ago.)
I find that very unlikely.
Are you talking about InDesign? Exact placement of anything is really a pain with Latex, choosing "normal" fonts requires you to switch to a different compiler and the debugging information on all those automatic choices TeΧ and LaTeΧ does for you is abysmal. What I'm saying is that even though LaTeΧ allows you to do precision work does not mean it is a good at it.
And that is almost the whole point. Most people are idiots about how to present written material, so TeX and friends make those decisions for us. It's especially frustrating when you have to dink with floats, because now you're second-guessing TeX, and it exacts a high price.
I've used TeX for over 25 years and I'm comfortable saying that I could produce documents of a similar quality, with better support for collaboration, with Word.
I still use TeX for many projects but the Word style capability is great.
Markdown still sucks, though. :)
How? This is an honest question. I wouldn't use Word anyway, but I am just not aware of anything that could come close to Git + LaTeX in terms of collaboration support.
I've been doing projects with people who have no technical background to speak of, but immense domain knowledge. They wrote their theses in word (god help them), published in journals that required submissions in word (there are more and more of the things), and include a lot of figures in their papers or grant proposals.
Sharepoint helps a lot with collaboration, but even word's basic change tracking and commenting facilities are good for people working around a document via email, Dropbox, or even git.
Word still creates those weird situations for which it is infamous, but they are generally rare. Also, word has become much more stable, even with larger documents.
Word: explicit WYSIWYG control over typesetting.
LaTeX: semantic level control over typesetting. Additionally provides explicit low level typesetting commands.
Markdown: given how basic (trivial even) Markdown is (http://daringfireball.net/projects/markdown/syntax) I'm not sure why it's even in this list.
I guess I don't really have a point. Just that if you are writing essays in the humanities or sociology then markdown can have a place in your workflow
Word nowadays has semantic typesetting controls too, I think since they started that ribbon thing that it has become useable. If you stick to using their semantic things I think you can do a pretty good job of properly typesetting in Word.
When i was finnished writing i exported the markdown to rtf, 1) selected all the h1's via "select all", set them to be a real (word) h1. vice versa with all other styles i used. 2) Then i put in all the pictures. 3) Then i exported the list from Mendeley.
That process - from markdown to perfection - took me less than 4 hours.
I have some experience with latex and especially latex + komascript, but i hated that experience because the latex sourcecode doesn't ebrace readability and you're sure to get obscure errors in the last second. besides stuff like running latex twice for a table of contents.
\item really long line which gets wrangled in my vim
terminal, so that i don't see where the second item starts.
\item another long line. maybe 200 characters long?
* really long line which gets wrangled as well, but is
better to read because it starts with a clear sign which
is recognized to be a bullet.
* next line, which is also very long.
The older proofreaders could proofread it in the final stage in ms word with the track changes functionality.
I put a lot of thought into Design and Typography and i was sure that i could decide better how my document should look, than latex.
I agree, that for some people this is not the case.
After a draft or two, transitions to LaTeX is a must. [Pandoc](http://johnmacfarlane.net/pandoc/) for the win!
Actually the Plan was to typeset it in Indesign, but as i woke up the day of the final layout i just decided to do it in Word.
I'll just pretend it's a parody.
This is what most people do in physics and astronomy. You make the plot in some kind of plotting package and export it to a vector format. In astronomy, the journals only recently started accepting PDF figures in addition to EPS.
In my experience doing all of my figures in LaTeX is significantly more painful than using a real plotting package like matplotlib.
What about tables? Granted they aren't fun to make, but I am not aware of any methods of creating them outside of a latex document.
The point isn't if Markdown has enough features, but rather that LaTeX lets you a pay a price for those features that you shouldn't be paying, just because it's a 40-years-old software.
The best thing about markdown is pandoc.
Also, even though Markdown tends to be fairly easy to write, there are many edge cases where you can't really be sure what the result is unless you try it.
LuaLaTeX is a reimplementation of PDFLaTeX on that base.
ConTeXt is a new syntax/concept based on LuaLaTeX (that doesn’t want to be compatible with LaTeX and its errors)
That shell is unfortunately MS PowerShell. There are several approaches to do it in a way more familiar and compatible with unix shells. (some of which have been started recently, such as Final Term, and some are vaporware, such as TermKit)
And most editors are like that, some of which even have an optional vim mode for fans of that control scheme (my favourite is Kate)
OneNote has "insert Ink Equation" (I don't know if it uses the same engine, or has its own).
I'm not convinced that's worse, but I'm not convinced it's better for the task of being a shell. It's totally, unequivocally, way the hell better when you're trying to build anything large out of it, but taking everything back to byte streams at the interfaces means a huge ability to easily reformulate things to fit them together.
What do you mean by "know what a directory is"?
That comparison right there, between systems in entirely different categories, tells me you have no idea about what you're saying.
You only need to run it multiple times if you use features that require it, and that is because one pass generates code to be read in a second pass.
The syntax is not ugly. It's a macro programming language.
If you don't like use vim (or emaca with AucTeX) to write it, look at LaTeXila - it's pretty good.
If on Windows, notepad++ is also free and has excellent support for LaTeX.
Then you try command-line Unix for the first time and nothing works anymore. You have to unlearn the habits of a lifetime before you can do something as simple as edit text. You can't even remap them effectively, because most of them do something else in vim or bash that you'd have to move elsewhere.
This is one of those problems that is just completely invisible to old Unix hands. It took me years to stop writing code in Notepad, because using vim felt like trying to drive a car with the gas and brake reversed, and someone punching me in the face every time I reach for the turn signal.
I'm not saying Unix conventions are objectively worse than Mac/Windows ones. Once you've got them trained into muscle memory, they're just as effective as anything else. But that need to retrain is a pointless roadblock in the path of every would-be hacker.
But seriously, this is a complaint I hear from undergrads in CS all the time. My answer is the same every single time: stop whining and get over it. You're a professional now, you use professional tools and sometimes they work differently, that's just the price of admission. Nobody whines that driving an 18-wheeler is different from driving a compact car, they just suck it up and train for a CDL.
To play devil's advocate to myself, yes, obviously just switching everything over to GUI standards overnight would cause all sorts of problems. Tradition is not inherently valuable, but standardization is. But for fuck's sake have some basic human empathy. In the context of an individual career, having to unlearn and relearn how to edit text is a pointless speedbump that gets in the way of learning actual skills. Teach students what they need to learn to survive in the world, but recognize that their complaints are valid.
So yes, there are differences, but the nix people have gone to great lengths to bridge the gap, to the point that I have zero sympathy for people who can't make the switch (note that I'm not saying everyone needs to use nix, that's a totally different debate, just that anyone who wants to should be able to).
Now, the OS X Terminal (and programs run through it, including vim) does support Cmd-C (after highlighting with the mouse), Cmd-V, and Cmd-A, because the OS can treat them as raw text without the terminal knowing anything about them. It doesn't support Cmd-X, Cmd-Z, or even basic navigation keys like Home and End.
" Make it behave more like a (Windows) GUI editor
People have made fancy new shells build around things other than bytes through pipes, people have created new replacements for terminal emulators, and people have created fancy GUI document creation systems with fancy instant visual feedback (a.k.a: word processors). Real shells, real terminal emulators, and LaTeX are all still used despite the creation of these meant-to-be replacements.
Habits die very slow.
I don't think this is something that follows age lines. Whether or not somebody primarily has a background in the FOSS world or in the Windows universe is probably a much stronger indication of what sort of tools they use.
Before getting my first UNIX contact, via Xenix in 1994, I was already fully comfortable with the GUI world of Amiga and later on PCs. Including GUI alternative environments like Smalltalk and Oberon.
So, although I do master the CLI, I rather spend my time in the cosy GUI world.
Org-mode has exporters for LaTeX, PDF (via LaTeX), HTML, DocBook, PDF (via DocBook), various flavors of plain text, OpenDocument text (which can get you to Word if you want to go there), all of which work quite well; if you need to supply LaTeX source as input to some external pipeline, Org makes it possible for you to do so. On the other hand, if you prefer to spend as little time as possible dealing with raw LaTeX, which in my experience seems like a sensible attitude to take, Org makes that possible as well. And on the third hand, if you need to write LaTeX source directly for complex diagrams and such, or if you want to override the way Org generates LaTeX in specific cases but let it handle the rest of a document on its own, Org doesn't get in your way there, either.
You can also include code snippets in an Org-mode file, of arbitrary length and in arbitrary languages. If they're in a language Emacs understands, you can have it render them with syntax highlighting, but that's just the tip of the iceberg; you can also have Org-mode execute those snippets, whether ad hoc or as part of the compilation process, and it can incorporate their results into the document verbatim or after a variety of transformations, and also use those results as input to other code snippets in the same document. This is why the literate programming folks love Org-mode as much as they do; there's nothing quite like it for interweaving text and code and then being easily able to do interesting things with both.
And, although Emacs is by far the most powerful editor for Org-mode files, there's nothing saying you have to use Emacs to edit Org-mode files. You can, for example, use Vim to write your Org-mode source, probably with some degree of highlighting via some vimscript file or other. You can then invoke Emacs in batch mode to compile your Org source into whatever format you want to come out the other end, without ever having to see or deal with the Emacs editor interface, and without having to involve yourself save on the most superficial level with Emacs Lisp. (I'll be happy to provide the general form of such an invocation, if you're interested; let me know.)
For the one percent (perhaps) of the documents which really end up on paper, I guess LaTeX is fine. Or Word. Who cares?
Both golang and D have a reasonable claim to being C's successor. People often think of D being a nextgen C++, but to me it has much more the feel of a C with garbage collection and batteries included.
And golang definitely feels like a modern C, at least to me. And that's what Rob Pike had in mind, as well.
PS: Had to use Word every day for years for my job. I despise it, and love vim. Vim keyboard navigation would have made my life so much easier when working on Word, and saved me several severe episodes of carpal tunnel. Luckily, I no longer have to use Word.
TIL that these are required for text editors and related tools.
- Go's unsafe package offered the same capabilities as Oberon's SYSTEM
- It offered a bit more control about GC takes place
Any AOT native compiled language can be used in place of C for user space code. There is hardly any C feature essential for such type of applications.
C can be relegated for kernel space until something better gains more market share.
I don't use Vim much now because I work a lot in Windows, and Vim got me into the habit of hitting escape when I was through entering text, but escape cancels the input in a lot of programs. However, I still use it in the shell or when I need to make certain global edits.
I wish Neovim great success, and I might use it someday if it meets my needs, but not just because it's new technology.
Look at the recent HTTP redirect article for example. Something as simple as redirects have been implemented incorrectly for a long time. Browser vendors are well aware of it, but they cannot change the behavior because it will break every existing site that expects the broken behavior.
You should be more impressed with the things that last 20 years, not embarrassed. It means they were actually engineered well enough to be a good general solution.
Another possible way to interpret the 'ashamed' statement is that tinco is simply saying we should be trying harder to move forward when we are using such old technologies with such warts. Why can't we get rid of the warts? We should try harder. Perhaps this is what tinco is saying? I think at least equally plausible to the naive position you're projecting onto tinco...that it's 'simple' to replace these things.
Kudos to Neovim for making the effort! It is certainly appreciated.
Does ^[ cancel input in any programs ? AFAIK vim understands the ^[, not the "escape key". the key labeled "ESC" creating a ^[ is a convention used in terminal emulators.
Not that it's going to help you now, but if you had trained your fingers to hit ^[ rather than the escape key (or even ^C but that has other semantics in windows whereas I don't think ^[ does), I wonder if you would be having the same problem now..
Vim understands "escape" to be 0x1B, pressed by itself (not as part of a longer escape sequence).
And the ESC key has been generating ASCII 27 on keyboards for a long time. It's definitely not just terminal emulators.
some people discourage the use of the Esc key and prefer to use Ctrl-c instead. I'd recommend to disable it and force to use the Ctrl-c combo. I did it and completely forgot about the Esc key(actually I find it more comfortable)
imap jk <Esc>
What?! You can take my vim+bash+tmux+urxvt when you pry it from my cold, dead hands!
>not on /g/
You're confounding usability with "do not need a manual". They're not the same thing at all. Moreover, I reject entirely your assertion that vim is not usable.
vim is amazingly usable; text manipulation commands for example all follow a common structure: [action][times][position]. That means the rules you learn for using "yank" apply just as well to "delete" or "paste" or whatever. Anytime you encounter a new text command you don't even need to wonder how to use it; it's a sure thing it will follow the same structure that you already know.
If by usable we mean "one can be productive in it" then yes that's correct, vim is immensely usable and I've been using it in that sense for six years.
That however is a facile definition of what most people typically mean when we speak of 'usability'.
A tool can be useful while only being barely usable at all. You can do wonderful things with modern airplanes, but most models require painstaking attention and study.
I'm personally of the opinion that we can have a terse input command language while also exposing the rest of the functionality in a way that doesn't punish all users.
I spent all of university using vim purely as a way to indent and colour my code; but intelligently using splits, buffers, tabs, code folding, ctags etc should not require luck to discover and exacting patience to use.
Let's not even get started on the totally inscrutable vimscript. How many more useful things could we have if I didn't have to depend on @tpope for 90% of my plugins?
For many years vim trudged along because it the only other game in town beyond emacs. Now, I only continue to use it because of sunk costs.
So yeah, nerds have to get over their sacred knowledge. Just because you somehow enjoyed reading the :help pages (which… require you to know how to use vim to read!) doesn't mean we have to force everyone to do that.
Or we could all move to Sublime.
Usable means an interface which is (i) consistent and (ii) well suited to the task at hand. It does not mean you should not require any training nor that you should be able to intuit the functionality without looking at the manual.
vim is an example of software that is both highly usable and intended for use by advanced users.
> I spent all of university using vim purely as a way to indent and colour my code; but intelligently using splits, buffers, tabs, code folding, ctags etc should not require luck to discover and exacting patience to use.
You're doing it wrong. Read the bloody manual.
No it doesn't. Usable as in Usability is defined in the ISO-norm 9241-110 by seven principles, a summary can be found here: http://www.userfocus.co.uk/resources/iso9241/part110.html
"suitability for the task (the dialogue should be suitable for the user’s task and skill level); self-descriptiveness (the dialogue should make it clear what the user should do next); controllability (the user should be able to control the pace and sequence of the interaction); conformity with user expectations (it should be consistent); error tolerance (the dialogue should be forgiving); suitability for individualisation (the dialogue should be able to be customised to suit the user); and suitability for learning (the dialogue should support learning)."
vim falls short in discoverabilty and, arguably, in user expectation, and suitability for learning.
Though I don't want to further spawn a vim-discussion. Just to show you that you don't seem to know the necessary vocabulary to tell you parent that he is wrong.
> You're doing it wrong. Read the bloody manual.
Yeah. Way to prove his point.
vim fits very well all the criteria you just cited. Moreover, "discoverability", "user expectation" and "suitability for learning" appear to be buzzwords that you just threw in on your own at the end.
Even a terminal window allows you to type help to get something to work with. VIM replies with "Left, end of word, Right, paste after" which in most cases is nearly a NOP.
I believe what he is trying to say is that it would make sense to figure out how to make the initial experience less daunting without giving up the power it provides.
type :help<Enter> or <F1> for on-line help
And surely that is why all the text-entering applications in the world have followed vi's lead.
saying it's converting vim users left and right is a little far fetched. most of those users never had any real vim muscle memory to begin with. a lot of "vim" users i saw work just fine without motion commands. the only reason they used it was because they could use it anywhere. whether it's modal or not is totally irrelevant to them.
Wait, what? Vim has been in use for a lot longer than that (it was released in 1991), and vi even longer (1976).
"Easy to learn and use" does not mean "I don't have to learn anything". In particular, vim is easy once you learn something about how it works with text. For example: "d" is used to delete text. It is used as part of a command sequence that usually goes:
One of the first things you learn for example is that "dw" deletes to the end of the current word. d5w deletes to the end of the next 5 words. "yw" copies to the end of the current word. "y5w" copies 5 words.
Later you might learn about cursor movemen. You find out that "j" for example moves the cursor down one line. Guess what? These cursor re-positioning commands fit into the command structure you've already learned. So you can easily intuit that "d5j" will kill everything to the end of the next 5 lines.
Later in your vim journey you might learn that "D" is shorthand for delete to the end of the current line. Hmm, but there is also this "y" command that yanks text. Guess what? "Y" yanks to the end of the line!
That's what usability looks like. You take what you've learned and apply it in a new context in order to produce expected and consistent results.
In short: vim is amazing. It's consistent, intuitive and immensely usable -- but only after you invest some time learning how to use it. The only time vim is not intuitive is when your expectations are that it will behave somewhat like TextEdit or Notepad or whatever other lesser text manipulation tools are out there.
"yeah, what's up?"
"Nagios is paging, something with that code you pushed live a couple hours ago is throwing a fit... looks like half the web cluster is on the verge of going to swap, can you look into it? asap!"
"Well, I'm on at the bus stop and won't be near a computer for at least an hour...yeah, I'll sort it out, will text you when we can push fix live."
pulls out Note 3, shitty 3G coverage
connects to VPN
ssh into dev env with connectbot
switches to hackerkeyboard for input mode
screen -dr dev
Ctrl+c, tail join IRC channel that dumps syslogs from production cluster
git stash, git pull, git stash apply, git commit, git push
sends text message "good to go!"
deal with it.
git commit -m "Always commit your changes first!"; git pull --rebase; git push
I like the syslog into irc thing, that's neat.
By the way, I recently made a commit in the Github web interface while sitting on the toilet. No posix stack required ;)
This whole "it's apples to oranges" claim is bullshit. Both Word as well as LaTeX can be used for anything from writing a resume, making a technical paper with a few simple equations, to writing an entire Vector or Linear Algebra textbooks. When you have two options for doing the same task it's very much comparing apples to apples.
Yes, they each were originally built with different objectives and different target markets, but they have since aspired to be usable for all the same things. Of course, they each have their strengths and weaknesses.
But I don't get how do a bunch of folk here feel justified in arguing against someone saying "there's room for improvement" with clear viable suggestions on the details too. I know that hating on MSWord (or MS-anything) is fashionable on HN. But you can't seriously be defending
* the steep learning curve: throw your 14-year-old who's doing a math assignment Word vs your favourite LaTeX editor, see what happens;
* lack of WYSIWYG-like feedback-loops: you have to wait a full minute on decently sized documents to see the result of adding that equation that took you 10 seconds to add;
* syntax holdups: you miss an underscore and your document is broken. All the way through. And can take an hour to fix even if you're skilled. People writing documents that need LaTeX's power aren't always coders, and debugging is not a fun or planned-for activity for anyone.
I also don't for the life of me see why someone has to "be an expert, not just written a couple documents casually in undergrad" to be able to comment on the flaws of a product. They're as much a user as a power user, and obviously the power users went through that stage too. Losing sight of your past difficulties (or being gifted in LaTeX) doesn't make them irrelevant for people not in your current position.
Loosely relevant: I smell hardcore "hacker entitlement" around people using LaTeX and defending it including all its stupid barriers to easy document-creation tooth and nail. Hardcore programmers do this with way too many things (like Linux, and development environments for almost any language). And it's just really frustrating for a lot of people to deal with =/
PS: NeoVim sounds like someone finally got off their high horse (or from whining) and decided to actually make something better. Thanks!
I'm sorry, but I just have no idea what you're talking about. I haven't encountered anything like that - not when I was starting out with LaTeX, not now when I use it for writing technical docs. "An hour to fix" smells like pure FUD to me - any mistake can take an hour, a minute, or even a week to fix. However, I can't imagine a syntax error like missing an underscore - especially one like that - could take more than a few minutes at the very most.
>PS: NeoVim sounds like someone finally got off their high horse (or from whining) and decided to actually make something better. Thanks!
I think you're a bit confused as to what exactly are the "hardcore entitled hackers" defending. Lots of systems in the UNIX ecosystem are bogged down by backward compatibility, are arcane, or just plainly in need of rewrite. Nobody disputes that. The commenters here however attack them from a completely different position, and with arguments which mostly don't hold water - so don't expect no defense.
Hackers aren't the kind of people who would cling to their tools despite them having obvious and easily mitigable flaws. After all, this post did gain a lot of traction, at this point mostly by simply presenting what would be great to do.
So can HTML. Would you compare Word to HTML? You can also write your resume in Notepad. Does that make Word comparable to Notepad? If you want to compare anything, compare LyX to Word. At least that makes some sense, even if the output of the latter just sucks in comparison to the former.
TeX is at its core a typesetting engine. Word is largely a GUI markup editor. It doesn't even do anything coming close to typesetting.
> * the steep learning curve: throw your 14-year-old who's doing a math assignment Word vs your favourite LaTeX editor, see what happens;
Are you sure? Customizing LaTeX can be hard, but using it in a straightforward way is no more difficult than HTML. Sure, it may be a bit more work than using a GUI, but once you get the hang of it it's a lot quicker than Word's formula editor.
> * lack of WYSIWYG-like feedback-loops: you have to wait a full minute on decently sized documents to see the result of adding that equation that took you 10 seconds to add;
In my experience it's a lot faster than that. The reason for this is simply that TeX does actual typesetting. This has a certain computational complexity, and cannot just be done on a line-by-line basis. Word is fast because it doesn't do anything like that. Again, Lyx has WYSIWYM (What you see is what you mean), which gives you an approximation of the output instantly, minus the proper typesetting.
> * syntax holdups: you miss an underscore and your document is broken. All the way through. And can take an hour to fix even if you're skilled. People writing documents that need LaTeX's power aren't always coders, and debugging is not a fun or planned-for activity for anyone.
It more commonly takes a minute, not an hour. Every had Word corrupt a file and mess up the styles in the middle of your 150 pages document? That's not exactly a quick fix either.
To sum up: If you want a WYSIWYM editor for TeX use Lyx. Otherwise, comparing LaTeX and Word makes about as much sense as comparing HTML and Word.
I don't mean to be difficult, but lots of laypeople actually compare HTML and Word for a common purpose, usually for making simple websites. Yes, people actually do generate their front-end page formatting using Word's HTML export. Again in that context, it's technically comparable (or worth comparing, rather). And they each have their flaws. I have used HTML/CSS for lots of documentation, and experimented with it for styling my resume even though those can be classically considered Word's domain. So I actually think comparing HTML and Word is really not that strange for a non-zero number of contexts. Typesetting via "formatting" is a pretty big part of the Word experience, and is a totally worthwhile context in which to compare it to LaTeX.
> Customizing LaTeX can be hard, but using it in a straightforward way is no more difficult than HTML.
I used LaTeX for every document I created in undergrad, but mostly because I wound up spending hours the first week of every course tuning a document to match the lab report/homework format, and there was a legacy of standards passed down from upperclassmen (ie plenty of nice handholding). After I left college, I've pretty much never wanted to open a LaTeX editor (or Word really, for that matter, because I prefer Google Docs).
I would bet money most middle schoolers, if given a blank LaTeX editor page and a blank page in Word/GDocs to write just a bunch of math formula (some fractions, arithmetic symbols and some exponents maybe?) would give you something better and sooner with the latter.
> In my experience it's a lot faster than that.
This unfortunately wasn't true for me, or for a lot of other people. Maybe large amounts of images, or the wrong image formats, or numerous other causes can make exports slow. I don't know the details of why, but it was there, and it sucked. Maybe a side-by-side editor+constant re-render in a custom output format (that's optimized for TeX rendering way faster than making a pdf), would be pretty cool, for instance. I don't need to export to pdf after every tiny edit after all.
> It more commonly takes a minute, not an hour.
I'll completely admit I got a little overdramatic by that point =]. Sure it's more commonly a minute, but it's still full of much more "Badness 10K" than I ever had Word throw an Error dialog at me. And in my (weak) defense, I have wound up spending an hour fixing a LaTeX bug from a missed closing brace, and I couldn't create the document. and it would have helped if I could have noticed it the moment it happened (as opposed to after my next recompile which took 30 seconds). I feel constant debugging shouldn't really have a place in document-writing. Perhaps better error tolerance (a la HTML) and/or debugging would be nice.
To sum up: I'm not trying to put down LaTeX. But it has lackings, which can be addressed. I also wouldn't suggest to make LaTeX function like other WYSIWYG/WYSIWYM editor, but to attempt to address the root problem, ie, the feedback-loops times. Lyx tried to do that, but WYSIWYM wasn't the right answer (for all the people who still use regular LaTeX editors instead).
Well, I actually don't even write documents very much anymore so maybe I'm pretty disconnected from the whole ecosystem for a while, but from what I gather the tools haven't really evolved much at all in the last 4 years.
All the additional commands can be learned on the fly, if you have a reference at hand. I'm sure there are commands in Word for creating a title page, a table of contents, a bibliography, an index, correctly numbering equations and updating references if you add additional ones, but once I discovered Latex I have never looked back and have not used Word in a long time.
Let's deconstruct each of the pieces we are supposed to be ashamed about, shall we.
LaTeX the language: a wrapper language built on top of TeX for typesetting documents. The idea being to specify higher level document logic (for instance sections, subsections, paragraphs, tables) and if all else fails, then fallback to explicit placement commands.
I fail to see anything to be ashamed of in the idea. And, as for your complaints about extendability in some of the comments below: maybe you should look at the texlive distribution or PGF/tikz here: http://www.texample.net/tikz/)
LaTeX/TeX compiler: that's a fair point. But, doesn't have much to do with the language.
vim: a text editor wrapped around using the keyboard efficiently, similar in idea to LaTeX the language in that it asks you to express yourself on a higher plane. Neovim is essentially a more maintainable repackaging of the same model keeping extendability in mind.
bash: wrapper over the early Unix model pipes, ttys, processes (and not threads) etc. The bash scripting language is not all it could be. But, as I said before, the basic model is fairly tied up with Unix itself and is similar in zsh/fish etc.
[TikZ] is a perfect example of TeX's missed potential: it is an outstandingly well designed DSL for diagram creation, embedded in an insane macro language. The moment you try to do anything nontrivial that exploits the programmatic (rather than purely declarative) nature of TikZ you immediately run into wall after wall trying to express basic programming concepts in the host language, TeX.
For example, two common patterns I have seen frequently arise as perfect uses for a programmatic diagram description:
- using computed coordinates and transforms to construct complex paths and diagram layouts from the composition of basic geometric reasoning;
- using abstraction to tersely encapsulate common visual components, both for simple iteration over many similar components, or for higher-order encapsulation of parameterized diagram logic.
In both cases, you quickly run face first into fundamental limitations of TeX as a programming language. In particular, it is extremely painful to use for either arithmetic or control flow. This is a big enough deal in practice that core among TikZ's features are custom inline arithmetic syntax for coordinate computation, color blending, etc., and a custom `\foreach` macro, both defined not in the language, but provided as part of a specialized diagraming library, because they are fundamentally at odds with the design of the core TeX language. Even this impressive bit of TeX engineering still breaks down as soon as you try to do much more than iterate over a hard-coded constant range. (How much do you need to bracket, `\relax`, etc. the operands of your `\foreach` if it is used inside of a macro definition? Or if they are the result of computation, even as simple as basic index evaluation like `\x+1`? Now what if you want to iterate over a range which results computed in part using floating point arithmetic using one or another library?)
TeX suffers because it is at once both a relatively awkward markup language (relative to something like Markdown) and an extremely awkward programming language. Many tasks which would be trivial in any mainstream programming language are outrageously challenging and arcane in TeX. The LuaTeX effort to embed a sane, modern programming language much more centrally into the core of the TeX runtime is a fruitful direction, but it still does little to better formalize or structure the various levels of document representation for programmatic transformation: we still have a giant stack of TeX macro complexity, ultimately expanding down to very low-level page rendering descriptions, just now with the ability to register Lua callbacks at various points along the way, or use Lua scripts to generate new tokens during the expansion process.
As a clever, minimalist hack, the single-pass macro expansion semantics at the core of the language were a great way for one man to bootstrap a complex typesetting system. As an expedient hack to build an incrementally more humane document generation system atop the powerful low-level typesetting engine already available in TeX, LaTeX got a lot of mileage for relatively little cost. As an intermediate representation for modern typesetting systems, TeX could be a reasonable higher-level alternative to something like PostScript (which few bother to write by hand, but is a great low-level page description language, particular as a target for machine generation). But as either a primary programming language or a human-facing markup language, TeX is a terrible fit. Worse, the extreme difficulty of doing anything parametric in TeX makes it a bad fit for a future of many display formats and adaptive (responsive) layout, where baking tools focussed on baking a single paginated output format are less and less relevant.
The evidence for how different the world could be is not given by plain Markdown for comment boxes—that is, indeed, an apples to oranges comparison—but by the power that comes from creating an extensible and programmable document transformation system based on a well-defined document grammar (which is, in fact, a recursive data type, not tied to any one front-end syntax). That's the heart of [Pandoc]. It's not just "extended Markdown" or "a markup format converter," it's a powerful framework for document transformation, which naturally supports humane markup, while also allowing extreme extensibility via general-purpose programming languages. See the [scripting] documentation to get a sense for the model of extensibility, then recognize that this same document representation is the core of every translation and transformation Pandoc does, and how this model enables many of the same things as TeX's macros, but in a far more structured yet also programming-friendly way.
The thing about compatibility is that it doesn't matter how great your new and shiny is if it can't talk to old things at all. Imagine the best new text editor in the world, but it's completely incompatible with every file format that's ever been written previously, even .txt! No one would use it, because the cost of switching is too high.
In some cases, it's definitely worthwhile to have some development pain to keep that around. Stuff like keeping vimscript (and therefore the entire ecosystem around vim) functioning means that neovim could be a drop-in for some people, and HAS to be preserved at least partially. Stuff like keeping ancient Amiga support or integration with Sun workshop? Definitely not worth it anymore.
I agree with almost everything--except for backwards compatibility when it comes to throwing stuff out. There is definite value in it, especially when we're talking about something as old and widely used as vim!
Just think about it. What happens when we decide a certain core feature of Vim is archaeic, and could be implemented in a more modern, more flexible, more loosely coupled way.
First, every vim plugin would break. But if you look at how many Vim plugins are actually used, I would be surprised if the top 100 Vim plugins covered less than 99.99% of all Vim users. That means we only have to upgrade 100 Vim plugins, and probably just a couple of lines if they're written well.
How much technical debt have we accrued now, that in reality could be paid off just by biting the bullet and going through a hundred pieces of software and changing a couple of lines?
This is open source, we could just do it. (tm)
There is a reason people pay for software that just works, and will work for the next few years. It's because they're fed up with being the guinea pigs for architecture astronauts and code purity fetishists.
Also, "that means we only have to upgrade..."
Who is "we" here? You? Of course not, you're too busy chasing the next fad, while the rest of us have to repare shit just to make it do again what it aready did before.
And let me extend my apologies in name of all the authors of software whose extra features and nice bug-freeness you enjoyed at the terrible cost of going through the gruelling process of having to upgrade your software.
Perhaps all us developers of modern software could get together raise funds so you could get a refurbished pink iMac G3. Before all those pesky Apple people started their code purity fetishist backwards compatibility breaking migration to a BSD+Mach based OSX.
Do keep in mind that modern software as you state are great tools for completed tools but very little options for those that create the tools themselves, IDE's for example but not limited to that concept. This leads to my point:
To paraphrase the OP, maybe some things should not be fucked with for "code purity" in the name of functional purity. If you vehemently disagree, make a better product. Critical mass will replace the failed ideology that ultimately leads to a better technology. All the flame wars in the world won't change this fact and ultimate outcome.
I do understand my comments are inflammatory, and did not expect at all they'd be upvoted more than downvoted, but that seems to be the case.
Your standpoint is the one many in this thread have and I understand the least. This idea that "code purity" and functional purity are different.
The only reason that LaTex can not be improved is that its code is unaccessible. This means its functionality lies rigidly restricted in the 80s. In my opinion, there's nothing functionally pure about 70s/80s software. Back then, functionality followed from hardware restrictions.
Please don't invent fake statistics in order to support your argument. Vim's plugin ecosystem has a long tail distribution. After just a cursory glance at vim.org's script repository you'll notice that even the 141st ranked plugin has over 10,000 downloads. To find the first plugin with less than 1,000 downloads, you'll have to drop all the way down to sql.vim which happens to be ranked 1501!
Personally, I happen to use (daily) multiple plugins which are ranked outside the top 1000.
(Sarcasm aside, ^this)
I sort of think that if just the top 100 plugins would be compatible, the authors of the non-top100 plugins would be motivated to migrate, as a sort of herd mentality thing.
I am also not a big fun of extreme forms of compatibility, but plenty of projects try to break it only to have people which stick around with old versions for years.
I wonder what would happen, if just a few companies that actively use Python, like Google, would hire a few devs that for a year would only fork and fix Python 2 projects. Wouldn't it just solve the problem? I bet they'd be done within a couple of months too, and they wouldn't even have to be super senior types.
It's now mostly just the thousands of smaller packages that still are on Python 2 only.
But Python couldn't get away with that more than once every 10 years.
Those of you who don't know about this mess, it shouldn't be hard to find some rants by googling. Basically Python 3 broke backwards compatibility with python 2 without providing any compatibility layer at all, they just expected every third party library to switch "sooner or later". It's now 10 years down the road and nothing has really happened. Everybody who gets shit done is still using 2.x because they need the old libraries.
> Everybody who gets shit done is still using 2.x because they need the old libraries.
And, btw., I hold each and everyone of these responsible as part of the chicken-and-egg problem.
I just looked it up, $70 is way out of my pay range. I'll convert when I have a full-time job.
> We as developers in the Open Source community should be ashamed people are still using Vim to write LaTeX in Bash running on terminal emulators.
Yes, you should be ashamed.
 Recent vim convert.
I can tacitly agree that using vim running in a terminal to write LaTeX is not the easiest way in the world to write LaTeX.
However, terminals do not suck. Bash sucks a bit, vim sucks in ways this project aims to fix, but terminals are absolutely awesome.
A terminal is a dead-simple API for interacting with the user. Want to display text? Write to fd 1. Want to read some input? Read from fd 0. Want to control the cursor or change the colour? Write a special sequence of characters to fd 1, that's supported by every regular terminal emulator. There is even support for getting mouse events. Yes, it's text-only but that is enough in a lot of cases. And it's very simple. And you get it for free the moment your program starts. There is no opening connections to X, no initialising GUI libraries, no callbacks, no need to design a GUI. It's just there. It's always there.
Can you explain that a bit?
It's also ever so slightly snappier. Vim seems to be ever so slightly less responsive when I have several panes open.
And then every once in a while I need to do some cool text transformation, and it doesn't work, so I launch a terminal, edit the file in Vim and work my magic. Then I exit Vim and continue in Sublime. Ideal? No.. but it's the usability of Sublime with the features of Vim.. every once in a while..
The sentence quoted above sounds a lot like something out of a product promotion.
> "There is nothing holy about Unix era software, chances are it's shit and a lot of it should be thrown out."
1. we're still very much in the "Unix era"
2. There's nothing holy about software at all. Rules shouldn't be broken, but rules can be broken.
> Look at SublimeText, it's got 1% of the features of Vim, yet it's converting Vim users left and right, by its sheer usability.
If anything I'd say Vim is converting TextMate/SublimeText left and right. Most SublimeText users are former TextMate, and many new Vim users are former SublimeText users.
> We as developers in the Open Source community should be ashamed people are still using Vim to write LaTeX in Bash running on terminal emulators
Speak for yourself. I don't personally write LaTeX but I still very much use the shell, and Vim.
So let's stick with LaTeX. There are 2 categories of people: those who think that LaTeX is fine for its usecases and those who think that something is wrong and could be done better. Fine is fine, so lets concentrate on the second category. I'll recite main complaints: slowness, horrible syntax, arcane source code, impossibility to divide semantics from representation. (Did I miss something?)
I guess many of us are able to program. So why somebody, right here, right now, wouldn't create a github/bitbucket repo, post link on it and explain, what exactly in your opinion could be done better? I personally would probably participate if somebody has a plan sane and explicit enough.
And there is nothing shameful about developers accomodating my desire to use a command-line text editor to create/edit TeX files in whatever terminal I am using.
Normally I create / edit TeX using LaTeXila but it is fanstatic that I can do so with vim and when I don't have access to LaTeXila I use vim for TeX editing, and I am quite glad I can.
With software that works in a standard shell environment, I can ssh into a networked computer from my Android tablet (Better Terminal Emulator Pro and Hackers Keyboard) and do actual work. Far more convenient than carrying my laptop everywhere and also faster than checking out source, firing up a GUI, and then committing the source.
Terminals ARE good technology and we would be worse off without them.
For me, Vi and Emacs are only when I don't have an option in terms of IDE support, even though I know at least Emacs quite well.
What can I say, the world of Amiga GUIs, Smalltalk and Lisp environments spoiled me as an IDE fanboy.
It baffles me why in 2014, some developers still prefer to work as if UNIX System V had just been released.
Is vim perfect? Well, for me it's not. Because of vimscript, because it doesn't have mechanism to work with non-text information (for example it has to open up another window to show me rendered LaTeX). Sometimes I even think of switching to emacs. Plugins to do some advanced processing (like rope for python) are slow quite often. But I still prefer it to any IDE in most cases. It's not perfect, but it's the best thing I've met so far in this imperfect world.
Because (a) some of us aren't convinced that things like IDEs give a better view of what's going on in the system; (b) some of us think that fundamental tools like terminal emulators, editors and so on should be debugged and stable because we have to earn our mortgage payments using them and chasing the shiny isn't on our job description; (c) any actually useful new idea winds up in Vim anyway.
You want to use a new shiny unproven editor, grand, have a ball. But don't ask me to purely on the basis that it's not new and shiny.
(d) When did I specifically pick out IDEs as the sole example of new and shiny? Or did you just read my comment without parsing it?
- Ability to select any symbol and find semantic uses of it
- For OO code, being able to visuallize the OO graph usage of certain symbols
- Refactoring across the whole project with semantic knowledge (no, search/replace does does not cut it)
- Navigation in third-party libraries deployed in binary format
- Code completion for static/dynamic languages, while showing tooltip documentation
- Graphical visualization of data structures on the debugger
- Background compilation with static analysis
- unit test debugging infrastructure
- integration with modelling tools
- integration with SCM tooling and being able to interact with them directly from the editor. For example, generating a blame file, with navigation across the file revisions.
- integration with continuous integration servers
- integration of developer workflow with task management servers
As an example of such developer workflows, have the IDE talk to Jira, edit the code, automatically bundle it in a workflow that binds the code changes to the Jira issue being worked on, get a Jenkins notification after the code is checked in and gone through the CI system.
Sure you can get part of it in Emacs, after spending a week configuring plugins, with different levels of maturity, and in the end it is still mostly textual.
I agree. Bash sucks. ;)
Please clarify by what is meant by "backwards compatibility," since this is a loaded phrase. On the one hand, it's the idea that software system B can do everything that earlier software system A can, using the semantics of software system A (e.g. "vim can do everything vi can, using the semantics of vi"). On the other hand, it's the idea that software system B is semantically incompatible with software system A (e.g. SublimeText vs vim), but software systems A and B can coexist without breaking each other. While I agree that achieving the former idea of backwards compatibility isn't always important (particularly in the realm of text editors), achieving the latter is an absolute must. If there is one thing I will not tolerate on my systems, it is a piece of software that breaks other working software that I rely on.
> There is nothing holy about Unix era software, chances are it's shit and a lot of it should be thrown out.
Like it or not, we're still in the Unix era, and likely will be for some time. Unix is more than a specific implementation or even a specific API--it also includes the logical abstractions behind them. You still deal with files, directories, processes, threads, pipes, sockets, dynamically-linked libraries, paging, etc., whether you're on Unix or Windows. This means SublimeText is "Unix era software" too--it's built on top of the same Unix-era abstractions as vim. Perhaps the only Unix abstraction vim makes use of that SublimeText does not is the TTY, but that doesn't stop me from using vim where TTYs don't exist.
And you're right, most Unix era software is shit. But don't limit yourself to Unix era software--most software in general is shit. Very few pieces of software become as robust and widely-used as vim, and SublimeText isn't going to make vim disappear anytime soon (especially since it doesn't break working vim installations).
Feel ashamed of me then, since I do exactly this. Why? Because there isn't anything sufficiently better for what I need to do. I have given every single WYSIWYG editor that claims to do a better job a good-faith trial, and I have always gone back because each contender always lacked the ability to do something I needed to do.
I'm not saying it's impossible to do better; it certainly is. It's just that I've seen nothing that actually is better. I would address this myself, but I have too many other, more important things to work on (like finishing my PhD thesis).
> (Yes, it gives me shivers just thinking about how much each of those technologies sucks when you think about how good it all could be.)
Code doesn't write itself :) Software systems don't design themselves either :) What you're asking for is a very, very tall order. Many have tried to replace that which is "good enough," and yet we're still in the Unix era despite their efforts.
And here we have a project. Virtually nothing is done, previous commit was 20 days ago, and the last commit is adding the fundraiser link an hour ago. I mean, the author doesn't seem to be fanatically enthusiastic. And plans are quite generous. Maybe I'm too pessimistic, but I have some bad feeling about that. I'm thinking instead of "Hm, maybe better to write an open-source version of Sublime Text?".
I understand the skepticism, but the FAQ gives a pretty good explanation for this and shows that this guy has some familiarity with vim and motivation for taking on this project.
 https://www.bountysource.com/fundraisers/539-neovim-first-it... (near the bottom)
I agree. vim the editor has a good UI: it is fast to work with and makes me productive. Vimscript is bad. I cannot grok how to write complex things and simple things often break too.
A prototype is at https://github.com/jayferd/ixl-prototype
Yeah, I have mixed feelings as well. I love my Vim, don't touch it.
And yet, vimscript does suck and better separation from core and UI could allow for very interesting things.
Also, no one really cares about Vi compatibility anymore and cleaning the scary old C code base could make it much more accessible.
I guess we'll see.
You know: Vim IMproved.
I wrote a utility to convert these files to/from YAML so it would be friendlier to edit Sublime Text themes; editing naked XML plists was painful for me.
My interest in the project is in something with a design similar to Sublime Text that works remotely in an SSH session. There's also some exotica about wrapping Python in a Go process which I find interesting and terrifying.
why do you feel that? vimscript is not the same thing as ex; i agree that any true vim would need to keep strict ex compatibility, but vimscript itself is purely for writing plugins in.
Vim needs to be built with the support enabled, though.
Python:  (more info )