Hacker News new | comments | show | ask | jobs | submit login
Neovim (github.com)
879 points by tarruda 1100 days ago | hide | past | web | 362 comments | favorite



Thanks for trying to do what many of us secretly wished we could do but can't because of time/skill constraints. I will definitely move to NeoVim the second it's packaged (is it yet?), regardless if you've changed anything yet.

I hate the ideas many programmers have about backwards compatibility, that it's more important than development speed and modern concepts. There is nothing holy about Unix era software, chances are it's shit and a lot of it should be thrown out.

Look at SublimeText, it's got 1% of the features of Vim, yet it's converting Vim users left and right, by its sheer usability.

We as developers in the Open Source community should be ashamed people are still using Vim to write LaTeX in Bash running on terminal emulators. (Yes, it gives me shivers just thinking about how much each of those technologies sucks when you think about how good it all could be.)


  > ...should be ashamed people are still using Vim to write LaTeX in Bash running on terminal emulators.
First, why? It works just fine. Second, is there anything better? I don't know of anything I would prefer to use. So until someone comes up with something, I'll stick to using Vim to write LaTeX in a Bash shell (might upgrade to zsh, we'll see).


No, it doesn't work fine. LateX is slow, inconsistent and needs to be ran multiple times to give a correct result, it has no API, is not extendable in a sane way, it's source code is so arcane there's books written about it, and if you've read the books the only thing you've learned is that trying to reimplement LateX is a fool's errand. And it's syntax is ugly.

Markdown is better. Microsoft Word is better. They just both lack certain things we need, and LaTeX has those things.

Could you not envision things could be better?

A LaTeX that parses to an in-memory tree so it could be transformed before compilation?

A shell that instead of working on character streams worked on structured and annotated data streams, so it could intelligently interpret what is going on?

An editor that does have at its core an expectation that its output is ASCII character commands controlling a 1970s terminal interface?

Of course you'll stick to using Vim to write LaTeX in a Bash shell, I do too. Doesn't mean I'm not mad about it..


1) You run latex twice only on really large and complex documents like books, and if you're writing books (not editing, see antipope.org for details) in Word, then you need professional help.

2) Markdown and LaTeX are designed for different tasks. Next you'll be telling me that if you have a leafblower, you have no need for a chainsaw.

3) Find me the structured and annotated data streams first - and show me that the structure and annotations are correct - then lament the lack of a shell that can route them to the programs that don't interpret stdin and stdout and stderr that way anyway.

4) I use vim to write latex when I write latex, but "in a bash shell" doesn't make any sense. You fire up vim from bash, yes, but unless you're trying to do an ex-style editing session, you're not editing in the shell any more than you're editing in the kernel or the tty driver...


You're completely missing the GP's point. Whether or not LaTeX is better than MS Word for task X is besides the point. It's that LaTeX has many obvious opportunities for improvement, ways to drag it into the current age, while keeping all its fundamental strengths. As do many similarly old tools such as Bash, terminal emulators and Vim.

These improvements aren't all that difficult to imagine, but they don't happen, because, well, why? GP seems to think an obsession over backwards compatibility.


Agree. Texmacs is a good idea. Unfortunately it don't attract much attention, it is still not very mature as Vim or Emacs. I hope the community could pay attention to Texmacs.


LyX works more reliable in my experience: http://lyx.org


Why expend manpower and energy improving a working solution just to do so when you have so many other broken things that need fixing first? That's a fairly self-answering question. There's a reason the expression is "If it's not broken, don't fix it".


We have tools that have virtually been unchanged for over 20 years. While the consistency and stability has been nice, there has been a lot of good things that have happened in software development and technology that those tools could really benefit from. You're honestly going to tell me you don't think that taking a second look at these tools and seeing how we could improve them would be a good thing?

What you're basically advocating is nobody should innovate because things are "good enough". With that sort of attitude we would never have any progress on anything.


That's not what I'm advocating. What I'm advocating is not wasting time and effort rewriting working tools just because the rewrite would be newer.

Which is not what you're saying I'm advocating.


Nobody thinks it's a good idea to rewrite something just to make it "newer". The idea is to rewrite (or maybe just refactor) these tools to make them better.

It's also not "fixing what's not broken", but improving something that could... well... be improved.


While the core interaction remains unchanged, 20 years ago was before bash 2.0. The version running on my laptop, bash 4.2, was released in February 2011. The development speed hasn't been blindingly fast, but it's a fairly mature project with scripting support (so less need to extend the core than some projects have), and there have absolutely been changes even in the last 5 years.

http://tiswww.case.edu/php/chet/bash/NEWS


If you fear breaking it, why not replace / revolutionize instead. See GCC vs LLVM, X vs Wayland, C++ vs Go.


> C++ vs Go

Bad example :P


markdown is perfect for forum comments, but idiotic to extend too far. if you’re a programmer wanting to write a book in markdown, you’ll find yourself thinking of markdown syntax extensions more than thinking about the book.

if you want extensibility and still a rather lightweight syntax, try restructuredtext. it has exactly one flaw compared to markdown, which is the hideous inline link syntax (which practically forces you to use named links). that makes it less suited for forum comments (where you might want to quickly inline 1-2 links), but perfect for books (where you can neatly specify your link targets below the current section)


Or you could just use LaTeX, which has been used for writing books for decades (the word you're looking for is "debugged" and the phrase is either "well understood" or "well documented").

And yes, there are efforts underway to improve it (LaTeX3) but 2e works so well, there's not been much drive there.

And if you find LaTeX is so hard, there's always LyX...


>which has been used for writing books for decades

Well, mostly math and science books. Very few (if any) fancy and well typespeced books by major publishers have been made with LaTeX. I know, because I know that industry. So, while TeX was once created to be a general typesetting solution, it has been relegated to something math and cs geeks use for their papers.

>And if you find LaTeX is so hard, there's always LyX...

A not really maintained, relic of a program, that tries too hard to work around the issues raised by a backend like LaTeX that wasn't really created with such GUI control in mind.


>Very few (if any) fancy and well typespeced books by major publishers have been made with LaTeX. I know, because I know that industry.

What do they use, then? (I'm actually pretty curious, because I am writing a dissertation that I currently build with LaTeX, but I find it much nicer to write in Org mode. I'm doing whatever I can to avoid a hard dependency on LaTeX for the backend, and to allow exporting to multiple formats.)

Why do they use whatever typesetting solution(s) they use? Does it actually produce better/nicer output than LaTeX (for non-math text)? Does it just have nicer syntax or friendlier error messages? What's the advantage?


Quark Xpress and Adobe inDesign are two I have seen being used. In the past Corel Ventura was wuite popular too.


> A not really maintained, relic of a program, [...]

http://www.lyx.org/trac/timeline



I have to wonder if your average publisher would even know what to do with a book written in LaTeX.


Any publisher that produces books with heavy math will know what to do with LaTeX, and if not then call a vendor that will know what to do with LaTeX.


...if you have a leafblower, you have no need for a chainsaw...

Yes, but a leafblower combined with a chainsaw...

I don't know about you, but that's a tool I could find some uses for.


Like scaring children?


It's not often that I come to laugh out loud while reading (the usually somewhat stuffy) comments on HN.

+1 coffee sputter.


Or destroying small rodents, perhaps?


Well, depends on the size of the leaf blower. If you can blow the trees down, certainly don't bother with the chain saw!


Newton would have some things to say about that, but you wouldn't hear him because of your doppler shift.


I just choked on a glass of water. Thank you for that comment!


One could easily remove then fan on the leefblower engine and add a saw blade.


If you are comparing Word and Markdown to LaTeX I think it's clear you haven't really used LaTeX.


Oh get over yourself. Every one of the criticisms he levelled against latex is spot on, and you pick up on the fact that he mentions Word is easier to use (which, by a long, long way, it it).


I hardly think so. Let's address them:

> "LateX is slow, inconsistent and needs to be ran multiple times to give a correct result"

It is not slow, it runs in under a second on most documents I have authored.

> "Inconsistent"

I'm not sure what tinco meant by this, but it does in fact return the same results for the same file across multiple runs.

> "Needs to be run multiple times to give the correct result"

This is true for things like references, etc. However, this proves to be a non-issue in practice as you are recompiling so often to view changes that references are always up to date.

> "It's syntax is ugly."

I don't see how learning LaTeX is different from learning any other programming language. I don't think that many C++ gurus would call C++ "ugly". Calling a progamming language "ugly" is often the last argument you see when someone couldn't come up with a decent argument against it.


>>> "LateX is slow, inconsistent and needs to be ran multiple times to give a correct result" > It is not slow, it runs in under a second on most documents I have authored.

Slow? I don't know about that, but compared to what? Even in word it does need some time to reach the "Save as PDF" menu, right?

>> "Inconsistent" > I'm not sure what tinco meant by this, but it does in fact return the same results for the same file across multiple runs.

Well, there is some truth about this. There is an odd naming scheme if you look at stuff like "enumerate", "itemize" and "description" - beginners get confused by this as they assume it would be called "describe". The same goes for most of the packages. This is somehow historical, but I think it also makes it unnecessary hard to figure out LaTex for beginners, right?

Also the reason why LaTex needs several runs to finally create the full document. Nowadays you would do this differently I guess as there is simply no need to start the program twice if the application would be smart enough to call sub-programs (like biblatex) by itself on the fly. It would still run the same routine several times, but users wouldn't see it and it would be one command for them to build the document. That's another point beginners tend to go crazy about :)

Having said that, I should add that I'm using LaTex (and XeLaTex, etc.) daily and there isn't a single program matching the power and beauty of it - I even design covers for my publications with it. But I'm able to use vim and emacs, right? 90% of the people are bloody beginners and it would be a shame to hide the beauty of a LaTex document from them.

>> "It's syntax is ugly."

Yeah, well...ugly is a bad term to discuss about. I like it and I think the syntax is way cleaner than reStructured text :D


funny that you mention rST. i also talked about it here, and i said that its only syntactical flaw is inline link syntax, and how that’s only a problem if you don’t write a lengthy document with it, because in that case, you don’t want to inline your links anyway.

what is unclean about rST in your opinion?


It is not slow, it runs in under a second on most documents I have authored.

It is slow. On my PhD thesis, it took more than one minute at some point, on an SSD. Not very convenient if you are tweaking figures and equations, and want to preview a change quickly.


Then you did it wrong. You should have just factored your thesis in multiple files, for example used one for each chapter.


That sounds like a workaround.


Actually, it is a pretty useful feature. I break my documents up into multiple files and can compile pieces of them together for different audiences. Want an executive summary of my research just build the abstract + a summary. Want to review the literature with my advisor, build the just the literature review.

However, I will agree that using it to address slowness of a build process is just a hack.


I suppose you would say splitting your C++ into separate source files is a "workaround" for slow compilation times, too? It could be, but both are probably good practice anyway.


It is, but you have to remember latex is best thought of a programming language that outputs pdfs so splitting into multiple source files is useful, as is source control.


My thesis was split in multiple files.


I agree that that aspect of LaTeX sucks. That said, as a matter of exchanging practical hints: If you haven't yet heard of \includeonly, you should definitely check it out ;)


Inconsistent probably refers to the fact that there are subtleties that can mess you up, once it works you are fine, but you may have a blocking issue on the way.

Syntax is important, remember he is comparing it to Markdown, not C++. C++ is certainly ugly and should not be used as the basis for any comparison, certainly not for a word processor.

I think he is just pointing out that general theme amongst these tools. They work for the people using them well enough, so they stay that way forever. It isn't that features aren't added or that they stagnate, it is just that thinking outside of the box is impossible.


> Word is easier to use (which, by a long, long way, it it).

Definitely initially for simpler.

But what about using it collaboratively, merging changes and comparing between versions? Especially large documents with a multitude of authors combined later (think journals).

What about documents with programmatically generated content?

What about consistent formatting for your documents?

What about when Word gets confused by some element in the xml not visible to you?

Sure Word may be easier but there are limitations to it's use.


It's an apples to oranges comparison.

I want to draw a conformal mapping diagram in word and also write a 3-vector of the navier-stokes equations.

Also, I want to do a lot of tensor arithmetic with tons of superscripts and subscripts and arrows.

Tell me how any of this is accomplished in Word please. Saying Word is easier to us is like saying BASIC is easier than C++.


You know what most people would do? Use a dedicated program to create the plots and equations, export it as a PDF, and embed that in Word. Instead of LaTeX being the de facto standard for all publishing, it is relegated to very specialized uses in mathematics and some engineering. If you try to use LaTeX with a biomedical researcher, they will look at you very funny.

Hell, I just had a postdoc send me figures as part of a manuscript draft where she was using Powerpoint to make the figures instead of something more appropriate like Illustrator. Most people who publish don't care about how their documents are created, they just want them to work and get published. Word does a pretty good job for this.

Use the best tools for the job. Most people would tell you that BASIC is easier to use than C++, if you can write your program in BASIC. And if you don't want to deal with learning curve of C++ and you get get by with BASIC, then what's wrong with using a tool that works?


Jesus nothing is wrong. Read my comment again. I'm saying it's an apples to oranges comparison. If BASIC is all you need great. If word is all you need great. If you can get by with importing from gnuplot or R or matlab or whatever great. I'm not saying one is superior to the other. I'm saying that when I want to draw a commutative diagram, I'd rather do it in LaTeX with tikz or the xy package.

I'm sick of the "HN strawman." People on these threads need to read and apply basic critical reading and logical skills.

You don't need to tell ME to use the best tools for the job. I've been telling people that for fucking years. I don't why your comment annoyed me so much but it did.


What I'm trying to say is that for the vast majority of peoe that have to produce documents, it isn't an apples to oranges comparison. For most proe, Word and LaTeX do the same thing. And one of those is vastly easier to use to get their job done. So while LaTeX is the technically superior, Word will still get the job done. And because of the network effects that working with collaborators has, People like me are stuck using Word even if I'd rather have my papers written in something else.

In some fields LaTeX is the easier method to produce documents (good luck with anything more than a simple equation in Word), but those are relatively small niches.


In a similar vein, you might say that to the vast majority of people there is no difference between a screwdriver and a pry bar because most people are content with abusing screwdrivers.


It depends on the job you're trying to do.

But you can't honestly say that Word and LaTeX really have that big of a difference in intended jobs. It seems to me that they are both in the document production business.


And pry-bars and screwdrivers are both in the "levers that provide mechanical advantage" business.

One can be abused to fill the role of the other, and if you look at them both from 1000 feet and squint then both "for" the same thing, but the reality is that when you actually examine what each is designed to do and in which situations each is used, they really have little to do with each other.


You can't honestly believe that pry-bars and screwdrivers are as similar as LaTeX and Word are. It's a ludicrous argument.

LaTeX and Word are for documents - making documents. How they operate is very different. For example, Word includes it's own editor, LaTeX does not. But they are both used for producing documents.


You know what most people would do? Use a dedicated program to create the plots and equations, export it as a PDF, and embed that in Word.

I am not fond of LaTeX. However, for some fields there are simply no dedicated programs, while there are good LaTeX packages. For instance, for my thesis I drew a lot of attribute-value matrices (in the unification grammar/HPSG sense). There is a great packages to draw these structures, both flat and in tree form.

I had some hope that DocBook et al. could replace LaTeX for most uses. It's great in many respects: it's just XML, so easy to machine-interpret. Customising output is easy (via XSL stylesheets). There is good support for producing PDFs (via XSL-FO). And it supports SVG and MathML. There are also great WYSIWYG editors, such as Oxygen. However, that toolchain never became popular, because most people hate XML.

The book that we never got to finish[1] is written in DocBook, equations are in MathML. The Slackware book that I once wrote [2] is also completely done in DocBook.

[1] http://nlpwp.org/

[2] http://rlworkman.net/howtos/slackbasics.pdf


Docbook and DITA are used a lot in the enterprise world.


A good example of Docbook is basically all recent Red Hat/Fedora documentation.


> If you try to use LaTeX with a biomedical researcher, they will look at you very funny.

I'm a biomedical researcher, and I use LaTeX. It's my preferred method of document preparation. I know many others who use LaTeX as well.


The last time I used Word it was an absolute nightmare to typeset equations in it. In theoretical physics basically every page contains at least one equation, it would be complete madness to try typesetting that in word.


Can one not value the simplicity of an apple, the juiciness of an orange and compare it to a Pomelo, wishing the latter had both?

I would say LaTeΧ can be perfectly well seen as a combination of many aspects of Word and Markdown in the sense that it offers the feature set and backwards compatibility that Words offers, while culminating a legacy, complexity and perceived indeterminism that also plagues Word (though in a different way); and on the other side being a plain text format that can be version controlled and taken apart and shared as snippets.


"Markdown is better. Microsoft Word is better." Stop comparing oranges to apples. And there's really no "objectively better", it's two different tools with two different uses. Sure, LaTeX could be better, but it's been working on.


I do what I want. Also, LaTeX is a system for creating documents, and Word is a system for creating documents. There's only four important differences: Word has a more modern architecture, Words compiler is integrated into its GUI, Word is closed source, Words source format is not human readable.

Apples to Oranges? I can compare these Braeburns to those Fujis just fine thank you.


I don't mean to be a jerk, but how much do you actually use/have used LaTex? They aren't even similar. They have very few similar use cases. One is a typesetting system the other is a word processor. Sure, documents come out the other end with both but the process and targets are very different. One allows you to quickly and easily make a quality document. One allows you to make the document to look exactly how you want.

Java and C are both programming languages but it is still apples and oranges to compare them.


I'm by no means a LaTex expert. I've written a few papers and a bunch of reports, nothing published, just undergraduate stuff. I dropped out halfway through my graduate degree, so I'm not what you'd call an academic. But I've seen and read enough to know that you can do impressive things with LaTex.

I have another question though, how much have you used Word? I have used it only a couple of times since highschool, but it's gotten pretty impressive. I bet that if you really dive into it you could get very close to the power of latex. At least, with regards to the typesetting. The drawing images/formulas/schemas thing is a whole different ballgame I'd agree.

Anyway, my point was not that Microsoft Word in any way is a good replacement for LaTex, I'm just saying that if LaTex could be as well written as Word is, it could have these fancy GUI and fast rendering features too.

There's people out there that think that the power in typesetting that LaTex has is mutual exclusive with the speed and usability that Word has. That just isn't so.

The only reason LaTex isn't fast and more easily extensible is that it was written for the machines of 40 years ago, that didn't have enough RAM to hold the parsed tree of a document.

LaTex could be ten times as fast, were it written in Ruby, the slowest language out there. Just because a modern programmer would make use of the modern hardware and modern design concepts, and just make it responsive. A GUI could be as simple external process that accessed a message passing API, rendering it to HTML and run it through Webkitview.

Overhead everywhere, still ten times faster than LaTex...

(edit: definitely worth a downvote, this comment :P)


to summarize:

you've written a handful of latex documents, so you're now an expert on what latex is used for and what people want out of it. would it help to point out that given latex is more difficult to use than word, I'd bet most latex users have tried and discarded word in favor of latex? Would that affect your judgements about the relative use cases?

   There's people out there that think that the power in typesetting that LaTex 
   has is mutual exclusive with the speed and usability that Word has. That 
   just isn't so.
Since, well, nothing else offers similar typesetting plus mathematical notation / flexibility, your proof is... what exactly?

If you built a better system, you'd find customers for it.


>I'm by no means a LaTex expert. I've written a few papers and a bunch of reports, nothing published, just undergraduate stuff

I had a long talk about something like this with a close friend last night. He has a favorite technology stack that is regularly trash talked on sites like this. From what he's seen its usually from people who have only a passing knowledge of it. Either they worked a job and only had to mess with it for a few months or they worked with it for a long time but never went past just the surface. Almost all the complaints he's seen about it are either extremely outdated, fixable with a little custom code or configuration changes. I think you should spend a little more time with it before you say that developers should feel "ashamed" for using it. Your post falls squarely into the category he was talking about.


I wouldn't trash talk this if it was simply an unpopular way of working. I'm trash talking it because it's the most popular way of working. I don't know anyone that does not write his/her papers in LaTex.

I have a lot of respect for having your own toolchain that's optimized and comfortable for you.

I am sorry I come across wrong, I am not saying that developers should feel ashamed for using it. I am saying developers should feel ashamed that other people are using it.

I'm not saying using LaTex is dumb, I've stated multiple times in this thread that I use it myself. I'm just saying we could do better.


I disagree that it's the most popular way of working. Perhaps that is true just in mathematics, physics, and maybe even computer science but unlikely to be true in many other fields. Let's be realistic and realize that the majority of people do not use LaTeX. The ones who do use it, use it because they are putting in a lot of mathematical formulae which are easier to do in LaTeX than Word (although Microsoft is making progress with the equation editor) and the formulae look way nicer in LaTeX. Word renders the formulae into an equation that generally looks bad and scales worse.

The things that I use LaTeX for are usually my own writing that I don't have to collaborate with other people for. If I have to work with other people I use Word. The things that I like about latex are that I can have it automatically generate my figures because I can script it, which I can't do with Word.

I agree that LaTeX can be difficult to write but you have to realize that word has it's shortcomings too (put an image in and send the document to somebody else, chances are the layout will get messed up). Tools like pandoc are trying to alleviate both problems so people can work in whatever environment they prefer, which seems like a good solution to me. I could write part of my document in markdown, convert to LaTeX and write my formulae and then convert to Word so my coworkers can edit it easier.


> I'm trash talking it because it's the most popular way of working. I don't know anyone that does not write his/her papers in LaTex.

And you never stopped to investigate why, in spite of the far greater ease of entry that alternatives have?


I have a hard time believing this, to be honest. I know far, far more people who use Microsoft Word, than I do people who use (or even heard of) LaTeX.


It depends on the field. The field I am working in (computational linguistics), virtually everyone uses LaTeX. And with good reason, there are great packages for typesetting stuff that is specific to our field. So, yes, at work I know far more people who use LaTeX than Word.


Agreed. LaTeX is a very field specific tool. I don't know anyone that has ever used it (other than me, and I'm forced to use Word because that's what all my collaborators use).


I don't actually disagree with you on most points, but just out of curiosity why do you have the impression that Latex is slow? I used it to write my thesis (> 200 pages, full of images) and was re-rendering all the time, it only took a couple of seconds on my slow netbook.. When I'm working on a shorter paper, I barely notice the time between hitting "render" and seeing the result. (I realize I could even set it up to automatically render while I'm writing but prefer to concentrate purely on the text editor as much as possible and curb my urge to re-render constantly)


I've formatted three books self-published by my wife. The first in Pages (better than Word for nearly everything except document interchange), and the latter two in LaTeX.

I'll take LaTeX. It's more reliably reproducible. Once I got the basic format created (for her second book)—which took about as much time as it took to format the first book in Pages—the third book took about a tenth the time. My tooling was in place.

It's not for everyone, and you really have to either dig deep or find a macro set that does what you want (or both—I'm using a macro set on top of scrbook). There is room for something better, but that something better is not Word (or Pages).

(I'd almost say that what LeanPub does is good enough, but their LaTeX typography sucked when I tried it almost two years ago.)


"I bet that if you really dive into it you could get very close to the power of latex."

I find that very unlikely.


> One allows you to make the document to look exactly how you want.

Are you talking about InDesign? Exact placement of anything is really a pain with Latex, choosing "normal" fonts requires you to switch to a different compiler and the debugging information on all those automatic choices TeΧ and LaTeΧ does for you is abysmal. What I'm saying is that even though LaTeΧ allows you to do precision work does not mean it is a good at it.


I am very confused by the implication that TeX or LaTeX make it easy to create a document that looks exactly the way you want. Unless you want it to look like a paper presented to a publication that has a mandatory style, and a package is available with that style, it's just not true.

And that is almost the whole point. Most people are idiots about how to present written material, so TeX and friends make those decisions for us. It's especially frustrating when you have to dink with floats, because now you're second-guessing TeX, and it exacts a high price.

I've used TeX for over 25 years and I'm comfortable saying that I could produce documents of a similar quality, with better support for collaboration, with Word.

I still use TeX for many projects but the Word style capability is great.

Markdown still sucks, though. :)


with better support for collaboration, with Word

How? This is an honest question. I wouldn't use Word anyway, but I am just not aware of anything that could come close to Git + LaTeX in terms of collaboration support.


Fair question.

I've been doing projects with people who have no technical background to speak of, but immense domain knowledge. They wrote their theses in word (god help them), published in journals that required submissions in word (there are more and more of the things), and include a lot of figures in their papers or grant proposals.

Sharepoint helps a lot with collaboration, but even word's basic change tracking and commenting facilities are good for people working around a document via email, Dropbox, or even git.

Word still creates those weird situations for which it is infamous, but they are generally rare. Also, word has become much more stable, even with larger documents.


Those are just the surface differences. The modes in which you use all of these are very different.

Word: explicit WYSIWYG control over typesetting.

LaTeX: semantic level control over typesetting. Additionally provides explicit low level typesetting commands.

Markdown: given how basic (trivial even) Markdown is (http://daringfireball.net/projects/markdown/syntax) I'm not sure why it's even in this list.


I do not use any of the mathematical elements of LaTeX and the work I do render using it is exclusively words. Just words. So really LaTeX isn't necessary for what I do. But, I write using MultiMarkdown in Scrivener, using Bibtex citekeys for referencing then export the document as LaTeX from scrivener and then compile using Texshop.

I guess I don't really have a point. Just that if you are writing essays in the humanities or sociology then markdown can have a place in your workflow


It's in the list because it's what I would use for my thesis, if I'd do one now :) It's basic, but it has the semantics for most stuff, and can easily be extended for more. I'd do all formula's/drawings in external tools.

Word nowadays has semantic typesetting controls too, I think since they started that ribbon thing that it has become useable. If you stick to using their semantic things I think you can do a pretty good job of properly typesetting in Word.


Writing a thesis using only markdown sounds extremely painful. Are you going to create all your figures elsewhere and then include them as images? How are you going to create your list of figures? And your table of contents? How are you going to create, maintain, and organize your list of references? I could go on.


Writing my thesis with markdown was one of the best decisions i made. I wrote 99% of my master thesis in markdown (50 pages w/o interviews). 1) I didn't have a lot of images so i just wrote FIXME: Image Foo.png where they belonged. 2) I wrote my Citations Plain in there (e.g. Wocken, 1985, 7 ff.), using a special folder for all sources i really used in Mendeley. 3) Writing in markdown i could just c&p perfectly formatted chapters written with simplenote. 4) i also could seperate the chapters in files and keep everything in a git repo.

When i was finnished writing i exported the markdown to rtf, 1) selected all the h1's via "select all", set them to be a real (word) h1. vice versa with all other styles i used. 2) Then i put in all the pictures. 3) Then i exported the list from Mendeley.

That process - from markdown to perfection - took me less than 4 hours.

I have some experience with latex and especially latex + komascript, but i hated that experience because the latex sourcecode doesn't ebrace readability and you're sure to get obscure errors in the last second. besides stuff like running latex twice for a table of contents.

Writing

  \begin{itemize}
    \item really long line which gets wrangled in my vim 
  terminal, so that i don't see where the second item starts.
    \item another long line. maybe 200 characters long?
  \end{itemize}
is complicated and you can't read it well. this otoh is very easy to read:

  * really long line which gets wrangled as well, but is 
  better to read because it starts with a clear sign which 
  is recognized to be a bullet.
  * next line, which is also very long.
Easy readability is crucial, because i needed proof readers. The young proof readers could actually get the markdown code and correct my mistakes there (or write comments, starting with another fixme). really rocked with diff.

The older proofreaders could proofread it in the final stage in ms word with the track changes functionality.

I put a lot of thought into Design and Typography and i was sure that i could decide better how my document should look, than latex. I agree, that for some people this is not the case.


Most content can be written in something other than LaTeX, especially in early drafts. I always start drafting papers in Markdown. I cannot imagine going from start to finish with Markdown though. Especially since most papers have fairly strict style guidelines and length restrictions.

After a draft or two, transitions to LaTeX is a must. [Pandoc](http://johnmacfarlane.net/pandoc/) for the win!


How do you get pagination with markdown? I would like to switch to markdown instead of using latex, because my needs are simple. But how can I convert markdown into something with support for pagination?


i actually converted it to rtf and imported it then in MS Word (On OS X). I took care of pagination there.

Actually the Plan was to typeset it in Indesign, but as i woke up the day of the final layout i just decided to do it in Word.


Saying that this toolchain is better than using Latex is so hilarious :D

I'll just pretend it's a parody.


> Are you going to create all your figures elsewhere and then include them as images?

This is what most people do in physics and astronomy. You make the plot in some kind of plotting package and export it to a vector format. In astronomy, the journals only recently started accepting PDF figures in addition to EPS.

In my experience doing all of my figures in LaTeX is significantly more painful than using a real plotting package like matplotlib.


I wrote a dissertation, two theses, and a bunch of papers in latex and I never ever ever ever did any graphics that weren't imported as eps or, later, png.


I don't know your exact use cases but I am reallyfond of pgfplots which, being based on TikZ, integrates much better than matplotlib. But I also happen to hate matplotlib for sticking with "that" API ...


As long as you can get it in vector format, it doesn't matter.

What about tables? Granted they aren't fun to make, but I am not aware of any methods of creating them outside of a latex document.


None of those features require LaTeX arcane syntax, obscure source code, and multiple compilation steps. reStructuredText is a Markdown-like language containing some of the features you mention, especially with extensions; its syntax is well-defined, and its main implementation is in Python.

The point isn't if Markdown has enough features, but rather that LaTeX lets you a pay a price for those features that you shouldn't be paying, just because it's a 40-years-old software.


I actually had a mini panic attack thinking about that.

The best thing about markdown is pandoc.


Plus there are many advances to doing figures in LaTeX - such as the ability to have you math typeset the same way as in non figures.


Check out pandoc [1], it will let you write documents in Markdown + LaTeX for equations and then convert it to LaTeX, a PDF, or several other document formats.

Also, even though Markdown tends to be fairly easy to write, there are many edge cases where you can't really be sure what the result is unless you try it.

[1]: http://johnmacfarlane.net/pandoc/


If that is all you need and expect from a replacement for LaTeX, why aren't you just using LibreOffice?


It's not, I need the human readable source part, so I can track my work and cooperate in git :) I use LibreOffice for some stuff though.


Versioning of OpenOffice/LibreOffice documents using git: http://blog.riemann.cc/2013/04/23/versioning-of-openoffice-l...


That is really cool, thanks for that.


Aren't modern word processor formats just zipped-up XML files? I suspect it should be a straight-forward matter to get at least LibreOffice and version control cooperating.


You are technically correct, but not usefully so. The XML is pretty hairy. A VCS that handles XML semantically (e.g. tag-pair oriented, rather than line-oriented) might have some shot....but, basically, merges are more or less not going to work so it's not really version control, just backups.


If we're talking about LaTeX, Word, Markdown and LibreOffice - well, then LibreOffice is a contender for the worst usability, it's superficially similar to a mature product but it's a case of "the remaining 10% of goodness will take more work than the first 90%".


Usability depends on who is using it. I refuse to use LibreOffice because it is so bad, but I would sooner recommend LibreOffice to my father than LaTeX or Markdown.


LuaTeX is exactly what you’re talking about.

LuaLaTeX is a reimplementation of PDFLaTeX on that base.

ConTeXt is a new syntax/concept based on LuaLaTeX (that doesn’t want to be compatible with LaTeX and its errors)

---

That shell is unfortunately MS PowerShell. There are several approaches to do it in a way more familiar and compatible with unix shells. (some of which have been started recently, such as Final Term, and some are vaporware, such as TermKit)

And most editors are like that, some of which even have an optional vim mode for fans of that control scheme (my favourite is Kate)


The by far easiest way to write formulae I've come across is pen and paper. I could imagine a drawing tablet interface with good computer vision algorithms could be nice.


Surface Pro has a digitizer and pen, Windows has math-writing recognition from pen input:

http://windows.microsoft.com/en-ph/windows7/use-math-input-p...

OneNote has "insert Ink Equation" (I don't know if it uses the same engine, or has its own).


"A shell that instead of working on character streams worked on structured and annotated data streams, so it could intelligently interpret what is going on?"

I'm not convinced that's worse, but I'm not convinced it's better for the task of being a shell. It's totally, unequivocally, way the hell better when you're trying to build anything large out of it, but taking everything back to byte streams at the interfaces means a huge ability to easily reformulate things to fit them together.


Have you seen http://xiki.org? It seems a good idea for the she'll to know what a directory is for example.


I had not; I'll look at it at some point here.

What do you mean by "know what a directory is"?


> Markdown is better. Microsoft Word is better. They just both lack certain things we need, and LaTeX has those things.

That comparison right there, between systems in entirely different categories, tells me you have no idea about what you're saying.


LaTeX is not slow or inconsistent.

You only need to run it multiple times if you use features that require it, and that is because one pass generates code to be read in a second pass.

The syntax is not ugly. It's a macro programming language.

If you don't like use vim (or emaca with AucTeX) to write it, look at LaTeXila - it's pretty good. If on Windows, notepad++ is also free and has excellent support for LaTeX.


For one example, just off the top of my head, most any remotely technically-inclined young person in 2014 knows that you use Ctrl-Z/X/C/V/A to undo, cut, copy, paste, and select all. (Or substitute Command if you're on a Mac--same basic idea.) They've got that down cold, along with a host of other ubiquitous editing conventions--Ctrl-Left/Right to jump words, Shift-arrowkeys to highlight, Home/End to jump to the beginning/end of the screen line, etc etc etc. Basic text navigation and editing is a breeze, straight from muscle memory.

Then you try command-line Unix for the first time and nothing works anymore. You have to unlearn the habits of a lifetime before you can do something as simple as edit text. You can't even remap them effectively, because most of them do something else in vim or bash that you'd have to move elsewhere.

This is one of those problems that is just completely invisible to old Unix hands. It took me years to stop writing code in Notepad, because using vim felt like trying to drive a car with the gas and brake reversed, and someone punching me in the face every time I reach for the turn signal.

I'm not saying Unix conventions are objectively worse than Mac/Windows ones. Once you've got them trained into muscle memory, they're just as effective as anything else. But that need to retrain is a pointless roadblock in the path of every would-be hacker.


Unix was here before all those conventions you listed. So go back in time and tell Windows and Mac software vendors to stop making everyone change their muscle memory in order to use some new software...

But seriously, this is a complaint I hear from undergrads in CS all the time. My answer is the same every single time: stop whining and get over it. You're a professional now, you use professional tools and sometimes they work differently, that's just the price of admission. Nobody whines that driving an 18-wheeler is different from driving a compact car, they just suck it up and train for a CDL.


See, this is exactly the sort of thinking that got this thread started. Yes, obviously Ctrl-C meant Break before it meant Copy. I cut my teeth on a Commodore 64, but you don't hear me saying that Shift-Run/Stop is a superior way to end programs. Older does not mean better. Technology flows like quicksilver; no one says "you're a professional now, so learn COBOL 60 and quit whining." Professionals favor efficiency. Clinging to tradition regardless of drawbacks is not professionalism.

To play devil's advocate to myself, yes, obviously just switching everything over to GUI standards overnight would cause all sorts of problems. Tradition is not inherently valuable, but standardization is. But for fuck's sake have some basic human empathy. In the context of an individual career, having to unlearn and relearn how to edit text is a pointless speedbump that gets in the way of learning actual skills. Teach students what they need to learn to survive in the world, but recognize that their complaints are valid.


They might "whine" if you used the steering wheel on the 18 wheeler to control acceleration and the left and right pedals to steer though...


That's true, but I contend that the differences between Unix and Windows tools (because that's really what we're talking about here) just isn't that large. For instance, in most terminals you can paste into Vim with Ctrl-Shift-v provided you are in insert mode (although you may end up with weird indentation and such). You can copy with Ctrl-Shift-c. You can use the arrow keys to move around if you really want to do so.

So yes, there are differences, but the nix people have gone to great lengths to bridge the gap, to the point that I have zero sympathy for people who can't make the switch (note that I'm not saying everyone needs to use nix, that's a totally different debate, just that anyone who wants to should be able to).


That seems like a bit of a straw man: the official OS X native version of vim, MacVim has literally all of those key map pings by default – cmd+c to copy, cmd+a to select all, etc.


MacVim is not official and it doesn't ship with any version of OS X. It's nice that third parties have developed tools to try and bridge the gap, but it doesn't really address my original concern.

Now, the OS X Terminal (and programs run through it, including vim) does support Cmd-C (after highlighting with the mouse), Cmd-V, and Cmd-A, because the OS can treat them as raw text without the terminal knowing anything about them. It doesn't support Cmd-X, Cmd-Z, or even basic navigation keys like Home and End.


I really don't understand why neutral statements of fact get voted down. Is somebody way too emotionally invested in the discussion? Somebody reflexively downvotes anyone who makes them examine their assumptions? That dude I told off last week stalking me? Misclick? It's just weird.


Add this to ~/.vimrc

    " Make it behave more like a (Windows) GUI editor
    source $VIMRUNTIME/mswin.vim
    behave mswin
and use gvim - the GUI counterpart.


Yeah, I mean it's not like 'more modern' alternatives haven't been created... They just haven't caught on among this group of people because this group of people values functionality and does not give a damn how old something is.

People have made fancy new shells build around things other than bytes through pipes, people have created new replacements for terminal emulators, and people have created fancy GUI document creation systems with fancy instant visual feedback (a.k.a: word processors). Real shells, real terminal emulators, and LaTeX are all still used despite the creation of these meant-to-be replacements.


Because many of the people that learned those old stuff were young when those tools were new and don't want to learn anything else.

Habits die very slow.


I'm not particularly old, but I prefer using tools that are considered "old". Furthermore from where I am sitting, there seems to be no shortage of older developers who are wed to the latest fancy IDE.

I don't think this is something that follows age lines. Whether or not somebody primarily has a background in the FOSS world or in the Windows universe is probably a much stronger indication of what sort of tools they use.


Well, I have a background in both.

Before getting my first UNIX contact, via Xenix in 1994, I was already fully comfortable with the GUI world of Amiga and later on PCs. Including GUI alternative environments like Smalltalk and Oberon.

So, although I do master the CLI, I rather spend my time in the cosy GUI world.


And because the shell utilities are more versatile and often better.


There is something better. It's called Org-mode, and it is part of Emacs, although that shouldn't scare you off immediately; I'll talk more about that shortly, but first want to mention some features you might find appealing. You can write LaTeX documents in it if you like, or you can write Org-mode documents in a far more humane syntax, which the editor understands on a semantic level, and which can incorporate LaTeX snippets where they're needed while otherwise insulating you from having to write LaTeX source by hand. (Although of course you can if you want to.)

Org-mode has exporters for LaTeX, PDF (via LaTeX), HTML, DocBook, PDF (via DocBook), various flavors of plain text, OpenDocument text (which can get you to Word if you want to go there), all of which work quite well; if you need to supply LaTeX source as input to some external pipeline, Org makes it possible for you to do so. On the other hand, if you prefer to spend as little time as possible dealing with raw LaTeX, which in my experience seems like a sensible attitude to take, Org makes that possible as well. And on the third hand, if you need to write LaTeX source directly for complex diagrams and such, or if you want to override the way Org generates LaTeX in specific cases but let it handle the rest of a document on its own, Org doesn't get in your way there, either.

You can also include code snippets in an Org-mode file, of arbitrary length and in arbitrary languages. If they're in a language Emacs understands, you can have it render them with syntax highlighting, but that's just the tip of the iceberg; you can also have Org-mode execute those snippets, whether ad hoc or as part of the compilation process, and it can incorporate their results into the document verbatim or after a variety of transformations, and also use those results as input to other code snippets in the same document. This is why the literate programming folks love Org-mode as much as they do; there's nothing quite like it for interweaving text and code and then being easily able to do interesting things with both.

And, although Emacs is by far the most powerful editor for Org-mode files, there's nothing saying you have to use Emacs to edit Org-mode files. You can, for example, use Vim to write your Org-mode source, probably with some degree of highlighting via some vimscript file or other. You can then invoke Emacs in batch mode to compile your Org source into whatever format you want to come out the other end, without ever having to see or deal with the Emacs editor interface, and without having to involve yourself save on the most superficial level with Emacs Lisp. (I'll be happy to provide the general form of such an invocation, if you're interested; let me know.)


I think the problem is we are typesetting at all. By typesetting I mean for the printed page. Most of the result is PDF which is then emailed around. It's a PITA to write, a PITA to generate, a PITA to distribute, and then a PITA to read.

For the one percent (perhaps) of the documents which really end up on paper, I guess LaTeX is fine. Or Word. Who cares?


This should not be the top comment, it is a rant that misses the point. Neovim maintains backward compatibility where it makes sense to do so. Neovim is NOT a rewrite, it is a thoughtful refactoring that aims to achieve one of Vim's _original_ goals: first class support for embedding. And better interoperability.


I think you miss the point of my rant :P I'm not saying everything should be rewritten (though I do believe everything in C should be rewritten in C's successor which doesn't exist yet), but I'm saying Vim has gone too far in maintaining backwards compatibility, and if it would have improved interoperability, even at the expense of some backwards compatibility.


>I do believe everything in C should be rewritten in C's successor which doesn't exist yet

Both golang and D have a reasonable claim to being C's successor. People often think of D being a nextgen C++, but to me it has much more the feel of a C with garbage collection and batteries included.

And golang definitely feels like a modern C, at least to me. And that's what Rob Pike had in mind, as well.

PS: Had to use Word every day for years for my job. I despise it, and love vim. Vim keyboard navigation would have made my life so much easier when working on Word, and saved me several severe episodes of carpal tunnel. Luckily, I no longer have to use Word.


Go has absolutely no claim to being C's successor. It is useless for all the things it makes sense to use C for. That's why it has attracted virtually no C programmers. Go is python's successor if user influx is anything to judge by.


That's a fair argument. Neither Golang nor D give you the control that C gives you. If that's your yardstick, then the only viable candidate at this point is Rust. But Rust doesn't feel or look anything like C, but more like C++ or maybe Scala.


Its not just the lack of control, the bigger problem is features that have been added in. There major use cases for C that go by its nature can not fill. It can not do bare metal programming or real time applications without major changes to the language standard. Go aims to be a superior system programming language than C but not a successor to C. No one will be doing audio codecs or device drivers in go, but I would consider it over C for a greenfield system daemon.


> It can not do bare metal programming or real time applications without major changes to the language standard.

TIL that these are required for text editors and related tools.


They could, if:

- Go's unsafe package offered the same capabilities as Oberon's SYSTEM

- It offered a bit more control about GC takes place

Any AOT native compiled language can be used in place of C for user space code. There is hardly any C feature essential for such type of applications.

C can be relegated for kernel space until something better gains more market share.


I don't see anything shameful about using old software if it works well and suits your needs. Yes, new editors like SublimeText have advantages, and if you prefer them over Vim, use them. However, I've seen many projects undergo a rewrite in order to move to new technology, but they never matched the utility of the old technology they were trying to replace. I'm not saying it can't happen, but that it often doesn't.

I don't use Vim much now because I work a lot in Windows, and Vim got me into the habit of hitting escape when I was through entering text, but escape cancels the input in a lot of programs. However, I still use it in the shell or when I need to make certain global edits.

I wish Neovim great success, and I might use it someday if it meets my needs, but not just because it's new technology.


Sorry, I didn't mean you should be ashamed about using old software. What I meant was that we should be ashamed that other people are using old software.

I am helping my girlfriend to learn Javascript, and it hurts inside that every once in a while she gets confused with something I have to explain her it's because the author of the language made rookie mistakes/dumb decisions 20 years ago.


You may know this, but the author of Javascript was under severe time pressure, and wanted to just implement Scheme but got overruled by management.


Yeah, it's unfair of me to point the blame at mr Eich with the 'the author'. Javascript definitely was no small feat and has withstood the test of time very well. My point lies more in the '20 years' part of the argument, that in all that time we couldn't get these wrinkles ironed out.


>My point lies more in the '20 years' part of the argument, that in all that time we couldn't get these wrinkles ironed out.

Javascript has the requirement of backwards compatibility. The limitation is not technical. It's disingenuous to suggest it's so simple to replace these things or it's extremely naive.

Look at the recent HTTP redirect article for example. Something as simple as redirects have been implemented incorrectly for a long time. Browser vendors are well aware of it, but they cannot change the behavior because it will break every existing site that expects the broken behavior.

You should be more impressed with the things that last 20 years, not embarrassed. It means they were actually engineered well enough to be a good general solution.


I don't see tinco saying it is simple to replace these things. Where do you see that? I suspect you're unconsciously framing tinco's position this way...

Another possible way to interpret the 'ashamed' statement is that tinco is simply saying we should be trying harder to move forward when we are using such old technologies with such warts. Why can't we get rid of the warts? We should try harder. Perhaps this is what tinco is saying? I think at least equally plausible to the naive position you're projecting onto tinco...that it's 'simple' to replace these things.

Kudos to Neovim for making the effort! It is certainly appreciated.


Or maybe eich was a terrible programmer and it shows. Serious my some of the is stuff are major blunders. No amount of time pressure can be used as an excuse for such poor design.


> Vim got me into the habit of hitting escape when I was through entering text, but escape cancels the input in a lot of programs.

Does ^[ cancel input in any programs ? AFAIK vim understands the ^[, not the "escape key". the key labeled "ESC" creating a ^[ is a convention used in terminal emulators.

Not that it's going to help you now, but if you had trained your fingers to hit ^[ rather than the escape key (or even ^C but that has other semantics in windows whereas I don't think ^[ does), I wonder if you would be having the same problem now..


^[ just generates ASCII character 0x1B, aka Escape.

Vim understands "escape" to be 0x1B, pressed by itself (not as part of a longer escape sequence).


Not practical. ^[ is a two-key combination, so more RSI-inducing. And you need escape a lot, especially in plain vi.

And the ESC key has been generating ASCII 27 on keyboards for a long time. It's definitely not just terminal emulators.


>I don't use Vim much now because I work a lot in Windows, and Vim got me into the habit of hitting escape when I was through entering text, but escape cancels the input in a lot of programs. However, I still use it in the shell or when I need to make certain global edits.

some people discourage the use of the Esc key and prefer to use Ctrl-c instead. I'd recommend to disable it and force to use the Ctrl-c combo. I did it and completely forgot about the Esc key(actually I find it more comfortable)


I have a habit of typing 'jkjk...' in command mode when I'm thinking, and eventually I got sick of stretching my fingers to escape into command mode so I just used 'jk' as my escape sequence.

    imap jk <Esc>
Of course now in non-vim editors I sometimes have to undo the occasional jk sequence... But my point is that the escape key can be whatever people want it to be.


Use ctrl-c only if you never want to be able to use a count with an insert command, e.g. 10a=<Esc> .


We as developers in the Open Source community should be ashamed people are still using Vim to write LaTeX in Bash running on terminal emulators.

What?! You can take my vim+bash+tmux+urxvt when you pry it from my cold, dead hands!


> 2014 > not zsh


I used to use zsh (including a giant, customized zshlovers .zhsrc) but I switched back to bash when I started administrating a few different servers. There's just something to be said for feeling totally comfortable and productive with a default environment. I'm also quite comfortable using plain old vi -- something a lot of other vim users can't stand.


Bash has also come a long way recently. I suspect a lot of the excitement over zsh comes from mac users who are stuck with what is now an ancient version of bash.


as long as bash doesn't get the terminal sizing 100% right all the time (and thus display proper entry line/prompt all the time), I'm not going to use it. It always happens eventually, if you spend along time in the shell. There used to be a myriad of little bugs causing this. Not sure how many are left now.


I spend pretty much all day every day in the shell and haven't had problems like that in a long time. I think maybe because at some point the checkwinsize option got turned on by default, at least on ubuntu.


>implying

>not on /g/


i so wish any of the terminal emulators in osx had the scriptability that urxvt has. i miss it all the time :/


I've been frustrated by the lack of iterm2 features in Linux terminal emulators: local autocomplete, tmux integration, etc


I've been frustrated with the lack of Konsole in any other term emulator.


Sure, how is it going back in 1974?


> Look at SublimeText, it's got 1% of the features of Vim, yet it's converting Vim users left and right, by its sheer usability.

You're confounding usability with "do not need a manual". They're not the same thing at all. Moreover, I reject entirely your assertion that vim is not usable.

vim is amazingly usable; text manipulation commands for example all follow a common structure: [action][times][position]. That means the rules you learn for using "yank" apply just as well to "delete" or "paste" or whatever. Anytime you encounter a new text command you don't even need to wonder how to use it; it's a sure thing it will follow the same structure that you already know.


>You're confounding usability with "do not need a manual". They're not the same thing at all. Moreover, I reject entirely your assertion that vim is not usable.

If by usable we mean "one can be productive in it" then yes that's correct, vim is immensely usable and I've been using it in that sense for six years.

That however is a facile definition of what most people typically mean when we speak of 'usability'.

A tool can be useful while only being barely usable at all. You can do wonderful things with modern airplanes, but most models require painstaking attention and study.

I'm personally of the opinion that we can have a terse input command language while also exposing the rest of the functionality in a way that doesn't punish all users.

I spent all of university using vim purely as a way to indent and colour my code; but intelligently using splits, buffers, tabs, code folding, ctags etc should not require luck to discover and exacting patience to use.

Let's not even get started on the totally inscrutable vimscript. How many more useful things could we have if I didn't have to depend on @tpope for 90% of my plugins?

For many years vim trudged along because it the only other game in town beyond emacs. Now, I only continue to use it because of sunk costs.

So yeah, nerds have to get over their sacred knowledge. Just because you somehow enjoyed reading the :help pages (which… require you to know how to use vim to read!) doesn't mean we have to force everyone to do that.

Or we could all move to Sublime.


> If by usable we mean "one can be productive in it" then yes that's correct, vim is immensely usable and I've been using it in that sense for six years.

Usable means an interface which is (i) consistent and (ii) well suited to the task at hand. It does not mean you should not require any training nor that you should be able to intuit the functionality without looking at the manual.

vim is an example of software that is both highly usable and intended for use by advanced users.

> I spent all of university using vim purely as a way to indent and colour my code; but intelligently using splits, buffers, tabs, code folding, ctags etc should not require luck to discover and exacting patience to use.

You're doing it wrong. Read the bloody manual.


> Usable means an interface which is (i) consistent and (ii) well suited to the task at hand.

No it doesn't. Usable as in Usability is defined in the ISO-norm 9241-110 by seven principles, a summary can be found here: http://www.userfocus.co.uk/resources/iso9241/part110.html

"suitability for the task (the dialogue should be suitable for the user’s task and skill level); self-descriptiveness (the dialogue should make it clear what the user should do next); controllability (the user should be able to control the pace and sequence of the interaction); conformity with user expectations (it should be consistent); error tolerance (the dialogue should be forgiving); suitability for individualisation (the dialogue should be able to be customised to suit the user); and suitability for learning (the dialogue should support learning)."

vim falls short in discoverabilty and, arguably, in user expectation, and suitability for learning.

Though I don't want to further spawn a vim-discussion. Just to show you that you don't seem to know the necessary vocabulary to tell you parent that he is wrong.

> You're doing it wrong. Read the bloody manual.

Yeah. Way to prove his point.


> vim falls short in discoverabilty and, arguably, in user expectation, and suitability for learning.

vim fits very well all the criteria you just cited. Moreover, "discoverability", "user expectation" and "suitability for learning" appear to be buzzwords that you just threw in on your own at the end.


They are very really things and relatively well understood. By calling them "buzzwords" that were just randomly "thrown in" you don't really make a good case for your understanding of usability.


Nope, they all are part of the ISO. I cited two of them, and the first, discoverability, is a consequence of self-descriptiveness and learnability. Don't know how you get the idea I throw them in as buzzword when they are part of the quote...


You have to admit, compared to most any tool out there, vim has one of the worst newbie experiences, given its popularity.

Even a terminal window allows you to type help to get something to work with. VIM replies with "Left, end of word, Right, paste after" which in most cases is nearly a NOP.

I believe what he is trying to say is that it would make sense to figure out how to make the initial experience less daunting without giving up the power it provides.


The first thing that shows up in a VIM session is a small introductory text that includes

    type   :help<Enter>  or   <F1>  for on-line help
And of course the GUI also has the familiar Help menu as most common GUI apps.


Oh, yes, because showing the help entries first when entering the application clearly is the right idea.

And surely that is why all the text-entering applications in the world have followed vi's lead.


not too mention that vim became hip only in the last couple of years.

saying it's converting vim users left and right is a little far fetched. most of those users never had any real vim muscle memory to begin with. a lot of "vim" users i saw work just fine without motion commands. the only reason they used it was because they could use it anywhere. whether it's modal or not is totally irrelevant to them.


> not too mention that vim became hip only in the last couple of years.

Wait, what? Vim has been in use for a lot longer than that (it was released in 1991), and vi even longer (1976).


"Hip" does not mean remotely the same thing as "in use".


Yes but emacs was for more popular than vim for awhile.


No, you're changing the meaning of usability. It doesn't mean it can be used, but it means that it's easy to learn and use. And vim is powerful but it's not a good example of usability.


> No, you're changing the meaning of usability. It doesn't mean it can be used, but it means that it's easy to learn and use. And vim is powerful but it's not a good example of usability.

"Easy to learn and use" does not mean "I don't have to learn anything". In particular, vim is easy once you learn something about how it works with text. For example: "d" is used to delete text. It is used as part of a command sequence that usually goes:

[action][times][position]

One of the first things you learn for example is that "dw" deletes to the end of the current word. d5w deletes to the end of the next 5 words. "yw" copies to the end of the current word. "y5w" copies 5 words.

Later you might learn about cursor movemen. You find out that "j" for example moves the cursor down one line. Guess what? These cursor re-positioning commands fit into the command structure you've already learned. So you can easily intuit that "d5j" will kill everything to the end of the next 5 lines.

Later in your vim journey you might learn that "D" is shorthand for delete to the end of the current line. Hmm, but there is also this "y" command that yanks text. Guess what? "Y" yanks to the end of the line!

That's what usability looks like. You take what you've learned and apply it in a new context in order to produce expected and consistent results.

In short: vim is amazing. It's consistent, intuitive and immensely usable -- but only after you invest some time learning how to use it. The only time vim is not intuitive is when your expectations are that it will behave somewhat like TextEdit or Notepad or whatever other lesser text manipulation tools are out there.


ring ring ring

"yeah, what's up?"

"Nagios is paging, something with that code you pushed live a couple hours ago is throwing a fit... looks like half the web cluster is on the verge of going to swap, can you look into it? asap!"

"Well, I'm on at the bus stop and won't be near a computer for at least an hour...yeah, I'll sort it out, will text you when we can push fix live."

pulls out Note 3, shitty 3G coverage

connects to VPN

ssh into dev env with connectbot

switches to hackerkeyboard for input mode

screen -dr dev

Ctrl+c, tail join IRC channel that dumps syslogs from production cluster

sees error

Ctrl+c,vim the/file/causing/the/unexpected/problem

fix, :wq

git stash, git pull, git stash apply, git commit, git push

Ctrl+d, exit

sends text message "good to go!"

deal with it.


Better, mobile off. Fix issue in the morning when back on office.


    git commit -m "Always commit your changes first!"; git pull --rebase; git push
And I use tmux, it's more modern ;)

I like the syslog into irc thing, that's neat.

By the way, I recently made a commit in the Github web interface while sitting on the toilet. No posix stack required ;)


Moral of the story: Vim is very good for deploying fixes from bus stops.


Moral of the story: This post has devolved into inane commentary.


‰s/bus stops/everywhere/g


I am honestly amazed at how feistily every 'hacker' here is defending LaTeX in response to your comment. I agree with you for the most part.

This whole "it's apples to oranges" claim is bullshit. Both Word as well as LaTeX can be used for anything from writing a resume, making a technical paper with a few simple equations, to writing an entire Vector or Linear Algebra textbooks. When you have two options for doing the same task it's very much comparing apples to apples.

Yes, they each were originally built with different objectives and different target markets, but they have since aspired to be usable for all the same things. Of course, they each have their strengths and weaknesses.

But I don't get how do a bunch of folk here feel justified in arguing against someone saying "there's room for improvement" with clear viable suggestions on the details too. I know that hating on MSWord (or MS-anything) is fashionable on HN. But you can't seriously be defending

* the steep learning curve: throw your 14-year-old who's doing a math assignment Word vs your favourite LaTeX editor, see what happens;

* lack of WYSIWYG-like feedback-loops: you have to wait a full minute on decently sized documents to see the result of adding that equation that took you 10 seconds to add;

* syntax holdups: you miss an underscore and your document is broken. All the way through. And can take an hour to fix even if you're skilled. People writing documents that need LaTeX's power aren't always coders, and debugging is not a fun or planned-for activity for anyone.

I also don't for the life of me see why someone has to "be an expert, not just written a couple documents casually in undergrad" to be able to comment on the flaws of a product. They're as much a user as a power user, and obviously the power users went through that stage too. Losing sight of your past difficulties (or being gifted in LaTeX) doesn't make them irrelevant for people not in your current position.

Loosely relevant: I smell hardcore "hacker entitlement" around people using LaTeX and defending it including all its stupid barriers to easy document-creation tooth and nail. Hardcore programmers do this with way too many things (like Linux, and development environments for almost any language). And it's just really frustrating for a lot of people to deal with =/

PS: NeoVim sounds like someone finally got off their high horse (or from whining) and decided to actually make something better. Thanks!


>* syntax holdups: you miss an underscore and your document is broken. All the way through. And can take an hour to fix even if you're skilled.

I'm sorry, but I just have no idea what you're talking about. I haven't encountered anything like that - not when I was starting out with LaTeX, not now when I use it for writing technical docs. "An hour to fix" smells like pure FUD to me - any mistake can take an hour, a minute, or even a week to fix. However, I can't imagine a syntax error like missing an underscore - especially one like that - could take more than a few minutes at the very most.

>PS: NeoVim sounds like someone finally got off their high horse (or from whining) and decided to actually make something better. Thanks!

I think you're a bit confused as to what exactly are the "hardcore entitled hackers" defending. Lots of systems in the UNIX ecosystem are bogged down by backward compatibility, are arcane, or just plainly in need of rewrite. Nobody disputes that. The commenters here however attack them from a completely different position, and with arguments which mostly don't hold water - so don't expect no defense.

Hackers aren't the kind of people who would cling to their tools despite them having obvious and easily mitigable flaws. After all, this post did gain a lot of traction, at this point mostly by simply presenting what would be great to do.


> This whole "it's apples to oranges" claim is bullshit. Both Word as well as LaTeX can be used for anything from writing a resume, making a technical paper with a few simple equations, to writing an entire Vector or Linear Algebra textbooks. When you have two options for doing the same task it's very much comparing apples to apples.

So can HTML. Would you compare Word to HTML? You can also write your resume in Notepad. Does that make Word comparable to Notepad? If you want to compare anything, compare LyX to Word. At least that makes some sense, even if the output of the latter just sucks in comparison to the former.

TeX is at its core a typesetting engine. Word is largely a GUI markup editor. It doesn't even do anything coming close to typesetting.

> * the steep learning curve: throw your 14-year-old who's doing a math assignment Word vs your favourite LaTeX editor, see what happens;

Are you sure? Customizing LaTeX can be hard, but using it in a straightforward way is no more difficult than HTML. Sure, it may be a bit more work than using a GUI, but once you get the hang of it it's a lot quicker than Word's formula editor.

> * lack of WYSIWYG-like feedback-loops: you have to wait a full minute on decently sized documents to see the result of adding that equation that took you 10 seconds to add;

In my experience it's a lot faster than that. The reason for this is simply that TeX does actual typesetting. This has a certain computational complexity, and cannot just be done on a line-by-line basis. Word is fast because it doesn't do anything like that. Again, Lyx has WYSIWYM (What you see is what you mean), which gives you an approximation of the output instantly, minus the proper typesetting.

> * syntax holdups: you miss an underscore and your document is broken. All the way through. And can take an hour to fix even if you're skilled. People writing documents that need LaTeX's power aren't always coders, and debugging is not a fun or planned-for activity for anyone.

It more commonly takes a minute, not an hour. Every had Word corrupt a file and mess up the styles in the middle of your 150 pages document? That's not exactly a quick fix either.

To sum up: If you want a WYSIWYM editor for TeX use Lyx. Otherwise, comparing LaTeX and Word makes about as much sense as comparing HTML and Word.


> So can HTML. Would you compare Word to HTML?

I don't mean to be difficult, but lots of laypeople actually compare HTML and Word for a common purpose, usually for making simple websites. Yes, people actually do generate their front-end page formatting using Word's HTML export. Again in that context, it's technically comparable (or worth comparing, rather). And they each have their flaws. I have used HTML/CSS for lots of documentation, and experimented with it for styling my resume even though those can be classically considered Word's domain. So I actually think comparing HTML and Word is really not that strange for a non-zero number of contexts. Typesetting via "formatting" is a pretty big part of the Word experience, and is a totally worthwhile context in which to compare it to LaTeX.

> Customizing LaTeX can be hard, but using it in a straightforward way is no more difficult than HTML.

I used LaTeX for every document I created in undergrad, but mostly because I wound up spending hours the first week of every course tuning a document to match the lab report/homework format, and there was a legacy of standards passed down from upperclassmen (ie plenty of nice handholding). After I left college, I've pretty much never wanted to open a LaTeX editor (or Word really, for that matter, because I prefer Google Docs).

Typing into LaTeX really is very straightforward, and is far superior for equations and the such, yes. But to start from a blank page, it's a nightmare without lots of handholding (or experience) to make your document legible. To some extent, so is HTML/CSS, for lots of things. A lot less so with Word. This is a bit of a contrived example, but I think of this as similar to running a "Hello World" alert window in python+tkinter vs javascript+Chrome on a Windows computer. It's amazing how difficult the initial setup and configuration in the python route is (you should try it on Windows if you haven't) compared to the "write code, refresh browser" in the js route. Even though the code is possibly less friendly in js, it is actually easier to get up and running with far less handholding. There's just so many other things slowing down and making the python+windows experience making it frustrating for a beginner. That's how I feel about TeX vs Word.

I would bet money most middle schoolers, if given a blank LaTeX editor page and a blank page in Word/GDocs to write just a bunch of math formula (some fractions, arithmetic symbols and some exponents maybe?) would give you something better and sooner with the latter.

> In my experience it's a lot faster than that.

This unfortunately wasn't true for me, or for a lot of other people. Maybe large amounts of images, or the wrong image formats, or numerous other causes can make exports slow. I don't know the details of why, but it was there, and it sucked. Maybe a side-by-side editor+constant re-render in a custom output format (that's optimized for TeX rendering way faster than making a pdf), would be pretty cool, for instance. I don't need to export to pdf after every tiny edit after all.

> It more commonly takes a minute, not an hour.

I'll completely admit I got a little overdramatic by that point =]. Sure it's more commonly a minute, but it's still full of much more "Badness 10K" than I ever had Word throw an Error dialog at me. And in my (weak) defense, I have wound up spending an hour fixing a LaTeX bug from a missed closing brace, and I couldn't create the document. and it would have helped if I could have noticed it the moment it happened (as opposed to after my next recompile which took 30 seconds). I feel constant debugging shouldn't really have a place in document-writing. Perhaps better error tolerance (a la HTML) and/or debugging would be nice.

To sum up: I'm not trying to put down LaTeX. But it has lackings, which can be addressed. I also wouldn't suggest to make LaTeX function like other WYSIWYG/WYSIWYM editor, but to attempt to address the root problem, ie, the feedback-loops times. Lyx tried to do that, but WYSIWYM wasn't the right answer (for all the people who still use regular LaTeX editors instead).

Well, I actually don't even write documents very much anymore so maybe I'm pretty disconnected from the whole ecosystem for a while, but from what I gather the tools haven't really evolved much at all in the last 4 years.


The first time I used Latex was when I was around 16 and did a Math competition. I first tried to use Word to typeset equations and it was a nightmare to use. So I looked around on the Internet for an alternative and found LaTeX. I think it has a very low barrier of entry, all you really need is

\documentclass{article} \begin{document} Hi. \end{document}

All the additional commands can be learned on the fly, if you have a reference at hand. I'm sure there are commands in Word for creating a title page, a table of contents, a bibliography, an index, correctly numbering equations and updating references if you add additional ones, but once I discovered Latex I have never looked back and have not used Word in a long time.


Have you ever tried to write math/physics paper using MS Word? It is completely broken comparing to LaTeX+decent text editor (vim, emacs, kyle, whatever). Writing equations in Word is pain, inserting bibliography is pain, inserting figures is pain. Printed results do not match screen version, if you need to collaborate with someone you have problems with diffs and different versions of Word. Another use for LaTeX is making slides for presentations. Word or Markdown can't be used for this. And if you use PowerPoint for slides you are in trouble. Most probably your Greek letters in formulas will be invisible or software will crash in process.


> We as developers in the Open Source community should be ashamed people are still using Vim to write LaTeX in Bash running on terminal emulators. (Yes, it gives me shivers just thinking about how much each of those technologies sucks when you think about how good it all could be.)

Let's deconstruct each of the pieces we are supposed to be ashamed about, shall we.

LaTeX the language: a wrapper language built on top of TeX for typesetting documents. The idea being to specify higher level document logic (for instance sections, subsections, paragraphs, tables) and if all else fails, then fallback to explicit placement commands. I fail to see anything to be ashamed of in the idea. And, as for your complaints about extendability in some of the comments below: maybe you should look at the texlive distribution or PGF/tikz here: http://www.texample.net/tikz/)

LaTeX/TeX compiler: that's a fair point. But, doesn't have much to do with the language.

vim: a text editor wrapped around using the keyboard efficiently, similar in idea to LaTeX the language in that it asks you to express yourself on a higher plane. Neovim is essentially a more maintainable repackaging of the same model keeping extendability in mind.

bash: wrapper over the early Unix model pipes, ttys, processes (and not threads) etc. The bash scripting language is not all it could be. But, as I said before, the basic model is fairly tied up with Unix itself and is similar in zsh/fish etc.


(My post got long, so I've put a standalone version, with better markup rendering, here: http://gist.io/9161898.)

[TikZ] is a perfect example of TeX's missed potential: it is an outstandingly well designed DSL for diagram creation, embedded in an insane macro language. The moment you try to do anything nontrivial that exploits the programmatic (rather than purely declarative) nature of TikZ you immediately run into wall after wall trying to express basic programming concepts in the host language, TeX.

For example, two common patterns I have seen frequently arise as perfect uses for a programmatic diagram description:

- using computed coordinates and transforms to construct complex paths and diagram layouts from the composition of basic geometric reasoning;

- using abstraction to tersely encapsulate common visual components, both for simple iteration over many similar components, or for higher-order encapsulation of parameterized diagram logic.

In both cases, you quickly run face first into fundamental limitations of TeX as a programming language. In particular, it is extremely painful to use for either arithmetic or control flow. This is a big enough deal in practice that core among TikZ's features are custom inline arithmetic syntax for coordinate computation, color blending, etc., and a custom `\foreach` macro, both defined not in the language, but provided as part of a specialized diagraming library, because they are fundamentally at odds with the design of the core TeX language. Even this impressive bit of TeX engineering still breaks down as soon as you try to do much more than iterate over a hard-coded constant range. (How much do you need to bracket, `\relax`, etc. the operands of your `\foreach` if it is used inside of a macro definition? Or if they are the result of computation, even as simple as basic index evaluation like `\x+1`? Now what if you want to iterate over a range which results computed in part using floating point arithmetic using one or another library?)

TeX suffers because it is at once both a relatively awkward markup language (relative to something like Markdown) and an extremely awkward programming language. Many tasks which would be trivial in any mainstream programming language are outrageously challenging and arcane in TeX. The LuaTeX effort to embed a sane, modern programming language much more centrally into the core of the TeX runtime is a fruitful direction, but it still does little to better formalize or structure the various levels of document representation for programmatic transformation: we still have a giant stack of TeX macro complexity, ultimately expanding down to very low-level page rendering descriptions, just now with the ability to register Lua callbacks at various points along the way, or use Lua scripts to generate new tokens during the expansion process.

As a clever, minimalist hack, the single-pass macro expansion semantics at the core of the language were a great way for one man to bootstrap a complex typesetting system. As an expedient hack to build an incrementally more humane document generation system atop the powerful low-level typesetting engine already available in TeX, LaTeX got a lot of mileage for relatively little cost. As an intermediate representation for modern typesetting systems, TeX could be a reasonable higher-level alternative to something like PostScript (which few bother to write by hand, but is a great low-level page description language, particular as a target for machine generation). But as either a primary programming language or a human-facing markup language, TeX is a terrible fit. Worse, the extreme difficulty of doing anything parametric in TeX makes it a bad fit for a future of many display formats and adaptive (responsive) layout, where baking tools focussed on baking a single paginated output format are less and less relevant.

The evidence for how different the world could be is not given by plain Markdown for comment boxes—that is, indeed, an apples to oranges comparison—but by the power that comes from creating an extensible and programmable document transformation system based on a well-defined document grammar (which is, in fact, a recursive data type, not tied to any one front-end syntax). That's the heart of [Pandoc]. It's not just "extended Markdown" or "a markup format converter," it's a powerful framework for document transformation, which naturally supports humane markup, while also allowing extreme extensibility via general-purpose programming languages. See the [scripting] documentation to get a sense for the model of extensibility, then recognize that this same document representation is the core of every translation and transformation Pandoc does, and how this model enables many of the same things as TeX's macros, but in a far more structured yet also programming-friendly way.

[TikZ]: http://www.texample.net/tikz/ [Pandoc]: http://johnmacfarlane.net/pandoc/ [scripting]: http://johnmacfarlane.net/pandoc/scripting.html


Thanks for that extremely insightful analysis. Haven't looked at pandoc before but will do.


I agree with almost everything--except for backwards compatibility when it comes to throwing stuff out. There is definite value in it, especially when we're talking about something as old and widely used as vim!

The thing about compatibility is that it doesn't matter how great your new and shiny is if it can't talk to old things at all. Imagine the best new text editor in the world, but it's completely incompatible with every file format that's ever been written previously, even .txt! No one would use it, because the cost of switching is too high.

In some cases, it's definitely worthwhile to have some development pain to keep that around. Stuff like keeping vimscript (and therefore the entire ecosystem around vim) functioning means that neovim could be a drop-in for some people, and HAS to be preserved at least partially. Stuff like keeping ancient Amiga support or integration with Sun workshop? Definitely not worth it anymore.


    I agree with almost everything--except for backwards compatibility when it comes to throwing stuff out. There is definite value in it, especially when we're talking about something as old and widely used as vim!
This is the part where people are surprised, but I am an extremist. I definitely think backwards compatibility should be among the lowest priorities when making decisions on improving software, especially and perhaps just, in Open Source.

Just think about it. What happens when we decide a certain core feature of Vim is archaeic, and could be implemented in a more modern, more flexible, more loosely coupled way.

First, every vim plugin would break. But if you look at how many Vim plugins are actually used, I would be surprised if the top 100 Vim plugins covered less than 99.99% of all Vim users. That means we only have to upgrade 100 Vim plugins, and probably just a couple of lines if they're written well.

How much technical debt have we accrued now, that in reality could be paid off just by biting the bullet and going through a hundred pieces of software and changing a couple of lines?

This is open source, we could just do it. (tm)


As a user of software - fuck you, and may you never touch anything I use. It's juvenile oo-look-shiny nonsense like this that makes us people who just want to get shit done waste yet another afternoon on patching working stuff after a (forced) update because some wet behind the ears dude decided that everybody before him was a utter moron.

There is a reason people pay for software that just works, and will work for the next few years. It's because they're fed up with being the guinea pigs for architecture astronauts and code purity fetishists.

Also, "that means we only have to upgrade..."

Who is "we" here? You? Of course not, you're too busy chasing the next fad, while the rest of us have to repare shit just to make it do again what it aready did before.


Haha, thanks roel. I'm so glad to be relieved of the duty to try to improve things you use...

And let me extend my apologies in name of all the authors of software whose extra features and nice bug-freeness you enjoyed at the terrible cost of going through the gruelling process of having to upgrade your software.

Perhaps all us developers of modern software could get together raise funds so you could get a refurbished pink iMac G3. Before all those pesky Apple people started their code purity fetishist backwards compatibility breaking migration to a BSD+Mach based OSX.


Agreed the OP was harsh in response but the point made regardless. Your caviler attitude in this thread, and overall post, makes me question your maturity as an industrial grade programmer but that moves towards ad hominem instead of healthy debate...this is not a sleight and will agree I lost the debate before it started.

Do keep in mind that modern software as you state are great tools for completed tools but very little options for those that create the tools themselves, IDE's for example but not limited to that concept. This leads to my point:

To paraphrase the OP, maybe some things should not be fucked with for "code purity" in the name of functional purity. If you vehemently disagree, make a better product. Critical mass will replace the failed ideology that ultimately leads to a better technology. All the flame wars in the world won't change this fact and ultimate outcome.


Thanks for not digressing :) To address your concern, yes I am not very mature as an industrial grade programmer. I have plenty of professional experience, but there's also plenty of experience still to be had.

I do understand my comments are inflammatory, and did not expect at all they'd be upvoted more than downvoted, but that seems to be the case.

Your standpoint is the one many in this thread have and I understand the least. This idea that "code purity" and functional purity are different.

The only reason that LaTex can not be improved is that its code is unaccessible. This means its functionality lies rigidly restricted in the 80s. In my opinion, there's nothing functionally pure about 70s/80s software. Back then, functionality followed from hardware restrictions.


That's pretty funny. OSX has far more backwards-compatibility with old software than OS <= 9. Heck, it even comes with vim and an xterm-compatible terminal preinstalled.


Haha, that's funny indeed. I hadn't looked at it that way :)


First, every vim plugin would break. But if you look at how many Vim plugins are actually used, I would be surprised if the top 100 Vim plugins covered less than 99.99% of all Vim users. That means we only have to upgrade 100 Vim plugins, and probably just a couple of lines if they're written well.

Please don't invent fake statistics in order to support your argument. Vim's plugin ecosystem has a long tail distribution[0]. After just a cursory glance at vim.org's script repository you'll notice that even the 141st ranked plugin[1] has over 10,000 downloads. To find the first plugin with less than 1,000 downloads, you'll have to drop all the way down to sql.vim[2] which happens to be ranked 1501!

Personally, I happen to use (daily) multiple plugins which are ranked outside the top 1000.

[0] http://en.wikipedia.org/wiki/Long_tail

[1] http://www.vim.org/scripts/script.php?script_id=483

[2] http://www.vim.org/scripts/script.php?script_id=905


Please do not invoke factual information that contradicts the myopic view of the world as I see it. K thanx bye.

(Sarcasm aside, ^this)


I think you miss a major thing here: if you put vim with just it's top 100 plugins up against Sublime Text, I think it will easily lose. Why would I even use that? It's because of the long tail of thousands of plugins (your 1% usage) that I choose vim/emacs (more specifically, each power user chooses it because of the set of many of those top 100 hundred plugins + a handful of weird, obscure plugins that fit their needs perfectly.)


Maybe I'm wrong. I don't think I have any non-top100 vim plugins installed. I could be underestimating the problem by a large margin.

I sort of think that if just the top 100 plugins would be compatible, the authors of the non-top100 plugins would be motivated to migrate, as a sort of herd mentality thing.


This is the same kind of short-sighted programmer selfishness that leads to idiotic ideas like "let's abolish time zones and just everyone use UTC". Do you realize how many man-years of other people's time and effort you're proposing to spend adding zero value to that software?


The problem is you broke compatibility now. Will you do it again next year, and fix 100 plugins once more?

I am also not a big fun of extreme forms of compatibility, but plenty of projects try to break it only to have people which stick around with old versions for years.


Big projects like Python only break compatibility once every few years, and a lot of thought goes into it. I think it's a lack of respect to the authors of Python and perhaps a lack of authority of Guido that packages are not being upgraded to Python 3.

I wonder what would happen, if just a few companies that actively use Python, like Google, would hire a few devs that for a year would only fork and fix Python 2 projects. Wouldn't it just solve the problem? I bet they'd be done within a couple of months too, and they wouldn't even have to be super senior types.


Almost all major Python packages have now been upgraded to Python 3. For example, only 3 or 4 of the Top 50 have not been upgraded, and all but 1 of those are in the process of updating.

It's now mostly just the thousands of smaller packages that still are on Python 2 only.

But Python couldn't get away with that more than once every 10 years.


Maybe I would have upgraded to python 3 by now if I saw any compelling reason to do so (like, say, getting rid of the GIL).


Large scale proof why this is a bad idea: Python 2.x vs Python 3

Those of you who don't know about this mess, it shouldn't be hard to find some rants by googling. Basically Python 3 broke backwards compatibility with python 2 without providing any compatibility layer at all, they just expected every third party library to switch "sooner or later". It's now 10 years down the road and nothing has really happened. Everybody who gets shit done is still using 2.x because they need the old libraries.


Actually I see Python as a good example why you want to change. Python 3 has much cleaner concepts than Python 2: Strings are not byte arrays, use efficient generators by default, etc. On the other extreme take a look at C++: Lot's of ugly as hell syntax and pitfalls because of decades of backwards compatibility. See https://stackoverflow.com/questions/6939986/c-nested-constru... for an example of both.

> Everybody who gets shit done is still using 2.x because they need the old libraries.

And, btw., I hold each and everyone of these responsible as part of the chicken-and-egg problem.


> Look at SublimeText, it's got 1% of the features of Vim, yet it's converting Vim users left and right, by its sheer usability.

I just looked it up, $70 is way out of my pay range. I'll convert when I have a full-time job.

> We as developers in the Open Source community should be ashamed people are still using Vim to write LaTeX in Bash running on terminal emulators.

Yes, you should be ashamed.


You can use the free version. It's full featured but has intermittent alerts asking you to upgrade[0].

[0] Recent vim convert.


Sublime is only free-as-in-beer. Some use vim because it is also free-as-in-speech.


"I just looked it up, $70 is way out of my pay range. I'll convert when I have a full-time job."


> (Yes, it gives me shivers just thinking about how much each of those technologies sucks when you think about how good it all could be.)

I can tacitly agree that using vim running in a terminal to write LaTeX is not the easiest way in the world to write LaTeX.

However, terminals do not suck. Bash sucks a bit, vim sucks in ways this project aims to fix, but terminals are absolutely awesome.

A terminal is a dead-simple API for interacting with the user. Want to display text? Write to fd 1. Want to read some input? Read from fd 0. Want to control the cursor or change the colour? Write a special sequence of characters to fd 1, that's supported by every regular terminal emulator. There is even support for getting mouse events. Yes, it's text-only but that is enough in a lot of cases. And it's very simple. And you get it for free the moment your program starts. There is no opening connections to X, no initialising GUI libraries, no callbacks, no need to design a GUI. It's just there. It's always there.

Try coming up with a GUI system that has an API as simple as that. The only thing I can imagine is something that consumes HTML written to fd 1 and renders it nicely. But note that HTML can and will be malformed, and malformed HTML can and will be parsed differently by different "HTML terminal emulators". Plus, if you want to do that, just write javascript and stick it in a .html file.


So Sublime text provides a 99% feature loss but a great usability gain?

Can you explain that a bit?


I don't think anyone would argue that Sublime Text has a much nicer GUI than anything Vim has. It's just lots of little things, like how scrolling is smooth and quick, not entire lines popping into/out of view. The zoomed out source map is also nice.

It's also ever so slightly snappier. Vim seems to be ever so slightly less responsive when I have several panes open.


Well, I can explain how I use it. I have my Sublime in legacy mode, which enables some vim like controls. And I benefit from Sublimes super fast text rendering, it's nice fonts and colors, the way it integrates with my desktop environment, how fast it interacts with plugins that spawn popups and analyze my code.

And then every once in a while I need to do some cool text transformation, and it doesn't work, so I launch a terminal, edit the file in Vim and work my magic. Then I exit Vim and continue in Sublime. Ideal? No.. but it's the usability of Sublime with the features of Vim.. every once in a while..


> And I benefit from Sublimes super fast text rendering, it's nice fonts and colors, the way it integrates with my desktop environment, how fast it interacts with plugins that spawn popups and analyze my code.

The sentence quoted above sounds a lot like something out of a product promotion.


Well, when users are doing that, you can be sure that the competition will be in a world of pain!


"99% feature loss" sounds terrible but vim is actually way too featureful, in the sense that all that shit gets in the way of just using it as a text editor. The amount of silly pointless knowledge you need just to move around lines and save a file is a real burden.


"I hate the ideas many programmers have about backwards compatibility, unless it affects me." – FTFY

> "There is nothing holy about Unix era software, chances are it's shit and a lot of it should be thrown out."

1. we're still very much in the "Unix era"

2. There's nothing holy about software at all. Rules shouldn't be broken, but rules can be broken.

> Look at SublimeText, it's got 1% of the features of Vim, yet it's converting Vim users left and right, by its sheer usability.

If anything I'd say Vim is converting TextMate/SublimeText left and right. Most SublimeText users are former TextMate, and many new Vim users are former SublimeText users.

> We as developers in the Open Source community should be ashamed people are still using Vim to write LaTeX in Bash running on terminal emulators

Speak for yourself. I don't personally write LaTeX but I still very much use the shell, and Vim.


Oh, it's a real flamewar and people still seem to be here, so I want to add a little constructiveness to it. Vim, bash, LaTeX. Let's postpone vim, as the whole topic is [supposed to be] about it. And we have enough shells (zsh, fish) instead of bash, many people (me too) tried it and have chosen bash (although I use IPython quite often). And I don't know sane and fully functional alternative to LaTeX. (Markdown and co don't count as they don't have same control on output.)

So let's stick with LaTeX. There are 2 categories of people: those who think that LaTeX is fine for its usecases and those who think that something is wrong and could be done better. Fine is fine, so lets concentrate on the second category. I'll recite main complaints: slowness, horrible syntax, arcane source code, impossibility to divide semantics from representation. (Did I miss something?)

I guess many of us are able to program. So why somebody, right here, right now, wouldn't create a github/bitbucket repo, post link on it and explain, what exactly in your opinion could be done better? I personally would probably participate if somebody has a plan sane and explicit enough.


But I want to write on my text editor on a terminal emulator because I have access to all of the built in operating system utilities.


Um, I've never heard of SublimeText. I like vim because it is cake to ssh into a server and quickly / efficiently modify a text file.

And there is nothing shameful about developers accomodating my desire to use a command-line text editor to create/edit TeX files in whatever terminal I am using.

Normally I create / edit TeX using LaTeXila but it is fanstatic that I can do so with vim and when I don't have access to LaTeXila I use vim for TeX editing, and I am quite glad I can.

With software that works in a standard shell environment, I can ssh into a networked computer from my Android tablet (Better Terminal Emulator Pro and Hackers Keyboard) and do actual work. Far more convenient than carrying my laptop everywhere and also faster than checking out source, firing up a GUI, and then committing the source.

Terminals ARE good technology and we would be worse off without them.


I can't believe I read this whole thread... I need to go take a shower now...


I fully agree.

For me, Vi and Emacs are only when I don't have an option in terms of IDE support, even though I know at least Emacs quite well.

What can I say, the world of Amiga GUIs, Smalltalk and Lisp environments spoiled me as an IDE fanboy.

It baffles me why in 2014, some developers still prefer to work as if UNIX System V had just been released.


I started with IDE and then moved to vim. Not because it's "cool". It's productive. There are several reasons. First off, movements. Almost any IDE now has some plugin to imitate vim-style keyboard shortcuts, but they honestly suck at it. Next thing is it's fast and lightweight, which surprisingly feels more important than I thought before. I don't have to wait for java-based IDE's GC to stop running or something. Third reason is that I can set any action to any key, I can invoke anything from the vim itself to automate some text-processing, make any visualisations I need, anything. IDEs also provide some mechanism for that but it's way more complicated and obscure. But the key reason is simpicity, really. I know what happens underneath the hood, because I manually enabled the plugins I want, I've set shortcuts. Vim serves me, not otherwise. Back when I used to write programs in Java I tried Idea and switched to Netbeans, because Idea is way too "smart". It tries to do everything automatically, which is OK until it fails. And it did fail sometimes. But when it fails it's me who has to find what did it mess up. So I prefer to write code myself (or let do it some scripts I explicitly authorized to do so), not to fix something my IDE broke. It is especially uncomfortable on some old messy projects with lots of legacy code.

Is vim perfect? Well, for me it's not. Because of vimscript, because it doesn't have mechanism to work with non-text information (for example it has to open up another window to show me rendered LaTeX). Sometimes I even think of switching to emacs. Plugins to do some advanced processing (like rope for python) are slow quite often. But I still prefer it to any IDE in most cases. It's not perfect, but it's the best thing I've met so far in this imperfect world.


Definitely try Emacs. Preview LaTeX is nice, Rope is also here. Also I find it easier to set the system to do all text editing in Emacs than in Vim. I mean that one can use Emacs to write mails, chats, edit all types of code, write and post blog posts, web pages and so on.


> It baffles me why in 2014, some developers still prefer to work as if UNIX System V had just been released.

Because (a) some of us aren't convinced that things like IDEs give a better view of what's going on in the system; (b) some of us think that fundamental tools like terminal emulators, editors and so on should be debugged and stable because we have to earn our mortgage payments using them and chasing the shiny isn't on our job description; (c) any actually useful new idea winds up in Vim anyway.

You want to use a new shiny unproven editor, grand, have a ball. But don't ask me to purely on the basis that it's not new and shiny.


IDE aren't new and shiny, I have been using them since the early 90's.


(a) To some of us, that's not that long ago :D (b) Vim is an IDE. No, seriously, if you can run the debugger, the compiler and a plethora of other tools from within it, it is an IDE. You might not like it, but... (c) IDE, DDE, doesn't matter. Some of us work faster in IDEs, some in DDEs. Now, if you don't grok what's happening from your source down to the metal, that matters.

(d) When did I specifically pick out IDEs as the sole example of new and shiny? Or did you just read my comment without parsing it?


What features does an IDE give you?


- Visual representation of the code structure

- Ability to select any symbol and find semantic uses of it

- For OO code, being able to visuallize the OO graph usage of certain symbols

- Refactoring across the whole project with semantic knowledge (no, search/replace does does not cut it)

- Navigation in third-party libraries deployed in binary format

- Code completion for static/dynamic languages, while showing tooltip documentation

- Graphical visualization of data structures on the debugger

- Background compilation with static analysis

- unit test debugging infrastructure

- integration with modelling tools

- integration with SCM tooling and being able to interact with them directly from the editor. For example, generating a blame file, with navigation across the file revisions.

- integration with continuous integration servers

- integration of developer workflow with task management servers

As an example of such developer workflows, have the IDE talk to Jira, edit the code, automatically bundle it in a workflow that binds the code changes to the Jira issue being worked on, get a Jenkins notification after the code is checked in and gone through the CI system.

Sure you can get part of it in Emacs, after spending a week configuring plugins, with different levels of maturity, and in the end it is still mostly textual.


> We as developers in the Open Source community should be ashamed people are still using Vim to write LaTeX in Bash running on terminal emulators.

I agree. Bash sucks. ;)


> I hate the ideas many programmers have about backwards compatibility, that it's more important than development speed and modern concepts.

Please clarify by what is meant by "backwards compatibility," since this is a loaded phrase. On the one hand, it's the idea that software system B can do everything that earlier software system A can, using the semantics of software system A (e.g. "vim can do everything vi can, using the semantics of vi"). On the other hand, it's the idea that software system B is semantically incompatible with software system A (e.g. SublimeText vs vim), but software systems A and B can coexist without breaking each other. While I agree that achieving the former idea of backwards compatibility isn't always important (particularly in the realm of text editors), achieving the latter is an absolute must. If there is one thing I will not tolerate on my systems, it is a piece of software that breaks other working software that I rely on.

> There is nothing holy about Unix era software, chances are it's shit and a lot of it should be thrown out.

Like it or not, we're still in the Unix era, and likely will be for some time. Unix is more than a specific implementation or even a specific API--it also includes the logical abstractions behind them. You still deal with files, directories, processes, threads, pipes, sockets, dynamically-linked libraries, paging, etc., whether you're on Unix or Windows. This means SublimeText is "Unix era software" too--it's built on top of the same Unix-era abstractions as vim. Perhaps the only Unix abstraction vim makes use of that SublimeText does not is the TTY, but that doesn't stop me from using vim where TTYs don't exist.

And you're right, most Unix era software is shit. But don't limit yourself to Unix era software--most software in general is shit. Very few pieces of software become as robust and widely-used as vim, and SublimeText isn't going to make vim disappear anytime soon (especially since it doesn't break working vim installations).

> We as developers in the Open Source community should be ashamed people are still using Vim to write LaTeX in Bash running on terminal emulators.

Feel ashamed of me then, since I do exactly this. Why? Because there isn't anything sufficiently better for what I need to do. I have given every single WYSIWYG editor that claims to do a better job a good-faith trial, and I have always gone back because each contender always lacked the ability to do something I needed to do.

I'm not saying it's impossible to do better; it certainly is. It's just that I've seen nothing that actually is better. I would address this myself, but I have too many other, more important things to work on (like finishing my PhD thesis).

> (Yes, it gives me shivers just thinking about how much each of those technologies sucks when you think about how good it all could be.)

Code doesn't write itself :) Software systems don't design themselves either :) What you're asking for is a very, very tall order. Many have tried to replace that which is "good enough," and yet we're still in the Unix era despite their efforts.


I'm not sure how I feel about this. On the one hand I've been thinking on it for quite a long time already, newer shinier Vim is something I secretly wish for. On the other hand I found it quite problematic. The main problem of Vim is Vimscript. So "newer shinier Vim" is Vim with real programming language instead of vimscript. But it's impossible to remove Vimscript: it won't be Vim anymore. It affects not only scripting itself, but also how we interact with it in command mode, not to say about plugins we have already.

And here we have a project. Virtually nothing is done, previous commit was 20 days ago, and the last commit is adding the fundraiser link an hour ago. I mean, the author doesn't seem to be fanatically enthusiastic. And plans are quite generous. Maybe I'm too pessimistic, but I have some bad feeling about that. I'm thinking instead of "Hm, maybe better to write an open-source version of Sublime Text?".


I am enthusiastic about it, thats why I started the fundraiser. To finish the first iteration fast I need to put my freelance work to a halt.


> And here we have a project. Virtually nothing is done, previous commit was 20 days ago, and the last commit is adding the fundraiser link an hour ago. I mean, the author doesn't seem to be fanatically enthusiastic.

I understand the skepticism, but the FAQ[1] gives a pretty good explanation for this and shows that this guy has some familiarity with vim and motivation for taking on this project.

[1] https://www.bountysource.com/fundraisers/539-neovim-first-it... (near the bottom)


Of all the reasons to keep Vim, Vimscript is the one I'd least expect to see on a list! The obtuse nature of Vimscript and clunky Ruby/Python interop are, IMHO, a huge wart on an otherwise fantastic editor.


I think the GP acknowledges that Vimscript is awful: "The main problem of vim is Vimscript."

I agree. vim the editor has a good UI: it is fast to work with and makes me productive. Vimscript is bad. I cannot grok how to write complex things and simple things often break too.

The GP is pointing out that Vimscript is essentially the command mode. The two are one and the same. Try switching Vimscript for, say JavaScript, and now you have to either use JavaScript in the command mode (yuck!) or create some bridge between the command mode interpreter and JavaScript (yikers!). So unless we rip out the command mode out of vim, we cannot replace Vimscript.


This is essentially where I'm going with ix. Vim with an awesome haskell-like language underneath whose syntax is optimized for doing powerful stuff in command mode.

A prototype is at https://github.com/jayferd/ixl-prototype


Neo Vi Improved. We suck at naming things :)

Yeah, I have mixed feelings as well. I love my Vim, don't touch it. And yet, vimscript does suck and better separation from core and UI could allow for very interesting things. Also, no one really cares about Vi compatibility anymore and cleaning the scary old C code base could make it much more accessible.

I guess we'll see.


I think it should be called Vimim, Vi Improved Improved


Only after it actually is.


Or Vim for short.

You know: Vim IMproved.


It really should.


"Hm, maybe better to write an open-source version of Sublime Text?" You mean, like Lime?

https://github.com/limetext/lime


redcar has been around for a few years, with compatibility with textmate bundles

https://github.com/redcar/redcar


Interesting. I find it a little comic that plist XML files have become the common language for exchanging syntax definitions between editors, due to TextMate.

I wrote a utility to convert these files to/from YAML so it would be friendlier to edit Sublime Text themes; editing naked XML plists was painful for me.

https://gist.github.com/swdunlop/8872387


Nice, but I would have to know a little more about their philosophy before deciding I'm interested in the project. Their frontpage shows something as closed as Sublime's. "This is what we're doing with this tech, its gonna be great".


There's source code, a BSD license and the README describes their rationale right up front. In light of just the superficial information on github, your comment seems hyperbolic.

My interest in the project is in something with a design similar to Sublime Text that works remotely in an SSH session. There's also some exotica about wrapping Python in a Go process which I find interesting and terrifying.


Oh, I didn't know about that one. Now that seems interesting! Thanks for sharing.


Check out his fork of vim, he has already done a bunch of stuff related to multi threading. I'm guessing this fork only has build system cleanup commits yet.


> But it's impossible to remove Vimscript: it won't be Vim anymore. It affects not only scripting itself, but also how we interact with it in command mode

why do you feel that? vimscript is not the same thing as ex; i agree that any true vim would need to keep strict ex compatibility, but vimscript itself is purely for writing plugins in.


Neovim fully supports vimscript and almost all existing vim plugins.


I think a modular base would mean that VimScript could be entirely replaced with a different kind of scripting language. I for one would love to see Ruby or Python take that spot!


You can already script Vim with those languages.

Vim needs to be built with the support enabled, though.

Python: [1] (more info [2])

Ruby: [3]

[1] http://vimdoc.sourceforge.net/htmldoc/if_pyth.html

[2] http://stackoverflow.com/questions/905020/resources-concerni...

[3] http://vimdoc.sourceforge.net/htmldoc/if_ruby.html


Vim already can be scripted with real programming languages, how did you miss that?


Thiago de Arruda (the author of Neovim) has also released a Vim fork with multithreading support (proof of concept):

https://github.com/tarruda/vim/

https://groups.google.com/forum/#!topic/vim_dev/65jjGqS1_VQ


A brave effort. Given your commit history, documented plan and great idea I was happy to back it [1] and hope you succeed. Best of luck :)

[1]: https://www.bountysource.com/fundraisers/539-neovim-first-it...


I greatly appreciate it, thanks :)


Thiago if you read this, please do your fundraiser a favor and mention all the work and review that went into your thread safe message queue already.

There is a lot of skepticism about your capability of delivering, but i think it's clear that you already have the experience needed.


To make sure that your message does reach him, try to reach out directly.

His contact information is here:

http://tarruda.github.io/


As someone who uses Vim to write articles and documentation in addition to code, I'd really love to see a richer UI with proper support for features like variable-width fonts.

It looks like the developer behind this refactoring effort has some really good ideas for decoupling the Vim engine from the user interface layer. It'd be great if somebody could build a really good cross-platform Qt-based UI on top.


That actually is planned on the first iteration of the fundraiser, see the FAQ


That's really great! After reading the FAQ, I have contributed $100. Best of luck with the project!


Thanks a lot :)


Is there some way to donate via Bitcoin?


And via EVE Online isk?


Yeah, when it has a 10 Billion USD market cap, is accepted by Fortune 500 companies and leading Internet sites, and is celebrated by successful venture capitalists like Fred Wilson and HN's founder, get back to me.

(And you'll probably retort with the failure of an exchange that almost everyone in the Bitcoin community has been hoping would fail for the past 3 years).

But seriously, I'm pretty sure that you can exchange Eve online currency for Bitcoin, so I'd sure take it as a donation if someone offered.


"I'd really love to see a richer UI with proper support for features like variable-width fonts."

Would the variable-width fonts be simply for display with plain text being edited and buffered, or do you wish to be able to have rich text formatting within the editor (e.g. italic, bold, smart quote marks &c)?

Just interested: rich text formatting has been suggested for emacs I recollect.


For display purposes, yes. I want better text readability when I'm working on an article. A nice variable-width font, more line spacing, etc. I want my Vim buffer to look as good and be as comfortable to read as a native text editor like Byword.

For formatting, I mostly use Markdown. Ideally, I'd like to see the literal markdown for the paragraph under the cursor, but formatted text for the rest of the buffer. There's actually a script with a pretty good first pass at achieving that kind of Markdown support via Vim's "conceal" feature, but some of Vim's inherent limitations make it imperfect: https://github.com/tpope/vim-markdown/pull/9

I'm hoping that the way neovim decouples the UI layer from the underlying editing engine will make it easier to address issues of that nature.


I can see what you mean, although the flicking between markup and formatted text as you moved the cursor would annoy the hell out of me I think.

Now, would the cursor be a block over a character or a caret between characters? A lot of history riding on that one.

Jef Raskin's Canon Cat springs to mind with its in-place commands (but not Vim's modefulness).



It sounds like they have no interest in getting this stuff pushed back upstream? There's no mention of it on the home page.

At this point it smells kind of Emacs/XEmacsish. Hope they can rally immense development effort.


The author of NeoVim (Thiago de Arruda) tried to add support for multi-threaded plugins to Vim and has been stymied[1].

I'm not sure how to get a patch merged into Vim. Bram Moolenar is the only person with commit access, and he's not a fan of most changes beyond bug fixes. My co-founder and I tried to add setTimeout & setInterval to vimscript[2]. Even six weeks of full-time effort and bending over backwards wasn't enough. Eventually we were just ignored.

I've contributed to a lot of open source projects, and the Vim community has been the most difficult to work with. I've been writing C for almost two decades, and the Vim codebase is the worst C I've ever seen[3]. The project is definitely showing its age, and I'd love for something new to replace it.

1. https://groups.google.com/d/msg/vim_dev/65jjGqS1_VQ/fFiFrrIB...

2. https://groups.google.com/d/msg/vim_dev/-4pqDJfHCsM/LkYNCpZj...

3. If you value your sanity, do not read eval.c. It is over 25,000 lines and has over 400 ifdefs. The first ifdef checks for Amiga; the second checks for VMS: https://github.com/b4winckler/macvim/blob/master/src/eval.c


Your patches were broken. Now your answer is to fork? Sheesh.

> The project is definitely showing its age, and I'd love for something new to replace it.

Then start over. Refactoring it will be nigh impossible.


Vim was created on Amiga, so that's explainable


Re #3. Dude - you broke my browser with that link :(


As the author I would like to contribute changes back, search vim_dev mailing list for 'message loop' and 'job control' and you will find two patches I've sent that werent even commented by Bram.

This fork changes so much that its impossible that it will ever be accepted


I TOTALLY hear that.

Maybe an GCC/EGCS, eglibc/glibc, or (as nfm points out) Yarv/MRI scenario could work. First prove your mettle (will take donkey's years), then upstream can jump to your stable code base in one glorious commit.

Hope you can mention in the readme that upstreaming is not a goal... It might be hard to get the wording right but it would have made my first impression quite a bit less skeptical. :)


Maybe we can hope for an MRI/YARV scenario!


the author has posted at least two different patches to vim dev google group. I dont think Bram even commented in any of them. Many others did.


Bram shows no interest in these sorts of things. Frequently, devs post to the vim mailing with a month or two of work on some new feature with no comment from the BDFL.


So why don't all these people get together and fork?


I have only superficially browsed vim sources some months ago. Vim source code is literally disgusting. The number of platforms vim runs on is close to Avogadro's number. It supports 5 or sth languages for customisation. I always had in my mind to fork the thing and remove all those languages, and strip off the compatibility stuff so it will be a pure Unix program, but I never got to do it.

Also, the source repository contains binary (.a, .dll) files, which I did not inspect. I always search the web for an alternative editor when I somehow browse the vim source repository.


The stripped down version is called 'vi' :)


Well, I decided I'll play with acme* editor a little bit, if I can't get on well with it, I'll probably go that way.

* http://research.swtch.com/acme


Yes, I was surprised by this, and no mention of the project on vim.org ...


> Migrate to a cmake-based build

> Legacy support and compile-time features

> Platform-specific code

Sounds like well-justified cleanup, although it's possible this project is underestimating the usefulness of feature selection.

> New plugin architecture

> New GUI architecture

Now that's cool and ambitious.

I wonder whether Bram has an opinion on this?


I'm a hardcore Vim user and I just don't see the point of this.

Not only that, but Vim is charity ware and requires the license to be included. The license is most notably absent from the Neovim fork. Wonder if that will be added any time soon...


The license removal was completely accidental, I will include it as soon as I get back to it.


As promised, here is the commit with the origin Vim license -- https://github.com/neovim/neovim/commit/7cadf15eee1f377e9a7b...


So I read over the license and you're right, the license does need to be included. However I don't see anything wrong with the project as long as the license is included in its distribution. The license doesn't go against anything Neovim is doing, provided they include the license and provide the project source.


Therein lies the problem; the license is not being included. I find this a bit concerning since the default in forking a repo would have included it. So this gives off the impression then that the license was deliberately removed.

Just to make it clear (most people on Hackernews probably already know this, but for the sake of anyone that doesn’t), it would look like this.

hg clone https://vim.googlecode.com/hg/ neovim

cd ./neovim

git init && git add . && git commit -m “First commit”

git remote add origin https://github.com/neovim/neovim.git

git push -u origin master

There you have it; an exact fork from Mercurial push out to neovim on github, everything included. Getting those license files out of there took some manual steps somewhere in between lines 1 and 3.


I think it's likely that the license was accidentally removed as part of cleaning up the build system.


Let me just say that I recently had to build a 'new' UnixWare 7.1.0 system. I was stunned and delighted that I was able to compile vim 7.4 + then-current patches.

Couldn't compile git, python or a couple of other things I wanted, but at least I could use a reasonable editor.


That's awesome in a way. But this being Open Source you can just download the sources for Vim 7.4 in the future and use it on UnixWare 7.1.0.

While the rest of us (and maybe you, yourself) on Ubuntu 15.10 can use Neovim with all the cruft taken out.


Finally, someone took this step. I really appreciate his work.

Don't get me wrong. Vim is an awesome editor and it is the only editor I'm using now. I've been using it for decades and still cannot find an editor which can replace it. Sorry, emacs, I tried several times but failed. I know it's my problem but I'm too familiar with vim's short-cuts.

However, I do think we can still do some improvement on vim, especially on its plugin systems. If you wrote plugins for vim, you know what I'm talking about. For example, can you quickly tell me the differences between map nnoremap nnoremap? How to write comment in vimscript?

To me, vimscript seems like a language patched by lots of authors with inconsistent goals. It's not as cohesive as Emacs Lisp. And there are lots of historical reasons why they do that --- I know, it's backward compatibility. But you have to move forward at some time.

With that being said, I do think it is necessary to have an editor which keeps the good parts in vim and improve it by not considering too much about backward compatibility. I'm so glad that someone did it for us.


There's viper mode.

Though, aside from a brief dalliance with emacs in the late 1990s, I've used vi / vim since 1987.


I tried Evil, which was supposedly better than Viper, and it got me 90% of the way to vim. But that last 10% kept me from staying (and also that emacs has way more typing/command lag than vim).


For those searching a combination between VI, Emacs and Haskell: https://github.com/yi-editor/yi


I actually had hoped "Neovim" would be a modern reimagining of a modal Texteditor for the modern time, along one of two paths.

Path 1: An html5+javascript based modal editor where buffers are actually DOM trees, enabling both code and other data to be rendered/edited, and also offering much richer visual enhancements. Javascript is already a great way for enabling a rich plugin ecosystem.

Path 2: Modal Textediting suited for mobile devices. For example based on python+kivy, or even C with SDL, it would be aimed to be portable and to extend the modal interface to touch-based gestures and speech recognition. Akin to "verbal vim", entering vim-like commands with touch or voice gestures, text editing on mobile devices like tablets would immediately suck a lot less...


Can't tell if you are actually serious about the above or not, especially path 1.


I personally think path 1 is absolutely where editors should be going. Upvote from me.

NeoVim is a step in the right direction, but not far enough. The future is a more Light Table like environment (http://www.lighttable.com/). But Light Table (as visionary as I think its UI is) is written (and extended, AFAIK) in ClojureScript, a LISP-like language that nobody knows.

It it was written and could be extended in JavaScript instead, and configured to the extent that I could write VIM-like modal editor support for it, I might finally be able to give up VIM after 20 years for something better.


Getting a little ahead of ourselfs maybe?

"Nobody knows", lets just compare IRC. #javascript on freenode got 951 users. #clojure on freenode got 651 users. Even tho its not a good comparison, saying "nobody" knows clojure is very far fetched. It has a ever growing community, with several books published, and the whole point on Chris choosing Clojure shows enough of its maturity.


Search Amazon.com books:

ClojureScript: 10 results (only one of these actually has ClojureScript in the title)

Clojure: 89 results

JavaScript: 5,787 results

Sorry, there is no comparison.

I'm not saying Clojure sucks, I am just saying that it's not mainstream. ClojureScript even less so -- it's an offshoot of Clojure which was designed around the JVM, not browsers.


"I'm not saying Clojure sucks, I am just saying that it's not mainstream.", to be frank, thats not what you said; "... ClojureScript, a LISP-like language that nobody knows.".

Saying its not mainstream is no problem, but it tells you nothing about the quality of the language. What i do find problematic is that you are changing your argumentation from one comment too another. Clojure is a lang several people know, judgeing by books is a poor comparison as we can limit too the past year and compare and the difference would not be as stark, as you try to present it. Clojure also got 80 (?) books published in less then 2-3 years since its its rather quick breakthrough. There is no need to even try and claim "nobody" knows clojure, or "it's not mainstream".


> ClojureScript even less so -- it's an offshoot of Clojure which was designed around the JVM, not browsers.

Clojure was designed as a Lisp friendly to it's hosting environment. JVM implementation is probably not purely accidental, but it's not the whole of original design either.

And as I write this comment here I might as well add: non-mainstream languages are in many, many places a competitive advantage for those who use them. I know, use and frankly like many non-mainstream languages and the ROI of learning them was universally higher than with the most popular languages.


Even if it sounds like sacrilege to some, I am putting it out there for comment. Both ideas arose from the dissatisfaction with text editing, one in the browser, and the other in the mobile space where currently almost nobody uses a text editor.

Especially the "modal html5 editor" would be useful across a variety of websites, either built into the site or the browser itself. For example it would be nice to have plugins that work about the same in IPython notebook, Chrome/Firefox Devtools, Udacity, latex-as-a-service, web-based admin tools, webmail etc.

That being said I am not convinced these projects are worth investing time in. They are on my mind though.


Path 2 is actually something you could do with neovim. The editor's core and it's UI would be decoupled, so you could put whatever UI you wanted to build on top of it relatively easily.


Ohh a modal vim with like wheel typing for taking notes on my iPad in meetings would be smexy


I really like what he's setting out to do here but I was a bit put off by the fundraising setup. If the target isn't met the donations still go to him, and there's no pledge to do anything on that case. Is that right or did I miss something?


On the other hand the fundraiser is now over $8,000 which is enough for at least a month of work, I think it'll make the 10k in short order.


Ambitious. Best of luck to them--I've considered trying to do a full port to a language like Go or Rust, but I've never quite had the time (or self-hatred to try and handle quite so many corner cases as what I've seen in vim's code).



Sometimes people would like to have just a as-small-as-possible "classic" vi (like nvi, or vi included in FreeBSD) to just quickly run through bunch of config files in a [remote] terminal. So, vim-minimal works well for us.

I am not sure that libuv is what we need, and it seems like another "monster" to be born, like "modern" Emacs (which is "aware of" such a crap like gconfig).

  schiptsov@MSI-U270:~$ ldd /usr/local/bin/emacs |wc -l
  85
"Do not want" meme.)


Good luck! This is a tough problem. Many have tried, and they all failed.

http://www.freehackers.org/VimIntegration


This reminds me of the now dead https://github.com/chrizel/Yzis

Even though it has been tried a few times before and never seems to catch on I really like the idea and would be down to help except I haven't coded C in 15 years and don't really have any desire to go back. I wonder if there are parts I could help with in newer languages. It's always fun to pick up a project to learn a new language.


The big difference here is that it's based on refactoring the existing codebase. While not the most glamorous job in the world, it's more realistic than attempting to recreate the product of 22 years of development starting from scratch.


I applaud your effort but I worry that this step is not far enough to make a better version of vim. However I really think that we can do better and the famous, great editors (vim, emacs, acme, etc.) all have different features that make them great and unique. I prefer vim because of its modal nature which might be the reason why I find it also ergonomically superior (after remapping ESC to jf) - e.g. I don't like it to stretch my fingers.

On the other hand I never really script vim as I find vimscript just terrible. This is far better in emacs as it uses a decent programming language.

The third aspect I think is not well-designed is window/buffer-managment. In my opinion this point could be outsourced or developed in connection with a tiling WM or terminal multiplexer like tmux/screen (this part is best in acme).

A modern approach I thought about (Yet Another Text Editor Syndrome) would be a client-server architecture with a server node storing buffers with context information (filename, cursor, etc.) to which clients can connect. A client node could be on one hand a viewer (terminal or gui based) doing fancy thing like syntax-hightlighting, cursor control, searching and on the other a REPL (in any language) that just has to implement the bridge to a defined message protocol.


Lime text fits this description I think.


Cool project! It'd be nice if the build system was converted to GYP, so that it integrated well with libuv.


I think this is a great project. The author appears to be extremely competent. I pledged $20.


I think there are features (which the ability to embed a vim engine may enable) that would make it easier to improve one's usage of vim.

Real time analysis of actions, with suggestions for better ways to do what you just did. Long term analysis of actions, to see what are the biggest time wasters (ie, you often do a certain command that requires typing, maybe it might be better to make a function mapped to a leader based command. Or you tend to copy things with a range that you could have copied using a text object.

And that is what interests me about this. I really hope that separating GUI from the core and the exposure of streams will allow you to put an analysis engine between streams.


kickstarter this please. I (and so many others I guess) would love to support this.

I'm worried that a lot of good projects die off because everyone have mouths to feed. I suspect that I have spent a large part of my adult life using vim, so I want to make sure a good successor comes through.

Take a look at LightTable - which was quite successful [1] in raising and delivering what it promised.

[1] https://www.kickstarter.com/projects/ibdknox/light-table



The more I think about this, the more I like it. I badly want my vim to be multi-threaded and a separate engine, especially one that had a queue based pub/sub type interface could really open up some cool possibilities. At that point if you could give it the ability to use operational transform for the changes it would make remote editing, as well as remote pairing work really well. I know floobits is doing something similar but I would have to think this would make that much easier to pull off.


But what about the children in Uganda?


I would love to have Bram Moolenaar's input on this project.


I've just sent an email to the vim_dev@googlegroups.com mailing list. I asked his opinion on the Neovim project.

The subject for my email is "Neovim", so let's wait and see what his opinion is.



300 comments in 24 hours.

nothing better than people word-fighting about writing tools...

-bowerbird


Right, haha, like comparing tool that does the same thing and just justifying or proving that this one is better -_-


Wait till you see what happens when someone comes up with neoemacs.


First off, what happens if you don't meet the funding goal, do we get our money back?

I think a big reason for Vims popularity is how ubiquitous it is. It's installed by default on most *nix operating systems. Even if I'm not privileged on a system, I can still pull down my dotfiles and have my familiar vim editing experience.

If I understand correctly, plugins written for Vim would be compatible with Neovim, but not vice versa?


As far as I'm concerned, just moving a vim clone to git is a huge improvement.

Ever tried building vim from the ports tree on a <1GHz machine? Painful.


I wish someone would try this again for Emacs :(


If anyone is looking for alternative to vim then check out Textadept which has a curses UI version and a graphical UI version.


I think one of the most important things about this is the last bullet point. "Development on Github." I occasionally look at the vim-dev mailing list and I see people attaching patch files to conversations. I think Github's collaborative development workflow could help the barrier of entry to updates a lot.


Any worry about latency with this new plugin architecture? I'm not quite sure how the current plugins communicate with the main process, but I am sure that one of the things I would most like to see in a re-write of vim is a more responsive interface, even if I have a handful of plugins running.


If anything, the NeoVim will be much more responsive. Vim's current plugin architecture is simple: when plugin code is running, the UI is blocked. If your plugin has an infinite loop, Vim hangs forever. If your plugin does network I/O, Vim hangs until the socket read/write is finished. The only work-around is to use non-blocking sockets and abuse CursorHold/CursorHoldI autocommands and updatetime. That trick breaks the leaderkey, which is a show-stopper for many users. There are similar problems with syntax highlighting. Vim has its own (extremely inefficient) regex engine that blocks the UI when matching.


The regex engine was updated in version 7.4 and now is noticeably faster.


There's plenty of room for improvement. For example: if you run the same regex many times in a row, Vim rebuilds the DFA on each run. A cache of recently-used DFAs would avoid tons of wasted computation. Also, Vim doesn't free many regex-related data structures until you close the buffer they were invoked on. I had to work-around that bug when writing a plugin, since it caused Vim to use gigabytes of memory.

The best solution would probably be to switch to PCRE. Some Vim-specific stuff[1] would have to be special-cased, but Vim could get an order of magnitude speedup (and fix quite a few bugs) by using a modern, optimized, well-tested regex library.

1. Vim abuses regexes like nothing I've ever seen. You can even match line and column numbers with the regex engine: http://vimdoc.sourceforge.net/htmldoc/pattern.html#/\%l


If you can make Vim seamlessly work with a REPL then this would be worth me upgrading (if all my current vim plugins still work).

It's the only reason I use Emacs + Evil + trying to get it to be as close to a Vim clone as possible. You know how cool it is to type ":" instead of M-x?


"If you can make Vim seamlessly work with a REPL"

Have you tried the ScreenSend vim plugin? I use it for Python and Clojure, and occasionally other REPLized languages.


Bram Moolenaar just gave his opinion on Neovim. See here for his answer:

https://groups.google.com/forum/m/#!topic/vim_dev/x0BF9Y0Uby...


I'm not sure about libuv. Isn't that primarily for async I/O? This is a desktop application. Async I/O isn't going to significantly affect performance. And it doesn't make shit any simpler.


libuv will be used mostly because it acts as a platform layer for many common system functions.


First-class support for embedding wins a pledge from me. I wish you the best.


if this ever gets off the ground there should be a complete spec sheet for the plugin commands so anyone can write their own vim core that interfaces with other guis and plugins designed for the original vim


Get this story and many more in top5HN Newsletter. Signup at http://top5hn.launchrock.co


Sounds nice, though I hope they keep support for more esoteric platforms. My only real choices on z/OS are vim, emacs, and the ISPF environment.


the improvements to the plguin system sound awesome - hopefully it remove the need to do virtually anything in vimscript...


Yes, anyone will be able to extend vim without knowing vimscript exists.


Good to hear that vim development takes on some speed again. Please keep compatibility with existing vim plugins.


I wish them the best of luck.

Vim is already an IDE.


$10000 seems to me a great amount for having this!


Long Live Vim!


Vim is already awesome as it is.


It seems the trolls have arrived.

https://github.com/neovim/neovim/pull/52




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: