Edit: for avoidance of doubt, I'm one of the founders of WriteLaTeX/Overleaf -- we've been providing LaTeX compilation-as-a-service for ~8 years now!
The bean counters don't have a problem paying Microsoft the amount of a small African nation's GDP per year, but I pay for my Overleaf pro account out of my own back pocket, and can't even claim it back on expenses...
This is ridiculous, how on earth a serious university have come to that?. If you need a laptop or an external disk you need to also buy it yourself?
I have one piece of commercial FEM software that is only ever "licensed not sold" [annually] -- an annual license is ~£1k; a perpetual one ~£100k. It's quite clear that the company want to go in that direction. I hate it. I don't like subscriptions, and I get riled by central university finance departments every single time...
Ironically, laptops and hard drives etc are quite easy to purchase: they count as 'consumables' and I have never worked in a department that actually audits their lifecycle properly -- much like lab reagents, once they're bought, the university [or my interactions with it at any rate] doesn't seem to even care if the first thing I do is throw it in a bin. This is a Good Thing™ as far as I'm concerned. Having decent computing wherever I am is absolutely key to getting my job done.
Slightly more on topic, the other thing I would say is that quantitative departments are very good at teaching their students LaTeX, but not necessarily teaching them to it well. I won't exactly say that learning TeX made me a better physicist, but it definitely helped me communicate like a professional one. I interact a lot with doctors and bioscientists -- I basically work in medical imaging -- and trying to cross that divide is very hard; we forget that doing a physics degree gives you lots of transferable skills. Overleaf is excellent at providing a "user-friendly face" for projects that I can share with doctors -- they don't need to understand the code, nor have the distribution installed locally; they can just contribute to a paper in progress quite easily, and in particular it's "track changes" feature is something that they like. My team and I tend to use git and % comments, but the value of a web UI is definitely there.
Debugging problems like alluded to in the original article is definitely easier locally, however.
Notice that you can have both things here. Each overleaf project is actually a git repo that you can clone and edit locally.
I get the appeal of overleaf for collaborating (I never used it myself, but it seems like a great platform!), but what is it that you consider a massive improvement in overleaf over git+local compiling?
you might like latexmk, which automates running the right tools in the right order. It's pretty neat. It detects whether to run bibtex or biber (and when) and whether another run of (pdf|lua|xe)latex is required to resolve references/citations/..., etc.
(1) The cognitive overload of the git push and pull - while this might have seemed small when I was using this workflow, I don't have these steps anymore making them seem unnecessary in retrospect. Reduced steps in general are better IMHO: same reason as to why I might use something like Google docs, even though, a somewhat equivalent workflow would involve going via GitHub.
(2) Ensuring all local environments are similar or identical. Esp. a problem when installing new packages, say on device 1... when compiling on device 2, I need to either remember to do this by noting it somewhere (e.g. modifying the makefile), or I am reminded of it with a compilation failure.
Small gains perhaps, but they add up. Esp. if you are working on multiple documents at the same time (I typically have 2-3 active documents).
 If it matters, I need to switch between 3 devices running windows, ubuntu, elementary OS.
The other major argument is, of course, that you get an off-site backup "for free" for something as life-changing as your doctoral thesis, too.
†Of course, the other thing about having lots of local commits is that you can easily graph, say, words as quantified by texcount vs time and include the resulting diagram in the final copy of the document...
Great work! I depend on sharelatex/overleaf since a long time to collaborate with other people writing articles. It is especially great that your software is always available under the AGPL license.
We use mostly the git interface, not the web editor. Do you know why it is so ridiculously slow to pull/push? Like, 20 or 30 seconds when it manages to work?
If you have a lot of files or some large files it can slow things down a lot. The git server is something that's due for a revamp, but as it stands the best thing you can do for now is separate out your LaTeX content and other project assets into different repositories. (We usually see these kind of problems when people have a bunch of non-document stuff in their Overleaf project)
Thanks for the suggestion! Indeed the largest repos are the slowest to update, even when we only change a single character on a small text file. This behavior is surprising because it is not what happens in git proper, some other part of the system must be introducing the delay.
I look forward to a normalization of overleaf's git interface, most notably the ability to have regular files, symbolic links and so on.
If you'd still enjoy following us, we're pretty active on Twitter, and to a lesser degree on fb/linkedin.
How many people actually look at the output file texput.log? What about using the family of \tracing* commands? Heck, most TeX engines drop you into a REPL as soon as they hit an error! How often do people make use of this, even minimally by using \show and \showthe to investigate state?
In my experience it seems like people (reasonably) come to TeX/LaTeX viewing it as a sort of advanced word processor. This is unfortunate, because if you approach it more like a full-fledged programming environment, then it ends up feeling a lot more friendly, if a bit archaic, IMHO.
When you're on a deadline, the frustration is real though. I got fed up enough that I bought a hard copy of the TeXbook and spent a week-long deep dive just trying to grok plain TeX. By the end, I ended up liking it enough that I re-wrote my master's thesis in plain TeX, with nice hand-crafted macros for section and equation numbering, etc. This actually made troubleshooting interesting and tractable for me.
If I were still in academia, I would probably advocate strongly for having a TeX/typesetting course for graduate students, if nothing else than to give us a good excuse to actually study and become closely acquainted with this tool that we rely so heavily on.
I love plain TeX and want more people to see the joy as well.
LaTeX being understood as a drop-in Word replacement is precisely part of the problem. It is just too arcane and different. If, like me back in the day, you approached LaTeX as an "alternate word processor" and from there delved into its backend/plain TeX, you are in for a bad time. Understanding it the other way around (some TeX, general ideas and patterns, then the rest [macros etc.] just falls into place) is a much healthier, holistic and sustainable approach that just makes sense. But in our case here, that is infeasible. It would require young engineers to learn a completely alien skill and cease being productive for many months. I reckon CS students/grads have the edge here, given their background and different approach.
It doesn't really take on the problem of equations but does handle structured documents in a way which is much more flexible and arguably powerful than LaTeX, since you can use the full power of XML tooling including e.g. XSLT to define document components.
That said, I think it's quite simply too verbose to be popular, not to mention the tooling around it tends towards archaic. I had a friend who was an extremely heavy Docbook user but had configured his emacs to insert most tags by key-chord, making it much more efficient to write. It's very hard to deal with if you don't do that.
 Of course there are plenty of alternatives to LaTeX. Microsoft Word is just one. But I mean a language that compiles to TeX.
* Because the problems with LateX are mostly due to the limitations of TeX syntax. So you really have to design a new improved TeX-like language.
* While, you can borrow most of the syntax of Latex for the semantic language newLatex built on top of newTex, you will have to make a lot of improvements there too. In principle, you don't have to, but in practice whoever does it, will.
* Then you have to write a compiler that works on multiple platforms.
* Compiler outputs PDF, a terrible format to work with anyway.
* Compiler for html will be demanded too, or the project will fail. So have to write that too.
* What about all the latex packages? There are hundreds of packages that will need to incorporated somehow. Or the project will fail.
* Who is going to do all of this? Language design requires a really small team to make an excellent product. But the remaining project is so large, so you need a larger team for that. So now you need a larger team that will just accept what the Language design committee creates.
* Once, you have created something, you have to convince scientists, some who have been using Latex for decades to switch. But scientists will only switch if journals will switch. And journals don't don't want to maintain templates in another language.
In short, there are a whole host of technical and political problems that nobody really wants to tackle. The project is simply not prestigious enough.
But if Ycombinator wants to help science, if they can fund a team to do this, this will probably contribute more to the advancement of science than almost anything else with a few million dollars.
Only to the extent modelling clay is an alternative to CAD/CAM software and CNC tools.
There might be a GUI replacement to LaTeX, but it would have to be a graphical way to manipulate structure, not a WYSIWYG system. And anyway, WYSIWYG is hardly ever WYSIWIG , or WYSIWTG . (More like WYGIWYD , or YAFIYGI .)
 What You See Is What I Get
 What You See Is What They Get
 What You Get Is What You Deserve
 You Asked For It, You Got It
The closest thing I know of is LyX. I actually started with it before diving into LaTeX.
And if you mean TeX formats, there are no shortage of these https://ctan.org/topic/format.
I'd rather say it is easy to start a TeX/LaTeX alternative. Completing one is another matter.
At one point patoline had promising future (ocaml is nice) but nowadays it's neither widely used nor under active development.
Anyway interesting post I wouldn’t have guessed that carries onto the next array.
Why does it only affect the first cell (or first part of the first row) though instead of the whole row?
About the first cell: IIUC You can determine the color by the row, the column or by the individual cell, so the color is recalculated in each cell. For some reason, it fails in the first cell. I'm still curious.