Fun fact: about 20 years ago, LaTeX developers were lamenting the stagnation of LaTeX and LaTeX being surpassed on a technical level by its new competitor (in the TeX world), ConTeXt.
Since then LaTeX has basically kept on stagnating in the name of backwards compatibility, while ConTeXt keeps on progressing; it already broke backwards compatibility several times (it's latest implementation is called LMTX).
But the inertia is so powerful that nobody except TeX geeks even knows what ConTeXt is. How does that happen?
Sidenote: it's not just about ConTeXt, even LuaTeX, the latest (but not new anymore) implementation of TeX in general, is being ignored by scientific publishers.
Meanwhile, ConTeXt is mostly used for demanding typesetting of books.
And, BTW, using ConTeXt (or LuaTeX) doesn't even necessarily mean writing TeX anymore, it's possible to drive the engines through the Lua C API, or with XML, etc.
"[I]t already broke backwards compatibility several times". There's your answer. I've got three decades worth of style files, personal macros, and workflow invested in the way I use LaTeX, numbering in the thousands of lines. In some communities, especially academia and most especially amongst mathematicians and engineers, that is a typical investment. Nothing that might require me to start overhauling that is going to get a moment's consideration until LaTeX stops working for me, and even then I'm probably going to try and maintain a bespoke TeX distro before I'd try to replace it.
New and casual users of LaTeX should probably consider these alternatives (I've even considered moving over to LuaTeX because it would allow more automation, but I'd need a pretty longish period of reimplementation that I don't see becoming available soon.) But anyone who wants to move that community of long-time users, organizations and publications is going to need to address the problem of a significant user base that have been using effectively the same software for literally (and according to my students, unbelievably) decades.
There's an analogous situation (or a continuation of this same situation, given the overlap in user
base) for those who've invested decades in building a personal set of Emacs configuration files. Again, I've got thousands of lines of *.el files (many of them devoted to my idiosyncratic use of LaTeX) in my Emacs configuration. Of projects looking to replace the aged Elisp engine and create a modern implementation aren't going to get much consideration if it would mean replacing that (literally decades of) work in a different language. That's why I continue to be interested in Guile Emacs---less because of Guile (although I have no issues replacing Elisp with a modern Scheme), but because they've always made it clear that an ability to re-use Elisp configuration files was a primary design goal.
That's exactly it - backwards compatibility. Back at the end of the eighties I was desperately trying to find a way to update documents on-site when I visited customers (and I would sometimes stay for weeks or months), where I couldn't use the word processing system we used at home. I needed something I could run locally, on the laptop (yes we had them) as well as on the target systems (servers). Tried a lot of stuff.. eventually found LaTeX, wrote software to convert our old documents to LaTeX, and a document class which created the same layout as the original (most LaTeX converters "hardcode" the layout [or at least used to, back then] - won't do, as my company changes the "template" now and then).
Everything finally ok. I and colleagues could write documents on-site. A guy from another company had the same problem so I wrote a document class for his company's layout as well.
So, I have these documents from way back, in LaTeX, and occasionally we have to produce them again. Sometimes we re-create them with a newer document class. Other times we extract chapters to be included in newer documents. If LaTeX had broken backwards compatibility it would be disastrous. I won't touch any replacement which doesn't guarantee backwards compatibility. However old the documents are.
> (I've even considered moving over to LuaTeX because it would allow more automation, but I'd need a pretty longish period of reimplementation that I don't see becoming available soon.)
Good thing you didn't bother, because it's absolutely not worth it. I chose LuaTeX initially precisely because it seemed more convenient to write commands in Lua than in plain LaTeX, but it turned into a total nightmare. Issues crop up as soon as you try to do something with text that embeds LaTeX macros, and the errors you get are even more byzantine than the ones you'd get by writing "normal" LaTeX code.
IMHO the very idea of embedding an external language into LaTeX is pointless, for the simple reason that LaTeX isn't markup, but code, and that it's pretty much impossible to parse it into a representation that makes sense and could be manipulated like, say, the HTML DOM. The only sane way to automate things is to write your documents in some language or markup that can be compiled to LaTeX, or to use a template engine.
> Good thing you didn't bother, because it's absolutely not worth it.
The biggest thing LuaTeX gives you is variou hooks and access to various data structures used internally by the typesetting engine, so that you can work with them directly instead of having to do everything in a roundabout way through macro expansion. If all you need is text expansion then sure, just use TeX/LaTeX macros. (You mentioned "try to do something with text that embeds LaTeX macros"; I think working at the text level like that is a sign that your problem, or at least your solution, may not be a good fit for what LuaTeX gives.)
Here are a couple of my answers where LuaTeX was valuable; you can find hundreds more (IMO) by others:
- https://tex.stackexchange.com/a/379802 Avoid getting short words at line edges: Note that the macro-level solution runs into issues with \ref etc (just as you said about text that embed LaTeX macros), but the LuaTeX solution works perfectly in all cases
ConTeXt is not targeted toward authors of scientific papers (although I'm sure this is possible). The community behind ConTeXt has always been involved with books and more complex material. So I guess they evolved into a niche of the TeX market, like music typesetting (MusiXTeX and friends).
Regarding Lua, I think it is a strange idea from the point of view of the typical LaTeX package writer. Not only you need to learn Lua, but you also have to write packages that are accepted only by this implementation of LaTeX. This is a clear disadvantage if you want to have a widely used package.
>I would posit that most people writing LaTeX aren't writing up papers in it either
what exactly do you think they're writing in it then? business cards?
as the other commenter said: publishing across math/physics/cs i have never seen a journal not prefer latex to word. and while ieee accepts word it most certainly prefers latex:
I work for a university press and virtually all of our manuscripts come in as Word docs, but most of our books are indeed in humanities, most are not esp. math intensive. BUt now even Word can handle math markup well enough for say, an economics text. We occasionally get LaTex files, and it is difficult to integrate into our design workflow, they have to be handled completely separately.
Over the last 10 years or so there has definitely been a push by journals more towards word.
I know that for some journals (osapublishing.org, which is optics so engineering/physics) this came at a time when they moved all their content to be accessible as html. I remember reading somewhere (maybe in their style guides) that they said that word allows them to extract the relevant information more easily, so I suspect that MS made some sort of publishing toolchain for this usecase and the publishers lapped it up.
I know that a significant portion of colleagues still prefer to write their documents in latex, and the journals still accept them.
almost all papers on arXiv are done in latex, you upload the source and they compile it on their servers. that’s a lot of papers! it’s most of math, CS, and physics, and some economics.
it’s probably true that most papers aren’t written in LaTeX but I’m confident most people writing in LaTeX are writing scientific papers.
Lua is orders of magnitude easier to learn than TeX. No macros, no expansion trickery and no obscure tricks to do things as complex calculations (either for typesetting or drawing).
Who is writing LaTeX? Mathematicians. They don't really care about the technical level of its implementation, and they surely don't care that you "don't necessarily write TeX anymore", actually, that's a negative for them.
Scientists use TeX for writing papers that they can submit to journals. As I already said, the journals don't accept ConTeXt. So of course mathematicians will rarely use ConTeXt.
> care about the technical level of its implementation
It's not about the implementation, it's about the interface.
Some journals accept Tex files? I’m surprised. SIAM (applied math) does not for instance. I don’t think IEEE does. Do Math journal accept tex files directly?
Arxiv does of course, most notably. But it’s not a journal.
> Authors of accepted papers must submit TeX files to SIAM for typesetting. Authors are highly encouraged to prepare their papers using SIAM's standard LaTeX 2e macros. Note that the SIAM office will format Plain TeX and AMSTeX files to LaTeX 2e. The LaTeX 2e macro package and documentation are available here or by email request.
That’s weird. I always upload PDF. And the copy editing is done on their side by editing the PDF using some fancy Adobe pro tool. I never uploaded a *.tex to SIAM. Of course I use latex to prepare the PDF.
> Authors of accepted papers must submit TeX files to SIAM for typesetting. Authors are highly encouraged to prepare their papers using SIAM's multimedia LaTeX 2e macros. Note that the SIAM office will format Plain TeX and AMSTeX files to LaTeX 2e. The LaTeX 2e macro package and documentation are available here or by email request.
I loved using LaTeX for note taking back in university for technical classes since the mathematics typsetting is quite developed and clear - that said the fields a bit more crowded now with some nice alternatives for mathematical markup.
For a while I helped run CTAN, the TeX archive site. So my name was all over the Interwebs associated with the word "LaTeX." I got some pretty wild mail.
This is a good point. I think products like these have strong network externalities (more the number of people use it, more beneficial they become). For this reason, a new product may not replace an incumbent product even when the incumbent product is slightly inferior. I spent a lot of time learning and using ConTeXt. Wrote one of my articles using that. When the time came to send it to a journal, they wanted it either in Word, or formatted using their LaTeX style files. At that time I gave up on ConTeXt, since I did not want to keep learning and using two different systems.
Yeah, I'm guessing that's probably why people aren't migrating. When there's no easy upgrade path and not a compelling enough gain, people will want to stick with the previous technology.
The LaTeX (declarative) approach is better. It's so much easier to write consistent, professional looking documents that way. The other (imperative) way is more like MS Word, and we all know how that turns out. Other declarative styles that are popular include markdown and html.
For all ye complaining about LaTeX being hard to use: use pandoc. Pandoc can convert your standard markdown/asciidoc file to many different formats. And, you can embed LaTeX inside, giving you both flexibility and ease of use.
Yep, pandoc is great! I used it to write a technical book. I wrote the text in standard markdown and used pandoc to convert to PDF and ePub, with MOBI handled separately (by Calibre). That plus a handful of ruby scripts to do some markdown transformations worked out pretty well.
Org-mode can also be exported directly to LaTeX amongst other things. I like pandoc for the times I must deliver a word doc, but org-mode is superior for LaTeX.
If you're using emacs, might as well use AUCTeX - if you're writing for LaTeX, it has all the keybindings and you skip and extra export step. With org-mode you have to keep using narrow-to-region to get all the syntax highlighting and keybindings for complex expressions.
Having said that, org-mode/org-babel is useful when writing for both HTML and LaTeX export.
I wrote my PhD thesis using AucTeX and if an org-mode document evolves to something that is definitely going to typeset then I'll convert to tex one last time and switch to AucTeX. Org-mode is great for starting random documents with no specific publication target. I might end up typesetting it, or maybe turn it into a web page/static site, or might just leave it as text for my own benefit.
I use some form of TeX (LaTeX or XeTeX) almost daily, and I often think about the future of (La)TeX and potential replacements.
I think that we desperately need some modern language for document preparation, which will replace TeX and its derivatives, but I also don't see anything like that on the horizon. The biggest drawbacks of TeX are lack of proper Unicode support; lack of support of modern font formats; and outdated programming practices. Tools like XeTeX, LuaTeX and ConTeXt tackle these problems with some success, but they are still not perfect. Also, they are too similar to TeX to bring drastic changes to programmers. We need something that is designed from zero specifically as a TeX replacement (including compiler, package manager, popular packages, bibliography manager, converters, etc...). I can only hope for better future...
BTW, fun fact: TeX is still actively developed. Knuth fixed some bugs in the kernel last month (after few years of pause).
When I want to change something nontrivial about the formatting or style, it's a nightmare. For example, there are zero mature fonts besides the default computer modern that also have math rendering that's aesthetically acceptable to me. And I forget the details, but typesetting is handled in such a way that the kerning around things like hyphens is broken. I've fixed it manually before in places where it was particularly egregious.
Also on this point: an ungodly number of hours are wasted every year by academics figuring out how to contort LaTeX so the resulting document complies with NSF grant specifications, which you must meet or be auto-rejected. In particular, NSF requires one-inch margins, but there's no way to force the typesetting engine to strictly respect the margins you set. So at the end you're left to turn the visual margin guides on and check there are no margin overruns, or you risk getting your proposal set to the trash (theoretically, at least).
I've also heard that LaTeX editors are development disasters with a bus factor of one. The NSF really ought to fund them, because if we wake up one day after the maintainers of all the popular editors die and Apple/Microsoft force an operating system update that breaks compatibility, paper writing in the physical sciences will grind to a halt.
"In particular, NSF requires one-inch margins, but there's no way to force the typesetting engine to strictly respect the margins you set. So at the end you're left to turn the visual margin guides on and check there are no margin overruns, or you risk getting your proposal set to the trash (theoretically, at least)."
TeX should always give you warnings like "Overfull \hbox (0.43386pt too wide) in paragraph at lines 42--45" and the text of theaffected line when it "disrespects your margins", so I don't see why you have to check for them visually?
But you can enforce it in many ways; for example you can set `\emergencystretch=\maxdimen` (or any reasonably large value) and you'll never have overfull boxes (what you called margin overruns), unless of course you have a single unbreakable "word" longer than a line by itself. (On the TeX.SE website here are a couple of answers of mine about related stuff and alternative approaches: https://tex.stackexchange.com/a/422652https://tex.stackexchange.com/a/401315 though the canonical question with good answers is probably this: https://tex.stackexchange.com/q/50830 .)
So this complaint of "I can't enforce that there are no margin overruns" is (to use the same word) actually something like "If I don't take any steps to avoid overruns, and moreover ignore the warnings that are specifically about overruns, then there are overruns". This complaint is not specific to you and is surprisingly common actually, which brings me to my main point, which is that the majority of people seem to use TeX/LaTeX without ever having read the manual. Some of them even joke about "incomprehensible" messages like "overfull hbox". But the whole point of TeX, what TeX is at its core, is this breaking-paragraphs-into-lines (in my view, the rest of TeX is just a large wrapper around this basic function of setting boxes).
The preface of the TeXbook describes it as "TeX, a new typesetting system intended for the creation of beautiful books", and for its initial design goals, it does make sense that it chooses by default to declare defeat in impossible situations and overflow margins with a warning to the user so that they can take appropriate action, instead of silently producing an "ugly" paragraph with lines subtly looser than whatever tolerance settings the user has specified. What is remarkable here is that a lot of people are using TeX who do not care about meticulous typesetting of "beautiful books" at all, and will blithely ignore warnings related to such matters.
The lesson may be that such a state of affairs is inevitable, so perhaps a better default for similar software today would be to produce loose lines to fit in the margin, so that results are good even for the users who don't read the manual, or even the warnings, or even proofread the final result.
> What is remarkable here is that a lot of people are using TeX who do not care about meticulous typesetting of "beautiful books" at all, and will blithely ignore warnings related to such matters.
Is this really remarkable? It's the standard tool for writing papers in physics. Get result, put into LaTeX, put on arxiv, repeat. The fact that these people have to constantly Google for obscure LaTeX incantations (faster than TeXbook) is not ideal for them. Indeed, it would be better if it were easier to use without reading a 500 page book.
Anyway, thanks for tip. When I googled for this I found only \sloppy, which did not produce reasonable results.
I should have been clear: I don't think people's behaviour is remarkable here (I've done the same, with LaTeX years ago, and with many other tools); what I found remarkable is the history / state of affairs, where a finicky tool primarily concerned with the finer points of typesetting (TeX) became the standard tool for dashing off papers (thanks to Lamport and LaTeX I guess) even for people not concerned so much with typesetting. It's as if, say, a samurai sword had become the standard tool for chopping vegetables in the kitchen, and everyone was complaining about the awkward shape, the unnecessary sheath, etc. It's neither the tool's fault nor the users', but… something has gone wrong somewhere.
> there are zero mature fonts besides the default computer modern that also have math rendering that's aesthetically acceptable to me
The only one I like is URW Garamond but it's a breeze to install and use and I think it stands head and shoulders above everything else. It also matches the math pretty well imho. I feel you though: Computer Modern is pretty ugly and most other fonts really clash with the math rendering by default.
I've been extremely happy with the TeX Gyre family. Its Pagella font comes with a math sibling and both together cover a wide range of symbols at high quality.
But after a while I remembered the more fundamental point that bothered me:
> one might add that it's not just microtype that doesn't offer optical kerning -- it's structural limitations in TeX's typesetting that prevent it. To TeX, each glyph is contained inside a (I guess: black) box. TeX knows the outer dimensions of those boxes, but has no clue (and doesn't care) about what the glyph looks like that's inside it.
And this led to bad spacing for hyphens, which led to me experimenting with manual kerning, which led to the issue that I recalled first.
Also, I am not really convinced of the merits of Vim for LaTeX. Most of my time writing is spent revising, which requires a lot of "random access edits," and I'm at least twice as fast at jumping to arbitrary locations on screen using a mouse as I am with vim (even with the usual tricks).
(I tend to think of vim as something that was purpose-built for programming in C and with usefulness inversely proportional to how far a given task is from that purpose. While I realize that's not entirely accurate, I've found it a good heuristic. Programming: vim works decently for me. General purpose writing: not so much.)
Further, the ability to control-click .pdfs and skip to the corresponding place in the .tex file, and vice versa, is an amazing convenience. I understand that there are technically ways to do this in vim, but then at that point I'm using the mouse a ton anyway and I don't really see the point of vim (which is usually proposed to be "hands don't go off keyboard = write faster" in my experience).
So, yes, I could survive if forced to use vim. I just wouldn't like it.
Given this, and given I feel I'm much more willing to experiment with these things than the average person writing papers in the physical sciences. I can't even imagine how the conversation would go if I tried to tell my 72-year-old advisor that TeXShop's not going to work anymore because he updated his computer and it's time to get friendly with the command line. These people want to do science, they don't want to learn vim shortcuts.
TeX finds out about character widths and kerning pairs and ligature replacements from outside of TeX itself. TeX is perfectly happy to ligature together "--" into an en-dash, and "---" into an em-dash, and to then kern either one of them with whatever characters you like, if instructed to do so by these externally-supplied per-font instructions. So, it seems like you're being troubled by the externally-provided font info rather than any limitation of TeX per se.
Boring details: TeX, as it comes from Knuth, reads TFM ("TeX Font Metric") files to supply this per-font information. Knuth's Computer Modern fonts do not contain any kerning pairs that contain dash characters, which one may be pleased or displeased about. Various enhanced-TeXs get font info from the OS or perhaps directly from, say, OpenType font files. If the ligature and kern info isn't passed into TeX from there properly, or doesn't exist in the font itself, then of course you won't get the results you're looking for.
Arguably, there is no algorithmic way to compute high-quality kerning based on character shapes (just as there's no way to determine, say, character width just from an outline). If there were, then Apple would build it into MacOS and iOS, and Microsoft would build it into Windows, and Google would build it into Android, and then all applications from TextEdit to Word, Safari to Chrome, and troff to TeX would automatically get told about these synthetic kerning pairs, and use them by default.
That said, Adobe does have a proprietary optical kerning scheme that they use in their applications. But it's turned off by default (hint, hint), and their documentation is a bit defensive about it: "Optical kerning adjusts the spacing between adjacent characters based on their shapes, and is optimized for use with Roman glyphs. Some fonts include robust kern-pair specifications. However, when a font includes only minimal built-in kerning or none at all, or if you use two different typefaces or sizes in one or more words on a line, you may want to use the optical kerning option for the Roman text in your document." Loosely translated: Fonts from real type foundries have kerning pairs carefully created by their designers; if you have slap-dash one where they didn't bother, we'll do the best we can if you want. YMMV.
I think this information would be built in to any high-quality font. As drfuchs tried to explain, TeX has no information about letterforms; that’s not its job.
> Also, I am not really convinced of the merits of Vim for LaTeX. Most of my time writing is spent revising, which requires a lot of "random access edits," and I'm at least twice as fast at jumping to arbitrary locations on screen using a mouse as I am with vim (even with the usual tricks).
> (I tend to think of vim as something that was purpose-built for programming in C and with usefulness inversely proportional to how far a given task is from that purpose. While I realize that's not entirely accurate, I've found it a good heuristic. Programming: vim works decently for me. General purpose writing: not so much.)
I don't want to comment on the kerning stuff (I don't really know much about it). However, I found your comment about the merits of vim when revising interesting. Like you, I spend most of my time writing revising/correcting (often other peoples work), however my take on vim is just the other way around.
If I write a large text myself, it doesn't really matter what editor I use, it's when I revise/correct things that I love using vim. While you might be correct that you would be quicker to move to the appropriate spot using a mouse, I pretty much guarantee that all the gain is gone when you have to switch back to the keyboard to do your change. That's why I love vim for these kind of tasks, the moving and changing of specific text is such a seemless experience without even having to change the hand position on the keyboard. My productivity absolutely plummets every time I have to use word or some other editor for these sort of tasks.
> While you might be correct that you would be quicker to move to the appropriate spot using a mouse, I pretty much guarantee that all the gain is gone when you have to switch back to the keyboard to do your change.
I see! What I didn't say is that I'm working on a laptop, so the trackpad to keyboard distance is negligible – it's essentially just another big key. All I have to do is pivot my hand.
> I've also heard that LaTeX editors are development disasters with a bus factor of one.
In case anyone else is wondering about this:
> The bus factor is a measurement of the risk resulting from information and capabilities not being shared among team members, derived from the phrase "in case they get hit by a bus."
Fixing bugs isn't the same as actively developing. The bugs that are getting fixed these days are in increasingly obscure edge cases (in some cases no actual implementation in use can reach those edges).
That said, I've started working on such a project as you propose. The document preparation language will be almost identical to LaTeX, but no macro language and no category codes, UTF-8 as the standard input coding, first-class support for multiple back ends (e.g., use the same source to produce PDF, HTML or ePub), command definition through an embedded scripting language (I'm still debating between Python, Lua or something else entirely), formatting done declaratively, most likely through a YAML file, etc. My musings along the way are at http://finl.xyz
While the original TeX engine only get bug fixes these days, that's not really relevant for LaTeX since LaTeX doesn't even support that engine anymore. While pdfTeX development is rather slow too, it does still see feature development. That being said, for modern documents LuaTeX is really the only engine worth looking at and it is quite actively developed. Of course, it doesn't see much adaption but that problem will also appear for any new system.
That brings up the other big problem with contemporary TeX. LuaTeX has handfuls of subtle incompatibilities with pdfTeX and XeTeX (and a weird way of handling character setting at the low level where it maps characters to their index in the font file rather than unicode code points). It's also noticeably slower than XeTeX which is slower than pdfTeX. It would be nice if it were true that pdfTeX ⊆ XeTeX ⊆ luaTeX, but unfortunately, it's not.
Any reason for that? The rest of your description looks very attractive, but why would I prefer a language without macros? I need conditional compilation, setting variables, parametrized notations, etc. This is extremely useful when preparing a scientific document. If anything, I don't really care about the document preparation language itself, but it must have macros to be useful!
To clarify, there would still be programmability, just not via macro expansion. Instead of macros there would be first-class variables instead of TeX's mix of registers and macros to emulate that. Instead of having to deal with the timing of macro expansions and all the pitfalls therein, an embedded scripting language would be available to handle complex settings (I'm leaning towards Python at the moment, but I'm far from having to make a decision on that topic). First-class flow control will also be available (right now, things like \foreach and \ifXXX are handled via macro expansion which causes a lot of pain beyond simple use cases).
OK, that's reassuring! What you describe sounds pretty much like a "macro language" to me (i.e., a separate language to describe your document). But this is just a question of terminology, somewhat arbitrary. Being able to generate the text of your document is a very powerful tool and, according to what you say, it will be even more powerful.
> we desperately need some modern language for document preparation, which will replace TeX and its derivatives
Do we? Sadly, appreciation for excellent typesetting has declined now that most people are consuming writing through a screen. And even for printed material there is pressure to cost costs: venerable and respected publishers with long traditions of obsessively good typesetting, realized they could outsource their typesetting to developing-world shops using Word or similar software, and few readers would care. (Similarly, and to the dismay of bibliophiles, even prestige titles these days are likely to get a glued building instead of a sewn one, and fairly low-resolution digital printing).
> lack of proper Unicode support; lack of support of modern font formats
Is that really true? I had no trouble writing UTF8 documents using regular TrueType-fonts in LaTeX almost two decades ago and with no problems with output quality.
The only thing was you couldn't go through the dvi backend so any tools which manipulated dvi was not possible to use. But I believe all this is default now.
Oh yes. For instance, LuaTeX doesn't print combining diacritics correctly unless you painstakingly swap the underlying codepoints around in some unspecified order (to be determined through trial and error) that doesn't match _any_ of the Unicode normalization forms.
I would be interested in how you did that?
I had a lot of problems with Cyrillic scripts until I started using XeTeX three years ago. I think that LaTeX3 made some progress on UTF-8 compatibility in the last two years, but font loading is still a mess.
I think this is not a good reason to replace LaTeX. The big advantage of TeX over other systems is the extensibility of the language. Yes, the language is strange, but you get used to it, and it does what it's supposed to do. So I think that, just like UNIX, replacing TeX will never be completely done, unless something really great shows up.
TeXmac looks very interesting. I will maybe give it a try. But I must say that TeXmac workflow seems a little bit dependent on GUI. That is not a bad thing per se but seams different form TeX.
In case, for Linux I am using their static binary, if I recall correctly the Ubuntu package wanted to change my Guile to 1.8 and this wasn't ok for me. I have learnt today that Fedora has it in the repositories. The Windows installer works and as far as I know the one for Mac OS too.
I use the latter extensively and am quite happy with it. The listed advantages of Tectonic are somewhat debatable:
> Tectonic automatically downloads support files
I use a full TeXLive distribution in a Docker image. But this is a clear advantage on regular desktop setups, although I'd argue most people don't notice that texlive is 4GB+. MiKTeX does downloading on-the-fly, too.
> Tectonic has sophisticated logic and automatically loops TeX and BibTeX as needed
Latexmk does this too. BibTeX is outdated (use biblatex+biber), I hope Tectonic is not hardwired to it.
> doesn’t write TeX’s intermediate files
Sounds terrible! For a long document I maintain, a cold run takes 8 minutes. A "cached" run (all auxiliary files present and somewhat up to date), it's below 3 minutes. It only says this is the default setting, but it would be a devastating default in my case.
> The tectonic command-line program is quiet and never stops to ask for input.
Okay that sounds nice, but I've used the interactive "can't write to file X.pdf" many times before, when the PDF to write to was locked for writing (open in PDF viewer). It saves aborting the entire run and starting fresh.
> Thanks to the power of XeTeX, Tectonic can use modern OpenType fonts and is fully Unicode-enabled.
Nice, but is this Tectonic-specific? I use latexmk with lualatex. The latter is better than xelatex (microtype, contour, memeory management, lua integration, ...). If Tectonic is hard-wired to xetex and doesn't allow lualatex, this is a big downside.
The remaining points are nice, especially the GitHub actions stuff. Though I just use the same Docker image there that I use for local development.
> BibTeX is outdated (use biblatex+biber), I hope Tectonic is not hardwired to it.
While I prefer biblatex and biber, some journals and conferences require BibTex.
Take for example the element <h2> indicating a new section in an HTML document. LaTeX also has a command for this; here one would use the \section command.
——
I feel like that’s making a very broad assumption about people writing semantic markup in HTML. H{1-6} is for a header. It has nothing to do with a section (<section> or otherwise).
I thought that’s exactly what headers are for. Otherwise, why would someone use them? Just for something to attach a style to, or because the default styling looks sort of like what you want?
That's not entirely correct. HTML allows heading elements (<h1>-<h6>) wherever "flow content" is expected, but not in eg. "phrasing content", and also not in a number of content models allowing flow content generally but not heading elements specifically such as those for the <address>, <dt>, <th>, and <legend> elements.
What that means is that eg. a <h2> element automatically ends an open <p> element when not closed explicitly. WHATWG's HTML spec clumsily expresses these SGML tag inference rules by enumerating explicitly all elements that end <p> elements (and of course, WHATWG's HTML specification process, as would be expected, then "forgets" to update the enumeration when new elements are introduced in later specs):
> A p element's end tag may be omitted if the p element is immediately followed by an address, article, aside, blockquote, details, div, dl, fieldset, figcaption, figure, footer, form, h1, h2, h3, h4, h5, h6, header, hr, main, nav, ol, p, pre, section, table, or ul, element, or if there is no more content in the parent element and the parent element is an HTML element that is not an a, audio, del, ins, map, noscript, or video element
Source: SGML DTD for W3C HTML 5.2 [1], prepared from W3C's HTML 5.2 spec, which in turn is derived from (an older version of) WHATWG's HTML spec.
People abuse LaTeX in similar ways. Not to mention that there are some awful examples in Lamport's book. I've dug up my 75% complete LaTeX book that I wrote while I was teaching LaTeX classes for TUG and I'm going to try to finish it in the next couple of months. I think a big part of what will make the book good is as much what's being left out as what's being included.
I know it's unrealistic, but I wish someone would reinvent Latex.
I don't really want anyone to learn it.
It's sad it seems kind of the best tools for advanced use out there...
It's not difficult to learn at a base level. The real difficulty in learning LaTeX is becoming comfortable with handing the keys over when it comes to presentation — WYSIWYG grabs people easily, and, once taken in, it is tremendously difficult to break them of their urge to fiddle with things. In many ways it is good that it takes some doing to do more than the basics — nice output for basic documents is almost a guarantee.
I second that. I was able to re-write my CV which was in Word to LaTeX in about 3 hours with help of online resources (no prior LaTeX experience). The output is high-quality, consistent and looks professional.
I didn't get into a lot of advanced stuff - just defined a few "newcolumntype"/"newcommand" and used them everywhere but that freed me from formatting worries and let me focus on the content.
However, it is not quite as helpful as latexdraw was for pstricks.
That said, I am not sure how much of that difficulty is due to tools lacking and how much is due to "describe vector graphics as code" being a hard problem.
What – you did not enjoy reading the 1200+ page manual?
But yes, I know what you mean. I use the cargo cult method for writing TikZ code myself. Look for examples of people doing something similar, then poke at the code until it does what I want. I am slowly developing some glimmer of understanding, though. I figure in another 100 years' time, I'll be a TikZ wizard at this rate.
Tikz was so frustrating that I ended up drawing my stuff in an SVG editor and importing a high-resolution export from there. I know it's not as cool or scalable as a proper Tikz diagram, but if you have any value on your time, then I just do not think that Tikz gives you enough to be worth it, at least for relatively simple drawings.
Take a look at TeXmacs (http://www.texmacs.org) (does not share code with TeX, only the name recalls it!). It is at the same time structured and WYSIWYG (having accessible text source), completely programmable (user-defined native macros and Scheme) and has a high quality typography (includes a global page-breaking algorithm). There is a helpful mailing list (info at http://texmacs.org/tmweb/home/ml.en.html) and a forum (http://forum.texmacs.cn/).
Problem with TeXmacs that they stuck with the old technology too - they use[1] ancient Guile and that prevents them to use JIT and in turn prevents[2] distributions to update Guile itself.
I agree that some communication from the TeXmacs developers to the Fedora ones would be helpful (I documented myself, Fedora is using Guile 2 and still supporting 1.8 with a compatibility package). (More about the Scheme in TeXmacs later).
Said this, I think the angle from which you are looking at the software is too narrow. A better angle is to look at the realized potential of the TeXmacs document preparation system: it does everything one could wish for. With "everything" I am exaggerating a little :-)
More in detail (I refer to [1] for a complete discussion):
1) It makes structured editing possible (and that is its main mode of operation). So if you want to mark your text with "semantic" markup you can do that, and in fact that is what using the software leads you to do.
2) It is controllable via programs
3) It has advanced typography (including mathematical formulae at the necessary level for professional mathematicians, as far as I can see), and it is pleasant and easy to work with: you can concentrate on writing, letting the program take care of the visual appearance and you do not need to interpret the sources while you are editing, to figure out where is the text that you want to edit
4) It has user-definable keyboard shortcuts that further help entering structured text
... and I feel it is enough for a short list of the features.
Back to Scheme. The development team is debating on which Scheme to use for the future of TeXmacs, and the debate is taking time. Here are some experiments [2],[3].
The current software (with Guile 1.8) works smoothly, with full functionality. For Linux installation, I chose the static binary they provide ... because on Ubuntu I don't have Guile 1.8 ;-)
This should probably be the first option for anyone who wants to produce a LaTeX document without learning the language, for some reason. It has support for bibtex, drawings, and equations.
What kind of error? I would never imagine a bug like that would be possible in latex. In fact I’ve never encountered any bug IIRC. But sometimes latex is just weird.
But he. Good thing is no one reads thesis in full anyway
For people interested in learning LaTeX, I learned LaTeX from this PDF well enough to do my reports and understand search results about LaTeX: https://tobi.oetiker.ch/lshort/lshort.pdf
I've used latex on and off over the years, I can write complicated math expressions off the top of my head pretty easily. For document formatting and package use, however, I need to look up syntax and examples every time. Where can I learn the how all the messy syntax of latex (aside from the math notation itself) is supposed to work? Or should I just stop trying, because it's a lovecraftian tangle that will drive me insane?
Not many really talking about the linked resource here but it looks quite beautiful (especially the font). Short for what it is, but could be useful to beginners. I find Overleaf pages also to provide good resources on LaTeX.
Since then LaTeX has basically kept on stagnating in the name of backwards compatibility, while ConTeXt keeps on progressing; it already broke backwards compatibility several times (it's latest implementation is called LMTX).
But the inertia is so powerful that nobody except TeX geeks even knows what ConTeXt is. How does that happen?
Sidenote: it's not just about ConTeXt, even LuaTeX, the latest (but not new anymore) implementation of TeX in general, is being ignored by scientific publishers.
Meanwhile, ConTeXt is mostly used for demanding typesetting of books.
And, BTW, using ConTeXt (or LuaTeX) doesn't even necessarily mean writing TeX anymore, it's possible to drive the engines through the Lua C API, or with XML, etc.
ConTeXt links:
https://wiki.contextgarden.net/Main_Page
https://en.wikipedia.org/wiki/ConTeXt
Mailing list: https://mailman.ntg.nl/mailman/listinfo/ntg-context