Hacker News new | comments | show | ask | jobs | submit login
Designing a Business Card in LaTeX (olivierpieters.be)
363 points by opieters 166 days ago | hide | past | web | 219 comments | favorite



If you're going to go through all the trouble of typesetting a business card in LaTeX, you really ought to use Computer Modern Roman so that folks know you typeset it in LaTeX. (Related; resumes in CMR always have +10 credibility.)


I think this need not be the case. When creating this, I wanted it to look good and the way I wanted it. Not make it scream "LaTeX".

For a CV, Computer Modern might be more apt since there's also more text on it. But again, why make it scream LaTeX if one can do it more subtle and make it more personal at the same time?

Do note that I have a LaTeX PDF style CV as well, but it's not linked in the article (nor available online).


Exactly. Good TeX should simply yield to good typography. Nice ligations, etc.

I dont like Computer Modern too much. I am very fond of Gentium [1]. For math, I love Euler [2], which is used in Concrete Mathematics.

[1] https://en.wikipedia.org/wiki/Gentium

[2] https://en.wikipedia.org/wiki/AMS_Euler


I like Gentium. Thanks for the tip!

> Good TeX should simply yield to good typography.

Making it look like tech can make it appear like you're one of the in-crowd. Although I personally try to avoid exactly that, I can see it being an advantage in -- for example -- CVs. Fwiw, my CV is also done in LaTex. I hope it doesn't show too much: http://stbr.me/cv


It's very cool! Side bars and cartoons are made with Tikz?


Tikz indeed! I'd use Inkscape today though ;)

Edit: thank you! Glad you like it.

Edit 2: sources at https://github.com/kaeluka/cv


BTW, I assumed you meant "ligatures" (for joined letters)? (Not sure what "ligations" are - maybe autocorrect... :-)


I did, I should be more careful. Thanks. Too late to edit though :)


Doesn't Euler look a tad cartoony to you? I guess that goes away when you are used to it.


It does. It is intended to mimic handwriting in blackboards. I use it for slides when teaching or presenting to informal audiences.

If you don't like it, a good alternative to Computer Modern is Latin Modern. A nice derivative which is not so thin. Reading long Computer Modern texts gets a bit tiring for my eyes. I think some editors like LyX default to Latin Modern instead of Computer Modern.

Aside, The Art of Prolog uses an amazing Lucida variant. It'd be my favorite for math texts if it was free.


I edit a genre fiction and we do all our layout in XeTeX. I picked it because it's the right tool for the job, and I don't know PageMaker. The colophon calls attention to it because I think it's helpful to tell others how you made something (typefaces, tools, etc.) rather than for nerd cred.

https://www.aliterate.org/aliterate-page47.pdf

The typefaces are Minion and Myriad.


When I copy text from your PDF it comes out all garbled. Does this happen to others as well? Is it intentional; some form of copy protection?


Don't know about the parent's intentions, but making the PDF from LaTeX copy-able is mostly a matter of putting this in the preamble.

    \usepackage[utf8]{inputenc}
    \usepackage[T1]{fontenc}
See also http://tex.stackexchange.com/a/64198/121234 and http://tex.stackexchange.com/a/119718/121234.


Huh, totally weird. Maybe it's an encoding issue? The character appear to be getting shifted by -39 (ex. c 143 ->D 104) I never noticed because I'm typesetting for print.


I think you need to use some PDF package to have the encoding set correctly. I had the same issue in the past, but I don't remember how I solved it.


Since you said you were editing this: Is "horribly familiar horribly familiar" a double typo?


I enjoyed that snippet, what is it from?


PageMaker has been dead software for like 10 years now, fwiw. InDesign is the modern replacement.


> why make it scream LaTeX if one can do it more subtle and make it more personal at the same time?

Because the point of a business card is to effectively signal your impressiveness.


Is LaTeX actually impressive to the average business client (i.e. someone without technical knowledge)?


It's like a secret handshake.

This comment could sound sarcastic but it's not. Even that marketing manager could be a nerd, and it would be awesome to find out.


FWIW, my $BOSS-2 had a PhD, loved LaTeX and even set one of our lower-level guys on writing all our company-internal docs in LaTeX. It actually worked out pretty well.

I imagine that an obviously-LaTeX resume (look for the ligatures, the correct spacing between sentences vice after periods, the clean, consistent grey across the page, the bold, clean margins) would play especially well with him. As you might guess from the preceding sentence, it'd probably play pretty well with me, too!


> vice after periods

Is vice Latin? It reminds me of pace (because of the "shape" of the word, haha), but I guess it means something akin to except?


I found a definition that fits https://en.wiktionary.org/wiki/vice#Derived_terms_2

So apparently what was meant is

correct spacing between sentences instead of after periods


Good point, I fear it could backfire in a business setting. Mind you, I'm a huge LaTeX fan, but the one study I'm aware of that compared the efficiency of Word and LaTeX for a couple of average writing tasks came to a pretty damning conclusion (see quote below). So, I'd argue that LaTeX is the right tool for long math-heavy texts under version control, while better tools exist for most other tasks.

> LaTeX users were slower than Word users, wrote less text in the same amount of time, and produced more typesetting, orthographical, grammatical, and formatting errors. On most measures, expert LaTeX users performed even worse than novice Word users.

http://journals.plos.org/plosone/article?id=10.1371/journal....


Not impressive for someone without technical knowledge.

If the technical people have a specific university background they might be familiar with it, and then either hate it or love it.

It's a good tool for writing large documents but does not signal anything except personal tastes.


Then why do it in TeX as opposed to Inkscape?


> Do note that I have a LaTeX PDF style CV as well, but it's not linked in the article (nor available online).

any particular reason why? i for one would love to see it.


At the moment the code is pretty ugly and there's a bit of personal information (e.g. phone number) I don't want to post online. I might write a blog post on creating a custom CV, but that's for some time in the future ;) (not really sure when)


My two cents using pandoc, LuaLaTeX (or XeLaTeX) and few TeX hacks. http://jill-jenn.net/résumé.pdf


If you want to see one in LateX, here is mine: http://manu.vives.fr/


Computer Modern is a hideous font. It's particularly terrible when rendered in Postscript/PDF, which is how almost all documents produced with it actually get read.

But the +10 credibility is absolutely true. It's like CS grunge style: when you wear ugly clothes, you look more authentic.


Heh, I was assuming NelsonMinar was being sarcastic. But yeah -- it's all about the audience. :)


How is Computer Modern much worse than, say, Times New Roman?


Computer Modern has a very high contrast of thick and thin strokes, and the axis of the strokes is completely vertical. This makes it very difficult to render on bitmap displays: you can end up with 3-pixel lines next to 1-pixel lines, and the optical compensations of antialiasing can't fix the visual appearance because most of the lines are orthogonal.

These characteristics of the font are intentional: it's really a showcase for Knuth's Metafont system which lets you tweak stroke contrast to a high level, so you can create many weight and casing variations of Computer Modern parametrically. And it does look bearable in print. But it's really the worst possible screen font.

Times is much better in this respect because it was designed for readability long before computers, for newspaper printing where unexpected ink bleed and other artifacts were daily routine.

(Cynically, it's possible to view Computer Modern as a fitting portrait of computer science itself: the font is used because it's exciting to have 62 tweakable parameters in a system designed by someone famous in the field, not because it's any good for the end users.)


Okay, so you're not saying it's always hideous; you're saying it's hideous on screen. I think you're wrong even there: some of the screens that have come out in the last few years have resolutions near that of cheap inkjet printers, and on those screens Computer Modern looks okay. If you don't have access to such screens, there is a simple trick you can use instead: just set your zoom to something like 10x. The font looks glorious then, in my opinion. The one tiny problem is that nothing fits on the screen :)


I suspect his problem is more with Windows and its brain-dead font rendering. Vertical strokes are the ideal case for subpixel anti-aliasing on a typical LCD: you get 3x the horizontal resolution. On a typical 100dpi screen this gives you 300dpi horizontal resolution, which is around the low-end for print quality. On modern 200dpi "retina" displays, you get 600dpi horizontal resolution which is more than enough to make Computer Modern look good. But ClearType has a strong preference for aligning positions and sizes to whole pixels, giving you horrendous kerning and sudden large jumps in font size and apparent weight as you increase the requested font size.


I'm using a Retina MacBook. The readability problem with Computer Modern is intrinsic to its design, and even a 200 dpi screen just isn't enough to make it look good (IMHO). But it's definitely much worse still on Windows.

My question is: why use this particular 1980s font developed to showcase a dead-end stroke rendering technology, when there are much better alternatives around?


Why use it? It's the default, and otherwise you have to find a math font and text font that go together all on your own. At least that's why I use it: laziness.

I agree at 200dpi is not quite enough for Computer Modern. But I claim that, say, 7200dpi is far more than enough, and then it looks great.


Times New Roman often means that you created the document with an old version of Microsoft Word (before Calibri became the default) and never bothered to customize anything.


That's why I always install CMR in MS Word.

Handy link (because it's actually quite hard to find): http://concentriclivers.appspot.com/LatinModernTTFHinted.7z


Word kerning and hyphenation still sucks, so -10.

Seriously, there are many places (CS publishing) where there are biases against papers written in Word.


Very true. I'm unlikely to take an arXiv article seriously if it hasn't been typeset in LaTeX.


And yet, in other disciplines, Word is the standard and you get very, very strange looks if you use anything different.

Source: wrote a psychology PhD in LaTeX.


Isn't using LaTeX instead of Word in the DSM-5 already, right next to vi and Linux?


Fortunately, there's no entry for emacs, so I guess I'm safe for now....


I once made a visual comparison between Word and LaTeX using CMR: http://tex.stackexchange.com/a/110219/14827



Computer Modern is now one of the most overused typefaces on the planet. It looks great for that “Victorian-era American-style math paper” look.

People should mostly avoid it otherwise, IMO. There are plenty of better alternatives for typesetting pretty mathematics.

* * *

More generally about the thread here, TeX is not a productive or effective tool for making business cards or other heavily graphical documents. The point of TeX is to handle standard typesetting of structured prose and mathematical notation. Any time you need flexible layout control, TeX turns into more trouble than it’s worth.

Someone using a dedicated graphical page layout or vector graphics tool is going to make a better business card (or poster, or flyer, or diagram, or ...) in less time, in a way which is much easier to modify later.


On the one hand, I tend to agree with you. I used LaTeX (with beamer) to make several slideshows over the past few years, and it seems safe to say that this is not the exactly the kind of thing that plays to TeX's strengths.

On the other hand, there exist wizards who do things like this:

https://tex.stackexchange.com/questions/134638/showcase-tex-...

(check out the pink one with hearts; note that it uses random numbers so it doesn't even come out the same way each time, try doing that with anything but TeX (just don't ask why you'd want to); there's also one other that also used random numbers)


That page’s examples are mostly a bunch of paragraphs of prose (that part, TeX is great at), some with fancy formatting that took significantly more effort than the equivalent effect in InDesign or QuarkXPress. The one with the Zapfino doesn’t bother using any alternate glyphs (arguably the main reason to ever use Zapfino). A few others are vector graphics projects that would be best done in Illustrator, Inkscape, or similar. As you go down the list, the examples (like the thing with the hearts) start to be examples of tacky overwrought things slapped together by amateurs.

Obviously TeX is an extremely flexible system, and it’s possible (fun even!) to use it for all kinds of purposes it wasn’t designed for. Analogously, it would also be possible to write a web application in assembly code running on an Arduino. I said it wasn’t a very effective or productive tool for the job, not that it wasn’t possible.


What I meant to say was that in the hands of the right person, it would be effective and productive, but since you're not particularly convinced by anything on that page, I think I'll have to concede this point. You win :)


> TeX is not a productive or effective tool for making business cards or other heavily graphical documents.

Agreed. See the study "An Efficiency Comparison of Document Preparation Systems Used in Academic Research and Development" which concludes

"LaTeX users were slower than Word users, wrote less text in the same amount of time, and produced more typesetting, orthographical, grammatical, and formatting errors. On most measures, expert LaTeX users performed even worse than novice Word users."

The study does claim though that LaTeX users had more fun while doing worse work, so there's that... :-)

http://journals.plos.org/plosone/article?id=10.1371/journal....


This conclusion makes me wonder if they were measuring LaTeX users while they were writing a new type of document for the first time.

My experience has been that once a given type of document has been written and tweaked to your satisfaction in LaTeX, the next time you write that type of document, you can just reuse the previous document as a template and not have to tweak much at all.

The first document does tend to require lots of tweaking, I'll grant that, especially if you're doing anything unusual that isn't covered by an existing LaTeX package.

Another thing that makes a big difference with LaTeX is the tools that are used to write it. If the "expert LaTeX" authors were writing LaTeX completely by hand, with no aid from any specialized tools, then that would explain their many mistakes. The use of advanced tools like the various LaTeX packages for vim and emacs make writing LaTeX documents a lot more streamlined and less error-prone.


Compared to Times New Roman? Arial/Helvetica? Calibri? Comic Sans and Papyrus?

I'd hardly put Computer Modern in anywhere near the same category.


You managed to name a bunch of typefaces which are even less suitable for tasks where someone might use Computer Modern (Times is comparable). I’m not sure what that proves....

Instead of Calibri, the Microsoft-provided typeface to use for mathematics is Cambria, https://en.wikipedia.org/wiki/Cambria_(typeface)#Cambria_Mat...

Personally I like Minion, http://www.typoma.com/en/fonts.html

Or combining some kind of Renaissance-style typeface with AMS Euler https://en.wikipedia.org/wiki/AMS_Euler

Or Lucida Math, https://tug.org/store/lucida/opentype.html

If you prefer Times, try MathTime https://en.wikipedia.org/wiki/MathTime TM Math http://www.micropress-inc.com/fonts/tmmath/tmmain.htm or STIX https://en.wikipedia.org/wiki/STIX_Fonts_project


My comment was in response to the assertion that CM is somehow "overused". Compared to the fonts I listed, Computer Modern is pretty obscure.

Maybe I just wasn't clear enough on that part, in which case I sincerely apologize for my poor communication skills.


Ok, I got this, but honestly i thought you meant the opposite


No worries; you apparently weren't the only one ;)

I thought I was pretty clear, but I guess not. Makes me really wish HN had a more forgiving window for editing comments, but oh well.


I think editing comments is disabled once someone posts reply...


You can edit for up to 2 hours, I believe, regardless of any replies. You can also delete during the same interval, unless someone replies.


Spot on. I laughed on every single font. Each one actually has a long story on abuse.

For replacement fonts I also like Palatino


I'll take the opposing view.

When I see a resume in CMR I get the signal that the applicant still has their head in academia. That can be good or bad, in my particular field it's not that great.


I typeset my resume in LaTeX, but I ended up formatting it so it looked almost exactly like it did in Word. I should probably update that at some point.

But I did throw a \LaTeX macro in the skills section, so anyone familiar LaTeX should notice.


Me nods. I clicked on it with a 100% expectation to see Computer Modern Roman (CMR) font, only to be surprised. Not that I was repulsed with what I saw. Still nice, but I'd have also posted a variant with CMR. :-)


It was this very reason that I used the CMR webfont to typeset my Jekyll blog (https://sneak.berlin). It's sort of a nerd shibboleth.


In March 2014, I attended a conference in Atlanta. In the subway, some nice guy told me I had to make business cards for the conference. So I went to some kinko and asked:

- How much for 250 business cards? - $69. - If I provide a PDF and just ask you to print and cut, can you do it for today? - Yes. - How much? - $39. Do you have the file? - Yes.

It was a lie. I opened a new LaTeX document and typed this thing, thanks to Stack Overflow.

https://github.com/jilljenn/business-card


I don't quite get why LaTeX is special here. You could have done the same in Inkscape, Word or whatever other text stuff related program.


Even better, you could layout with Inkscape AND typeset the text with LaTeX using textext: https://bitbucket.org/pitgarbe/textext


This is so silly and yet so awesome. I might actually use it in the future.


I don't understand. What was the lie?


I believe he means when he responded "Yes" to the final question (as in "Yes, I have the PDF file for the business card"), he was lying and had nothing. So he quickly searched SO and found what he needed.

Sometimes, that's just what Stack Overflow is there for: a quickly accessible CYA!


That's a bit dramatic to call a lie!


It was a lie though, the fact you can then go on to do something such that the other party is never aware of the lie does not stop it from being a lie.

Most interactions are full of such lies, particularly between people with very different skill sets. Lies are social lubricants, and often have very little drama associated with them.

We do however have an interesting social stigma associated with the term "lie", much like "manipulate".


>some nice guy told me I had to make business cards for the conference

why? are they checking for business cards at the entrance or something?


Every person you meet is going to ask what you do, that's the moment to hand over your card.


I've always used it more as a natural conversation closer personally.


These are really beautiful business cards.

But man, does anybody actually get value out of their business cards? No interesting opportunity has ever come up for me as a result of sharing business cards.

At some point I stopped taking them with me. I still have a bunch of them in my desk drawer, they're very outdated, and I only use them for grocery lists.

Is that just me, or are business cards going out of fashion in general?


Don't travel to Japan without business cards. It's really awkward when everyone gives you one and asks for one back, as I think exchanging them is an important part of meeting someone there.


Indeed. Remember: accept them with both hands and briefly inspect them. It's the polite thing to do.


Absolutely. You can also ask about the name pronunciation, etc. This is Japanese small talk.


I think the answer is, it depends. In the circles where I vend stuff, if you don't have one, people look at you as a grade-A idiot. Like you showed up to a meeting without pants/a skirt on.

Heck, even when I go to startup meetups, everyone asks for a card (medium-sized Pacific Northwest city).

If they're going out of style, they're still fashionable here.


Really interesting business cards, simply cannot look like a business card, and the direct communication features (contact details) are secondary.

If a business card can delight people who receive it, then it has a point.

In this epoch, if sharing contact details requires a hardcopy, I'm of the opinion that it's of very limited value. Although it's useful to have business cards for many locales, since they are a customary item. (In many parts of Asia, it's essential)

So, if a card can communicate some meta information (sense of creativity, problem solving, good humor etc.) That's going to have value. It will create an interesting memory.

I think it's worth having a few designs made, but it's a matter of ROI.

If you think you need a business card, think about why, where and who they are for.


> If a business card can delight people who receive it, then it has a point.

The first rule of the advertising club is AIDA: Attention, Interest, Desire, Action. A business card should draw attention, if only to distract from the terribly clichéd action of handing over a business card :)


> Is that just me, or are business cards going out of fashion in general?

Because you don't do business obviously.

When you go to events, business shows, networking events, conferences, hiring fair... you give business cards to your contact so they can remember who you are and contact you back. It is very important.


I hand them out frequently. Sometimes to people I already know who want my contact information. It's easier to hand someone a card versus disrupting an interesting conversation to exchange details.


What sort of opportunity are you imagining?

I trade cards with vendor sales reps and support people who visit. They have my info so they can let me know about new products or obsolescence issues, and I have their info so I can call them for support and product information.


We got our first big break 4 months ago by giving out a business card. You have to make a connection first before you slip the card. I think your experience is more a reflection of your selling skills than anything.


As other poster said, business cards are important in Asia (not only Japan). Hand it over and receive it with two hands, inspect it carefully, hold it or leave it on the table for a while (don't just stuff it away immediately).

Having said that, I have for a while now used a dedicated email address for business cards to track its effectiveness, and extremely rarely receive anything on there.


I regularly hire and contract out work based on business cards hauled from conferences, travel, et cetera. At the end of the evening I make notes on the back. (Right after meeting someone interesting, I crease a corner.)


Just out of curiosity, what do you do with a card which has one of its corners creased already?


I'm guessing that you don't live in Japan. There, opportunities may or may not come after you have executed your card exchange, but don't leave home without them.


Nice!

I recently re-did my business cards. Pretty happy with how they came out: https://pbs.twimg.com/media/C25LX0bXAAAWWYL.jpg:large


That domain name is pretty bold! I am very much amused.


Really nice how you pulled off name + email + website in one line.


Awesome, I much prefer a creative idea than knowing the tool used for creation.


Hey, what font did you use? One notable feature is that the @ is very circular which is quite rare with fixed fonts.


It's called Cutive Mono! Took awhile to settle on one :p


Where do I order cards like these?


Ordered through moo.com :)


I second moo.com as a provider of fine business cards.


I think it might have been easier starting with the `standalone` document class vs `article`, then you don't have to spend so much time on set-up. You just have to make sure the tikz pictures are the right size, and standalone will put each picture on a new page.


Has anyone ever used QR codes on the business cards or store doors? It seems like a waste of space, you need to have a special app on your phone to read it. I think most people don't use that.


I've actually-used QR codes once or twice, in the last... six years. I still have software that'll scan them installed, but just because Scanbot coincidentally does it as a side-effect of what I actually use it for.

That said, the use on this card of encoding a complete vCard is actually pretty decent. It's not a tiny URL which you could enter in a few seconds.


Most phones nowadays have a shortcut button from the lock screen to open the camera, I wish the phone companies would have one shortcut for a barcode scanner. Or in the camera app, have an extra button to turn the scanner on -- mine has "take picture" and "take video", the scanner reads the live preview until ut recognizes a barcode.


One of the first apps I install on my phone is ZXing's Barcode Scanner, specifically so I can scan QR codes. On Android, it's hands-down the easiest way to post WiFi details for a LAN party or similar (no idea how to do it with iPhones — maybe Apple sells some $99 AirWiFiScanner thing?).

I'll scan 'em to see if there's useful info there. I'm a little concerned that there may be some way to do something malicious with a QR code, but I don't believe I've ever seen a POC.

I'll also use them to set up TOTP. Incredibly easy.

Honestly, I wish that they were used more often. In my experience they work flawlessly.


Exactly. It doesn't save the user much time at all, if any, and it detracts from the simplicity of a clean design. It's a solution in search of a problem.



As an iOS user, I always assumed QR code reading was just built into Android's camera app. Is that not so?


(from memory, may be out of date)

The AOSP camera app does not support QR code reading, and neither does Google's camera app. Many Android phones will have one or both of these installed. Most phones also come with a manufacturer-specific camera app. I know new-ish Motorola phones will read QR codes, and I suspect that the "kitchen sink" vendors like Samsung would as well.


It is kind of indirect, Google Now Screen Search supports QR Codes, so if you have a way to get the code onto your screen (via the camera app viewfinder for example) you can read it.


It's built in Windows Phone's camera app, but in Android phones I've tested, there's no QR code recognition.


Kind of.

Google Now has QR code reading built in, if you have a QR code on the screen (e.g through a camera app) you can have google now scan the screen by holding the home button.

I don't remember having to enable it, but if it isn't enabled for android users you can go into Google Now settings and enable "Screen Search".


Nope. Can't read QR code an android out of the box.

You need to get 1 out of 100 applications from the app store, that competes for the one who can shows the most ads and be the least effective at decoding the picture.

Are you saying that the iPhone camera can read QR code out of the box?


> You need to get 1 out of 100 applications from the app store, that competes for the one who can shows the most ads and be the least effective at decoding the picture.

The ZXing Barcode Scanner doesn't show any ads. IIRC, they give it away for free because they want people to use their (open source, natch) barcode library. I've been very happy with it.

This of course demonstrates a problem with the App Store model: neither Google nor Apple has a strong incentive or interest in directing users to the best, most secure, most privacy-preserving apps (particularly Google, who would prefer that privacy become a thing of the past): they would much rather that you buy something, or use some ad-ridden piece of nearly malware.


Yeah, I find it so strange that the Google Authenticator setup includes a step to download a recommended 3rd-party scanner scanner, given that GA is pretty much the most important app security-wise you can install on the phone.


I ditched GA after I tried Authy. Built-in QR scanner, plus syncing with non-mobile devices.


There actually is a first-party app from Google that can read QR codes: Google Goggles. Besides QR codes, I've also found it handy for scanning the ISBN bar codes of books that I want to remember.


> Are you saying that the iPhone camera can read QR code out of the box?

No, I meant that iOS doesn't have any built-in apps for it, so I always assumed Android must since QR codes are so prevalent.


Right; I mean, I guess I could scan it to find out - but what is it, a link to OP's website?

What could it possibly be that's easier to convey through a QR code - especially given that it's on a bit of paper you can keep.


I usually put a vcard on them rather than a link (especially on business cards) then you don't have type things into your phone.


It's a complete vCard. So if you scan it, you get an "add to your contacts?" prompt.


Not on business cards & store doors, but WeChat Pay and Alipay both use QR codes for payment, so anywhere between 1-5 times a day. The washing machines downstairs too.


I use snapchat, it picks up QR. It's my default camera just in case I want to caption something.


Business Tip: ALWAYS use a white background on a business card (or at least, leave one side empty full white).

People will write and take notes on the card. Can't take note on a dark background.


I completely agree and would add that your name, telephone number and email address should be a large enough font that people don't have to take out their reading glasses or squint to read it.


Can't agree more! I see a lot of bad business cards which do not leave one side empty full white.


You don't need an advanced text formatting system to design a business card. Any system that allows you to place short strings of text at any desired (x,y) position will do just fine. In fact, such a simple system might be better at not getting in your way.


The author addresses this in the second paragraph.

> I picked LaTeX because I want to have a platform independent implementation and because why not? I really like making LaTeX documents, so this seemed like something other than creating long documents.


I've done my resume in CSS for a decade now (with print media styles to generate a PDF) for this reason.

I don't currently use business cards but if that changed would probably do the same.


Mine is a PCB (Printed Circuit Board). I designed it with KiCAD. Open source CC-BY-NC-SA.https://github.com/bdc0/businesscard


I took graphic design in school (so I've seen plenty of business card concepts) and I just wanted to say that's the most creative card I've seen in years! I'd cherish something like that if you gave it to me - no matter how much they cost the impact is definitely worth it. Wow!


You aren't the only one who likes it; it definitely gets noticed. They cost ~$1 (US) in quantity.

The most common question I get is "If I install the chips, what do I get?" Unfortunately, the answer is nothing. There are people who have done that.


Serious question: Is LaTeX used outside academia these days?


Check out http://tex.stackexchange.com/questions/40720/latex-in-indust....

I like the one about the Deutsche Bahn (german railway company) using TeX to typeset the time tables on the train stations.


But is it still true?

The downloadable files containing several time tables say:

    Creator:        Adobe InDesign CS6 (Windows)
    Producer:       Acrobat Distiller 10.1.16 (Windows)
While the ones with just a single one say:

    Creator:        Hafas Print Bahnhofstafeln 4.6.5
    Producer:       PDFlib 9.0.5 (Win32)


I'm finishing up a 350+ page family history book that I wrote in LaTeX; I've got two or three more to go, and I wouldn't consider using anything else. It's got its quirks, but it was so easy to work with page formatting, footnotes/endnotes, and sections that it made up for all the fiddly issues I ran into with images. I have also been using the genealogytree[0] package (which relies on TikZ, coincidentally used by OP) to draw diagrams programatically. In addition to satisfying my scripting itch, it looks beautiful despite my unlearned eye for graphic design.

Related: I wrote a resume in LaTeX a few years ago. After an in-person interview, the manager took me to meet the team: When we walked in, they were crowded around a monitor arguing about figure out how I'd made such a cool-looking CV. It probably didn't actually look all that good, but anything that isn't Word stands out nowadays.


For longer documents in TeX, I wrote a collection of TeX macros for to make cross referencing easy. The collection was a little tricky to write but is fairly easy to use and works great. The package is based on some logical names that serve as, say, pointers; then in one place define such a name and in other places refer to it and get the page number, etc. inserted where the reference is made. Given some simple examples, it's really easy to use. I have a simple macro for my favorite text editor that will create a new, the next according to one scheme, such logical name. Having an easy way to have cross references is nice.

I also have something similar but simpler for bibliographic references.

TeX, some TeX macros, a good text editor with a good macro language (e.g., KEdit), and a good spell checker (e.g., ASpell with the last TeX distribution I got) are super good writing tools to have.


Your cross-referencing macros sound interesting. Are they publicly available anywhere?


Available? See the next three posts.

Offered as-is, no warranties. Use at own risk.

They work nicely for me. I haven't looked at the code in years, so there may be some dependencies, likely minor, maybe just in documentation, on some of my other macros.

There may be some subtle bugs, but I haven't found any. If you find a bug and know just what usage encounters the bug, then don't do that usage again! Or fix the bug!

There is enough documentation so that you can see the ideas -- actually they are all quite simple. The macros were a good TeX exercise.


Part II

     \newread\XREFileIn
      % \message{\string\XREFileIn = \the\XREFileIn}
      \newwrite\XREFileOut
      %
      \newcount\SNChapter
      \newcount\SNUnit
      \newcount\SNTable
      \newcount\SNFigure
      %
      \SNChapter=0
      \SNTable=0
      \SNFigure=0
      %
      \def\SNChapterPrefix{}
      %
      \def\TrimTag#1 {#1}          % For trimming trailing blanks.
      \def\First#1 #2 #3 {#1}      % Parsing first token.
      \def\Second#1 #2 #3 {#2}     %   "     second "
      \def\Third#1 #2 #3 {#3}      %   "     third  "
      \def\SNUNone{CH UN PG}       % Value for tags before there is an XRF file.
      \def\SNTNone{TB JUNK PG}     % Value for tags before there is an XRF file.
      \def\SNFNone{FG JUNK PG}     % Value for tags before there is an XRF file.
      \def\SNPNone{PT JUNK PG}     % Value for tags before there is an XRF file.
      %
      \def\SNUAdvanceChapter{\advance\SNChapter by1
      \SNUnit=0}
      %
      % Define a 'chapter' tag:
      %
      % Need a \global on \advance because might have the \SNC from
      % within a group, e.g., {\bf \SNC ...}.
      %
      \def\SNC#1{{\global\SNUnit=0
      \write\XREFileOut{\string#1}%
      \def\JUNKA{\SNChapterPrefix\the\SNChapter\space JUNK}%
      {\edef\JUNKB{\write\XREFileOut{\JUNKA}}\JUNKB}%
      \write\XREFileOut{\the\count0}}}%
      %
      % Define a 'unit' tag:
      %
      % To be more sure page number is correct, add text to list
      % BEFORE writing page number to XRF file.  In principle could
      % do the same for \SNT and \SNF but from how these are used in
      % practice a page number error would be nearly impossible.
      %
      \def\SNU#1{{\global\advance\SNUnit by1
      \write\XREFileOut{\string#1}%
      \SNChapterPrefix\the\SNChapter.\the\SNUnit
      \def\JUNKA{\SNChapterPrefix\the\SNChapter\space\the\SNUnit}%
      {\edef\JUNKB{\write\XREFileOut{\JUNKA}}\JUNKB}%
      \write\XREFileOut{\the\count0}}}%
      %
      % BEGIN Modified at 04:16:31 on Tuesday, December 29th, 2015.
      %
      %    In this collection, we now have a new
      %    macro
      %
      %         \SNP -- Sequentially Number Pointer
      %
      %    This macro intended for cross
      %    referencing to a place in text, that
      %    is not specifically to a 'unit'
      %    (definition, theorem), table, or
      %    figure.
      %
      %    In short, can have
      %
      %         Note\SNP{\SNTagCU}
      %         that, for $a, b \in R$ and
      %
      %    which will define tag \SNTagCU but
      %    insert nothing into the document.
      %
      %    Then elsewhere can have
      %
      %         Since as on page
      %         \SNCGetPage{\SNTagCU}
      %         an inner product is bilinear,
      %
      %    which will insert into the document
      %    the page number of the page where the
      %    macro \SNTagCU was defined.  Many
      %    more details below:
      %
      %    The macro \SNP is part of this
      %    package of cross referencing and
      %    sequential numbering but does not
      %    actually 'number' or 'sequentially
      %    number' anything.  We start the name
      %    of \SNP with 'SN' just to regard
      %    macro names of the form SNx as
      %    'reserved' in own TeX usage.
      %
      %    Then elsewhere in the document, can
      %    refer to that place by its page
      %    number, chapter number (apparently
      %    with its chapter prefix), etc.
      %
      %    E.g., if give a little discussion of,
      %    say, bilinear, can type, say,
      %
      %         Note\SNP{\SNTagCU}
      %         that, for $a, b \in R$ and
      %
      %    and that will just 'define' tag
      %    SNTagCU.  Of course the tag SNTagCU
      %    was likely from running own KEdit
      %    macro isntag to find the new tag name
      %    SNTagCU and insert it in the argument
      %    of the macro \SPP. That is, in KEdit
      %    we would have line
      %
      %         Note\SNP{}
      %
      %    current, run KEdit macro isntag, and
      %    get a new tag found an inserted to
      %    have something like
      %
      %         Note\SNP{\SNTagCU}
      %
      %    Then elsewhere in the document can
      %    write, say,
      %
      %         Since as on page
      %         \SNCGetPage{\SNTagCU}
      %         an inner product is bilinear,
      %
      %    and get the page number of the page
      %    in the document with
      %
      %         Note\SNP{\SNTagCU}
      %
      %    that is, where tag \SNTagCU was
      %    defined.
      %
      %    So, the lines
      %
      %         Note\SNP{\SNTagCU}
      %         that, for $a, b \in R$ and
      %
      %    write to the XRF file the standard
      %    three lines that would be written by,
      %    say, macro \SNU. With those three
      %    lines, the first has the tag
      %
      %         \SNTagCU
      %
      %    the next line has chapter and unit
      %    and the third line has the page
      %    number, all for what was the case for
      %    that part of that page when the macro
      %    \SNP was run.
      %
      %    Then, macros
      %
      %         \SNUGetChapter\SNTagCU
      %         \SNUGetUnit\SNTagCU
      %         \SNUGetPage\SNTagCU
      %
      %    will all work fine to extract the
      %    data on, respectively, chapter, unit,
      %    and page and insert it into the
      %    document.  That is, there is no
      %    reason to have separate macros to
      %    extract and for macro \SNP.
      %
      %    Really the macro \SNP is just like
      %    macro \SNU except does not add 1 to
      %    SNUnit and does not insert into the
      %    document the chapter prefix, chapter
      %    number, and unit number.  So, where
      %    macro \SNP is used, nothing is
      %    inserted into the document; this is
      %    in contrast with, say, macro \SNU and
      %    is why need the new macro \SNT
      %


Part I:

      % XREF001.TEX --
      %
      % Created at 02:29:35 on Thursday, April 13th, 2006.
      % ======================================================================
      %
      % Modified at 01:03:52 on Thursday, April 13th, 2006.
      %
      %    Macros for sequential numbering and cross-references
      %
      %    Macros \SNUAdvanceChapter, \SNU#1, \SNT#1, \SNF#1,
      %         \GetSN, \SNC#1, \SNCGetChapter#1, \SNCGetPage#1,
      %         \SNUGetChapter#1, \SNUGetUnit#1, \SNUGetPage#1,
      %         \SNTGetTable#1, \SNTGetPage#1, \SNFGetFigure#1,
      %         \SNFGetPage#1, \SNChapterPrefix
      %
      %    Counters \SNChapter, \SNUnit, \SNTable, \SNFigure,
      %
      %    Here SN abbreviates 'sequential numbering'.
      %
      %    Here we have a start on a fairly general collection of
      %    macros for sequential numbering and cross-referencing.
      %
      %    For example, maybe in some chapter, that at present is
      %    chapter 3, we want to sequentially number definitions,
      %    theorems, remarks, and examples -- call them all 'units'
      %    -- as in:
      %
      %         3.1 Definition:
      %
      %         3.2 Theorem:
      %
      %         3.3 Definition:
      %
      %         3.4 Remark:
      %
      %         3.5 Example:
      %
      %    Then with this package we would select some tags, say,
      %    starting with SNTag, and type
      %
      %         \SNU{\SNTagA} Definition:
      %
      %         \SNU{\SNTagB} Theorem:
      %
      %         \SNU{\SNTagC} Definition:
      %
      %         \SNU{\SNTagD} Remark:
      %
      %         \SNU{\SNTagE} Example:
      %
      %    So, macro \SNU abbreviates 'sequential numbering with
      %    units'.  So, with macro \SNU we get a case of automatic
      %    sequential numbering.
      %
      %    Suppose 3.1 Definition: appeared on page 43.  Then
      %    elsewhere we could write
      %
      %         See \SNUGetChapter\SNTagA.\SNUGetUnit\SNTagA Theorem
      %         on page \SNUGetPage\SNTagA.
      %
      %    and get
      %
      %         See 3.1 Theorem on page 43.
      %
      %    So, we get automatic cross-referencing.
      %
      %    Of course, to do cross-referencing in such a general way,
      %    need to run TeX at least twice, once to find, for each
      %    sequentially numbered 'unit', its page and write this
      %    data to a file, and once to read this file and insert the
      %    sequential numbering and page numbers where desired in
      %    cross-references.
      %
      %    The file is \jobname.XRF.
      %
      %    To use this sequential numbering with units,
      %
      %         o    At each chapter, after the \eject, if there is
      %              one, and just before the \hOne, have
      %
      %                   \SNUAdvanceChapter
      %
      %              to increment by one the counter \SNChapter and
      %              set the counter \SNUnit to 0.
      %
      %         o    Otherwise use macros
      %
      %                   \SNU\SNTagA
      %                   \SNUGetChapter\SNTagA
      %                   \SNUGetUnit\SNTagA
      %                   \SNUGetPage\SNTagA
      %
      %              as illustrated.
      %
      %         o    Then run TeX at least twice, basically until
      %              the file \jobname.XRF quits changing.
      %
      %    So, we get sequential numbering and cross-referencing with
      %    'units'.  The macros and counters particular to 'units'
      %    all begin with \SNU.
      %
      %    We let
      %
      %         \newcount\SNUnit
      %
      %    keep track of the number of the unit.  So, when we use
      %
      %         \def\SNUAdvanceChapter{\advance\SNChapter by1
      %         \SNUnit=0}
      %
      %    to increment
      %
      %         \newcount\SNChapter
      %
      %    we also set \SNUnit=0 for the new chapter.
      %
      %    With the \SNU macros as illustrated, will get file
      %    \jobnane.XRF like
      %
      %         \SNTagA
      %         1 1
      %         1
      %         \SNTagB
      %         1 2
      %         1
      %         \SNTagC
      %         1 3
      %         1
      %         \SNTagD
      %         1 4
      %         1
      %         \SNTagE
      %         1 5
      %         1
      %         \SNTagF
      %         1 6
      %         1
      %         \SNTagG
      %         1 7
      %         1
      %         \SNTagH
      %         1 8
      %         1
      %         \SNTagI
      %         1 9
      %         1
      %
      %    So, get three lines for each invocation of \SNU.  The
      %    first line has the tag.  The second line has data
      %    particular to 'units' macros.  And the third line has the
      %    page number.
      %
      %    For in Appendix I, may want units to go
      %
      %         A.I.1.2
      %
      %    that is, to have a 'prefix' of 'A.I.'.  To this end there
      %    is macro
      %
      %         SNChapterPrefix
      %
      %    which is default empty.  Setting this macro to
      %
      %         \def\SNChapterPrefix{A.I.}
      %
      %    will give the prefix 'A.I.'.  But, such a prefix should
      %    have no blanks else the parsing of the macros that get
      %    cross-referencing information will get confused!
      %
      %    But we may also want sequential numbering and
      %    cross-referencing for tables, figures, equations, etc.
      %
      %    Then the idea is to have more sequential numbering
      %    macros.  But, do not want a proliferation of files.  So,
      %    these other macros should also use the file \jobname.XRF.
      %    For compatibility, each of these other macros should also
      %    write three lines to file \jobname.XRF.  The main freedom
      %    is just in the second line.
      %
      %    Each tag, e.g., \SNTagA, becomes a macro.  So, each tag
      %    should be spelled like a TeX macro and have spelling
      %    different from all other TeX macros in use.
      %
      %    During the first pass of TeX, macros
      %
      %         \SNUGetChapter\SNTagA
      %
      %         \SNUGetUnit\SNTagA
      %
      %         \SNUGetPage\SNTagA
      %
      %    will notice that \SNTagA is not defined and will define
      %    it as the macro \SNUNone from
      %
      %         \def\SNUNone{CH UN PG}
      %
      %    the three tokens of which are intended to abbreviate,
      %    respectively, chapter, unit, and page.
      %
      %    Then each of the three macros
      %
      %         \SNUGetChapter\SNTagA
      %
      %         \SNUGetUnit\SNTagA
      %
      %         \SNUGetPage\SNTagA
      %
      %    will return 'CH', 'UN', and 'PG', respectively, as
      %    temporary place holders and a way to get page breaking
      %    approximately correct before another pass of TeX that
      %    will read and use the XRF file written on the previous
      %    pass.
      %
      %    We also have macros for tables and figures.  For tables,
      %    the names begin with SNT; figures, SNF.
      %
      %    The main challenge was in macro SNU (SNT, SNF) that has
      %    to write both the sequential numbering and the page
      %    number.  The cause of the challenge was how, really,
      %    when, TeX actually does page breaking.  So, once TeX
      %    starts a new page, it keeps adding lines until it clearly
      %    has enough, and maybe too many, lines for that page.
      %    Then TeX decides where to break the page, uses any left
      %    over lines for the next page, and continues.  So, when a
      %    macro in the text executes, TeX does not yet know the
      %    page number but the macro does know the chapter number
      %    and, say, the unit number.
      %
      %    When TeX sees a \write, TeX temporarily puts the \write
      %    in a 'whatsit' until the page breaking and then executes
      %    the \write.  'Whatsits' are discussed starting a line
      %    14,007 of the file of TEXBOOK.TEX.
      %
      %    The trick, then, is to have
      %
      %         \def\JUNKA{\the\SNChapter\space\the\SNUnit}
      %         {\edef\JUNKB{\write\XREFileOut{\JUNKA}}\JUNKB}
      %         \write\XREFileOut{\the\count0}
      %
      %    In the first line we define \JUNKA that has the data to
      %    be written.  The second line has a group with an \edef
      %    (define a macro expanded immediately) of a temporary
      %    macro \JUNKB with the \write and with the data to be
      %    written and, then, with an invocation of \JUNKB.  Due to
      %    the \edef, the whatsit for the \write has only constants
      %    and the right constants.  Those constants are from
      %    counters
      %
      %         \SNChapter
      %         \SNUnit
      %
      %    which might well change by the time the \write is
      %    executed; but, the constants will still have the values
      %    we need to have written.
      %
      %    On the third line, we use \write to write the page
      %    number, and the page number in \count0 is expanded at the
      %    time of the page breaking and, thus, has the correct page
      %    number.
      %
      %    Another challenge is how to handle the data read from
      %    file \jobname.XRF:  As each line is read, it has a
      %    trailing blank.  So, have to use and/or parse the blank.
      %


Part III

      \def\SNP#1{{\write\XREFileOut{\string#1}%
      \def\JUNKA{\SNChapterPrefix\the\SNChapter\space\the\SNUnit}%
      {\edef\JUNKB{\write\XREFileOut{\JUNKA}}\JUNKB}%
      \write\XREFileOut{\the\count0}}}%
      %
      % END Modified at 04:16:31 on Tuesday, December 29th, 2015.
      %
      % Define a table tag:
      %
      \def\SNT#1{{\global\advance\SNTable by1
      \write\XREFileOut{\string#1}%
      \def\JUNKA{\the\SNTable\space JUNK}%
      {\edef\JUNKB{\write\XREFileOut{\JUNKA}}\JUNKB}%
      \write\XREFileOut{\the\count0}%
      \the\SNTable}}
      %
      % Define a figure tag:
      %
      \def\SNF#1{{\global\advance\SNFigure by1
      \write\XREFileOut{\string#1}%
      \def\JUNKA{\the\SNFigure\space JUNK}%
      {\edef\JUNKB{\write\XREFileOut{\JUNKA}}\JUNKB}%
      \write\XREFileOut{\the\count0}%
      \the\SNFigure}}
      %
      % Get all the sequence number data from file \jobname.XRF:
      %
      \def\GetSN{{\openin\XREFileIn=\jobname.XRF
      \pushCount\InChapter
      \pushCount\InUnit
      \pushCount\InPage
      \pushCount\MoreFlag
      \ifeof\XREFileIn\MoreFlag=0 \else\MoreFlag=1 \fi\relax    % \jobname.XRF exists?
      \ifnum\MoreFlag=1 \relax
        \loop
          \read\XREFileIn to\LineIn
          \ifeof\XREFileIn\MoreFlag=0 \else\MoreFlag=1 \fi
          \ifnum\MoreFlag=1
            \let\InTag=\LineIn
            \read\XREFileIn to\LineIn
            \edef\InValues{\LineIn}
            \read\XREFileIn to\LineIn
            \InPage=\LineIn
            \edef\JUNK{\InValues\the\InPage}%
            \global\expandafter\let\InTag=\JUNK
        \repeat
      \fi
      \closein\XREFileIn
      }}
      %
      % Get cross-references on 'chapters':
      %
      \def\SNCGetChapter#1{\ifx#1\undefined
      \edef#1{\SNUNone}\else\fi
      \expandafter\First#1 }
      %
      \def\SNCGetPage#1{\ifx#1\undefined
      \edef#1{\SNUNone}\else\fi
      \expandafter\Third#1 }
      %
      % Get cross-references on 'units':
      %
      \def\SNUGetChapter#1{\ifx#1\undefined
      \edef#1{\SNUNone}\else\fi
      \expandafter\First#1 }
      %
      \def\SNUGetUnit#1{\ifx#1\undefined
      \edef#1{\SNUNone}\else\fi
      \expandafter\Second#1 }
      %
      \def\SNUGetPage#1{\ifx#1\undefined
      \edef#1{\SNUNone}\else\fi
      \expandafter\Third#1 }
      %
      % Get cross-references on tables:
      %
      \def\SNTGetTable#1{\ifx#1\undefined
      \edef#1{\SNTNone}\else\fi
      \expandafter\First#1 }
      %
      \def\SNTGetPage#1{\ifx#1\undefined
      \edef#1{\SNTNone}\else\fi
      \expandafter\Third#1 }
      %
      % Get cross-references on figures:
      %
      \def\SNFGetFigure#1{\ifx#1\undefined
      \edef#1{\SNFNone}\else\fi
      \expandafter\First#1 }
      %
      \def\SNFGetPage#1{\ifx#1\undefined
      \edef#1{\SNFNone}\else\fi
      \expandafter\Third#1 }
      %
      %    Read file \jobname.XRF here, now, before anything else.
      %    Get this reading done before any hint of a need to write
      %    to that file.
      %
      % Modified at 08:13:46 on Monday, January 4th, 2016.
      %
      %    If a TeX file has
      %
      %         {\bf \SNU{} Theorem:}\ \
      %
      %    instead of, say,
      %
      %         {\bf \SNU{\SNTagDA} Theorem:}\ \
      %
      %    the TeX can die here:
      %
      \GetSN
      %
      %    Important:  First read file \jobname.XRF.  Then open it
      %    for writing.  This open will give the file length 0 and,
      %    thus, destroy the data, if any, just read.  If
      %    execute
      %
      %         \write\XREFileOut{\string#1}
      %
      %    etc. without doing an open, then output will go to
      %    console.
      %
      %    From some fairly careful experiments, commands
      %
      %         \newread\XREFileIn
      %         \newwrite\XREFileOut
      %         \openout\XREFileOut=\jobname.XRF
      %         \closein\XREFileIn
      %
      %    seem to ignore TeX block nesting and to be fully 'global'
      %    across blocks.
      %
      \openout\XREFileOut=\jobname.XRF
      %
      % ======================================================================


I've also published multiple books for myself and my friends and family using LaTeX, use LaTeX whenever I want to print out and post or distribute an article that looks crappy in the original, and I've long written my resumes in LaTeX.


I thought my Latex CVs were great, but my wife always does better in...Illustrator out of all programs.


In my experience there's a lot of layout that's been done in Illustrator. I suspect it's because of how long Illustrator's been around and had pretty good layout support, especially when compared with the early years of Aldus PageMaker and QuarkXPress and other tools. Illustrator was needed to do a lot of graphics work that these other tools didn't support, so it can make sense to do as much as you can in a single tool. If you can get good results doing everything you need in terms of text handling in Illustrator, why take the time to use a DTP application when you're likely going to need to use Illustrator anyway for graphics work.


I'm very curious to see this CV!


Yes.

For example, I wrote "iOS and macOS Performance Tuning: Cocoa, Cocoa Touch, Objective-C, and Swift"[1][2] using LaTeX, and I think it came out rather well (Pearson has some pretty amazing LaTeX compositors that took my rough ramblings and turned them into something beautiful).

Quite a while ago, I also used TeX (not LaTeX, IIRC) as part of the typesetting backend of a database publishing tool for the international ISBN agency, to publish the PID (Publisher's International Directory). This was a challenging project. IIRC, each of the directories (there were several) was >1000 pages, 4 column text in about a 4 point font. Without chapter breaks. My colleagues tried FrameMaker first on a subset, they let it run overnight and by morning it had kernel-panicked the NeXTStation we were running it on. The box had run out of swap.

TeX was great, it just chugged away at around 1-4 pages per second and never missed a beat. Customer was very happy. The most difficult part was getting TeX not to try so hard to get a "good layout", which wasn't possible given the constraints and for these types of entries just made everything looks worse.

[1] https://www.pearsonhighered.com/program/Weiher-i-OS-and-mac-...

[2] https://www.amazon.com/gp/product/0321842847/ref=as_li_tl?ie...


Yes, I am a lawyer with mind dyslexia and I often got into trouble for missing stuff on my letters or rearranging the letters of things like the other parties names or case numbers etc. So I use the LaTeX letter document class and import the other parties information from an adr file. That way I only have to type the darn thing once. I think I can automate more stuff on documents where I often make errors but I'm just getting started with LaTeX. It's been a big stress reliever for me.


Interesting. I always instruct my lawyers to reference all counterparty information only once, either in the header or preferably in the signature block, and then make everything else reference an impersonal defined term. I hate transaction-specific information being littered randomly around a document.


I work on supply chain optimization at Target, and we write lots of internal documents in TeX. Admittedly the team has a relatively academic background—many PhDs in math, CS and OR—but it's still industry use :).

Some Word/PowerPoint creeps in as well, but I highly prefer the TeX documents. They look better, are easier to edit/reuse and slot neatly into Git. I really wish more people used it (or Markdown or whatever) instead of Office nonsense.


Completely agreed (happy LaTeX user in industry). However, I have recently been using org-mode for emacs and treating LaTeX as one possible output format. Being able to output to HTML/markdown is really useful for automating import into more commonly used tools.

For anything with non-trivial maths or structure, I'd still use LaTeX, but it's very nice to have an (easier, more portable) choice of output formats.


I use it when I need to write some documents (user guides etc...). If the alternative is open office or some other "WYSIWYG" editor I'll take LaTeX any time.

I don't really love it though, I find myself using org-mode more and more for that purpose, then I let emacs generate the latex code for my doc. It's not nearly as flexible but it works well and your source is a lot more readable.

I also use LaTeX for my resume, if you find a good theme they end up looking pretty sharp and professional with little work.


If you're converting Org documents to PDF, you can use Pandoc (http://pandoc.org/) to automatically convert & send through pdfLaTeX.


Thanks, that looks useful. But org-mode's built in "org-latex-export-to-pdf" worked well enough for me so far.


I do EECS research in the aerospace industry. Every journal we ever submit to either requires or accepts Latex. I also used it regularly to write reports as an undergrad. At least in the EECS side of the engineering field, Latex is alive and well.


"Mastering MuseScore" was written in LaTeX.

The presentation and content are excellent.

https://musescore.org/en/node/64711


Well, this was in academia and it was a long time ago, but when I was in college at the turn of the century I used it. Once I noticed that the exact same paper got a whole extra grade in LaTeX, I never went back to word processors.


Unlike what most LaTeX users may think, LaTeX is actually not even widely adopted in academia, with less than 20% of scholarly articles published every year written using LaTeX (https://www.authorea.com/107393-how-many-scholarly-articles-...). That said, it is the only powerful option to professional typeset mathematical notation. And for that reason, it is used by few in some non-academic research fields (military, gov, pharma, tech, HN readers).


If I read the blog post I learn that

- Latex is widely adopted in hard science.

- Latex is not widely used in other disciplines such as sport science.

I think this matches with what most latex users think.


Yes. I just wrote a guidebook with it. It's not that great if you want to do a full-color glossy thing, but the results are way better than using something like Word.


I work in scientific instrumentation, on the product development side. I've never seen LaTeX output (unless it's been well concealed) in a report or paper.

What I'm seeing is that the use of typesetting and even its cousin, word processing, are generally going downhill. My employer has largely given up on print advertising. If we write papers, they're for the more commercial oriented journals or trade mags, that don't use TeX for their own typesetting.

More and more, I see stuff that's just written in the e-mail editor, with graphics copy-pasta'd from screen captures, or directly into web based services such as Office 365. Or, PowerPoint (with all of the pitfalls described by Edward Tufte).

Word processing has practically been relegated to documents that nobody reads, such as functional procedures and announcements from HR.

I realize this all sounds kind of cynical, but I hope is taken in good humor. However, the gist is that writing and correspondence are becoming increasingly informal.


As a sysadmin I use it, though to be fair most often in the form of emacs org-mode export to latex because I like the way it makes stuff look (so I'm not editing in latex unless I need to for some reason, which you can do in snippets inside org mode). Resumes, and anything I want to be formal such as latters and invitations I also use it for.


FWIW, I prefer to use it (LaTex with Beamer) to make technical presentations at conferences. This[1] was my most recent output.

Though, I spent way too much time than I care to admit making those, especially tweaking the TikZ 'diagrams' (first time I tried) that can be seen later in the slides. I find it quite convincing.

I'm pleased with the end result.

[1] http://events.linuxfoundation.org/sites/events/files/slides/...


Hell I think it's just barely hanging on in acedamia


That's definitely not the case in computer science and electrical engineering at least. LaTeX definitely isn't great, but I don't see a viable alternative.


> That's definitely not the case in computer science and electrical engineering at least.

Same for astronomy/astrophysics. Most of the publications are typeset in LaTeX.


In the 1990s I used FrameMaker. It's still around, now owned by Adobe.

https://en.wikipedia.org/wiki/Adobe_FrameMaker


There's always InDesign, but given its time to proficiency, I think the bast case for similar purposes might still be something creating LaTeX/DocBooks source that ports well to InDesign.


Can't diff a indesign file, whereas latex is easy to diff.


Indeed, lots of people I know (mainly computer science, electrical engineering and physics) still heavily rely on it for long documents and papers.


In my current experience with states-side academia: draw a line between the most math/theoretical departments through physics and end up at the engineering departments. The tendency to use TeX/LaTeX follows that line from a relatively high probability to zero on arrival to engineering. In European academia it stays reasonable high throughout.


Hell, even in academia people hate it =\


I think it's not perfect, but for many academic purposes, LaTeX is still the best tool in my opinion. It's definitely not perfect, but there's a lot of historical weight and quirks, but the results look great if a little effort is spent. :)


I honestly miss LaTeX a lot, especially the quality of documents I could create with it (when I was in academia). Hopefully I'll get a chance to do a white paper later this year and bust out the skills.

But I am totally willing to accept my role as an outlier... I even wrote a small TeX package at one point, so I'm aware it's not a "fun" system. :P


I'm not sure why you were downvoted. Most of the academics I know hate it. They use it because certain journals require it, or their advisor makes them use it.


I use TeX. LaTeX also works, but the books are longer and less well written than Knuth's original TeXBook! :-)!

I love TeX -- it's one of my favorite and most important tools.

I have a Ph.D. in applied math, and IMHO TeX (or LaTeX) is just essential, call that more than ESSENTIAL for my work.

E.g., now I'm a "solo founder" of a startup, a Web site. The crucial core of the work is some original applied math I derived. So, yup, i typed it into TeX. So, as I wrote the corresponding software, I referred back to the TeX output of my core math -- worked great!

Without the math, the software would be impossible; one would just look at the screen and wonder what the heck to type. With the math, the software was just routine, essentially just trivial.

For typing material with a lot of math, I see no reasonable alternative to TeX or LaTeX.

I wrote my Ph.D. dissertation with word processing (thankfully!) but without TeX. What a pain. I could have included more math in the dissertation if I'd had TeX to do the word whacking. More generally, at one point in my career, I could easily have written and published a lot of original and tutorial applied math but didn't because of the difficulty of the math word whacking before TeX.

The last paper I published, some a bit wild mathematical statistics, was a good test for TeX -- some of my mathematical expressions in subscripts were a bit much, but TeX worked flawlessly!

If anyone is typing a lot of mathematical material and objects to TeX, then just encourage them to do the typing without TeX and see if they like that world better!

Computing is changing the world in major ways, some just astounding and/or astoundingly good; math is helping, a lot now and will more in the future; and TeX is just crucial for getting the math word whacking done. But Knuth knew that and did a great job.

So far, for the near and distant future of computing, TeX is one of the stronger pillars of civilization.


I'm pretty sure the overlap of academics who hate TeX and academics who comment on HN is small.


If it doesn't compromise your work, can you speak more of the path you took from a Ph.D. to startups/tech, and how your research allowed you to go down that path?

I'm a Ph.D. student in applied math as well, currently.


Part I

I tried a grad math department and didn't like it: (1) In a course in real analysis, early on the prof discussed some set theory. The summer before I'd had an NSF thing in axiomatic set theory, Suppes, von Neumann, an appendix in Kelley, etc. His first test had a problem, and at the last minute I saw a solution and wrote it down. He called me on the carpet -- nasty guy. I apologized for using little omega for its usual meaning without defining it, and then he saw that my solution was better than his and I was off the carpet. Bummer. He was too quick to cut me off at the knees. (2) Course was in Kelley, General Topology. As a ugrad senior, I'd lectured a prof once a week and covered all of it except the last chapter on compactness when I cut out to finish my honors paper [The typing was so hard that from rolling the carriage a half step my left arm hurt for a year!] But the course in grad school, same book, was beneath me. I turned in a stack of solved exercises and was a nice guy -- I didn't submit any I'd done in ugrad. Waste of time. (3) There was an abstract algebra course from Herstein's book -- by then nearly all beneath me. E.g., my ugrad honors paper had been on group representation theory which is heavy linear algebra and abstract algebra stuff. I solved some exercise in ring theory and got sent to a full prof. The only thing new in the course for me was Galois theory, so I studied that some weekend and took an oral exam for the course. Waste of time.

I wanted the math for math-physics but didn't see how to get that there. Certainly not Galois theory. There were some good ways but not with the courses they put me in. The specs for the q-exams were a disaster -- the faculty committee had a political train wreck. Bummer.

I got recruited by the NBS&T in DC. Getting to DC then was the land of milk and honey for applied math. I got married, and she went for her Ph.D.

We had a great time, good French cheese, some quite good French wine, lots of plays, concerts, trips to Shenandoah, etc.

I got into descriptive statistics, multi-variate statistics, statistical hypothesis testing, numerical linear algebra, curve fitting, the fast Fourier transform, second order stationary stochastic processes, extrapolation, and power spectral estimation, optimization, the Navier-Stokes equations, did a lot of catch up reading in the basics, a lot more in linear algebra, multi-variate calculus, e.g., exterior algebra, and more. Kept busy. Had a great time. Also got into computing in a fairly big way. Got some nice items, e.g., two new cars, etc.

My favorite book on my bookshelf, including for applied math, is J. Neveu, Mathematical Foundations of the Calculus of Probability.

Worked in industry and saw some problems in combinatorial optimization, deterministic optimal control, and stochastic optimal control, identified a problem in stochastic optimal control and found an intuitive solution, applied to grad school in applied math. Got into Cornell, Brown, Princeton, and more.

Independently in my first summer did the research for my dissertation in stochastic optimal control. Had lots of delays having to do with my wife and, then, our budgeting. In a rush, wrote some corresponding software in two months, much of it over Xmas at wife's family farm, and wrote and typed in the final dissertation in six weeks, stood for orals, and got my Ph.D.

During Ph.D., did work in military systems analysis, some optimization, statistics, and Monte Carlo -- wrote the corresponding software.

The day my wife got her Ph.D. she was in a clinical depression from the stress. To help her get better, I took a job I didn't want as a B-school prof in applied math (also played a leadership role in campus computing and did some consulting) but was near her home family farm that I hoped would help her. It didn't. I took a job in AI at IBM's Watson lab and did some optimization, mathematical statistics, and AI. My wife never recovered from her illness and died.

Then I became an entrepreneur.

I did some interesting work in two cases of optimization; thus I found good solutions to the customers' problems that they believed could not be solved; that I solved the problems scared them off. One solution turned out to be just linear programming on networks -- I was coding up the W. Cunningham variation when the customer ran away. The other problem was just some Lagrangian relaxation; I got a feasible solution within 0.025% of optimality in 500 primal-dual iterations in 900 seconds on a slow PC to a problem in 0-1 integer linear programming with 40,000 constraints and 600,000 variables -- scared the pants off the two top people in the customer's company. They had tried simulated annealing, failed, and concluded that no one could solve their problem; that I found a good solution, both the math and the software, scared them off.

I looked into lots of stuff that didn't work out.

Lesson: US national security, especially around DC, was, maybe still is, really eager for a lot in applied math -- optimization, stochastic processes, etc. In wildly strong contrast, I've seen no interest in business at all comparable, not even in Silicon Valley. The US DoD is often quite good at exploiting applied math; in comparison, business, in a word, sucks. The flip side of that suckage is, in some cases, an opportunity.

Lesson: Business is still organized like a Henry Ford factory where the supervisor knows more and the subordinates are there to add routine muscle to the thinking of the supervisor. Sooooo, US business just HATES anyone who knows more than any of the supervisors about anything relevant to the business, and one can about count all the good cases of applied mathematics in business without taking shoes off.

Business CAN make good use of specialized expertise and does with lawyers, licensed engineers, and medical doctors. Each of these, however, is usually outside the usual organization chart pecking order, is often from an outside service, in a research division, in a staff slot off the C-suite, etc. Each of these has a profession that is crucial; applied math doesn't. Bummer.

In business, an applied mathematician who shows the company how to save 15% of the operating costs is a lose-lose to the C-suite: If the project flops, then it was a waste, and anyone in the C-suite who signed off on the budget has a black mark. If the project is successful, everyone in the C-suite feels that their job is at risk from the guy who did the good project. So, the C-suite sees any such project as a lose-lose situation.

Nearly no one in US business got promoted for doing an applied math project successfully or got fired for not trying an applied math project.

So, sure, to make money with applied math, go into business, your own business, as your own CEO, and own the business.

Now some of the opportunities are closely related to the Internet -- take in data, manipulate the data with some applied math, maybe somewhat original and novel, spit out valuable results. Then monetize the results whatever way looks best. Use the math as a crucial, core, powerful, technological advantage, secret sauce. Don't expect the customers/users to see anything about the math -- just get them results they will like a lot. Do the other usual things when can -- viral growth, network effects, lock in, good publicity, own data, etc.


Part II

My software now is 100,000 lines of typing with about 25,000 programming language statements and the rest voluminous comments. About 80,000 of the 100,000 are for on-line, and the rest are for off-line, occasional batch runs for some of the data manipulations. There is some light usage of SQL Server.

I am basing on Windows and the .NET Framework. For the Web site, that is Microsoft's IIS (Internet Information Server -- handles the TCP/IP Web site communications and much more leaving a nice environment for my code for the actual Web pages) and ASP.NET for the Web page controls (single line text boxes, links, check boxes, radio buttons, etc.).

My Web pages and my code for the pages is just dirt simple -- Microsoft writes a little JavaScript for me, and I have yet to write a single line of it. There's no use of Ajax, no pull downs, pop ups, roll overs, overlays, icons, etc. The Web site is also dirt simple, a seven year old who knows no English and has only a cheap smart phone dirt simple.

I wrote a little C code; am using some open source C code, and otherwise wrote all the code in Visual Basic .NET -- seems fine to me. The important stuff is the math I derived; given the math, the code is routine, and Visual Basic .NET is well up to the work. Since I don't like C syntax, I don't like the syntax of C#. Maybe someday I will convert to C#, but in an important sense Visual Basic .NET to C# is just converting to a different flavor of syntactic sugar -- indeed, IIRC there is a translator.

So, I'm an entrepreneur working to sell the results of some math I derived.

So, to do such a thing, think of a problem and a solution, write the code, sell the results. Of course, problem selection is a biggie. And want a problem that can solve and with own applied math with a better solution than available otherwise; want the software not too much to write; want the computing resources within what is reasonable now or soon (possibly considering the cloud); want the results to be a must have instead of just a nice to have for enough users/customers times money per each to make some big bucks.

If you are a solo founder, then you get to avoid a common cause of failure -- founder disputes. As a founder, you SHOULD understand all the work, so if you are a solo founder you will!

Don't expect any equity funders to write you a check until you have revenue significant and growing rapidly. Thus, as a solo founder with revenue significant and growing rapidly you won't accept their check. No one in equity funding has yet seen even 10 cents from applied math research; you won't get back even laughs. Ph.D. academics is really good at work that is "new, correct, and significant". Business just HATES anything really new or significant and has no idea how to check "correct". E.g., the information technology VCs keep looking for simplistic, empirical patterns and have no idea how to evaluate anything new. Really, their looking for such patterns is likely also just a publicity scam; instead, they want to invest money in a business where accountants working for their limited partners, who, if that is possible, know even less about math, can say that they made a good investment. In an analogy, they want to buy a ticket on a plane that has already reached nice altitude and is climbing quickly. Maybe the startup will take their money if there are five founders, all exhausted, all with all credit cards maxed out, and each with a pregnant wife.

For a good applied mathematician -- with some original, powerful, valuable work, good at software, with a business with significant revenue growing rapidly -- to report to a BoD of business people, essentially anyone else in business, is a bummer. E.g., at an early BoD meeting you will outline an applied math project for some nice progress in the business, and about the time you get to sufficient statistics, an ergodic assumption, completeness of Hilbert space, the polar decomposition, something in NP-complete, or a martingale, the board members will soil their clothes, leave a smelly trail to the rest room, and then run screaming from the building. They will meet off-site, fire you, put the business up for sale for the cash in the bank, and be glad you are gone. Bummer.

So, go into business for yourself. Or, don't expect anyone in business, who no doubt knows next to nothing about math, doesn't even remember sin' = cos, to create a job suitable for your talents, training, and business value in applied math.

Heck, at one time in business, I saved a major company just by formulating and solving

y'(t) = k y(t) (b - y(t))

The BoD was thrilled, but I scared the socks off the C-suite.

That was the third time. The first time I wrote some software that pleased the BoD and saved the company. The second time I beat everyone in the C-suite at Nim -- I'd read the algorithm in Courant and Robbins.

Scared the socks off the C-suite.

Applied math in business is a wide open field -- nearly no one there now. So, you will be alone. You can trust the solid math you know, the solid, new proofs you write, and a lot in software, but no one will do anything but laugh until you have the big bucks in the bank; then, since you did something valuable they don't understand and know they could not have done, they will fear you and hate you; they will all agree and may gang up on you; they may attack you. The laughing is not nearly new: Just read the Mother Goose "The Little Red Hen"; that's still the case.

There's a lot of good, foundational applied math code out there for optimization, statistics, etc. you might be able to exploit. In computing, operating systems, .NET etc., SQL etc. are astounding and from free to usually quite cheap.

Nearly no one in business can identify, formulate, and solve even a problem that is basically just linear programming -- the competence in applied math in US business is, well, they forgot plane geometry. To expect them to derive some simple Lagrangian relaxation is asking for hen's teeth.

As an applied mathematician in business, you will be essentially alone out there, in the nearly empty intersection of math and business. If you are successful, then you will necessarily also be exceptional, and necessarily nearly everyone who is exceptional is alone.

My first server is an AMD FX-8350, 64 bit addressing, 8 cores, 4.0 GHz processor clock, 32 GB ECC main memory, etc. That's a lot of computing power for the money. Fill that up doing something valuable, buy 20 more, fill those up, and sell out for $1 billion or so. It's a heck of an opportunity.


Thank you so very much, graycat. Incredibly helpful, and I appreciate the time you put into the discussion. I feel as if I need to read your post 3-4 times to absorb it all.

I am working in numerical methods for PDE so some of this was far afield but it does make sense that those are the areas ripe for opportunity.

Seriously, appreciate it.


> They use it because certain journals require it, or their advisor makes them use it.

I think that's one of the jobs of an advisor: to encourage his students to strive to be better and do better.


Vertical card layouts look cool, but they aren't very practical for people with longer names or surnames.

To me this design seems right if you're English or Chinese, but try using it with your Greek or Basque name, or with the Spanish tradition of double (maternal and paternal) surnames.


Off topic. Recently I have rebuilt my CV in LaTeX and I want to share it with you here:

https://github.com/letientai299/cv


Looks nice but:

Hex colors don't map to CMYK at all. Your printer may be able to use a non-CMYK color using a special ink but that may cost more.

There are no crop marks or bleed. You'll want to add them.


Those particular RGB colors map well into the CMYK gamut and will be readily printed by any competent printer using a 4-color process.

https://support.dma.ucla.edu/help/tutorials/print_color_guid... has more information on the two color spaces' overlap.

The front side design is very tolerant of inherent printing tolerances, but the backside design is poor (in that it will highlight, rather than hide, the inherent tolerances in cutting). The human eye can readily see the difference in thickness of the white border. A far more tolerant design would take the green fill all the way to the edge. It's not the end of the world to have the backside of your business card have slightly uneven margins, but if you're equally happy with a full fill (and you bleed it beyond the cut lines), you'll never end up with a visibly bad cut.


Wow! Thanks for that link. It matches with my experience over the years. I think choosing in-gamut colors to start is a good practice though. It saves frustration later on.

Otherwise agreed on all points. Nice post!


NP. I do have a fair amount of relevant experience in the space: almost 14 years working for what is almost surely the highest volume seller of business cards online. ;)


What an horrific idea to make an accidental horizontal swipe on a mobile device make the page jump to the top!


Horizontal swiping on my iPhone does what it's usually doing.


For me it even jumped to the top if I was not extremely careful and conscious when scrolling vertically, never even made it to the end of the description of the front side before giving up in desperation.


I've done all my business cards in TeX, just Plain TeX, not LaTeX, for years.

It's just a simple TeX exercise.

I have the spacing so that the printing goes to some standard card stock for business cards with the positioning just right. Then bending the printed card stock results in a stack of business cards. Works great. Simple.


So your business cards have fuzzy edges from the perforations?

That just screams amateur.


I would love to use this to design my lab badges as well :)



I have a problem trying to generate the document with: xelatex src/front.tex

It is generated but without icons. If I see the file front.log I think it's trying to write the icons using Fira Sans:

  Missing character: There is no  in font Fira Sans/OT:script=latn;language=DFLT;mapping=tex-text;!
(The line shows 9 times and there are 9 icons too)

I tried everything but I have never used before Latex, so it's really complicated. Any comment will be very welcome.

Thanks.


Once I wrote a ten-line TeX program to add serial numbers to a large stack of Scantron forms. It worked well enough, but that's when I learned that laser printers are much less precise than I thought when it comes to positioning relative to the edges of the paper.


Let's put it that way - Patrick Bateman would not be impressed by this design, otherwise commandable effort!


I made mine using LaTeX for VL/HCC 2016. It's on http://web.onetel.com/~hibou/blog/VLHCC2016.html

I repeated enough times to fill a page, printed it, and cut the cards.


Nice environment! But...why do justified on a webpage?


Thanks.

I think the pages look better when justified. Also, the width is about the same as a typical book, so you don't have to move your eyes much when reading it.


So since this about latex it feels appropriate to point this out. Justified works well when you have auto word hyphenation turned on, and your layouts are fixed. Web doesn't do word hyphenation (I think?), so the words have odd spacing between them, it sort of stands out right away. Justification works in latex because of word breaking, but even then one has to manually fix underfilled or overfilled lines (I.e. by slightly changing word usage). For web content where you have less control over these things (as opposed to printing for a fixed layout via PDF), ragged right often works better.


So when there are large gaps between the words on a line, I should experiment with manual hyphenation until it looks better?


CSS3 has automatic hyphenation (hyphens: auto) which you can tweak with manual soft-hyphens (­), and there are client-side hyphenators (Hyphenator.js, Hypher) available as fallbacks.


This is nice to know! Still, unless the formatting is fixed (by distilling to PDF), there is no way to deal with the under and overflows that arise from an aggressive (and hence decent looking type setter).


My favorite business card belongs to Kevin Mitnick.[1]

Can LaTeX do that?

[1] - http://i.imgur.com/NxCZ32J.jpg


Out of curiosity, what was the most beautiful thing you've seen designed/made in LaTeX?



I have an iPhone. How do I scan a QR code? There's not a built in reader that I know of?


Challenge - do the card in Asymptote


People still use business cards?


I bet Knuth wouldn't be happy to hear that LaTeX was used to design Business Cards


First, Knuth did TeX, Plain TeX, and was not the primary force behind LaTeX.

Second, IIRC, Knuth did proudly use TeX to print dinner menu, invitations, or some such.


>Knuth did proudly use TeX to print dinner menu

That's hilarious. Any sources on that?

That man is incredible.


I was typing quickly from memory. With The TeXBook, on page 411 he has a concert program; on page 248 he has a genealogy chart; on page 247 he has some nicely formatted stock tables; on pages 233, 236-7 he has some recipe stuff from Julia Child.

So, instead of a menu, I found a concert program and some recipes. Maybe there is a menu in there someplace.

He also has an addressed envelope where he used his Metafont in some tricky way to get something like a stamp. He might also be able to get from TeX with Metafont the whatever they are called little boxes of geometric gibberish used as identifiers.

I did some TeX macros for foils -- I get a nice box around each foil. Some more macros I did let me do a simple drawing in, say, old Microsoft Photodraw, include it in a picture in a TeX document, and then position TeX math as annotation overlays on the drawing. So, I can, say, draw a right triangle and put the standard Pythagorean theorem annotation on the figure. Can put darned nearly any TeX expression as annotation. TeX is nice and for math nearly essential.


>little boxes of geometric gibberish used as identifiers

in which context?


I struggled to understand that as well. Couldn't tell if it was about USPS barcodes (1) or QR codes (or other 2D barcodes) or something else?

[1] - https://www.neodynamic.com/Products/Help/BarcodeCF2.0/barcod...


I meant QR codes! I don't really know what they are!




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: