Hacker News new | past | comments | ask | show | jobs | submit login
The Art of Computer Typography (37signals.com)
169 points by wlll on May 25, 2012 | hide | past | favorite | 25 comments



This blog post is pretty light. If you really care about Knuth's work on METAFONT then you should just read the source code. It's written using WEB: http://www.tex.ac.uk/ctan/systems/knuth/dist/mf/mf.web

WEB is a language created by Knuth that supports his notion of literate programming.


Re literate programming. A scientist friend of mine likes to build his articles so that the graphs in a TeX are generated from an R script. Makes it easy to change some variables or try different input. sweave is another solution http://www.statistik.lmu.de/~leisch/Sweave/


Oh my god this is so awesome. The version numbers are increasing asymptotically to e (2.718281...).


TeX increments 3.14159... and it will be pi when Knuth achieves equilibrium. He no longer sends cheques for finding bugs...

http://www-cs-faculty.stanford.edu/~uno/news08.html

One of the originals


I'm going to start using "el_gordo" to represent int.max like Knuth does.


It is impossible, but how I wish he would rewrite it in cweb.


I wish the whole thing were tossed out and rewritten. The algorithms TeX uses are amazing and the output is phenomenal (especially using ConTeXt), but TeX itself is an abomination and LaTeX somehow manages to be worse.


TEX's implementation is great, considering it's essentially unchanged for the last 30 years. It's hard to name one software that currently exists in its original form that long.


What certainly doesn't mean it would be less great if done with less ancient tools.

The problem is layers over layers of fossilized complexity. The LaTeX distribution in my laptop is 3.1GB. It contains 344 files in its bin directory. I'm pretty sure the beautiful typesetting can be achieved in a leaner way.


So you would prefer one monolithic binary over a collection of binaries each of which does a specific task (bibtex for biographies, xindy for indexing, dvips for converting dvi to ps, etc.). For comparison, the git installation on my computer contains some 160 binaries (hidden in a directory different than /usr/bin). Should git be rewritten in a leaner way as well?

Most of the size of a TeX distribution is due to the fonts and the documentation. I did not analyze TeX Live distribution due to its size, but ConTeXt standalone (which is a distribution just containing files needed for context), is around 230MB, out of which 100MB is fonts, 45MB is binaries, 42MB are core macro files, and 26MB is third party modules (and their documentation). You will get similar numbers if you install only the LaTeX packages that you need. Blame your distro's packagers for forcing you to install a single monolithic 3GB package.


Most of those git binaries are links to git for tab completion in your shell.

DVI is exactly what we're talking about: a strange custom format that only exists to be transformed into formats we want. A modern TeX should produce PDF output with no DVI intermediate stage.

Same story with Metafont. Outside math, people want good fonts developed by typographers. Those fonts don't show up in Metafont or TFM format, they show up in OTF and TTF. Getting TeX to handle these is quite a chore, but even if you do, you're still going to have Metafont and all the strange intermediate files it creates lying around on your filesystem.

There is a reason no other software is built like TeX: it's terrible. But for its history and the fact it does the job really well, nobody would tolerate this mess.


Please verify your facts before bashing TeX.

Firstly, tab completion does not need those binaries. Zsh does similar tab completion for git (which uses 150 binaries) and hg (which uses one binary).

Now for TeX, pdfTeX has been around since, well, since the mid 90s. It has been part of the standard TeX distribution for more than 10 years. So for the last 10-15 years, TeX can directly produce PDF.

Similarly, xetex has been around since 2003-04 and luatex since 2007-08. With both of them, using OTF and TTF fonts is as simple as just saying `\setmainfont[Dejavu]` (using the simplefont module in ConTeXt or fontspec package in LaTeX). No need for converting to TFM files or anything.

The reason TeX is built like TeX is because when it started, postscript/pdf did not exist, Type1/TrueType/OpenType fonts did not exist, so TeX had to create its own formats. Over the years TeX has adapted to the new technology: you can easily use UTF-8 input (for text and math), ttf/otf fonts, opentype math fonts, use Lua to write macros, and get PDF output. You are objecting to the fact that TeX works hard to be backward compatible and carries around all the legacy programs so that documents written in 90s still compile easily today. If you don't want them, just don't install them (again this is more of a packing issue than anything else).


Run md5sum on your 160 git binaries. Notice that 100 of them have the same checksum. This is because they're hardlinked to the same binary.


That still leaves around 60 unique binaries.

<pre> find /usr/lib/git-core -type f -exec md5sum {} \; | cut -d' ' -f 1 | sort | uniq | wc -l 68 </pre>

Comparing from TeX: <pre> pacman -Ql texlive-bin | grep '/usr/bin' | cut -d' ' -f 2 | xargs md5sum | cut -d' ' -f 1 | sort | uniq | wc -l 181 </pre>


I have 50 left, and running `file` on them shows that 40 of them are shell script wrappers around `git`.


Disclaimer: I'm not a programmer

Could this become an open source type project with a core engine exposing an api and additional modules written in commonly available languages to add functionality?

Once the core engine is stable, it should (in theory) fly...


Of course it could. But then, why would anyone do that? You can't compete with TeX on price, on correctness and quite probably not on quality of output. You're left with ease of use and performance. Not sure either it actually warrants such a huge undertaking as recreating TeX.

This leaves open source typesetting a hostage of its own success.


I believe this is one of the objectives of LuaTeX. Lua is a much saner programming language than TeX.


Of which your source file is an excellent example. Run, IIRC,

    weave mf.web
to generate TeX sources for a camera-ready, exhaustively commented version of the source code that has even been published in hardcover[1].

[1] http://www.amazon.com/Computers-Typesetting-Volume-Metafont-...


Here's a little video (and transcript) of Knuth describing how hard it is to make a general system for describing type: http://www.webofstories.com/play/17114 My favorite part is when he's fighting with generalizing the 'S' and his wife suggests, "Well why don't you just make it S-shaped?"


I hope his classroom lectures don't have that same meter; it is very distracting. I'm probably going to hell for blaspheming against Knuth.

That said, thank you for posting the link - that site looks very interesting.


For the Tolkien fans, Johan Winge's Tengwar Annatar font was created with METAFONT, mftrace (using potrace) and FontForge. The design is meant to simulate a nib pen, and METAFONT is particularly apt at producing calligraphic letter shapes.

http://home.student.uu.se/jowi4905/fonts/annatar.html


Interesting! I use (La)TeX but I didn't know what metafont was about.

Even though I am pretty good at identifying typefaces I have forever damaged my changes of identifying the standard mac fonts like Palatino, Times, Helvetica in print. Rendered as bitmap (like on a 9" Macintosh Plus) I have no problem in telling them apart.


If anyone's not sure of the advantages of a text processor, like TeX or troff, over a WYSIWYG (or WYSI-all-YG, as Kernighan said) application, then http://www.schaffter.ca/mom/mom-02.html may help.


TeX is great but in serious need of a rewrite. I'm not sure it will ever happen.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: