The article's title really downplays how much terrific cultural analysis there is here. This write-up has great depth, not only on typography but also architecture, art styles, film, and music! Many links and reference images, too.
The title of the submission makes no sense either, and doesn't match the source. "Typeset" is a verb, and is used here as a pun on "set in the future". It doesn't mean "typeface" or "typography".
I'm afraid this is probably another word that will get accepted with new uses. I didn't actually know it was only a verb and the title made perfect sense in my head. I imagine that's true for most people who read it.
For non native English speakers it means "the font used in Wall-E".
I was quite surprised by the first comment because I did not expect that a font was so emotional (especially that I do not remember any particular fonts in the movie)
The book (<https://typesetinthefuture.com/2018/12/11/book/>) is a fun read, too! I think it was David Plotz who said "read more books with pictures." Interviews, gorgeous looking, and does a good job showing the whole world you can miss by just watching a movie once (or even twice) without really seeing everything. It gives a good sense, too, of how movies first create the standards for "future" and then proceed to subvert or reference those standards.
The most interesting thing for me was the Iconian Fonts website. One guy who is a "commercial transaction attorney for a global software and service company", that makes fonts as a hobby.
On his commerical use page, he just asks for a $20 donation if you use a font commercially. I wonder if he realised that his fonts would be used in billion dollar movie franchises.
Would this be more than 100 seats? $2000 is still not a lot. Would you theoretically need a license for every employee involved in the movie or only those which would actually work with the font?
Fun fact: long credits at the end of the movie were invented by George Lucas for American Graffiti in 1973 [1]. He didn't have the money to pay everyone so he offered to put their names in the credits instead.
And thus started a new chapter in the exploitation of film crews, where you don't get paid enough but hey, at least your name is in the credits. All the other producers were immediately like -- that's a genius idea to pay the crew less! So now all movies (and even TV shows) are full of hundreds and often even thousands of names in the credits.
I still remember my first credit in a blockbuster production, after a first few years in TV advertising that name no names, and it was exhilarating. My name is since forever embedded into the artwork we all worked towards. I was also paid, but with that money now long gone I just wanted to highlight that there is value not just in money.
That's amazing. I always wondered about watching old (1930s-1950s) movies where they would give credits to the lead cast at the start and just end with a "The End" card with no credits. I always wondered if they just cut the credits off, but I guess they never existed!
I'm just glad they stopped putting so many opening credits in films. It's basically insufferable to watch old movies with 10+minutes of opening credits. I'm annoyed by the 3-5 minutes of production credits at the start of movies today as it is.
Wow, it never even registered to me until I read this comment that the Star Wars movies didn't have opening credits. They're usually so forgettable anyway, and after reading that article, it seems so silly and ridiculous that all the various Guilds and Associations and Hollywood Political Units got so butthurt over that decision.
Lord of the Rings also famously does not have opening credits. Just runs right into the film. With the expectation that you might have just finished the previous film and ready to start the next.
The FUN part is, if you just happen to have the same name as one of the crew, you can get a credit without being involved at all!
Funny story: I usually sit through the full end credits when I go to the movies, just in case something interesting happens afterward. So I'm doing the usual thing at the end of Sonic the Hedgehog, and suddenly, there's MY REAL NAME jumping RIGHT out at me! It wasn't even on its own line, just in with a bunch of other folks working in the same group (I presume), but there it was, and I SPOTTED it!
Maybe it's just me, or maybe we just tend to take notice when our own names pop up, even in a crowd.
I'm glad to see people recognized for their work, even in such a small way though. As they continue to scroll and you start to see titles like "2nd assistant to the HR Team Lead" I can't help but wonder how much is bloat and how much improved the games might be if the teams were leaner.
I'm also torn on the concept of "production babies" which is basically just acknowledging that some parent was forced to abandon their family and newborn child for weeks-months of crunch because of bullshit arbitrary release schedules
I assume that a large number of people in video game and movie credits just did part time work or a short project and they weren’t exclusive with this particular project
Like when you list the accountants, is it that those accountants were working ONLY on this project or was this one of a dozen things they were handling at the time?
When we list a musician in the credits, is it a musician who was ONLY working on this project or was this one of a dozen things they were handling at the time?
I'm always amazed how many games in the early years have no credits at all. And it would bum me out because I wanted to know who coded this? Who's music is it? Did the coder do the graphics too?
It was only working in the industry that I got answers to most of these from seeing hundreds of resumes and demo reels. (How else would I have found out who was behind Virtua Hamster?)
Someone can correct me if I'm wrong, but I believe these days the various film-industry unions essentially require individual credits for anybody who even tangentially had anything to do with working on the film or on the set of a film. E.g., I'm pretty sure I've seen the names of catering staff in movie credits.
It's all contracts. My father is named because the effects company he worked for had a contract for credit, the guilds similarly enforce their contracts on producers. It was common for many many people involved in production directly to not get credit, you may be underestimating the sheer number of people involved in making a major film.
Industry people joke that the caterers are very important, and to be listed above them is a privilege, but a wry hint of truth because an army moves on its stomach.
Those aren’t syllable divisions, they’re hyphenation points!
From the footnote on page 219 of Word by Word by Kory Stamper (formerly a lexicographer at Merriam-Webster):
> Here is the one thing that our pronunciation editor wishes everyone knew: those dots in the headwords, like at “co·per·nic·i·um,” are not marking syllable breaks, as is evident by comparing the placement of the dots with the placement of the hyphens in the pronunciation. Those dots are called “end-of-line division dots,” and they exist solely to tell beleaguered proof-readers where, if they have to split a word between lines, they can drop a hyphen.
U+00B7 MIDDLE DOT = midpoint (in typography); Georgian comma; Greek middle dot (ano teleia) • also used as a raised decimal point or to denote multiplication; for multiplication 22C5 is preferred
But note there is a separate Unicode scalar value for the dot operator:
U+22C5 DOT OPERATOR • preferred to 00B7 for denotation of multiplication
Curiously, the interpunct in the title of this article (Wall·E) can be control F searched with ("walle") but all of your usages cannot (try "sometimes" and "clarify")
While he’s doing so, there’s a dramatic…
COMPUTER MALFUNCTION
…in the hibernation pods, which causes the life support
machines of the hibernating science crew to start reporting…
LIFE FUNCTIONS CRITICAL
…and end up with their…
LIFE FUNCTIONS TERMINATED
…which isn’t what they wanted AT ALL.
Still: if it is going to happen, it may as well happen in what is
probably Univers 67 Bold Condensed.
In my opinion dystopianess is transported, the more you hold the now as contrast against the old. Take bladerunner- the presence of mechanical animals -the holograms of greenery, the presence of once beautiful buildings (like the Bradburry) held against a industrial produced, soulless item, highlights the decay and despair.
"A Wuppertal Schwebebahn monorail train arrives at the Werther Brücke station in Wuppertal, 1913."
Somewhat off topic, but stuff like this always leads me back to this reconstructed video from 1902. It feels like you almost there. I cannot fantom all these people walking peacefully now are dead, and they never heard of silicon circuit, let alone internet or an iPad. Worth watching once or twice a year it always makes me appreciate my life just few inches more...
For sure! But just because you don’t “see” it doesn’t mean it doesn’t have a huge impact on the feel of the movie, your impression of the characters, and on the storytelling, which is ultimately why these “invisible” decisions were made :)
I'm not sure about the poster though. This is not necessarily communist, as this was just the style of propaganda posters of all kinds, that came up in the first half of the 20th century.
> The implication of this design choice—that communist values are the solution to decades of rampant consumerism
Completely absurd. The association is not with communism, but with propaganda posters. It's true that Americans are commonly conditioned to associate propaganda posters with communism, but the assertion above is an unreasonable leap.
aside: WALL-E's story is strangely analogous to the mystic journey of a seer. This includes the precise steps of (a) isolation (b) love (c) rejection (d) hero's voyage (e) androgyny (c.f. WALL-E in the space ship), (f) spiritual fight, (g) the 'kiss' (h) the 'return' (j) union and establishment of a new earthly life.
The title of the submission should be changed to “Type in Wall·E”, “Typesetting in Wall·E”, or “Typography in Wall·E”; the word “typeset” is a verb or past participle, not a noun.
It's funny how big of a difference perspective makes. I think this is a neat portrayal of how differently designers and engineers reason about design topics.
I was surprised to see an article about type that didn't involve code editors so heavily upvoted on HN, but as soon as I read the first few paragraphs, I realized why-- it was clearly written by an engineer that has learned a lot about design, and not a designer. There's nothing wrong with that! It's a cool and very well-researched design history deep dive that explores the network of references and roots of the type used, how it was used as a storytelling element, and that sort of thing.
If this was written by a type designer, they'd have been discussing very different things-- why the letterform shapes hit like they do, what design problems they solve, the conceptual and emotional references these shapes make rather than which concrete symbols they relate to, the general rounded square shapes, their negative space, how the lack of stroke contrast makes it hit differently than similar less uniform characters, kerning concerns, etc.
The person that wrote this article knows a lot more about how this type is used from a modern art perspective, but there's a difference between knowing art history and being able to wield the underlying principles to work with these things as an artist or designer. I think a good example of this is film fans that spend a lot of time on TV Tropes an the like, which are rather like informal film critics. That knowledge is a base requirement for great critical analysis, but if I needed to hire someone to create a film, I'd favor an undergrad film student that was in diapers when the film fan started digging into TV Tropes. Why? It's just a fundamentally different way of reasoning about the same thing. The fan is more concerned with the "whats" and "whens", and the student is more concerned with the "hows", and that gives each of them have a totally different perspective on the "whys". I think creators need to watch out with this. Especially in the nerdier genres, if they're so focused on the film itself that they disregard factors like context and overall story continuity, watching that film is a meaningfully worse experience because they're often more invested in the universe/characters/etc. than they are with the making any give story arc or character pop for a given movie. On the other hand, if we hand too much control to the people primarily interested in the context and story continuity at the expense of any individual film's story and artistic value... well... have you seen the Star Wars prequels?
Back to the article, I could see someone without education in type design reading this article getting the impression that they understand the typographical elements in this film. They definitely understand how the typography was used, but that's a lot different than being able to reason about these things like designers do. A more concrete example would be an in-depth article about the cars in the fast and furious movies, complete with the cultural references of each modification and the purposes they serve. It would be cool and informative, but it wouldn't bring the reader any closer to being an auto designer.
A lot of developers, in particular, get annoyed when I push back against their misconceptions about design. It's not an insult-- it's just not their area of expertise. Usually, they don't know enough about it to realize how little they know about it-- like anybody else with any deep topic they don't know. I've heard designers that have cargo-culted tutorial code into some wordpress plugin spew absolute nonsense about everything from data structures to network architecture with the confidence of someone that just got accepted to a prestigious CS doctoral program. That said, it's easier for non-technical people to see that they don't understand software development because it's easy to see they don't understand the terse error messages, stack traces, code syntax, terminology, etc. It's more difficult in the other direction. Visual design, broadly, is visual communication; for a design to be good, at a bare minimum, it must present cohesive messages or ideas to its intended audience. Many things that look the simplest while still solving all of their goals were the most difficult to make-- you can tell when non-designers copy it because it might look simple, but it probably doesn't effectively communicate everything it needs to-- and that's every bit as true for UI design as it is branding and identity design, and poster design. The complexity in that process is only apparent if you've tried to solve difficult, specific communication problems with a bunch of real-world constraints, grappled with the semiotics, tried to make it stand out, etc. etc. etc. and then had it torn apart by people who've done it a lot longer than you. I can see why someone that doesn't understand what's happening under the hood thinks a designer's main job is making things attractive, like an amateur interior decorator. In reality, that's not even always a requirement-- it often is a natural result of properly communicating your message. What we do is more akin to interior architecture: the functionality comes first. So the next time you see someone suggesting something like allowing custom color themes to your app to "improve UX," maybe consider consulting an experienced UI or UX designer to see what they think. If their suggestions revolve around making it prettier or hiding everything behind menus because functionality is ugly, I conceptually owe you a beer.
As an aside, Microsoft hired Carter to design a high-quality serif typeface that was easy to render on under-powered PCs. When you're designing type, having a standard set of words you can use to compare revisions is important; for this task, they made up a tabloid headline "Alien Heads Found in Georgia," and that's where the typeface "Georgia" comes from!
(When I was in design school, I heard that it was "Pickup Truck of Severed Heads Found in Georgia" but I haven't found a single citation that supports it. I imagine the person that told me was mistaken.)
"When art critics get together they talk about style, trend and meaning. When painters get together, they talk about where you can get the best turpentine."
Fun fact, Windows 8 changed the decimal separator for the South African locale from a period to a comma.
My theory is that some academic or idiot government official told Microsoft they're not using the official separator who duly fixed it. But in practice every "normal" person in the country used a period as a separator.
By default, Excel now uses a comma separator for decimals. Which unless I change it, makes it especially fun when I want to paste values into my banking website which (like most of the country) uses a period as a separator.
Really, it would have been way more pragmatic if South Africa just changed its official decimal separator.
It also caused some annoying issues on our .NET with SQL Server software project. For example SQL seed scripts inserting decimal values would break depending on if they were being run on Windows 7 or 8. On the upside, it did teach us all to have our code be properly locale aware.
> File extensions would never go through i18n/l10n.
Yeah they did. Notably, Microsoft did it.
Source: me. I spent 3 hours at the Venezuelan Embassy in London trying to get Microsoft Works for DOS to accept a new printer driver. I had the driver disk (in UK English) and I had Works and DOS (in LatAm Spanish.)
(My Spanish boss gave me the job, porque yo hablo un poquito de Español.)
MS Works wouldn't register the driver, whether installed, or manually decompressed and copied.
Eventually I worked it out. English-language Works called printer drivers `.PRD`, for "Printer Description" or something like that. Spanish-language Works called them `.DIM` for "Descripción de Impresora" or words to that effect.
Rename `OKIDAT24.PRD` to `OKIDAT24.DIM` and Spanish Works immediately saw it and the printer could be selected in Preferences and it worked perfectly.
Yes, filenames and even file extensions get translated sometimes.
Note for those too young to have used MS/PC/DR DOS: it did not have printer driver support, at all. It sent plain text to the PRN: or LPT1: port device, and nothing but plain text.
(OK, or LPT2: or LPT3: -- but I don't think I ever saw a machine with multiple selectable printers. It was cheaper and easier to buy a physical printer switch box and turn the dial than fit an extra ISA card with a Centronics port, and then configure it to have its own IRQ line.)
Apps did that for themselves. So, each DOS app had to have its own dedicated printer drivers.
Stood out to me too, as was sure that was not true. I'm a native Brit, I'm on the wrong side of mid-40s, often read historical literature that uses pre-decimal currency (and notation), and have never seen an interpunct used at all, or heard it referenced in terms of British currency until today.
Unfortunately, I'm the sort of pedant who on seeing somebody state an incorrect fact with such certainty, I doubt the veracity of the rest of what they have to say. I wonder where the author got this idea from?
Born in the late 70s, so also mid-40s, I worked retail in the 90s and older pricing guns often used a raised · rather than one aligned with the baseline, so I didn't doubt that it might have been common, or even official, in the past, when I read it. Such guns would sometimes have “old style” alignment with the baseline for the numbers¹ too.
The “still in use today” part is quite definitely wrong though.
Even in North America, old school cash registers printed their receipts with the decimal as the dot in that purple ditto ribbon colour. I’m also remembering ticker tape calculators were the same as well.
To be fair, traditionally 95% of digital content has zero amount of typographical finesse and simply uses whatever is available in the local standard keyboard layout. Even the use of em and n dashes is a big deal.
The decimal currency in the UK is essentially the same age as me - and I have never heard this claim that an interpunct is somehow more official. So the claim stood out to me as being outlandish; not impossible, just super unlikely.
To show I have no ill will toward outlandish Britishisms, this one applied in Parliament until relatively recently...
"To increase their appearance during debates and to be seen more easily, a Member wishing to
raise a point of order during a division was, until 1998, required to speak with his hat on.
Collapsible top hats were kept for the purpose."
Yeah, I don't doubt you, just meant that web copy is usually so typographically impoverished compared to print that it's not in itself much of evidence.
I think the idea is that in the financial world of 2100, when on currency, 10⁶ should be read as short-hand for 10E6, not 1E6.
Thus, 99⁶ should be interpreted as 99E6, hence 99 million, as the author says.
We know already that SI isn't universally followed. As a rough comparison, if a food item contains 160 calories, we know that's 160 kilocalories - and calorie isn't an SI unit. Or, 1GB of RAM is often 1024^3 bytes, with relatively fewer people using GiB.
If you look closely, kcal is always written Calorie not calorie. It’s interesting that even in otherwise metric based countries, the joule is completely foreign. It must be what most Americans experience when they see Celsius
RAM is always GiB - it’s how it works. I’d love to see an example of it being mangled though
Back about 20 years ago, I actually did see some SFF PCs listing their (chipset-fixed) RAM in terms of powers of 10, it was strange and disconcerting to see numbers like '1.04 GB' (1GiB - 32MB of video ram)
I'm don't know where you are talking about, I've now looked at the four closest food labels I could find and all have been kcal/ kJ, are you in Kenya or something?
> Some authors recommend the spelling Calorie and the symbol Cal (both with a capital C) if the large calorie is meant, to avoid confusion;[8] however, this convention is often ignored.
> 2,000 calories a day is used as a general guide for nutrition advice, but your calorie needs may be higher or lower depending on your age, sex, height, weight, and physical activity level.
As for 'RAM is always GiB - it’s how it works.' - that's my point.
RAM by convention is in GiB even if the notation uses GB.
The "Crucial Pro 32GB Kit (16GBx2) DDR5-5600 UDIMM" at https://www.crucial.com/ is actually GiB, because the convention for RAM is to use "GB" to mean "GiB".
Even though that doesn't follow SI prefix conventions.
Just like the predicted currency of 75 years from now uses 99⁶ to mean 99 million, even though the superscript 9 is not currently a suffix meaning "million".
Given that 99^6 is supposed to mean 99,000,000, it seem clear that the superscript is not meant to be exponentiation but rather just denote the number of zeroes following (ie. short for scientific notation à la the 99e6 syntax used in programming languages.)
Not to mention, 99^6 = 941,480,149,401 ≠ 99,000,000 (which TFA also quotes). But who's to say notation didn't degrade along with the rest of society? :^)
Probably designed by one of those young whippersnappers who never used a slide rule and confused 10^6 with 10e6, because that is how you enter big numbers on a calculator.
A worthwhile read that I thoroughly enjoyed.