It's a holdover from the days of Morse code. Recall the first computers were for military problems, and the first output was teletype. Teletype was originally for military messaging and that had a long history of using all caps because they relied on manual transcription of Morse code (and other codes) over wire and radio. The all-caps policies were put in place to make sure the officers could consistently read what the operator had transcribed. Some of these date back to the 1850s. The Navy didn't actually do away with all-caps until 2013.
I think this makes more sense (or at least gets more directly to the point) than any of the Stack Overflow answers. Lowercase is easier to read in print as our minds learn the shapes of the words and can interpret whole words at a time rather than letter by letter. But Morse code was originally transcribed by hand, and it is easier for sloppily written lowercase letters to be mistaken for one another than the more distinctive uppercase letters, so it became a standard to write in uppercase. This tradition carried on to teletype and early terminals until both cases were supported.
As someone who had extremely poor handwriting as a child (it is still not great...), it makes a lot of sense to me that they'd land here.
Over the years, being regularly mocked/embarrassed/reprimanded over my handwriting and often forced to re-write assignments led me to develop a weird print ~hybrid casing that substituted a fair number of uppercase forms anywhere my lowercase forms caused trouble.
(This is mostly a fallback when someone can't read my cursive, or for official forms, package labels, etc. For the same reason I also adopted a smaller number of uppercase print capitals in my cursive.)
When it isn't socially appropriate to use ALLCAPS or come across as a sTRANgE pERsoN, I have to be fairly careful/attentive when writing in print to avoid dropping into mixed case.
(I'm not a monster; I'll scale these more like smallcaps when I write them.)
It's been a few decades, but I recall having to write in all caps in my high school drafting class. We were told the all caps and the letterforms we were told to use helped legibility when the drawings are stored on microfiche.
We were still taught that in high school in the late 90s. Probably some of the last. You are correct. Here's a good example of drafting from the 1960s: https://i.imgur.com/tXozAMy.png
It's not just all caps, but a flowing, easy-to-read style of all caps.
Hard to believe people used to draw all this stuff by hand. I was never very good at it.
The header image on the website shows slashed zeroes, but the font preview doesn't, which is unexpected. In fact when previewing in Google Fonts there's barely any visual distinction between uppercase letter O and zero. There's only a very slight difference in width which you actually have to look for. Punctuation isn't centered in their box and leaves so much tailing space the test text `5240N 15:01 .84 8,9` reads like `5240N 15: 01 . 84 8, 9`
These problems don't exist as much in the variable width font, where there is no tailing space in the punctuation and the uppercase Os are rounder.
There's also some weird pinching in the corners of N. 1, l and I are perfectly distinguishable though, so at least it's got that going for it.
The hollow zero vs slashed or dotted zero thing is often done in fonts with an extra variant; basically the font can contain both and will display what the application requests. So you should assume that the font really does have the slashed zero, but the font viewer you looked at it with didn't ask for it.
That's a good point. I added that rule (font-variant-numeric: slashed-zero;) to the font preview using the browser's devtools, and I saw no difference.
Also, for decades teletypes used 5-bit 'Baudot' codes. 2^5 = 32. (LTRS) A shift-key (FIGS) could be used for number digits and most punctuation. (Telegrams shown in old movies are almost always upper-case only.)
7-bit ASCII (2^7 = 128) didn't start to show up until 1963. (Starting the arguments.)
Funnier: the machines used gears inside that limited how fast it could type. (60wpm on the low-end ... Which also made it hard to 'read the mail' of commercial carriers that used 66-75wpm machines.)
Distilling all the explanations, the answer is simple.
When you're limited to one case, all-uppercase has a long history and we're used to reading everything from headlines and titles to telegrams and stone inscriptions in uppercase. It's natural, we're used to it, and uppercase came first historically anyways.
Whereas all-lowercase isn't really a thing historically. You see some trendy logos or ads that use all-lowercase, but that's a pretty new thing. (Well, and then maybe back to e.e. Cummings for poetic effect?)
At the end of the day, it's frequent to encounter text in all-caps, it's rare to encounter it in all-lowercase. So that's why.
Historically the distinction between upper and lower case has been one between different scripts for carved/engraved letters (where curved lines are awkward), and handwriting which does not have any such limitations.
BUT THEN IF ALL-UPPERCASE WERE THE HISTORICAL DEFAULT, WHY DID MOST OF THE WESTERN WORLD SWITCH TO USING NEARLY ALL LOWERCASE? AND WHY DOES IT FEEL SO JARRING - EVEN RUDE - TO USE ALL CAPS?
> WHY DID MOST OF THE WESTERN WORLD SWITCH TO USING NEARLY ALL LOWERCASE
It's easier to handwrite than with all Latin capitals. Also, the greater variation in size of characters makes it easier to read at higher text densities, like on the printed or written page.
And from there, it spread mostly via imitation of style. If you look at most of the scripts in the world that don't have upper/lowercase distinction, they tend to be far less blocky than uppercase Latin script, and more like lowercase.
> AND WHY DOES IT FEEL SO JARRING - EVEN RUDE - TO USE ALL CAPS?
Because we are trained to interpret all-uppercase today as form of emphasis. Emphasizing everything, especially when it's not necessary, indicates a lack of social awareness. It would be similarly socially awkward to end every sentence in an exclamation mark.
Because the distance between text and reader shortened after the arrival of the printing press.
In Roman and pre Gutenberg eras, text was inscribed, eg on monuments. The reader might be a casual gawker, or someone at the base of a tower, or generally someone who could not be assumed to be in a position to engage closely with the text.
Once print became widespread, that changed and the superior readability of mixed case won out- it was faster and easier to read. All caps was reserved for things that had to be consumable at speed, at distance and in short chunks, like newspaper headlines and mass media posters. Over time the use of upper case became synonymous with IMPORTANT and since shouting people inherently think their message is important, they used upper case too.
COBOL code is an interesting case, actually. One key reason why early computer languages were either verbose or cryptic is that they had very few standard symbols and punctuation to play with. It's why COBOL still has syntax like DIVIDE X INTO Y GIVING Z - you might not have a standard forward slash or divide symbol in your character set. Also why PASCAL (drawing from early ALGOL) went with BEGIN and END markers as opposed to the later use of braces.
I'm pretty sure lowercase letters originated with the creation of the Carolingian miniscule script[1].
I had heard/read that it was created as a unification of handwriting under the Carolingian empire, but that may be untrue[2]:
“The use of the new script makes an experimental impression,” says the Latin scholar. “They were trying it out. In the Middle Ages a script like this was not just invented, as is the case today. It was developed as part of the living tradition of a scriptorium. In the 8th century Corbie was something akin to a laboratory for new scripts.” This is another point that militates against the idea propounded by many popular history books that in the framework of his cultural and educational policies Charlemagne more or less commissioned the devising of the minuscule with a view to creating a uniform, readily legible script.
Prior to that, people wrote in uncial and half-uncial. The distinction between majuscule and miniscule letters wasn't there yet, but already then, they were enlarging and decorating letters at the beginning of the sentence back then. See this image on Wikipedia for an example[3]: https://en.wikipedia.org/wiki/Uncial_script#/media/File%3AKe...
I like to also imagine that since telegrams were supposed to be short and direct, it was the written equivalent of yelling over a bullhorn. Anything more eloquent would be written by hand in proper casing (and cursive)
It's only jarring if the rest of the context is not all caps. Precisely zero people in the 80s complained that their VCR is shouting at them because its display is all caps letters.
Also in older computers, fonts were designed for all caps text. And it was fine:
Historically yes - uppercases came first. Even when lower cases showed up, the combination key sequence to get the lowercases working was harder so that took more time to catchup
It's also interesting to note how many of these systems are still running on a daily basis for 50+ years now. Stuff like your airplane boarding passes and hotel bookings still use all-upper encoding because these systems go back to the 1960s [1]. Cheques and other bank documents still use those MICR typefaces, which are similarly still encoded in all-caps [2]
But is that an O or an 0? Of course, care was taken to display them differently, and the same could have been done with l and 1, as coding fonts commonly do now.
No, the real reason is that upper case was customary for telegrams and then teletypes. Upper case was well established before CRT displays came into play.
The Remington No. 1 - which introduced the QWERTY layout - omitted number keys for 0 and 1; the O and I letter keys were considered sufficient. The No. 2 added a shift key, and a full number row.
All have characteristics that set them apart from potential lookalikes - the bar through the 0 is one excellent example - but it’s up to the font designers to take readability into account, and most designers are looking at reading English prose, not code. They don’t think “my font might be used to render a URL in a context where a user will need to make an essential security decision based on whether the string they’re reading contains an l or an I.
I was going to say that, early terminals had less resolution than many TV's, in the early days, especially as graphics drivers weren't standardized yet or had CPU/Memory constraints on how many pixels it could draw.
The neat thing about crt terminals, or really any monochrome crt. is that because they don't use shadow masks. The text is crisp. Really good looking, Much more so than one would expect given the low resolution. Especially compared to text on a color crt.
I sometimes wish monochrome displays were more of a thing. There is e-paper tech and text on it looks great for exactly the same reasons. but I would like to see what a high resolution monochrome LCD or oled display could do.
> but I would like to see what a high resolution monochrome LCD or oled display could do.
Monochrome LCDs are out there, but they mostly burn a hole in your wallet. They're used in medical imaging, but there's no other mainstream market for them, so the prices are high because the unit count is low.
PERHAPS EVERY SYSTEM SELLER ASPIRED TO BUSINESS AND CAPABILITY SIGNALS WITH FORMALITY AND AUTHORITARIANISM LIKE "WE BEGIN BOMBING IN FIVE MINUTES". HOW WEAK AND TIMID WOULD IT APPEAR IN LOWER CASE? THAT IS THE REASON. IT DEFINITELY WAS NOT THE FACT THAT MOST 50'S AND 60'S SYSTEMS LACKED LOWER CASE, SO THERE WAS ONLY ONE CASE. OR THAT EARLY OCR NEEDED TO USE AN ALPHABET AS DISTINGUISHABLE AS POSSIBLE.
ARGUE WITH ME, BUT I THINK AFTER LOWER CASE BECAME A THING, WE FORGOT AND LEFT CAPS LOCK ON SO WE COULD SHOUT LIKE GRANDPA WHO FORGOT TO CHANGE HIS HEARING AID BATTERIES RATHER THAN IT WAS TOO MUCH EFFORT TO HOLD SHIFT ON KEYBOARDS AND SCREENS.
BUT WORRY NOT YOUNGSTERS, IN 50 YEARS, WE WILL BE WRITING IN STREET GRAFFITI EMOJI SLANG WITHOUT SPACES OR PUNCTUATION.
Initially because teletypes only had uppercase, and those were the only tty devices available. And then when early CRT terminals became available, RAM was expensive, and you could save a whole chip by not storing one more bit per screen character.
I think paper tape and teletypes predate punchcards for IT applications. Baudot's work started in the 1870s, and fully automatic teleprinter systems were in operation by the 1900s (decade) [1]. IBM may not have been punching alphanumeric content until the 1920s (as your article suggests)?
Baudot (the code, not the man) was 5 bits, so only 32 codes. It couldn't even fully encode the alphabet and the digits; shift codes are used to switch between two sets of symbols.
Lowercase was added quite early though. Some hot metal casting typesetting machines ran on paper tape. They typically used six bit codes with multiple sets of shift cases. Upper, lower, digit, symbol, etc. (The physical construction of the teleprinter or typesetter had some relation to the nature of the code used.) The use of paper tape decoupled typesetting from the work of preparing and laying out the physical plate for printing. Typesetter and printer could be in different rooms or buildings, or maybe even on different continents; those tapes were being run over telegraph circuits by the late 1920s.
Teleprinters/teletypes go back further than you think. The first were in the 1800s. By the early 1900s teletypes had already become standardized around the time IBM developed the 80-column punchcard.
Your answer doesn't actually get deep enough. You say "because teletypes only had uppercase" -- but there is still a "why?" there -- they could have only had lowercase instead. The article answers this in a more comprehensive way by examining the history of typesetting and written script.
It's because they couldn't easily draw letters below the line. It was all block-aligned. They used blitted bits from a lookup table and copied them into squares on the screen that did not vary from letter to letter
What line? As you say, they were just bitmaps, so the font designer could declare the font's baseline as "3/4 of the way down the character bitmap and draw lower case letters just fine.
Lower-cased letters have both ascenders and descenders. Fitting both of those in 8 pixels is doable but leaves you with either the clipped ascenders or mangled descenders that you see in old 8-pixel lower-cased typefaces - your method works, but the results are poor. Upper cased Latin letters are far more amenable to rasterizing down to small sizes than lower cased.
Though the real reason behind the question is that teletypes printed in upper case, early computers copied teletypes and followed that convention which carried on through the 8-bit era.
then it would look crappy relative to capital letters "Ajax" - if the j bottomed out where the A does, it messes with the whole look. There's a whole world that we take for granted that typesetters/font-makers/graphic designers have figured out for us.
The question here was why they didn't use lower case letters exclusively instead of using upper case exclusively. In other words, if you have to pick one, why did they pick upper case instead of lower case? The difficulty of mixing cases doesn't apply to the question.
Exclusively lower case letters need more vertical space than exclusively upper case letters. "fgjl" has the same "top line" as "FGJL" but has a bottom line that's at least two pixels lower.
There are lots of 8x8 bitmap fonts for LCDs and 8-bit retro systems where they just didn't care about descenders being below the floor of the font. Aesthetically suboptimal but didn't stop them from being used in production.
It used to be, at least on some systems, if you logged in with an upper case user name, the system assumed you only did upper case, and all future output would be in upper case. In this case, Unix is in lower case, because you told it you could handle it.
Also, in Japan, some old equipment displays only katakana. No hiragana, let alone kanji. E.g. old cash registers and such.
I have a Yamaha FX-500 (audio effects processor) from 1989 (original owner!) which is like this: in the program titles, shown on the small LCD screen, you can use Roman letters, numbers and punctuation and also Katakana.
I modded that thing about a decade ago, replacing the NVRAM chip with a small daughterboard with a batteryless chip; and replaced some RC4558 op-amps with NE5532.
You can see pictures from the batteryless mod here:
I had to put a transistor onto the little board in order to invert one signal.
That was all made possible by suddenly getting the schematics for the FX500 thing in my hands, so I knew exactly what the original part is and all its signaling and how to relate it to the batteryless AutoStore chip.
I found the schematic on a Hungarian website called Elektrotanya, which at the time you could only join if you passed a small electronics knowledge test.
In one of the images you can see a diode (nestled under the socket for the IC). This is part of the transistor circuit, which features a Baker clamp to speed it up.
In the beginning, encoding was done just for single letters, not for a combination of lower an upper case letters (it was an evolution that later lead to the creation of ASCII code set). This had to do with limited amount of storage in early computers, and the fact that people had to use punch cards to enter data. Upper case is preferred because it is more legible and the standard form of letters in the western world.
I'm not convinced. I think I could live with that actually, even today. Certainly if I got used to it.
Now that I think about it, I read a lot of comics as a kid, and most of them were in all caps (or at least, they did not have two clear cases). It didn't feel odd at all that all speech in Asterix, Donald Duck or Calvin and Hobbes is in upper case. I bet most people don't even notice.
Isn't it weird that we spend so much effort to present text on computers in the way that we got used to specifically in printed books, once upon a time? The work that's been put into font rendering is crazy.
right; its all subjective based on our experiences; if we were strictly exposed to upper-case only from the start, then it would be jarring to see lower-case later on
But now you're subsituting "jarring" for "legible". They're dependent, of course, but not synonyms.
What's jarring depends entirely on what we are used to. What's legible depends among other things on how distinctive the shapes look to our visual system.
At least on home computers connected to crappy TVs via composite or coax, uppercase characters are almost certainly more readable than lower case characters just because uppercase characters use more pixels of an 8x8 pixel matrix.
Also uppercase was also standard on Eastern European computers, and I doubt they cared whether they could spell GOD all uppercase ;)
The only notable exception I can think of is the ZX-80/81 which had notoriously rudimentary output capabilities because it was designed to be so cheap.
The KC85/2 (https://floooh.github.io/tiny8bit/kc852.html) also only supported uppercase, AFAIK mainly because there wasn't enough room left in ROM for the lowercase characters.
I think it's kinda weird that a lot of the toplevel comments here are offering (incomplete and sometimes incorrect) answers to the question, clearly without having read the highest-rated answer after following the link. That answer is much more comprehensive and goes into the history of typesetting and even writing.
The detailed history can be found in https://www.redhat.com/sysadmin/unix-linux-history. In short, Unix was initially developed as an experiment in operating system design. But the investment of buying a PDP-11 to port it to was justified on the promise of creating a typesetting system for patent applications. And so it needed both upper and lower case early in its history. Since most English text is lower case, that was a sensible default to use.
Good responses here already, but it's interesting to note that the early typewriters had the same issue. The noisy Remington No 1 was released in 1874, with all caps and no lower case. The shift key allowing both upper and lower case wasn't introduced until the Remington Model 2 in 1878.
Yes, and there were still connections to telegraphs, which used all upper-case.
Sholes was much inspired by Hughes' telegraphs that used a piano-like keyboard with keys in alphabetic order.
The first early adopters/beta testers of Sholes' Type-Writer were telegraphists who transcribed Morse code. First after that did Remington get involved.
Heh. I once spent two days trying to understand why Fortran didn't know what cosine was. I had overshot the "oh" key and hit the "zero" key. Trying to spot the difference between "COS" and "C0S", on those fonts, when I didn't even realize that's what I was looking for, was... non-trivial.
It did, as I recall. But it was only one or two pixels, plus one pixel at the corners. It wasn't much if you weren't looking for it. And I wasn't - it hadn't occurred to me as a failure mode.
This may have been answered already. Character sets are a continuous evolution from Morse code (which didn't have a notion of "case" or even punctuation), to the first, or at least early teletypes, which used 5-bit Baudot that didn't have upper/lowercase either.
Eventually, lowercase capable terminals came along and the more interesting question is, where did the cultural shift to "lowercase by default" actually come from? Unix? Very early Unix stuff was all caps because that's all they had, but eventually lowercase prevailed.
In late 1977/early 1978 I used a Honeywell Level 66 mainframe whose timesharing interface was lower-case. My memory is hazy, but I’m pretty sure I typed in the FORTRAN in lower case, and the printouts were in upper case, including the comments. Probably, as mentioned by other posters, the line printer only had an upper case train. Certainly I typed lower case into the command prompt.
The operating system on that piece of overworked big iron was GCOS, formerly GECOS. This is the very same GECOS that’s still a field in /etc/passwd, where it was originally used to store the user’s GECOS account name. Bell Labs had a GECOS system and you could spool jobs to it from the Unix systems, so that was why the field existed. It shows the Unix creators were familiar with that operating system.
In 1977 Unix Version 7 had only just come out and was beginning to appear outside Bell Labs. Its influence on the wider computing world would have been slight.
However, I doubt the Unix creators got the lower case directly from GECOS. They had worked on Multics, which was also lower case. (Coincidentally, Multics ran on the same hardware as GECOS, with the addition of a memory management unit.)
The earliest terminal and printer I used that supported lower case, supported lower case "without true descenders", i.e., the lower case letters had to fit in the same character cell as the upper case letters. Lower case "g", for example, was shifted up, which looked funny. Needless to say, we felt really modern and high class when we got hardware that supported "true descenders". (And of course all this was a built-in monospace font; font support was years in the future.)
The large SO reply missed a beat: unix was case-sensitive/lowercase preferred because Multics was. As far as I know, Multics was the first to be case sensitive. It had the advantage of being developed after ASCII had been defined, and by standardising on 7-bit ASCII. Other 36-bit machines of the era used multiple character sets, including SIXBIT, a six-bit character set with only one case, to pack an extra character in a word.
The Multics terminals during its initial development at MIT were also mostly variants of the IBM Selectric typewriter, which rendered lower case text with descenders just fine. [0] [1]
The retrocomputing question mentions the theory that upper case was selected because it would be disrespectful to put "God" in lower case. I remember reading that explanation decades ago on alt.folklore.computers, so the story has been around for a long time: groups.google.com/g/comp.misc/c/vVW0wrfLaKw/m/tr-MsouDL5YJ
The opposite question is interesting; is there anything historically that is lowercase-only?
Think of situations in which a font has a single case. Like say on calculator-like displays which have additional elements to show letters. You could see those on old VCRs for example:
Now imagine this display but with lowercase letters:
1. In most letters the top half of the letter display is not used, but in some it is.
2. In some letters like "q y g j" you need a "descender" (https://en.wikipedia.org/wiki/Descender), which means you need to extend the display further down, and add more elements.
3. Digits themselves are "capital only". There are no "small case" digits in most fonts, or in writing. Which makes capitals consistent with them.
So capital letters are simpler, more visually consistent and as a result also I'd say more legible on displays where there's not a ton of text to begin with.
Even on pixel displays where the "descender" or "unused top half" is not a technical problem, you still would need larger spacing between lines to fit the descender without some letters touching the line below.
Capital-only is also the optimal choice when you need to print some small label on a device, or on a street sign, for example. It's a "STOP" sign, not a "stop" sign. No one does "small letters only" labels. It's either capitals only, or mixed case.
> In some letters like "q y g j" you need a "descender"
I just noticed yesterday the MBTA buses in Boston use mixed case on a dot-matrix display without space for descenders. Some letters are simply raised slightly (like "y"), others are replaced with small-caps versions (like "q").
That's easy. Now people used to high quality displays, even cheap now are good.
But this was not always. First videomonitors was very moderate quality, with huge distortions, and they even wear out (yes, OLED wear out is not new problem for industry). Only later, appear professional monitors, and after looong time, near all monitors become digital, and high quality and long life.
And UPPER case is much more readable on moderate quality monitor. That's all.
Bad quality printers are also a big issue. It's a lot easier to read a low-quality printout if you know it comes from a tightly limited character set. And some of these (such as dot matrix printers in "quick, low quality" mode) are still among the cheapest options if you have to print a lot.
The mainframe system I used in the 1970s had its printers with upper-case print trains. They did have one printer with an upper/lower train, so if you wanted to print a document out you submitted a batch job specifying that printer, and waited longer than usual for your output (because the print speed was lower). The computer centre management of the time considered document formatting to be an inferior use of their shiny equipment.
Teletypes. Many early computer systems were connected to teletypes as terminals.
I cut my my teeth playing SUMER and programming Fortran on a VAX over an all caps teletype with an attached paper-tape punch/reader and on a good day I got to use one of the decwriters (a dot matrix printer-terminal.
The vector screens (Tektronic?) were always in use by the engineering students, and I was just an 8 year old logging in on my moms account lol.
Sumer! I translated a printout of Sumer my buddy acquired somewhere into the BASIC for RT-11 and played it quite a lot back in the late '70's. Almost nobody I know has ever heard of it.
This is why in Common Lisp the internal names of symbols are normally (including all symbols in the standard) all in upper case. The default reader converts lower and upper case characters in symbols to upper case, and the default printer prints them in lower case (if no escaping is required), so you can write a program in lower case.
The "God" justification for all-upper-case is one I've heard attributed to Ken Olsen, CEO of Digital Equipment Corporation, though I can't seem to find a reference at present.
The widespread adoption of upper-case text in computers and electronic communications would of course have long pre-dated either Olsen or DEC.
Line printers. We couldn't use mixed cases until dot matrix or daisy wheel printers came along.
If the question is why upper case instead of lower case, there is a long tradition of handwritten uppercase only labels in drafting and mechanical drawing when precision is required.
I think that it's a bit of a reach there to blame 3rd century Latin for why computer/telegraphy people 16/17 centuries later decided to not retain the idea of lower case, when by that point it had been around in writing since the 10th century.
for me its also a matter of font; mix-match case+font+size and who-knows the difference between an 'l' '1' 'I' '|'. (small world i just was cursing at a pw-manager a few hours ago at the lack of distinction between some of these characters)
I don't think that's true. Uppercase I was narrower also on roman inscriptions. Just look at any inscription on old monument walls written in Roman capitals.
At least he didn't say that the Romans used V instead of U because their letters had to consist of straight lines. For some reason, that "factoid" is Out There.
Yeah. I think part of that factoid stems from some modern "ancient looking" Greek scripts used on souvenirs that use angular versions of capital Greek letters.
Some of those glyph variations were used but rarely in that precise mixture that maximizes the angularity look, which seems to have been chosen in modern times to represent "antiquity"
The latin script derives from the Etruscan script which in turn stems from some of the variety of the early Greek scripts (I forgot which one).
An interesting fact is that the Roman cursive script retained many of the archaic properties of the original Greek script it originates, while the capital letters have been subjected to more evolution.
In particular note the shape of the Roman cursive "m" (with the long stem on the left) and the "a" without the horizontal line.
For some infuriating reason, in 2023, all the police reports I see are still primarily written ALL IN UPPER CASE AND EVERY POLICE OFFICER OR CORRECTIONAL OFFICER HAS TO HIT THE CAPS LOCK KEY BEFORE THEY WRITE THEIR REPORT. I DO NOT UNDERSTAND WHY THIS IS AND WHEN I ASK THEM THEY SAY THAT'S THE WAY THEY HAVE ALWAYS DONE IT AND THAT IS THE WAY THEY WERE TOLD TO DO IT.
It's something to do with the military, that's all I found out. A lot of cops and prison guards are ex-military and somehow it has transferred from there.
Uppercase is numerically 65-90, lowercase is 97-122.
The numbers for uppercase can be represented in fewer bits.
"To implement a major change like lower case (keeping 6 bits per character in my syntax table instead of 5 bits) would have been a horrendous and risky job to do by hand." - Steve Wozniak
The early days of manual typewriters, paper tape, teletypes and vacuum tube systems, which already followed this practice predate US-ASCII, so I don't think the particular numeric values assigned by US-ASCII can have any explanatory power in answering this question.
Me, I blame the Romans ;-)
The Latin alphabet was initially only what we call the "upper case". What became the lower case came (a millennium?) later, first as an alternate style of handwriting and then as an addition to the alphabet along with rules about when which form of each letter should be used.
Given the need to economize as in 5-bit teletype codes it's not surprising the chosen convention was to print (or later display) those codes as upper case as that is, historically speaking, the default.
Still, I like to wonder if anyone every thought to build a teletype that printed in lowercase just to screw with people. :-D
The teletypes needed all the control characters as well (cr, lf, bell, etc.), so everything below ASCII 97 was mostly required.
If I remember correctly, Apple terminal emulators set reverse video when they meant a capital letter, so you could converse with something over a modem that was sending lowercase.
When I wrote "teletype" I wasn't referring to what teletypes had become by the time by the time Unix was developed. Think further back than that. This is why I mentioned 5-bit codes:
This coding scheme is so constrained (32 possible 5-bit values) that it uses the codes FIGURES (01000) and LETTERS (10000) to toggle back and forth between two alternate sets of meanings for the remaining 30 possible codes. Still not enough space for lower case or ASCII's plethora of control codes (just NUL and DEL).
I remember this being a space-saving technique on TI calculators as well. Some of them allowed using lowercase letters in console output, but why would you bother when it ate into the KB of total storage you had to work with.
The computers my mother worked with (1960s?) only had 6 bits per character. So with 2^6 = 64 different characters, there weren't enough characters to have both upper case (26 chars) and lower case (26 chars) plus all the numbers (10 chars) and symbols etc. you'd need. So they only had upper case.
I think it sort of stuck from there, that computers and commands etc. used upper case.
https://www.al.com/wire/2013/06/navy_puts_all_caps_communica...
https://www.doncio.navy.mil/chips/ArticleDetails.aspx?ID=489...