Hacker News new | past | comments | ask | show | jobs | submit login
80x25 (tive.org)
204 points by paulgerhardt on Oct 24, 2019 | hide | past | favorite | 129 comments



80 characters was the number of characters a 10-pitch typewriter could fit on a line on standard letter-size paper. 24/25 characters came from the vertical resolution of CRTs and what you could fit into 2KB of RAM.

The point size argument is clearly wrong, as this sample from the early 19th century shows: https://upload.wikimedia.org/wikipedia/commons/4/45/A_Specim...


That appears to be typeset, not typewritten.


But they didn't used typewriters to make punchcards; they used a press.


80, yes. 25 lines, though, had nothing to do with punched cards.

And yes, physical card quality was a big deal. The early feeding mechanisms relied too much on the card thickness and card edge properties. That lasted into the 1970s.

Then, near the end of the punched card era, Documation nailed it.[1] They discovered that if you squirt air into the base of a card stack just right, you can make the cards separate themselves slightly. Then a vacuum picker can reliably pick them off one at a time. A neat little piece of machinery, far simpler than most earlier card readers.

[1] https://www.youtube.com/watch?v=qu55b0GpgE8


> The VT52 and its successor the VT100, though, doubled that capacity giving users the opulent luxury of two entire kilobytes of display memory, laid out with a font that fit nicely on that 4:3 screen.

> .. 80×25 just sneaks under that opulent 2k limit with 48 bytes to spare.


The 25 lines comes from the 2 Kb display RAM of the VT52/VT100, plus the desire for 80-character lines.


the Vt100 had a 132 character mode as well.

They also had shockingly bad protection on the serial port from induced surges, eventually we brought the full service manual so our in house electronics shop should could fix ours.


> They discovered that if you squirt air into the base of a card stack just right, you can make the cards separate themselves slightly. Then a vacuum picker can reliably pick them off one at a time.

This is what I do every time I load more than a couple of sheets in my cheap old laser printer.


Fun fact, it’s the same way ATMs dispense money.


that machine is really boss


I expect the standard would be something like 80x25 regardless, given the size and resolution of early terminal monitors and how small you could make text before it's unreadable. Whether a different sized greenback ends up making a 70 or 90 column punched card, it's not that big of a difference from what we got. The 4:3 aspect ratio is probably the bigger influence.

It's like how rack mounts are 19 inches wide because the Bell System made that the standard for its telephone relay racks 100 years ago. In the long run it's more important that there is a standard than what it is.


Video memory addressing could have become quite a bit simpler and faster if the width would be nice round 2^N number (e.g. 32, 64 or 128 characters on a line), especially on limited 8-bit machines of the 70s and 80s. So 40 or 80 is indeed a somewhat strange choice (at least those numbers aren't completely odd though, with 40=(2^5)+(2^3) and 80=(2^6)+(2^4).


TRS-80... 64 chars (or 32 double-width) by 16 lines


As others have pointed out, the claimed link between banknotes and terminal dimensions is unclear, if not flat out wrong. The article's own logic is missing a crucial connection right here:

> At one point sales of punchcards and related tooling constituted a completely bonkers 30% of IBM’s annual profit margin, so you can understand that IBM had a lot invested in getting that consistently, precisely correct.

> At around this time John Logie Baird invented the first “mechanical television”; like punchcards, the first television cameras were hand-cranked devices...

It goes on to argue that the television standards influenced terminal dimensions, but there is no link (unless I missed something) between the banknote/punchcard discussion and that of the television - other than the fact that they used hand-cranked devices. No mention as to the punchcard's size/dimensions being carried over to television (and by extension, terminals) other than what appears to be a coincidence in the number of characters fitting on a line of a punchcard.

Looking at the origins of the television standards (at least for the number of terminal lines, since the number of characters per line seems to have been influenced by typewriters), one could trace it back to William Kennedy Dickson (created the 35mm standard by cutting 70mm roll film in half), or previously to Peter & David Houston and George Eastman for their creation/marketing of early roll film cameras.


The link is probably as follows: Bank notes -> Hollerith card (80 x 12) -> Datapoint 2200 terminal (80 x 12, facilitated by 1K Intel linear shift register for display memory) -> 2K display buffer & 4:3 display dimensions of silent film (as opposed to the later academy format) as generally used for television as the major source of demand for readymade raster cathode ray tubes => two punch cards + 1 extra status line.

The Datapoint 2200, evolving from an original idea about some data editing device to replace the IBM 029 keypunch, is missing in this account, though. Announced in 1970 (and shipping in 1971), it predates any of the terminals mentioned in the article. (BTW, the Datapoint 2200, which was also the ancestor of the Intel 8008 MPU and by this the ancestor of the modern PC, was an enormously influential device and is generally underrated for historical aspects.)

The link between columns on a punch card and typewriters is probably found in the need for some correspondence between typed tables and information punched on cards. The aspect ratio of silent film is probably more for esthetic reasons, since the technical aspect of the width of a frame could have been addressed by any kind of gearing. However, there's a link between the speed of the film moving through the camera, the number of exposures per second, the width of the image, the optical quality of the lens, the shutter construction and the sensitivity of the film, putting some constrains on the aspect ratio of choice. (Obviously, if the film is moving too fast and exposure becomes too short for the chosen material, the resulting image will be dim. The choice of film sensisitivity, on the other hand, is related to image contrast.)


Punchcards (and thereby FORTRAN) were 80 columns. To have a viable glass tty, it needed to display 80 columns.


> > At one point sales of punchcards and related tooling constituted a completely bonkers 30% of IBM’s annual profit margin, so you can understand that IBM had a lot invested in getting that consistently, precisely correct.

There's also a missing logical connection in here. If sales of punchcards were 30% of IBM's revenue and profit margins were tiny, then it would be very important to get them right. 30 different things can each be 30% of the profit margin at the same time that all 30 are insignificant to the company.

More generally, how important it is to get the punch cards right depends on what your profit margin is on the punch cards, not on what the company's overall profits are.


Something I missed in this great writeup was the Datapoint 2200 terminal (the ancestor of the 8008 MPU), announced in 1970 and introduced in 1971. It originated from an idea about a replacement for the 029 key punch (intended to replicate the prior success of CTC's Datapoint 3300 terminal as a drop-in replacement for the ASR 33 TeleType.) From this initial idea the Datapoint 2200 directly inherited the 80 x 12 layout of the Hollerith punch card. (That this fitted quite perfectly into 1K, certainly didn't hurt and may have been an incentive for keeping the dimensions of the initial project. Reportedly, the narrow built height resulting from the narrow 12-lines screen was generally liked by users.)

Compare: Wood, Lamont. Datapoint: The Lost Story of the Texans Who Invented the Personal Computer. Hugo House Publishers, 2013.

See also, https://en.wikipedia.org/wiki/Datapoint_2200


What may be interesting, as well, is that the earlier Datapoint 3300 terminal (announced in 1967, shipped 1959) already featured a 25 lines display, but at 72 characters per line, just like the ASR 33 it was meant to replace. There seems to be no hard evidence on how CTC arrived at 25 lines per screen, though.

Regarding the Datapoint 2260, there's prior art in form of the IBM 2260 Model 3 terminal (1964), featuring 80 characters in 12 rows as well, in direct correspondence to a punched card. (The Model 1 displayed 40 characters at 6 lines, while the Model 2 managed 40 characters at 12 lines, with only the top model of the family featuring this relation to punchcards, though. Bonus fact: the 2260s used a portrait raster tube turned on its side, resulting in vertical scan lines.)


Yup! Python folks don’t seem to think it’s funny when I call the PEP-8 standard, with its 80-column line limit, “punchcard compliant,” though.


Don't forget this part of PEP 8:

> For flowing long blocks of text with fewer structural restrictions (docstrings or comments), the line length should be limited to 72 characters.

Older IBM machines could only read 72 columns of the 80 columns of data. The remaining 8 columns were used to store sorting information, eg, a line number so if the card deck was dropped it could be re-sorted using a mechanical sorter.


> Older IBM machines could only read 72 columns of the 80 columns of data.

That applied to the early scientific line, the 70x series, as a side effect of their 36-bit word size — they read each card row into two words. Fortran gave them lasting influence.

(Their earliest commercial computer, the IBM 650, could read 50 columns — any 50, determined by wiring a plugboard, as on pre-computer card processing equipment.)


Ahh, I see. Yes, my experience is with some data format defined in the early 1970s using FORTRAN format statements, almost certainly because of IBM's scientific line.


There are actually deeper reasons for these limits. Text typically gets broken into multiple columns when it would be over about 70 characters long, because reading long lines is uncomfortable.

Many people talk about the 80 column rule as reflecting some technical limitation due to the antiquated punch cards — but they could certainly have made wider punchcards if they wanted to.


Those who refer to it as a technical limitation are almost certainly wrong.

Remington-Rand released a 90-column punched cards, for example, two years after the famous 80-column IBM card. But IBM equipment dominated the market, and their cards became "standard." https://historyofpunchcards.wordpress.com/2014/03/24/rivalry...

Eg, Univac used the 90-column format. https://en.wikipedia.org/wiki/Punched_card#Powers/Remington_...

FWIW, the recommendation of 72 characters seems much more a programming standard than a general readability standard. For example, quoting from https://www.viget.com/articles/the-line-length-misconception... quoting from "Bringhurst's authoritative The Elements of Typographic Style".

> Anything from 45 to 75 characters is widely-regarded as a satisfactory length of line for a single-column page set in a serifed text face in a text size. The 66-character line (counting both letters and spaces) is widely regarded as ideal


Code is - text wise - not as dense as prose. There's usually a lot more special characters and white space. I don't think rules for prose apply here.


I'm one of those that sticks with 80x25 because I had decades of it influence me.

I do understand, though, that it's dumb, and wouldn't force it on others.


When I started conforming to it, my code became a lot more concise.

I'd advocate every developer conforms to PEP-8 rules (80-char, nesting limits, etc), at least for a 3-12 month period. They'll be a better developer for it.


That's a great idea. Have an opinion about the standard from experience, not gut feel.


It lets me have 3-4 columns of code on a standard laptop or 8 columns on a 4k monitor, which is a huge productivity win.


Only about 2.5 for me (at my font size on a 1920×1080 screen), but it's a productivity win regardless, and it's a massive productivity sink when I have to work with code written without that constraint.

An 80+ character line is also a code smell for me; if a statement is so complex that it pushes past the 80 character mark and it can't be split into smaller chunks on their own lines, then I know for sure that something needs refactored (whether it be the statement itself or the nested control structures around it).


Yeah, I actually keep my editor columns at about 65-70 with :set nowrap. I find this gives me appropriate pressure to write shorter lines, but if I go past the end I've got another 10 columns or so until I hit 80.


I stick with it because I read the newspaper and eventually discovered that the wider a column is, the harder it is to read. I understand that it's not quite the same with code, and I don't know that there's any significant difference between 79 and 81 columns, but I never found anything that I couldn't get to work with 80 with a few minutes thought, so why not.


Yep. Newspapers are laid out in columns because that's simply easier for people to read.

Since you're into dead tree editions, here's a fun fact you may not know: The fibers in newsprint are aligned vertically so that you can easily tear out an article vertically along its column. Tearing horizontally cleanly is significantly more difficult.

Source: Worked for two major newspaper companies.


Probably vertical for strength too. They need it to be pulled the rollers in the printing machine.


Paper in books is also aligned the same way, to resist better to humidity.

Humidity enlarges the fibres, but doesn't get them longer. In a humid environment a book grows wider but not taller.

If the paper is oriented the other way, the paper would grow taller and the spine of the book, which is stitched and glued, would force the paper to warp and became ondulated.


It's not really dumb. It's good to have somewhat of a standard.

If there's anything wrong with it, it's that it's too big!

Ideally we might have settled on something like newspaper column widths for readability - which are way less than 80.


Except that pep8 doesn't call for 80-column limit?[1] Or am I missing something? It seems pretty clearly to request 79 to me. This is actually one of my pet peeves about pep8, why can't it just be 80 like everyone else?

[1] https://www.python.org/dev/peps/pep-0008/#maximum-line-lengt...


"Because an 80 char wide Emacs window starts wrapping at 79."

https://www.reddit.com/r/learnpython/comments/1h2eug/pep_8_w...


Neat, another weapon to use against emacs in the vim vs emacs war.


Ugh.


Because with a diff annotation, it grows to 80.


huh, that's actually the first time I've heard a somewhat reasonable answer for that question. I wonder why pep8 gives such a terrible reason (editors that show newline chars) when there's actually a somewhat defensible reason. I still disagree with it, and other guides don't appear to consider this a problem even for projects like the linux kernel, which considers terminal users to be first class citizens, but it's at least nice to know a better reasoning than showing newline chars.


Make your editor worry about long lines. In vim:

  set wrap
  set sbr=\
In Visual Studio:

Tools -> Options -> Text Editor -> All Lnaguages. Select word wrap and Show Visual Glyphs.


Fortunately this is a commonly overridable setting in pep8 linters; I generally go with what django project established: 119. The complaints about the odd length really helps identify the engineers who shouldn't be making coding style decisions.


Why is this an invalid complaint? Are they complaining because it's not 120, or because it's significantly larger than 80? 120 means that you can't comfortably split two files on the same monitor (on most monitors), which is a very valid complaint (and is often cited by style guides for various languages as a reason to not exceed 80 chars). If they're complaining that it's 119 instead of 120, then maybe I can see a little bit what you're getting at.


> If they're complaining that it's 119 instead of 120, then maybe I can see a little bit what you're getting at.

This is indeed the complaint. Apologies it wasn't clearer.


I remember back in the DOS days being able to type a tiny program into DEBUG.COM that would set the screen to 80 x 50 mode! It was one of the first things I would do on a machine if I didn't have my floppy disk of favorite tools with me. It was just a couple instructions, like MOV AX this and INT 10h that. Now I have to go find it!


I thought the MODE command could do this.


  MODE CON LINES=50 
Indelibly burnt into my long term memory. You needed an EGA or higher graphics card though.


There was also LINES=43 for EGA (640×350, rather than VGA's 640×480).

MODE CO80 was for 80-column colour. ``MODE MONO'' switched to MDA/monochrome VGA mode (if you hadn't used &b700-b7ff for a UMB.)

MODE CO40 for the 40-column mode intended for the original IBM PC when used with an analogue TV set as its display.


Didn't 'MODE MONO' need a Hercules graphics card though?


Pretty good quote at the end. I’ll have to remember it next time I think I know something.

>As a personal aside, my two great frustrations with doing any kind of historical CS research remain the incalculable damage that academic paywalls have done to the historical record, and the relentless insistence this industry has on justifying rather than interrogating the status quo. This is how you end up on Stack Overflow spouting unresearched nonsense about how “4 pixel wide fonts are untidy-looking”. I’ve said this before, and I’ll say it again: whatever we think about ourselves as programmers and these towering logic-engines we’ve erected, we’re a lot more superstitious than we realize, and by telling and retelling these unsourced, inaccurate just-so stories without ever doing the work of finding the real truth, we’re betraying ourselves, our history and our future. But it’s pretty goddamned difficult to convince people that they should actually look things up instead of making up nonsense when actually looking things up, even for a seemingly simple question like this one, can cost somebody on the outside edge of an academic paywall hundreds or thousands of dollars.


The example the author uses reminds me of some wackadoodle numerology BS proving that the ancient Egyptians predicted the Federal Reserve. "...which gives us 675, which is close enough to 640, and therefore there had to be a second shooter". Hey, wait, what? And the comments! "Yeah, that makes sense." Umm, might I posit that it most certainly does not make a fucking lick of sense? Yeesh, round off enough numbers, and I've got yer String Theory proof right here.


Most early terminals were actually 80x24. I know the DEC VT series were. I remember being surprised when the IBM PC came out with an 80x25 display.


Early video terminals had a variety of sizes. The 1964 GE Data Editing Display (earliest I can find a manual¹ for) had 46×26. The 1967 Datapoint 3300 had 72×25. The 1967 Sanders 720 had (up to) 40×50 or 64×32, depending which orientation you got the monitor.

But the IBM 2260, their first character display, had 40 or 80 columns by 6 or 12 lines, depending on how much you spent on the controller, which stored a cluster of terminals' displays on acoustic delay lines.

¹ https://archive.org/details/bitsavers_geterminalayDec64_1626...


It was likely so you could have a status line at the top/bottom of a terminal connection


Yes Burroughs block mode terminals of the same era (late 70s) as the VT52s were also 80x24 - we were very much editing 80 column card images at the time


The leap between IBM punchcards and VT100 makes no sense. Teletypewriters used 72 characters per line. IBM mainframe line printers used 132 characters per line. DEC and IBM probably just happened to pick 80 character columns for unrelated reasons (1970s CRT resolution and 1920s printing technology, respectively)


The VT100 actually supported both 132 characters and 80 characters per line, in a 9 x 10 and a 10 x 10 character matrix, respectively.

(The character matrix in ROM is just 8 x 10, but pixels are stretched horizontally before being displayed. The limiting factor here isn't resolution, but the signal response of the screen phosphor and the circuits involved in its activation. Therefor the horizontal pixel stretch in order to allow the phosphor to reach its maximum activation level. In fact, a monochrome CRT has just an even phosphor coating and there isn't such a thing as hardware pixels or a native resolution. However, if activation phases are too short to allow the phosphor to gain a suitable activation, the display will be dim in those regions and overall unbalanced in brightness.)


It makes lots of sense - as we moved from batch systems to working from terminals (mid to late 70s) some people were editing cards on card punches, while others were editing card images on terminals


So DEC thought of this as a way to edit IBM cards but IBM didn't?

The implicit claim is that somehow the machines for manufacturing punch cards were reused for computer output and that is what terminals were trying to emulate. But surely there were more displays and non-card-sized printers?


Actually I was using Burroughs systems (in pre-VT52 days)

There certainly were non card size printers - but they scaled linearly with technology

But no we didn't repurpose card punches to be terminals, but we did need terminals to edit card images (fortran was 72 characters plus 8 digits of sequence numbers)


Having used such a system: yes.


But wait, I thought CRTs don’t really have resolution per se. Especially for horizontal, I’d imagine things are quite flexible. The same might not be true of an NTSC signal, where 80 columns probably would not work terribly well, but why would it matter for 72 vs 80 vs etc.?


don’t really have resolution per se

A CRT doesn't have to have resolution in the sense of pixels per inch (although many of them do). However "resolution" is a more general limit on how far away two distinct points can be.

A discrete grid of pixels imposes one kind of limit. But another kind of limit is that two slightly fuzzy dots will become indistinguishable if they blur into one another sufficiently.


Yeah, this was my understanding, but I kept it vague since I don’t claim to be any kind of expert. Otoh, I suspect for this reason that the choice of 80 columns particularly was not motivated by CRT limitations. Could be wrong but it’s not like CRT tech was new in the 70s, and we’re talking about monochrome here.


Most monitors were pretty small. So too many characters per horizontal real estate would have been hard to read.


The core claim of this article is that this is not why 80 was picked though


VT220s had a 132 column mode. It was readable, albeit not as readable as 80 columns.


So did vt100/vt102 - though it may have been an option on vt100, and a reverse video mode. It was only vt52 stuck at 80.

It was readable, but not something for your default setting - 10 mins was more than enough!


However, the character matrix was the same for both modes, just the letter spacing was different. – Did this cause certain characters to bleed one into another at times, or was this just a general readability issue?


Hard on the eyes: https://imgur.com/IYULtxA


Ah, I see. Apparently, more of a general contrast / density / grey level issue. Also, probably some issues with bleed and character separation, but this may be also due to the photographic exposure. Thanks for linking the image!


Apparently the author is not aware that delivering a box of microfiche cards instead of a fat paper printout was a common practice back then. It had nothing to do with spooks.

You find that mentioned in Brooks's Mythical Man-month book: everyone working on OS/360 got a box of microfiche representing the current state of the OS, each morning. Without a terminal, how else would you look up what a system call actually did? Printing one copy of all the source, and then optically reproducing it to more acetate, was obviously more efficient than printing it hundreds of times.

The old listings would be burnt to recover the silver.


One not need to have even worked on old computer systems. I'm not even retired yet, and I've gone to the library to look stuff up in old newspapers that were on microfiche. When I was a mechanic back in the day, parts listings were on "fish", too.

I think the author read "microfilm" and instead of hearing a synonym for the thing car parts are listed on, I'd bet the immediate thought was little micro spy cameras to put your microfilm in.


Where TFA goes from talking about punch cards to 35mm film with audio track to 4:3 film and CRT aspect ratios, there's a disconnect: there's nothing to tie film/TV aspect ratios to punch cards. TFA then tries to construct a tie from:

> Fascinatingly, the early versions of the ECMA-48 standard specify that this standard isn’t solely meant for displays, specifying that “examples of devices conforming to this concept are: an alpha-numeric display device, a printer or a microfilm output device.”

TFA should know that terminals evolved to be a keyboard + printer before they evolved to be a keyboard + CRT. I'm not sure that there's any link between paper size and CRT size. Most likely the only connection would be number of columns (of fixed-width text), but not lines.

My guess is that in the end there's no connection between punch cards and terminals being 80x25, much less between Civil War Demand Notes and terminals being 80x25. 80x25 was just convenient for all the independent reasons TFA lists. Any similarity to punch cards was almost certainly coincidental, and perhaps convenient.


> I'm not sure that there's any link between paper size and CRT size. Most likely the only connection would be number of columns (of fixed-width text), but not lines

I haven't got a solid answer here, but I do know from experience that your average typewriter does about 80 characters across a page once you account for margins.

Now clearly that'd not be much of an argument if margins are arbitrary but I don't think they are. On a typewriter your can more or less freely choose the top, left, and right margin. The bottom margin is however limited by the mechanism: by the time the bottom of the page is under the striking area the paper is flopping in the breeze since the roller isn't holding it anymore. If you set the left and the right margins (and the top margin) to the same size as the minimum workable bottom margin you end up with roughly 80 characters across the page.

Seems as likely a link as any: typing out tabular data from a punch card would be limited to the same width as common on typewriters.


Sure, but all these links (like the 2KB link) seem a bit coincidental. Of course, 2KB would fit other combinations of columns and lines, and typewrites (I think? I don't have one at hand!) probably do manage to fit more than 25 lines of single-spaced text with normal margins... My guess is that matching typewriter (and/or punchcard) width was more important than matching length, so with 1KB and 2KB, terminal manufacturers used as many lines as they could fit while using 80 column lines. Display width is probably more important than display length for us humans.


Is this another variant of space shuttle SRBs being the width of two horses arses?


I designed and built an ASCII terminal as a college project. The 25 lines was set by the number of lines a standard monitor could display, given a certain number of scan lines needed to make the font legible.

As monitors that could display more scan lines became available, people instantly used them to display more rows.


A related post about software development, history, surveillance, knowledge, weaving looms, terminal sizes, and more:

https://dave.autonoma.ca/blog/2019/06/06/web-of-knowledge/



While I greatly enjoyed the article and its effort at historical synthesis, the author cites a fact that's not exactly true:

> Then in 1983 the Apple IIe was introduced, the first Apple computer to natively support an 80×24 text display, doubling the 40×24 default of their earlier hardware.

My family had an Apple IIe in 1984. It only showed 40 columns until my fater went out and bought an "80 column card" to upgrade it. He also sprang for a second floppy drive. So "natively" is a bit of a stretch there.


I can't get the page to open but the title of the HN submission is nonsense.

Terminal is 80x25 because that's how big terminals were back in the day (for very long ago values of back in the day and for purely bandwidth driven reasons).

Pre-Mac (technically pre-Lisa) Black and White CRT displays which were made using then-current CRT TV technology which had just enough analog bandwidth to show 80 columns of 5 pixel by 7 pixel characters, with one pixel separation (480 pixels horizontally [edit added: those are US numbers, European TV's had slightly higher bandwidth but that's a different topic]) and most could only show 24 such characters vertically due to the 4x3 aspect ratio used in essentially all CRTs of the day. 24x80 was the industry standard for a screen of text for purely CRT bandwidth reasons. So why 25 lines? Because a few super cool terminals allowed you one extra line for showing status below a conventional 24x80 layout, hence 25x80 (vertical squeeze didn't pose bandwidth or pixel separation problems on black and white displays of that era). Terminal naturally went with the cool kid size of 25x80. No civil war bank notes involved, just the bandwidth of then-current generation TV display technology and a fortuitous coincidence that punch cards had 80 columns.

[edit added] Apple][ and other similar era devices had even lower character counts because they hooked up to color TV's which had roughly similar bandwidth but needed to spread that bandwidth across three phosphors per pixel so far fewer pixels available on screen.


Your being downvoted but you have a valid point. People should look up the NTSC/PAL standard for how analog TV screens worked. There were only ~560 scan lines in the raster pattern (vertical and horizontal blanking intervals notwithstanding). With a character height of 7 pixels/scan lines + one for spacing that gets you 70 lines of text on screen. Even if the punch card machines gave you more memory, actually using it would have required buying a special purpose TV. Eventually we did get CRT monitors which were specific for computers instead of television, but that wouldn't be economically viable at the time.


The downvoting is likely because 80x24/25 happened well before home PCs that displayed via NTSC/PAL. And many of those early terminals also supported a 132 column mode.


Analog TV's required incredibly high bandwidth electronics, which was standardized to meet TV requirements. The early terminals used TV hardware in custom cases they just plugged in directly rather than going through the demodulators used by home PCs once they were invented.


What were the earlier non-home computers displaying on?


CRT, but not via RF modulators, NTSC, PAL, etc.


That's very odd. Why did they invent a new CRT format and spend the cost of the hardware to implement it instead of using an off the shelf TV?


Because TVs didn't have high quality video inputs. Old Apples, C64s, etc, came with a "tiny tv broadcaster" (RF modulator) that broadcast the signal on channel 3. In shit NTSC or PAL quality, which is lower resolution than what the CRT can display.


I understood that. I'm old enough to remember VCR's / game consoles required you to turn to channel 3 or 4. But what were the older, non-personal terminals using? Was there some sort of RF modulated raster pattern optimized for displaying text? Were they drawing everything as vectors? Did they use off the shelf CRT's?


Direct manipulation of the beam with analog voltage to a CRT driver for the x and y axis. Same as what the TV does inside.


They just took a regular B/W CRT and wired it up directly - leaving out all the PAL/NTSC decode circuitry. That allowed for a much sharper picture.


Also, (almost) all humans have two feet because shoes come in pairs of two. /s


Yup that's my memory 80x24 until the VT100 came along.

The phrase "Then in 1983 the Apple IIe was introduced, the first Apple computer to natively support an 80×24 text display, doubling 40×24 of previous Macs" is rather cringeworthy and shows a misunderstanding of the time

(first of all Macs were introduced in '84 .... Mac and AppleIIs were competing architectures from Apple)


It's stupid mistakes like this that make me doubt the rest of the article. Besides many Apple ][ had an 80 column card already.


And the ][e itself didn't come with one. My family upgraded our Apple ][e by adding an 80 column card.


More like a brain fart and a lack of proofreading. Be charitable.


80x25 comes from VT100's 2 kilobytes display memory limit.

>Then in 1983 the Apple IIe was introduced, the first Apple computer to natively support an 80×24 text display, doubling 40×24 of previous Macs.


> 2 kilobytes display memory limit

That too


Apple has nothing to do with 80x25. That all happened before they existed. So I'm confused about all the references to them.


Correct, this is explaining that prior to the Mac terminals used then-current TV tech which had these limitations, and that the limitations were even more severe on display of color text, as was the case for the Apple ][ (intended to show that it's not even true that terminals were 24x80 or 25x80, it's that black and white terminals were that size because they could be and color terminals had even fewer characters because thats all they could have).

As they say, correlation does not imply causation. It's a coincidence that Black and White tv screens from long ago had just enough pixels to display 80 columns of text.


No. Terminals did not use TV screens. Monochrome CRTs do not have pixels.


A monochrome CRT tube and a black and white TV tube are the same physical device just packaged and sold to different customers. The bandwidth of the electronics driving the electron beam sets a maximum modulation frequency and hence a minimum line pair spacing. That minimum line pair spacing sets the effective pixel separation as directly as the shadow mask pitch does on a color CRT. The bandwidth was set by the TV standards, which were in turn set by what was feasible to manufacture economically in those days.


To be precise, the flyback transformer produces a sawtooth waveform, which generates the raster scan (lines are actually a bit slanted, dropping from left to right). While there's a preset frequency, a TV set will synchronize with the input signal – and there's some robustness in this. E.g., most 50 Hz CRT TVs will happily display a 60 Hz signal (as may be seen in the PAL/60 pseudo-standard). Horizontally, screen activation is directly bound to signal intensity, hence "analog TV", therefore only limited by the signal response of the phosphor used and by the amplification circuitry (for a character display, you'd want your signal ramps not to become too shallow). Also, there's some flexibility in the overall number of lines displayed. (E.g., while there are nominally 262 scan lines in a NTSC field, Atari VCS games historically varied between 238 and 290.) There are no hard specs for a monochrome tube, especially, when driving it directly, w/o going through modulation/demodulation stages. However, there will be a visible vertical separation, if you're just using odd fields as in non-interlaced video (due to the sawtooth waveform generating the raster scan), resulting in visible scan lines. Notably, most of the constraints resulting from this can be addressed by the type design of the display characters.


No, terminals rarely if ever used TV-type tubes (before the home computer era had people using actual TVs) — usually not even the same phosphors — nor the same signal rates, because TV rates would make no sense for a terminal. ‘Manufacture economically’ was not the same thing for a business product as a home appliance — an IBM 3278 in today's dollars cost $13000.

This isn't lost-clay-tablet stuff. There are manuals and schematics online.


The vt220 manual I found suggests the same CRT scan rates. 15.75 Khz horizontal, for example, with the reference freq coming from a very TV specific Motorola IC.

Perhaps different phosphors, sure.


Different phosphors for sure. My terminals (I've had 3 or 4 DEC and others) hade a very visible afterglow. Very stable and flickerfree compared to my CRT TVs.


The VT220 was a very late terminal, though and I was trying to focus on early models where the line length decision was not just inherited from the previous CRT model (i.e. the VT220 had 80 columns because the VT100 did because the VT50 did).


There's enough "pixels" for 132 columns, and many old terminals (vt220 for sure) could do that.

The RF modulator needed by Apple/C64/etc did perhaps limit it to 80...it wasn't the tube.


Terminals like the VT220 were much more expensive because they needed much higher bandwidth analog electronics than conventional TV circuitry (the limit isn't on the B/W tube, it's on the electronics that steer the beam around)


I think there's not much difference in the horizontal/vertical driver circuits.

Here's the (pretty pedestrian and likely not different from a b/w tv) diagram of the horizontal driver for a vt220: https://imgur.com/a/xvq1ukV

The shown MC1391P is, in fact, called a "horizontal TV processor" by Motorola.

I'm sure there's other stuff that made them expensive (CPU, memory, video logic, etc).

Edit: Lol at imgur for marking my screenshot as "erotic" and NSFW.


Awesome find. That said I think the bandwidth issue was less with the semiconductor electronics in that diagram and more with the big analog electronics that actually controlled the gun directly (the box labeled horizontal driver in your image)


Here's the other two boxes:

https://imgur.com/a/yJH9QYu

(Which Imgur also finds erotic and NSFW. Odd)


from the same author, why arrays are zero-based http://exple.tive.org/blarg/2013/10/22/citation-needed/


This article is complete nonsense.


I thought this aside about how Wikipedia has been taken over by extremists was even more interesting:

> It’s difficult to research anything about the early days of American currency on Wikipedia these days; that space has been thoroughly colonized by the goldbug/sovcit cranks. You wouldn’t notice it from a casual examination, which is of course the plan; that festering rathole is tucked away down in the references, where articles will fold a seemingly innocuous line somewhere into the middle, tagged with an exceptionally dodgy reference. You’ll learn that “the shift from demand notes to treasury notes meant they could no longer be redeemed for gold coins[1]” – which is strictly true! – but if you chase down that footnote you wind up somewhere with a name like “Lincoln’s Treason – Fiat Currency, Maritime Law And The U.S. Treasury’s Conspiracy To Enslave America”, which I promise I am only barely exaggerating about.


"Either way, it’s always good to be reminded that the goldbug/randroid/sovcit crank spectrum shares a common ideological klancestor."

Starting with spurious political generalizations isn't great set-up for readers. Especially if the author wants their historical theory given a fair shake by a wide audience.


Indeed - smearing anyone who is skeptical about central banking as a KKK member is not a good way to build credibility as a historian.


There are certain topics where the standard of debate is ridicule and name calling. Central banking is one of them. Usually that's a sign that there's some reason people don't want to talk about it. It's the kind of topic you bring up and the authority figure in the room says, "You know we don't talk about that topic anymore."


Yes, this is true. To my way of thinking, the real fight over central banking is the massive wealth granted to the government via seigniorage. That wealth belongs to the sovereign, which in a republic like the US is the people. However, the people never directly benefit from that wealth. I think people understand this underlying truth and are angry about so much usurpation and thievery done in their name.

Meanwhile, it benefits the recipients of that wealth and power to remain silent and mock their adversaries, because they can only stand to lose their privilege.


For obscure conspiracy topics, the only people who care enough to edit the pages are the true believers. Today arguments about which occult order is descended from which happen on Wikipedia talk pages.

Look at this nonsense:

https://en.wikipedia.org/wiki/Talk:A%E2%88%B4A%E2%88%B4

>I am an initiate of the A.'.A.'. and the Golden Dawn. If anyone would like to challenge the factual basis of my claim that the System of the A.'.A.'. is based on the Merkabah Seven Palaces rather than the Kabbalistic Tree of Life, then please provide your reasons for doing so before demanding book citations be produced for initiated and previously secret knowledge. Prove the A.'.A.'. is based on the Tree of Life or be Silent.

https://en.wikipedia.org/wiki/Talk:Hermetic_Order_of_the_Gol...

>The registered trademark should remain on the Hermetic Order of the Golden Dawn, the outer order of the Rosicrucian Order of Alpha et Omega entry in the contemporary orders section, as they own the HOGD trademark in Europe and Canada. Recently the HOG/A+O settled litigation victoriously preserving their perpetual and irrevocable right to use the name of their outer order, the Hermetic Order of the Golden Dawn, without interference in the USA. The registered trademark rightly and properly distinguishes the HOG/A+O from the exorbitant number of -unlicensed Golden Dawn based study groups and should NOT be removed. The registered trademark is a distinguishing character integral to the association of, and a privilege entitled by law reserved exclusively for the HOGD/A+O as the owners of the Hermetic Order of the Golden Dawn in the European Union and Canada as aforementioned. It is certainly improper and somewhat unlawful to deprive the HOGD/A+O of using that privilege of the registered trademark they reserve the right to fully represent themselves therewith. Please do not remove the trademarks from the HOGD/A+O entry again.

>Furthermore, as the A+O’s outer order is named the Hermetic Order of the Golden Dawn and they reserve all rights to that mark in the European Union and Canada. It should be correctly stated in the contemporary orders section that the HOGD/A+O is: “a modern order headquartered in the European Union using the same name being also the outer order of the Rosicrucian Order of Alpha et Omega®. This is paramount, as it distinguishes the HOGD/A+O as a completely separate entity from the independent organisation which is the HOGD, Inc. who are a modern independent order of the same name.


"The amount of energy necessary to refute bullshit is an order of magnitude bigger than to produce it." - Alberto Brandolini


Yet this is a fight worth fighting.

I must say it is great that they are confined to rather obscure parts of the wikipedia whereas they could be on pages about much more important or current events.

And I must say, HOGD is a page where I would expect a lot of contradictory information. It would take several courageous scholars to put some order in the various versions of the various ego-maniacal members it had. I would argue that this is a subject that has relatively little importance, given the small amount of people involved and the general irrelevance of their actions but for some reasons, this order became pretty popular in magic folklore...


I couldn't finish reading it, but something about a printing press, typewriters, paper size and the max characters that can fit with a 20 pt font.


> It’s not entirely clear if this is a deliberate exercise in coordinated crank-wank or just years of accumulated flotsam from the usual debate-club dead-enders hanging off the starboard side of the Overton window. There’s plenty of idiots out there that aren’t quite useful enough to work the k-cups at the Heritage Institute, and I guess they’re doing something with their time, but the whole thing has a certain sinister elegance to it that the Randroid crowd can’t usually muster. I’ve got my doubts either way, and I honestly don’t care to dive deep enough into that sewer to settle them. Either way, it’s always good to be reminded that the goldbug/randroid/sovcit crank spectrum shares a common ideological klancestor.

Is this the kind of tinfoil hattery that is acceptable now?


It is off topic, but worth mention.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: