So much of what we have today is due in some part to what ATT created. I'd say they're very much an example of what people imagine when they think of a well run company. Anti trust concerns aside, what they created was incredibly reliable, at a time when few would have blamed it if it hadn't been. Old switching stations would take up entire buildings, when by the 80s and 90s it could be reduced to a single rack or less. The microwave tower system had requirements to have multiple backup generators available in case of power loss, and had to be built to withstand a blast (some distance away).
And of course bell labs created a number of incredibly important technologies. This isn't as easy as just throwing money at the wall, someone had to be in charge of talent selection and what projects to invest in, but when you review what they were able to accomplish, it can really make you wonder what other labs were/are doing.
>Between 1960 and 1973, NASA spent almost $26 billion on the Apollo Program ($311 billion in 2024 dollars), over 2% of U.S. GDP. Over that same period, AT&T spent almost $70 billion building new telephone infrastructure
What a crazy statistic to pull, and the chart in the article also adds to it.
> The vertical integration between R&D and deployment is what’s impressive.
For a surprisingly large fraction of the 1900s, a Bell Labs scientist's introductory training would include elements such as climbing telephone poles and operating exchanges!
At least I'm not the only now who sees it - I feel a tremendous sense of loss when I look at Divestiture - we lost untold jobs, manufacturing and engineering capacity - and the ability to respond quickly.
For me, as someone who spent some portion of my career in wireline telephony, I feel these losses acutely, particularly as the copper wireline network dies - the fact that all of this infrastructure was built by one company was not lost on me, and was very obvious early in my career - but is less so now.
Nice article. One tactical detail that arguably didn't belong in it, but which is still awesome as hell, was when they moved (and rotated) the Indiana Bell central office building without dropping a call. (https://www.archdaily.com/973183/the-building-that-moved-how...).
Idea Factory is an excellent book, one of my favorites, I just finished rereading it a few weeks ago.
I'd also recommend Skunk Works: A Personal Memoir of My Years of Lockheed[0] by Ben Rich.[1] Rich was the second director of Lockheed's Skunk Works and protégé of its founder and famed aerodynamicist Kelly Johnson.[2]
The book covers the creation of the U-2 spy plane, SR-71 Blackbird, F-117 Nighthawk, stealth technology in general, and talks a bit about the F-22 Raptor. I'd recommend the book even if you're not super into aerospace or defense tech. It's amazing what the Skunk Works was able to achieve with such a small contingent of dedicated engineers, technicians, machinists, and managers for so little when left alone to do what they knew best.
In my opinion, these days, the term 'skunk works' is all too often used to describe a company's secret R&D wing when it would be more appropriate to use it to describe a small walled off portion of a company with tenure, a place engineers and architects get to roam free and experiment without meddling and micromanagement, even if constrained by a limited budget.
My takeaway from that one is that Bell Labs' success relied on the common feeling that they were all contributing to the nation's survival and flourishing by supporting the core infrastructure role of the Bell System.
(Compare with the cynicism of Googlers/Eric Schmidt)
I happened to read a couple of months ago and it is indeed very good. The author gives the impression of really caring about the subject and going in deeply on the parts they choose to touch.
> Year after year, for decades, AT&T manufactured millions of telephones
And those phones were very well made. My parents had the same Western Electric rotary dial phones in their house for close to 30 years, and they never had an issue.
The reliability goal for a lot of Western Electric equipment (at least in the first half of the 1900s) was an average lifespan of 40 years. They were one of few shops outside of Japan that took statistical process control seriously.
Ours was this ugly, vomit colored wall mounted monstrosity from South Central Bell that lived in our kitchen not 10 feet from the stove. It was heavy, and persistently covered in the grease du joure, but likewise, never a single issue with it.
The “AT&T Archives” playlist on YouTube is an absolute treasure trove of historical educational and promotional films produced by Ma Bell. Cannot recommend hightly enough.
My biggest question is what happened around 1970? Did their analysts predict saturation or the rise of other networks? It's not exactly clear from the article.
They basically were spending money like it was free in the name of growth for a century, and stopped about that time, which led to all sorts of problems. Was it a prediction, or the onset of 'beancounter accounting'?
1974 was officially the first year of the end of AT&T (conclusion of the antitrust case leading to divestment), and they probably already knew few years before that.
Nope, you're talking about the full breakup. First antitrust court case described as "leading to divestment" ended in 1974. Look at wiki, that's where I verified it before I posted.
In 1920 long distance calls were charged by the minute, and by the distance. The operator would write down when she connected your line on the switchboard, then note down the time when she unhooked it.
In 1920:
>a call from New York to Indianapolis, Indiana would cost $4.15 for the first three minutes and $1.35 for each additional minute
>a call from New York to Los Angeles, California would cost a lot more - $15.65 for the first three minutes and $5.20 for each additional minute.
In Germany during WWII (according to Richard von Rosen's memoir) you had to wait a few hours for your long-distance call to connect. You could pay double to get connected faster and ten times the price to connect faster still.
From what I recall from the 1970's, we paid a monthly charge for each line (number), phone rental (each), and then long-distance calls were itemized. If you had two phones in your home for the same line, you'd be charged an additional location fee, as well as the rental for the second phone. If you had two numbers (like one for the parents and one for the teenagers) then it was 2 lines, 2 phones, plus any long-distance calls each month.
By that time their billing was computerized, so the monthly statements would list each long-distance number you called and how many minutes you spent talking. I was calling BBS's at that point (300 baud acoustic coupler hotness!) and I had a phone budget that I had to stay under or pay the parents. Long distance calls (basically anywhere outside your city) were expensive!
Bell planned for the 2 line scenario, as your wiring out to the pole had red/green (tip + ring for line 1) then black/yellow pairs (tip + ring for line 2). So they could offer two lines without needing to run additional wires to the house. Or use the second pair if the first one went bad. The Bell System did forward-thinking like this all the time. And could afford it because (as mentioned) they amortized their system investment over 40+ years.
One of the elements that came with mechanical (as opposed to human-powered) phone exchanges was introduction of "impulse billing" (there's probably a proper term for it) where establishing a connection triggered a device that printed a symbol every "pulse" from the timer on a ticker tape. This would be then collected, pulses counted, and you'd be billed based on the amount of those pulses. Similarly different devices would be able to generate "logs" of actions like "connected line A to line X", which would be collated and sorted regularly.
Different billing strategies often were based on how often the pulse was triggered. This is also related to "billing by the second" - it was originally a technological issue that caused per-minute billing.
Yes, at first it was an analog process with toll tickets making their way to your local business office who would prepare your bill, eventually it went to tabulation technology (punched cards) and then into the 1960's computers.
As far as the bill - you'd be billed for local service (either unlimited or metered rate), plus a phone rental, and then on top of that local toll and long distance usage. For this, you got a telephone, and the phone company was responsible for your inside wiring as well.
Curious that he missed out on telling the story of the construction of a network that could survive nuclear attack during the cold war. That's where the peak of ATT investment probably came from in the Apollo years.
Entire buildings underground on springs, underground generator farms and weeks of fuel, a hidden set of 4 buttons on your touch tone phone for specifying quality of service during dialing, etc. Fascinating stuff.
Almost all the graphs start to kick up around 1950. Interestingly, the REA was expanded to include telephony in 1949. I don't think that there's a coincidence between AT&T construction spending and subsidized government loans for expanding infrastructure.
Can you be more specific? I’m familiar with Shannon’s work, obviously, but what societal changes happened as a result of engineering the Bell system specifically?
Shannon's great contribution to the Bell System was that he figured out how to reduce the number of relays in a fully-connected toll office from O(N^2) to O(N log N).[1] After that, they let him work on whatever he wanted.
UNIX was written by some guys in the same organization, I wonder one of them thought "Oh sure Shannon gets to work on what he wants, why can't we work on the the future of a global inter-net? Why do we have to hide it as a text processing system?"
My management here apparently is a crowd sourced mob trying to silence me by clicktivism. Shannon and KNR had it easy, IMO.
Actually, no. The UC Berkeley TCP/IP implementation was not the first. It was more like the fifth. But it was the first for UNIX that was given away to universities for free. Here's the pricing on a pre-BSD implementation of TCP/IP called UNET.[1] $7,300 for the first CPU, and $4,300 for each additional CPU. We had this running at the aerospace company on pure Bell Labs UNIX V7 years before BSD.
Much of what happened in the early days of UNIX was driven by licensing cost. That's a long story well documented elsewhere. Licensing cost is why Linux exists.
But that doesn't refute the parent's point, does it? (If it has been edited since you wrote that, the version I see is "Unix's involvement with the development of the Internet was mainly through BSD, which was a UC Berkeley joint, not Bell Labs.")
They were responding to the statement:
> "why can't we [Kernighan, Ritchie, Thompson, other folks at Bell Labs] work on the the future of a global inter-net? Why do we have to hide it [Unix] as a text processing system?"
Whether or not the BSD TCP/IP implementation was the first or most influential, the point is that it wasn't the Bell Labs Unix folks driving Unix networking forward. UNET was from 3Com.
The Bell Labs people had their own approach - Datakit.[1] This was a circuit-switched network as seen by the user but a packet switch inside the central-office switches. Bell Labs used it internally, and it was deployed in the 1980s.
Pure IP on the backbone was highly controversial at the time. The only reason pure IP works is cheap backbone bandwidth. We still don't have a good solution to datagram congestion in the middle of the network. Cheap backbone bandwidth didn't appear until the 1990s, as fiber brought long-distance data transfer costs way, way down.
There was a real question in the 1980s and 1990s over whether IP would scale.
A circuit-switched network, with phone numbers and phone bills, was something AT&T understood. Hence Datakit.
Interesting. I want to point out though that the document you link to is dated 1980, which is late in the development of the internet (ARPAnet): by then the network was 11 years old and research on packet-switching had been going on for 20 years, which is one reason I find it hard to believe that the Labs (or anyone at AT&T) contributed much to the development of the internet like great grandparent implies when he imagines the Unix guys saying, "why can't we work on the the future of a global inter-net? Why do we have to hide it as a text processing system?"
Yes, the early internet (ARPAnet) ran over lines leased from AT&T, but I heard (but have not been able to confirm by finding written sources) that AT&T was required by judicial decree (at the end of an anti-trust case) to lease dedicated lines to anyone willing to pay and that if AT&T weren't bound by this decree, they would probably have refused to cooperate with this ARPAnet thing.
I concede that after 1980, Unix was centrally instrumental to the growth of the internet/ARPAnet, but that was (again) not out of any deliberate policy by AT&T, but rather (again) the result of a judicial decree: this decree forbade AT&T from entering the computer market (and in exchange, IBM was forbidden from entering the telecommunications market) so when Bell Labs created Unix (in 1970), they gave it away to universities and research labs because it was not legally possible to sell it. In 1980 (according to you, and I have no reason to doubt you) AT&T no longer felt bound by that particular decree, but by then Berkeley was giving away its version of Unix, or at least Berkeley had an old version of Unix from AT&T which came with the right to redistribute it and would soon start to do exactly that, and Berkeley's "fork" of Unix is the one that was responsible for the great growth of the internet during the 1980s. Specifically, even if an organization wanted Unix workstations for some reason other than their networking abilities, the ability to communicate over the internet was included with the workstation for free because most or all of the workstation OSes (certainly SunOS) were derived from Berkeley's open-source version of Unix (although of course they didn't call it "open-source" back then).
Unix got increased presence on Internet because DoD paid UCB to port "DoD Internet" (aka TCP/IP) to Unix, because Digital had announced cancellation of PDP-10 line.
Meanwhile everyone and their pet dog Woofy was exploiting the recent explosion in portability of Unix and thus source-level portability of applications, using unix as the OS for their products - because it enabled easier acquiring of applications for their platform.
With some Unix vendors (among others Sun, which arose from a project to try to build "cheap Xerox Alto"), providing ethernet networking and quickly jumping on BSD sockets stack, you had explosion of TCP/IP on unix but it still took years to get dominant.
Actually, the early ARPAnet was mostly DEC PDP-10 machines.[1] MIT, CMU, and Stanford were all on PDP-10 class machines. Xerox PARC built their own PDP-10 clone so they could get on. To actually be on the ARPANET, you had to have a computer with a special interface which talked to a nearby IMP node. Those were custom hardware for each brand of computer, and there were only a few types.
The long lines of the original ARPAnet were leased by the Defense Communications Agency on behalf of DARPA. ARPAnet was entirely owned by the Department of Defense, which had
no problem renting a few more lines from AT&T.
AT&T was willing to lease point to point data lines to anybody. They were not cheap. I had to arrange for one from Palo Alto CA to Dearborn MI in the early 1980s, and it was very expensive.
> The resulting units may be called binary digits, or more shortly, bits.
It's interesting to read this early use of “bit”, before the term became commonplace. The first publication to use “bit”, also by Shannon, was only a year prior[0].
> At the close of the war, he [Shannon] prepared a classified memorandum for Bell Telephone Labs entitled "A Mathematical Theory of Cryptography", dated September 1945. A declassified version of this paper was published in 1949 as "Communication Theory of Secrecy Systems" in the Bell System Technical Journal.
> what societal changes happened as a result of engineering the Bell system specifically
I don't have that much time, but in general think about how I am even capable of communicating with you at all. Start with the "https://" at the beginning of most modern URLs.
UNIX, transistors, foundational information theory, "on and on till the break of dawn." If you want to become more familiar with Shannon's work and Bell systems, separately and together, try his master's thesis, followed by his Ph.D., ...
> obviously
I thought my original comment was obvious. At least we both seem to be familiar with the principles of:
Thank you! That was a super-helpful response. I wasn't asking because I disagreed, but because I didn't fully understand what you were getting at. Now I do. Thanks!
However, if you can discern "better" (i.e. 1 plain old bit of difference) by taking a few words off my posts on social media, you have "Beat the Shannon Limit". ;)
Update, I thought of a way to express the parent comment here:
0 + 0 = 0, substituting literal values, dropping the units, for some convoluted overloading of the operator '+'. My TSH (thyroid test) came back from the lab without units this, I guess I'm modernizing.
0 dB S/N + 0 bytes originating information = 0 bytes transmitted (arguably error free to be fair).
Contrast this with P. Graham's comment last year "Why would I want anyone to fail?" on Twitter, which (I'm not a fanboi of Graham or Shannon or Turing, just an admirer of their work), was the most information transmitted to me over any medium, with any S/N that year. Perhaps we should revisit the basis of Shannon's work, in light of what we have learned from the Internet--Einstein wasn't afraid of arguing with Newton. :)
Hmm..took my brain several orders of magnitude longer to warm up than a 6L6 power tube.
AI suggested that I was being generous with my 0 dB S/N for social media, it should be -∞ dB. Good catch.
It also didn't like my unit compatibility (reminding me of the utility of unit analysis), but remember that my '+' is overloaded--most programmers would probably write:
int plus_ungood(int bytes, double SNR);
dropping the units. Of course we programmers also add geometric points together without dividing, which is mathematical no-no too.
I guess I'll put this on my TODO list for a fun project--"Relativistic Shannon: A Critique of Pure Reason on Social Media":
Einstein wasn't afraid of arguing across time with Newton, nor should we be afraid of arguing with Shannon. Arguing on social media, well that's a different thing altogether....[continues for 7000 pages]
I did learn what semantic density was without AI's help, postulating that it would be involved (viz Graham above) using my alarmingly down-trending cognitive abilities.
Modern AT&T isn't really Ma Bell, it's SBC who bought AT&T and kept the AT&T name. That's why AT&T is based in Dallas and not New Jersey.
Ma Bell today is really AT&T, Verizon and parts of Lumen, Frontier and Consolidated.
AT&T also is worth less than Verizon due to bad mergers (DirecTV and non-cable Time Warner) which added a lot of debt, money that should've been used for fiber and 5G or even a bidding war against T-Mobile for Sprint if you had to buy your competitor.
I work for (current) AT&T, in Atlanta. SBC also bought Atlanta-based BellSouth in 2006, and some of my coworkers who are ex-BellSouth complain that working for this company hasn't been the same since. But I haven't heard that as much lately - a lot of those people have retired by now.
re: some of my coworkers who are ex-BellSouth complain that working for this company hasn't been the same since.
That would be correct and only natural. I started with Michigan Bell in 1977, went through divestiture, the assembly of Ameritech, then SWBT/SBC. The Bell System / along with Western Electric/Bell Labs was a great place to be. You could spread out the SD's to work on problems and end up talking with engineers/programmers. My bosses went from being in Michigan, to Wisconsin, to St. Louis, to NJ. So, yes, the jobs have changed, the coworkers have changed, the jobs have changed. Plus, we all missed Ed Whitacre when he retired.
Dont forget the takeover of Cingular. That always seemed to be when the old DNA was comprehensively switched out for the new. No more ATT with sleepy offices in San Antonio, now it was a mobile company based out of Dallas.
Yep - those were around the same time so I think people might find it hard to disentangle the two, especially when they’re just complaining instead of trying to do serious corporate history.
And of course bell labs created a number of incredibly important technologies. This isn't as easy as just throwing money at the wall, someone had to be in charge of talent selection and what projects to invest in, but when you review what they were able to accomplish, it can really make you wonder what other labs were/are doing.
>Between 1960 and 1973, NASA spent almost $26 billion on the Apollo Program ($311 billion in 2024 dollars), over 2% of U.S. GDP. Over that same period, AT&T spent almost $70 billion building new telephone infrastructure
What a crazy statistic to pull, and the chart in the article also adds to it.