I worked in Bell Labs from 2012 to 2018, unfortunately then it was a shadow of its former self. The MBAs had taken over.
While there Weldon tried to pivot the place from being ‘like twitter’ to an incubator to a like a startup. He got an Apple Watch so we were going to be a wearables lab. He threw together a book on the supposed future of networks and conspired to get it to the NYT #1 by forcing us all to buy it.
There were endless reorganisations. Disastrous leads parachuted in to wreck groups. Weldon played favourite to an alarming degree only to turn on them when they didn’t deliver on the aforementioned vapid promises.
We had endless managers holding endless meetings telling the smart people what they should be doing and not listening to what the smart people wanted to or could do. All hands meetings telling us how great everything was while they were letting a third of us go.
They engaged in vanity projects like putting a 4G network on the moon and getting us to build endless fake demos for technologies that didn’t exist and then acting surprised when told said technology didn’t exist.
The funny thing was that the old timers over in Murray Hill just ignored all of this and continued to work away on their research untouchable like some prize zoo exhibit.
It was a place full of the most wonderfully intelligent people managed by fools who wasted fortunes on flashy demos rather that let the smart people take the time to build something.
Makes me think of Buffett’s saying that goes something like: “I invest in companies that could be successfully run by monkeys because eventually they will be.”
I don’t know how an organization avoids this fate, but it does seem to eventually come for all enterprises.
IBM, Bell Labs, GE; who else should be on that list?
From its history since the MD merger you could argue Boeing should be on this list.
It's the inevitable result of prioritizing profit rather than prioritizing the creation of quality products that the people working at a company can be proud of.
That's why you see companies fall apart when the founders leave. They're replaced by people who prioritize profit for the company and wealth for themselves rather than product quality so the company ends up rudderless.
>That's why you see companies fall apart when the founders leave. They're replaced by people who prioritize profit for the company and wealth for themselves rather than product quality so the company ends up rudderless.
Which ironically fucks over profit anyways, just about every time. I think it's more of a personal greed thing combined with incompetence, on top of perverse incentives that come with being a publicly traded company.
The quarterly profit model all but ensures a societal-scale myopia in the end.
Unless the founder of the company builds a culture that is designed to survive without them at the center, the company will eventually fail without them there.
One of my former employers was like this - the founder was a brilliant engineer, who installed good financial controls, and was himself decent at the business end.
But, he was a micromanger, and he never sought to create replacements for himself inside the company, he tended to penalize people who stepped out of line too. So once he was out of the picture (he sold out), we didnt really have the right person to run the company, or the right culture to just install a generic manager with industry expertise, it instead was a company that had been formed for the founders personal needs.
That company was purchased by a multinational, and is now being systemically dismantled.
Google has been on this list for the past decade, as demonstrated by the Google+ fiasco of 2011. Same for Microsoft in the post-Gates era and Apple in the post-Jobs era. As for Facebook, the vapid metaverse hype shows that it has well and truly jumped the shark.
I would argue that Apple is doing fairly well in post-Jobs era, probably much better than anyone expected. Microsoft seems to be making some steps forward here and there, improving its reputation among developers and end users (although they still do their fair share of missteps).
Apple has done some pretty ugly stuff related to the App store monopoly and fighting the right to repair but having monopolistic tendencies has always been baked in to a company that wants you to run both its hardware and software, and that is not the same thing as being MBAized (ie greed/evil and technical prowess can be orthagonal). Their recent processor success proves they are not another IBM.
> IBM, Bell Labs, GE; who else should be on that list?
Intel felt quite a bit like IBM during my time there circa 2010.
I wasn't familiar with this quote, but wow was that my experience at large tech orgs. I wish I would have known this about 15 years ago, as perhaps I would have done a better job picking orgs to work at early in my career instead of being so frustrated.
They also made cameras. Ever hear of the wildly popular Brownie? Anyway their film expertise does not take away their involvement in cameras for spy satellites in the 1960s.
> conspired to get it to the NYT #1 by forcing us all to buy [his book]
How many people worked at Bell Labs at the time? It looks to be a little over 600 now, which in relation to NYT #1 (best selling list?) having less-than-1000 employees all buy a copy seems to be a relatively ineffective strategy.
It was cleaved off AT&T as part of Lucent and then merged with Alcatel. Part of the problem there was that it was never clear if Lucent merged with Alcatel or if Alcatel acquired Lucent. The whole company was pretty dysfunctional but nothing too unusual there.
Nokia came along flush with MS money from the sale of their handset business to buy Alcatel Lucent and got Bell Labs for free. Neither company was doing particularly well flogging network gear and together they didn’t do much better.
> People always underestimate the impact of decades, and overestimate the impact of years.
Sometimes cited as Gates' Law, and also attributed to Arthur C. Clarke, Tony Robbins, or Peter Drucker. But they may have all gotten the idea from Roy Amara.
I‘ve been working for Lucent Technologies in Moscow, Russia from 2000 to 2004. I remember my feelings when I looked up Dennis Ritchie in PeopleSoft, corporate directory - hey, I work in the same company with a man who invented Unix and C! Lucent was a great place to work, even in Russia. :)
I was working on an annotated (unofficial) edition of K&R updated to the latest C standards, with commentary like the Lion's book on UNIX, completely typeset in LaTeX. Sadly, I don't think it will ever see the light of the day due to copyright.
I had some correspondence with DMR in my early college days. It would have been an ideal tribute.
If you’re interested, I’d try to get ahold of an executive at Pearson. Some big corp lawyer has 0 concept of the significance of your work and doesn’t have the authority to green light anything anyway.
I was in 6th grade when I found "The C Programming Language" on the floor in my friend's house. I picked it up, took it home, and read it cover to cover. I didn't have a computer then, but I was absolutely sure about what I wanted to do in my life.
Thank you, Mr. Ritchie and Mr. Kernighan for opening the world of computer science for me.
I was lent a copy of K&R by an English teacher¹ in my high school (this was 1984ish). I still remember the smell of coffee and nicotine that was imbued in its pages and any time I deal with C code, the sense memory comes back to me.
For a while, under the influence of K&R and The TeXbook, I contemplated going to Stanford to study computer science and then working at Bell Labs. I did neither.
⸻⸻⸻
1. About ten years ago, I decided to try to reach out to him and thank him and comment about how out paths kind of were the inverse of each other—he had a degree in computer science but ended up teaching high school English, I had a degree in English and ended up programming computers—and I discovered that he had died a few months previous. Whenever possible, get in touch with those who influenced you earlier if just to say hi and thanks.
I ordered it by inter-library loan in 1991 to rural Oregon. I had recently learned 6502 assembly language, so pointers seemed "obvious". A few years later in CS101 I had such instinctive feel for them I could hardly explain them to my fellow students.
I can remember writing a large Pascal program in the 80s and really wishing I had function pointers available so I could pass in a reference to a function. I look back on that as an autodidact programmer and realize that I had some vague instinctual notion of stuff that would become commonplace as OO and functional paradigms took over.
surprising -- I picked up the same, also read it cover to cover, and wondered over and over what kind of thinking leads to the small assembly'ish idioms and quirky character IO definitions. "Structured Programming" was obvious to me, and using that design to build non-trivial programs was very compelling, but the constant emphasis on small, tricky ways to move around a character seemed driven by some intense factory of machine parts thinking, not clean abstractions or consistant naming or human-readable coding. I immediately wanted to try this "big phone network" core OS language on my portable home computer with apparently one-one hundred thousandth of the capacity. Other home computer companies were publishing C compilers rapidly with lots of feature tradeoffs, so there was no question that C was the thing to use for me. Not good design at all though -- machine requirement driven totally.
I recently worked to update a Linux-based system that was originally built by a team that had previously implemented the same product on a microcontroller-based system. The Linux drivers are obviously direct ports of the old subsystems, without any apparent effort to understand or leverage existing kernel drivers or subsystems that could have simplified (or outright replaced) their custom functionality. It is unholy.
Now, this might sound absurd by the standards of today (because it is), but this was the transition that every programmer had to make back when high level languages were introduced. It takes time to adapt to a paradigm shift, so it hardly seems surprising when vestiges of the “old ways” can be seen peeking through the curtains of the new abstraction.
The ease with which I was able to read this page has me wondering how much of the challenges I sometimes have with focus and attention have to do with modern web design where pages are littered with elements unrelated to the text (not to mention ads). It's hard to beat the readability of black text on a white background with a few <p>'s
Theoretically, how would one build a statue of Dennis?
Yes, I’m serious. If it costs less than $10k, I’d love to make a statue of Dennis and put it somewhere in my house. Partly to show off my excellent sensibilities in artistic taste on my otherwise barren walls.
But mostly I just realized I have no idea how statues are made circa 2022, and it sounds fun{,ny}.
The art of marble or bronze sculpting is still around, although much rarer than it used to be. It's a trade like any other, requiring years of study to achieve a high level of competence (notwithstanding the trend of throwing a bunch of random metal pieces together and calling it art - true art requires creativity and skill; one or the other does not suffice).
Many university art programs have a sculpture department. Student and faculty artists will make works on commission, although it's hard to find good figurative art among the sea of abstract political B.S. If you're really serious, Italy is the place for the best artisans, as it has been since the Renaissance[1].
There are still some old-school sculptors around [2] in America who take the craft seriously.
Find a local sculptor you like and commission a work! Your budget won't get you somebody famous, but it's enough.
Maybe consider a bust rather than a life size full body sculpture. I think that a bust of Dennis Ritchie would be a pretty awesome quirk in somebody's house. Engineering heroes aren't normally celebrated like that.
This artist died in 2010, so won't available for commissions XD. However, it's a cool aesthetic and it's worth checking out the article just to see it.
Practically speaking, if you actually want to find a sculptor to commission a work, search for a local sculptor's guild and check out local galleries or exhibitions. Many works at exhibitions will be available for sale and will have prices listed, which will give you some idea about cost.
Plus, visiting local galleries if you haven't done it before is a fun adventure!
Certainly. I would too. But wouldn’t it be cool to go down to your laundry room or wherever and see a bigass Dennis Ritchie statue?
I suppose I could put it in the yard, facing a neighbor’s window. Then we’d be able to dress Dennis for Halloween and Christmas too.
But for real, is it completely impractical to want your own statue of someone? Rich people do it, and it’s been a few centuries, so I bet technology has worked its usual magic on the price...
EDIT: in Germany you can get a 3D printed 10 inch figurine for $400: https://doob3d.com/ so this is possible in principle.
Requiring something weatherproof which can be displayed outside and not degrade quickly when exposed to the elements will dramatically increase the cost. Can you make your artistic point with something which can only be displayed inside?
It's a shame that the Americans don't have a culture of putting luminaries (other than presidents obviously) on their bank notes; it's a wonderful and far-reaching way of celebrating a person's contribution to the culture. Even so, Ritchie might be considered a little niche for such an accolade, but it's a nice thought experiment nevertheless.
His recognition level is niche but his contributions aren’t. His language (C) and OS both literally (UNIX->BSD->macOS/iOS) and in design (Linux/Android) is powering a huge part of civilization.
Somewhere on the Internet exists a Plan 9 press release with my name on it. When I had a chance to corner a few of Dennis's colleagues at Mobile World Congress (they were, at that time, part of Alcatel-Lucent), I asked them for their memories of working alongside him.
Please excuse the audio, I was working with a FlipCam and just focused on capturing what I could for the history books.
I sometimes think modern culture has lost a grip on the past. Much has gone before us and much will go after us.
I find reading the Unix Manual comforting as it reminds me where we have come from. It was written a handful of weeks after I was born and I am still using it's commands fifty years later.
Even though I never met or knew Ritchie I still feel bad when I read about his death, when I was a teen I would go in rabbit holes reading about C and Unix, and would read about all the design decisions he made and the rationales for it, a true loss for the programming world.
The concepts of minimalism and modularism are being thrown away and it shows in the performance and stability of new software. It's a shame we have to learn the same lessons over and over.
>Even though I never met or knew Ritchie I still feel bad when I read about his death, when I was a teen I would go in rabbit holes reading about C and Unix, and would read about all the design decisions he made and the rationales for it, a true loss for the programming world.
sounds a lot like me, even I still read docs related to old school UNIX and C to this day
>The concepts of minimalism and modularism are being thrown away and it shows in the performance and stability of new software. It's a shame we have to learn the same lessons over and over.
why though? is coding huge monolithic software easier to do rather than creating a set of modular and simple tools?
Yes, creating a monolith is usually easier and faster. It is also the wrong thing to do, more often than not. As they grow and mature, properly designed modular systems can be easier to debug, maintain, test, deploy, and document.
On the other hand, hardware has continued to expand to support the bloat in software. I do agree computer software design could be better, thanks Electron. My most used piece of software is written in Java. I also spend a lot of time in Electron. My terminal is written in D (Tilix). I guess my point is, it could be better, but there is no incentive to make it leaner so no one will try. Nothing is really stopping anyone from running old school Linux software though. I know guys who run FVWM and really minimalist configurations of their Linux systems.
From Wikipedia: "News of Ritchie's death was largely overshadowed by the media coverage of the death of Apple co-founder Steve Jobs, which occurred the week before."
NOW I'm sad.
RIP Richie, your place in computing heaven is secured.
HN has a weak spot for Marketroids like Jobs or Musk. It it sad that the people being praised are not the ones doing the work, just the ones presenting it. For me, Dennis Ritchie, Ken Thompson, Brian Kernighan, Alfred Aho are real people who built something. The others (Musk, Jobs) are only opportunists with a big mouth.
Yeah, it really frustrates me that this site's startup culture worships people who are veritably garbage individuals as opposed to recognizing the people who, you know, actually built the stuff in the first place. I struggle imagining a world where so-called hackers respect Jobs more than Wozniak or Dennis Ritchie, but here we are...
The first book I bought for programming was C Programming Language. Freshman year, 2004, for EECS 10 at UCI. To my delight, it remains part of the curriculum of the course, 17 years later with the same instructor too:
While there Weldon tried to pivot the place from being ‘like twitter’ to an incubator to a like a startup. He got an Apple Watch so we were going to be a wearables lab. He threw together a book on the supposed future of networks and conspired to get it to the NYT #1 by forcing us all to buy it.
There were endless reorganisations. Disastrous leads parachuted in to wreck groups. Weldon played favourite to an alarming degree only to turn on them when they didn’t deliver on the aforementioned vapid promises.
We had endless managers holding endless meetings telling the smart people what they should be doing and not listening to what the smart people wanted to or could do. All hands meetings telling us how great everything was while they were letting a third of us go.
They engaged in vanity projects like putting a 4G network on the moon and getting us to build endless fake demos for technologies that didn’t exist and then acting surprised when told said technology didn’t exist.
The funny thing was that the old timers over in Murray Hill just ignored all of this and continued to work away on their research untouchable like some prize zoo exhibit.
It was a place full of the most wonderfully intelligent people managed by fools who wasted fortunes on flashy demos rather that let the smart people take the time to build something.