Hacker News new | past | comments | ask | show | jobs | submit login
Wisdom from computing's past (lewiscampbell.tech)
72 points by LAC-Tech 11 months ago | hide | past | favorite | 40 comments



I am curious to learn what concepts we may have lost and not just abstracted away. Is the knowledge lost or niche? Is it necessary for programming in general or a subset of the practice?

I read the article but didn't see exactly what has been lost.


> I read the article but didn't see exactly what has been lost.

TFA contains this: "The graphical display is the servant of information, not the master of it." and "the computer being a tool we use to better our lives".

I think what has been lost is that instead of a slave (not my terminology, it's the one in TFA btw) at your service, the computer has become a master and you're the slave. At the mercy of a few gigantic corps and states spying your every move and turning you into a consumer, instead of a producer.

It's made painfully obvious with the ultimate consumption device: the smartphone. Which are mostly used to consume pointless, debilitating, content and not to produce anything of value.

The computer as a tool still exists though: architects, 3D artists, accountants even (yup, they're needed), engineers, music creators... These professionals use actual computers, typically with gigantic screens, as a tool to help them.

It used to be only that way, for everybody, and that's what's been lost. The masses are mindlessly consuming content of exactly zero value on what they believe is a computer ("My smartphone is a more powerful computer than the computer of the 70s/80/s/90s!").

That's how I interpret it and I take I'm not very far from what TFA means.


I think what has been lost is that instead of a slave (not my terminology, it's the one in TFA btw)

Actually, it is your terminology! I used the word 'servant' not 'slave'.

It was a deliberate choice. My commentary was more about how our visual presentation now takes centre stage, often at the expense of displaying information clearly or usefully. This is a mindset shift and not really a technical issue. So master/slave didn't fit, as that has more precise technical definitions; database replication, disk arrangements, etc.


> visual presentation now takes centre stage, often at the expense of displaying information clearly or usefully

What does this mean? Displaying information clearly is all about visual presentation!


There's nothing quite as fast as keyboard navigating a text input screen, tab key going from field to field. Learning the order that the tab key takes you through the screen. Pure muscle memory, flow from entering information. Web pages have different navigation paths, or none at all, for each web page. That's one big thing that's lost.


> It used to be only that way, for everybody, and that's what's been lost. The masses are mindlessly consuming content of exactly zero value on what they believe is a computer ("My smartphone is a more powerful computer than the computer of the 70s/80/s/90s!").

I don't think this has been lost. I think there are more people than ever using computers as tools. But there are also people using computers as consumption device. And also people using computers as appliances (for example a smartphone through a very specific app, or my instant pot). At home I use my computer to code (can be seen either as tool if I'm doing something "useful", or as a hobby/practice), to watch movies (as a consumption devices), to organize my movie collection (kind of between tool and consumption device).


>I think what has been lost is that instead of a slave (not my terminology, it's the one in TFA btw) at your service, the computer has become a master and you're the slave.

The gig economy apps like Uber and Doordash always seemed so dystopian to me because effectively your boss is an app.


I think it's the height of arrogance to declare much of anything about "the masses". You don't know other people's lives, so stop judging them. xkcd managed to be perfectly relevant here years ago: https://xkcd.com/610/

I work from home now, enabled entirely by computers. I'm not in an office, no one watches my day to day activities, work gets done and that's the evaluation. By every possible metric, modern computing has set the rest of my life free.


The older I get, the more often I have the opposite thought of that XKCD. I look around and see so many different stories, each with their own main character.


I maybe could have done a better job of what so captivated me watching the 'The Computer Programme' and why that feeling's so different today. The idea of ordinary people buying a computer and programming it to do useful things was one. Personal address books was covered in an episode - that's gone. There was the implicit assumption that your machine and your data was your own.

As for specific technologies, here's my rough list of what we're either starting to forget, had to re-remember recently, or have completely forgotten:

- low latency, simple, native UIs that don't require a designer

- operating on data locally first, and not over a network

- information highway instead of doom-scrolling

- message passing (constantly re-discovered, and then forgotten because people complain it's not RPC)

- relational and logic programming - I firmly believe this will come back one day, but it will be called something else and look new.

- static memory allocation. turns out it's still incredibly fast to do this.

- software design


> low latency, simple, native UIs that don’t require a designer

This is very much alive for internal tools, but the importance of design from a product perspective has proven important, which is why you don’t see much in products anymore.

> information Highway instead of doom-scrolling

Those are just buzzwords, what exactly do you mean by this?

> relational and logic programming

SQL is alive and well, granted newer programmers don’t learn it as early as they should

> static memory allocation

I’d like to hear more of your thoughts on this as well. We’ve found that this kind of memory allocation is error prone especially in multithreaded workloads, which is why it’s not as popular

> software design

Another buzz term. Software is constantly designed and the design of software is constantly discussed. What exactly do you mean by this? Who forgot what?


This is very much alive for internal tools, but the importance of design from a product perspective has proven important, which is why you don’t see much in products anymore.

This is revisionist history. Desktop environments had become so complex and fragmented that just writing HTML & CSS seemed incredibly appealing. "the importance of design from a product perspective has proven important" is unsubstantiated, and is justification after the fact.

Those are just buzzwords, what exactly do you mean by this?

A focus on information vs a focus on 'engagement'. Knowledge vs addiction. Feeling better after using a computer as opposed to feeling worse.

SQL is alive and well, granted newer programmers don’t learn it as early as they should

Unfortunate use of words on my behalf, see: https://news.ycombinator.com/item?id=38243753

static memory allocation

The tigerbeetle database is all statically allocated and (as I understand) makes no memory allocations as a response to use requests. They seem to be having great success with this approach.

https://tigerbeetle.com/blog/a-database-without-dynamic-memo...

Another buzz term. Software is constantly designed and the design of software is constantly discussed. What exactly do you mean by this? Who forgot what?

I don't feel like this is a buzz term. Agile or whatever one wants to call sprint based work flows means no serious design gets up front anymore, and so we constantly try and code our way out of anemic or non-existent designs.


1. There’s nothing revisionist. Game developers constantly make small internal tools using native UI toolkits. I don’t understand how you could think good design being good for a product is unsubstantiated. You’re arguing in bad faith there. Apple’s entire business model is selling their product design over capability. It’s obviously important.

2. That’s fair.

3. This looks interesting, but also very academic. Did that pattern ever see wide production usage?

4. A single project both doesn’t refute my point and refutes yours.

5. That just isn’t true. There’s room for good design. We do technical specs where I work along with some agile methodology. We just recognize that time spent hammering designs up front rarely produces better or more maintainable results in the same amount of time. Though I suspect this is very domain dependent.


>- information highway instead of doom-scrolling

It's really interesting to read about the vision that drove Douglas Englebart (inventor of the computer mouse, did the "Mother of All Demos" where he introduced a bunch of modern computing tech for the first time). Englebart was a hippie who envisioned a future where researchers used the tools he was creating to collaborate remotely and work to solve complex problems. His tagline was "Boosting Our Collective IQ"


Well that is one of uses of the modern internet. Just a very small one..


> I firmly believe this will come back one day, but it will be called something else and look new.

It's called SQL, and it's pretty big and you can get paid a lot for being good at it.


I meant relational programming in the sense that William E. Byrd uses it, as something like a close cousin or a different way of looking at logical programming

miniKanren is being used for research in "relational" programming. That is, in writing programs that behave as mathematical relations rather than mathematical functions. For example, in Scheme the append function can append two lists, returning a new list: the function call (append '(a b c) '(d e)) returns the list (a b c d e). We can, however, also treat append as a three-place relation rather than as a two-argument function. The call (appendo '(a b c) '(d e) Z) would then associate the logic variable Z with the list (a b c d e). Of course things get more interesting when we place logic variables in other positions. The call (appendo X '(d e) '(a b c d e)) associates X with (a b c), while the call (appendo X Y '(a b c d e)) associates X and Y with pairs of lists that, when appended, are equal to (a b c d e). For example X = (a b) and Y = (c d e) are one such pair of values. We can also write (appendo X Y Z), which will produce infinitely many triples of lists X, Y, and Z such that appending X to Y produces Z.

https://stackoverflow.com/a/28556223


Thanks for the reply! I hope it didn't sound like I was insinuating that you did a bad job, I was just curious what concepts others would bring up in a discussion about your article's thesis. It was a good read! I look forward to more.


I think the big issue is that the stuff that already exists has so many features and so many ecosystem integrations, nobody wants to use anything simple enough to DIY.

It would be easier then ever to make a simple address book, but it wouldn't do everything Google does.

The other issue is that we never really got a proper upgrade from spreadsheets. I think you could do a VB-like studio that let average people make modern apps that they'd actually want to use, if it was free, as easy as a spreadsheet, and a cross platform local app, not something self hosted, and had a ton of random features that would interest users.


As a minor point I'd say a UI always needs good design, even more so the more minimal it is. UX matters and I'm disappointed how many developers treat it as just "making things pretty."


Yes perhaps I should choose my words more carefully.

What I was more getting at is the lost idea of a 'set piece' UI we can use to quickly make things that look nice - rather than re-inventing the wheel with styling.

Like imagine if the default browser styling was good enough, and only people with particular artistic flair needed to bother with CSS or component libraries or what have you.


This is absolutely true. Said another way industry is nearly exclusively focused on designing the visual - to the detriment of everything else.

Design goes far beyond the visual.


> The idea of ordinary people buying a computer and programming it to do useful things was one.

First program I ever wrote (I was a kid) was a program to help my mum manage the household accounts, it even loaded and saved data to a cassette.

Bless her she used it - it had to be multiple times slower than just doing it in her notebook (paper not computer, this was the 80's).

Even then a couple of things became obvious, I was gonna be a programmer and I was more inclined to solving practical problems than games.


There’s a documentary that’s in post-production named “Message Not Understood: Profit and Loss in the Age of Computing” that chronicles Xerox PARC’s research on graphical user interfaces and personal computing and how modern personal computing and the Web have deviated from the visions of Xerox PARC researchers. I’m highly interested in watching it once it is released.

In my opinion, what we’ve lost from the early days of personal computing is a sense of empowering users by giving them tools that not only they can use, but they can extend and even modify. Sure, today’s hardware and software are more capable than ever, and these extra capabilities do empower users in the sense that their tasks are made easier. But what about the ability to shape their environment? All too often software these days is locked behind walled gardens and binary code. Users generally have to accept their software packages as is. If Slack or Zoom or Photoshop changed its interface, too bad; you gotta just cope. Even FOSS can be an impenetrable mass of hundreds of thousands of lines of C and C++ code that even a seasoned software engineer would have to spend a week or two studying the codebase before making modifications, a far cry from the much simpler AT&T Unix from the late 1970s or the Lisp machines of the 1980s.

Even more frustrating than the complexity of modern software is the increased commercialization of personal computing. Ads, subscriptions, and tracking is everywhere. Data is increasingly locked in cloud services that often don’t respect users’ privacy. In short, personal computing back in the 1970s and 1980s was about taking power away from mainframes and giving it to the people, but it seems that the past two decades have been efforts to bring the power back to large corporations who dictate the terms of our computing experience.

What can be done about this? I think the FOSS movement has been a wonderful start, but there has to be an effort made to make a simpler software stack that makes it easier for users to extend and modify their tools. I’m also very interested in idea of community-driven, non-profit cloud services as alternatives to Big Tech. Computing is too important for society to be controlled by a tiny handful of corporations.


To give an obvious example, it seems every generation re-discovers the concepts of encapsulation and abstraction every 5 years or so. Each generation thinks we’re so smart for having figured this out and how much easier software development becomes when your code has a clear purpose and interface.


UI frameworks, OO, and databases/SQL get reinvented every 5 years or so, usually by younger programmers who seem particularly ignorant of the lessons learned from decades of research and development in these areas.


To be fair, that's the way of young people in every endeavor. When you are full of energy but inexperienced, you have a tendency to present new solutions that you don't realize aren't actually new.


The idea that the user's time is more valuable than the developer's

The idea that the user is not an idiot


Both of these are manifestations of the same underlying cause: arrogance combined with ignorance.


The latter one is more business people driving the operation.

They want to assume the user is an idiot, because idiots are often the desired customer:

* Broader market. For every N people who'd buy AutoCAD, there's probably 10N people that would be able to run Baby's First Drafting Suite.

* Lower expectations and support costs. The professional with 10 years experience knows when a given piece of software isn't meeting expectations. He'll ask pointed questions and consume actual support resources. The idiot can be steered into a chatbot and gaslit into blaming himself.

* More susceptible to dodgy schemes. A professional who has to look at the actual cost/return of his investments and deal with corporate policy might not be as interested in subscription schemes and monetization shenanigans, while it might expand the addressable market for idiots. They might not buy into $100 one-time but will end up paying $10-per-month forever, or "we'll lock your data into a closed ecosystem and it will cost you to liberate it later."


Reading about virtualization and old IBM systems gave me massive respect for what our predecessors achieved. I wonder what other treasures are out there lurking in the literature.


Capability Based Security is one, data diodes are another


Data diodes are alive and well, just not in a lot of places. Most people don't need them, and organizations that do either have them, have decided that an air gap is easier to deal with for some reason, or are being run (on the IT/infosec side) by people who have no business being in their position.


I have some issues with this blog post. On one hand

> We like to talk of 'engineering', but chemical engineering was alchemy in an earlier life, and I believe our field is no different.

while on the other hand

> Computing being about information first, and the computer being a tool we use to better our lives. 40 years on these these ideas have been pushed to the fringes.

Maybe this is evolution? Maybe when society runs on information and literally everything is a computer (your car, your phone, your watch, etc..) we've kind of moved on from the mindset of the past when the computer was a 'thing' to use when you need it and not something ubiquitous?

> Can we treat the past with honesty, setting aside what we no longer need, and recalling what is useful but forgotten once more?

I'm trying to figure out what this means. Can someone help?


I'm trying to figure out what this means, it seems extracted from some electoral campaign. Can someone help?

That hurts - can you at least tell me it seems extracted from an electoral campaign from a previous century to soften the blow?

But what I was trying to get it is this - we either see the past and think "Gee weren't those people dumb with their 8 bit machines and in-door smoking", or we romanticize the past as some kind of "back when the world was simpler, no cellphones in sight just living in the moment" tripe.

It's better to be nuanced. Treat them as if they were just human beings like you or I - which they were. Look at some 60s, 70s or 80s computing idea and say say "that was a dead end and not relevant anymore", or "wow I really like how they were thinking about things, maybe that's worth re-visiting".


Sorry, I may have been too caustic with that remark. I have edited out, it was not my intention to offend. I think I get what you mean, thanks for responding.


Not at all! All in good fun.


> > Computing being about information first, and the computer being a tool we use to better our lives. 40 years on these these ideas have been pushed to the fringes.

> Maybe this is evolution? Maybe when society runs on information and literally everything is a computer (your car, your phone, your watch, etc..) we've kind of moved on from the mindset of the past when the computer was a 'thing' to use when you need it and not something ubiquitous?

I just think that the capacity to modify our surrounding environment is essential to what makes us human.

if we lose this ability in the computer world we feel as if we're getting choked because as I said, I have chosen to believe that modifying what I consider "my environment" (which for better or for worse includes my computer) is part of what defines me as a human


I like this blog post. It's very poetic.


Author here. Thank you for this comment! I decided to write in a style I wanted to write in as opposed to what I thought people wanted to read. And I'm glad of it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: