Hacker News new | past | comments | ask | show | jobs | submit login
The computer revolution hasn’t happened yet (1997) (catonmat.net)
215 points by tosh on Jan 24, 2021 | hide | past | favorite | 179 comments



I would say computer revolution still hasn't happened.

We are just starting to learn what computers are for. Hopefully, instead of being tools of stealing our focus and invigilation some day we will collectively figure out how to use computers so that we can live better lives.


I could not agree more. I keep banging on about what I call a MOOP - Massive open online psychology. These phones we all carry can listen to / video our every interaction, watch our spending habits and our exercise and body functions.

Epidemiology ought to be able to easily tease out ideal "life best practises" - actual good advice given to you by your phone - backed by the mistakes of a million others in just your situation.

if keeping a daily diary helps people make better life decisions imagine what that would do?


Being watched and served recommendations is not the revolutionary paradigm for computers imo.


It would be revolutionary if "recommendations" wasn't a modern synonym for "advertisements"


This exactly. The recommendations are, at best, a win win for you and some 3rd party that paid to be part of the win (but sometimes tries very hard to not make it seem so). It can be different, but only for the tech-savy (open source) and rich people (pay the tech-savy).


I care to disagree, millions of people working from home and huge digital economy is there.

There is already a lot of real value in ubiquitous computing, automated warehouses, supply chain optimizations, digital entertainment, digital government services in many countries, I could go on for couple of pages now.

Recommendations and tracking are just a side effect. FB and Google ads are not whole "digital economy" on their own.


> There is already a lot of real value in ubiquitous computing, automated warehouses, supply chain optimizations, digital entertainment, digital government services in many countries, I could go on for couple of pages now.

Is technology about, or only about optimizing value in supply chains?


I don't understand the question since I pointed out other areas as well like digital entertainment which has nothing to do with optimizing value in supply chains.

Maybe you thought about optimizing content delivery, but I was thinking about computer games, VR, music production, which are digital entertainment and have nothing to do with supply chain optimizations.


It's more like a Gattaca-like dystopia.


Why not?


What is the purpose of a human life? If my phone is recommending a course of action that is “best practice” for my life, surely the answer to this philosophical and political question has been answered rather than just assumed by engineers based on the perception of current societal norms of a subset of the global population.

Imagine the power you’d have if you had people literally following orders based on a system of life that you’ve designed. And follow it they might if the perception of outcomes is positive.


The computer revolution is more subtle than that. Just look at chess history and play. the chess abilities of humans has vastly increased and you can thank computers for 95% of that. ELO rankings is a modern invention to play highly competitive games from literally any place in the world. Chess opening theory is no longer a human endeavour, want the best practice in the opening? Ask the computer. Puzzles are being auto-generated and ranked to improve your recognition at record pace. The internet and the computers that run this world have sped up almost every conceivable industry. Computers have led to the creation of new concepts entirely. If you want best practices for your life, google it. ( no, really) Asking a search engine to point you to the person to solve your problem is the feature creep this article is about.


> Just look at chess history and play. the chess abilities of humans has vastly increased and you can thank computers for 95% of that. ELO rankings is a modern invention to play highly competitive games from literally any place in the world.

Elo rankings (not ELO) was first adopted in 1960, which while technically overlapping with computing's timeframe, the two are unrelated. Certainly Elo was not invented to play online chess or for the sake of computer-aided chess by any stretch.


Yeah bit of misphrasing. Point being is that fair rating systems and our computer's ability to keep track of everything has given rise to highly competitive games and very efficient progression.


You just described China with it's Social Credit System, pervasive internet monitoring and central control of public discourse.


Social credit is different. Social credit imposes penalties for deviating from the norm or even associating with those who do. This is more like feeding apple health and google for data into IBM watson and getting personalized recommendations out.


For the most part, china’s Social Credit System is the same as the credit scores in U.S.


That's how it started, but nowadays Credit Score type data is a tiny fraction of what goes into SCS. The SCS is affected by things like playing loud music, eating on public transport, traffic violations, not showing up to restaurant or hotel bookings, not sorting your domestic waste, and loads more. You can gain SCS by donating blood, volunteering, etc. It is widely believed social media posting heavily affects SCS, and there are many example of people's scores diving after they had posts blocked, but the authorities won't come clean about it.

As of mid 2019 26 million air tickets and over 4 million high speed rail tickets were refused based on SCS. Also some personal information is openly published about people with low SCS.


> What is the purpose of a human life?

There isn't one. We're simply the result of a bunch of lucky events and evolutionary forces.


We can create our own goal though. For instance maybe we want to maximize happiness. So this means we should attack the core root of the problem. Our chemicals. We either need to find a drug like heroin that does not have tolerance and withdrawal or find a way to reset chemicals after drugging yourself or thirdly we just must find a way to be able to easily alter this chemical state. Then we will have solved all suffering issues. Right now we are just trying to treat symptoms.


What if brains aren't wired like that, and you become numb? That sounds like an aweful goal to have. Maybe a better one is to minimize misery & unhappiness (ideally, global minimization, not local)


Yeah I think what I proposed is an example of what an AI could come up with if it's given a goal of maximising human happiness and reducing misery.


This reeks of scientific hubris, to speak as if we know this to be fact.


One of my purposes is to keep living. To that end I exercise, get my heart rate elevated, try to eat better and brush my teeth. I also try (though less successfully and without much attempt) to limit my time watching TV.

I know some of my habits are awful (brushing teeth, exercise as of late, going on any walk outside) so I setup my phone to remind me every date at a specific time.

Can some of these be predetermined by an engineer? Yes! Can all of them? No.

Are not notifications intended by engineers to dictate a behavior? "To use my product" to what end various products are a means (Facebook, reddit, HN etc). I'd argue any product/system to which we join encourages us to follow it if we find it rewarding, regardless of positive outcomes (though surely the short term just look positive else what incentive exists to start?)


I think the point is that a greater amount of more-granular and empirical data could provide the basis for learning more about "the human condition".


Ah, an experiment in the eugenics of human behaviour. One must be careful where one treads.


I’ve seen that episode of Black Mirror (or it at least sounds like one)

Every technological advance is usually both good and bad. Some are more good, some are more bad. Sometimes we gain a lot by it, and only lose a little, other times what we gain is not worth what we lost.

The problem with this idea is what makes life valuable the machine would not know. “Life’s best practices” is subjective, and therefore the machine could promote what one considers a best practice that another considers a worst practice.


To me, MOOP = self reinforcing echo chamber.

The reverberations of the loudest (most magnified) permeate and shape ideology.

Hopefully the conductors get the frequency right for long term and long distance communication.


Have you read Homo Deus by Yuval Harari? Mentioning him can sound a bit pseudo as he's become so popular and is such a popularizer of others' ideas, but I believe the main tenet of Homo Deus is his own and comes very much from thinking about what you're calling "massive open online psychology." His notion is that trusting our technology more than trusting our own feelings will be the death of humanism, because in his view humanism is at heart about trusting one's own individual feelings. The entire liberal can be seen as one aspect or outgrowth of humanism.


I would say Martin Heidegger touched on that (trusting technology over intuition) before Harari in Being and Time (1927). Harari may have brought concepts into the mainstream but these concepts were being discussed in philosophy for over 100 years.


I would say that Harari is probably a good deal more approachable than Heidegger. :)


You are basically describing augmented reality, where devices give you superhuman knowledge to give you advantage in life (know things before they happen). We will get there, but it will be a rocky road if it doesn't destroy us (e.g. China's social credit system). The implicit bias built into such systems is so hard to remove and is part of the Big Sort issue in society today.


Another question. If China is left free to implement such a device that can completely optimise their people, will they become more powerful than other countries by having wonderfully optimised people to do their bidding?


It is definitely possible. It is a better worker bee.


The most popular food is fast food. That is the end game. The most popular computer programs will be distractions, we are already there, nothing much will change with respect to personal computing. Next step is a decent AI, before then things will continue to look like now. Or more realistically if food is anything to go by, we will just get more and more information fat just like we got physically fatter, so we aren't "there" yet, it is going to get worse.


An interesting analogy with food. There are two assumptions I see in your analogy (1) Development of computer will parallel that of food. This is the raison d'etre of the analogy. (2) The food industry has matured and plateaued.

My personal takes:

For (1), I don't think it is self-evident. I see that the pleasure from the consumption of junk food and that from the mass media has some parallel, but tech has a far greater degree of freedom -- it's potential to truly transform human lives (for good and bad)

For (2), the food industry, although, by its nature, conservative, is going still going through evolution. Maybe fast food doesn't have to be junk food. Maybe nutritional balance can be achieved without compromising taste and cost.


In America.


We're already there though..

I bought a action camera that is the size of a palm, with hardware 3D gimbal stabilization and can track faces and other objects. The video from which I can stream to my phone via bluetooth and it came to my doorstep 2 days after I ordered it online.

All of that involves computers of various size, sitting either in a miniature hardware in my hand or somewhere in a huge data center, and worked in tandem to provide an experience that is just mindblowing 20 years ago.


That rather depends on your understanding of good life.

Was that camera somehow necessary for you to lead a good life or is it yet another distraction from it?


That is very subjective.

May I ask, what in a good life can not be classified as another distraction? Or are only human activities that someone from 1000 years ago can do pass the purity test?


Perhaps a good life can be defined in terms of a lack of distraction. That does not mean we should value activities from 1000 years ago more. For some it may mean quite the opposite since they are no longer distracted by the struggle to fulfill basic needs so pursuing the benefits of modernity is possible. For others, it may mean ignoring the distractions of modernity while seeking the purity of the past.

The problem with the current trajectory of technology is that it seeks to strip us of autonomy. We are usually presented with means of consumption, rather than courses of action. I suspect that is what people mean when they discuss distraction.


What distraction means here is still unclear.

I for one would much rather fill my life with pleasant distractions, instead of a Spartan utopian lifestyle that some other people tries to sell me as 'good life'.


Hundreds of millions of people are stuck at home due to lockdowns and the only practical way they have of doing their jobs, keeping in touch with loved one and accessing public services and information is through the Internet.

Is the Internet contributing to the good in their lives? A report recently that found children who play social online games and connect with friends online while locked down have better mental health than those that don't.

https://www.theguardian.com/commentisfree/2020/nov/23/video-...


I tend to disagree heavily with that. Social online games are fostering a toxic "win it or die" mentality that is making children anti-social physically and introvert. Bullying practices just got to the next level with games such as Fortnite, Free Fire, etc.

And the industry profits with that by applying psychological gimmicks in their games, the same ones applied by Vegas' casinos. The problem here is that they are applied on children that are unable to fully fathom the mid-long term consequences of their decisions and that keeps them playing and forgetting the external world. Their mental health is better until the day their parents take away their devices and send them to play outside. Suicide rates doubled over the years. Antidepressants use doubled over the years.

Social media by itself is not the problem. How big techs use them to profit (The Social Dilemma) is the problem.


Online bullying does happen, for sure, and all those are concerns. But bullying and safety are an issue in any playground or arena in which kids hang out. This study is actual evidence of the positive overall effect on most kids. Do you have any counter evidence that shows the problems actually outweigh the benefits?


The problems are shown only on mid-long term. We are seeing children more introvert and more "scared" of the external world. Children that will grow into adults that are not able to deal with unforeseen negative events.

I agree with you that this is an issue in any kids physical external activity. But the difference is that these activities are often supervised by an adult and in an event of a bullying the participants are removed and also often punished, and also the victim is helped and assisted. Note that I use the word "often" because I know that there are parents that encourage bullying behavior as ~self-defense~, which I also disagree heavily.

We don't have adults supervising every Fortnite lobby. Most children do not know how to use reporting tools to report improper behavior by another player. And while they suffer from bad words, psychological threats and such, they can't just turn off their devices, because the game always offer them a reward, a new skin, virtual coins, lootboxes, etc, so they keep playing, and therefore we have a loop.

Just yesterday I reported a player who cheated on FIFA, by exploiting a glitch that enabled him to play with a team full of superplayers and legends. The whole process took me almost 30 minutes. Sometimes I think that they don't want us to report bad behavior because the process itself is tiring.


You didn't talk about action sports cameras...


I enjoy watching action can videos of mountain bike riding which shows me all these places around the world for free. They are purely a positive influence on my life and I don’t even own one


We wouldn't be living in the lock down hell hole in a pre internet world. The idea that children playing online games are better off than those that don't is disgusting. It's a relic of this sad excuse for a life we live now. All to help grandma and grandpa, average age 82, to squeeze out another year. Who would happily let go for their grand children. Reprehensible.


My wife is a Nurse. It's also to help the millions of health care professionals in the world working gruelling hours and horrible working conditions on wards full of dying patients. The number of health care professionals that have suffered severe stress and trauma, a shocking number of them to the point of suicide, is a horrible bloody stain on the hands of the people not taking these precautions seriously and thus aiding the propagation of this virus. I'm pretty sure the thousands of people dying daily in the US and UK where I am, and millions who have died globally aren't all in their 80s.


I also have an immediate family member who is a nurse, and unfortunately our anecdotal, second hand experience, seems disagree. Yes, that's not how statistics work, they are not all 82. Let's also remember that we live in an imperfect world and many, many more of younger age die every day. We are living in a dilution where we forget that everybody dies. A 10% increase is not a 10x increase. We can handle this without the security theater.


> understanding of good life.

Waiting 1 month to have something delivered is hardly what I call a good life. Let's celebrate the improvements where we can.


What if humanity had solved all grand problems except for shipping. Would that not be a good life worth living? Everyone's happy and fulfilled, except for those who's happiness depend on swift Amazon deliveries which never seem to occur. Day after day they wait endlessly for a shipment of goods, becoming moodier and moodier while the rest of us dance in the streets in perfect joy.


Except most of us won't dance in the streets in perfect joy, and probably won't ever see it in our lifetime or even in the entirety of humanity's existence.

In lieu of that, I prefer my shipping delivered in 2 days vs. a month.


Ok, let's take another example.

The leading COVID-19 vaccine involves a lot of computers. From the production of the mRNA strands to the setup of the supply chains to communications between scientists.

And there are few things that we need more than an effective COVID-19 vaccine in order to lead a good life right now.

The computerized chain that got GP his camera is similar the the one that is used in the making of the vaccine.

GP's camera is just icing on the cake.


Would covid have been made in the wuhan lab without computers?


What is life?


David Ackley, an artificial life researcher working on "robust-first computing," has this gem buried in one of his videos: "Life is nature's way of preserving meat."


Is “stealing focus and invigilation” really a problem that computers themselves bring to us? It is like blaming the road for billboards around it. Just find those who install these boards and slap them to their senses.


I mean yeah, most computer work is voluntary. Facebook is exploitative, but most people choose to go on Facebook and let it happen. They can choose to only occasionally get on there and look up events and happenings, but for a lot of people it's not just that but a time waster, and as it goes from thoughtful / intentful activity, people become less critical of what they're seeing.

I mean I'm guilty of that, I frequently browse reddit mindlessly. At least I'm saved the obvious ads - for now anyway, I'm sure they'll crack down on 3rd party clients soon.


As long as corporate greed continues unhindered, computers will remain a way of controlling and monetising the population.


Complementarily, I would say that because of corporate greed (or corporate behavior / politics more generally), it will be a long time before we can use computers built on the principles Alan Kay outlined in his talk.


Greed motivates corporations to try to provide people with what they demand. It can be channeled for good if consumers inform themselves about what products/services better their lives and actively choose them in the market.

The growth of the health food market - or at least some subcategories of it which are genuinely healthy - would be an example of this happening.


> It can be channeled for good if consumers inform themselves about what products/services better their lives and actively choose them in the market.

...assuming they're given a meaningful choice in the first place. Consumers choose from what's available on the market, not from some abstract space of possible products. Not only is technology too complicated for the mainstream audience (and products on the market do their best to make everything seem like "magic" that cannot be understood casually by non-specialists), the market doesn't really listen to user feedback anyway, preferring to divine everything from telemetry.

Greed, in general, motivates corporations to make money. Sometimes, providing people with what they demand is the easiest way to profit. Oftentimes, there are better ways. Like investing in marketing to make people demand what you want to provide them. Or scam them through convoluted business models.


n be channeled for good “if consumers inform themselves about what products/services better their lives and actively choose them in the market.“

Very true. However it’s easier for motivated corporations to steer people away from such information.


That's why we need public resources expended on producing useful and true information and making it available for public consumption.

This can be done through both state-backed initiatives and non-governmental ones like Wikipedia, or speculatively, a public interest DAO that crowdfunds R&D. Gitcoin may be the first iterations of the latter.


This is very true, but the "if" is a big one...


I think the products/services the market provides mostly enhance our lives. Society works better than than our cynicism would imply, and when you zoom out over a span of a couple centuries, or even 40 years, that becomes apparent.

Of course there are many cases where the consumer is underinformed, and harmed by what the market urges them to buy, and these need to be addressed.


On the other hand, we are potentially now closer to our doom than ever before. If we look at all the things, that _in the short term_ improve our lives, but _in the long run_ might seriously impact future generations, then I am no longer so sure about that statement. Zooming out is fine, but not only looking at the past, but also what the future seems to hold, is a worthy perspective to take sometimes.


Certainly! It was these cases I had in mind. Maybe I should have made it clear.


> Hopefully, instead of being tools of stealing our focus and invigilation some day we will collectively figure out how to use computers so that we can live better lives.

This is just a specific narrow view of computers and how they are used by people in day to day life. The larger picture is that computers have impacted every single industry/domain and that impact have indirectly made lives better in some aspect.

In the end, computers are just tools, tools to automate tasks. What task humans want to automate depends on their needs/desires and so does the consequences of them.


My personal belief is that single computers themselves are worthless. Their connectivity and the possibility to exchange knowledge and experiences is what makes them so much more efficient than humans (at least in those tasks we automated already) - or when used as an interface - so efficient as an information management tool.

In order to reach a superintelligence level we need to solve the automation of gathering, evaluation and improvement of knowledge first.

And by automation, I don't mean wikis or something you can find via google. Wikis and the web currently are built for humans, not machines. If we are able to automate the semantic knowledge behind it and transfer it to neural networks, we will have limitless capabilities.


Some people already did - it is not that hard, really.

As a kid, I spent countless hours in the underground trains staring out of the window into the moving darkness. Today, if I am on the train, I read interesting stories I like.

As an another example, apparently area about my birth town had all sorts of unique and interesting places to hike. I had no idea that those places even existed before internet - blogs and forums.

And the more unusual hobbies (think aluminum casting for example) - did you know how had was it to find good information before internet?

Please do not equate internet to facebook and twitter. It is so much more than that - you just have to invest a bit of effort instead of blaming it.


> We are just starting to learn what computers are for.

This seems a bit like saying that we're just starting to learn what engines are for. What we use computers for will continue to evolve and we will never arrive at an end state.


Computer/Smartphone being addictive and time-wasting in personal level doesn't negate the value they created due to automation. That's a revolution too, even if most of value is harvested by the industrial upper class.

If people realized the value computing can give, consumer electronics will be useful for them. But they want instagram and snapchat instead.


>>That's a revolution too, even if most of value is harvested by the industrial upper class.

A significant portion of the value created is not captured by economic statistics, as they mostly deal with measuring economic resources that are scarce enough to have market prices.

Abundant goods are free like air, and are widely distributed, and automation has made many classes of goods/services fall into this category, like many types of information (compare trying to find your way around a city 20 years ago, to today using Google Maps, or trying to read up on some obscure subject 30 years ago by visiting a library, versus finding the Wikipedia page on it via a 3 second Google search on your phone).


In the attention industry content is created to package attention and formalize it enough to be a commodity that can be sold to advertisers. The whole very much depends on this model of production, and if we don't find a good substitute this will continue ...


I guess the way I see it, computers have already changed every aspect of life.


This. The last decade saw a massive adoption of computers of all shapes and sizes in India. In the decade before that (2000 - 2010) computers were limited within labs, and offices. But smart phone changed all that. From booking train tickets to access medical health check reports to paying push-cart vendors through mobile wallet, consumers are now using computers in all sorts of ways.

As you rightly said, I just can't think of a domain that's untouched by computers.

Even in the fundamental science it's now impossible to do anything without massive data crunching machines. LIGO detection of gravitational waves, the first black hole images, and the plummeting cost reduction of full DNA over last two decades are a great example of how computers have irreversibly changed the way science is done.


Agreed on the "starting to learn what computers are for", but I'd say it's happened. It's just not the way we wanted it to ideally manifest as a species.

We want to believe that the revolution should be altruistic in nature. Bettering the human race through only "positive" methods - Star Trek style utopia if you will.

But reality shows that even with all this available computing power, it's still a finite resource. Any finite resource is capitalistic in nature and will be subject to the aggressive expansionist.

It needs its checks and balances like other resources.


Of course it has. Your smartphone is the manifestation. It replaces at least 40 functions and makes your life better. If you choose to use mostly the bad parts, that is on you. It’s like complaining about your car making your life worse, but you only drive in reverse and bump into things


How does it make a life better. Have you made the determination yourself? At least a significant number of people I know personally say otherwise and stick to dumb phones. They are well paid and it doesn't seem they are doing this to save money. So I don't think what you say is objective truth as you try to portray it.

If I leave out the bad parts then the rest is mostly what I need to use because of external demands or because of technology itself (ie. problems that only exist due to technology itself)

I don't need to be available on communicator. My team and my family expects me to be or I am marked as unhelpful. That wasn't a problem when there wasn't expectation to be constantly available.

I don't need to have Uber installed, but it seems to be the only way to get a cab where I live.

I don't need navigation, but roads and intersections got so complicated and I got so reliant that I would not be able to reach my destination without it.

Etc.


I’ll consider it a revolution when user friendliness and low-config are the default.


I guess we are already there then. My grandma has no problems operating an iPad while a laptop is largely too complicated.


A sudden revolution is unlikely anyway. It's more some sort of a gradual process.


The real revolution will be AGI. Dunno how that'll play out.


IMHO we've had only one material revolution from computing which offered the ability to move away from paper. For individuals and SMBs this was the spreadsheet, for larger organizations this was a centralized database with remote access. But after that I don't think we've seen a material improvement. Dumb terminals in the 80's gave just as much value to access data as their Windows GUI successors and then web/mobile. So aside from the spreadsheet I'd agree with the premise of this post, we're still not seeing a true computer revolution. It's mostly the same stuff with different interfaces.


In some ways we've gone backwards. The 80s green screen computers that my public library I had as a kid were really well-designed and efficient to use. The modern, web-based GUI of my public library today is not.

(Though I would say there's a gazillion ways we've gone forward. Videochat, youtube, twitter, and tiktok are all amazing things.)


Yeah you and another reply have a good point that real time and asynchronous video communication paired with ubiquitous cameras on our devices might also be the next game changer. If so it's still the first inning or relative beginning of seeing the impacts of that technology on society.


remote work during a global pandemic wouldn't be possible without computer revolution


Bandwidth was the main limitation to remote work.


And conversely made the problems all the more troubling as we became even fatter and more sedentary.


these and other current issues like mental issues are due to current quarantine and lockdowns, office life is inherently sedentary anyway.


Sure office life is no tradie or farmer but personally I find i found i was doing slightly less walking when not in the office, I drove to the coffee shop instead of walking 400m round trip to my coffee shop of choice at the office 1 or 2 times a day and I lost the 15 minutes walk to the train station every day.


That is almost entirely the fault sugar and other shitty diet choices. Exercise is far less impactful than diet.


Computers and GUIs have radically reshaped video and music production. I'm sure there are other domains where has had similar impact, where reverting to a dumb terminal would not be possible, such as CAD applications.


The production tools may have changed drastically, but has the end product really improved? On might argue that music and movies produced before 1998 were a lot more engaging, nowadays movies and music are produced without obvious flaws, taking away character and making it all very bland.


Isn't that a slippery-slope argument or something?

Nothing in the history of life has changed fundamentally: Eat, reproduce, die. Your evaluation of the industry's products doesn't change the objective fact, that there is significant change enabled by computation.


What do you expect than? We humans got already advanced computation going on between our eyes. Pretty much nothing a computer can do, isn't to some extent already done by humans. What do you expect, if not just getting more efficient at tasks, than humans?

Not sure we can build something fundamentally more capable than human brains. And if so, can we recognize the leap?


My thought process reading this article:

-“Wow Alan Kay is legendary.”

-“Can’t imagine life before OOP.”

-“I like how this YouTube video is annotated and I can scroll, see screenshots, and jump into the video where I want.”

-“I wish there were more video summaries like this.”

-“Generating summaries of videos like this is something tools like GPT-3 will do for us soon.”


"HTML and the Internet has gone back to the dark ages because it presupposes that there should be a browser that understands its formats. This has to be one of the worst ideas since MSDOS. HTML is what happens when physicists decide to play with computers."


Kay later points to the internet as an example of software that can grow instead of being constructed. And I know that the web is technically not the internet but I wonder if the internet would have grown the way it did without HTML and the browser. There's a lot written about all the better ways that could have been but it's hard to downplay the beauty of what has essentially become the largest unstructured program in history.


Well, I think the rise of Javascript also played a key role in it.


I'm not much of a programmer, so I don't follow this statement. Could you break this down a bit for someone with little programming knowledge?


Alan Kay is a huge proponent of capital P Personal Computing. Meaning that anyone should be able to edit programs, edit how text gets displayed on your screen, etc.

HTML is a language to structure text with, that's perfectly fine to him. The problem is that the browser decides what that text should look like to you. Not only its aesthetics, but also how and where it gets displayed. If you want to visit webpages in a painting program ro find images, why not?

This makes a lot more sense if you watch one of his smalltalk demos, he draws a wheel and a car in a paint-like program, then tells the car to rotate as much as the wheel is rotated. He can rotate the wheel with his mouse.

Then he paints a pedal, and if clicked on it accelerates the car etc...

This is not a programming environment, but an operatjng system, you're free to make all your programs do everything


Every webpage you view nowadays is at least written in HTML, including this one you're reading, mostly it also includes Javascript and CSS.


HTML, and the web stack by extension, have their faults, but the advent of the web should've been a humbling experience for Alan Kay. I deeply respect Alan Kay for his achievements, but he appears to have been stuck in "object land" for a few decades now. For example, he seems to hate the idea of "simple data" for philosophical reasons (see [1]), but it's undeniable that "data" is not a bad idea at all and we use it proficuously every day. The fact that a physicist succeeded in creating a worldwide hypertext, and the best computer scientists didn't, should be telling. There is a lesson here somewhere, just dismissing the web as the work of amateurs is missing the mark. I love Alan, but sometimes, he is his own worst enemy.

[1]: https://news.ycombinator.com/item?id=11941656


Anyone else find it extremely coincidental that of all the millions of years of human civilization we could have existed in we all just happen to find ourselves at the most exciting period of technological innovation?

Coincidence or Simulation confirmed?


Wouldn't someone at the start of the industrial revolution also think themselves in the most exciting period?


I definitely think of the first ~7 decades of the 20th century that way. If you were born in 1900, you would (most likely) have witnessed: the internal combustion engine, human flight (from nothing to faster than the speed of sound), a major world wide economic depression, two world wars, the invention of nuclear power, major social and legal changes in the '60s (in some places at least), and the exploration of space, starting from nothing and most likely including humans walking on the moon.

Or another way to look at it: during the N years I have lived, how much has the the world changed compared to the period of time between 1900 and 1900 + N? Though on reflection, I guess this is probably the sort of thought experiment that becomes more interesting as one gets older..


No, no. This time is different.


Can't tell if you're being earnest or funny. Care to say why?


By some estimates, current living humans make up 7% of the total number of people who have ever existed.

Additionally, many during the last 200 years probably thought they were in the most exciting period of technological innovation. When you factor this in the probability of existing during such a period seems much higher than coincidental.


Then again, they were all correct (in a way): They did live in the most exciting period of technological innovation there had ever been... So far.


One thing to keep in mind is that there are way more people alive today than ever before. According to Google, there have been about 108 billion humans alive, and 7.7 of those are alive today. So, your chances of living during the "most exciting period" is better than 7%! In any case, much more likely than any other period in human history.


I would like to skip the clock ahead about 500 years, but I'm afraid of arriving in a desolate tomb world.

I feel like it wouldn't be that hard for a person from today, especially someone who keeps up with scientific and technological progress, to quickly get up to speed with the world of 500 years from now... but I might be wrong.


In case of time travel to the future (or the past) what you should be worried about are all the other aspects: the social landscape, language, ...


If Michael Crichton's Timeline is to be believed, I should be worried about disease, at least for the past... although there's no reason I couldn't contract COVID-2519 and die...


However, if you expect that in the future the population will keep growing by orders of magnitude, what does that imply?

A) you're just in an unlikely position

B) the population will rapidly shrink and never recover

C) this is the most "interesting" time in history, and there are so many simulations of it by the people of the far future that we are more likely to exist in this time.


B; the argument that if humans are going to take over the galaxy and become a multi-trillion population species and you throw a dart anywhere in the population of all humans who ever lived, chances are the dart would land in the region of most population, and therefore where you live is probably the time of highest population, so we never do become a galaxy-spanning species we only dwindle from here.

And it's a daft argument because if you don't have a soul, you are the product of your environment. You couldn't be born as someone else, or somewhere else, or somewhen else, just like the River Amazon couldn't be on Mars or in Pangaea, because it's defined as "the thing in Brazil, currently". You couldn't be born in the Wild West because you are defined as "the child of your parents" and they weren't there, then. You didn't end up /in/ that meat body, you /are/ that meat body.

(And if you do have a soul, and they are randomly assigned to meat bodies, this argument is still like saying "roll two dice, the most likely combined outcome is a 7, I got two dots and one dot so that must be what 7 is")


D) This is the most interesting time in history so far, and the correlation with population will hold as population grows.

Although echoing the opinions of many other commenters, I believe the early 20th century probably takes that crown.


That's just mostly viral hype thinking, maintained sometimes by key people in the industry through the press, because it drives money to them and willingness to work for less in certain cases.

The decades around 1900 were more innovative than what we've been experiencing the last decades.

And yes you can find people thinking the same thing way back in history, with the difference is that now it's been psychotically amplified by mass media, and it is actually annoying.

In popular press, and when people have things to sell, they do not include historical facts, because then they cannot drive the hype to new degrees, if they the include historical facts.

You may think that they have done their homework when they say "never before in history" but they have either never checked or deliberately ignored history.

I've had people I know, and the press swearing up and down that 'this time it's different' since the 90s when it comes to 'A.I' for instance. I'm sure older guys can go back even earlier and remember 'the impending A.I revolution'. Sweet money in that hyped narrative.

If you think that "this time it will be different" - congrats they got into your brain, and they do not care that they have done it just 10 years prior. With enough repetition of the same information, and the way the human memory works - it doesn't really matter.

You will help them every time by reliably assuming that someone else cared enough to check history before saying that something is "historical", "first time", "never before in millions of years of human civilization". They haven't, and it doesn't matter for their goals.

One of these times something will eventually happen. Until then PR money, clickbaits, VC money, startups.. the whole classical techno-utopian centrifuge.


By Jove, Northington!

Look yonder at yon contraption! A beast of iron and stream that can pull yet more buttloads [0] than a team of a hundred horses doth speed along tracks at a brisk pace of 80 miles every hour!

Yes, yes... quite right, Wigglesbarton. Truly we live in an age of wonders!

Well, I must away to beat my servants. Cheerio, old boy!

[0]: Actual unit of liquid measurement, eq. to 126 gal.


Well, someone had to be there.


I don't agree. I was born in 1979, my dad was born in 1944, and my son in 2009.

If you look at all of our youths, the biggest difference is between my dad and me, not between me and my son.

My dad grew up without TV, 1 person in the street had a radio. Later cars arrived.

When you look at my youth, we had computers, but internet arrived when I was 18.

My son watches TV, plays computer games, watches star wars, etc. Not that much has changed. Sure they have smartphones and social media, but the difference are details.

The biggest impact of computers is already passed us, and happened probably between 1970 and 2000


Not sure I agree with your disagreement; I'm thinking the biggest practical differences in youth would be non-computing and a generation or two earlier - indoor light, central heating, hot water on demand, electric washing machines and vacuum cleaners and motor cars and so on. I don't know which generation it would be but my mom's upbringing in a house with only fires for heating the house and water, and my grandma spending most of her life on cooking and cleaning and my other grandma being from the "make do and mend" tradition of making their own clothes and adjusting hand-me-down clothes, to a world now where sewing is a hobby and household chores much less effort seems a much more significant change than having a TV or not.

Or the other way, a generation later; your dad - TVs did exist in the 1950s; Richard Feynman was born in 1918 and his memoirs include fixing radios as a young lad around 1930; that your dad had no radio in the 1950s isn't because they didn't exist, and he could have raised you with no TV and no radio too. My grandad, my dad, and to a small extent myself, grew up in a world where electronics and radios were things made of discrete components which you could build and repair, where chemicals (including explosive things) were things you used in everyday life, could buy from the chemist, played with if you wanted. We all grew up with schooling based on books and paper. At least my dad and I used cassette tapes, film cameras, clockwork oven timers, digital and analogue watches, push-button TVs and radios, microscopes, Meccano, and a world where going down the street left you completely uncontactable.

I don't have children, but if I did they would grow up in a world where the only device is a computer, the computer works by magic and is not repairable. By that I mean all the light and sound bleeping toys of the 1980s, audio tape players, CD players, VCRs, film cameras, timers, are all subsumed into computers. Drawing is a thing you do on a cheap tablet, constructing is something you do on Minecraft, research is something you do on Google, and you always have cellphone signal and there's always a world on the other end of it never a ring and no answer. Games are computerised, drones exist mostly to bring a video image back to a smartphone, everything is or has a camera, all storage for audio, video, pictures is digital and copious and portable, all communication is wireless, cellular and ubiquitous. And we grew up in a world where talking to someone else outside the local area was rare - a phonecall was a reasonably expensive luxury, and you would only phone people you knew or companies.

Your son watches star wars, but he doesn't exist in a world where if he misses Star Wars at the movie theatre he has literally no way to watch it until it's out on tape. In the 1980s and 1990s only my cousins had a VCR and they only had a handful of films, many recorded from TV. Now films are everywhere - in second hand shops, on Amazon to be delivered next day for $2, on download sites, on YouTube, on NetFlix and Prime and Hulu; he lives in a world where Star Wars is roughly indistinguishable from any other moving picture available on a screen - that it's a movie isn't anything special. Recipes no longer come in books they come in Google results. Games no longer come on boards and cards they come on screens. They don't have to, but in our youths they almost couldn't. Now they do by default.

I say the biggest impact of computers to date is always on communication, which I understand was earlier in the USA than in the UK, but for me dates to 2000 exactly; that's the time when talking to other people outside the local area on forums and IRC became normal and commonplace, the time when downloading information took over from other forms of obtaining it. It didn't have to be fast, it only had to be unmetred and not disrupt others using the phoneline. After that, smartphones and always-on-data from circa 2012, always on became always on you.


> practical differences in youth would be non-computing and a generation or two earlier

I agree with that one, but that was not part of the discussion.

Maybe TV's existed in the '50, but in the rural areas of Flanders, not a lot of people had one.

For me, the move from no electronics to electronics, seems bigger than the move from electronics to everything is computer. That you play a VCR tape or mp4 isn't that different. That you play a cassette or mp3 is also not that different. It's details compared to not being able to play anything at all.


Where (when) else could we be? If all these lives exist (/will exist/did exist), what does it mean to say that you are living a particular one? Who else could be living it?


I’m not sure how to think about it philosophically. Reminds me of the anthropic principle. But as a graphics / game programmer I am elated to have lived through the evolution from monochrome to 4K TrueColor. And beyond? Still unclear on whether 10 bit color is a real thing, and if it is why it isn’t a bigger deal with gamers.


Luck confirmed for me. I feel very grateful and at the same time sad about the advances I won't witness.


Neither. throughout most of history that would have seemed to be the case, because you can't compare to the future, only the past.

And it's just as likely to be a misleading perception today as it would have been at any earlier time.


Two things to remember: barring a dark age, there will always be technological innovation, and the human population is at it's peak, so we're more likely to make advancements.


I picture the peak when the moons of Jupiter are full with a thriving civilization. One trillion strong under one government. Building the ships that take humanity to other stars.

How small we will look to them.


I picture the peak when the entire universe is teeming with life, but looks empty to very simple creatures like us.


Seems awfully convenient that the universe would look identical to us in particular, as opposed to any other observer. Or being computronium.


It’s not the most exciting time. many centuries ago you could discover new humans living a different way. We can’t really do that anymore unfortunately.


You still can today but "leaving your bubble" is not common.


How many countries are there where people don't wear jeans?


Most exciting so far


Despite the Kay's remark about Dijkstra, I'll quote him:

> In computer programming our basic building block has an associated time grain of less than a microsecond, but our program may take hours of computation time. I do not know of any other technology covering a ratio of 10^10 or more. ... This challenge, viz. the confrontation with the programming task, is so unique that this novel experience can teach us a lot about ourselves. It should deepen our understanding of the processes of design and creation, it should give us better control over the task of organizing our thoughts. If it did not do so, to my taste we should not deserve the computer at all! ("The Humble Programmer", EWD 340)

This. I think that the computer revolution would happen if programming teaches the rest of us more about our society and our mind, rather than computers and systems themselves.


Care to elaborate?

Sounds interesting.


I had a college professor who said CS actually stands for common sense. "If you don't develop a really good web of common sense, you will fail at computer science"

Often enough, the problem itself is easily solved. CS is all about how you approach the problem..


Seriously watch every Alan Kay (not the survivalist) video on youtube, you will not regret it. It's practically criminal that they have such low view counts.


Seriously, this video made my day and I am searching for more. I basically think of it as a "how computing stacks should be" talk. Also just launched GNU Smalltalk on the side and looking for a good tutorial, by the way.


Pharo [1] is probably the best place to start for Smalltalk in 2021

See also glamorous toolkit [2] for more immersive programming.

[1] https://pharo.org/

[2] https://gtoolkit.com/


I post this video from time to time, and sometimes it gets traction / discussion.

It's good to see that the way to really get it noticed is to present the contents in order as this guy has done.


> It's unlikely that one or more large companies will be able to capture the Internet as it's too big and people are going to be sophisticated enough to realize that a solution from one company is neither called for nor possible.

I guess even Alan Kay didn't predict Google and Facebook...


I guess he was still kinda right, Google and Facebook themselves are essentially multiple companies stitched into big giants they are.


Computers are klunky objects that get in your way when you try to use them. When they disappear inside ofbobjects, then we know they have arrived. They are starting to disappear inside of phones, tvs, appliances, voice assistants, cars, etc. But have a ways to go.


That would just be mindless consumption. The computer revolution will be when everyone learns a little programming and accidentally complexity (bullshit) is beaten back enough that that little bit is useful.

In other words, I think we've long had the means, but capitalist underconsumption and the accumulation of technical debt holds us back.


I agree with this. Everything should be programmable, interoperable. Simple APIs, introspection, etc. Have systems emerge out of simple interactions built by users. Have more users depend on that.

Instead, we get silos, to better capture value. Users are consumers, and need only ingest what we feed them, to better keep them in check.

I'm not blaming anyone specifically here. Even OOP, and living cells tend to work the cell: keep your riches inside a membrane, and control every external interaction on your terms if you want to have it your way. Only... to some extent, as there is an equilibrium. Bacteria have plasmids. We at least have HTTP, for now (but it's not enough anymore due to captchas, js, etc).


Programmers today are like scribes were in the middle ages. When 90% of the world knows basic programming, I think that we will have a profoundly different conception of both computation and society.


At the same time; not everyone became a blacksmith.


Both analogs bear aspects of truth since software has the characteristics of being both a machine (crafted precision tool), as well as being information itself (freely spread & adaptable). Its future is probably some superposition of both.


Similarly, "basic programming" is a spectrum that starts at simple Excel formulas.


Well considering the literacy rate is only 86% in the world today, this seems like a very lofty goal.


This was 20 years ago. The computer revolution isn’t even being worked on.


This keeps getting down-voted, so let me clarify.

The problem was summed up nicely in Stephen Diehl's "Near Future of Programming Languages" which I've posted as separately another comment (http://dev.stephendiehl.com/nearfuture.pdf). Here's an important slide:

"Where will the next great programming language come from?

Academia? NO. No incentive to do engineering. Those that do are committing career suicide. Funding is drying up for fundamental research.

Industry? NO. Can’t fund anything that doesn’t have a return beyond a fiscal quarter. Incrementalism doesn’t move things forward.

Hobbyists? NO. No economic means. Modern implementations require multiple FTE and decades.

Will we just be stuck in a local maxima of Java for next 50 years?"

I've personally attempted to create a sort of Manhattan Project of getting some top minds together in an off-site location to work on it in earnest. (I'm not joking. In 2018, I found a small 13-room hotel near Nice, France and made an attempt to get a bunch of us sequestered there for a year. Alan Kay liked the idea, by the way.)

None of that is happening. The sad truth is that the computer revolution isn’t even being worked on.


> None of that is happening. The sad truth is that the computer revolution isn’t even being worked on.

in later talks alan kay would say "the best way to predict the future is to prevent it" and i generally feel the same way...

it seems most commercial interests are interested in refining what already is, than building thind that are completely new (and who can blame them, its really risky to do so)


There's a definite "Innovator's Dilemma" situation going on where the most anyone wants to backtrack is a few feet, and the solution is to backtrack a couple of miles.

The irony is that everything is that by not investing in this, we've been eating our seed corn. There's been a dearth of innovation for decades and markets are desperate for growth but none is coming.


> There's been a dearth of innovation for decades and markets are desperate for growth but none is coming.

its intersting dilemma for sure... i think alan kay also said, the innovations for xerox park only cost a few tens of millions of dollars, with the economic impact of literally trillions...

then this book comes to mind as well: https://www.goodreads.com/book/show/6130130-my-years-with-xe...


Alan Kay's vision of computing sounds a lot like Ethereum. Self-sufficient objects with their own URLs communicating and interacting with each other to form larger units. He was convinced this was the only way to scale large complex software systems.


Sounds a lot like Minsky's vision of AI systems, btw.


Kay's proposition for OOP: an object should grow as much as to become a full microservice (in today's parlance).

> 00:43:15 Every object should be a virtual server and have a URL and an IP.


Some of these have been posted, but for those that like transcripts with a bit more order than video content dumps, here they are (different versions, as this has been done trough the years)

http://archive.cra.org/Activities/grand.challenges/kay.pdf

http://worrydream.com/refs/Kay%20-%20The%20Real%20Computer%2...

There's also a paper for those with access or a penchant for piracy

https://link.springer.com/chapter/10.100/978-3-322-89884-5_3

What I think we need is:

1) A global policy where computers are taught from scientific principles, with no tie-in to closed platforms or paradigms, throughout the education system

2) Ownership of data by data creators, with no platform tie-in, mandated standards for migration and access control

3) Engineers insisting on adherence to ethical standards for computing, such as doing no harm to humans. A case would be engineers actively participating in reviewing damage from tweaking algorithms for increased user engagement on crud platforms (time spent).




I feel like quantum computers will truly be the revolution in computing as computing with them will exceed the human brain capacity just as automobiles exceeded human physical capacities to move in the 19th century.


Computers exceeded human brains in a number of measures already back when they where a bunch of mechanical gears.


That's also when true automation will take place.


Can we build an independently acting, benevolent AI and have him run for president?

We could have it interact with every citizen and act accordingly.

(I joke, but it couldn't be WORSE than the mess humans created.. right?)


I guess I am too dumb to understand these high concept talks, especially from the past when words meant different things than today.

Why should every object be a server. How does slapping a TCP and IP parser to every struct helps something. How does it relate to bacteria.

Yeah OK metaprogramming. The more a program use metaprogramming, the less readable it usually is.

I don't know. I am not Alan Kay and I didn't create OOP and Smalltalk so I guess I will stick to more practical things rather than philisophising about computer languages, and how objects will save everything. (I like Smalltalk for what it is.)


> Why should every object be a server. How does slapping a TCP and IP parser to every struct helps something. How does it relate to bacteria.

The point is that objects should be independent and communicate by messaging. The substrate doesn't really matter but both bacteria and computers on a network are examples of that scheme. The Actor model is the same idea developed for concurrency.


alan kay is really interested in scaling

performance considerations aside, having every object (stateless or not) have an ip and run asyncronously (just like on the internet) allows you to scale out and replace the implementation at any time (just like on the internet)

if you are programming in the small (say c with structs) its hard to see the advantage of that, but if your programming system was "done right" you could have the same semantics and get both high performance and high scalability

one thing that alan kay says alot is, the internet is basically the only system to run for decades and still have all its parts changed and yet never taken down for maintanence (outside of living things that is, which is where the biological comparison comes in)

every object having an ip means, basically, the same thing, you could have a system run for decades in a distributed fasion and never need to be taken down for maintanence...

thats the idea as far as i understand it


This is the follow-up reading:

http://dev.stephendiehl.com/nearfuture.pdf


> 00:43:15 Every object should be a virtual server and have a URL and an IP.

I see. Going overboard with microservices has a long pedigree.


> Arrogance in computer science is measured in nano-Dijkstras.

What a burn!


I think it's meant as a nod to Dijkstra's greatness, not a burn.


if we're an ancestor simulation inside a universe-sized computer, the computer revolution itself is just a simulation, too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: