Hacker News new | past | comments | ask | show | jobs | submit login
Steve Jobs and the actually usable computer (2011) (w3.org)
110 points by lproven on May 3, 2023 | hide | past | favorite | 92 comments



Butterfly effect is really something. Steve Jobs gets kicked out of Apple, builds NeXT as a form of revenge. Tim Berners-Lee buys a NeXT computer, finds it inspiring, and, thanks to NeXT's focus on a usable development environment, finds it easy to whip together a prototype of the WorldWideWeb. And so, even though none of us use NeXT computers today, thanks to one particularly important NeXT computer, all of us use the Internet and hyperlinks today. And yes, I know that much of MacOS and iOS has some lineage to NeXT. Still, this chain of events: Apple => NeXT => Tim Berners-Lee => WorldWideWeb is just bonkers for its world-wide ramifications.


The million dollar question, of course, is whether Tim Berners-Lee—or someone else—would have built a web browser anyway, or at least something very similar. We don't know the counterfactual.


There were tons of alternative hypertext systems before the world wide web.

Major reasons WWW succeeded:

1) Open standard, no license required. (My understanding is that TB-L worked hard to make this happen.)

2) A forgiving text-based format that was trivial to author, serve, and display. And evolve with forward/backwards compatibility.

3) One-way links. (Many other HyperText projects were hung up on bi-directional links.)

4) URLs didn't require any centralized authority other than already-existing DNS.


5). 404s. Other systems went for total consistency. Good luck with that!


When the www came out I use to get there through gopher. I preferred gopher until I was able to get an ip address. The ip address made it possible to get images with text. Perviously there were separate downloads.

Welcome to the World Wide Web there is no top or bottom. Which distinguished it from gopher.


yeah, off the top of my head I remember gopher, hytelnet, I think there was some screwy thing built on finger, even.

There was also an amiga format that was getting picked up for documentation and gamemanuals/guides, but the name is escaping me.


> A forgiving text-based format that was trivial to author, serve, and display. And evolve with forward/backwards compatibility.

I'm not disagreeing, but that's certainly a less often heard characterization of SGML ;) and also, not sure evolution of the HTML vocabulary past 4.x, or lack thereof, supports this point considering there are entire universes of additional syntax such as CSS and philosophical schools of thoughts only there to avoid having to write plain markup attributes. The metaphor also is apt, since, like the universe, CSS syntax seems to expand faster than the speed of light.


You don't have to use all of that though, it's optional. A fairly basic but serviceable HTML document that looks decent is still pretty easy.


It probably hit just the right time in terms of available CPU power and network speed for a text-based format to get accepted.


The web could have plausibly existed as early as FTP did. Which would have been 1972. Plenty of documents from that era had URL-like manual links of the form "pub/foo/bar.txt at MIT-AI", and many FTP servers supported anonymous login and were fast enough for real-time text document retrieval.

It is kind of embarrassing that it took 20 years to invent URLs and browsers.


I wrote a networked hypertext system in 1985 but it used a binary "page" format to save bandwidth. The GUI used was GEM.

Another road not taken could have been to define a rich text format in ASN.1 and build a browser on top of the OSI network stack.


Phd needed to understand Asn.1 and … osi network stack. Good luck.


Not really any conceptual difference between ASN.1 and something like Cap'n Proto.


Not to knock the amazing contributions of Tim, but IMHO Gopher and HyperCard had already laid a lot of the important concepts down in various suboptimal ways. I think there were enough smart people using both in the 90s that it was virtually inevitable that something like Tim's work was bound to arise somewhere, somehow, as a merger of and improvement on those two sets of ideas.


Don't forget about Minitel

https://en.wikipedia.org/wiki/Minitel

> The service was rolled out experimentally on 15 July 1980 in Saint-Malo, France (...)

> From its early days, users could make online purchases, make train reservations, check stock prices, search the telephone directory, have a mail box, and chat in a similar way to what is now made possible by the World Wide Web.


Don’t forget about the Mother of All Demos [1] with Douglas Engelbart! Windows, hypertext, graphics, navigation and command input, video conferencing, the mouse, word processing, dynamic links, revision control, and real time collaboration; all that in 1968!

[1] https://en.wikipedia.org/wiki/The_Mother_of_All_Demos


So why didn't Engelbart invent HTML and HTTP in the 60s? At it's core it's a very simple protocol and minimal document format comparable in complexity to systems that existed at the time. Computers back then were perfectly capable of implementing it.

Some here have compared the WWW to the iPhone, but the iPhone literally couldn't have existed even a few years previously. With the Web there were no meaningful barriers to it's development 20 years previously. If a more complex and clunky system had gained critical mass in it's place, that's what we might have got stuck with for a very long time.


It took Englebert over a decade to build the mouse and when he was done and gave a demo Xerox hired his lab.

The barrier to the web was a global network and an IP address for each node. When the network was sufficient size and people had access to ip addresses the web happened. Before that we all just used the internet.


I'm not trash talking Englebart, in any way, I'm just saying the minimalist web approach was not at all obvious. To many, many pioneers there were a lot of sophisticated features they though would be absolutely essential to any pervasive hypertext system, that turned out not to be.


I wasn't sure. I thought it might be eye opening tidbit considering how many more years went into the development to make it commercial product. I can't remember if it was 14 or 17 years of development of Engelbart.

Another big hero of mine from that era that helped invent the internet and regret the result was Wesley A. Clark.

https://en.wikipedia.org/wiki/Wesley_A._Clark


Rand Corp had a moonbase idea on paper two years before.


It was similar from the user point of view, but it was absolutely different from creator/admin point of view. It was supposed to be a service provided by big cable/telecom companies and not something that everyone would've been able to use to create. The whole architecture gives it away - minitel is merely a display device to a telecom mainframe, paid per minute of connection. The average Joe was not supposed to create their own services/pages.


Perhaps their plan was to, once successful, create a Minitel AppStore and allow smaller developers ...


Given how big telecoms usually behave, I really don't think so. Remember the fight for usable mobile internet... Ever tried WAP and developing on it?

They set the whole thing up to control it and charge extraorbitant fees for it both to the user and to anyone publishing on it. Minitel as well as WAP. Apple's 30% is peanuts compared to the fees I'm talking about - I mean contracts with minimums in millions.

Good thing Apple kicked their sand castle down. People don't give them enough credit for that.


People do seem to forget about things like Verizon forcing camera phone vendors to disable Bluetooth on their devices to prevent you from being able to transfer image files off the phone without going through Verizon and paying a per image fee.


This. When the web first came about, it seemed evolutionary to me, not revolutionary, because it built on those ideas.


And Xanadu. That had built in editing and bidirectional hyperlink. Never really took off. In many cases the free open source product beat the commercial product.


Well, yes, there was already an existing web concept out of University of Minnesota called Gopher: https://en.m.wikipedia.org/wiki/Gopher_(protocol)

But it had many limitations WWW/HTTP did not, and was crippled by restrictive licensing. Still, probably reasonably likely Gopher would have been riffed on by a different open source hacker eventually, had WWW not been created.


As is typical, many of the pieces were primed to lead to the Web outcome, and Berners-Lee was the right person in the right place at the right time. It certainly could have been someone else. Just as another company would have done some variation (better or worse) of what Apple did with the iPhone (most of the pieces were ready to do it well and Jobs saw that).

The more intriguing question is how things might have occurred differently, if the beginnings were derived from a different set of factors & actors. It could have been a lot more obnoxiously commercial from the beginning (AOL & CompuServe style).


My belief has been that sure, someone would have created a device like the iPhone, but remembering just how consumer-hostile the phone manufacturers and carriers were at the time, it would have:

* required a carrier-specific App Store

* been just one of 20 models sold by the manufacturer

* gathered dust in their awful retail stores because the salespeople had no idea how to use it or incentive to sell it

Not only did Apple create the device, but they threw their entire weight behind it (and of course had a built-in fan base eager to try it), and committed to its success.

Even Google with Android was experimenting with multiple form factors, and would not have demonstrated that level of commitment to a single one.


The thing is nobody was even trying to create a smartphone platform remotely like the iPhone. The standard approach was crippled cut-down mobile OSes with shared memory architectures and proprietary tooling stacks. There's no way to get from Symbian, Windows CE or the Blackberry stack to a workstation class, pre-emptive multi tasking, multi-threaded full stack OS.

Even with the iPhone right in front of their eyes it took Microsoft 2 years to even correctly identify the problem. The only reason Android managed to catch up was they happened to have started from a Linux kernel base. The thing is they'd done that for the developer tooling, not because of the platforms capabilities. What they were aiming for was an open source Blackberry clone because that was the dominant platform in market and mind share.

Given that it took Microsoft until 2012, 4 years after the launch of the iPhone that's a reasonable absolute minimum lower bound for how long it would have taken. I think it's quite plausible the modern smartphone might not have existed until 2015 without Apple, possibly more. Bear in mind the only reason such powerful ARM cores were developed since 2008 was due to the smartphone arms race.


Not necessarily.

BlackBerry was a thing, and the Sidekick had traction among kids. Apple hit it out of the park at a time where BlackBerry hit the stagnation phase, the tech was good (but not great) for screens, etc, and Apple was sitting on a software framework that could be adapted phones quickly, there was a struggling carrier willing to do anything to get traction, and had Apple stores full of happy iPod customers.


Don't forget it was preceeded by the iPod, which had a momentum of popularity Apple could build on.

As well, the original iPhone was a very different device - in my opinion the platform didn't realize its potential until they opened it up to third party apps and the App Store.

I still think Palm squandered their decade headstart and I actually like their UI so much better. What could have been if only they hadn't made such terrible business decisions...


3G’s broadband capabilities are what I think made iPhone a no brainer, and AT&T’s unlimited 3G data offer.

Streaming music, using google maps, real time accurate location data with GPS, and being able to browse most websites.

They had a sufficiently capable product available at just the right time to take advantage of newly rolled out mobile broadband networks, and in the US at least, coupled it with unlimited data so people would not hesitate to extensively use it and cultivate the new space.


The iPhone was EDGE. It was genuinely pretty awful if not using Wi-Fi. It was just so incredibly far ahead of everything else that 2G internet was an acceptable trade off. AT&T literally built out a weird patchwork of Wi-Fi hotspots to make it more bearable.


I know, I was referring to iPhone 3G, which is what I thought the person who responded to me was referring to with:

> in my opinion the platform didn't realize its potential until they opened it up to third party apps and the App Store.

I do not recall iPhone in 2007 being that huge of a hit. It was obviously very promising, and making waves, but iPhone 3G summer of 2008 was a no brainer even to non techies. I remember there being lines at Apple stores for months.


> I do not recall iPhone in 2007 being that huge of a hit.

I was working in mobile phone retail in 2007. It was absolutely an enormous hit, and was sold out everywhere. They couldn't ramp up production fast enough. Steve Jobs original iPhone demo was a huge, huge hit: https://www.youtube.com/watch?v=VQKMoT-6XSg

Apple already had a huge, established, fanatic base of iPod users by 2007. The presentation went viral, and people couldn't wait to get their hands on it.


> but iPhone 3G summer of 2008 was a no brainer

It’s also important to mention that the 3G was way, way cheaper than the original.


I also think it was invaluable that the iPhone app system was heavily secured (app approvals/reviews/restrictions) and or closed in the early years. Consumers were coming from a terrible Windows stretch where viruses plagued the mass PC systems connected to the Internet. They mostly didn't have to worry about that while learning how to best utilize the iPhone in their day to day lives. Apple made several excellent pro-consumer choices that many other companies would not have (including squeezing the telecom companies; which at the time, Apple & Co were nowhere near the size they are now, pushing around big telecom was no small feat).


I think you just described Symbian.


That is a very US specific problem though.

It would have been easy to make a store for java apps present on other platforms as well. Could even have been a java app itself or a homepage.


I'm actually trying to pickup up the pre-personal computer design ideas and design an analog type computer as if the pc never happened. I believe it is these types of questions that drives the cycle of development.


Xanadu


I wonder how that would have worked if the Autodesk situation had been more gentle. eg. they had xu88/Green/Classic working well enough for actual usage while they built out xu92/Gold.


not sure, but I think with AI's help to codify and code, we could build it now :)


I've been looking deeply at the existing Xanadu docs so I have a pretty good sense of how it all was supposed to work. Could I get a look at the stuff you did with the LLM? I want to see what it gets wrong as there is a lot of misleading data out there on Xanadu and Enfilades. (contact info in bio)


https://github.com/tudorw/Xanadonot

it's all on a public repo,

it's basically an unfiltered chat between me and gpt, someone more familiar tried some of the python code and said it ran so...would love to hear your feedback :)


A place… where nobody dared to go…


A million lights are dancing and there you are


now we are here...


Gopher was better than www until the graphical web browser existed. The gui browser had one technical requirement that gopher did not. The gui browser required a node address on the network or ip address.

One could access the www from gopher. The gateway page for the www stated

Welcome to the World Wide Web. There is no top or bottom which is what distinguished it from gopher. That had a top of the tree and bottom of the tree of links.


Apple should have just doubled down on HyperCard


Functionally speaking, the WWW is an implementation of HyperCard. They're very similar, again, functionally.


And ended up with something like Flash?


Interesting data point...

By 1988 Theo Gray also in Urbana-Champaign, had build the notebook interface for Mathematica's release that summer, which is nowadays cloned as originally iPython Notebooks then generalized as Jupyter for various languages.

This is still popular in the Mathematica version as well, and can do many things HTML browsers demonstrated years later.


Maxima existed far before iPython.


Did Maxima have a notebook interface back before Mathematica's appeared in 1988, and many years later iPython cloned it (with credit given to Theo Gray's idea) ?


Maxima it's the continuation of Macsyma.

https://en.m.wikipedia.org/wiki/Macsyma


I used your prompt to look into Macsyma which led me to maxima. What a great tool I just finished the 10 minute tutorial.

https://mathblog.com/a-10-minute-tutorial-for-solving-math-p...


I know that, but did it have the notebook interface like people call Jupyter nowadays, which was based on IPython which was an explicitly described honoring of Mathematica for Python?

I ask because I don't know if Macsyma had that...we had a university site license for Mathematica so in my case I never had Macsyma to play with back then.


> The million dollar question

Don't forget about the 150 million dollar investment made by Microsoft that saved Apple.


This has been discussed many times here and elsewhere, but it was not an investment, but rather $150 million purchase of Apple stock. The symbolic of it, as well as the confidence of Microsoft to continue developing Office for Mac software for at least 5 years, helped save Apple, and not any cash infusion.

https://www.zdnet.com/article/stop-the-lies-the-day-that-mic...


It also settled a long standing lawsuit and a lot of cross licensing.


That's a $1 question with "yes" as the answer, it wasn't that unique/hard to think it couldn't have been done elsewhere (but true that we don't know alternative histories)


Who knows, we might have something better than a networked document format masquerading as an application delivery platform. Something in spirit to 3270 or X/RDP and programmable with a single language.


Everything looks like this in hindsight. You can trace any significant thing through a series of seemingly unlikely events.

You can do it with your own life and career if you wish. The larger things you can impact now makes the smaller things you did earlier look more significant.

If it were something else that took off, we'd say "Wow, x, then y, then z happened and now we all do abc."


If memory serves right, DOOM was also developed on NeXT.



Mac OS it's the new NeXT since OSX and Rhapsody.


> the optical disks proved unreliable

Fun fact: this is what the "spinning beachball" cursor represents. The wait cursor looked like a magneto-optical disc in NeXTSTEP/Rhapsody/Cheetah/Puma because original (1988) NeXT Cube booted from a single MO, no HDD. It was redrawn into a glossy sphere in Jagwire.


What exactly is unreliable about an optical drive? I find them to be quite trusty so long as you don't move them while in use. Additionally, I still have yet to hear a better cold digital storage than optical. Perhaps I'm wrong?


These were magnetooptical drives, not cd drives. Also the entire machine booted off it and wrote files to it. No hard drive in the first models.


Jagwire... <3 Thanks, I needed to hear that again.


I have just finished restoring a NeXTcube and was surprised how modern the environment feels: neworking, interface, unix tooling, .. To get to know the environment I compiled the original httpd server and managed to get it compiling after only tweaking the makefile a bit, very similar to what still is sometimes needed on Linux.

The usability of the NeXT computers also had a lasting impact on interactive music applications with the introduction of MAX at IRCAM:

https://0110.be/posts/Electronic_Music_and_the_NeXTcube_-_Ru...


Cool work, I wonder how hard would it be to compile the 1st browser/editor[1]. I have seen few source code repositories with it's code[2], but I'm not sure if it's complete source or not.

[1] https://en.wikipedia.org/wiki/WorldWideWeb [2] https://browsers.evolt.org/browsers/archive/worldwideweb/NeX...


I have tried to compile the browser/editor, the worldwideweb.app, as well and failed, indeed due to having incomplete sources/dependencies. I did not look into it too much though with my main goal being to get music software running.


> NeXTcube and was surprised how modern the environment feels

Yeah, I started with a NeXTcube ~30 years ago (after Apple ][ and Amiga), and the computer experience itself hasn't really changed materially. Which is stunning, in a way. And sad.


I wish the person at Apple responsible for disappearing scroll bars, miniature, hidden, adding useless "features" would read this. I know, there's a setting...


For some things there isn't even a plist setting.

It's a sad turn of events that people were paid to make an OS less usable.


And the most stupid thing of all is that they tried to reinvent the filesystem when they developed iOS.


I don't know if you're being sarcastic, but it's arguably one of the reasons for iOS (and Android's copy of) success. Most people struggle with multiple levels of hierarchy


Getting rid of the file hierarchy was like saying "Well, most people read at a six grade level, lets publish the New York Times as picture books." Now people are even more siloed between who actually knows whats going on and who doesn't with technology, instead of it lifting everyone up on an equal plane. I'm starting to think that period of time where the kid knew more than the parent with regards to the computer as this goldilocks zone, and its all down hill from here as we've gone from scary cli to safe gui to bubble wrapped mobile to what, vr ad machine probably where all you need to learn how to do is put it on your head and open your eyes. I'm having to teach undergrads what files even are before we can even think about writing code, and these are the ones that are taking computer classes.


I think we risk stunting the development of the next generation if we insist that they use the metaphors that made sense to an earlier generation. I say we stand back, let the next generation figure out their own ways of doing things, and expect to be impressed at what they accomplish. If nothing else, being optimistic like that is more appealing to me than lamenting a supposed dumbing down as innumerable generations before us have done.

Edit: This is one reason why I as a forty-something will probably not attempt to teach any of my nieces and nephew (8 years old and under) to program. The generation gap between me and them is too big.


"we" will never stand back, so if you do, someone (potentially much worse) fill happily fill the void, there is no way for any generation to stay insulated and magically figure it out on their own (also the same innumerable generations that did the lamenting practiced the optimism)


I think you’re right and it’s sad. There could’ve been a way to gradually disclosure complexity that ended up inviting and educating the masses.

OTOH, I know seniors managing huge WhatsApp groups, installing apps, upgrading OSs, migrating backups without blinking. Those same people couldn’t install a single app on Windows if their life depended on it.

We could’ve done better, but it’s unquestionable that there’s been some democratization of essential tech.


You have to wonder if the democratization of this tech would have happened anyhow even with a more complicated OS. I like to think it was inevitable. We like to think we as consumers have choices that lead to things reflecting our wants, when really our wants reflect what the market is able to make available as a potential choice to us consumers. The inevitable smartphone would have been in everyone's pocket I expect, just like how the inevitable flip phone was in everyone's pocket, just like how the inevitable everything else ended up in everyone's home and life: it was there in the store when the advertisements told them it was time for them to buy.


The hardware was inevitable. The software, not so much.

Apple made a lot of compromises, constraining a Unix system ™ to a huge degree in order to make it extremely simple and usable battery/ram wise. Those compromises had costs.


> “WorldWideWeb.app”

That sounds so funny in todays time.


"Designing the app’s menus was trivial — just drag and drop with InterfaceBuilder. The code framework of the app was generated automatically. "

User Interface development has gotten more complicated since those early days.


"Steve was a champion of usable technology – even sexy technology. Intuitive on the outside and extensible and cool engineering on the inside. "

There's still some examples of this in Apple current products and OSes, but I wonder how many more "intuitive" and "extensible and cool engineering" choices we'd see in their products today if he was still alive.

Apple's trajectory over the last decade has been simultaneously impressive and depressing to watch through. I miss Steve.


I'd forgotten all about that gizmo. Under the heading Steve Jobs in my memory, it got completely replaced by "the computer for the rest of them", with a GUI tuned to hide the computer, to prevent users from being tempted to use it as a way to learn to program computers.

The NeXT was competing against Sun, priced accordingly, and novel for being weird.


There have been programming environments on iOS since 2011 with Codea. I started programming on my iPad using Pythonista in 2012.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: