Hacker News new | past | comments | ask | show | jobs | submit login
“There Is No Reason for Any Individual to Have a Computer in Their Home” (quoteinvestigator.com)
139 points by sohkamyung on Sept 15, 2017 | hide | past | favorite | 141 comments



> To understand the mindset of this period it is important to recognize the distinction between a computer terminal and a free-standing computer. Some experts believed that individuals would have terminals at home that communicated with powerful remote computers providing utility-like services for information and interaction. These experts believed that an isolated computer at home would be too under-powered to be worthwhile.

Considering how the majority of the general population's use of "computers" is really interacting with remote and cloud services (Facebook, Netflix, etc) this is a valid argument today. While today's equivalent of terminals (phones, tablets, and even PCs) have a significant amount of local computing ability, for the most part they really are just used as fancy terminals. Gaming is maybe the one mainstream application where local compute is important.

The other quote that stuck out to me was this:

> “The personal computer will fall flat on its face in business because users want to share files and want more than one user on the system,”


>Gaming is maybe the one mainstream application where local compute is important.

There are a ton of things our smartphones and tablets do that rely on very high levels of local computing power.

Biometric authentication such as face and fingerprint recognition is a heavily compute intensive application. It needs to be done locally in order to be secure. Face ID is just nuts.

Modern smartphone photography is heavily compute-intensive, from HDR to optical zoom to image editing and markup. Just look at what Apple is doing with portrait mode and lighting effects. I love using time-lapse and slow-motion video modes. What would instagram and snapchat be without locally applied filters?

Note taking, address book and time management. The notes apps on modern mobile devices are multi-media wonders with integrated text, image and handwriting recognition built-in. Calendar management, alarms and notifications all rely on local processing even if they do make use of cloud services. Without local smarts those services would lose a huge chunk of their utility.

Document and media management. I read eBooks and listen to podcasts. My ebook reader has dynamic font selection, text size adjustment and will even read my books to me. Managing media on the device is essential as contemporary networks are still nowhere near good enough to stream everything all the time. My podcast app has sound modulation and processing options built in to tailor the sound to my tastes and needs, including speed up, voice level adjustment and dead-air 'vocal pause' elimination in real-time, all on-device and adjustable at my fingertips. That's serious sound studio level stuff in my pocket.

Playing digital video files. Even ones downloaded or streamed. So what if we've had it since the 90s. It's still proper computation, especially with advanced modern codecs. VOIP and video calling even between continents have also become everyday and absolutely rely on powerful local number crunching.

These things have become so everyday that we hardly notice most of them, but without serious on-device processing power, some of which would have been beyond $10k workstations just 15 years ago, none of this would be possible.


> My podcast app has sound modulation and processing options built in to tailor the sound to my tastes and needs, including speed up, voice level adjustment and dead-air 'vocal pause' elimination in real-time, all on-device and adjustable at my fingertips.

Everything short of the vocal pause stuff an old Sansa will do for you, running a tiny embedded C based runtime.

> These things have become so everyday that we hardly notice most of them, but without serious on-device processing power, some of which would have been beyond $10k workstations just 15 years ago, none of this would be possible.

The modern photo and video stuff is super cool and uses a ton of processing power. Likewise for high quality video streaming, decompressing is hard work, though often offloaded to HW these days.

Everything else my desktop of 15 years ago handled just fine, with 256MB of RAM.

Heck my smartphone in 2006 with 64MB of RAM did most of this stuff.

The one thing that has remained constant across a decade of smart phone development is the Pandora app getting forced closed all the time.


"Everything else my desktop of 15 years ago handled just fine, with 256MB of RAM."

I hear this sort argument a lot. But I never hear of anyone doing it, that is working on a 15 yr old computer for a while. I imagine with Linux it might not be too bad.


It's because software accrues bloat on all layers of the stack, so that even as functionality stays the same, performance requirements increase pretty much in lockstep with hardware power. And since each piece of software tends to have its own format, and/or is tied to its own cloud, it's harder and harder to exchange data between old and new programs. So people keep upgrading.

If you look at places where there isn't much pressure, you'll see plenty of old hardware and software doing just fine. For example, many shops I frequent are run on DOS-era programs, which are 100% totally fine for the job, and probably even better than modern replacements (less clicking, more typing = more efficient to use).



> I hear this sort argument a lot. But I never hear of anyone doing it, that is working on a 15 yr old computer for a while. I imagine with Linux it might not be too bad.

The Internet has made this infeasible. Everything has moved online, and 90% of online resources have grown very bloated, though Wikipedia would probably still work well.

Security is another problem. Windows XP would work fine if you can find 15yr old hardware that still functions (hardware rot is a real thing), but the number of 0 days out there is far too high.

All that said, I wonder if any Shoutcast streams still exist. I'm using Youtube Live for the same thing now days (orders of magnitude more CPU being used!).

Coding would work fine, except the old IDEs don't support newer C++ standards, and I'm accustom to my niceties in life, but if you can ignore that, life would be fine for writing code. Modern web platforms wouldn't work that well, writing code would work OK, but the resultant webapps wouldn't run that well.


> All that said, I wonder if any Shoutcast streams still exist. I'm using Youtube Live for the same thing now days (orders of magnitude more CPU being used!).

Shoutcast is still around, at least as a protocol via Icecast. I dunno if it's still being used for widespread public streams, but I know people still use MPD + Icecast to stream their personal music collections.


Most of the FM stations in my city (in Argentina) have a shoutcast stream to listen online.


Shoutcast is the preferred method for both live performances and recorded music in Open Sim type virtual worlds, since it's supported in the viewers, although I often copy the stream url into Winamp.


I was running a windows XP system from 2007 at home until a few months ago. The push to upgrade came finally cam from not having a reliable pdf reader for some pdfs and the fact the last browser still getting security upgrades for XP, firefox, was no longer going to be supported this September. A side benefit to upgrading is now I can buy a modern color laser printer and get out from under the expensive HP ink nightmare.

The computer is now running Linux Mint just fine.


I'm still using the rig I built in 2006, though I have maxed out the ram (32gb) and upgraded to SSDs since then. It works fine for what I use it for--writing code, listening to music, surfing the web, image editing in gimp, and drawing with qcad or inkscape. I often do these things simultaneously with up to 100 open browser tabs. I have no reason to believe I won't still be using it 5 years from now.


Not quite 15 years, but I'm happily using an 11 yr old laptop. When I need something beefier, I turn to my 9 yr old desktop. And, you're correct, I am running GNU/Linux on both.


I bought a Thinkpad from 2011, because I can't see myself wanting more than that from a laptop.

The only place it is lacking is the display: 1366x768 feels pretty low-res these days. Video playback has some minor screen-splitting, too.

Still, for browsing the web, watching <=1080p video, and hacking on some software projects, this old thing is more than capable.

> I imagine with Linux it might not be too bad.

Of course, that's a requirement. ;) NixOS is near perfection.


A quick look shows that in 2002, we were using Pentium 4's and Athlon XP's, single core, around 1500-2000mhz.

Until I could afford to buy a PC, that's the class of hardware I used.

Just pick your favorite Linux distro, and they will run like new.

It's also worth noting that Arch Linux started in 2002. That's the distro I settled on back then (NixOS wasn't around yet ;).


Lubuntu runs in 112MB of RAM for me and I can use Dillo to browse Wikipedia and other well-made sites just dandy ^.^


I try to make my Desktops last for 10 years before replacement, but I've been known to replace components along the way.

Probably wouldn't be good for a hard core gamer, but for Quickbooks, Email, and MS Word, I seem to be surviving.


> Everything short of the vocal pause stuff an old Sansa will do for you, running a tiny embedded C based runtime.

Great. So instead of using the increased compute power on my smartphone, I can do (almost) as much by just carrying another device around.


You are kinda missing OP's point.

It is true that today's phones are several order of magnitudes more powerful that those of 50 years ago, but OP's point is that we are kinda still in the paradigm where a lot of the computing we rely on happens on remote servers.

So, while the computing power of the 80s is very well accessible to us, our needs have grown far beyond that and are only serviceable using remote servers.

Take Google Search itself for example.


Im not missing any point.

>...this is a valid argument today. While today's equivalent of terminals (phones, tablets, and even PCs) have a significant amount of local computing ability, for the most part they really are just used as fancy terminals.

The argument that we don't need or use local computing power is not valid. The nuance you are imagining isn't there. Of course networked services are incredibly valuable. But most of them would be useless without local client software and there are plenty of things we use mobile devices for now where online services are just a convenient adjunct to powerful local applications.

'For the most part' almost every time I use my phone I unlock it using locally running fingerprint recognition software and interact with a high performance animated graphical user interface, regardless of what else I do with it. You can't wish these things away as irrelevant just because you've got used to them.


> The argument that we don't need or use local computing power is not valid. The nuance you are imagining isn't there. Of course networked services are incredibly valuable. But most of them would be useless without local client software and there are plenty of things we use mobile devices for now where online services are just a convenient adjunct to powerful local applications.

But the original argument is the reverse of that: they weren't arguing that local computing power was unnecessary, but that it would be insufficient because what people really needed computers for was to connect and do complicated things. The fact you use more processing power than they envisaged being possible on a microcomputer to do something as trivial as unlock your phone is a moot point to the "computer terminals do much more interesting stuff when they're connected to each other" argument that was actually being made.

I remember a time when that wasn't true, and could create and play with lots of interesting stuff on a computer with the hard disk space of a thirty second YouTube clip which didn't talk to other computers except reluctantly by floppy disk. Quite possibly you do too, and arguably that makes certain seventies computer magnates look silly. But even now when computers are several orders of magnitude more powerful and offer much better games, somewhat better graphics editing and frankly barely-improved word processing, most people use most computers to interact with other people or services or experiences online, and would have a lot less use for them otherwise.


>But the original argument is the reverse of that: they weren't arguing that local computing power was unnecessary..

It was literally the justification given for killing projects to develop home computers.


Today's GUI is yesterday's terminal


I am still sort of baffled by this trend. In my pocket, I have a device with capabilities of a supercomputer from not that many years ago. So do many other people.

Yet, with all this, we're slowly reverting to the days of the dumb terminal. The concept of data ownership is all but dead. Even trivial things, like word processors, are being moved to the cloud and more things are requiring connectivity.


But then a huge chunk of the smarts in e.g. Google Docs is in the JavaScript you're executing locally. Just because you download the code at runtime doesn't change the fact your device is doing a ton of local computation to make that application work. A lot of web apps like that are actually local JavaScript apps with a relatively dumb web storage backend. Then again I mainly use google docs using the native app on my iPad.


Yeah, there is still a big difference between JavaScript and an installed application. I agree that they are functionally similar, but they are philosophically different and a paradigm shift from the brief moment where computer users nominally owned their data and had control of their platform.

They still can, of course. You can get any number of word processors to install and run locally. However, a smaller percentage are doing so. For better or worse, they are putting their data, the output of their labor, onto computers that don't belong to them. No, they don't have to, but many do so. The numbers for something like Office 365 are amazing. It's great business for the cloud providers, but is it really good for the end user?

Then again, does the end user really give a damn? Evidence suggests not. I'm not sure if they are making informed choices, however. With the increased censorship going on, I'm a bit concerned with the trend.

Meh... Just some thoughts on the subject. Even if it is a problem, I have no idea how to fix it. I'm not even sure it's a problem.


I feel like this is somewhat cyclical where thick and thin clients are variously in vogue.


More like the computing world has rebooted itself 1-2 times.

First there were institutional computers, the big irons (or dinosaurs as the web kiddies love to call them these days).

then came the microcomputer, and it spread computing to people that had barely heard about it. This largely rebooted the thinking about computing.

Then came the smartphone, and similarly it brought in a whole mass of people that had minimal interest for the earlier iterations. And thus we seem to have another reboot going.

The reboot is as much mental and social as it is technical btw. There is a technical layer to it sure, as there has to be some amount of hardware and software to draw people in. But what really reboots things is when the number of people that come in overwhelm those that are already in, and thus the newcomers have time to form their own mental and social models of how things should be.


Years ago when I had a VAX to work with, the terminal was happy with user name and password. Your supercomputer phone might have classy iris scanning stuff but it is merely unlocking the terminal. Okay there is no remote authentication to it but it is the same conceptually.

Everything else apart from things like the clock are client server.


While waiting for a better network in every part of globe, we do need local power like you described. But, none of your examples couldn't be implemented remotely, and be more powerful if implemented remotely. We have all the tech to have faster and better bandwidth, it's just a financial implementation problem (i.e. 'some dudes need to have their bank accounts happy' kind of problem).


> It needs to be done locally in order to be secure

You know that. I know that. Technical personnel generally know that.

"The suits", however, will find some way to profit from collecting biometric data, at which point "appealing to security" will cease to be relevant (to them).


What podcast app do you use?


The Apple ][ reference manual, when read from today's perspective, seems almost as out of touch, but in the opposite direction. From the first lines in the Introduction:

http://www.classiccmp.org/cini/pdf/Apple/Apple%20II%20Refere...

"Like the Apple itself, this book is a tool. As with all tools, you should know a little about it before you start to use it. This book will not teach you how to program. It is a book of facts, not methods. If you have just unpacked your Apple, or you do not know how to program in any of the languages available for it, then before you continue with this book, read one of the other manuals accompanying your Apple."

It goes on to list the "Apple II Basic Programming Manual" and "The Applesoft Tutorial".

I recently helped my mom purchase a Macbook for personal use. She's retired now but she used to work somewhat as a programmer in the COBOL-era. She was deeply annoyed at me telling her not to worry that the Macbook didn't come with a user manual, just Google for things online. I'm about 4 years into my current Macbook so I'd had forgotten what if any documentation came packed with it. I was surprised how hard it was to find a user manual on Apple's website, and even then, it was fairly skimpy, and not particularly accurate -- the book was described as being for the non-Touch Bar version of the MBP, but it still had a Touch Bar section confusingly plopped into the middle of the other MBP topics, including references to ports that don't exist on the lower-end 2-USB-port model.

Even today it's pretty controversial to push coding literacy (not necessarily because coding isn't seen as valuable, but in terms of curriculum priority). Back then, to even understand the Apple ][ user manual, you were told to in its introduction to first read up on how to program in BASIC. I can imagine why a computer expert back then would be skeptical of the value of personal computing.


Back when people bought software on physical media [pre-1999, basically], all decent software came with physical documentation. Games might come with a center-stapled 32 page manual; Turbo Pascal for Windows came with a multi-book reference set. For more than a decade, IBM PCs and their successors came with gray cloth-covered three-ring binders full of manuals and references.

It was a different era.


When I bought Logic 7 around 2006, it came with several manuals and reference books totalling IIRC about 2000 pages. Now there's no boxed version to buy.


I still recall getting a flight simulator i had ordered right before we went on a family holiday, and i brought the documentation with me.

Not only was the actual manual of the game thick and detailed, but the company had also produced a document of the history of the plane and how it was operated during its years in service. This complete with diagrams, pictures, and perhaps even anecdotes from surviving aircrews.

All this for a game that modeled the world in flat shaded triangles on an A500.


It was indeed a different era.

Nowadays, people are editing photos, videos, recording podcasts, publishing their own books, all using tools that allow for more complexity than what any professional could do a couple decades ago, all without even having to read a single piece of documentation!


Not true at all. To do the things you mentioned, they all have to "read" plenty of "documentation", usually in form of third-party text/video tutorials, or even training courses. In a way, these are all worse than what the official manuals used to be.

What changed, unfortunately, is that the industry no longer seems to expect people to actually learn anything. In other professions, or even with things like driving, you're expected to learn and train before you can use a tool effectively. In computing, we somehow started to believe it's wrong, and thus we're making ever dumber and feature-less tools instead of telling people to suck it up and spend 30 minutes reading the included manual...


> they all have to "read" plenty of "documentation", usually in form of third-party text/video tutorials, or even training courses.

That's an optional step, of course. But it is far from required. I remember for instance my brother downloading a copy of Audacity and instantly starting to record his voice and putting together radio shows.

Of course, at some point, when he wanted to do complex stuff, he watched YouTube videos etc. But at this point he had already created many hours of content.

And I don't think watching a 10 minute YouTube video of someone showing how to perform X is worse than reading a user manual. People learn by watching much better than they do by reading a manual - especially since documentation is rarely the first concern of most companies, and is therefore often lacking in thoroughness, poorly written/translated, contains very few pictures/diagrams if any, etc.


> Of course, at some point, when he wanted to do complex stuff, he watched YouTube videos etc. But at this point he had already created many hours of content.

He presumably had previous experience with operating other GUI programs, and experience with standard computer concepts like programs and files. The big benefit here is that a lot of GUI paradigms are shared between very different kinds of programs, so something that would've been explained in text in a bound manual in the past gets covered very early on, either through experimentation, or someone explaining it.

> People learn by watching much better than they do by reading a manual

Depends. Is it something highly visual? Or is there an intuitive visualization of a concept that I need a high-level understand of? Then sure, a video illustration is invaluable.

Video was wonderful for learning to disassemble the trunk of my car to install a backup camera, replace my garbage disposal, and (back in the day) get an intuitive grasp of how sorting operations work.

I hate videos for programming tutorials. They're slow, awkward, difficult to jump around easily, use a boatload of bandwidth, and are a hassle to save for later.


> He presumably had previous experience with operating other GUI programs, and experience with standard computer concepts like programs and files. The big benefit here is that a lot of GUI paradigms are shared between very different kinds of programs, so something that would've been explained in text in a bound manual in the past gets covered very early on, either through experimentation, or someone explaining it.

Exactly! Just like reading documentation takes a bunch of knowledge as granted that most (but not all!) of the population possesses (reading, understanding diagrams, graphs, step by step assembly directions, etc.) You've put your finger on the wonderful thing that computer interaction has given us over the past decade or two - a shared visual/interactive language that, while far from perfect, lets the average teenager pick up Audacity and be productive with it within seconds.


> People learn by watching much better than they do by reading a manual

Except so many video tutorials aren't "to achieve x, do y and z", 90% of the running time is bloat. And that's not even counting the apparently mandatory intros and outros which talk about what is about to happen, or what just happened. I learn very little on YouTube because I'm must constantly distracted by how people misuse language. There's a whole lot of "content", sure, but precious little contained in it in relation to the bloat and the repetition.

> especially since documentation is rarely the first concern of most companies

That good documentation is less of a focus for "most" (do they have a clue or anything else to offer, or do they just come in large numbers?) companies is hardly an argument that worse documentation is actually better documentation.


I wonder how many times i have sworn at someone slapping up a 30 minute video of their mug talking when the same information could have been delivered in a few lines of text. All because they want those tiny tiny Youtube ad impressions.

This is why well received books rarely translate well into movies btw (at least if you try for one book, one movie).


That seems to be the "for dummies" version :-). It does give a listing for the ROMs and tells you a lot about the hardware, but doesn't go into detail about the PCB layout.

The 1978 version (digitally recreated in all its glory at http://www.classiccmp.org/cini/pdf/Apple/Apple%20II%20(Redbo...) gives PCB layout, timing diagrams, etc.

If you have that, you can build an Apple 2


I don't know about this analogy. With websites, you're running the software locally. You just happened to download the software right before running it. Your phone's web browser is hardly "just a terminal," and in fact for most websites it probably requires significantly more processing power per user on the front end to render the site than it does on the backend to serve the data.

Netflix might be a slightly better analogy, since it can naively be thought of as a dumb encoder just outputting a stream of data as it receives it.


A web browser is similar to the old-style 3270 terminals too.

All the mainframe server did was send bytes down to the terminal (well, terminal controller), the terminal itself would do all of the display formatting including field validation, then would send the user's data back up to the server.

It's not quite the same since a web browser can run arbitrary code, but the browser is still used to display data to the user and return user's input to the server, while the back-end server does data storage and processing -- for example you wouldn't implement Google's search engine on client-side alone, all of the heavy lifting is done by Google's servers.


Technically yes a lot of processing is done locally. But if you always need access to a service (i.e. a "powerful remote computer") to access your files, and if your device becomes useless without a connection to these services, then it kind of becomes a dumb terminal "philosophically"


But our devices aren't less powerful when offline than they used to be. It's just that networks are better and more ubiquitous, so of course our devices spend more time on networked things like social networking, messaging, browsing memes, etc.

I don't understand this conflation between "we can now almost always be connected to a worldwide computer network" and "now our devices are useless without a network and are now just dumb terminals." To me, the one valid part of this analogy is when applications are artificially reliant on a network connection, like single player games that use the network for DRM, or a web-based image editor whose developers didn't bother supporting offline capabilities. That definitely happens, but it's probably a tiny portion of modern device usage.


In an absolute sense, our devices are more powerful offline than they used to be. But so much of what many people use their devices to do, and so much of the time that spend doing it, are internet-dependent. So in a relative sense, being offline is more crippling in 2017 than it was in 1997.

> That definitely happens, but it's probably a tiny portion of modern device usage.

2/3 of my games are through Steam, and offline play has some complicated restrictions that I've never been bothered to remember. So much software documentation is only available online (especially considering corpora of example code, like StackOverflow). Acquiring new software is often only possible through the Internet. These are things that would've been handled by a visit to the store, or by reading the included documentation. The culture around those things has passed those methods by, for the most part.


I think the only thing the quote gets wrong is the relative balance of power, as the assumption was that remote mainframes would always, due to size constraints, be more powerful than the terminals used to access them.

Of course, the true power imbalance is connectivity, as it has always been. Using the internet is the primary reason most people own computers, and in that sense, our web browser has become our remote terminal. The computing tasks run on most sites doesn't require much more power than you could technically run locally, but the undeniable utility of connecting with other users makes the connection so valuable that the computer is, for most users, entirely useless without the internet.


I have sometimes wondered that what made the "web" special compared to the earlier BBS and leased terminal attempts is that we can have multiple "connections" up at the same time.

If the C64 back then could have brought up 5+ BBSs/terminals at once for various services, we could basically be looking at the same as a browser with 5+ tabs open to various sites/services.


This is very task-specific.

For some reason lot of people seem to find it hard to imagine that there could be a mid-way system between the extremes of client/server - and that in fact the hybrid system is what we have today, where there's a genuine mix of local processing and storage for some tasks, with remote processing and storage for others.

More, that the app economy is very much this hybrid, where users buy specific tools to do specific jobs locally, instead of running everything remotely, or independently programming applications from the ground up on their own hardware.

Browsers are almost a side issue, because browsers are like a terminal, but only for a standard selection of popular applications - to some extent the lowest-common-denominator jobs that have the most utility for the widest demographics.

Other jobs and applications still don't fit into the browser, and likely never will - even when there's a benefit to being able to collaborate. (E.g. I can't make music in my browser. If I want to collaborate I still have to send files over the Internet and wait for someone to edit them. Or I can use a mediocre collaboration system built into a DAW. But technically and socially, there are too many limitations to the in-browser experience to expect that live collaboration will become a thing soon.)


In the 99%+ case for web apps, you're running only part of the software locally, and without the remote component the local part is useless.

The distinction matters, especially when it comes to tasks like home automation - Nest alone is sufficient to exemplify the intractable problems that result from failing to design systems which perform purely local tasks in such a way that they require only local resources to do so. But Nest is far from alone in having so failed.


> you're running only part of the software locally, and without the remote component the local part is useless.

Of course that's true, but that's not sufficient for calling the client a terminal, unless you're going to define any client in a client-server architecture to be a terminal.

Again, with most web apps or even simple web sites with static content, the processing power required to render and interact with the site is probably much greater than to serve data on the back end. Web browsers are definitely fat clients.


> unless you're going to define any client in a client-server architecture to be a terminal

Why not? As you say, the general-purpose computers we customarily use have much greater capability than the character or even graphics terminals of yore, but in the decidedly client-server architecture of most of the modern web, these general-purpose computers most certainly act as terminals nonetheless.

(Some of the confusion here seems to arise from defining "terminal" at one point in terms of available compute resource, and at another point in terms of role within a distributed architecture. I'm not sure where the former makes sense - save in that it may constrain the roles in which a given device may be used, available resource seems entirely orthogonal to architecture.)


Why not? Well, simply because I'd prefer to use technical terms as they're widely used and widely understood. If someone wants to argue that the definitions themselves ought to change, then that's a semantic argument that doesn't interest me.


The point I'm making about definitions is that the one you use here seems to vary within the space of a single comment.


With websites (and web apps), you're only running UI locally. All the important parts - storage, queries, data wrangling, core logic - is done server-side.


This is the case with traditional websites, but web apps (especially Progressive Web Apps) are moving more and more of the heavy lifting back to the client.


And they key is that with the modern web architecture, having powerful local compute gives you the flexibility to put what can be usefully done on the server or on the client where it belongs. Terminal type architectures have no flexibility at all. Do round-trip times and bandwidth limitations mean this aspect of the application would be better done on the client? Tough. It has no computing power and isn't programmable so you don't even have the option. Want to manage tabs and bookmarks locally so you control that stuff instead of being beholden to the service suppliers? Tough, if the online service doesn't offer the feature to cannot have it because all the smarts is upstream.


> I don't know about this analogy. With websites, you're running the software locally.

The only thing most websites run locally is ad trackers which are of no value to the viewer.

It's funny how the web didn't really fulfill its promise as a distribution platform for software, when Flash was doing that 15 years ago.

Now i'm not saying there is no "photoshop","illustrator","premiere" or code IDE for the web right now. I'm saying it didn't perform as good as or replaced native apps. In that sense, the web failed.


>Considering how the majority of the general population's use of "computers" is really interacting with remote and cloud services (Facebook, Netflix, etc) this is a valid argument today. While today's equivalent of terminals (phones, tablets, and even PCs) have a significant amount of local computing ability, for the most part they really are just used as fancy terminals.

Right--today the problem isn't that there isn't enough raw computer power to go around--there's not a scarcity of cpu cycles--but now the issue is whether the benfits of connectivity are worth ceding control and autonomy.


This is a great insight. Today's "apps" are more similar to terminal based services on a mainframe than they are to the 'applications' that were running on individual computers.

When you look at the architecture of the infrastructure for a Google or a Facebook you see what are essentially data center sized computers that you can walk through. We have a smart terminal called "web browser" that accesses them :-). And things like ChromeOS and Chromebooks take that to its logical conclusion. Oddly enough I got an early sense of this using a Heathkit H89 which was a terminal with an embedded CP/M computer. Where there was a CP/M program providing the user interface while talking over a slow serial link to a data server. Others got to experience things like the Minitel. Crude by today's standards but heading down the road we currently drive on :-).

The corporate challenges of having everything in the mainframe (competing with others for resources, paying a 'tax' to IT for running the mainframe even if you weren't using it, information accessibility that you did not control, Etc.) pulled PC's into business when people could do their jobs on them and didn't need to use the mainframe. Given similar sorts of pressures on "cloud" I keep wondering if the idea of a personal data center will take hold. In that model you would still use your web accessed services but they would run on equipment at your house rather than elsewhere.


Your last sentence iis similar to edge compute.

https://www.google.es/amp/s/www.cio.com/article/3176036/it-i...


For some with massive NAS setups etc this may already hold true. The problem is connectivity. Unless you stick within the 100 meter radius of wifi, you have to deal with you ISP. And few home connections offer much outbound bandwidth. Never mind the issues of firewalls and NATs.

And speaking of minitels, i can't help think that the switch from circuit switched to packet switched was the big enabler of the resurgence of the "leased terminal".

Before you had to have a complete wire circuit from terminal to computer. But these days all that is needed is that the packets reach their destination. This in turn allows a single "terminal" to connect to multiple computers using a single physical circuit.


>> ...These experts believed that an isolated computer at home would be too under-powered to be worthwhile.

> Considering how the majority of the general population's use of "computers" is really interacting with remote and cloud services (Facebook, Netflix, etc) this is a valid argument today. While today's equivalent of terminals (phones, tablets, and even PCs) have a significant amount of local computing ability, for the most part they really are just used as fancy terminals. Gaming is maybe the one mainstream application where local compute is important.

The argument you quote was not that people would be using terminals, per se, it was that they'd have to because local computing would not be powerful enough. And in that sense their argument was not correct.


> Gaming is maybe the one mainstream application where local compute is important

Most people do not think of gaming machines (consoles, handhelds, etc.) as computers, and usually they are not general purpose computers (for the user) in the technical sense, e.g., the PS4.


Most people don't think of smartphones and tablets as computers either, yet in every practical sense they are. The same applies for gaming systems (in fact from a hardware standpoint the PS4 isn't very different from my desktop computer)


> Gaming is maybe the one mainstream application where local compute is important

While it isn't as mainstream as gaming, music production also requires a lot of local computing power. Even with ridiculous bandwidth, having a high fidelity sample streamed elsewhere, processed, and streamed back to you would be extremely slow. Not to mention, as a keyboard player or drummer, the latency would be atrocious if you tried to play a synth that was being computed in the cloud.


To be a bit of an ass, music production, at least in the sense of recording an actual musician performing in sync with previous recordings of same, is a damn close cousin to a game.

But then more and more music production is someone dropping loops and samples into a time chart and iterating until satisfied.

Hell, there are now software that allow a computer to basically take over vocals even.


Except my 8-bit micro with 32k in the 80s was very useful. I used to run my dad's business stock control on it. I wrote software for someone to keep track of their commercial VHS lending library on another system.


I think it is also very much a chicken-and-egg problem. Without software for personal computers, there is no need for personal computers. When there are no people with personal computers, it is not interesting to write software for personal computers.

Then again, why would someone want a personal computer in that time? The things computers are used for now are broadly: 1. Multimedia 2. Communication 3. Administration 4. Computing (for example in science or finance)

The only purpose where you actually need a computer is for computing. This is not something that you typically do at home. It was simply not conventional to use a computer for other purposes at the time. There were no fancy HD screens and the like, and no ubiquitous internet access.


Supposedly what sold the AppleII was Visicalc. Because now accountants didn't have to argue with the sysadmin about compute time.

Yes it was slower than the beast in the basement. Yes it was more limited. But it allowed accounting to present management as requested without having to stall for time on the beast.


I used google maps to navigate offline while traveling. It made me once again realize how powerful is the device in my pocket. Offline documents and web pages also work just fine.


I would go with Here or Sygic for offline navigation.


Those not around (or aware enough) back then don't grasp how staggeringly little computing power there was, and how enormous the effort to do that paltry processing. Sure there were "desktop computers", but they barely did anything; a computer sophisticated enough to do anything meaningful in one's home would have required an entire room devoted to it - and still wouldn't do much.

Today's Lightning cable literally contains more processing power than a "home computer" from the early 1980s.

"Document processing", today a simple matter of scanning a paper at >300dpi and running OCR, was a huge undertaking just to identify & fragment documents (varying extreme compression of components based on differing legal importance of preprinted content vs handwritten content vs legally-binding signatures) precisely because there was so very little space to store anything in. One common image today could easily overwhelm an entire storage unit back then. (Tangent: I believe this is a major & widely-disbelieved source of much of the consternation surrounding Obama's birth certificate).

The notion of utility-level remote computing was necessitated by computing problems of real practicality requiring such scale, not just of straight capacity but of complexity to squeeze maximum usability out of such under-powered hardware, that doing it at home was unthinkable.

Nobody expected computing power would increase, per volume & dollar, so incredibly much. Some 40 years later, processors are about 10,000x faster, displays 10,000x more detailed, and memory & storage & networking 10,000,000x greater. I remember my brother taking all day to download a large book; today we use as much data for 1 web page.

"These experts believed that an isolated computer at home would be too under-powered to be worthwhile."

It was. Think: modern insulated automated-HVAC 2000 sq ft home, vs mud hut. They couldn't comprehend what was to come; most today can't comprehend where we came from.


One of my favorite examples:

The parent comment above would have been too big to store in main memory for one of these computers.

The comments on this thread wouldn't fit into secondary storage.

It's also useful to remember that was talking about his then present, not forever and ever. He didn't say, "and there never will be."


Nicely put. It's hard to get the orders of magnitude across; that's a good example.

One could argue that the "didn't say 'never'" was implicitly understood. What I don't think most grok is then-incomprehensible scale change; those loading multi-ton 5MB storage units on airplanes couldn't imagine a half-terabyte on your fingernail.

https://imgur.com/uqKOX

https://ae01.alicdn.com/kf/HTB1nA6XRpXXXXX1XXXXq6xXFXXXy/200...


George W. Mitchell's quote was pretty much spot on considering the context of the time, he was basically describing an internet connected pc which was communicating with servers somewhere that do the real computing. Now obviously even the cheapest netbook has more computing power than anything he was thinking of at the time, and your average gaming computer is vastly more capable of computing than what the home terminal he was envisioning was, the fact of the matter is that we're moving more and more towards the thin client model where the majority of your computing is done in the cloud. Hell, I'm sitting in front of a macbook pro that could tear your face off but I rely on the web and a remote server to run my word processing software.


It hits me as i dig though the comment pile that what has changed is that while compute power has grown, storage has downright exploded.

The power of Google et al is not the compute, it is the storage. The amount of data that Google have on tap and sifts through is downright staggering, and clearly not something that is possible to store locally for home users.


Especially considering the petabytes of tape they use.


Funny how we're now at a time where we're moving past computers. I thought this was going to be about how you don't need a computer at home anymore because you can do so much on tablets, phones, watches, and TVs.

When the Apple Watch with LTE was announced, I wondered when people will start to have a watch but no phone. When the iPad got LET, senior execs shifted to them and raved about the experience. It worked for them because they mostly just did email and a few other things, unlike worker bees who need a traditional computer OS. I wonder if a similar thing will happen with higher-ups foregoing phones for watches.

My guess is that in the next 3-4 years, we'll start seeing people talking about how amazing it is to go around without a phone. Perhaps a tablet for doing "real work" and a watch for everything else.


… but modern phones, watches and TVs are computers …


Terminals are also 'computers' in that they have RAM and a CPU etc, but they where not independently useful.

A tablet without google, Netflix, Facebook, Pandora, etc is just far less useful to people.


For some tasks they are less useful, but there are still a ton of things our smartphones and tablets do that rely on local computing power.

Biometric authentication such as face and fingerprint recognition is a heavily compute intensive application. It needs to be done locally in order to be secure. Face ID is just nuts.

Modern smartphone photography is heavily compute-intensive, from HDR to optical zoom to image editing and markup. Just look at what Apple is doing with portrait mode and lighting effects. I love using time-lapse and slow-motion video modes. What would instagram and snapchat be without locally applied filters?

Note taking, address book and time management. The notes apps on modern mobile devices are multi-media wonders with integrated text, image and handwriting recognition built-in. Calendar management, alarms and notifications all rely on local processing even if they do make use of cloud services. Without local smarts those services would lose a huge chunk of their utility.

Document and media management. I read eBooks and listen to podcasts. My ebook reader has dynamic font selection, text size adjustment and will even read my books to me. Managing media on the device is essential as contemporary networks are still nowhere near good enough to stream everything all the time. My podcast app has sound modulation and processing options built in to tailor the sound to my tastes and needs, including speed up, voice level adjustment and dead-air 'vocal pause' elimination in real-time, all on-device and adjustable at my fingertips. That's serious sound studio level stuff in my pocket.

Playing digital video files. Even ones downloaded or streamed. So what if we've had it since the 90s. It's still proper computation, especially with advanced modern codecs. VOIP and video calling even between continents have also become everyday and absolutely rely on powerful local number crunching.

These things have become so everyday that we hardly notice most of them, but without serious on-device processing power, some of which would have been beyond $10k workstations just 15 years ago, none of this would be possible.


10's of Gigaflops is now cheap and in your pocket. But, unlocking your phone is not a use case that's just a hoop you jump through before making a phone call etc.

The phone might happen to have an on board address book and calendar, but this was not a statement about access to computation. It's a statement about cost/benefit and use cases.


> A tablet without google, Netflix, Facebook, Pandora, etc is just far less useful to people.

What you say makes no sense. Tablets are independent computers. You don't have to be connected to the internet to make the most of it. You rely on these internet services only if you choose to. I can read a book, listen to music or watch a video without Pandora, Netflix and Co. A tablet is not a dumb terminal.


Starting from scratch you load a book, music, or video onto an iPad by connecting them to another computer. In that context your just cashing information from another system.

Unless your talking about taking photos / video from it, but that's another story as a cassette deck can also record and play back audio without being a computer.

Granted, you can play games with one or write notes etc, but I just mean they are mostly used as connected devices.


> Starting from scratch you load a book, music, or video onto an iPad by connecting them to another computer.

That doesn't help making the differentiation between a terminal and a standalone computer. Computing has always been a networked thing, we just now have a transport medium that is faster than a floppy disk.

Mainframes were loading punched cards, cloud datacenters are loading user data, and tablets are loading books and video.

And the moment you annotate that PDF or manage synced documents, you are now editing content.


As strange as it might seem early computers where not networked. Early computers generally had a front panel that you could use to load programs by hand making them very much a standalone device.

At least in the context that a modern PC's graphics card is not thought of as a separate computer.


> Starting from scratch you load a book, music, or video onto an iPad by connecting them to another computer. In that context your just cashing information from another system.

And that's true for any desktop or laptop as well. It doesn't make them dumb terminals either.


You can still buy CDs / DVDs and load them on a PC.

It's possible you could jury-rig something up for a tablet, but it's also clear the intended use is as a networked device.


> You can still buy CDs / DVDs and load them on a PC.

Because phones and tablets don't have SD slots? or data transfer from blue tooth or NAS drives isn't a thing?


Gamestop does not sell games on SD cards for those tablets. So, somewhere between 99.99% and 100% of the time that's not what happens.

PS: iPhones and iPads don't have SD slots so for millions of these devices that do qualify as phones and tables it's really not an option.


Teletypes, which might have been what people where thinking of in the 60s and early 70s, didn't have a CPU AFAIK.


IBM 2260 video display terminal dates back to 1964 and is rather different from a Teletype though it also does not really have a CPU.

The IBM 3270 on the other hand does significantly more processing on the device and dates to 1972 which would have made it well known in the late 70's computing world.


I think what most people envision when talking about a terminal is the lowly VT100.


Which has an 8080 CPU...

Now a 2MHz 8 bit CPU chip may not seem very powerful. But, people wrote very useful software using far less.


Less useful to most people, sure, but most people also use their desktop computers for only those same things.

I make music, and for the past several years, I've been pretty much exclusively using my phone and iPad to do it. The touchscreen works very well for my use case, and the audio pipeline is quite flexible with the right apps--in many cases outshining the workflow available on a full PC. I'm working exclusively on the device and using no external services after downloading the apps themselves. That takes far more computing power than a dumb terminal.


Tell that to the people who sit in public transport playing games on their smartphone. A lot of the tasks usually done are impossible without internet connection, but that can be said about almost any computer. That doesn't make them useless, it just restricts them to a (still very useful and big) subset of functionality.


Starting with a new iPad, load a game without connecting to another computer.

Yes, they do processing without connection to an external system. But, unlike say a play station you can't just go to a store and buy a game without connecting your device to another system.


The disk with your playstation game is pressed as a copy of a master disk that was created by a computer. In contrast, an iPad is connected via radio waves to a transmitter that is connected via cables to computers.

The only difference is the transport medium used for the computer-computer connection.

At least on Android Tablets there's also nothing really stopping me from buying a USB-stick with games and installing them, without any computers involved after the stick's creation (I would have to get a friend to create it and sell it to me, but the lack of market demand, and thus supply, is besides the point).


If every other computer in the world was off you could still load and play a game on a PlayStation. That's a difference in kind.

The reality is there are tablets are a third option between a standalone computer which could be programmed by hand and an dumb terminal.


That might be possible on an Android tablet, though I haven't tested. Skip initial Google account setup, enable sideloading in the settings, plug a SD card (or an USB drive, if that's supported), use the built-in file manager to open the APK.


People have been saying tablet for "real work" for years now. Real work isn't one size fits all. Computers (laptops, PCs, ...) will be around for quite some time.


I immediately got excited about the idea of replacing my phone with a watch, until I learned about the constraint of needing a phone to setup the watch. That shows that the phone isn't completely replaceable right now, but I guess Apple will fix that in a year or two. I'll definitely get rid of my phone then.


Yeah, even though I don't currently have an Apple Watch (go Pebble!), I wondered whether this would be feasible now. You could just choose to leave your phone at home all the time, but you'd still have to pay for the wireless plan. Currently, carriers charge $10 extra for the Watch, but you can't yank coverage for the phone. They use the same phone number, so you still have to have both.


My iPhone is a three year old 6 Plus. I have no problems leaving that giant at home to rely on a watch. I'd hope it would bring more attention back to my day, and help break me of the habit of going to my phone when bored.

The watch would do everything I need, texts, phone calls, email alerts, and I can even use it for navigation. While it would be better not to pay the $10 a month surcharge, that's a fairly small price to pay if it's an improvement in my daily experience.


Yep! And one issue Apple will have to deal with is: if people routinely leave their phones at home, will they pay $600+ for new phones every 2 years? My guess is that Apple will not be too eager for that scenario...


Apple isn't dumb enough to think they have a choice. When Android finally figures out watches as well as Apple, a hundred watch makers will start chipping away at the phone market. Apple has always been smart enough to try to cannabilize its own products if cannibalization was inevitable.


What do you think the ETA is on a good, LTE-enabled Android watch? I feel like nothing out now is as good as the original Apple Watch, and it took them 2 more years to get the LTE stuff worked in there (and even then, the battery life isn't as good as Series 2).

And to be clear, I'm not a fanboy who sees the Apple Watch as the gold standard — I don't have one and probably won't purchase the Series 3.


Apple isn't going to let you replace a 1000$ phone with a 400$ Watch any time soon when they can easily sell you both.


Same old Apple really. Back when the iPod first happened, the idea was of the Mac as the hub of your Apple provided device world. These days it is the iOS device, tethered to your iTMS account.


And 10 years later everyone will be using glasses to correct the "looking at an 8K postage-stamp sized display" syndrome.

Not possible without some magic holographic technology.


An interesting talk by Ken Olsen on his history with DEC: https://www.youtube.com/watch?v=GNBS0I1h42k

Sadly, the names DEC and Digital barely resonate with the average under-30 developer. In fact, I'd guess that DEC/Digital on a resume not only counts for much less than AirBnB or some other trendy, frivolous startup, but probably even hurts, given how prevalent ageism is in our industry.


We laugh at these quotes and yet we keep making them.

"640K... nobody would ever need that much memory"

"No wireless. Less space than a nomad. Lame."

"Well this is an exceptionally cute idea, but there is absolutely no way that anyone is going to have any faith in this currency."

It seems inevitable that despite knowing better we will keep putting our collective foot in our mouth. I'm ok with that. The takeaway is that limiting predictions aren't useful.


> "640K... nobody would ever need that much memory"

The Gates quote was "640k ought to be enough for anybody", and it's been wildly misinterpreted by basically everyone.

The IBM PC had already picked the 8088, which had a 20 bit addressing architecture regardless of whether or not 1MB was "enough" or not. The issue Gates was addressing was whether or not DOS's choice of 640 as the barrier for loadable program memory management was enough given that it was only about half the space in the machine. And he wasn't really wrong. The time frame between DOS programs bumping up against 640k and "DOS" programs wanting to use the multiple-megabyte flat spaces available on the 80386 (which couldn't possibly have been architected for in software in 1981) was like three years.

If you want to ding DOS, complain not about the 640k limit but about the complete lack of architecture and discipline in the high 384k. Really no one ever got this space under control, and it was a bucketful of undetectable device I/O hazards and too-clever-by-half hackery well into the 90's.


the quote was "When we set the upper limit of PC-DOS at 640K, we thought nobody would ever need that much memory."

"640K... nobody would ever need that much memory" was just a contraction

https://quoteinvestigator.com/2011/09/08/640k-enough/


> The takeaway is that limiting predictions aren't useful.

Er ... isn't that rather limiting?


Or maybe the takeaway is that most things fail and cherrypicking the times they didn't isn't a real argument


I guess some of the ideas were “half” right of the future to come, considering how many of our devices now make use of “cloud” computing.


But less and less do we have personal computers in the home- the PC is being converted into a dumb terminal, a screen and keyboard, with all the real brains on the other end of the wire.


The computers there were talking about would be analogous to the big iron supercomputers of today. The quote "There is no reason for any individual to have a supercomputer in their home." is much closer to reasonable.

It does show a lack of foresight on their part, but it's not as completely wrong as it looks on the surface.


Every time this quote about computers in the home is brought up, it seems like people miss the point. It wasn't that the people saying no one needs a computer at home were entirely wrong, it's just that the meaning of "computer" changed. What computers did, what they looked like, their power and space requirements were reduced to fit into the home. The big iron of the era were the equivalent of trains, and it's true no one needs a train parked in front of their house. But powered transport was miniaturized too. So I agree, it boils down to a lack of vision on their part.


True, but... for that era, a "supercomputer" was a Cray 1. How many Cray 1's do you have in your home? That is, how many devices do you have that can do 160 MFLOPS? For many HN readers, it may be in double digits.


Then again said Cray1 was a liquid cooled beast that acted as much as a piece of furniture as it did a computer.

What blew everything up was how fast Intel and "friends" could get ICs to shrink.

But even then we now have vast warehouses that we access remotely that is mostly there to store metadata that is sifted and sorted on command.


As an adjunct, here's a journalist in 1979 struggling to comprehend Ted Nelson's vision that the computer would become a creative tool for the masses, https://www.youtube.com/watch?v=RVU62CQTXFI


From the same period, a radio interview where Ted Nelson tries to convince, unsuccessfully, a skeptical radio host that home computing has a future: https://www.youtube.com/watch?v=RVU62CQTXFI


"Unless you are very rich and very eccentric, you will not enjoy the luxury of having a computer in your own home."

- Ed Yourdon, Techniques of Program Structure and Design, 1975

Yourdon had been at DEC, but left the company prior to 1970.



This is, in large part, still a popular sentiment in Japan.


Failures in imagination can be deadly for businesses.


To me the amazing thing isn't that DEC pooh-poohed the idea of the PC, but rather that IBM was agile enough not to!


The PC was really an outlier.

Not only was it an IBM product, but beyond the BIOS everything was off the shelf (even the OS).

So once the BIOS was cloned, and the clone defended as clean in court, they promptly tried to reassert control with the PS/2 and spiraled into irrelevance.


There is no reason for any individual to have a high energy particle accelerator in their home.

35 years later...


Why didn't the investigator contact Dave or Gordon both of whom are alive and well?


For these quotes I always wonder if we put them in the right time context. Is it fair to assume that the quote was meant to mean "never", or was it implied a time frame such as "during this decade" etc.


There is one big reason and that's privacy!


And 640kb ought to be enough for anyone :)


This would have been true today if people no longer felt the need to buy desktop computers. (They still do it to play games.)


"There is no such thing as an underestimate of average intelligence" --Henry Adams




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: