Hacker News new | past | comments | ask | show | jobs | submit login
An old OS idea is new again: non-installation (rebol.com)
41 points by bdfh42 on Oct 22, 2008 | hide | past | favorite | 26 comments



With the exception of some software, like behemoth Adobe installations, OS X already works this way.

It is common practice in OS X to unpack a zip or DMG file and then simply double click to run the application.


It goes even further than that in OS X. Applications are not the only thing grouped, shared libraries are grouped as ".frameworks". These include all the headers and binaries in one directory structure, and they're even versioned! It's possible to keep the "last stable" release of your lib while also including the "new development", all in one package.

It really frustrates me that other OSs don't adopt this approach. It's good for both developers and users alike.


The versioned share library is interesting. But another concern to "non-installation" method is that some "lazy" programmers failed to keep the library they use update. When there are security issues around the library, usually, the developer of the library will quickly come up with a hot-fix. But Other applications which use the library under their own local folder still expose the potential security backdoor to third party.


The .framework actually can handle this. The xml registry can list versions and version aliases.

Maybe you're implying that .app files tend to keep local .framework files inside their bundle. I'm not sure how common this practice is, but one thing that's true about OS X apps is that they're almost uniformly equipped with something like Sparkle (http://sparkle.andymatuschak.org/). Most of my apps self update without me even trying.

I think that part of 0-install is self-updating.


Maybe you're implying that .app files tend to keep local .framework files inside their bundle. I'm not sure how common this practice is...

If your app requires a library that is not included in OS X, you have to ship a copy of that library inside your app bundle. (Or you can use an installer .pkg, but that defeats the point of no-install.) Self-update helps, but the app developer has to actually build the update.

There's a tradeoff here between the simplicity and reliability of OS X app bundles and the other benefits of Linux package managers where an app can just express a dependency on a library and that library will be kept up to date globally.


Unfortunately, OS X can make it a bit too easy. Until I gave her instruction otherwise, my wife downloaded apps like MSN Messenger, opened the DMG and then ran the app direct from there! So she had all these installation DMGs open so she could run her apps.

My father (who tried OS X after being a Linux user for some time) did the same. It's really not obvious what you're meant to do unless the DMG has the instructions in clear sight (which some developers do). I had a similar confusion when I first switched to the Mac five years ago.


What they were doing may seem weird, but it's not really technically wrong to do. The only real downside is the resources wasted on keeping the disk mounted.


That's funny. The words "web" and "cloud" do not yet appear anywhere in these comments!

Even novice users have proven that they can remember to visit URLs for their applications and bookmark their favorites. With that in mind, we can easily overcome the "desktop loaded with .application files" problem.


I don't know why people don't just use Zip files.


> OS X already works this way.

Or almost works, I'd say; dragging an application to the Trash does not completely remove the application: the preferences and support files that the application has thrown into (user) global directories are still there, and in fact (besides third party apps which do this task) there is no clear interface to remove them or to know where the application put files in the first place; you have to comb manually through your ~/Library.


OTOH, disk space is cheap and if you ever re-install the app in the future your old prefs will still be there.

(My name is wmf, and I'm a data pack rat.)


And I'm completely anal retentive about my files. Pleasure to meet you, wmf.


Fortunately those files don't take up a ton of space. I've got an Application Support directory that I've migrated with me for about 4 years now, and it's just now at 1 GB.


Seeing as how many novice computer users can't bring themselves to organize their pictures/files/music, I can't see this non-install movement as a good thing.

Just think of your average Windows/OSX desktop, it's usually cluttered with shortcuts, downloaded files, music, photos etc.

Now imagine that with 10-20 application folders (which by the way if you mess with will cause your app to stop working)

We need to help users logically organize their content first, before we can attempt to allow them to pick where their applications reside.

Most don't even have a concept of file structure.


Putting files in folders is a metaphor left over from the way we had to organize physical documents, because we couldn't do any better. The key is to provide easy search and flexible views on top of the data/files users have so they can easily find what they need, not to force users into organizing their files into neat little folders.

OS X gets most of this right because it doesn't care where you put your apps (although /Applications is recommended), applications aren't folder you can mess around with, they are wrapped in a logical package you can move or doubleclick on to open. Spotlight makes it easy to find anything on your computer and most applications are optimized around searching or looking at your data from different views (iTunes, iPhoto, Mail, etc).


I suppose you could restrict programs to only running out of certain folders to centralize it, but I think the only way this kind of thing could ever really get better is a sink or swim approach. As in, if your computer is setup insanely poorly(like if I can't find your programs in the huge mess you plunked them into), don't look to me for help with it, cause I'm just gonna recommend a nuke from orbit approach, with a further instruction to do it right next time. Tough love counts as help, right? Still, I know, I know, it's never gonna happen...


ummm, when it comes to operating systems, I never found /usr/bin particularly difficult, and dumbing things down to /Programs, /Users, /Files etc doesn't seem to really be anything other than a trivial and superficial aspect of an operating system. How they multitask, manage memory, keep the file system running etc - these are the tasks of the operating system. How apps get installed is largely up to the app developer. I can statically link and provide one simple big binary, or choose to componentize so that upgrades are less network intensive, but this is just the minutia. OSX already bundles everything into one place (for the most part) and most Unix systems are really easy to understand the structure, /etc for the configuration files, /var/log for the log files and /usr/[local]/bin for the app. libs to /usr/[local]/lib etc etc. Nothing particularly difficult about that. And there are some very good reasons for breaking things up and scattering them around. /var/log would, for example, often be a managed part of the file system, for log rotations and often on a different physical media to prevent disk head movement associated with the logging requirements from interfering with the main purpose of the app. /tmp would often be mounted as a very fast ram drive or some other high speed r/w filesystem type that an app can use for its temporary storage. /usr/bin would be backed up, /var/log, /tmp possibly not so...

There are very good reasons for how many operating systems have ended up the way they are, and simplicity isn't always a lofty goal. Take for example, the implementations of malloc/free. Its very important for these to be very efficient, very fast, very reliable and not at all simple. Simple is good in UI's (sometimes), good in many things, but please, keep simple out of the operating system design...


To a large extent, OS X gives the best of both worlds.

From the command line, things look very much like you describe (but not exactly, Apple often has different conventions and names for things than Linux, say). That application you just downloaded is just a directory with a ".app" extension. You can browse its directory structure and find all of the resources for the application, all residing in their conventional locations.

But using the Finder, you just see the application's icon. If you double click it, it launches.

This is just one of the ways that Apple manages to have Unix underneath, but not expose it to users who don't want to care that it's there.


Side note: The inventor of rebol was also the engineer behind the original AmigaOS.


My opinion is that the real reason of the existence of the actual model of installation was to prevent people from making a copy of an installed software. Maybe it was even more important before the internet to prevent people from copying the most expensive software at their work, library and school. Now it doesn't really matter when the original install CD is easier to get from a torrent site.


Casual piracy is easily stopped, you could argue that it's already been 'solved' with online activation.

For hardcore pirates, it doesn't matter really as they'll find a way to get their booty with or without the 'install' procedure.

You can still have an application that doesn't install, but uses activation to ensure it's the only copy running.


I'm always pleasantly surprised when an application I download doesn't 'install'. When I click the downloaded file and it just ..runs, my typical reaction is "oh, well isn't that nice".

But then my next question is always, "where should I store this unicorn of an application" My first instinct is to create a Program Files folder, and then make a shortcut, but I know that's just wrong. So typically it ends up stored with other random program/iso files but rarely used because there is no Start Menu shortcut to remind me that I have it. The one exception being Thunderbird Portable which I use all day.


> As an OS designer who prefers well-thought-out simplicity over ever-deeper-layers of complexity,

He could always use Colorforth:-)

Joking aside, you can do something kind of like that with "starkits". You don't even have to unzip them.

http://www.equi4.com/starkit/


I like this approach also because of security reasons. It could make application sand boxing easier if it can only read/write in its local directory and the users directory.

Getting rid of the dll nightmare is a very good point. I'm also very impressed by the method used by Apple to support multiple machine codes into a single app.

On the other side, application scanning at boot time is not a very exciting strategy. Why not binding this with the file index like tracker that a good OS should have ? The index would be updated in background when files are created or deleted and so will be the application locator and document type binding.

I am more am more convinced that this is a new direction to go in addition to make the OS the most lightweight possible and eventually generalize virtual machines. It is hopeless to secure current windows PC against becoming zombies.


The flexibility and ease of use offered by not-installed software is what makes projects like http://portableapps.com/ so popular.

The possibility that installing an application may splat arbitrary files across my system has always been unnerving to me.


Also check out the way QNX merged the traditional Unix /usr/bin directories with app packages, with a Plan 9-like virtual filesystem: http://www.qnx.com/developers/articles/article_920_1.html




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: