Hacker News new | past | comments | ask | show | jobs | submit login

> And with magnitude orders more RAM.

Doesn't mean I want to waste it. Some systems don't really have much RAM to spare, such as the iPhone.

> And just because there are large libraries doesn't mean they are either necessary or desirable.

This is an old argument, but short of rewriting the world there isn't presently much alternative.

> Interestingly most apps that depend on Qt, WebKit or ICU include their own copies of this libraries.

Not sure which system you're talking about. As far as I know, Linux distros tend to link everything against system libraries. Here on OS X, Qt is usually bundled, but WebKit and ICU are part of the system and dynamically linked against. Independently distributed Linux programs are an exception and arguably a bad idea.

> I'm a fan of many small tools working well together but I can't help but feel that when you have systems with 4,000 binaries in your PATH, something has gone terribly wrong.

Most of it I don't use and thanks to MacPorts, I have a lot of duplicate copies of tools. (Maybe not the best system, but disks do have enough room to waste some.)

> There is no reason why Go binaries couldn't be much smaller, other than so far it has not been a problem for anyone building systems in Go.

Fair enough. It is a problem for me mainly because I value very fast compilation of small tools.

> Which is not helped by dynamic linking for reasons both of complexity (see http://harmful.cat-v.org/software/dynamic-linking/versioned-.... )

This does sound awful but I have not really seen it anywhere outside of libcs. OS X libc has only a few switches (UNIX2003, INODE64) and uses them to provide wide backwards compatibility.

> and because most programs anyway use their own version of such 'shared' libs (update your system's ffmpeg and chrome will keep using its own copy).

And indeed this can cause serious problems (I remember a vulnerability report or two about some program shipping an out-of-date library), but luckily you're exaggerating its prevalence. Chrome will keep doing its own thing, but when WebKit gets another of its innumerable security updates, I don't have to redownload the 32 applications I have that link against it.

(ffmpeg might be an exception because it doesn't care about stable releases, but I also remember a blog post complaining about Chrome's (former?) gratuitous forking and bundling of libraries. It's a bad idea.)

> Dll-hell has security implications too.

Only on poorly organized systems. On a Linux distro, the package manager takes care of dependencies and generally gets it right. OS X is uniform (and willing to break backwards compatibility) enough that when there are problems, the developers update their apps.

> In practice people end up doing things that either nullify the alleged benefits of dynamic linking,

in a small minority of cases, yes; in the vast majority of cases where, on a well-organized system, I just want this security or framework update to make it to everything, no.

> or simply using static linking (Google deploys statically linked binaries, sometimes multiple Gb in size to their servers).

Facebook also does the gigabyte binary thing. It's a ridiculous waste of space, but if they don't care, they don't care; servers don't have as many constraints as user-facing computers (they have a fixed workload and expected disk usage), and are often less vulnerable to library security issues, not having to expose the full web stack + PDF rendering + Flash + GL to to any random web site the user navigates to. :)




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: