Hacker News new | past | comments | ask | show | jobs | submit login

PPAs are an antipattern.

One of Debian's (many) advantages vs. Ubuntu is that there isn't an ecosystem of poorly maintained but easily found packages around the core.

Lending Debian's reputation to such an ecosystem will not improve Debians' standing or user experience.




As a Debian user for 15 years, I don't understand the hostility against PPAs.

As a user, they provide me with a convenient and somewhat-traceable source of supported packages for my release. For example, at work we're stuck on Ubuntu 12.04 which has git 1.7.9, but there's a stable PPA that can give me a more up-to-date git which has better pull options and defaults. Easier, more trustworthy, and more reliable than compiling and packaging my own!

Alternatively, I'm learning OCaml on Debian unstable, and the OPAM package manager was stuck at 1.1.0 for months after 1.2.0 came out, and to add insult to injury the Debian version was broken because of a silent incompatibility with a dependency.

I realize that by using a PPA I'm going out of Ubuntu's (or eventually Debian's) well-tended garden of tested packages (and honestly, "well-tended" is a very relative term) and potentially exposing myself to risky sources, but that's my choice.

I'm afraid this opposition to PPAs boils down to conservative dogmatism bred from decades of no good option for user package existing. There have been issues, yes, but don't throw out the baby with the bathwater.


Agreed. One of the things I love about Arch and Gentoo is that if a package isn't in one of the official repos, it can be found in the AUR or an overlay, respectively.


The AUR is a little different than a PPA. Yes, both are user generated content not officially supported by the project. However at least in the AUR there is a centralized place for these packages, and an entity behind the AUR platform that can curate bad/malicious/negligent packages out.

PPA's have a bad wrap for being, well... bad. Any user, anywhere. Here today, server gone tomorrow. Outdated blog posts from years ago with dead links, or bad advice/packages. It's not uncommon for a PPA to break a system.

The AUR is more similar to rpmfusion repo or epel repo (centralized and somewhat governed). Where PPA's are just like tarballs on some random-joe's blog.


Let's be honest; there are a ton of broken PKGBUILDs in the AUR, too. Lots of packages where a new version came out, upstream delete the old version, and the PKGBUILD hasn't been updated yet. I've also seen a few others broken for different reasons too... like some that don't include a function that's been mandatory for a while but used to be optional. I should probably get off my lazy ass and start posting comments and diffs on the AUR website.

Fortunately, hacking the PKGBUILD isn't a big deal.

Oh, and Arch also has its own third-party repos aside from the AUR. I've added a few for big packages that I don't want to have to recompile all the time, like Perl 6 and OpenSUSE's fork of Firefox. Also for ZFS, because it's easier to use the demz repo than coordinate kernel updates and rebuilds of zfs-git.


Lack of PPAs are one of the reasons I bounce back to Arch and Gentoo from Debian.


You realize that if a package isn't in the repository (or is in the repo, but not the right version), the alternative is that the user compiles it from source, right? Now the user has all sorts of software scattered over his system, with no way to manage, update and uninstall them. How is that situation any better than using PPAs?


Use checkinstall? It's somewhere between installing from source and installing from a package.

https://help.ubuntu.com/community/CheckInstall


Install to your own custom path? /opt ... /local ... /myfolder ... etc. Add the path to your user's $PATH. If you are familiar with linux this isn't really much harder to do. If it is a library that you are then compiling against it does get a bit more difficult, of course, but it is only a good thing to become familiar with these things. If you are ok with "polluting" your path, you can install each new binary + libs to its own custom path. You simply delete the folder in the future when you no longer need it anymore.


Not really an answer for packages that need configuration in /etc, add desktop entries to user's menus, or should be discoverable for example by systemd.

Adding to everyone's PATH is invasive and not really an option for most multi-user systems.


I was responding to somebody who made the claim that installing software leaves things "scattered all over the system". I simply pointed out that that is not true if you plan carefully. Additionally you can ensure that all users on a system have a basic set of paths enabeled in their profile (is the default PATH any more invasive than a default path + "/opt/bin"? These things are not insurmountable if you actually know what you are talking about and know the scope of your problem. These complaints are coming from people who are not actually interested in managing a system, but would rather be able to click a button in a gui and not have to understand anything of what is actually going on.

That being said compiling your own stuff in and of itself is not guaranteed to be a simple or "fun" task. It takes work and can lead to unexpected results: you immediately own anything that is custom on your system and become your own testing, debugging, and troubleshooting team. It isn't ideal.


The plan for Debian PPAs is for them to be quite different than the ones from Ubuntu, they will be available only for those that already have upload privileges. They will be a sort of compartimentalized experimental, with no different builders network and no need for a reupload to push to unstable/testing.


In which case, I don't understand what problem they're solving.

We already have Experimental.


Check out the initial proposal, it contains info on lots of the use-cases that aren't satisfied by experimental.

https://lists.debian.org/87y5btehw3.fsf@gkar.ganneff.de


Experimental is an all or nothing proposition. With a PPA system it should be possible to keep most of your system on stable, and opt into newer versions of software on a case by case basis.


Doesn't apt-pinning solve that? Set a pin that never installs from Experimental unless manually specified, or only install that one package from Experimental?

https://wiki.debian.org/DebianExperimental#APT_pinning


Just a note for the uninitiated: when you pin from Stable you just gave up the guarantees that may have been the reason you were on Stable in the first place. This is an area where RTFM is in order:

"WARNING: Use of apt-pinning by a novice user is sure call for major troubles. You must avoid using apt-pinning except when you absolutely need it." https://www.debian.org/doc/manuals/debian-reference/ch02.en....


Well yes, but all the warnings and cautions in that link apply equally to PPAs - the security implications are pretty similar. A novice using PPAs will be in the same boat, though admittedly PPAs are a little easier to use.


Oh I was being a little tangential and pointing out that changing packages on Stable to an extent defeats the purpose of it. I just hope they don't start releasing PPAs that'll help people achieve worst-of-both-worlds systems that combine the outdatedness of Stable and the bugginess of Experimental.


Packages equivalent to named long-term development branches of upstreams, without packaging the result under a separate name (thus confusing dependent packages)?


I already mentioned some of relevant points in the comment you replied to:

1) compartimentalized, not a big, global dumping ground 2) same builders network as unstable/testing 3) no need for a reupload to push to unstable/testing


> One of Debian's (many) advantages vs. Ubuntu is that there isn't an ecosystem of poorly maintained but easily found packages around the core.

Context: I'm a full-time Debian user.

I firmly believe that the future is not around packaging as we've been conceiving of it for the last decade or so, but around packaging in the form of containers[0]. As a Debian user, I'd be really happy to see Debian keep an eye towards containerization as a a first-class citizen of the distribution, the way apt(itude) and dpkg are now.

PPAs can either fit into this model or work against it, depending on how you look at it. They can either enable the creation of containers by virtue of being more flexible and more easily used inside container builds, or they can serve as a crutch for low-quality packaging standards. So, I'd be excited for PPAs in Debian, but I'd like to see them adopted as a tool to facilitate first-class containerization in Debian, rather than the way they are used in Ubuntu more as a place to hold unsupported or less-supported packages.

[0] Not necessarily Docker or even Docker-like containers, but containers nonetheless.


packaging in the form of containers

I don't really see that. Containers have mostly ingrained themselves into application deployment, but the trends in GNU/Linux packaging seem to be heading toward compressed bundles (combined with some form of access control like AppArmor or POSIX caps to get some form of sandboxing and resource isolation). At least that's what Ubuntu Snappy seems to be. This is similar to Klik, Autopackage, 0install, OS X bundles and even the Windows way of stuffing your DLLs into a single directory namespace (though with a common format and infrastructure).

Nix and Guix are in leagues of their own that have nothing to do with containers specifically.

Everyone else is sticking to the same conventional system package managers and I don't see that changing. The systemd developers are proposing their own odd scheme based on btrfs volumes that is still in its very early stages.


Having used Docker mostly as a packaging format for the past 6 months at my day job, I think it's a terrible format for packaging. I'm about to strip it out and go back to .debs.

The current love of containers is a fad - they have their uses, but it is the new hammer for which every problem has been redefined as a nail. Docker solves some problems, but packaging isn't one of them - and it brings lots of it's own issues along for the ride as well.

Admittedly, I haven't played around with other forms of containers more than a 'hello world', but my experience of container-as-packaging has been woeful so far.


I'd prefer regular Debian packaging plus containerisation via systemd-nspawn or just the various systemd security features that use the same Linux namespacing features as Docker.

http://0pointer.net/public/systemd-nluug-2014.pdf http://ftp.nluug.nl/video/nluug/2014-11-20_nj14/zaal-2/5_Len...



That's the tutorial - o you have a more detailed link?


For an explanation of Snappy - or how to get/run/use it?

Here is a couple of links for the former

http://thenewstack.io/snappy-ubuntu-a-new-cloud-os-with-supp...

http://www.markshuttleworth.com/archives/1434


The lack of PPA has hold the industry back for years. Outdated Apaches, outdated Mysqls, outdated PHPs. In a word, distro mantainers became gateway keepers, and bad ones, since they don't have the unlimited resources required to keep everything to date.

While there definitely are cases when you want somebody to pick the versions for you, nowadays the industry is moving just to fast for keep most people satisfied.

Of course there will be poor PPA's. Poor Github repositories also exists, bad NPM packages exist, bad Maven packages exist, but that does not stop people to search for the right ones and take charge of their own future.


Allowing users to easily extend the available packages is not an anti-pattern, it's practical software freedom!


The good part about PPA's are that they can be ignored completely. After being bitten once or twice, I have stuck to not installing anything that isnt in the core ubuntu repositories.


In my 6+ years of using Ubuntu and derivatives, and frequently installing PPAs, I've never had a problem or been bitten. And I've only exercised a little bit of caution.

On the other hand, I ran into massive headaches when I used to try to pin apt repositories. No more of that for me.

PPAs are the main reason I use Ubuntu distros now instead of Debian.


I don't know how you can subsist on Ubuntus severely outdated packages.


Could you elaborate which packages you use from PPAs?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: