Hacker News new | past | comments | ask | show | jobs | submit | debiandev's comments login

> Are distros doing anything more than scanning CVE databases with the library versions, or are they _actually_ auditing the versions they choose?

Debian Developer here. Part of packaging work, for Python libraries or anything else, is to verify the reliability of the upstream developers, audit the code, set hardening flags, add sandboxing and so on.

I spotted and reported vulnerability myself and it's not uncommon.


Please don't take it the wrong way, I have a lot of respect for distros packagers and maintainers. I donate to debian, and I report bugs. I think you are basically heroes of the FOSS world, because without your invisible (and frankly thankless) work, mine wouldn't exist.

But come on, there are 300k entries on pypi, 200k more for perl and 160k for ruby. I'm not even counting the whooping 1.3M on npm because I assume this is considered taboo at this point.

You cannot package 0.0001% of that, not to mentions updates.

And unless distros make it as easy to package and distribute deb/rpm/etc than it is to use the native package manager, this means distro packaging will never be attractive to most users because:

- they don't have access to most packages

- the provided packages are obsolete

- packages authors have no way to easily provide an update to the users

- it's very hard to isolate projects with various versions of the same libs, or incompatible libs

And that's not even mentioning that:

- package authors may not have the right to use apt/dnf on their machine.

- libs may be just a bunch of files in a git repo, which pip/gem/npm support installing from

- this is not compatible with anaconda, which has a huge corporate presence

- this is not compatible with heroku/databrics/pythonanywhere, etc

- this is hard to make it work with jupyter kernels

Now let's say all those issues are solved. A miracle happens. Sorry, 47 miracles happen.

That would force the users to create a special case for each distros, then for mac, then for windows. I have a limited amount of time and energy, I'm not going to learn and code for 10 different packaging systems.

It's not that we want to screw over linux distros. It's that it's not practical, nor economically viable to use the provided infra. The world is not suddenly going to slow down, vulnerabilities will not stop creeping up, managers will not stop to ask you to use $last_weird_things. This ship has sailed. We won't stay stuff only published 5 years ago with delays of months for every updates.


Thanks for your work. And I have to say that for internal development, when there is no need for the latest features, it's much easier to develop based on a Debian release as much as possible. A stable distribution provides an easy to track baseline not only for Python libraries, but any other tools that may be needed.


This! Asciidoc is the grown-up brother of Markdown. Designed to scale up to entire books, handle images, tables, references, citation, book indexes, maths.

And the syntax is very friendly and intuitive.


And it's quite easy to insert compiled images, e.g. Graphviz, UML diagrams, Ditaa, just by having the SOURCE in your document.


Debian Developer here.

Reminder: the Debian Packaging Guidelines are meant for official Debian packages.

Making a package with dpkg-deb -b <dir> <packagename> is very easy and gives all the features of APT (dependency tracking, config management, atomic deployment).


Yep. If you're packaging for Debian, the packaging guidelines make sense. If you're packaging something nobody outside your org is ever going to see, you can be a lot more flexible.


Yes, dpkg-deb is a recent discovery. Out of interest, what am I losing by doing it this way rather than if I followed the official guidelines?


Debian Developer here: upstream developer almost never care to prepare fixes for existing releases.


If an upstream dev isn’t supporting software anymore maybe the users should stop using it?


Here's the problem. Say I develop program/library/whatever Foo.

I make a new release every six months, so we have Foo 1, six months later Foo 2, six months after that Foo 3, and so on.

Between the time Foo n and Foo n+1 are released, I'll release minor updates to Foo n to fix bugs, and maybe even add minor features, but I don't make breaking changes.

Foo n+1 can have break changes, and once Foo n+1 is out I stop doing bug fixes on Foo n. My policy is that once Foo n+1 is out, Foo n is frozen. If Foo n has a bug, move to Foo n+1.

A new version of Debian comes out, and includes Foo n which is the current Foo at the time. Debian supports new versions for typically 3 years, and the Debian LTS project typically adds another 2 years of security support on top of that.

That version of Debian is not going to update to Foo n+1, n+2, n+3, etc., as I release them, because they have breaking changes. A key point of Debian stable is that updates don't break it. That means it is going to stay on Foo n all 3 years, and then for the 2 after that when the LTS project is maintaining it.

That means that Debian and later Debian LTS ends up backporting security fixes from Foo n+1 and later to Foo n.


> Debian supports new versions for typically 3 years, and the Debian LTS project typically adds another 2 years of security support on top of that.

I don’t think Debian should try to do this. They should just ship whatever the current release of upstream is. Or better yet, just allow upstream to ship directly to users.


The whole point of a stable release is to be, well, stable.

People want to be able to develop their things they need, such as their organization's website, online store, internal email system, support system, and things like that, deploy them, and then get on with doing whatever the organization was organized to do.

They don't want to have to be constantly fiddling with all those things to keep them working. They want to build them, deploy them, and then not have to spend much effort on them until they want to add new features.


This is good advice for software lifecycle management in general:

https://wiki.debian.org/DontBreakDebian


Debian Developer here. Backporting fixes to tenths of thousands of packages is already a huge amount of (thankless) work.

But it's still done - as long as there's usually one version of a given library in the whole archive.

Imagine doing that for e.g. 5 versions of a given library, embedded in the sources of 30 different packages.


I'm sorry to hear that it's thankless. Thank you for doing it. It is one of the pillars of my sanity, and I am not exaggerating.


A couple of healthy reminders to avoid drama and FUD:

1) People announce leaving publicly and the FLOSS community takes notice of it. This is a sign of health of Debian. In many other projects few people notices.

2) The number of Debian developers, projects, and packages has been increasing for decades.

3) For each person writing on mailing lists and blogs there are 10 people quietly contributing.

4) The same applies to the occasional flamewars. Vocal minorities are not representative of the thousands of DDs and contributors.


This seems like damage control and image management.

The piece was cogent, respectful, and constructive.

How about addressing his points?


Exactly. I'm a long time Debian user, and after reading that piece I'm not worried that Debian is going to disappear, but I think he is raising good points that should be addressed.


I'm a DD. His points left me scratching my head. Yes, Debian's infrastructure is old, but so is Big Ben and just a Big Ben still does an admirable job of broadcasting the time to the locals, Debian Bug Tracker does an admirable job of tracking bugs. I get it that he personally might prefer to interact with it via a more modern web page, but that wouldn't alter how well it tracks bugs or facilitates discussions. And besides, I find email easier to interact with automatically than a web page. Oddly he goes on to list not being able to automate things as a complaint.

The same goes for the rest of the things he lists. Yes, he might prefer to do them some other way, but the way they are done now has obviously been working very well for a long time.

As for Debian being incapable of making big changes - that's just rubbish. He's been there for 10 years for pete's sake, systemd was a big change requiring many packages to updated spanning several years and several releases, while still delivering a working system as it happened. That's not big? How about altering the source package format, or moving away from sha1 for signing, or making everything build reproducibly, or moving all developers using its collaborative development platform from FusionForge to GitLab? Sorry, he's just plain wrong on the "can't make big changes point". Debian regularly makes big changes every major release. In Bullseye they will have made big strides in migrating away from Python2. Changes of this scale are things other projects regularly struggle with, but not Debian.

His posts lists a whole pile of things about Debian he's discovered he no longer likes, which is fair enough. But as he says in his introduction it's him whose that's changed, he gone from a student with lots of time on his hands who was happy to be part of loosely collaborating group to being a member of a very focused and highly directed team at work, and he's discovered he prefers the latter. Great, I get it, happens to all of us. But that doesn't mean the Debian no longer works. It clearly works very well. It just means he's no longer a great fit for that way of doing things.


Very helpful response.

In particular mentioning systemd, which I was very happy to discover on returning to Linux after some time away made this answer very relatable.


Also:

Big Ben is a Victorian hand wound clock. It’s accuracy is maintained by moving a stack of old English pennies balanced on its pendulum. It must be wound by hand three times a week, and it takes one and a half hours to wind every time.

My guess is that the author of the piece we are discussing would find this to be an excellent analogy for the Debian processes he is complaining about.


At least one of his points is already addressed by the twitter thread he links: Debians bug tracker behaves correctly, Gmail just ignores the information included in the email headers to group them.

So he has been told that this is a Gmail issue, but insists on using it. Meanwhile he complains that the rsync package maintainer is blocking his changes out of personal preference. Double standards much?


Gmail is an immensely popular email provider with 1.5 billion users in 2018. It has 65% of US market and even on a global level it's the leading email provider.

This is not some esoteric email client that a handful of developers refuse to let go.


But it still is the source of the faulty threading behavior. The issues also seems minor enough that there is no point in adding a hack on Debians side to make the output on googles proprietary endpoint look better.


You are both right; GP in that this isn't the end of Debian, and you in that the article author's points should be addressed.


Neither the GP, nor I, nor the original author mentioned the end of Debian, so it’s unclear how the GP can be ‘right’ about that.


I mean, I'm also not sure if his "FUD avoidant" post has the desired effect the way it is posted, but "damage control and image management" suggest that Debian would be a Company and a PR department doing such a thing.

That's simply not the case, and the OP is right that there were many Debian Developers in 2019 and there are many other Debian Developers now.

Not saying that does not mean that it won't be noticed, or just brushed over, if a prolific member decides to step down.

> The piece was cogent, respectful, and constructive.

Just for clarity, I agree with that.

> How about addressing his points?

Who, the OP? Even his nick name suggests affiliation with Debian, there is normally support and action of more people required to bring bigger change.

But yes, IMO Debian surely needs to continue to adapt or be doomed to frustrate more developers in the future.

Things like (from the blog post): > I tried to contribute a threaded list archive, but our listmasters didn’t seem to care or want to support the project.

Just seems baffling to me, he proposed to do the actual work (and with his record one could be certain that he'd follow through) of a feature where one can only win (i.e., don't like it? Just continue to use what you like).

Such resentment against unproblematic changes, bringing value to some group but not taking away value from others, is tedious and demotivating.

But who takes up the fight to change Debian? In the end it probably needs to come from within, i.e., a sizeable part of Debian Developers need to drive and push forward, or at least reduce the barriers for those who wish to do so respectfully, without breaking what is now.


> "damage control and image management" suggest that Debian would be a Company and a PR department doing such a thing.

No, damage control or image management in no way implies a Company or PR department. Any group can engage in these activities. Later you point out, the OP appears to be affiliated with Debian.

> Who, the OP? Even his nick name suggests affiliation with Debian, there is normally support and action of more people required to bring bigger change.

Yes, the OP. I perhaps should have used the words ‘commenting on’, or ‘responding to’ instead of ‘addressing’.

I am not expecting the OP to solve the problems, but I am suggesting that it would be more constructive to comment on the substantive content of the original article than to write innuendo about how many people are just quietly contributing, or implying that the author may be part of a ‘vocal minority’.


> No, damage control or image management in no way implies a Company or PR department. Any group can engage in these activities. Later you point out, the OP appears to be affiliated with Debian.

1. I said it seems he is affiliated, but anybody can nick name himself a variant of "debian developer" in any forum. 2. It implies that a formal body of the organisation, that can be a single person like the DPL, else it's not damage control by Debian like you suggest, but that of a single person - which can hardly be framed as damage control in this case, the blog did clearly refer to Debian as a whole, not a single person.

> I am not expecting the OP to solve the problems, but I am suggesting that it would be more constructive to comment [...]

That's what you say now, but not what you said originally. As said, change needs to come from Debian within, not some HN discussions - talk is cheap.

Thanks for the constructive down vote, though ;-)


“...else it's not damage control by Debian like you suggest”

Nowhere did I suggest Debian was doing damage control.

You are simply misrepresenting me.

“That’s what you say now, but not what you said originally”

Another misrepresentation. What I said originally, and my follow up comment are perfectly consistent.

I’m curious why you feel such a strong need to defend DebianDev and deny that there is any damage control happening.


That's quite an impersonal, facile, and generic set of responses to a personal, respectful, detailed blog.


I'm in no way responding to the blog, as you can see in my 4 points. I'm addressing the comments here.

Every time similar content is shared here there's a number of people making exaggerated claims around Debian being dead or in deep trouble.

The other comments accusing me of doing damage control are a good example.

I recommend attending Debian events in person (once COVID is gone) to see that 99% of interactions between people are very friendly.


> I'm in no way responding to the blog, as you can see in my 4 points.

Your opening line in no way makes that clear:

> A couple of healthy reminders to avoid drama and FUD:

If your intent was to "in no way respond to the blog," you should have instead written something like this:

"Unlike the article, it seems like a lot comments here are intent on spreading FUD about Debian..."

> I recommend attending Debian events in person (once COVID is gone) to see that 99% of interactions between people are very friendly.

In the meantime, I'd recommend reading the blog: in it a Debian developer mentions having very friendly interactions with other Debian friends before diving into a technical, respectful, and detailed critique of the developer UX in Debian.


Again, I'm talking about the article: "avoid drama and FUD" refers to HN.

Especially the word "avoid", knowing that the article is already written.

> In the meantime, I'd recommend reading the blog: in it a Debian developer mentions having very friendly interactions

And still, you keep assuming that I'm not talking in good faith and that I'm trying to subtly attack he article.

https://news.ycombinator.com/newsguidelines.html


Debian Developer here. Some people take a very cursory look at Debian and assume the packaging is an exercise in masochism that we inflict on ourselves.

And yet the number of DDs keeps increasing (and Debian is one of most successful projects).

Indeed it takes time to do packaging, and this is by design. Packagers are expected to thoroughly review the code they are packaging and smooth out various sharp corners.

Many times I've found bugs in the upstream code while packaging. Sometimes around security and privacy, often around documentation, usability or non-x86 architectures.

Every time I check if other distributions opened bugs or applied patches for the same issues. It almost never happens.

This is why I might spend 2 hours on a package instead of 10 minutes.


It isn't a cursory look. I worked at Canonical. I used to do lots of Debian packaging. My packages on Arch aren't low-quality. Yet, the packaging process was a lot simpler and a lot more fun on Arch. Plus my package and dependencies are actually based on the stable packages and not a 4 year old patched 10x to maintain compatibility package due to a 3 year old kernel.


Correct.

Furthermore, rebuilding and distributing a large number of large binaries every time a vulnerability is fixed is harmful!

- It encourage users to delay security updates. Hundreds of millions in the world have slow or expensive or capped Internet connectivity.

- Makes the distribution unsuitable for many embedded/IoT/industrial/old devices with very limited storage

- It gives the impression that the distribution is bloated.


Debian has an escape valve from its fundamental spirit-defining policies even for non-free software. And if Debian didn't have that policy wrt non-free, I could easily make a similarly general post as you did listing the very real dangers of shipping blobs, which probably carries more weight than the dangers of vendoring you outline.

So, why can't there be a repo for the all the vendored things, and make a policy for the maintainers there?

By not having that escape valve, the pressure shoots out on the maintainers-- in this example the maintainer's work is made impossible as evidenced by the statements of the previous maintainer that it would take two fulltime devs to package this according to the current policies. So you encourage burnout on your volunteers.

This also encourages passive aggressive software design criticisms. Look at the very first comment here that shifts to talking about the "maturity" of the software under discussion. I'd be willing to bet I'd see similar flip judgments on the list discussion of this-- all of which completely ignore the monstrous build systems of the two browsers that Debian ships. So apparently there's an escape valve for exactly two packages, but no policy to generalize that solution for other packages that are the most complex and therefore most likely to burnout your volunteer packagers.

Keep in mind you are already on maintainer #2 for this package that still does not ship with Debian because shipping it per current policy is too burdensome. Also notice that you've got a previous maintainer on the list-- who already said this is a two-person job at least-- calling out the current maintainer for being lazy. It seems like a pretty lousy culture if the policy guidelines put policy adherence above respect for your volunteer maintainers.


> the very real dangers of shipping blobs, which probably carries more weight than the dangers of vendoring you outline.

This is a false dichotomy.

> By not having that escape valve

Please do your research before posting. Building packages with bundled dependencies is allowed, actually.

Having a handful of small files from 3rd parties bundled in few packages is relatively harmless (if they are not security critical) and allowed.

Having 200 dependencies with hundreds of thousand SLOC creates a significant burden for security updates.

Put security-critical code in some of dependency and the burden become big. Make the dependencies unstable and it gets worse.

Now create a similar issue for many other packages doing the same and the burden becomes huge, for the whole community.

> This also encourages passive aggressive software design criticisms.

I would call it outspoken criticism of bad software design.


You are confusing static linking with embedding library sources.


Both static linking and embedded copies have similar issues.

With static linking you have to track where the static library ended up (recursively) and then issue rebuilds in the correct order for all packages that contain the static library (directly or indirectly).

With embedded copies you have to search the source code archive for the copies and then patch and rebuild each copy.

In some ways static linking is more complicated to deal with, due to the case that the static library ends up in some other static library.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: