I should probably switch back to Debian. I switched to Ubuntu back in the day because it was Debian + more stuff. Now it's Debian + more stuff I don't want.
Oh it's worse now. No one tests anything against Debian any more and assumes that Ubuntu is the one true Linux. That makes things like Kubernetes have some extra annoying hoops to jump through.
The whole Linux ecosystem is like trying to work with mixed windows 95 and NT environments was in 1998 now as far as portability goes. Mostly it works, but sometimes it really screws you hard.
- When you want the latest version of some software and need it "on the host" for some reason: use the official repositories / auto-updating stuff (e.g.: docker, firefox, vscode, virtualbox).
- Everything you can run on docker ... just do it. Even cli utilities are usually fine (for instance I run latex from within docker, through an alias that mounts the current folder as a volume).
- When you cannot run something on docker, consider lxc.
- As a last resort, turn to kvm machines. Hint: vscode-remote works beautifully to work inside the VM when you must resort to this. I know it is not free, but I said pragmatic ;)
For desktop use, I run Debian stable for the first six months or so after release, and then switch to testing. That ends up being enough time for things to settle down, and, remarkably, I have near-zero issues with testing.
One thing that is a pain is that testing isn't covered by the Debian Security Team, and sometimes gets fixes a week or so after stable does, so you do need to be more proactive about security issues.
> No one tests anything against Debian any more and assumes
this is absolutely not true. All packages that are available in Debian have a release process, and there is a test bed. there is also the dh_auto_test part of debhelper which becomes increasingly relevant for accepting packages into the debian release process. So if anything the opposite of your claim is true.
Yes I’m talking about third party packages, software and repositories. The contents of the default Debian repositories are excellent, but usually quite old.
most of the useful stuff? can you point me to some data that shows which "useful stuff" is lacking in sid?
what is the actual issue? inability to compile things from source or not knowing that you can install to /usr/local/bin or $HOME/bin etc?
of course everything is either on gihub or gitlab in 2020. the argument that the majority of software that is useful isn't actually available as deb package is false. useful to who? based on what arguments?
To clarify your edit, I am mostly concerned that I want to pull in another deb repo or something so when I do an apt update I get a full set of updates, not the distribution packages and a manual compile job to argue with.
At least with Ubuntu/Debian there is documentation, and the software feels 'complete' - the CentOS systems I work on feel like working/living in a half finished house by comparison.
Please note: Starting with Debian 7, the minor number is not part of the Debian release number, and numbers with a minor component like 9.4 or 9.7 now indicate a point release. Basically, only security updates and major bug fixes, with new updated installation media images. This, 10.6, is not a new major release of Debian.
Historical context: It wasn’t always like this; Debian 3.1 was a major release after Debian 3.0, and Debian 6.0 had minor releases up to 6.0.10. This did not change until Debian 7 (which was called 7, not 7.0, and had a minor release numbered 7.1). Therefore, many people might still assume that an increase of the first minor version number is a major release, even though it is not true anymore.
Yes, that will work. As part of running `apt upgrade`, the package base-files will get upgraded from 10.3+deb10u5 to 10.3+deb10u6 which will update /etc/debian_version to say 10.6 instead of 10.5.
The man page is many times better than me at explaining it:
upgrade
upgrade is used to install the newest versions of all packages
currently installed on the system from the sources enumerated in
/etc/apt/sources.list. Packages currently installed with new
versions available are retrieved and upgraded; under no
circumstances are currently installed packages removed, or packages
not already installed retrieved and installed. New versions of
currently installed packages that cannot be upgraded without
changing the install status of another package will be left at
their current version. An update must be performed first so that
apt-get knows that new versions of packages are available.
dist-upgrade
dist-upgrade in addition to performing the function of upgrade,
also intelligently handles changing dependencies with new versions
of packages; apt-get has a "smart" conflict resolution system, and
it will attempt to upgrade the most important packages at the
expense of less important ones if necessary. The dist-upgrade
command may therefore remove some packages. The
/etc/apt/sources.list file contains a list of locations from which
to retrieve desired package files. See also apt_preferences(5) for
a mechanism for overriding the general settings for individual
packages.
The distinction is that "dist-upgrade" is willing to remove packages and "upgrade" is not, on the grounds that outdated packages are common for major version upgrades but not otherwise. So I could imagine a situation where there's a new conflict, and "upgrade" resolves it by finding a different alternative instead of installing something that would conflict, and if you run "dist-upgrade" after "upgrade" it finds that the conflict was already resolved, but that seems rare/unlikely.
Personally I just use "dist-upgrade" always and pay attention to the command line - it tells you pretty clearly if it's going to remove something.
dist-upgrade is more likely to break things because it can uninstall packages. So it's sometimes useful to run a plain upgrade first just so there will be fewer packages in the dist-upgrade. This lets you review the dist-upgrade output closer and perhaps decide to cancel it if you need to make some changes before it's safe to run. If you are going to ignore the output or just run with -y, there's no reason to split them.
Does anyone else find the "recent" files section in the Debian 10 (XFCE) file selection dialogs to be significantly less useful than in previous Debian releases?
Recently created files just... don't show up. This is a hard one to do an internet search for, and saves a surprising amount of time for some workflows if it works consistently.
1000 internet points to anyone who knows how to make it work as well as it did previously.
The GitLab migration was never about changing how bugs are reported. The bug tracker is still The Place for bugs. Salsa (the name of the GitLab instance) is just voluntary infrastructure for git, replacing the previous (also voluntary to use) Alioth service.
There are some attached hooks that automate some Salsa-BTS interactions. For example, making a commit that references closing a bug will automatically send a message to the BTS letting the reporter and others know that a fix has been committed. The bug will actually close only when the new package version is in the archive (the archive is still fully git-oblivious).
That said, the Salsa migration is complete, and Alioth is no more.
I think gitlab has a few email gateways. You can send patches to salsa if you create an account first. To be quite honest, at least this way spam goes down a bit.
For the maintainers, they can filter things when it arrives this way. As a productivity tool, gitlab is great.
It's a bit more complicated than that. For example, does it run on an NVidia AGX Xavier [1], which has an ARM 64-bit cpu? Would you be able to access the GPU?
That's up to Nvidia to tell you not Debian. Debian supports running on any of the above platforms assuming the hardware OEM supports running generic Linux installs.
For that board in particular only Ubuntu 18.04 with proprietary kernel modules is officially supported, you can generally upgrade without too much issue though. It also uses Nvidia specific apt repositories.
Problem with Debian that by the time of the release all packages are heavily outdated. Ubuntu is even worse - they can't even patch multiple vulnerabilities, see, for example, radare2 package[1]. Everything that is required just a simple version bump. It's often not a problem for other distributions like Fedora, Arch, Gentoo, etc.
This is a misunderstanding of the purpose of Debian, which comes in several flavours: Stable, Testing, and Unstable. Stable does indeed "lock" older versions of packages, but this comes with the enormous benefit of reducing maintenance caused by bugs, regressions and incompatabilities that occur in newer releases. Only security updates are released, but there are also methods for installing newer versions of packages if you like, which is exactly what I want from a server operating system.
If you are looking for more of a Fedora or Arch-like experience on desktop, then you could investigate Debian Testing or Unstable.
As for the radare2 security issues, it appears that the package is part of the Ubuntu universe or community repository, and has been abandoned by its maintainer. It won't receive any official support from Canonical. It's probably a bad thing that Ubuntu allows this in a way that's not super clear to the end-user, however the best solution would be for somebody to step up to the role of maintaining the package, or for it to get removed from the universe repository altogether.
Debian-Testing is the worst of both worlds. It doesn't get updated as frequently as -unstable and doesn't get security fixes in time like -stable. e.g. firefox-esr was on 68.11.0 for two months until in the last few days it finally switched to 72.3.0. So users were running a vulnerable version all that time [i.e. they never got even 68.12.0].
I consider the in-between nature of Testing to be a feature. Less-frequent updates than Unstable drastically reduces breakage, but I still get new stuff. The lack of timely security updates does unfortunately mean I have to be a little more vigilant.
I usually run Stable for about 6 months after its first major release, since Testing often has a lot of churn then, and sometimes breaks.
I've been running this way on my desktop machine since Stretch was Testing and have been really happy with it.
For apps like Firefox, I run the upstream version anyway, which auto-updates on upstream's schedule.
This is the problem when relying on distros to build and distribute all your software.
They also don't want to break people running on a given version of the distro. So typically whatever is stable at the time that the distro version is released is the version that stays for the life of that distro version.
Package maintainers do usually do backports for important bug/security fixes, though.
If there is a particular problem, best to raise it on the bug tracker for the distro.