Hacker News new | past | comments | ask | show | jobs | submit login
Ubuntu 20.04 LTS (Focal Fossa) (ubuntu.com)
589 points by dragonsh on April 23, 2020 | hide | past | favorite | 385 comments



Most reviews are very positive. Faster, better, Amazon removed, maybe the best Ubuntu release until now.

The only 'con' is the pushing of Snap packages. It looks like deb files only can be installed via the terminal.

Ubuntu is also researching if they can get Adobe on board. I doubt this would ever happen, but that would be huge. Most people stick on Windows and MacOS because Adobe software is not available on Linux.

Personally I switched back from Xubuntu to Ubuntu. And I must say: Ubuntu is back on track!


> The only 'con' is the pushing of Snap packages. It looks like deb files only can be installed via the terminal.

My main concern with Snap is security.

Using the default Ubuntu Apt repositories, I can `apt install` pretty much anything I want and it's almost guaranteed to not be malicious/dangerous, as only trusted/well-established developers can get their package into the Apt repos.

However, with Snap, everybody in the world can publish any old rubbish in the Snap Store, including packages that typosquat the names of others.

Snap's current 'protections' for this are mainly reactionary rather than preventative, which is far from satisfactory [1].

For me, the absolute key selling point of Linux over Windows/Mac is the secure-by-default and natively integrated package management. Pushing Snaps as the main package management method goes too close to the Windows way of just downloading random EXEs from the internet.

I personally would like to see Apt remain as the default system package manager for all common/well-known software, with Snap existing purely as a 'community' repository for software that is known to be untrusted/unknown.

[1] https://forum.snapcraft.io/t/is-there-any-protection-against...


The amount of cases for malicious snaps was about 1 with the cryptominer that was addressed within 3 days. Those snaps that are published go through static analysis and unless manually vetted will not have access to the rest of your system. You install it, at worst it will cryptomine that is it. Oh and you can remove it completely with all traces unlike deb equivalents.

Snaps come with a tick against verified snaps coming from first party. Such as those from Jetbrains, Mozilla, Microsoft, Amazon, Spotify etc..

If you are downloading from a software center this would all be handled for you and typo squatting will not be a problem.

In contrast with debs, the person who owns the PPA for those third party packages do have unfetted, root access to the system. The most popular PPA to this day is a closed down Java PPA run by 3rd party dev. That could easily turn malicious easily.


> You install it, at worst it will cryptomine that is it.

Last time I looked, snaps still had access to the X server. They were therefore perfectly capable of logging and inserting keystrokes, capturing whatever sensitive information is on screen, etc. Has this changed?

I don't think Wayland would solve this, because even if Ubuntu switches to Wayland, variants like Xubuntu (which inherit snap from the base distribution) still use Xorg.

This sort of thing is often overlooked when we talk about linux sandboxing technologies. People see the word "sandbox" and assume safety, but the fact is that most of these sandboxes are leaky in one way or another. Does it protect X11 abuse? DBus abuse? Shared memory? Microphone access? Device node access? The list is long, and the leaks are different in each of the sandboxes.


You are right, there are still issues and vulnerabilities present with using X. That is the case with every distribution mechanism ever in existence.

You would have to be a complete numpty to download and install such a thing as it wouldn't come from anything with first party support. Enough of a numpty that you shouldn't be trusted with root to begin with.

Wouldn't be surprised if this specific thing was scanned for and flagged with their static analysis tool. It seems like something that would be flagged.

> DBus abuse?

When I added the dbus slot for the firefox snap, Canonical wouldn't push to the store until it was manually reviewed. So yes, asking for new permissions/unusual permissions would probably need review.


Note that it should be possible for X server to pretend there is no other client connected to a particular program and AFAIK it already does this for remotely connected clients. Snaps could (if they're not doing this already) use this functionality to isolate themselves from the rest of the system.


I was specifically talking about the default, trusted Ubuntu Apt repositories, not custom PPAs which are of course inherently untrusted.

> You install it, at worst it will cryptomine that is it.

If that happens then I have no choice but to assume a full system compromise and nuke my machine. It's not a risk I'm willing to take, as there's essentially no way to definitively prove that that's all the malicious Snap was doing.

> Oh and you can remove it completely with all traces unlike deb equivalents.

Snap sandboxing is rarely utilised in a meaningful way, and the permissions for a particular app are controlled by the author by default. In most cases there's nothing explicitly preventing a malicious Snap from gaining persistence even after it is removed.

In other words, Snap sandboxing is in no way comparable to a 'proper' solution like a well-configured Firejail or a VM.


>If that happens then I have no choice but to assume a full system compromise and nuke my machine. It's not a risk I'm willing to take, as there's essentially no way to definitively prove that that's all the malicious Snap was doing.

You can be rest assured that the problematic snaps were tackled and addressed within 3 days, there are no cryptominers on the store anymore. That and it was actively searched for.

Why are you downloading random crap from teh snap store to begin with?

People always take extreme perspectives over this and I find it weird. No you probably shouldn't be installing hello-world snap from Davind1923232. I wouldn't expect you to do that on Android or any other manufacturer. It probably is safe, in that it has had as much vetting as any other store owner.

Downloading firefox, vscode, intellij, vlc, nodejs, spotify and all of these other first party snaps is perfectly fine.

>Snap sandboxing is rarely utilised in a meaningful way, and the permissions for a particular app are controlled by the author by default. In most cases there's nothing explicitly preventing a malicious Snap from gaining persistence even after it is removed.

Snap permissions aren't controlled just by the author. I have disconnected plenty of plugs that don't magically reappear. Especially for sandboxing internet facing programs from my home directory.

Snaps can't magically persist that is a load of FUD. The files needed are stored on squashfs, home config and configuration in it's own isolated directory. On removal, the squashfs is removed and that is gone.

>In other words, Snap sandboxing is in no way comparable to a 'proper' solution like a well-configured Firejail or a VM.

Why do you comment on things you really don't seem to understand. This is the entire point of snap plugs/connections which are enforced permissions based model.


> Why are you downloading random crap from teh snap store to begin with?

I'm not, but the fact that there's the potential for junk to exist on the store in the first place is the problem, especially when there isn't adequate protection against typosquatting like I mentioned originally.

As long as I only use the default repositories, I can `apt install` a package I've never even heard of and it's pretty much guaranteed to not be malicious/actively dangerous. With Snap, this guarantee doesn't exist to the same level.

Sure, there is moderation and review in place, but this puts the Snap Store in the same realm as other stores and 'community maintained' package managers, almost all of which have issues with junk/dangerous packages.

> Snaps can't magically persist that is a load of FUD.

Yes, in many cases this is right, but some of the most common Snap interfaces (multiple of which can auto-connect) would provide enough leverage on a system to gain persistence or actively interact with things outside of the sandbox.

For example, the `home` interface is enough to compromise an average personal computer and probably gain persistence, as everything of value is usually within the home directory. (I do like the fact that `home` disallows access to hidden files though.)

The `x11` interface can even be auto-connected, and this potentially allows the Snap to read the graphical output of other applications.

I agree that these scenarios are quite theoretical, but as foresto says in this thread, 'sandbox' implies 'safe', and sandboxed Snaps are quite leaky compared to other sandboxes such as Firejail or a full-blown VM.

Perhaps this is just a terminology problem? I would say that Snap sandboxing is far more comparable to permission management on an Android phone.


>I'm not, but the fact that there's the potential for junk to exist on the store in the first place is the problem, especially when there isn't adequate protection against typosquatting like I mentioned originally.

If you are that concerned about typing the wrong thing then use a software center. I have never even seen one and I have been using snaps since their inception. I am using 38 snaps and I have never once installed something i didn't intend to do. I also tend not to run sudo commands without knowing what i am doing.

It's not like launchpad, or universe isn't full of junk software too. I think you can download an open source rootkit via apt as if that matters.

> For example, the `home` interface is enough to compromise an average personal computer and probably gain persistence, as everything of value is usually within the home directory. (I do like the fact that `home` disallows access to hidden files though.)

You can't gain 'persistence' just from the home interface. In fact the only way of getting 'persistence' AFAIK, is through creating a systemd snap like ufw. Again, I am fairly certain that stuff requires manual vetting before being published to teh snap store.

X11 vulnerability applies to everything, and will apply to everything until wayland is usable. Connecting it automatically means that users actually have a functioning browser. That is a sane policy because users shouldn't have to mess with configuration files to get their programs to work (unlike firejail profiles).

All you are describing are permissions which are generally needed to actually run useful programs. Yes, programs automatically connect them. I do suggest reviewing software permissions before executing it, and you can do that with snap.

>sandboxes such as Firejail

You are talking about leakyness and mention Firejail? Firejail has historically had the most severe CVE vulnerabilities partly because of how usernamespaces/network namespaces work. It was basically a setuid binary and proved a easy mechanism to get root.

Snap is built using the same tech as namespaces, but doesn't act as a setuid binary (I think because it uses mounted namespaces rather than creating a usernamespace). It uses the same seccompf, and same browser sandboxing. The bonus of snap is that it actually comes with working apparmor profiles unlike firejail.


I'm not talking about gaining persistence via a legitimately-installed system service. Instead I'm talking from a malware point of view.

If a malicious Snap has read/write access to non-hidden files within someone's home directory, you can almost certainly gain a level of persistence, e.g.:

* Edit a desktop shortcut file so that it points to your malware

* Edit a script or program so that when the user runs it, it runs your malware

* Edit a non-hidden configuration file in a malicious way

I am talking very theoretically here, and I agree that this is taking security concerns to the extreme, but these are important considerations that aren't really present with Apt (when using default repositories).

At the end of the day, despite my security concerns, I do like Snap and the technology it uses.

However, at the moment at least, I will always prefer Apt with default repositories as it provides that extra level of safety/guarantee of authenticity.

Finally, I use a script to install the Chromium Snap to remove the risk of typosquatting, which sufficiently mitigates this risk for me.


I don't know much about the Snap system - regarding 3rd parties and the "tick", is this using domain verification, or some other means? Do 3rd parties have to pay for this?

I read something the other day about needing to pay Canonical $15k/y for a branded Snap store, but I'm not sure if that was about private stores, this, or something else.


Another point is that when installing Snap packages from the command-line, the verification tick only appears after the install has finished, by which point it is already too late.

You can view package info prior to installing, but if we're talking about a typosquatting issue here, that doesn't really help.


I think the main mechanism of this is communication directly with developers on forum.snapcraft. In this case Mozilla with Canonical etc...

I didn't go through this process so I don't really know.

I think that is how it has worked so far for all of the 1st party snaps I have seen.


>However, with Snap, everybody in the world can publish any old rubbish in the Snap Store, including packages that typosquat the names of others.

This is why I don't bother installing snaps unless the developer themselves are actually providing(or at least promoting) the snap. There's been a couple of occasions where I went to install the snap and it turned out to either be an old or broken version of the software.

I don't have too many technical quibbles with snaps, it's just the way Ubuntu has the ecosystem setup now.


Don't forget performance. SquashFS has terrible IO performance and you pay that price every time you open an application.

https://forum.snapcraft.io/t/squashfs-is-a-terrible-storage-...

There's an even bigger security hole with Snaps. If a library is compromised on apt, they'll push the update and your applications will be updated. With a Snap, every single snap developer must somehow come across the error (which could be in a sub-dependency), find the fix, then deploy again. Very few (if any) dev teams are that well connected to the development of their dependencies. In contrast, the developers of those dependencies will be very well connected.


> If a library is compromised on apt, they'll push the update and your applications will be updated. With a Snap, every single snap developer must somehow come across the error (which could be in a sub-dependency), find the fix, then deploy again.

Snaps compete with third party apt repositories, not distribution-provided packages (but see below). I've seen a general trend in complex upstreams _bundling_ their dependencies even in builds destined for apt/deb -based distribution via third party apt repositories - because handling differing dependency versions across every distribution release is too complex. In this case, your distinction is moot. In both situations it is up to the upstream developers to update their dependencies. Only that with snaps, their sandboxing security model provides some mitigation.

It is true that some packages traditionally shipped by the distribution are moving to snaps, such as Chromium. This is because these packages are moving to bundling anyway, because updating them during the lifetime of a stable distribution release has become impossible any other way, due to the same dependency pain.


This could be fixed by separating the store into an "official" repo that only includes trusted apps and an opt-in "community" repo with the rest.

It's not really an issue with snap itself.


It kinda is. Canonical explictly doesn't want to do that. The cli doesn't support any other server except canonical's.[1]

Flatpack on the other hand, does support that structure out of the box, and a few projects are already using there own repos.[2]

[1] https://forum.snapcraft.io/t/external-repositories/176 [2] https://docs.flatpak.org/en/latest/repositories.html


I can `apt install` pretty much anything I want and it's almost guaranteed to not be malicious/dangerous, as only trusted/well-established developers can get their package into the Apt repos.

Maybe not malicious, but packages in the universe repos only have 'community maintenance', meaning that they are not updated with security fixes in any systematic manner. Given that you need universe to have a somewhat useful system, most people's systems are full of known holes:

https://people.canonical.com/~ubuntu-security/cve/universe.h...


> meaning that they are not updated with security fixes in any systematic manner.

Interestingly, that exact thing was always my worry about Snap (or Flatpack) as well.

Sure, big-name software such as Spotify will keep their Snap package well in order; they've got both the incentive and manpower to do so. (Incidentally, they could also use this manpower to build distro-specific packages).

But what about all the little open-source hobby projects? They'll be packaged with whatever library version happens to be latest at the time. And then, be updated whenever the hobbyist dev finds the time and inclination.

So on my system I might have a huge zoo of different versions of the same library, with various bugs or vulnerabilities.

If they all used the same system-wide library, at least they would all be fixed at the same time (when the library maintainers publish an updated .deb).

To me, Snap and the like feel like they're essentially the same as static linking, except more opaque.


Spotify aren't doing a great job with their Snap, it's been a few versions behind Mac and Windows for a while now. They could do with more dev manpower.


Heh, interesting, didn't know that.

I was just trying to pick a random example.


This is a bug and is already fixed, debs can be installed by clicking. (Source: Alan Pope in the Ubuntu Podcast Chatter Telegram channel)


What do you mean? I'm assuming that this is a Snap Store feature?

My concern is really more about the command-line, as that's how I (and probably most technical users) install packages.


I guess my reply should be one level lower, it was an answer to "It looks like deb files only can be installed via the terminal."


I think OP is refering to snap being the default application install in the gui now rather than apt.


there's debian!


I understand the hate for snap (and flatpak) and I personally use deb as much as I can, but they are the right move.

Yes, right now it's a subpar experience.

But you need to deploy it now to be able to improve on it and have it as the default in the future.

And I believe it's a better future, because:

- packaging a deb is difficult. I've done it, and it's a lot of complicated hard work you get to repeat for every release for anything that is not a simple program. Snap and flatpak are very easy to build and support accross releases. This will increase the number of software that will be provided to Linux. In fact, it already has.

- the permissions model we have on our phones is very hard to replicate on Linux, yet something like this proves more and more necessary. SELinux and Apparmor are too low level. Snap and flatpak make it easy to make sandboxed app and request permissions by abstracting those. What's more, they make it ovious for the user to know that the apps it uses are sandboxed and the permissions they use.

It will take a few years before we get a streamlined experience with snap and flatpak, and it will be at the cost of a bit of performance and disk space. For now they feel bloated, are not necessarily more secure, and are less well integrated.

But in the end, it will make the Linux ecosystem more user friendly.


I think Snap is a great way forward. I get the argument that shared libraries let you update one package to get a fix for the entire system without relying on any number of maintainers to independently stay on top of security issues, but in practice it doesn't work all that well and makes things more complicated. Maybe Nix improves this; I've been meaning to play with it for years.

From a distributor perspective, I love that it was simple and publishing at https://snapcraft.io/cyph gets us on every distro with no hassle. The alternative was likely not supporting Linux at all right away, or at best supporting a couple distros and requiring more work on the user end.


> packaging a deb is difficult. I've done it, and it's a lot of complicated hard work you get to repeat for every release for anything that is not a simple program

Right, but the point of a high-quality distribution is that this complicated hard work only needs to be done once (per package, updated for upstream changes obviously) by your distribution's package maintainer. Then it scales for free. That's the beauty.


> Right, but the point of a high-quality distribution is that this complicated hard work only needs to be done once (per package, updated for upstream changes obviously) by your distribution's package maintainer.

Sure it works for a sysadmin, but it's not matching at all how regular people are using their computer.

It means you can only use packages that are part of the official repo and:

- it's hard to get in those official repo

- it takes a lot of time

- it assumes free software

- if your app embeds its own dependancies, it's usually rejected by default

- it's a bottleneck for publication (the repo team is limited in resources)

- you have no control over updates

- you can't make people pay for software that are in those repos

- you may want/need to package the software yourself or for yourself

On my computer I have a huge number of softwares that I have not installed from the repos:

puslsms, vscode, telegram, dynalist, veracrypt, signal, discord, antidote, bitwarden firefox developper, guitar, stremio, pop corn time, dukto, photoflare, sublime text, table plus and sublime merge

I think using the simple act of spreading this argument is a demonstration of how remote one can get from the vast diversity of the user base.


Snaps solve the problems caused by users installing from third party apt repositories. Users do that today. Comparing against the virtues of distribution-provided packages is a strawman.

The exception is distribution packages like Chromium and Firefox. These don't work well in deb format, because users expect the latest upstream updates after the distribution releases, but upstreams add dependencies creating dependency hell for packagers. Snaps work better for this case.


Deveroper/packager productivity matters a lot, as people will then bother to package more nice things and keep those packages up to date. It's bad for everyone if packaging is needlessly hard.

Also the sw engineering world is clearly moving away from packaging as "integate application into distro X version set of library versions by continuously expending a lot of integration engineering effort" and more towards things like docker images, snap/flatpak/appimage that sidestep the problem of host platform component dependencies as much as possible.


But your distribution's package maintainer has limited bandwidth, there's only so much software they can support. At some point you need a way for developers to do it themselves.


I agree with almost everything you say, and, yes, people used IIS, windows, as webservers, only not-long ago, so, let's hope for progress. Totally agreed.


I don't see what that has to do with the post you replied to.


I'll enlighten you. About twenty years ago people used IIS for web-servers, encumbered in costly licenses, for everything, and progress goes on. That's the point man. Time goes on, and things gets better. Hope you see now. That was the point, anyways. I really hope it helps.

Progress goes on is just the general point. Nothing else intended. Should be clear enough.


Lots of people still use IIS for web servers


parent is joking around but I got curious and wanted some data on this and found rather surprisingly that IIS isn’t doing so well these days and seemed to have dropped even more in the last year:

https://news.netcraft.com/archives/2020/04/08/april-2020-web...

https://news.netcraft.com/archives/2019/04/22/april-2019-web...


Yeah I was just being an asshole. Didn't really mean to thrash on ISS or anything. Just being dumb.


I hate snap.

It makes my system less predictable, increases load times, makes UI stuff look ugly, obfuscates process monitoring, hogs memory, cannot deal with files in /tmp (and I happen to use /tmp a lot), makes if hard to do audio (like connecting a mic to Chromium)... I can go on.

But in 2019 versions of buntu it could still be removed:

https://github.com/cies/kubuntu-setup#remove-snap

I found a way to use Debian packages reliably on buntus with a trick that is shown here for Chromium:

https://github.com/cies/kubuntu-setup/tree/master/chrome

If this snap thing continues to invade the *buntus I will move away from them. For now I still like the combination of stable + up-to-date + familiar (Debian based), but --as others have said-- the snap thing goes against why I choose to use open source.


Out of curiosity, do you feel the same way about Flatpak?


Have not tried that. Maybe I should, but im on Debian-style distros for so long (and use 'm on servers and such) that I dont really want to learn the basics again.

Also, from my RPM days I remember essential (pgk mgmt) tools changing way too often for my taste. Also I'm a KDE/Qt kinda person, and the RPM ecosystem was mostly GTK based.


You can install flatpaks on many distros, including Debian, Ubuntu, Mint, Kubuntu, et cetera. I have some flatpaks on a Kubuntu laptop I maintain for a family member and installation was about as smooth as it is on my own fedora machine. I use cli exclusively though so ymmv if you prefer gui(Edit: Exclusively use CLI for sysadmin).


On that note, what requirements pushed Ubuntu to select snap over flatpack?


Snaps evolved from click packages that were developed for the Ubuntu phone. The design dates back to 2013: https://lists.ubuntu.com/archives/ubuntu-devel/2013-May/0370...

Flatpaks, according to Wikipedia, were first released in 2015. I don't about the current situation, but initially a bunch of features that are considered necessary for snaps were out of scope of the Flatpak project, such as IoT support.

So it wasn't so much that Ubuntu selected snaps over Flatpak. Flatpaks came along later, and Ubuntu had already invested considerable resources developing snaps.


Well for one they control it. But a good advantage that it has over flatpak is that flatpak cannot run on servers while Snap can, though it doesn't sound like it's enabled for servers in this release?


Just my guess work, but I think they are afraid that THE ONE AND ONLY compatible Linux package format will be owned by RedHat. And that "auto updating" packages in that format will usually roll through RH's servers/repos. They want to be in on the action when a standard emerges and therefor push very hard for it, ship a crap product and experience backlash.


> The only 'con' is the pushing of Snap packages.

I don't get why they're doing that. Snaps have mostly created problems for me, like they don't scale when other apps do, or the styling is off, or they don't show up properly in the software manager. The snap daemon on one of my laptops always blocks the shutdown by 30 seconds. Generally a rather confusing experience.


I'm on Fedora where the equivalent is called Flathub/Flatpaks. There are definitely some rough edges still, but the possible upsides are amazing for me.

It makes it way easier for a developer to release for different Linux-flavors. The applications get sandboxed and there is much less risk to mess up your system. Stuff like permission, similar to Android/iOS becomes possible.

There for sure is problems to: it makes it way easier for a lazy developer to ship ancient libraries without security fixes, the "packages", now images, are much larger and the quality check of the packager gets sidestepped. And, as always when you try something new, it gets worse before it gets better.

But I think the idea is good and I hope for the best.


The way i see it, software which is truly part of the distribution can and should still be packaged the traditional way. But snap/flatpak creates a standard way to install software which is not really part of the distribution. It replaces all the random tarballs and unpacking things in /opt that you occasionally had to do when installing third-party software. It makes sense for VS Code, IntelliJ, Slack, Chrome, etc, to be in snap/flatpak.

It doesn't make sense for open-source software built with traditional tools by the distribution maintainers themselves to be in snap/flatpak.

I note that on Ubuntu 18, snapd itself is distributed as a snap, which seems like a very poor idea to me.


> snapd itself is distributed as a snap, which seems like a very poor idea to me

I took this as a "vote of confidence" that snap could self-host snapd.

Hosting via snap means they can push updates and not have to wait for users to `apt update; apt upgrade`, as snaps auto-upgrade without user intervention.

I don't know of other benefits, though.


> I'm on Fedora where the equivalent is called Flathub/Flatpaks. There are definitely some rough edges still, but the possible upsides are amazing for me.

Except Ubuntu and Fedora each use their own container format. The entire reason we're in this place is that RedHat went with .rpm and Debian/Ubuntu with .deb. There's no-one winning in this format war, and no-one is winning by putting yet another layer of abstraction either. Like with Docker, the problem that container app images are trying to solve (that of incompatible 3rd party shared libs) cannot be solved by containers, because the reason for shared libs to exist in the first place is so that eg. security vulnerabilites in old shared lib versions can be transparently fixed by having the OS install newer libs; but (as you say yourself) with containerized apps this is impossible. You might as well ship statically linked apps instead.


> The applications get sandboxed and there is much less risk to mess up your system.

Apparently snaps are also sandboxed between versions of the same application. My very first experience of snaps was Vuze getting updated overnight and all my configuration and torrents just vanising.

I ended up rolling back the version (to restore everything that disappeared) and, upon discovering snaps are designed to never be able to lock a version number, shot it in the head by killing the daemon and chmod'ing its executables to not be executable.

I assume Vuze's snap was misconfigured, but it really turned me off of them and I've since decided to stick to nix instead of snap whenever I can.


Google says that snap app configuration should be copied forward to new versions. Sounds like either a Vuze bug or a snap package bug.


My understanding is snap and flatpak are not equivalent at all because snap is more permissive whereas flatpak is sandboxed.

Specifically, it is not possible to do gpg signed commits from flatpak (e.g., jetbrains idea) without tweaks upstream.

I agree with the overarching idea though: it gets worse before it gets better. See wayland and how we still can’t capture display in obs.

I am a little torn on this. Should the onus to “fix” be on jetbrains and obs? I personally think no where possible. Replacements should be drop in without effort or at least with clear guidance and not too much effort on behalf of individual applications.


Snaps are sandboxed. They can be granted extra privileges, such as access to $HOME, camera, bluetooth etc. Some of these interfaces such as $HOME can be gained automatically by any snap. Others need to be granted by Canonical (to automatically grant access on install), or manually granted by the user post install. There is also a 'classic' mode where the app is unsandboxed, which again needs to be granted by Canonical and in addition requires explicit acceptance by the user when being installed.


"classic" snaps are not sandboxed, and you have to acknowledge that when installing them.

"modern" snaps are sandboxed, and you can change their permissions in the Settings/Control-Panel app. You might argue with the choice of permission granularity, but it IS there.


It's still work in progress but snaps genuinely make packaging easier. There are rough edges, especially on the desktop, but those are all being worked on, in collaboration with flatpak actually.

It will take a while but in the future it will not be any worse at runtime and it will be much, much easier to package and run apps across releases and OSes


Ubuntu had the calculator as a snap! What's the idea with that? It suddenly took 2 secs to load and uses load of memory...

I dont care packaging got easier. I care for a smooth working system I can understand and instrument, and that does not hog resources.


Snap and flatpak are great fallback methods. I first try installing software natively, but sometimes it's not available/compatible and you really need it to work, so flatpak is great to just get it running at all.

I'm not a fan of using it as the default though, and flatpak isn't immune to issues either, for example I use a flatpak package which regularly can't be updated for months because of some incorrect metadata if the error message can be believed.

I still think it's a good step forward for increasing compatibility especially across different Linux distributions.


They may be fallback methods, but snap is for sure not "great" (not a finished product as other have pointed out). As you also understood, snap is sadly not promoted as a fallback method.

This stuff is pushed with corporate power. Not adopted because great great fallback method it provides.

The compatibility is also balkanized for the start with the flatpack/snap split. Pfff, what a mess.


Oh! Is _that_ why the calculator is nigh unusable now? Sheesh, this just shot snap's credibility right out the window. It's faster for me to boot up the node executable and do my calculations there.


The minimal fix for the sluggish calculator on Ubuntu 18.04 to 19.10:

  sudo snap remove gnome-calculator
  sudo apt install gnome-calculator
The more complete solution is entirely removing snap. I did that long ago, but my process was a little painful. From my logs, it went something like this:

  snap list
  snap remove <copy-paste packages from above>
  sudo service snapd restart # core is tricky to get rid of
  sudo snap remove core
  snap list # check it's empty
  sudo apt install gnome-calculator gnome-characters gnome-logs gnome-system-monitor
  sudo apt purge snapd squashfs-tools gnome-software-plugin-snap
  rm -r ~/snap
See also: https://ubuntuforums.org/showthread.php?t=2409173&p=13826670...



`bc -l` is pretty awesome.


And some times, on some systems, the calculator got all transparent, and unusable, so, I left Ubuntu after that. Granted I was on LTS'es almost all of the time, so, this might be highly circumstantial and subjective. But I've been there as you apparently also have. I hope snap improves for the best of all, it seems amateurish to push snap-apps, at their then-current state, in LTS releases. I left basically after Ubuntu 14.04, and even that, 14.04, was a let-down from 12.04, and so on. In my experiences.


Do you know why, technically, being packaged as a snap would increase start time and memory usage?


Also not yet possible to install a snap using a chroot yet : https://bugs.launchpad.net/snappy/+bug/1609903 (the bug has been open for 3 years). Chroots are a key mechanism to building any customized Linux image.

I hope they smooth these rough edges further before they push more snaps onto the world.


Yeah, snaps are pretty much a disaster if you don't have a simple home directory, put some directories on different volumes with a symbolic link (because they are so large and you don't have room), or use /tmp as a temporary directory to save files into and chrome becomes useless (and fails quietly without explaining why) - I must admit if they haven't fixed this stuff I'm likely jumping ship


> Snaps have mostly created problems for me

I think this is the wrong way to look at it. You still have access to the same packages as before without snap. If you're using snap, that means you either explicitly chose it, or the package is not available in the repo.

The choice here isn't between a package or a snap most of the time. It's between: not using the app, packaging it on your own, or using a snap which is not perfect.


But certain packages are migrating to Snap-only, e.g. Chromium.

I understand why they're doing this (primarily to save on build times), but the way they've gone about it (using a transitional Apt package that just installs the Snap) is messy and inconsistent.

If I type `apt install` then I want an Apt package. Under no circumstances should this install a Snap.


> using a transitional Apt package that just installs the Snap

You need a migration path from an existing package. It's also nice not to invalidate every post on the web which describes the installation.

Chrome is special... it already doesn't really belong in the repository and was handled in a special way. It bundles tens of libraries in its binary without relying on the system/repo deps. It's actually more natural for it to live with all the other "include all deps" applications.


That's because Chromium is hell to keep updated in a stable distribution release apt archive. Upstream adds new dependencies that aren't packaged in the distribution's stable releases, for example. It's not practical to do anything but a whole load of bundling (in debs) to solve the problem. This then isn't much different to a snap any more anyway, and packaging a snap takes a fraction of the effort.

Anyone is welcome to maintain chromium as an apt package for your distribution release of choice, and you're welcome to point apt to that repository instead. The reason it's not happening is that it's increasingly impractical to do while also maintaining Ubuntu's quality standard for debs.


> If I type `apt install` then I want an Apt package. Under no circumstances should this install a Snap.

Exactly!

I found a way to remove snap[1] and get Chromium from Debian[2]. But I'm close to leaving the buntus over this drama.

[1] https://github.com/cies/kubuntu-setup#remove-snap

[2] https://github.com/cies/kubuntu-setup/tree/master/chromium


Yep, and the chromium migration broke my webex meetings.

First I had to research and find out how to give it microphone permissions. That worked, but the mic works for about one minute then stops.

So frustrating, I had to move meetings to Firefox which is where I do most of my work. I like to have two browsers to split the work to that which is most appropriate.


I found a way to remove snap[1] and get Chromium from Debian[2]. Have a look (I have the same problem with audio in the snap version):

[1] https://github.com/cies/kubuntu-setup#remove-snap

[2] https://github.com/cies/kubuntu-setup/tree/master/chromium


That's f'd up. Some people say the default settings for sandboxing [in respects of basic snap defaults or setups] etc are basically disabled some places. Hope that works. Bet the skin looks great. Most snaps I used long ago looked like a un-made-up pig to be frankly. I don't hate on snap, I obviously love it, or either I'd not wasted the time. Let's hope it improves. All improvements are great, obviously.


The problem is, from a user perspective, when simple things as the included calculator is a snap package. I am biased and it's been a long time since I used Ubuntu or snap, but, you shouldn't need to be a hostage to sloppy snap-early-experiments, for the _calculator_, on an LTS release. This is long ago, and, let's hope these things have improved. And will improve. All progress is progress, let's hope for the best. Freedom is good, long live Ubuntu and snap. Right.


> The snap daemon on one of my laptops always blocks the shutdown by 30 seconds

Oh THAT'S why that happens sometimes. I never bothered debugging it.


Same, I'd taken to just holding down the power button, mystery solved I guess.


These "statically linked" packages can solve certain problems very well. Packaging or installing Steam for non-Ubuntu platforms can be very tricky. I'm using the Steam flatpack version on Arch and it saved me a ton of hassle.


>The snap daemon on one of my laptops always blocks the shutdown by 30 seconds.

I used Spotify, VS Code and some PS2 emulator from Snap and haven't seen this behaviour. Which snap package does that for you ?


I don't think it's from a specific package. It's just the Daemon. I didn't really dig into why or what, don't have the time, and don't have the energy. I just get the message on the console when shutting down my laptop. Maybe it's updating something? Who knows.


So far, I'm very impressed with 20.04 on my personal laptop (a Thinkpad L390). HiDPI works better out-of-the-box (I didn't actually need to enable scaling manually, but did have to add a Chrome and Teams .desktop file with --force-device-scale-factor=1.2). Power management is much better out-of-the-box too (I've gone from 2-3h battery to 4-8h depending on workload, rivalling what I see under Windows).

But you're right about Snaps. I love the concept (easier distribution, fewer dependency issues especially if you're a few years into the lifecycle of an LTS release) but they're not, in my opinion, production-ready yet so it's frustrating to see them pushed by Ubuntu. The main issues we have are the slow startups compared to non-snap apps (which is felt more acutely on old hardware with spinning disks) and the fact that they don't work if /home lives on an NFS server. That rules out their use for many academic and enterprise deployments.


oh, I am suprised you praise HiDPI support. I could not make emacs work resonably. It does not react on the widely documented Xft and GDK_DPI_SCALE settings as in other distros.

I am on Xubuntu though if that makes a difference. And I have not tried for a couple of weeks. There have been many package updates since I tried last.


I actually tried installing xubuntu-desktop first (and then plain old xfce) as I fancied switching back to them after a few years using Gnome. But you're definitely right that out-of-the-box, XFCE HiDPI support was lacking. Widgets (panels, icons, etc) and fonts all scaled fairly independently, and I gave us after manually resizing a few elements.

On Ubuntu 20.04 (with Gnome), it just worked (with the exception of Chromium/CEF/Blink based applications like Chrome and Teams, which needed the aforementioned command line argument to scale them). I actually had to undo most of the scaling changes that came across with my restored home folder. The one other thing I had to change was the Terminal font as the default was a little too small to use comfortably on my laptop screen - I opted for Liberation Mono Regular, 12pt (px?) with 1.10x height.

As a former long-time XFCE proponent, I recommend you give Ubuntu + Gnome a try. It's the best it's been since the Gnome 2 days, and is no-where near as power hungry as it has been in recent years. The most power-hungry application I'm running is Evolution, where the webkit process that renders HTML email frequently gets out of control and burns 40% CPU until I restart it (I'd use Thunderbird but the Lightning plugin https://github.com/ExchangeCalendar/exchangecalendar to handle Exchange calendar stopped working with recent versions of Thunderbird some months ago).


I haven't tried 20.04, but this is the "dconf dump /" setting in 18.04 that got emacs in the ballpark:

  [org/gnome/desktop/interface]
  text-scaling-factor=1.5


I've noticed the battery life too! And I've always been running mainline kernels ahead of what the ubuntu repos provide.


I hope the HiDPI support is improved now. Right now, when I use fractional scaling on 19.10, none of the display settings get saved and so after a restart, I have to readjust all of the layout, orientation and scaling for displays. Pretty annoying for a dual-boot triple display system.

edit: Nevermind, seems that it got even worse for my case:

https://bugs.launchpad.net/ubuntu/+source/gnome-shell/+bug/1...

https://bugs.launchpad.net/ubuntu/+source/mutter/+bug/187340...


Try using mixed DPI displays. My main display is 27 inch 4K display, and on the sides I have old 19 inch 1280x1024 displays, used for IRC, music player, process monitor, email etc. Stuff I glance at from time to time.

_NOTHING_ supports this setup properly. Even Windows has bugs galore as soon as you mix high and low DPI displays. I don't believe anyone ever tests this scenario. I've learned to tolerate and work around the Windows bugs, but I've never gotten it to work at all in any Linux desktop environment I've tried.


I use a 1440p display at 100% and a 4k display at 175% on Windows and it works fine. The only bug that I see is that if a window is spanned across two monitors, it picks the scaling based on which monitor contains more of the window. I'm not even sure that's a bug, it's just how it has to be done. (My first mixed DPI setup was actually on ChromeOS. That got around the issue by not letting a window span monitors. Actually a somewhat elegant workaround, since it didn't let you do that before HiDPI support ;)

Windows and the applications I use even handle the different refresh rates of my monitors pretty well. I have a 165Hz main monitor and a normal 60Hz monitor. Apps are aware of the framerate differences when I move them around.

It wasn't always good, but as time goes forward applications and Windows itself handles HiDPI better and better.

(FWIW, I haven't had any problems on Ubuntu... except that letting the monitors go into power saving mode resets the display scaling override. That was with 18.10, though. I use an Ubuntu VM under Hyper-V now... begrudgingly.)

Having said that, I am not a GUI power user by any means. On Linux, all I ever use are a browser, terminal, and Emacs. On Windows, I use CAD software and games (and a browser, terminal, and Emacs). It all works well enough for me.


MacOS seems to use the same approach as ChromeOS, where a window can't span monitors.


I am using a 1200p, a 1800p and a 2160p monitor side-by-side on MacOS and it works fine. The 2160p monitor is rotated to portrait mode for reading/writing text documents. I can hotplug a monitor and it will automatically switch to the right layout, remembering the correct location for all open windows.

I am using a 1080p and a 2160p on ArchLinux/Wayland/Gnome3. It's not perfect, but it's not unusable either (compared to X11, which was completely broken).


You might want to give KDE Plasma a shot: it handles scaling quite well.

I find it much superior wrt Gnome, but that's a matter of preferences of course.


Not true when you are using Wayland and GTK-based apps like Thunderbird / Firefox / Chrome or Electron based apps.

That was the main reason that after more than a decade I turned to Gnome. HiDPI and fractional scaling just works since then. Highly recommended!


>The only 'con' is the pushing of Snap packages.

The graphical "gnome software" software installer was replaced in the default install by a snap-only fork named "Snap Store" with no support for Flatpak (see https://bugs.launchpad.net/snap-store/+bug/1871944). Were there some technical reasons for this move or was it a business decision to kneecap the other package format?


IIRC GNOME removed support for snaps from gnome-software so this was the only way forward.


According to git, gnome-software snap support still seems to be actively developed: https://gitlab.gnome.org/GNOME/gnome-software/-/commits/mast...


It's not true. GNOME Software still ships Snap plugin. It's Canonical that decided to fork gnome-software and provide it under different name without competition plugins enabled.


No. Canonical absolutely has the resources to make an appstore that supports all three package formats (dpkg, flatpak, snap).


> The only 'con' is the pushing of Snap packages. It looks like deb files only can be installed via the terminal.

Another solution to this (aside from synaptic, command line, other clients) is to use Kubuntu (which has an apt based package manager called Discover, much improved in this release over prior versions). KDE still doesn't work great on tablets, but in all other respects it's a lot nicer than Gnome. Spend 30 minutes tweaking it to your preferences after install (apt install kubuntu-desktop) and you'll have a desktop that does exactly what you need it to. (You could also install KDE Neon, which packages latest stable KDE with a LTS Ubuntu core.)


Some problems I had with Snaps:

* Installed VScode. The terminal in VScode was in a different environment from the system terminal, so used different Python environments. Took me a while to figure out why installing a package was not working. * Installed Slack. When I click a link in Slack it opens a new Firefox process that does not have access to my normal profile, saved logins, etc. and is not shown as a Firefox window in the sidebar.

In both cases I uninstalled the Slack version and installed the "real" version.


The new firefox on link click in slack can be guided by going to `about::profiles` or something like that, and making sure there's only one profile, and that it's default.


I've not used Snap, Flatpack, Appimage and all the other types of app packaging tools before.

What are the benefits? Are they effectively like bundling an application with all its dependencies? Is it a way of running "native" apps instead of in a docker container?

What should I and shouldn't be using it for?


The negatives, of snap, at least, was that most apps became 'un-skinned,' and looked like unthemed things you'd launch directly in an X11-session so they look like their base-worst, and so on, in my experience. - The first Ubuntu LTS I tried that had snap even had problems with the [snap-packed] calculator, which for some dumb reason was a snap-app instead of a .deb, I'm talking the main included/internal calc-app, and the VLC snap-package I tried was totally unskinned, due to being snap'ped I suppose, as the ppa and etc .deb versions of VLC certainly was and looked better.

Now I am on EndeavourOS, an arch-based distro, easy to use and set up, and this is due to snap being introduced in Ubuntu directly, plus not wanting the ppa chase-game, and et cetera. It's just an/my opinion(s), and certainly not a pointer. Just being objective from my own perspectives, here. Got nothing against Ubuntu or anything, it's a free world, thankfully, still.


Snaps also create a myriad on mount points which is quite annonying. I like Appimage though.


Tell me about it df -m was like myles long. And it [meaning snap] had separate auto-update procedures and janitorial daemons, which I had a hell of a time removing and disabling etc. I guess the best thing about snap is that you can uninstall all of them, I hope, and find the alternatives of the same features either in the main repositories or through every-lovable ppa. I loved 12.04 LTS, 14.04 LTS, even, but, this is too much. I had to leave man. Had to be there to know it.


I've disabled snapd and set it on hold so it's hopefully never installed again precisely because of this.


There is sandboxing to protect the security of your computer.

An application that runs as a snap does not see the system-wide "/tmp". They get their own /tmp. If the snap is chromium, it has /tmp/snap.chromium/

The chromium snap would not be able to view files outside your $HOME. Also, it would not be able to read any dot files in your $HOME. Any configuration it makes, is separated into $HOME/snap/chromium/

There are a few rough edges, but the end of the discussion is that you get better security if something goes wrong, and it makes it cheaper for maintainers to created updated packages, and distribute them.


They claim that it does sandboxing, which to my knowledge was never actually activated in any meaningful way.

I have used flatpaks on fedora which was nice, and I also built some which was quite painless. I hate that I can't launch them from the terminal like usual though.


You're right regarding the sandboxing. Snaps use 'plugs' to add sandbox exceptions that can be configured by the package author and enabled at install time, e.g. to allow access to your home area or the network.

Even if the sandboxing may technically prevent the package escalating to root or whatever, this is a fairly moot point on a personal computer as everything valuable is probably in your home area.


> What are the benefits?

Common packaging layer between distros and independent packaging. You don't end up in a situation where the software is packaged for Ubuntu, but not RedHat for example.

> Are they effectively like bundling an application with all its dependencies?

Kind of. Some dependencies, yes. But there are many layers which apps can depend on, so for example I've got an app which depends on "GNOME Application Platform version 3.36" - that is shared between apps which need it.

> Is it a way of running "native" apps instead of in a docker container?

Pretty much. Docker is more convenient for server software. For GUI apps, flatpak/snap/appimage provide nicer interface and more targeted sandboxing. For example flatpak allows to limit which DBus interfaces are reachable - which is not doable in docker. It also allows things like "filesystems=xdg-download;home:ro;" which protects your home directory files from being overwritten.


This, for me, marks the release when (somewhat unfortunately, 18.04 LTS has been good) I will not use Ubuntu for any new installations - desktop or server. I can see how snaps are beneficial for novice desktop users who want an AppStore-like experience, but how do they see people adopting this for Ubuntu Server?


Apt isn't going anywhere. What changed for you on the server side?


Just upgraded on my non-essential home server, and apparently lxd is only available via snap now, for whatever reason.

https://packages.ubuntu.com/search?keywords=lxd&searchon=nam...


You always have the ppa's, one reason, except snap, itself, that I moved. Let's hope it picks up and improves. Right.


A couple of packages (specifically which slip my mind) that either were only available as snaps and not in the main apt repos, or had way older versions in the apt repos.

At that point, there's no reason to not run debian instead.


I'd like to understand which packages, I've had no packages I need disappear. I'm skeptical that the packages are actually missing to favour snaps, and not simply renamed or replaced.


My guess is more attention and focus on snaps and increasing neglect maintaining the deb repo.


There are no server snap packages in Ubuntu 20.04 LTS. Is you stand based on principle?


Snap packages are the reason I moved from Ubuntu Mate to Fedora Mate.

I use free software to own my computer, if it decides on its own when to update like snaps do that defeats the purpose.


Does that distro use flatpacks? I also hate snap with a passion, but I dont think flatpack will be better, what are your thoughts on this?


No, fedora Mate has no flatpaks by default. I do use them on my machine though. They are not as integrated to the system as snaps on Ubuntu but they update only when I tell them to and I like the additional security benefits of the flatpak project overall.


I think security wise it got worse. With traditional package repos someone still does some due diligence. Also it's not clear what's going on, and that is a huge prerequisite for security.


The only thing that holds back my Employer to switch to Linux for 6500+ Desktops is Microsoft Office and full MS Exchange compatibility. The Collegues who need Adobe Products just have to work with Apple Machines anyway.


Just switch to LibreOffice. The dependency on .docx/.xlsx is an overblown propaganda.

I personally had a business in 2007..2011 that replaced windows for Linux. All that time I heard that same old song again: "But but MS Office is so essential!". Turned out, not so much, on over 5000 PCs in several dozen companies.

Update: business eventually failed, for one simple reason - Ubuntu Linux is too freaking stable. The idea was transferring customers to Ubuntu, and have them pay a monthly service fee to do maintenance and fixes. And we have soon discovered that a properly configured GNU/Linux desktop PC can run for YEARS without maintenance at all. Customers had it figured out too and opted to do one-time fees when something goes wrong, and it just wasn't sustainable.


In addition to sibling comments about UI, if you work with Asian/Complex Script Language, Libre/OpenOffice is terrible choice.

I use WPS Office (not open source) on Ubuntu mainly because it supports Asian/Complex script much, much better (as in, almost identical to MS Office).


Well, it's a matter of preference and tastes. I can't stand the ribbon interface introduced in MS Office 2007.

I'm not familiar enough with Asian/Complex Script language, but I once was in China and have installed a few Ubuntu 10.04 (brand new at the time) to a class of PCs where students were struggling greatly with Chinese version of Windows. On Ubuntu they were quite happy with how pinyin support worked in OS and LibreOffice.


I don't do Asian scripts, just latin and cyrillic, but I still use WPS office, since a lot of the options are in exactly the same place as they would be in MS Office, it's so nice not to have to go searching for things.


Have you ever tried to open an Excel Sheet from 2002, that one that is existencially important for the whole success of the company :), with makros as big as the Windows Codebase itself, in Libre Office? I can tell MS Office will not leave the field.


Don't you even get me started on xls from 2002. That pathetic standard was poorly supported even my MS itself, files were opening with great difficulties and differences even on same version of Excel on different PCs, and sometimes newer versions could not open them properly at all. Once I have discovered that one .XLS file could internally have different types of encoding within one same document.

Anyway. It is clear that MS Office will not leave the field, like FORTRAN, but the key principle is the same: if you want to get out of the pit, you should stop digging. DO NOT create new spreadsheets in XLSX format. DO NOT ever send out MS Office documents. That's it, simple rules. Follow them for 5 years and you'll wonder, how little you depend on MSOffice.


Recently got some UX whiplash after being in engineer world for years and long ago migrated to .md/.rst over rich text editors. To quote the Matrix, "I don't even see the markdown anymore, I just see bold, italic, header..."

Then I started working with some folks on a Covid mask project and they are massively struggling with MS office docs, dropbox, file versioning, web deployment, wordpress, for something that needs both high uptime and fast updates. I'm like why not just use git, github, and jekyll static sites? Add better looking pages in time.

...

Lets just say they weren't thrilled with that idea.

Sticking to open formats (flat text, odf) is a great start for the individual but it's still a workflow shift for everyone else, which is painful. It was a start reminder for me of the frustration I experienced when shifting my workflow to more open standards. Add to that some serious Hyrum's law when it comes to excel usage.


Yes! Im with you on that. But it is not the tech or IT "Poeple" that we are talking about here. This Discussion is going on for many years now and there will be another 10 Years of "simple" solutions. I will be honest here. As a Programmer and Application Supporter (for 17 Years now) i earn Money with that kind of Problems. I dont like MS and it would be a joy to switch to GNU/Linux. But i think i will see someone standing on Mars before MS Office will be replaced.


Just have to move to an adjacent/another field. I have not seen Office in ten years, and haven't used it in twenty.


> if you want to get out of the pit, you should stop digging.

Great quote, I'll try to remember that.

Also, folks forget how times change. Ten years ago most were still trapped on Windows. Today a lot fewer are.


Sounds like something a human wouldn't have any use looking at anyway. Multi gb excel files? This is what programming is for.


> Multi gb excel files? This is what programming is for.

Excel is arguably the most successful programming language ever, or at the very very least second most successful after JavaScript.

(Note to future commenter: I said "most successful", not "best".)


Yeah excel, outlook, etc, and their db-app, access, word, and photoshop, adobe, etc, will be tough beasts to either man handle or convince, I guess.

Looking forward to handle being a terminal app to install all of those things. Throw us some games too.

No manual entry for handle


Reality does not work in the way you wish it to.


Excel is a very nice purely functional programming environment with a fantastic debugger.


Excel is a nice flight-simulator, it's just too bad it doesn't wear a miniskirt.

Edit: here's the procedures to activate a flight sim in excel 97 as I understand https://kb.iu.edu/d/agqw

You can fly around using your mouse or the arrow keys. The monolith lists Excel 97's developers.


> The dependency on .docx/.xlsx is an overblown propaganda.

It depends on the company. One will use it as a glorified letter composer. Another has a sharepoint integration with version control and documents with ActiveData integration into SQL servers and (literally) thousands of lines of scripting.


Nothing, not LibreOffice, not Google Sheets, not Excel for Mac can replace Excel for me. It's wonderful, they keep adding features and I know all the quirks like the back of my hand. If I ever swap to Linux on the desktop, I'll keep a VM around just for Excel.

Word is nice too, although that I could probably change that rather quickly (word processing is not hard to get right).


This speaks only that you are so used to MSOffice that you don't want to study different app, not that it is really better.

When I switched to OpenOffice.org in 2006 it took me some time to switch, and you know what? I have found that OOo was way more stable, predictable and logical in how things are done. It was also really free as in freedom, and I could run it on any OS of my choice. Now, I know LibreOffice like the back of my hand and it's wonderful.

People who used MS Office all their life usually do this: launch LibreOffice,see different interface, and like "nah... I don't want any changes, I'd rather keep myself and the world vendorlocked into proprietary software".


> _"nah... I don't want any changes, I'd rather keep myself and the world vendorlocked into proprietary software"_

People don't have a problem changing if the change doesn't break their work(flows). The fact that Internet Explorer is pretty much dead proves it - everyone switched to Chrome although is different.

MS Office document format support is simply not good enough on any alternative (of which Libre Office is the only serious contender).

I have tried to switch to Libre Office on multiple occasions over the years, but every time I gave up after couple of weeks of frustration, because I couldn't collaborate on shared documents. Either the stuff I created in Libre Office looked different when others open it in MS Office, or I was breaking the documents for other people.

I don't consider MS Office to be better - it's simply that the alternatives are not compatible.


> MS Office document format support is simply not good enough on any alternative

This is a key fault in your logic. You SHOULD NOT judge the alternative by how it supports the document format that is specifically created in such a way to block competition from effectively supporting it. The company that did it also corrupted the international standards organization to get it an official 'standard' status.

But the truth is, you don't really need MS Office document format support at all. The company I've founded in 2007 never ever sent out any document in MS Office format, never owned any copy of MS Office, and we survived since then just fine.

If you want an electronic spreadsheet, use an open, documented and well supported standard - OpenDocument, and get on with it. Oh, and it is supported just fine on Ubunut 20.04 that we discuss here, as is on a Mac and Windows. The only thing it lacks is a proper online collaborative editor, that's true.


> The company I've founded in 2007 never ever sent out any document in MS Office format, never owned any copy of MS Office, and we survived since then just fine.

It's great that it worked out for you, but that's just not the case in the most business environments I have been dealing with. When a person sends out a document to a customer, and it turns out that customer didn't see the content as intended, alternatives get deleted and MS Office gets reintroduced.

Just to be clear, I'm not a proponent of MS Office. As a matter of fact I run on Mac and Linux and don't have much touching points with Windows, but MS Office is simply a necessity in most business environments.


> but MS Office is simply a necessity in most business environments.

My experience says otherwise. Most businesses can replace it with alternatives without too many difficulties. Definitely 95% of SMB can, I have led many transition projects myself, where companies moved 90% of their computers to Ubuntu and 100% of their computers to OpenOffice.org / LibreOffice


It's great that you got so good results of it, say yes to free software. May it continue and may it improve. Perfect man.


You'd lose the realtime collaboration features though.


Ignoring format issues, LibreOffice just feels extremly slugish for me. I get little freezes all the time, the UI is laggy and startup time is pretty bad too. I came to dislike it so much that i prefer to edit things in Google Docs over LibreOffice .


I haven't tried it but some people say 'OnlyOffice' is a better branch than libreoffice, it seems to support MS-Office standards better, but, as I said I've never tried it. Only trouble these people had was with certain fonts, but I guess the remedy would be to choose embed all fonts upon saving and sharing, if that helps, then perhaps OnlyOffice, desktop version, is an option, for some. I plan to try it, although I use my old office-versions, in a vm, for now.


I love OnlyOffice along with WPS Office, but they're not "branches" of libreoffice at all, completely different products. You might be confusing LibreOffice branching from Sun OpenOffice.org/Star Office.


It's most likely that I am totally wrong, they just told me it was some "Latvian" branch, Lituanian, so I don't know. I might be confused overall but it's good someone knows. Thank you.


It's always felt slow (I mean, since early in the OpenOffice days). I've assumed it has something to do with its Java dependencies. Back when I used linux I'd stick to Abiword and Gcalc and such when I didn't need Microsoft compatibility, for that reason. On Mac, Pages and Numbers and such are nice and light. Google Docs is too input-laggy and glitchy for me.


I personally didn't experience any problems with LibreOffice performance, but things happen probably. Maybe you'll feel better about it since that sluggishness you experience is compensated by the fact that you can run LibreOffice on a proper OS, not slowed down by necessary antivirus software and the likes


office 360 seems to work ok.

The bank I used to work for moved to cloud exchange already.


I think this is the answer.

Since edge has moved closer to chromium, chromium on Linux should be decently supported there.


I use Ubuntu at work. Chromium is now snap only, and snap crapped itself with some error messages that didn't immediately lead to a solution when thrown into Google. Firefox was my main browser anyways so not too a great loss but oh man...


There is a way, which I documented here:

https://github.com/cies/kubuntu-setup/tree/master/chrome

I also use FF primarily, but sometimes I need Chrome (chromium actually, I should fix that).


Thanks for that. A little hacky but worst thing that can happen is it just doesn't run I guess...

URL is now https://github.com/cies/kubuntu-setup/blob/master/chromium for the curious.


They ramped up the security in Chromium and it is not identical to the old package. It is too much security that hurts ease of use.


> It looks like deb files only can be installed via the terminal.

That's a good thing! I can't wait for Deb installers to stop mucking about dumping random files all over my system. I also can't wait for the whole ppa nightmare which is the only way to get cutting edge versions of software by giving the right to muck about with the system to random packages on the internet to end.


> The only 'con' is the pushing of Snap packages.

I am disappointed that snap still forcibly clutters everyone's $HOME with an extra directory[1] that cannot be moved or renamed. It was reported four years ago, and nearly a thousand people have joined the bug report. The glacial pace at which they're addressing this problem makes me hesitant to embrace snap.

[1] https://bugs.launchpad.net/ubuntu/+source/snapd/+bug/1575053


Idk, since 10.04 I've appreciated that Ubuntu is as far as a mainstream Linux desktop distro as it gets and just works (I used to run various distros including RH/CentOS, Debian, Suse, Gentoo, and also FreeBSD). But I'm on the brink to leave (maybe towards Void Linux, Debian/Devuan, or back-to-Slackware) because the things I care about have atrophied (lightweight DE, real Unix environment) while crap (snap, systemd, gnome-shell, invasive command-line suggestions) has taken over. On 19.10 gnome-shell really gets in the way for me. And Unix command line utilities I'm using all the time stopped working for me (for example, even vim on Ubuntu 19.04 comes with broken syntax highlighting OOTB, and annoingly and incompetently inserts tabs and spaces where there must be none, something I haven't seen any vi implementation doing for over 30 years; also, awk rejects complex regexpes all of a sudden on 19.04). I know there are other 'buntus shipping with alternatives for gnome-shell but for me the point is/was that there is a large installed base eliminating glitches, which however isn't the case for alternate 'buntus any more than it is for non-'buntus. I also don't see the benefit of yet another packaging format; the point of a desktop distro should be to deliver a consolidated set of shared libs rather than going nuclear and ship everything in fscking containers, especially when there are exactly zero new desktop apps coming out these days (with the exception of Chrome which I'd rather install containerized, and which I only use for website testing, my main browser being FF anyway).


> On 19.10 gnome-shell really gets in the way for me.

Xfce is a nice alternative to GNOME 3 that reminds me a lot of GNOME 2.


> It looks like deb files only can be installed via the terminal.

Oh, I did not even notice that during 4+ weeks on the unreleased Xubuntu Focal because I never use anything else but the command line to install software. But the target audience Ubuntu is wider, so I think that is a very bad change. One might think they have learned from https://bugs.launchpad.net/ubuntu/+bug/1 , but obviously not. Well, formally they declared bug #1 solved, but few would agree that the Linux desktop has won.

How does the snap "repo" compare to the Ubuntu repos in terms of count? I would have guessed it's orders of magnitude. Even if many of the packages might not be useful for those requiring an installation GUI.


One can dpkg -i , but, how about dependencies? I always had a problem with that. Whenever the app-store didn't work, which usually pulled-in all deps, I just naively used dpkg -i pkg.deb, and chased all the deps manually, through google. I have done that at least. Thank god for arch. No politics/"religion(s/*)" intended.


Apt is the best way, but if you don't want to or can't use apt, then try gdebi, which does the dependency chasing for a .deb you give it.


Yeah I was probably just ignorant using dpkg, didn't know better. Thanks for the info.


apt is your friend. Or apt-get if you prefer old school. Or aptitude if you are willing to learn something pretty powerful.


huh how well does that work with individual downloaded *.deb packages? You're probably right, but, the included repos are usually full of outdated versions, if you get a too-new deb, from some experimental site, in terms of Ubuntu freshness, and then no pkgs in the repo will have a new enough version. (In terms of dependencies for the local exteriorally-gotten base pkg.deb and it's deps,) I guess ppa's are a thing, but, yeah, good to be off this train, no disrespects to Ubuntu, if it picks up I will surely be glad, and even, chances are slim, ever use it again. We're all friends, let's say yes to progress, in general, in the right directions.


> huh how well does that work with individual downloaded *.deb packages?

You can `apt` to install `.deb` files on the local file system using:

    apt install ./path_to_the.deb


And it will take care of all dependencies? What if the repo packages are version < 2.4 or something, I have ran into that a tons of times. Yeah rather the AUR than the ppas to be frankly these days. I love Ubuntu though, as a general idea. Let's just hope it works sometime. Bit disappointing the last few releases. If it ever truly wasn't. Just gave up after 14.04. Tried to like it but I failed.


Since when, that's cool, I've been dpkg-ing my downloads. Then apt-fix-ing them to get necessary dependencies.


>Ubuntu is also researching if they can get Adobe on board.

Where did you hear about this?


They mention it here, but it's pretty non-committal: https://ubuntu.com/blog/ubuntu-20-04-survey-results

> Adobe Photoshop (Illustrator®, Acrobat and Creative Cloud®) were asked for over a hundred times. (...) bringing these kinds of tools and applications to Linux is an ongoing effort.


I've been using 20.04 since the early betas and Ubuntu since the first release. I haven't been this impressed with an Ubuntu release since that first install.


Snaps are significantly safer for users and easier for publishers to produce. The styling issues and so forth occur because of a security-first mindset. [Disclaimer: work for Canonical, but not on the Snapcraft team]


> because of a security-first mindset.

I would believe you here if snap was not an auto-updating system. This is about the least security-first mindset that it can be; it has a single, gaping, point of failure, after which an attacker may get hold of millions of user's computers without action by their part.


As a user I do not care about publisher's convenience. They should solve their own problems without making users the victims of said convenience. As for "safer" - I'll buy this argument when you get rid of slow startup, bloat and other problems associated with snap.


Nice release! The latest Gnome included (3.36) is so much smoother and faster!


What's keeping Adobe from releasing Photoshop for Ubuntu ? Convincing ? Or real technical difficulties ? Like recoding the interface ?


They're probably just not willing to absorb the support costs for such a small market.


Colour profiles, display calibration.


Sorry I’m not adding anything to the conversation here, but have they removed that strange ‘swipe-up to login’ gesture at the start screen?


I don't know if they've removed the ability to do it. I don't think you've ever had to do it. You can just hit space, or just start typing your password from the lock screen.


Yeah you can just start typing your password, last time I tried Ubuntu, at least. Or press any key, and the prompt should appear. Even pressing ctrl could work. But any character-key should be the key, in my experience.



I don't believe my 19.04 login manager does that. I do recall that being annoying because my first keystroke would actually just move that up. Now it starts up to the user list by default.


I'm still on 18.04, and it doesn't do it. I remember it in an older version though.


Does the snap store mean that much software was moved out of the apt repositories or is it also still available there as well?


Most people stick with Mac and Windows because of MS Office, not Adobe.


Microsoft Office has fairly reasonable and popular browser-based alternatives. Adobe’s suite doesn’t.


Sort of, and only really starting in the last year or so, and still limited from the real version, but the market for adobe products among anyone who wants to use Linux is pretty much zero.


(I'm talking about Google Docs.)


Oh, well.

Google Docs is simply not capable of business level work. Sorry.

I thought you were talking about Office 365, which is significantly better, but still not up to par with the native applications.


"It looks like deb files only can be installed via the terminal."

There's another way?


Even as a Windows guy, I was wondering: Why would you use anything else than the terminal? When you understand deb packages then you only want the terminal. Everyone else, will want to use a UI and will have no idea what a deb file is.


Everyone understands the paradigm of double clicking an icon to launch the program or an installer. Many aren't going to understand the exact command they need to type into a terminal to get it to do what they want. This is almost always the answer for why people don't want to resort to using a terminal, discoverability. You can look it up, of course, but there's no good reason imo to not include GDebi for users unfamiliar with that workflow. Other distributions have it by default, but Cannonical is favoring their own tech here over the needs of the users. One more useless roadblock.


My dad runs Ubuntu at home. He's completely computer-illiterate. When I need to help him install something over the phone, it's really nice to be able to have him download a .deb, double-click on it, and it just works. I've done this numerous times over the phone with him... it would take me an hour just to get him to type apt install properly.


For me the struggle is: is the package called libcairo-devel, libcairo2-dev, cairo-dev or something else, i hate running a search everytime


Oh yeah, used to be at least two more ways. Not sure which one of these are not possible but anymore, but you used to be able to do:

1) Synaptic package manager which is a GUI for searching and installing .debs. My first exposure to a package manager when I first installed Ubuntu some 12 years ago.

2) Simply double-clicking on the file, opening the Gdebi (I think?) UI for installing that specific .deb file.


In my experience, simply double clicking on the file never actually worked for installing debs. The UI would pop up, give me a nice pleasant "Install" button, but clicking it would either no-op, crash the UI, show a progress bar that stalled out, or act like everything worked perfectly but not actually install it. Command line install works flawlessly every time.


Judging by your other comment (https://news.ycombinator.com/item?id=22954061) you were using Software Center for the installing. Gdebi was working fine back before Software Center.


I was using whatever popped up in a default Ubuntu VM when I double clicked a .deb file. If they've chosen to make that not the best software for installing .deb's... lets just say I can see why people say Linux distributions tend to be user-unfreindly.


Ok, great story, but not really relevant to the conversation around the difference ways you can install a .deb, which is the subject of this particular sub-thread.


Synaptic should still work at least for the X session. I haven't used it in a while, but it is a really good package manager. They got it right a decade ago.


I like the simplicity of Eddy for installing packages https://github.com/donadigo/eddy


It works marvelously for me. Never stumbled upon an issue with it.


In 18.04 you can install via the Software Center


This almost never actually worked in my experience. It was more of an annoyance than not even pretending to work, because I'd take time trying to get things to install in the UI and they wouldn't work before I eventually needed to revert to CLI.


My experience mirrors yours. Could never get double-click on .deb to work, same symptoms, and Software Center was always buggy/frustrating. dpkg -i all the way, but that's definitely a ux hit for broader adoption of Ubuntu.


Ah... I want to like Linux. I really do. But I regularly install software across MacOS, Windows, Debian, and CentOS, and the only time I have trouble installing is for .deb/.rpm/.snap.

Installing on Windows is run the installer and click through the guided wizard. Annoying, but it works and it’s foolproof so whatever.

Installing on MacOS is unzip and click and the application is already running. Drag over to ~/Applications if I want to. Easy as fuck.

Installing on Linux is trying to remember what apt-get/snap/dpkg/etc. I need, then on top of that what --dangerous or --classic or I don’t even know what all else I need, and cross my fingers and hope it works. It’s honestly aggravating.


apt install ./package.deb installs the package and dependencies in one go.


Exactly!


Looking through the features

  Gnome 3.36
  - X11 fractional scaling.
I got all excited, because this means that now finally Ubuntu will be usable on my Lenovo X1E with High-DPI display.

But then further down below:

  Fractional scaling does not work with the NVIDIA proprietary driver (bug 1870736, bug 1873403).
OK, nevermind, back to Windows with WSL...


Just set font scaling to 1.3x or whatever it is you need and everything works great. I've been doing that for many years for 1440p in a laptop.

Ubuntu actually regressed there since switching back to GNOME from Unity. In Unity font scaling was neatly exposed in the same slider as integer scaling and all apps used that setting. Now I have to set one thing for GNOME and another for Firefox to get the same effect. "Proper" fractional scaling is a waste of resources in most situations. It's only really worth it if you want really consistent sizing of things when you have several screens with very different DPIs.


As someone who uses Ubuntu on multiple machines, does this and uses a high DPI laptop display AND an external display, this only sort of works and the outcome is much worse than what Windows or MacOS do out of the box.

Honestly, I don't get how the Ubuntu ecosystem is so much worse at handling something so basic, when other OSes have had almost no issues with it for over a decade..


I use external screens just fine. I can't even think of a situation that's a big issue:

- 4K laptop screen and 1080p external, just use integer scaling

- 1440p laptop screen and 1080p external, use 1.3x font scaling and everything looks fine

I currently have a 1440p laptop, a 4K external, and a 1280x800 projector in my work-from-home setup. The external screen replaced a 1200p one before. I use sway that supports fractional scaling per screen in whatever setup I want. And yet I prefer to just set 1.3x font scaling for everything and not touch the scaling at all. Everything looks great.

Fractional scaling is a missing feature for some people and Linux desktops are definitely behind. But compared to font scaling I don't see how it's really much more than a little bump in functionality (some controls sized a little better) for a big drop in performance (calculating >2x the pixels in a lot of situations) and even some loss of sharpness.

It's probably a huge preference thing. I've seen people online waiting impatiently for fractional scaling because things were too small in their 1080p laptop screen otherwise. Even though 1080p on a laptop is sort of the definition of 1x.


> I use sway that supports fractional scaling per screen in whatever setup I want.

So not x11/xorg, but Wayland? I recently switched from xorg/i3 to wayland/gnome on 18.04 - and while I miss the tiling, external screens behave a bit better. I'll probably try sway/Wayland when I have the time to upgrade to 20.04.

> And yet I prefer to just set 1.3x font scaling for everything and not touch the scaling at all.

Doesn't fonts end up too big on a typical 21" 1080p display?


Unity still exists, it got renamed to Lomiri - you should still be able to install that. So maybe that feature still works.


I've switched to sway now and have that done manually like many other things to make them "just right". I just meant that the default Ubuntu desktop regressed on that from the Unity to GNOME switch. It was one of the things that I thought Unity had done particularly well within the constraints of what the software could do but very few people seem to know about it. I've explained how to get font scaling to dozens of people online to fix their issues. For most people it's just a much better solution than the much awaited fractional scaling.


> I just meant that the default Ubuntu desktop regressed on that from the Unity to GNOME switch.

I already felt like that with the switch from Gnome 2 to Unity. I switched to Mate as a result.


> It's only really worth it if you want really consistent sizing of things when you have several screens with very different DPIs.

So useful for all laptop-users with an external monitor then.


Yet another proof that proprietary drivers are hindering innovation and holding back Linux desktops, not helping them.


I can recommend Kubuntu, KDE has better support for fractional scaling and is a very usable and distraction-free window manager. In general I'd try to stay away from NVIDIA drivers on Linux, unless you're doing GPU-intensive work the embedded graphics is often the better choice as it's less power-hungry as well and sufficient for most tasks.


KDE is also a lot less garish than it used to be. That was the one thing holding me back from using it.


And for KDE+Ubuntu it's always advisable to also check the official KDE distro KDE Neon, based on Ubuntu LTS. I don't know of the timeline for 20.04 based version though.


What's the advantage of Neon over Kubuntu?


Up to date and optimally integrated KDE directly from the project.

Looks like 18.04 came for KDE Neon only in August 2018, so that's a bit slower process then I suppose.


Can you do scaling with xrandr? I scale gnome to 2x and then have this line in my .xprofile: xrandr --output eDP-1 --scale 1.4x1.4

I disable it whenever playing video games on steam.


Setting up scaling via xrandr when you have multiple (different-sized) displays is probably doable but incredibly tricky.


If you're not letting gnome manage the settings for you, arandr and autorandr can help (the first for adjusting, the second for saving/restoring profiles):

https://christian.amsuess.com/tools/arandr/

https://github.com/wertarbyte/autorandr


I'm afraid to update and check the different displays scaling.. It's going to be absurdly hard to setup in a non-terrible way again, I guess.


Does the Nvidia chip drive the display directly? I think you also have an Intel GPU that drives the display, the Nvidia GPU will only be used to offload rendering.

I don't know how it is implemented, but from my understanding, as long as the Intel chiip does the scaling you should be fine.


If you only ever want to use the laptop's screen then you can indeed run the X1E on the intel gpu only. Unfortunately the display ports are hardwired to the discrete gpu so you need run the nvidia chip in that case.


Confirming this. When I bought X1E it was a pain to get 18.04 working. But 18.04 on this device was for me the worst Linux experience since the 90s :-( It is probably because of Nvidia, but I blame Gnome too - the latency got worse than 20 years ago. I just want to be able to type on the terminal and have a browser.

Am now to leave Ubuntu, that I used for the last 15 years or so. Thanks for helping me out that long. Going to NixOS and hope it will make my life easier, even if the start is more involved.


Interestingly enough, Fractal scaling worked for me in 18.04, but it doesn't work in 20.04 (the thing just crashes). Anyways, hoping for a fix


This is surprising to learn, since I am using nvidia-driver-435 and fractional scaling (150%) on one monitor.


Throw your nVidia card in the bin and be free. After many years with nVidia I removed the card and I've been happily using Intel hardware for a year now.


Because God forbid you actually have to use the card you paid for, oh no, we can't have that. No one should work with machine learning, edit video or play games, Intel HD should be enough for everybody /s


Lenovo X1E is a laptop... So removing the GPU will be a bit of a challenge ;)

I thought of turning the GPU off under Linux, but external displays (through the thunderbolt port) can only be used with the nVidia GPU. I need external displays for my work, so disabling the nVidia GPU is not an option for me.


You may be able to either:

- Use Nouveau instead of the proprietary driver

- Use a nested X/wayland server, which should then presumably support fractional scaling

- Use non-fractional scaling and adjust font sizes


I guess you don't play videogames on your computer.


How's ML training on Intel integrated graphics? :)


Do you have to use the nvidia as your graphics adapter if you are only using it for ML training?


It definitely won't work in the bin.


Desolder the NVIDIA card on a laptop?


Or just disable it in bios?


Throw your Linux OS in the bin and be free. After many years with Linux I removed it and I've been happily using Apple/Windows hardware for a year now. /s


The commenter is lamenting the fact that he can't run Ubuntu. I, on the other hand, am perfectly happy with what I have.


In the linked post they mention "ZFS Native Encryption", but (to save others the time I spent on it): Encrypted ZFS is not available in the installer. However, it will boot encrypted ZFS fine.

If you boot the live image and modify a source file, you can install to an encrypted ZFS. I've documented what I had to do here: https://linsomniac.gitlab.io/post/2020-04-09-ubuntu-2004-enc...


Highly appreciated. Thanks!


"Fractional scaling does not work with the NVIDIA proprietary driver (bug 1870736, bug 1873403)." That's kind of a big deal for any desktop users with a high resolution display.


While I would agree it's certainly not insignificant, fractional means, say, 1.5.x. A nice high-resolution display will be 2x, which will generally result in a nicer experience (largely due to how software deals with it or not). Now many people do indeed have displays where a fractional multiplier works better, but as a buying rec - try to avoid them.


You usually want 1.5x for 13.3" laptop with 1080p which is not uncommon at all and is kinda hard to avoid.


I work around that by setting the default scale in the browser and that's it.


If you put your scale to 2x you might as well save your money and buy an old 1200p screen instead of 4K. Biggest benefit of a 4k screen is more screen real estate, smoother text is of course nice too but imo secondary. When comparing image quality you also have to take panel type into consideration, a very good 2k IPS screen can still be cheaper and have better image quality than a entry level 4K with TN panel.


The nice 2x high-resolution display is generally significantly more expensive (just compare 4k and 5k displays at 27’’), so the advice isn’t wrong, it’s just not very practical.


> just compare 4k and 5k displays at 27’’

I have one of these, a Samsung U28E590D which I'm very happy with in every way. That thing cost me 220 € (purchased in Germany), so I don't see what's particularly expensive about 4K 27" displays at this point unless you have some very particular requirements about shortest possible reaction time or precise color calibration.


The point is 5k 27’’ displays are significantly more expensive than 4k 27’’ ones (which are very affordable). And unless you prefer very large everything, running a 4k 27’’ display at 2x (meaning 1920x1080) is not desirable. But 5k 27’’ displays can be relatively comfortably run at 2x.


I have a 4k 32", and IMHO 1x is too tiny, 2x too large, something like 125% or 150% is just right for me.


I always use mine at 1x


To any Nvidia desktop users out there who don't know yet: the AMD drivers are getting quite good lately. I switched this year, and have been very happy about it. Just be careful if you're buying a new model of GPU, as the drivers are a few months behind in new hardware support.


Yeah, its one of the main reasons using GNOME is near impossible for me. I have a 32" 4K. 2x is too big, 1x is too small. Also, the nvidia drivers are weird for some cards in linux. I'm not running ubuntu, but I have the same exact problem as this guy.

https://askubuntu.com/questions/1133071/why-is-my-tty-green


For a long time I've been using Lubuntu on my laptop. However, a couple of years ago they changed focus and

From wikipedia:

Lubuntu was originally touted as being "lighter, less resource hungry and more energy-efficient", but now aims to be "a functional yet modular distribution focused on getting out of the way and letting users use their computer"

I stopped updating my OS since (I'm on 14).

Does anyone care to suggest an ubuntu-like distribution for someone like me who wants their OS light?


> Does anyone care to suggest an ubuntu-like distribution for someone like me who wants their OS light?

What about Debian? You staying on 14.04 for so long surely means that the slightly slower moving Debian stable is not an issue for you. It supports lots of window managers. I'm using i3 with a simple xinitrc here, but also XFCE or LXDE are available.

You can install it very very lightweight and as Ubuntu bases of Debian it should be similar enough.


If you don't mind me asking a follow up:

How behind Ubuntu is Debian, and where can I read this information myself (i.e. is there an official list by either the Debian or the Ubuntu people that says "here's all the stuff that the latest stable Debian is missing from the latest stable Ubuntu"?)


It is not behind all the time: it just follows a different (and more flexible) schedule. [1]

For instance, the current debian stable release entered freeze on 2019-01-12 and was finally released on 2019-07-06. [2]. This means that most software in stable is currently at the latest versions as of the end of 2018. Hence, up until the release announced today, debian stable had more recent (by around 1 year) software than Ubuntu LTS (18.04).

Now and until the next debian stable release (which should happen around summer next year) Ubuntu LTS (20.04) will have newer software than debian stable.

There are a few software packages that do have newer versions in Ubuntu because they are developed by canonical themselves. I can only remember LXC/LXD as examples of these.

I personally run Debian on my personal Desktop, and have no complaints about it at all.

[1] https://wiki.debian.org/DebianReleases

[2] https://release.debian.org/buster/freeze_policy.html


how well do ppa's work on Debian?

I don't like my system constantly updating (and occasionally breaking) - esp secretly behind my back like snap/flatpak

but occasionally I do need the latest version of some package and i don't want to manage manually installations tucked away in some folder with random bashrc tweaks


There is no official support for PPAs on Debian. You can configure a Debian system to use them, but you'll be installing packages built on and for Ubuntu, so you'll be risking breakage unless the package maintainer has already taken the extra time to test for problems on Debian (which is rare).

This feature request shows some community interest in Debian support, but as far as I know, it hasn't gone anywhere yet. https://bugs.launchpad.net/launchpad/+bug/188564

I wish Debian offered its own equivalent of PPAs.


Damn. That's a real shame. I really like Debian otherwise. It's still minimal to the point where you can understand all the pieces. I used it for years, but then 5 years ago I moved off of it just because it's so easy to get software from PPAs and b/c every instruction is Ubuntu by default. Over those 5 years being a little flexible has ended up saving tons and tons of hour. Even just the other day I wanted to install OpenJDK 14 and ended up having to do it manually. Maintaining that in some directory and having my own update-alternatives entries.. it just feels really yucky at this point


In my experience, most software offering PPAs also offers debian repositories that get setup in your /etc/sources.d/ (usually the install instructions are just to run "curl https://somesoftware/installer.sh | bash", which you should doble-check of course). Thereafter your regular apt-get update && apt-get upgrade updates that software too.

It is really not an issue in my experience. I currently have the following software installed (and updated) this way: docker, virtualbox, nodejs, yarn, skype, vscode, teamviewer and sublime text.


I've enjoyed Xubuntu 18.04 for the light-ness. I pull the taskbar down to the bottom of the screen, swap out the "Whisker menu" for the more classical "Applications Menu" and I'm happy.

It has been pointed out to me that this produces effectively a reskinned Windows98 experience, I'm not unhappy about that.

NB: Have not tried any 20.04 versions yet.


I use a 12-year old Dell laptop (upgraded to 4GB RAM) as my main computer and I’ve been using Lubuntu for the past 6 years for the same reasons you describe. I was also a little disappointed that the focus of the project is no longer to produce a light-weight and efficient desktop – though I don’t object to their new goal.

I’ve been running Lubuntu 16.04 for the past three years but I recently thought I’d update my system. The options I boiled down to were either Debian (the parent and grandparent of so many distributions) or one of the successors to CrunchBang Linux (a light-weight distribution based around the Openbox window manager), BunsenLabs [1] or CrunchBang++ [2].

In the end, I decided that since it was only a few days to wait, I’d try the new LTS version of Lubuntu †. Apparently, its LXQt desktop is just as light as LXDE so I figured it’s worth trying out.

[1] https://www.bunsenlabs.org/

[2] https://crunchbangplusplus.org/

† Not available just yet according to https://wiki.ubuntu.com/FocalFossa/ReleaseNotes#Official_fla...


Update: I see the Lubuntu 20.04 LTS has been released [1] since I posted earlier today. :) Kudos to the maintainers!

I forgot to mention in my post above that I had purchased an SSD a while ago and this is what prompted me to consider the available options for updating/changing the OS.

[1] https://lubuntu.me/focal-released/


Just take Ubuntu Server and throw a light window manager on top. Ubuntu Server is pretty lightweight as is and there are still some very nice lightweight window managers out there.

Something like WM2 is ultra low [1], but watch out if you're running a laptop as you won't get battery monitoring!

[1] http://www.xwinman.org/


That web site is horribly out of date, and thus, their "activity rating" of projects is hilariously wrong. I was very happy to see that it claimed that my old pal fluxbox is highly active, but the last three news items on the fluxbox site are from 2015, 2008, and 2006, respectively.

But I fully agree with your general sentiment that the best way to use Ubuntu is to go with the Server install and then adding only what you actually want on top of it. I've never had anything but a good and responsive system with that approach.


> I was very happy to see that it claimed that my old pal

> fluxbox is highly active, but the last three news items on

> the fluxbox site are from 2015, 2008, and 2006,

> respectively.

Many of these projects are quite old, but still compile fine and are usable. WM2 for example still runs perfectly fine on top of X11.

> I've never had anything but a good and responsive system

> with that approach.

Fast and gives you an appreciation for what is needed for a functioning desktop.


If you're just looking for a DE/WM that's light on resources, I can definitely recommend tiling window managers like i3 [1].

I'm not sure what other changes would be in your sightlines given that you were happy with Ubuntu, but I usually go to the Arch wiki in those times to see what fat I can trim off of my setup.

[1] https://i3wm.org/


Regolith [1] is the perfect i3+Ubuntu combo for me.

It comes with sensible default for a good i3 experience but you keep the Ubuntu "universality".

[1]: https://regolith-linux.org/


I cannot recommend this enough for those looking to dip their toes in tiling managers. I would have saved a lot of time and trouble if I'd gone to Regolith directly, which I only did after spending hours trying to get i3wm into a useful (for me) DE, and realizing that Regolith was what I'd been trying to do all along.

The good stuff from Ubuntu-gnome (easy-to-use common settings GUIs, utilities etc), + the good stuff from i3, well put together and simple to customize, i3-gaps by default, and easy to install.

It also led me to discover i3blocks/i3xrocks which I didn't know I needed.

here's an example of a i3xrocks script for bluetooth status in the tray.

    #!/bin/bash

    # how hard is it to find a bluetooth symbol in common fonts...
    # more icons at https://fontawesome.com/cheatsheet?from=io
    LABEL_ICON= #ᛒ # '' # #  # ᛒ
    LABEL_FONT="Material Design Icons"
    LABEL_COLOR=${label_color:-$(xrescat i3xrocks.label.color "#7B8394")}
    VALUE_COLOR=${color:-$(xrescat i3xrocks.value.color)}
    VALUE_FONT=${font:-$(xrescat i3xrocks.value.font)}

    PANGO_START="<span color=\"${LABEL_COLOR}\" font_desc=\"${LABEL_FONT}\">${LABEL_ICON}</span><span color=\"${VALUE_COLOR}\" font_desc=\"${VALUE_FONT}\">"
    PANGO_END="</span>"

    NCONNECTED=$(expr $(hcitool con | wc -l) - 1)
    echo ${PANGO_START}${NCONNECTED}${PANGO_END}

    if [ ! -z "$button" ]; then
        /usr/bin/i3-msg -q exec /usr/bin/gnome-control-center bluetooth
    fi


Long-time i3 user here, just getting ready to try Regolith when I upgrade to 20.04. Interested to see what I get from it, and what I need to override back to my current setup.

Any pointers for an i3 user moving to Regolith?


I enjoyed reading i3's landing page, thanks for linking.

So could you clarify for me, what's the relation between being a "tiling windows manager" and being light? Is it that for some technical reason one would expect tiling windows managers in general to be lighter than non-tiling? Or is it that orthogonally to the tiling aspect, i3 specifically aims to be light?


I don't think that there's anything inherent in tiling managers that mean they have to be light, but it's very common for them to be light. Ratpoison, xmonad, awesomewm and i3 are the most well known tiling managers (unless I've missed any out?) and all of them are very minimalist and resource light.

My non-expert take on this is that the sort of person who sees value in the efficiencies from using a tiling WM are also of a kind to use system resources efficiently too, even though they are orthogonal characteristics.


DWM, too.


It's a cultural thing. Most developers of tilling WM's seem to prefer less resource-heavy setups.


I have been using Openbox exclusively since about 2005. on both laptops and desktops. It is highly customizable, super light and floating window manager. Using wmctrl wrappers around launcher keystrokes or window management keystokes I can achieve the tiling functionality that I want without committing to tiling only workflow:

* space splitting for launcher * some sticky displays in multi-display desktops


Support for i3. Just don't give up even if it feels a bit awkward to use it the first day(s).

There is also https://swaywm.org/ if someone thinks they should run wayland.


Definitely check out MX (their 19 image even fits onto a 2GB USB) - I have a crappy old netbook with only 2GB of RAM and it's surprisingly useable

Other options worth looking at - Peppermint is okay (although their focus on web apps is a little odd), or LTS Ubuntu MATE using the minimal install option


Bodhi, Peppermint and LXLE are all based on Ubuntu but use either LXDE, LXDE with elements of XFCE or (in Bodhi's case) the Moksha desktop, which is a fork of Enlightenment 17 - it's smaller and faster than LXDE.

Otherwise, go with a Debian or Devuan based distro like antiX, MX Linux, Q4OS Trinity, Refracta or EXE Linux.


I've read all the responses to this thread, and I still think MX-Linux is the distro you're looking for: https://mxlinux.org/current-release-features/

You're going to be delighted.


I'm using Lubuntu 18.04 as a desktop OS without any problem whatsoever, works like a charm.


Since I know nothing about your machine, that response doesn't really tell me anything. My laptop has 2 cores and 5GB ram.

But bigger picture my point was: Lubuntu's focus has changed, so it's not for me any more.


18.04 is the last release that comes with LXDE, wikipedia even lists the requirements

System requirements for Lubuntu 18.04 LTS included a minimum of 1 GB of RAM, although 2 GB was recommended for better performance, plus a Pentium 4, Pentium M, or AMD K8 CPU or newer.[139] The RAM requirements increased from Lubuntu 17.10.


I was very impressed with Linux Lite. The name is terrible, but the distro has worked great. It uses the Ubuntu 18.04 standard apt repos. Since I replaced the kernel, my presumption is most of the benefit is just selecting a smaller subset of default packages to install.

https://www.linuxliteos.com/

Installation on a machine with 2Gb RAM resulted in a system that works just fine, for running scripts, compiling medium-sized projects, running web browsers, media playback, etc -- all on a credit-card sized machine.

The desktop is based on XFCE. The post-installation steps I took are listed below.

    # emable swap to compressed RAM
    sudo apt-get install zram-config
    
    # log to RAM for speed, less wear on eMMC
    echo "deb http://packages.azlux.fr/debian/ buster main" | sudo tee /etc/apt/sources.list.d/azlux.list
    wget -qO - https://azlux.fr/repo.gpg.key | sudo apt-key add -
    apt update
    apt install log2ram
    
    # remove icon generator that hogs machine
    sudo apt remove --purge tumbler
    
    # remove powerline garbage
    sudo apt remove --purge powerline
    
    
    # Liquorix kernel, see https://liquorix.net/
    sudo add-apt-repository ppa:damentz/liquorix && sudo apt-get update
    sudo apt-get install linux-image-liquorix-amd64 linux-headers-liquorix-amd64
    ## NEEDS linux-firmware from focal distro
    ## see https://packages.ubuntu.com/focal/all/linux-firmware/download 
    ## for latest link
    wget http://mirrors.kernel.org/ubuntu/pool/main/l/linux-firmware/linux-firmware_1.187_all.deb
    sudo dpkg -i  linux-firmware_1.187_all.deb
    rm linux-firmware_1.187_all.deb
    
    sudo apt install powertop iotop
    
    # fix font rendering
    cat <<'EOF' > ~/.fonts.conf
    <!-- disable embedded bitmaps in fonts to fix Calibri, Cambria, etc. -->
    <match target="font">
       <edit mode="assign" name="embeddedbitmap"><bool>false</bool></edit>
    </match>
    EOF
    
    
    # remove spectre mitigations, make computer fast again
    sudo perl -pi.bk -e 's|splash"|splash mitigations=off"|' /etc/default/grub
    sudo update-grub
    
    # disable and completely remove AppArmor and snapd
    # https://www.simplified.guide/ubuntu/remove-apparmor
    sudo systemctl stop apparmor
    sudo systemctl disable apparmor
    sudo apt remove --purge apparmor snapd
    
    
    # suppress this language related updates from apt-get
    echo 'Acquire::Languages "none";' | sudo  tee -a /etc/apt/apt.conf.d/00aptitude
    
    # remove irqbalance 
    # see https://github.com/konkor/cpufreq/issues/48
    sudo apt remove --purge irqbalance


Hard to not read that code name as "fecal fossil". Sorry if you can't get that out of your head now. I know I can't.


In portuguese I'm having a hard time not to read it as "fossa fecal" (~cesspool).


You can, if you focus.


More people are getting the Limp Larynx feeling such as you do. I had the same thought, especially after snap, and all. So I agree.


You can call it Ubuntu Coprolite as an informal name.


The best news from an Ubuntu release is that a new Pop_OS! Release is available. Can't wait for the final release tonight!


... and a new elementaryOS release will be coming at some point!


It's the best Ubuntu-based distro for desktop use!


how well does workspaces work? are they at all close to workspaces on mac?


I hate to say it, but I don't use workspaces. I use a number of window management keyboard shortcuts in place of it.


Ubuntu has been regressing on the server side for a while. 1804 introduced netplan, a proprietary wrapper around systemd-networking. Going back to /etc/network/interfaces is an uphill struggle, but it is required for some functions.

This time they've removed debian-installer, so it's no longer a matter of a new kernel/initrd/preseed file. On top of that, the automation has only appeared in the last few days.


I've been running this since mid Feb at home. It's been rock solid.

I've only had 1 issue (other than it never remembering default audio devices which I set manually to resolve)

Snaps break all the time for me. I installed VS Code, after reboot I couldn't launch it, had to uninstall/reinstall. Fixed, reboot, broken. Installed it via terminal. 0 issues.

Same with Slack, Chrome, and Skype, and anything else I tried with Snaps.

Very happy with 20.04 so far.


That sounds like a huge "1 issue", then, if Snaps are supposed to be the way forward for installing packages in Ubuntu as other threads make it seem.


One thing that threw me off is Snaps are sandboxed. So I installed Slack, and I clicked on support ticket links and I had to sign in again. Cos when it loaded Firefox it loaded a separate instance of Firefox rather than re-using my existing open instance.

So I've resorted to removing all the snap apps and installing them from terminal and no longer have any issues.

I'm not sure who the target audience is for Snaps.


If I really need to sandbox anything I'll use the VM for that. As for the rest as a user I am staying away from bloat, slow startup and lots of other inconveniences of snap.

This whole security thing is getting really annoying.


Did they fix the dbus (?) bug that made some applications like FileZilla or VLC hang for about a minute on startup? [0]

[0] https://askubuntu.com/questions/1184774/some-applications-on...


Not sure if it's related, but I have a nice bug with VLC where if I suspend my machine with it open, /var/log/syslog fills my main disk to the brim with error messages!


Note that it's still in beta as of now. But it will (most likely) be released today.


It is released in the afternoon. Someone posted the releae notes a bit too early.


This is the first Ubuntu LTS release without a 32 bit version


Oh? I'll have to see if I can boot 18.04 on an old laptop I found. I think it was one of those mismatched efi vs arch things (32bit boot, 64bit cpu ?) - and last time I couldn't get it to boot... Maybe I can just run it as 32bit...


I wonder if we'll ever get /usr/bin/python back. Or is it going to be like browser user-agent strings where we call it "python3" forever and forget why we can't just call it python?


You can install the [python-is-python3][0] package on 20.04, which symlinks /usr/bin/python to python3.

[0]: https://packages.ubuntu.com/focal/python-is-python3


I saw that, but it's not the default setting. You can't just write a python script and expect it to work on Ubuntu without putting python3 in the shebang.


I think the python team decided to do it that way, in combination with various distro?


The python symlink is allowed to be python2 or python3. E.g. Archlinux has it point to py3, Void Linux allows the symlink to be configured by the user.

See https://www.python.org/dev/peps/pep-0394/ for details.


I guess it will break too many things, some of which may be out of control of Ubuntu.


apt-get install python-is-python3


I love how easy it is to install snaps.

And no more including of PPAs of some developer whose apps seems nifty at first glance.


If only, not every snap would come with its own version of Chrome/Electron.. Would save lots of storage and potentially RAM.


Yesterday, I didn't know what Snaps were. I'm returning to Linux from years in Microsoft land. Used to use Solaris back in the 90s. I'm running big servers in a coloc environment (not desktop) but I want nice things like, oh Python 3, by default. Now, I know way more than I ever wanted to.

So far, there have been two real pain points: 1. Netplan is a mysterious friend who occasionally rearranges my furniture when I'm not looking. 2. Snap is a young kid who wants to make help me get groceries but borrow my car to drive to the store.

I'll leave netplan aside for a moment and focus on Snap.

I got 20.04 to fail badly under KVM using either the just released version or the nightly build because Snap fails. I can see exactly where it fails thanks to Python (cool). It's in an error handler for a UI view that's reading a property that doesn't exist and it hoses the install (ouch). It appears to be triggered because I was testing an odd network configuration.

Once something goes wrong, after that, every change just results in a one-line "apt-get fails" error message.

But consider the implications...

Canonical apparently doesn't have an automated testing system which checks how Snap behaves under failure conditions but they put it into the install process without protections around it. This feels pretty half baked....


Was anyone able to run 18.04 or 20.04 successfully on a relatively modern desktop spec (AMD Ryzen 9 3950X / Nvidia 2080 TI / 32GB RAM / Custom water cooling) and two displays? I'm thinking of upgrading my development machine and running ubuntu only and I'm looking for some hints as to what could go wrong.


I've been running the last "nightly" release or whatever it is of 20.04 since Apr 20 on the same specs you list (3900x instead of 3950x). No problems so far on my ultrawide 3440x1400 display or with the proprietary nvidia driver. Just choose the 'safe' graphics option on boot of the live usb.


Personally I won't upgrade until NVIDIA releases their next CUDA version officially for 20.04. Past experience leads me to believe that otherwise I can run into a few unpleasant surprises.


A couple of weeks ago I installed Ubuntu 18.04, it took hours to get it working fine.

I had to install the properitary NVidia drivers but it kept locking the entire machine, sometimes 30 seconds after booting, sometimes a few minutes. Took hours of restarting and trying to get the drivers installed before it would hang to get it stable.

Since getting them installed it's been solid.

I'd now like to upgrade to 20.04, but not if I need to go through the whole ordeal of racing to get drivers updated before the system crashes.


I've been running Ubuntu 19.10 on a Ryzen 3900X / Nvidia RYX 2070S / 64GB RAM, so close enough I suppose.

The biggest issue I can remember was weird issues with the graphics drivers a few times. Not quite sure exactly how it happened, but apparently I had a pretty wonky installion at first. I uninstalled and installed the nvidia drivers through `apt`.

Other than that, I have an unsolved problem of random, occasionally reboots, mostly which conveniently happen when I'm away from the PC. Everything goes back to normal after rebooting and it usually doesnt do it again for a while. I havent managed to solve this, but I'm assuming it's a rather localized problem.


The overall experience is fantastic. It ends up being little edge-case things that have weird behavior.

For me; my Razer Nari wireless headphones just don't work. They're not bluetooth; they have their own low-latency USB receiver, and thus their own Razer-supplied drivers, which are not available on Linux. There's been some effort toward getting Razer products working better in Linux, but (last I ran Ubuntu, a few months ago) their headphones are not there yet.

You just have to try it out and see what works, see what doesn't. Chances are, it'll work fantastically for you.


Running an i5 9600K / 2x Nvidia 2070 / 64GB RAM on 19.10 and I managed to get to a fairly stable configuration. The only pain point was and has always been Nvidia drivers and CUDA stability. I have yet to find a consistent way to install the driver and CUDA version you want and lock them. A couple of weeks ago I installed Wine which deleted my CUDA packages. For now it's stable enough for a few months until I need to reinstall or tinker with something but maybe someone has suggestions for improving stability.


Running 18.04 on a Threadripper 3970x. My GPU is an older Radeon RX550.

I have no issues except for a driver issue for the audio chip on my mobo (Gigabyte Aorus Master TRX40) that seems to only affect optical audio out.

Cooling solution, RAM, and monitors aren't things that will have OS compatibility issues.


It's at least easy to run live from USB, and see if anything pops out.

Ed: i would probably go straight for 20.04 on a fresh install - but heed sibling comment about nvidia drivers. Might run fine on open source drivers until proprietary ones gets an update (fine, but slow).


Apparently everyone hates snap, while I hate Gnome, especially purple Gnome with oversized windows titles and borders


Just tried upgrading on WSL and I got an endless loop of

  sleep: cannot read realtime clock: Invalid argument
Ctrl+C and then I get

  $ sudo apt --fix-broken install
  ...
  Setting up libc6:amd64 (2.31-0ubuntu9) ...
  Checking for services that may need to be restarted...
  Checking init scripts...
  Nothing to restart.
  sleep: cannot read realtime clock: Invalid argument
  dpkg: error processing package libc6:amd64 (--configure):
   installed libc6:amd64 package post-installation script subprocess returned error exit status 1
and I have no clue how to fix this. (Edit: Found a fix. [1]) For the life of me I don't get how people always say upgrades are so easy on Ubuntu. I practically never have a pain-free Ubuntu upgrade experience... never. And while this particular issue appears WSL-related, I'm not just talking about WSL. There's always some random things breaking in the middle.

Update: After fixing that and upgrading everything, now I get this nonsense:

  The following packages have been kept back:
    libpython3.8 libpython3.8-dev libpython3.8-minimal python3.8 python3.8-minimal

  $ sudo apt-get install --upgrade python3.8
  ...
  The following packages have unmet dependencies:
   python3.8 : Depends: python3.8-minimal (= 3.8.2-1+bionic1) but 3.8.2-1ubuntu1 is to be installed
               Depends: libpython3.8-stdlib (= 3.8.2-1+bionic1) but 3.8.2-1ubuntu1 is to be installed
  E: Unable to correct problems, you have held broken packages.

  $ sudo apt-get install libpython3.8-stdlib
  ...
  The following packages have unmet dependencies:
   libpython3.8-stdlib : Depends: libffi6 (>= 3.0.4) but it is not installable
                         Depends: libreadline7 (>= 7.0~beta) but it is not installable
[1] https://github.com/microsoft/WSL/issues/4898#issuecomment-61...


20.04 (well, modern glibc) is not compatible with WSL 1 due to missing bridges for a realtime-clock system call!

Microsoft is working on a patch, but until then, you need WSL 2 or stay on the previous release.


That's more on WSL thank on Ubuntu though. It's up to WSL to support the syscalls correctly, not on Ubuntu to aim for whatever translation layer WSL1 supports.


For the life of me I don't get how people always say upgrades are so easy on Ubuntu.

They do? I just assume that I have to do a backup of my data, wipe the disk and install they new version.

Ubuntu updates never worked for me.


I don't understand how for some people it "never works", while for me it's literally never once failed, all the way from 6.10 up until now, my latest desktop install lasting through 12 updates, plus motherboard and GPU change (between Intel and AMD CPU, and Nvidia and AMD proprietary driver and then AMD open source driver no less) and four different desktop environments (Unity, Gnome, Mate, and finally KDE).

The closest it's ever come to failing was literally yesterday when I realized that my desktop was still on 19.04 and they had closed the old repositories so the GUI upgrade utility failed. The CLI upgrade didn't skip a beat, though.


Do you run mysql / MariaDB? That's the one that bites me, 100% of the time. There are other packages that often require "adjustments" but the db makes dist-upgrade a total fail for me.

Not fun with servers, unfortunately.


No, I don't. I use Linux strictly on the desktop.


I think it's one of the things Ubuntu dropped from the Debian experience. In-place upgrades are much less reliable. I also tend to wipe and reinstall on the desktop at least. Makes me keep configs in dotfiles/puppet and is good training for having to replace a computer. On the server side, upgrading between LTS releases seems to work better.


Ubuntu tests upgrades between LTS releases extensively and so do Ubuntu community developers and advanced users.

As you said, server users always upgrade. Enterprise desktop users upgrade. A lot of general users don't - they reinstall.

The upgrade process you should use with Ubuntu is different to Debian. In Debian you'd use apt, in Ubuntu you should use the update manager. Update manager specifically checks for upgradability, it has specific warnings and workarounds that are added for each release following testing and reports from end-users. This makes it much easier to upgrade.

The reason why users think the Ubuntu upgrade process is less reliable compared to Debian is they serve different user segments. Most Debian users are technically advanced, engage with the distribution and they often stay in their stream (e.g. testing). Ubuntu is used by lots of normal users - many may not pay particular attention to how packaging works - this means they often do things that make upgrades complex: install community PPA's that advance packages past the next LTS version, install Node into /usr by wiping out apts package knowledge, etc, etc. The range is just bigger.

For many normal users upgrading may no longer be the best option - it's just faster to reinstall. I personally do upgrades, I still find them magical and it's part of the fun: http://www.futurile.net/resources/ubuntu-upgrades/


Canonical wisdom is to have separate / and /home so you can wipe system easily.

I upgraded by grepping for the packages that were breaking resolution and removing them. But I already know the failure mode so that whole thing took a few minutes.

I do have separate mounts for that reason though.


Interesting, is this feasible in everyday life?

I do not use Linux on my desktop or notebook but I would like to do so in the near future. This means I should strive to keep all my user created data within the home directory at all times? How about program preferences and configs that get saved elsewhere by default?

I imagine having ZFS snapshots of / would be useful for updates going forward.


I've gone through 3 laptops now with a migrated home. That included ubuntu -> arch -> fedora migration too. It works pretty well. (with more issues between distros than between version upgrades)

> This means I should strive to keep all my user created data within the home directory at all times?

Why would the user have privileges to save it outside of the home directory? :) Special cases like databases with storage in /var need to be handled separately.


> How about program preferences and configs that get saved elsewhere by default?

Nothing goes (should go) elsewhere by default.

If you're wiping the distro and re-installing - you probably don't want to keep /etc anyway.

Fwiw Ubuntu in place lts to lts release upgrade should be solid. But sometimes you want start with new, contemporary defaults, or a different disk layout.

BTW with zfs, you can have a separate home (or home/your_user) filesystem, and not worry about allocating fixed space (thus running out of free space on /, but with more available in /home and vice-versa).


My first Ubuntu install was 7.04 Fiesty Fawn, a full alphabet circle ago, crazy how time flies.


Mine too


It isn't released yet, right? My package manager isn't bumping up.


From the link, I had to run

  sudo do-release-upgrade -d
to get mine to upgrade.


I'm always wary of upgrading from an LTS to the next one instead of preinstalling from scratch. Did you run into any issues?


For LTS Ubuntu triggers an update once the .1 release is out in July maybe. I did the last two upgrades from LTS to LTS that way on my laptop and had no issues.


I didn't run into any issues myself, however I'm not running a GUI or anything on this machine so I didn't have as many dependencies to upgrade. The whole process took maybe 20 minutes.


Thank you. Wow, they removed Python 2.7 huh? Had to clear out some packages to let that go through.


Python 2.x is EOL Jan 2020


Yeah but didn't think anyone would have the balls to do it haha.


IIRC Arch did that years ago.


No, it was a few months ago, and it's still a work in progress: https://www.archlinux.org/packages/extra/x86_64/python2/


No worries. Yep, that's an exciting change they've gone and made.


There probably are PPAs that allow you to install 2 if you need it.


I use Ubuntu via WSL2 when WFH (i.e. always, at the moment). I've ran into lots of issues trying to use snaps (e.g. for ccls) in WSL, because of the lack of systemd as PID 1.

Does anyone know a robust way to get snapd running properly in WSL2? I would prefer to use whatever package there is than compiling everything that isn't packaged as .deb myself. In the case of ccls it's particularly annoying because I had to install the whole clang toolchain just to build it (which isn't typically on my dev machines, because the work I do is GCC-based).


There are guides for this at https://discourse.ubuntu.com/c/wsl2 That's the place where you can find any working solutions for WSL2 and Ubuntu.


I fondly remember the free CDs in early Ubuntu days.

Is it actually possible to buy an "official" USB/DVD of Ubuntu these days?

When I say buy I mean most of money would go to support Canonical.

There used to be such an option about 10 years ago but I guess there is very little demand.

The world has moved onto mostly online distributions.

For some reason I still do yearn for the world of "touchable" official software - the world of 6 foot long Borland C++ manuals.

PS One of honorable sources - https://www.osdisc.com/ - closed last August


Finally with more recent (4.08.1)[1] OCaml version, even though significantly behind upstream (4.10.0 - was released in June 2019!)[2]. The problem with Debian and all distributions based on it, including Ubuntu, is the ancient versions of the software.

[1] https://repology.org/project/ocaml/versions

[2] https://ocaml.org/releases/4.10.0.html


> In 20.04 LTS, the python included in the base system is Python 3.8. Python 2.7 has been moved to universe and is not included by default in any new installs.

Finally.


(1) After update to 20.04, my Ubuntu Software Store app wasn't there, and neither wad there an icon for Snap Store. (2) Aftet I installed snap-store, it showed me only 16 available apps rather than the umpteen which should have been there. I'm sure these are only intitial glitches, and I'll wait patiently for an update to fix the bug.


Noticed the Xubuntu flavor has some graphics problems on some hardware combinations (as did 18.04 & 19.10).


Awesome. Now let's wait for Elementary to upgrade (all my Linux machines with a GUI run it)


How is ElementaryOS doing nowadays? I used it for about a year a few years back and while gorgeous to look at, it had quite a few UI glitches.


If you usual from scratch you can use ZFS as the root filesystem and until will automatically create snapshots before software updates.

The feature is still experimental but has been working well for me during the beta period.


It does not look like it is available yet: https://releases.ubuntu.com/


I see it at that address now. Also at https://releases.ubuntu.com/focal/


It releases at 2100 UTC.


is there an apt-file equivalent for available snap packages?

I rely strongly on this tool and would be sad to lose its functionality if everything becomes snap-based.


Any news on CUDA Drivers/Toolkit for 20.04?


I will definitely try this out: “Support for raspberry pi (Pi 2B, Pi 3B, Pi 3A+, Pi 3B+, CM3, CM3+, Pi 4B)”


all sorts of apparmor issues, dhcp client doesn't even work out of box, need to run dhclient on every reboot

https://pastebin.com/PUkH3Y1P

with rtlsdr (RTL2832U) plugged in, every process gets blocked

with nouveau, as soon as onboard HDMI (Intel) is connected, desktop becomes very sluggish and system locks up in a few minutes

I am impressed it works at all on release day. Previous LTS releases are hardly usable until LTS .03


Still no ZFS Boot Environments ... pity.

Maybe in 2022.04 LTS then ...




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: