One not-so-unrelated thing I really love about linux is that the "configuration" for just about every application is stored in plain-text files, mostly in the user's home directory. Which means we can use git to version control them (like many many people do - the dotfiles trend).
You can have a log of configuration changes for your system, you can have various profiles with branches, etc. Want to switch to a different profile? Just `git checkout <branch-name>`, done! Want to revert some change, or see how your PC looked like 2 months ago? `git log` and `git checkout` your way through!
I currently have all my configurations[1] in a dotfiles directory representing the home directory (same directory structure etc.) and symlinks to them in the actual home directory. Very easy to manage.
You can also go one step further and have the configuration for the entire OS in text files with something like NixOS. https://nixos.org/nixos/about.html
Want to see what your PC was really like 2 months ago, including rolling back updates and removing new applications you installed and vice-versa? You can do that. Even if you switched desktop environments.
Can you get a Spacemacs setup working on NixOS, with layers and so on? In general, assuming one is willing to write a bit of glue code/NixOS versions of patches/ebuilds/PKGBUILDs, can one get one's configuration from $distro working on NixOS?
I use Haskell, so I've been meaning to switch to Nix for a while.
For every person loving those plain texts, there are one that want to see them purged from existence. Observe how Gnome has basically recreated the registry, complete with hidden magic switches...
The Gnome stack is horrible. I long ago started questioning the sanity of the devs when they tried to build an entire office suite in C. Start any Gnome/GTK app and it will continously spew assertion failed log messages at you. Gnome has also forced certain standards, e.g. Network Manager when superior alternatives exist (the excellent Connman).
Hah, I tried using Connman for a while instead of NM, and I cannot vouch for its superiority.
It's been over a year since I've switched away (to gasp systemd-networkd with vanilla wpa_supplicant), so I can't recall the exact reasons, but I think it was because of lack of doc, terrible debugability, and bad behavior when dealing with multiple known wifi networks.
I seem to recall that Connman came to be after Network Manager was established. And frankly i am unsure if it was Gnome or Red Hat that brought it on, but then i find the lines blur between the two (after all, it seems like the most active Gnome devs are on RH payroll).
Connman was developed by Intel, there's absolutely no Gnomeness involved at all. I think you are correct that Network Manager was first, but it took a long time to stabilise and came with far too many dependencies, including X11 and a full Gnome install!
What's wrong with Network Manager? Show me any other network manager which handles split DNS, WWAN modems and automatically connecting VPNs out of the box with zero configuration.
It's slow to connect, uses a lot of resources and cannot be installed without bringing in hundreds of megabytes of dependencies (e.g. X11, Gnome libs).
It connects to my home wifi in <1s. How long for you?
> uses a lot of resources
10 MiB and 9 minutes of CPU time after an uptime of 4 days with many reconnects, which is totally reasonable.
> cannot be installed without bringing in hundreds of megabytes of dependencies
Not true. Even on Ubuntu, it only pulls in 17.4 MB of dependencies:
apt-get install --no-install-recommends network-manager
The following additional packages will be installed:
dconf-gsettings-backend dconf-service glib-networking glib-networking-common glib-networking-services gsettings-desktop-schemas libbluetooth3 libdconf1
libgudev-1.0-0 libmm-glib0 libndp0 libnl-3-200 libnl-genl-3-200 libnm0 libproxy1v5 libsoup2.4-1 wpasupplicant
[...]
After this operation, 17.4 MB of additional disk space will be used.
Do you want to continue? [Y/n]
On RPM based distros, it's even less. They made it the default network manager on RHEL 7, even for the minimal cloud images.
Try installing it on Ubuntu headless server, then it will bring in hundreds of megabytes. You obviously already had most of the dependencies installed.
Intel developed Connman because NetworkManagers shortcomings make it unsuitable for embedded devices.
First, replace the core GNOME apps you use most frequently with lighter alternatives. Take it easy, do it one at a time. Try rxvt-unicode instead of gnome-terminal. Try Thunar or Dolphin or an mc clone instead of Nautilus.
There's a lot of non-obvious configuration options in the X11 stack. An .Xresources file can select better fonts, tweak hinting so they look fantastic, and change settings and color themes for many X applications. Out of the box, many X applications look like obsolete garbage, but they can be lightweight, functional, and awesome looking at the same time with a little configuration and love.
After getting your core apps sorted, consider an alternate DE/WM. I personally love the gaps fork of i3. XFCE is nice if you still want DE functionality.
I personally dual boot Debian Stretch and Gentoo ~amd64 - sharing a home partition. I keep Debian more or less stock on GNOME, while I rice the hell out of Gentoo with i3 and no systemd.
All you need to do is install XFCE. It has a great terminal, a solid file manager, a compositing WM, and easy configuration. I don't bother setting things up piecemeal or fiddling with .Xresources any more.
Also, it would be great if we could stop using "rice" as an adjective. Even if you don't find it offensive, some people do.
Contrary to some other advice here, I switched to a new window manager first (xmonad, FWIW), but kept running a bunch of GNOME stuff too. Gradually, I have phased out most of GNOME and my computing life is simpler and more stable as a result.
It's the worst. I've sufferred so much although I never wanted to care about it. I'm a little blurry on the details, but if you want to automize addition of some configuration to the dconf system you have to put your part of the configuration tree in /etc/ and recompile the whole config to a new binary file. I doubt anybody can measure the parsing advantage of a binary format, but the drawbacks are obvious. You can't just deploy by copying files. The target system has to be running, and you have to run the right update script in the right sequence.
This update sequence is also meant to live-update the configuration of the running system. Of course that wouldn't work reliably. My session dropped its hotkeys (the only thing I actually cared about) on every update.
> automize addition of some configuration to the dconf system
You mean like "dconf dump" and "dconf load"?
I use it for exactly that (importing my hotkeys config).
What you describe (compiled files in /etc) are system-global default values. You only need those for multi-user environments.
> I doubt anybody can measure the parsing advantage of a binary format, but the drawbacks are
The performance advantages are massive. The dconf database is just a hashtable which can be directly mmap'ed, so no parsing happens at all. This does matter for settings which are read hundreds of times (your current desktop theme, for example).
> What you describe (compiled files in /etc) are system-global default values. You only need those for multi-user environments.
Well, I work as a systems administrator...
>> I doubt anybody can measure the parsing advantage of a binary format, but the drawbacks are
> The performance advantages are massive.
The advantages over reading a big unsorted chunk of text, maybe measurable in an isolated benchmark - depending on the size of the text. But it's not a good idea to put everything in one place.
With the conventional one-file-per program approach the existing file system (instead of custom binary datastructures) is used to speed up the access.
> The dconf database is just a hashtable which can be directly mmap'ed, so no parsing happens at all.
As an aside, mmap is not necessarily faster, or may even be slower. It's probably not worth bothering except in extreme cases. What matters is caching.
"No parsing happens at all", hardly. A hash table just leads to the information. You still have to interpret it. And I assume dconf can only be queried over dbus? This would mean serialization, deserialization, and lots of context switches.
> This does matter for settings which are read hundreds of times (your current desktop theme, for example).
Not convinced. If most keys were read many times, something was very wrong. No. Most things are read by exactly one program.
The pure act of reading config files is neglibible compared to what other things happen at start-up. A process spawn costs about 1ms. Loading the program code itself is a hell of a lot more expensive than loading a tiny text configuration file. There simply is no performance gain coming from putting everything on one crap pile. But it has serious practical disadvantages.
So do I. You only need that feature if you want to, say, set a default wallpaper for new users or prevent them from changing, and I don't see any issues with they way it's implemented. It's a plain text config which can be deployed using your favorite config management system. You'd just put in a change hook like you'd restart a service after changing its config file.
> With the conventional one-file-per program approach the existing file system (instead of custom binary datastructures) is used to speed up the access.
The way it works is that dconf client mmap the database file, so it's the same. dbus does no caching by itself, it uses the same mechanism.
> A hash table just leads to the information. You still have to interpret it. And I assume dconf can only be queried over dbus?
No, dbus is only used for writing. Reading is directly from the filesystem without anything in-between. The serialization format used by dconf matches the in-memory structure, so the application can literally take the value from (mmap) memory and use it as-is. So, no parsing.
I agree with you that it would be less efficient if dbus were used in the read path, this is probably why it's implemented that way.
> Not convinced. If most keys were read many times, something was very wrong.
> No. Most things are read by exactly one program.
> Loading the program code itself is a hell of a lot more expensive than loading a tiny text configuration file
The primary use case for dconf are desktop settings and yes, those are read hundreds of time. Not only at startup, but also at runtime. The current GTK theme or scaling factor is read all the time during rendering.
I'm not advocating that all application config should end up in dconf - as I noted, I like my i3, GPG or SSH config the way they are.
For me, it has lots of practical advantages. For example, I have a script which sets all desktop settings the way I like them (font, scaling factor, focus follows mouse,...) and I can just set them using "gsettings set org.gnome.desktop.interface.scaling-factor 2" instead of figuring out which configs are in which files, how to parse them and merge existing and new configs.
> You'd just put in a change hook like you'd restart a service after changing its config file.
It's not the same. While servers restart on reboot, the config file is not automatically compiled. Also if you don't reboot, most servers are restarted through a uniform interface. Dconf makes an exception without need. You need the dconf infrastructure at deploy time (it might be an installation), and you need to know how to use it.
I've wasted hours figuring out how the configuration is to be used. In the end I still had terrible bugs. I'm glad other programs just do the parsing for me. It works, there never was a problem to begin with.
> The current GTK theme or scaling factor is read all the time during rendering.
I think you are missing the possibility of parsing a configuration once at startup and storing the result in memory in suitable datastructures. It's not like conventional programs would parse the configuration over and over. Fixed binary layouts are very bad for interoperation.
Btw, there is no memory usage problem compared to an mmap solution if every program just parses the _relevant_ information. And it also doesn't preclude a live-update mechanism -- this just has to be done with care. Not every information is trivially live-updatable, as illustrated by the breakage I mentioned.
> I can just set them using "gsettings set org.gnome.desktop.interface.scaling-factor 2"
This can be also done with text configuration files. Some programs do it. Most do it not (including dconf, which doesn't update your original input file) because the tradeoffs are bad. It has nothing to do with the question whether or not the serialization format is text.
Text vs binary is just this question: Do I need absolute performance or absolute discoverability? In the case of configuration files, the answer is clear.
Most importantly, a stable API which can be concurrently used by multiple processes.
Getting/setting single values in a plain text config file is cumbersome and error-prone, and even if there's a library which is able to read that particular format, it often clobbers comments and formatting. For systems administration, you usually solve this by generating the entire config file from a template since nothing on the server should change your config files anyway.
However, on the desktop, many different application want to read and write keys. Tray applets, the system config panel, even your media player... They would inevitably conflict and you'd have to implement file-based locking.
Another topic: real-time notifications. If you modify the text size in the Gnome Control Panel, what happens is that the panel modifies the value in dconf and dconf sends a notification to all running Gnome applications. You could implement this using inotify, but you'd need code in each of your applications to open the config file, parse it and figure out which setting actually changed.
Single dconf lookups are really fast since everything is zero-copy and doesn't involve any syscalls. This is much faster than opening, reading and parsing config files.
dconf doesn't aim to replace all plain text config files. Storing the i3 or SSH configuration in dconf would be stupid (or even using dconf on a server to begin with).
> Getting/setting single values in a plain text config file is cumbersome and error-prone
What? The INI format is very simple and there are numerous libraries that handle it cleanly in any language.
> and even if there's a library which is able to read that particular format, it often clobbers comments and formatting.
What? These are primarily machine-readable configuration files. Why is formatting relevant? And how are comments relevant to dconf, since it uses a binary format anyway?
If a plain-text file isn't good enough, at least have the decency to use SQLite.
> However, on the desktop, many different application want to read and write keys. Tray applets, the system config panel, even your media player...
If your applications directly read and store configuration keys belonging to other applications, you have way worse problems in your architecture than configuration file format.
Configuration items like current theme or current default font are things, that all applications are interested in. If they want to have look and feel like the rest of the desktop, that is.
That opens another can of worms: What is system control panel: is that gnome control center? Or command line dconf tools? Or is that dconf-editor? Or one of the myriad tweak utilities?
Cut off one of these and brace for the whining, that gnome is dumbing everything down and locking up your settings.
The issue for me is more that they recreated the windows registry with all its flaws. For example, when uninstalling an app, can every entry associated with that app be easily removed?
The registry was supposed to improve upon dotfiles. In practice there is often one dotfile to delete versus hundreds of scattered registry entries. If an Android app is removed, the local settings are also completely removed, because they didn't just copy the design of windows.
This is unique on Windows. I actually took a look at dconf on my machine and no application occupied more than one key.
GEdit sits in /org/gnome/gedit and virt-manager in /org/virt-manager as you'd expect.
I also hate Android's behavior, since there are situations where you want to keep them. If I remove GEdit for whatever reason, I expect it to keep my settings.
There are legitimate reasons to criticize Gnome, but this isn't one of them. dconf is totally reasonable and has significant advantages over plain text.
It's unfortunate that many of the modern components frequently seen in Linux environments do not appreciate/respect/promote the use of plain-text files and scripts. Thinking of systemd and its many utilities, Docker, etc.
Can you please explain what you mean with respect to Docker? I've been using it (and the other software in its ecosystem) for three years and have never run into files and scripts that were not plaintext. Even image layers are just tarballs of files and metadata.
Some of the configuration is plain text: unit, service, and environment files are plaintext. Which services are enabled and disabled cannot be controlled through file manipulation that I know of. Journal (log files) are binary.
I think what bothers me about both is that the presence of text files in a predefined location is not enough to make things happen on its own. Instead of the simplest possible everything-is-a-file and interacting with the filesystem is the interface, you have an additional requirement of calling binaries and their subcommands.
Generally systemd manages enabled/disabled services through symlinks in /etc/ (/etc/systemd/system/ on my Fedora 24, for instance), which you can add/delete in the shell if you want to. It's admittedly not plain-text, but manipulation-wise it's about equal.
IIRC enabling and disabling services in systemd are done via symlinks. You'd have to call a binary to start newly-added services, though. (i.e. systemctl daemon-reload && systemctl start service)
My personal favorite is the LXDE / Lubuntu 16.04 setup.
All my subjective opinion, of course:
- Sits in the sweet spot for bare bones distro vs batteries included.
- Without decorations, I have essentially borderless applications, that I can quickly toggle the titlebars etc.
- Kept sane keyboard shortcuts and did not remove them for some arbitrary reasons. Here's a random nitpick, why do some interfaces no longer allow you to press a single letter on the keyboard to launch an option from a menu, example: An option from the list of options in the File menu?
- Very fast, allows a great battery life, all of ubuntu packages and ubuntu community support/troubleshooting, very easy to customize if you need to.
I love it because it is an OS that stays out of my way, has a sane setup out of the box and looks awesome with the borderless windows.
It is perfect in almost every way! I'm worried about their plans to switch to LXQt. It's almost like how Ubuntu 10.10 was IMHO, the pinnacle of perfection, sane choices etc, and then they decide to start all over again.
Only thing I dislike about Lubuntu is losing Mac keyboard shortcuts and hardware, a lot of them I managed to replicate and even found a (slightly unstable) Alfred alternative called Albert [1]. My Magic Mouse 2 could only work like a dumb mouse.
I turned off the tray at the bottom, the window borders and set apps to start full screen, switching between full screen apps was much nicer than what remains after Apple closed off whatever Total Spaces was using. It ended up being a really productive way to work due to the lack of distraction like crap software from Apple, Adobe, 1Password, iTerm etc endlessly begging for 5 minutes so you can watch their updates download. Or all the other notifications that mostly don't need to be read or exist.
That's probably my favorite thing about Macs. The unix/emacs-like keybindings like C-f, C-b, C-n, C-p to move the cursor, C-c to kill a process in the terminal, etc and the Windows-like keybindings like ⌘-c, ⌘-v, ⌘-x for copy, paste, cut, ⌘-n for new window, etc both exist by default without colliding. I can get this on Linux if I spend a week remapping things but not by default. As far as I can tell, there's no way to get this on Windows.
The Unix/emacs-like keybindings are great, aren't they? They're included in the default bindings of the Cocoa text system, and, though I wasn't able to find references to it in the limited searching I did, it's not so surprising given that OS X comes from NeXTSTEP which was a Unix-based system. I did find that you can change these keybindings as well.[0][1] Pretty, cool, eh?
Minor pendantic nit: ⌘-C, ⌘-V, ⌘-X originate with Apple, so referring to them as "Windows-like" is backwards.[2] Like I said, minor, eh? Normally I'd suggest alternative wording in the interest of providing constructive criticism, but I'm too lazy to do so right now, so feel free to disregard :)
> My Magic Mouse 2 could only work like a dumb mouse
Last I checked, I thought I saw this was a driver issue, not a DE one? I could be wrong, but I remember coming to the conclusion that it wouldn't work as anything but a dumb mouse at all on Linux (at least, for the time being)
My understanding is that LXDE does not support HiDPI very well because not everything is ported to gtk3. Do you have any experience to share with retina/3K/4K displays and LXDE?
I recently tried to use Linux (Ubuntu 16.10) as my primary desktop and failed again. I tried yearly since year 2000. But again in late 2016 I switched back to Windows (10) and MacOs. Driver support is getting better but so are my expectations and Linux (well Ubuntu) doesn't seem to catch up fast enough.
For instance I failed having my dual graphic cards be used in extended mode with the my native laptop screen and an external HDMI screen monitor. My bluetooth headset has a hard time being recognized by the OS I have to retry many times. And I cannot have sound through on my monitor connected to HDMI output of my Nvidia graphical card. Maybe I used the wrong distro, but all these issues are non-existent when I switch to Windows 10. Linux desktop is getting better but is still a painful experience when used with non common setup.
It's interesting that your experience is so different from mine. I've preferred Linux environments since about 1998. Sometimes I boot to Windows, but Windows feels very limiting. With Linux and KDE, I can use custom keyboard shortcuts with ease, I can put my whole OS on a thumb drive, I can run the same instance on other computers, I can copy an instance as many times as I like, and I can try different variations of drivers and settings with no worries that I'll break anything so badly that I can't fix it. I can downgrade anything. I can upgrade programs and drivers even while they're running. I can upgrade at my own pace. I can run the same software on either a desktop or a server. For the few programs I use that don't run well in Linux, I have Wine, virtual machines, and dual booting.
I like Windows and Mac OS, but I feel like they have a long way to go before they're useful for my day to day work.
Everything I own and do is terrible on Windows and works fine on Ubuntu.
There is some weird finnicky things with Bluetooth headphones (especially headsets) but it is honestly such an easy fix (plug them in) that I don't even think about it.
Windows on the other refuses to connect at all to my Bluetooth headphones (Sony), my graphics card has broken drivers (and continues to) and that is even before I mention all of my gripes with Windows as a developer (compiling on Windows is basically a no-go)
The only thing I use Windows for is gaming, because unfortunately games are basically never released for Linux.
I always struggle with Windows updates, it's not on automatic update and it seems that after Win 10 shenanigans Microsoft is making it harder and harder top keep Windows machine updated. First it seeks for updates without finding anything, then after restart it says there are updates but by that time I have work to do and can't be interrupted with updates. Essentially, the solution is to shut it down and return to Linux machine.
> Linux desktop is getting better but is still a painful experience when used with non common setup.
Indeed every time I think of getting a new laptop or desktop, I try to do research into how well Linux will run on the hardware. Thankfully there seem to be a decent number of laptops that have Linux preinstalled now.
As much as I enjoy tinkering, even after using Linux for a while I still don't have a strong understanding of how things work, and it's frustrating to be forced to tinker when my intention was to get things done.
Perhaps I need to dedicate more time to understanding how things work?
> Perhaps I need to dedicate more time to understanding how things work?
Yes, if you don't already, learn basic command line very well, with stderr/stdin/stdout, pipes, sed, grep, find, cat, make, etc. etc. You'll then thrive in Linux, and it will help you with OSX, too.
Oh I already know how to use (most of) those things. What I meant was more that if something on my machine is not working, like a driver or something, I usually have to resort to just copy pasting whatever people tell me to do.
I guess what I meant was more that I don't understand how Linux the OS works, not how to work in Linux?
I've been dealing with the same issue. I haven't figured out any real trick to it; the best I can tell, a working Linux system is composed of so many different utilities worked on by so many different people, there's a lot less standardization than what you might expect.
For example, if I do something and it doesn't work, I want to see something that went wrong. This could show up:
1. In stdout/stderr
2. In a logfile specific to that software, somewhere on my computer
3. In an aggregated logfile somewhere on my computer
4. Nowhere, but will do 1, 2, or 3 after you enable something in a config file or cli parameter
This all assumes that if there's something you have to look up (like where it logs files to), you can figure it out based on the man page or website or inspecting your filesystem, and then it actually logs some actionable information on the problem.
I've just come to terms with the fact that everything, no matter how simple it should be, is an ordeal the first time I do it. I just set aside time to research and make notes.
I had the same opinion of Linux as you. I didn't want to tinker with it and get everything working well.
Just install Ubuntu. Ignore everyone saying Ubuntu is a kiddy OS ("All you need is X, Y, Z-ix"). Ubuntu is Linux Windows (not really, but it is pretty hard to mess up).
Unfortunately Linux Mint does not track security issues or issue security updates. That alone is reason enough to avoid it completely, but then when you add in their server breach and the way they handled it...
Stick with Debian and Ubuntu. And more Debian, going forward.
I get security updates pretty soon after they're released on the Debian mailing list, actually. Logging on this morning I have an update for the gstreamer issues.
Today I found out that my brand new Macbook Pro (company provided) does not actually go to sleep when I close the lid. Every 30 minutes or so, it wakes up, builds up a wifi connection and starts doing something. Even though the 'powernap' feature is turned off.
I find that sort of behaviour far more worrisome than a bluetooth headset not working.
My experiences are similar, it's not the software that's the biggest problem, I use mostly FOSS anyway. It was the lack of decent hardware support, e.g. Skylake laptop support, Displayport MST, that drove me to Windows 10.
I was an early adopter and it was a bumpy start for sure, but you expect that. Just wait half a year or so after a new processor generation and it'll be fine. Zero issues on my Haswell laptop for the past years.
Intel has full-time people working on the Linux graphics stack. I've contributed bug reports and stack traces and they have been super friendly and responsive.
I wouldn't say it is working fine yet, for example:
https://bugzilla.kernel.org/show_bug.cgi?id=116671
I'm not convinced mobile hardware is supported by them given this bug is still open and Skylake is approaching EOL.
What drove me back, already in Windows XP days, was that the UI/UX and gamer culture don't really get along with the FOSS ideals and using powerful gaming computers as a modern version of Sun 3/50 (where twm was created).
If you ever decide to try again, try to time it with a new computer purchase and buy a system from a vendor who ships Linux based systems. Hardware support is always the trickiest bit with Linux as it tends to lag behind a bit due to developers not getting access to the hardware until after it is publicly available combined with the Linux driver model (in kernel drivers) + the distribution model of sticking with stable kernel releases.
TLDR; If you want good hardware support without worry, buy from a Linux vendor.
I had a system76 laptop with ubuntu pre-installed and still had nothing but problems. I've tried a myriad of laptops (always with hardware that's supposed to play well with linux e.g. intel graphics/wifi etc...) with linux (as well as every distro under the sun), and while some definitely worked better than others, every single one had obnoxious game breaking issues/bugs (suspend/resume being occasionally unreliable being the most common, even on 'linux compatible' machines).
My system76 laptop also had an obnoxious firmware bug (that seemed to only effect linux) with the media keys that they basically told me they couldn't do anything about: https://www.youtube.com/watch?v=52TaFzCDDkA
Linux desktop environments and software also have serious QA issues in general IMO. When running linux on the desktop I found myself spending more time reporting bugs than actually using my computer. I really enjoyed tinkering with linux, but when it comes to day to day use (particular on laptops) I found that the quality and reliability just wasn't there.
I ended up running windows on my system76 machine and eventually moving to a macbook air.
Now this is strange, because I've had exactly the opposite experience with the same hardware. I've gotten both a (Thinkpad) W530 and a P50 working fully (with four monitors!) with their dual-GPU hardware out of the box with Linux Mint 18 (Cinnamon), which is based on Ubuntu 16.04. Incidentally, Windows 10 shit itself with a simple ATI 7990 and the onboard NIC on an Asus motherboard when I tried to install it a couple of months ago.
I bought a $360 Thinkpad w520 quad core mostly to see if I could move from linux in VMs to it being a daily driver. That was two weeks ago. I'm amazed at how well it works: this laptop has a nightmare graphics setup (Nvidia Optimus) but it works very well when you pick a card in the bios; the Optimus setup doesn't work very well. On the other hand, it isn't even supported in Windows 10: I tried it anyway, and it keep crashing. Meanwhile, Ubuntu 16.04 is a revelation (I mostly used Linux Mint in the VMs). Fast, very stable, 7 hour battery on integrated graphics, works with one DP monitor (haven't tried two yet, waiting for a docking station to arrive). I use it for development, VMs and RDP, fantastic experience so far. To my amazement, Wine can even run Excel & Word (2010). I have never got Office working in Linux before, this is quite amazing, although ultimately not very useful since it seems to run Windows VMs better than my 2015 i7 mac (which is only dual core), and LibreOffice is almost different software on Linux vs the mac, it's so much better. Ubuntu is really good to use, it's very fast to move around apps and workpaces, great keyboard shortcuts. There are a lot of moving parts but it seems to hang together.
Plus my bluetooth magic trackpad worked. So I'm probably going to get an entrylevel quad-core P50 in the next month or two, and sell the macbook; I think I'll finish ahead, so I'll put the change towards a good Chromebook.
I have it working with two DP monitors and the laptop screen. But this only works in XFCE not Unity, even though I use lightdm with XFCE. Optimus actually probably would have worked out of the box except that I added a brightness control xorg.conf tweak which broke Optimus (the problem with 2011 hardware is a lot of neolithic advice). With 16.04.2's 4.8 kernel and Nvidia 367 driver that tweak isn't necessary anyway.
One thing I noticed about installing multiple desktop environment on Ubuntu: they all have different default apps that are always installed. So when you have two desktop environments, you end up with 2 different terminals, 2 different file browsers, 2 different archive managers etc. You could uninstall apps that have the exact same purpose, but it would be nice if there was a formal way for DEs to recognize and acknowledge the existence of another DE.
Rant: When I first wrapped my head around the concept of a desktop environment, the first thought that came to mind was: why don't Microsoft and Apple abandon their proprietary OSes and just build and maintain a solid desktop environment for Linux and the BSDs. It would save them a ton of money, resources, developers, it would improve security. The major drawback was porting the apps written for their OSes to Linux and BSDs, which was a deal breaker. Then came the announcement of Windows subsystem for Linux, first thought: incredible engineering feat but MS did the exact opposite of what I would have wanted, which was a Linux subsystem for Windows. It would have solved the issue of porting Windows apps to Linux. Now I am beginning to realize that all decisions technical or not, are just ugly politics with nice explanations to make the decisions seem like the right thing to do.
It seems like the endless iteration of OSS stripping some politics, then someone taking it and adding different politics, which makes someone else do an OSS version without that politics, etc.
Pretty really isn't the problem and hasn't been for a very long time. You can make Linux IMHO prettier than a Mac or Windows pretty easily.
The problem is having to tinker. Tinkering is fine if you've got the time and enjoy it, but if you just need your computer to always "just work" this is a problem. Both Linux and Windows strike out for me here. I use a Mac because I almost never have to mess with it... I mean sometimes a year can go by between times when I have to actually mess with my machine. Nothing else does that.
I'm not sure why you think you "have to tinker". I run Ubuntu 14.04 and use i3 window manager. I haven't "tinkered" with the setup in about a year-and-a-half, dating back to when I installed i3 shortly after I got the my last computer. The default Unity DE is nice, too. I don't use it, but I could have used it all this time, can't imagine what "tinkering" would be required.
I have to tinker all the time because hw compatibility still sucks. Wifi, sound, video-cards often require tinkering to get them working, especially after upgrading. They are a lot better than they used to be, but they still have a ways to go.
I've had zero driver issues on my Asus UX305 laptop. Even if I had, those issues are generally solved once on installation, not continually tinkered with.
I've had quite a few computers that didn't have any problems and worked right out of the box. That doesn't mean there aren't still problems with other computers.
Tiling window managers like i3 seem to allow the perfect mixture of terminal and graphics. After spending some time using i3 I've noticed (when using for work, etc.)that desktop environments universally seem extremely inefficient and slow. For certain types of tasks a mouse is essential, but being forced to use it for nearly everything in OSX seems like a massive waste of time. I'd like to see a study on efficiency among proficient users of desktop environments and proficient users of WM. Users who view themselves as fluent in an OSX desktop environment and users who view themselves as fluent in a WM(say i3) probably spend different amounts of time executing the same tasks. IMO not having to use the mouse as frequently seems like a tremendous efficiency booster.
What I do is run a tiling window manager inside a desktop environment. XMonad inside Gnome is what I'm using right now. For 95% of what I do every day I'm just using the keyboard but when I do something unusual there's still the friendlyish GUI to take care of it.
I remember when I discovered Unix as a college freshman (HP-UX with CDE), I spent way too much time fiddling with the desktop settings. For a while I even ran a buggy Enlightenment because wow was it pretty.
I loved trying out new visual styles. I spent so much time on wallpapers.
It took me years before I realized that what I really wanted to optimize was app layout. Any screen space that wasn't used by an app was wasted space. Space I could use to have a few extra lines in a terminal, a few extra columns of code...
I used AwesomeWM for a few years because I loved its concept of "tagging" windows which let you mix-and-match which you were displaying at a given time (this is my "all code" setting, that's my "all terms", and this is my "some code some term..."). Now I'm on macOS and I dearly miss it.
Anyhow, I haven't looked at a desktop wallpaper in years, and I like my window decorations as minimalistic as possible.
I'm in the same boat, I used AwesomeWM for several years, before switching to OSX around 2010, just because the Mac hardware was superior to just about everything else, but I always felt crippled in the window manager. iTerm2 + tmux can get close, but switching to the browser never "fit" in that flow.
A few weeks ago I switched back to Linux on a NUC, and its like I've come home again.
I do not subscribe to the idea that it is a good thing to switch from one DE to another for specific tasks. I think it is better to learn one, master it and stick to it. Otherwise time that should be spent working is spent on tinkering with the DE.
While I agree with you personally, I also get why people do it. For some the tinkering with the DE is working, or at least having fun. It might be a form of procrastination or a genuine interest. It's not terribly different than reading productivity blogs or looking at other people's vacation photos.
> get your DE/WM just the way you want it, you can be more productive
Does that ever actually occur? My vimrc changes almost on a daily basis, and I'm frequently changing plugins, writing new tools, and creating new aliases all time time. And that's just for my terminal.
Having to repeat some of those modifications 3-4 times because I'm using a slightly different terminal app for each individual task would, I think, drive me batty.
> Does that ever actually occur? My vimrc changes almost on a daily basis, and I'm frequently changing plugins, writing new tools, and creating new aliases all time time. And that's just for my terminal.
Don't take it the wrong way, but that sounds like a personal problem ;-)
If I were changing my configuration files that often, I would start looking for an environment that worked better for me.
My config editing is approximately logarithmic. There's a bunch when I start using new software (or start using old software in a new environment), but it slows down quickly as time goes by. Nowadays, I might make a big change to my .emacs once a year, and maybe tweak a variable or keyboard shortcut once every 3 months or so, but generally it doesn't change much. Same with all my other config files.
>Don't take it the wrong way, but that sounds like a personal problem
Yes, and in addition one that exists regardless of what OS/WM/DE you're running, since Vim and Emacs configuraion works basically the same whatever platform they're on.
>Having to repeat some of those modifications 3-4 times because I'm using a slightly different terminal app for each individual task would, I think, drive me batty.
I have no idea what you're talking about. What are these "slightly different terminal apps"?
There's nothing forcing you to use the native terminal under a new DE/WM. If your neurons have hardened on the gnome-terminal way of doing things, it's no biggie to just keep using that under KDE or openbox or twm or whatever, modulo (maybe) a little twiddling of desktop shortcuts and menu entries and the like.
I do use a tiling manager on my linux laptop, but this is mostly because on a 13" screen every pixel counts. Also, the trackpad is really not good.
Nothing wrong with trying several environments until you find one which is the right one for you. But, hopping from one to another, like the article suggests, seems counterproductive to me.
Although this blog entry is entirely about desktop interfaces, he does mention that
> ...Windows, OS X, iOS, and Android, have one common interface...
This is true for the first three without significant modification, but it isn't true for Android. One of my favorite things about Android when I had it was the ability to choose a launcher right from the app store. You can completely change your home screen interface with a downloadable app, and it's actually much easier to do than it is on linux last time I checked.
For desktops I agree, Linux has the most customization when it comes to the interface. But on Android, it's just as customizable as a Linux desktop(I mean... it is Linux after all).
> on Android, it's just as customizable as a Linux desktop
I wouldn't say is true. For android launchers to be functionally equivalent to (or as customizable as) linux DEs, they'd have to be able to customize android's nav bar, pull-down menu, settings app, recents menu, and split screen/tiling/windowing. At the moment, these are all handled at the OS level. Not that android's launcher concept isn't cool, but it's definitely not as flexible as DEs.
I agree, the launcher isn't as customizable as a linux desktop. To change the entirety of a linux user interface is much easier on a linux desktop. But nonetheless, Android is still quite customizable, and the other interfaces you mentioned (nav bar, pull down menu, etc) are still customizable if you're able to flash a custom ROM. It's all possible, but desktop Linux is much easier to fiddle around with in that manner.
Nonetheless, I think Android should've been left out of that group in the OP. Not a big deal, but for someone who used to tinker with Android quite a bit, I couldn't resist commenting the point.
> If you’re using Windows, you’re pretty much stuck with the Windows interface. The menus are always going to be in the same place and while you can do things, like change your desktop image and your colors and themes, you’re still limited in how you interact with your computer.
I'm not so sure about that. I used winblinds years ago for XP. Not surprised to learn that they're still around.
There used to be a bunch of alternatives to explorer.exe (yep, the very same damned .exe as the file explorer), but its been ages since i played with them. They all had quirks and issues, as best i recall.
They with different products provide a dock like experience, virtual desktops, some of the bling of compiz themeing etc etc etc. Which is fantastic until windows which on some machines is more flakey than others acts weird and you get to figure out if the quirk is in the app, your machine, or your stardock product which in my experience it often is.
Stardock is buggy and costs money. Linux DE are free and superior.
I think even more importantly, Linux allows DEs and WMs to do practically anything they want. The window manager i3 simply is not possible on OSX. OSX tiling window managers use the accessibility API as a hack, and it's just not very good at all.
I can't even begin to express how much I love i3. I literally switched from OSX just for i3, it's made me that much more efficient, focused, less frustrated and allows me to use my ultrawide monitor to its fullest potential.
I feel the same way; although I've spent some time with both Gnome and KDE, I never developed an attachment to either of them, mostly because I felt that there was a good likelihood that mastering the Gnome of today might be a pointless exercise because if you sat me down in front of the Gnome of one year from now, it'd be unrecognizable to me.
Around the advent of Gnome 3/KDE 4 I also realized that I didn't really use any of the Gnome or KDE apps. If I want to burn a cd, I know wodim well. If I want to listen to music, I probably use my phone. The browser is the only desktop app that I use regularly, and that's fairly agnostic.
I've spent years tweaking rxvt Xresources, for example, such that I can drop it on a brand new machine and everything behaves as I expect.
I'm well past the point in my career where I'd be looking for, say, a better terminal emulator, or shell, or chat app or any of that stuff; the stuff I've got is working as well as I could hope, and when I need something new I can compose it from the abilities that the shell and the window manager expose.
The last big change I made was after I tried a tiling WM (Xmonad) for a couple of days and realized right away I'd never go back; that's been 7 or 8 years ago. Occasionally stuff changes underneath me in ways that seem somewhat capricious (Pulse Audio, Systemd) but I usually give that stuff a chance and try to adapt to it unless its absolutely unbearable.
I enjoy checking out the new stuff on machines I use at home for fun, but at work I treat the workstation as an instrument; I know what I want and almost everything has a reason.
I've been going through a similar experience over the last few years, mostly driven by my hardware getting older and not handling the requirements of modern DE's (when I'd prefer the resource to be used for applications).
It's nice stripping back to the original structures, though it does feel like computer archaeology sometimes. There's a lot of services in the upper layers that have to be switched off to get back to plain .xinitrc and Xresources!
On the upside, I know my configuration and it's all in a repository now!
Except I never actually got on board with KDE or GNOME. Command line, .xinitrc, and .Xresources were my jam lo these many years.
When I first tried Linux in the 90s it was already less of a pain than Windows. Why would I want to reinstate all that Windows-like cruft to be at most 0% more productive than I was without it?
That is one of the reasons i like Linux very much! Still, bspwm is my favoriete because i van everything togheter as i want :) And a tilingwm is increasing my workflow! Still considering to leave osx (or called macOS these days) behind. But there is still the osx only software problem.
In the environment I'm in, my Linux home directory is shared by multiple systems. Most are CentOS 6, but we're starting to use CentOS 7, too. I'm using the default Gnome 2 Desktop on 6, but Mate Desktop on 7, as I don't like Gnome 3 or the "Classic" option. I can't point to any specifics at the moment, but I've had what I think are issues of these desktops stepping on each other. Being able to switch between DEs is nice, but they have to play nicely with each other and I'm not sure that's always true.
I also maintain some Solaris 10 systems and that's a whole other set of issues, sharing a home directory across different "unix family" systems.
If in this case you've a /home that is an NFS mount shared across several nodes running different *nix or even Linux distros then yes, that is a problem.
Centos6 will have different Gnome versions than Centos7. However, both will write in ~/.gnome, possibly .gtk, or .gtk-3.0 or wherever the hell they write to these days.
Better to share your data through an nfs share in /data instead of /home and take the pain of not having the exact Desktop across different nodes. Or to create a homogenous environment with only Centos7 for example.
I'm beginning to think that's the right approach. Or, have a minimal home directory with a .bashrc that sets $HOME at login to an appropriate sub directory. Might not be too robust...
If there is one distro i have to recommend it's Solus. I'm usually not a fan of new distro xyz were not much except the wallpaper changed. But Solus is build from scratch and questions many old linux dogmas. It also focuses on the desktop and is probably one of the fastest distros with a great OOTB experience. It also has a very high velocity were every time you look there are new amazing improvements
Some design choices for GNOME simply escape me. By default, no window buttons (i.e. active windows) on the top panel. By default, no maximize/minimize buttons for windows. Who could possibly find that user-friendly?
why do you use the maximize / minimize buttons? If you're just trying to see a different app, wouldn't alt+tab do the same thing?
> no window buttons (i.e. active windows) on the top panel
why do you need to see a running list of everything that's open in your face at all times? it seems to me like the only time you should need this information is when you're switching apps. you can also see this information by pressing the Super (aka Windows) key. also, afaik i3 doesn't show you all of the running applications at once either, but no one ever complains about that.
> If you're just trying to see a different app, wouldn't alt+tab do the same thing?
Can you alt+tab with a mouse?
> why do you need to see a running list of everything that's open in your face at all times?
A taskbar at the edge of the screen is "in your face"? What if the panel auto-hides?
> it seems to me like the only time you should need this information is when you're switching apps.
And by hiding that information, you add an extra step to the process. The one-step "click on taskbar button" becomes the two-step "activate the window list, click on window button".
> i3 doesn't show you all of the running applications at once either, but no one ever complains about that.
That's because people who use i3 don't want that feature. People who do don't use it.
GNOME used to have it, but threw it away--along with many other features.
Why is it so hard to understand that not everyone uses their computer the same way you do?
This year my MacbookPro died (battery exploded). Since I was frustrated with Apple's lack of support for this extreme case and the fact that the new models can't be repaired, I bought an Asus UX305LA and loaded Ubuntu onto it.
I have used Ubuntu at work for a few years but this was the first time I went with Linux at home.
For the most part, I love it! The whole OS seems to be really light on resources and the UI is pretty nice these days. It also passes a critical test, which is when I hand my laptop to a non-technical friend they don't even notice it's running anything uncommon. Chrome is Chrome I guess :-)
One thing that kills me is the fact that Linux is basically left in the dust when it comes to media on the web. I can't watch HBO Go, installing Flash player requires a PhD, and many websites just tell you to go away for using an "unsupported operating system".
Ubuntu are playing a (very long) game. If they manage it, IMHO they'll be in a better place than Fedora's UI which is built on a stack that increasingly only Redhat will continue to develop.
I would argue that RedHat is in a better position to do most of the development of the Gnome desktop and Wayland vs Canonical being able to manage the transition to Mir and keep Unity going.
Are you aware that the whole of the development resource (people) across the entire free software desktop would fit into a reasonable size conference room? Versus, say the size of the development teams on proprietary systems such as Windows?
So, what is the point of trying to divide the community of people all trying to work towards similar(ish) goals - the improvement of the alternative desktops - rather than focusing on how they are part of the same thing? It doesn't improve the likelihood of achieving the goal of a "better" desktop.
You are literally stood in a room with 10 people trying to demonstrate how 2 of them are better than the other 8. It's an exercise in pointlessness, makes the free software community unfriendly and sets up fake academic wars [0].
Note: I previously worked for Canonical so have bias, I know people at RedHat and other open source companies. I have had the opinion that this sort of stupid divisionism stuff is bad for FOSS for at least 10 years.
I apologize if I came off as an ass. I was ticked off by Redhat being painted as the BigBadCo and Canonical being painted as the little-guy-who-could.
I have the greatest of respect for anyone who has ever contributed a single line to Linux - more than anywhere else, your works really impacts countries like India.
I am not sure why there is this impression. I would be lying if I didn't think so myself.
I used to love the choice after being limited by Windows but it meant all the development and design efforts were split between different desktop environments so mixing apps from different desktop systems was an awkward user experience.
I preferred KDE the most but I never understood the obsession with the number of customisation options. These surely slowed down development and it felt like you had to be the UX designer yourself instead of sensible defaults behind chosen.
Now I just want something with minimal customisation with decent defaults so I know it won't suddenly break when I have to get some work done.
> Now I just want something with minimal customisation with decent defaults so I know it won't suddenly break when I have to get some work done.
The difficulty is that one persons "sensible default" is anothers "crazy terrible default". Particularly in the Linux world, fuelled by it being committed hobbyists who create much of the desktop software and the tradition of UNIX configurability.
> Now I just want something with minimal customisation with decent defaults so I know it won't suddenly break when I have to get some work done.
Try Ubuntu MATE, if you want minimal customization you have some preconfigured defaults in MATE Tweak tool, if you ever want to customize more its super easy compared to every other DE I've tried.
The freedom of using a variety of desktop environments, and the ability to customise them down to the source code, is what got me interested in Linux in the first place.
Back in 2003, I was experimenting with Rainmeter and a port of Blackbox for Windows, but became quickly frustrated about their limitations. I spent an embarassing amount of time trying out the different window managers, before settling on KDE 3.
I wish one of the major Linux distributions would double down on design as a core priority (as in, bring in designers and give them serious pull). I'd kill for a beautiful, opinionated, and novel Linux desktop environment.
I think this is being attempted, though everyone I've seen is horrendous.
- Graphics designers are not necessarily qualified UI (or "UX") designers
- UX designers are not necessarily qualified UX designers
- Qualified UX designers are not necessarily qualified designers of an OS UI. Most UX work you can get is essentially applying a few simple principles to yet another implementation of a CRUD app.
I've only seen Linux distros make "design" progress on these fronts:
- Eye candy: making things shake, glow, blink, blah, whatever. This is often pointless, and can exacerbate issues with hardware.
- Terrible "modern design" copycatting: making fonts and buttons really huge and interfaces empty with no real clue what someone is going to use it for. KDE Neon's package manager sticks out as a bad offender I saw recently
- Out of box setup: releasing something that has all the batteries included. Unfortunately, this also pretty much guarantees: it has software on it you don't need; it has software on it you don't want; it has software on it that will break at some point; understanding and customizing it will be treacherous and nigh impossible.
You can go two ways: make a single/narrow purpose system, or make a general purpose system. Linux is great for single/narrow purpose systems, since it's all pieces you can pick and choose at your will. Of course, that won't work as a general computer.
For a general purpose system, I think we need to stop trying to pretend that it's easy or can work out of the box and start accepting the fact that it's difficult and provide people the tools and documentation to handle it.
It's attractive and all but that pretty canopy is turned up in the direction where the the rain and wind normally comes from and the seats which have these neat looking glass tops with embedded fish and octipi etc sculptures are not only uncomfortable, they also have a bar in the center presumably designed to stop homeless from sleeping on them which also stops more than 2 people from sitting and the glass has a lip which causes the rain the canopy lets in collect on the seats making them utterly useless at least 1/3 of the time.
ElementaryOS is just a skin of GNOME and some half-assed replacements for some GNOME apps. If you want a polished desktop just install GNOME. The best distro for GNOME is Fedora. So I guess what I'm saying is just install Fedora.
As long as it's not the Material school of design or any of these other sparse, happy atrocities that have graced the web and mobile over the last few years
The only problem will be having a dozen of calculators, control centers, text editors, terminal emulators, and file browsers installed on your computer, to the point you have to hunt for the correct program.
Speaking of window managers - is there a successor/branch of scwm (Scheme Constrained Window Manager) that is being maintained currently? I loved how that worked...
You can click on the uBlock icon and open the logger (little icon that looks like a window with lines) to see which requests are being blocked. I'm seeing xhr's to https://collector-medium.lightstep.com/api/v0/reports being blocked every few seconds. Probably some medium analytics request.
Cargo cultism is alive and well! So many _feelings_ of variety! 2016 is the year of the linux desktop, just like 2015, 2014, 2013...
So many options of ways to lay out windows but fewer than half the actual applications creative professionals need to do their job, so we're still stuck on Mac or Windows.
But, the reality is that "creative professionals" have very few reasons to change platforms as Mac/Windows satisfises [0] them. This means they don't contribute to making alternative platforms better for their use-case.
Second, professional Linux development is focused on the server. No-one is really paid on the desktop side to create applications for creative professionals [1]. Most of the applications are created by "hobbyist" developers whose target audience is themselves.
This means no-one is really contributing to what "creative professionals" want.
The Linux desktop works for me perfectly, it's been the year of the Linux desktop since 1996 - but then I'm not a 'creative professional' or an 'accounting professional' etc etc.
[0] https://en.wikipedia.org/wiki/Satisficing
[1] Creative professionals are a more difficult segment than general users because they have high expectations of the platform and specific applications/use-cases.
I think the core of the argument about linux is not about having a well designed system that works well, but that it should be an american technology that dominates, and not anything that is managed by a non american, especially if he is against corporatism and tend to do things in a way that can be perceived as arrogant.
So basically, linux will not prosper as long as it is not in the interest of the US.
You can have a log of configuration changes for your system, you can have various profiles with branches, etc. Want to switch to a different profile? Just `git checkout <branch-name>`, done! Want to revert some change, or see how your PC looked like 2 months ago? `git log` and `git checkout` your way through!
I currently have all my configurations[1] in a dotfiles directory representing the home directory (same directory structure etc.) and symlinks to them in the actual home directory. Very easy to manage.
[1]: https://github.com/awalGarg/dotfiles