The author's points are valid. However, there's no recognition of the tremendous marketing, documentation, training, community & ecosystem surrounding the platform.
These things are real, they are as real as any code that's written. They are as real as any hardware.
So yes, the Raspberry Pi is a great, fantastic introduction into this IOT world. The security problems are real, that should be the first thing attended to.
And then yes, once you get the basics down it should be easy to move to a better platform/device/doohicky.
My job is trying to get ambivalent people to take security seriously, and I'd like to amend your statement. Ubiquity and ease-of-use will win over security as long as security professionals insist on cumbersome practices.
Nonsensical password complexity rules.
The absolutely asinine technology we have to encrypt emails.
Third-party antivirus software.
Patches forcing a reboot (hell, patches needing a reboot).
Encryption being an add-on or an option.
Bundling spyware and adware with brand new machines in order to reduce their sticker price.
Let's Encrypt changed the world by making SSL certs as easy as they could ever be. That's a very positive step. Likewise, no one has to wonder if their iPhone is infected with malware. They just use it, without any security training at all. Developers use PaaS because patching is hard and you never know if it's going to break a production system. Now it's someone else's problem.
What wins security is making it harder to not be secure. Wordpress is still a long way from that ideal.
One of the most annoying habits of computer professionals when talking about security is how we object to every idea by showing how a stupid/lazy end-user could render it useless.
It's not that users will never do that: it's that users can't get into secure habits if we paralyse ourselves into not providing reasonable tools.
> What wins security is making it harder to not be secure.
I find this is usually at odds with your original statement. Current security practices are almost entirely a security/utility trade off, we make things secure by locking systems down so the user can do less or has to jump through hoops to do it. The iPhone is the perfect example of this, it's only secure because it's limited in what can be installed or run by the user.
You're right, I worded that wrong. I should have said "What wins security is making it easier to be secure (rather than making it harder to do insecure things). The iPhone went the second route. iOS is undeniably easy to use right out of the box, and it's also really super secure. But if you want to circumvent all of their security, it's actually pretty difficult. You have to trade a lot of things for the ability to jailbreak your phone.
Windows went the other route. If you want to install unsigned drivers, you have to reboot into a special mode, and next time you reboot you lose that privilege again. If you want to install programs, you have to click yes on the UAC pop-up. In the past, none of those roadblocks existed. Rather than making it easier to have good security, they made it harder to do things insecurely.
Microsoft purposefully put in things to make it harder to be insecure, where Apple (with the benefit of starting from scratch and not having to deal with legacy cruft) was able to make being secure easier. Most people don't have to jump through hoops to do things securely on the iPhone because it was built from the ground up to make sure everything you need to do can be done within the sandbox/walled garden.
Things like SSH keys that require you to upload a certificate to every server, that's secure but all you're doing is making it harder to do things. People are more likely to fall back to passwords, so your only option is to not let them use passwords, which is just making it harder to be insecure. RSA tokens are a middle ground, not really harder than a password but far more secure. But a fingerprint scanner built right into a button you were going to press anyway? That is making security easier. It's good (enough) security and end users don't even notice it, let alone have any opinion on it. It just works. It's that easy.
Ah I don't think it is the same, I think the equivalent would be more like jQuery or even React. Wordpress is a "you don't have to be a programmer" platform while Raspberry Pi would be more like "you do have to know your way around".
I agree with you, the Pi's community and ecosystem are very good. I question also why I would care about saving a little money on a one-off personal project, but the author's advice to use a $5 device would be good if you wanted to make many smart devices.
Off topic, but when a consulting customer gave me a Pi 3 as a going away gift when I was working onsite, I set it up when I got home and used it as my work system for 2 or 3 days. A bit slow but it was fun!
The new Pi Zero W's are $9 and the include wifi. I used to use arduinios or MSPs for things because i didnt want to "waste" a full pi. But I mean, $4 savings with these NodeMCUs? vs. the platform I already know how to use that's a normal computer that happens to have trivially easy to use IO pins? For one-off type projects i just don't see how it's worth it for me. I'd rather spend the four bucks and skip ahead to the interesting part of the project.
Imho the better thing to consider w/ pi is that even the Pi Zero W consumes a decent amount of power. So for some projects, that really matters.
"To start with, it runs Linux, which has a steep learning curve associated with it and isn’t suited to beginners."
I couldn't take the article seriously after this. "Beginners" click icons and run applications. This works under Linux like any other OS. Anything more complex than that isn't going to be done by a "beginner" no matter what OS you're using.
This article referenced the ESP8266 (specifically NodeMCU configuration) as a cheap alternative for WiFi IOT projects that don't warrant the Pi's complexity. I've played with this chip and was pleasantly surprised how easy it was to get it working. The bare chips can be had for a couple of dollars, or $15 for a development board which supports USB flashing.
Don't cheap out on the dev board, the cheapest ones use a shitty USB-serial chip and it was a headache to get the drivers working. The adafruit one worked out of the box.
For anyone thinking about trying ESP8266 programming, it's a great way to start. For my one-of projects, I just put a whole D1 mini in and don't worry about trying to optimize it to less components (why bother at $4/board?). Just order at least a handful as shipping is slow.
The hardware limitations and the "danger of server maintenance" points are valid. Only that
>That ever-so-slight delay between hitting a key and having it appear on screen will eventually wear you down.
You don't need to run a full blown desktop. TTY is snappy enough on my model 2B.
It's a great "learn to hack stuff on a cheap small computer" platform for my nephew (if I had a nephew [1]), and that's how it is advertised.
[1] On the other hand, I wonder how many RasPi users buy the thing because they wish they would have had it when they were of the age to be someone's nephew.
I use my Raspberry Pi as an always-on Emacs computer; all on a framebuffer. It's very refreshing to be able to turn on a monitor and be able to write a journal entry or manage a to-do list and then commit and push to a remote git repository. There's nothing to distract you besides M-x tetris.
I once considered buying an AlphaSmart Dana for this purpose, but you can't beat Emacs's keybindings (or vi's, if that's your thing).
Just like any other programmable internet connected device.
> A Mini-PC or Tablet Would Probably Be Better
How does this solve the security problem mentioned above? IMHO it actually makes it worse, because you are now connecting an even more powerful machine.
> Never open your Pi as a public facing server.
What if the purpose of my Pi is to be a public facing server?
> This is true of every website regardless of where it’s hosted, but it’s particularly problematic for the Raspberry Pi, which tends to be set up by hobbyists who aren’t intimately familiar with best security practices.
It is easier to install a web server and misconfigure security on a typical desktop machine than it is on a Raspberry. It feels that the author has a condescending view on raspberry hobbyists that is not backed up by any facts.
"Linux enthusiasts perpetually claim that this year is the year that Linux will finally make headway into the desktop for the everyday user — but it never has and never will."
I guess the author has a (much coveted) crystal ball.
Oh stop it. Is anyone seriously expecting Linux to take over the desktop anymore? Ubuntu is embedded in Windows now for cripes sakes! Be happy with the world's servers and cell phones.
The closest thing to a "Year of the Linux Desktop" is the growing popularity of ChromeOS. Nowhere near rivaling Windows, sure, but not entirely obscure either, and certainly not with a "steep learning curve" (come on MakeUseOf; it ain't 2005 anymore, you can cool it with the tired "Linux is hard to use" crap).
The Linux systems that get brought up in these arguments have a lot of non-default helper programs that obfuscate the underlying OS and remove really any reason to be using Linux in the first place other than it's free and it runs on everything.
I mean, yeah Android is nice. And it technically runs Linux. But so does my wifi router... just because I plugged it in and logged into the web interface doesn't mean I can, with a straight face, claim that I am a Linux user.
Does having a laptop that boots straight into a browser without showing anything lower level than that really count as Linux? Does it really make me a Linux user?
Yes it does. What kind of question is that? With the same logic you can claim that no Windows user that doesn't know how to use "obscure" Windows tools is a Windows user. What you are talking about is a "Power User" or admin. Fuck when every Desktop would run ChromeOS would you still claim it wasn't the year of the Linux desktop? Ubuntu tries to hide every "linuxy" aspect and make it user friendly. So people using Ubuntu and not using any of the command line tools aren't Linux users?
Making the choice running a system makes you a user of that system. If the choice is conscious or not is unimportant. Most people only click on "the internet" or office and it doesn't make them less Windows users. Or do you need to be aware of what you are to be it?
The difference is, Windows is Windows. It's a full OS from tip to tail. Linux is a kernel, one that doesn't do a whole lot by itself. If you installed Linux, you'd be very disappointed. But even installing Ubuntu is different IMO than using Android or ChromeOS or a car infotainment system that may technically be based on Linux but the end user would never be able to tell.
Likewise if I've used an ATM or a mall kiosk that was based on Windows, I'd hardly call myself a Windows user. I'm not talking about an admin or a power user. I'm talking about someone being aware at a basic level what kind of system they're using. My grandma knows she's using Windows.
The problem we both have is when we define someone as a Linux or Windows "user". In my opinion what counts for the year of the Linux Desktop isn't if anybody _sees_ themself as Linux user, but rather if someone actually uses Linux.
I think if we are evaluating what the market share of an os it doesn't matter how it hides it presence. ATMs still are vulnerable to Windows exploits and Linux PCs for Linux vulns and it doesn't need someone to identify with something to be it. It's just the difference between calling oneself a * user and being counted/seen as one.
> Does having a laptop that boots straight into a browser without showing anything lower level than that really count as Linux? Does it really make me a Linux user?
You maybe don't see yourself as one, but you are one. I don't think my Grandma knew she was using Windows and it doesn't matter if she knew.
In that case Linux won years ago. Nearly everyone has seen a billboard or mall kiosk or in-flight entertainment system running on Linux, nearly everyone has connected to a web page running on a Linux server, many people run Android-powered TVs. Hell, my dog can push the ice dispenser on my Samsung smart fridge, she's a Linux user too.
Debate's over. If you don't have to know you're using Linux, if you don't have to see yourself as a Linux user, if all you have to do is be exposed to a system that incidentally runs Linux in some odd capacity for it to be considered Linux on the desktop, that's it. Linux won.
Bad news for everyone who is anti-Microsoft but visits Stack Overflow which runs on Windows Server... wonder if those people are aware that they're now considered Windows users?
> Does having a laptop that boots straight into a browser without showing anything lower level than that really count as Linux? Does it really make me a Linux user?
That's the attitude that keeps Year of the Linux [whatever] from happening. Just because you haven't recompiled your kernel to get some odd library working doesn't mean you're not a Linux user. The idea of a true Linux user is a fallacy in league with there being no true Scotsmen.
How about the other example I stated about my wifi router? It runs Linux, I plugged it in, and I've connected to its web admin page. Am I now a Linux user? My TV runs Android, am I a Linux user?
I never said you had to compile a kernel to be a Linux user... I'm just not sure if an appliance really counts.
> That's the attitude that keeps Year of the Linux [whatever] from happening.
I don't think it's the attitude stopping it. I think it's that ChromeOS has a 0.82% market share and other Linux distros have a 1.66% market share, while Windows and Mac make up 95%
Define "default". From the perspective of a user buying a Chromebook, the "helper programs" are very much the default.
Additionally, "helper programs that obfuscate the underlying OS" is pretty much the whole point of a desktop environment. I fail to see what point you're trying to make there.
"I mean, yeah Android is nice. And it technically runs Linux."
Which is really all that matters. We're talking about the Year of the Linux Desktop, not the Year of the KDE Desktop or the Year of the GNU Desktop.
Not that Android is (usually) a desktop OS, but whatever.
"But so does my wifi router... just because I plugged it in and logged into the web interface doesn't mean I can, with a straight face, claim that I am a Linux user."
Sure you can. You're not a Linux desktop user, but you still use Linux in some capacity rather than, say, VxWorks or IOS.
"Does having a laptop that boots straight into a browser without showing anything lower level than that really count as Linux? Does it really make me a Linux user?"
I think it does. It'd be no different from being a Windows user who only uses a web browser.
You don't have to stutter for your point to be logically inconsistent to the point of incomprehensibility. Lorem ipsum is nonsensical faux-Latin regardless of any speech impediments.
If ChromeOS or Android don't count as "real" Linux desktops because (aside from Android not usually being used for desktops) they abstract away the "Linux" part, then by your exact same logic, any Linux distro that ships with a full-fledged desktop environment is not a "real" Linux desktop, either, since they by design abstract away the Linux part. In fact, literally no actual operating system could possibly be a Linux desktop unless we want to go ahead and turn all user software into kernel modules, since literally anything in userspace is by definition an abstraction on top of Linux.
Linux is one component in a fully-featured operating system. Exactly which userland happens to be running on top of it does not change one's status as a Linux user.
Meanwhile, pretty much every desktop environment for Linux (except maybe recent versions of GNOME) is not exclusive to Linux, so by the logic you've presented, there's literally no such thing as a "real" Linux desktop.
>[maybe] there's no such thing as a "real" Linux desktop.
That may be the simplest answer, but when people talk about "the year of Linux on the desktop" they're talking about something in particular, right? Or does any Linux system count? Android smart TVs? Routers? Mall kiosks? Car infotainment systems? Bluray players where the only interface is a Java program? Or are we only counting systems where we can drop to a shell? In which case, do we only count Android phones that let you install bash, or only ones that can be rooted, or do we count them all?
It's not a logical inconsistency, and it's not incomprehensible. It's a simple question: at what point do you differentiate "it runs on Linux" from "it's Linux on the desktop"? Because if my router counts, Linux won a long time ago.
"when people talk about 'the year of Linux on the desktop' they're talking about something in particular, right?"
Yes: they're talking about a computer running a desktop environment on Linux on a desktop or laptop computer. ChromeOS meets all those criteria. Android would meet all those criteria should it ever be widely deployed on laptops or desktops.
So:
"It's a simple question: at what point do you differentiate 'it runs on Linux' from 'it's Linux on the desktop'?"
And it's a simple answer:
1: It's a desktop or laptop computer.
2: It's running Linux.
3: It's running a graphical desktop environment.
If all three of those things are true, then it's Linux on the desktop. All three of those things are true for Ubuntu, so it's Linux on the desktop. All three of those things are true for ChromeOS, so it's Linux on the desktop.
Whether the operating system allows lower-level access (like a command-line interface) is irrelevant.
"Because if my router counts, Linux won a long time ago."
Your router probably doesn't meet those criteria, since it's probably not a desktop computer and probably not running a desktop environment.
"Android smart TVs?"
Fails criterion 1.
"Routers?"
Fails criteria 1 and 3.
"Mall kiosks?"
Probably fails criterion 3, and almost certainly fails criterion 1.
"Car infotainment systems?"
Fails criterion 1.
"Bluray players where the only interface is a Java program?"
> they're talking about a computer running a desktop environment on Linux on a desktop or laptop computer
That's literally what I was asking. Thank you for replying with your opinion on the subject, it's ridiculous that it's taken this long for someone to say what they actually think instead of just arguing about compiling kernels and a lack of Scottish people.
In my opinion Android is not Linux on the desktop. In my opinion ChromeOS is not Linux on the desktop. I understand there are other opinions, but when I think Linux on the desktop, I think Ubuntu, Debian, Slackware, Red Hat, etc. Even the Wikipedia page about Linux on the desktop says "The term Linux adoption often overlooks operating systems or other uses such as in Chrome OS that also use the Linux kernel (but have almost nothing else in common, not even the name – Linux – usually applied"
So there is definitely room for debate. But apparently not on this forum.
"In my opinion Android is not Linux on the desktop."
Agreed, since it does not (usually) run on a desktop or laptop computer.
"In my opinion ChromeOS is not Linux on the desktop."
And why not? That's the point that I've been trying to coax out with no success. What makes it anything but Linux on the desktop?
Is it the fact that it doesn't provide low-level access to the system by default? If so, then why is that a requirement for a desktop system? It seems to not be in line with how the vast majority of desktop computer users actually use their computers. It would also exclude Windows and macOS from being desktop operating systems, since they're (IMO) just as hard to work with on a low level (if not harder) than ChromeOS (but not as hard as Android or iOS).
Is it the fact that everything's web based? If so - again - why does that matter? Who cares what executable format is used or what programming language is used for user-facing software? (Of course, this "fact" ain't entirely true: https://developer.chrome.com/native-client/overview)
Is it the fact that the Linux name is not used? If so, then by that logic Ubuntu is not a "real" desktop Linux system, since https://www.ubuntu.com/desktop doesn't mention Linux anywhere (except for the developer subpage, where it's mentioned as a development feature).
The Wikipedia quote explains the perception's existence. It doesn't explain why that perception of "ChromeOS ain't a real Linux on the desktop" exists or by what criteria it's not a "real" Linux desktop. The fact that it uses the Linux kernel is the only thing that matters for the phrase "desktop Linux", since - again - we're talking about desktop Linux, not desktop GNU or desktop KDE or desktop GNOME or desktop whatever other software.
To be fair, ChromeOS has a proper shell[1], you can ssh from it, etc. (of course you may be able to shell into your router, too, hopefully not via telnet...).
If you want to do anything "linuxey" with a Chromebook, don't you need to use something like Crouton and install a whole different distribution anyways?
Is it even possible or simple to install ChromeOS on a Pi? Most projects related to ChromeOS on Pi seem to be dead. There's FlintOS, which had it's last update last month, but in the release notes:
> We have disabled the root password briefly for improved security, subsequent releases will have a better system to maintain security and allow users root access if activated.
There's also ChromiumRPI, which hasn't been updated in a year and does not and will not support wifi.
Also as far as I can tell, none of them can play Netflix unless you're on a real Chromebook. I'd say it's fairly "hard to use".
Thanks for the info! (I haven't touched ChromeOS in a long time). Still kinda supports my point that ChromeOS does have a steep learning curve if you want to do anything outside of web browsing.
To your implied question about Linux on a Chromebook - you can skip all that Crouton stuff if you flash a special bios. I've got Ubuntu Server with Xmonad on mine as a daily driver and it's been a smooth experience.
I'm afraid I'm not able to discern your meaning from that little snippet, but...
I am willing to go on record now as saying "never will". It hasn't happened yet and the desktop is already dying. The mobile space is the juggernaut now and through Android, Linux is the biggest player.
Using an RPi for everything is indeed not a good idea! It's a poor choice for e.g a HPC server of a 10G router.
Not using RPi for anything is not a good idea either, it seems.
The RPi (and C.H.I.P., Arduino, etc) have a rather well-defined sweet spot: small-scale hobby projects and study. At this niche, they are a good value for the money (because tools, books, community, etc).
#1 is an interesting point. Use a $5 nodemcu instead of a $9 pi. Not only do you save $4, but you get a free introduction to the world of horrible ssl implementations!
These things are real, they are as real as any code that's written. They are as real as any hardware.
So yes, the Raspberry Pi is a great, fantastic introduction into this IOT world. The security problems are real, that should be the first thing attended to.
And then yes, once you get the basics down it should be easy to move to a better platform/device/doohicky.