Hacker News new | past | comments | ask | show | jobs | submit login
Learn just enough Linux to get things done (alexpetralia.com)
353 points by shrikant on Nov 16, 2017 | hide | past | favorite | 133 comments



One example of this is the popular version control system (VCS) called git. Developers could have written this software to work on Windows, but they didn't. They wrote it to work on the command line for Linux because it was the ecosystem which already had all the tools they needed.

I think the real (and even stronger) reason is that Linus Torvalds specifically built Git to serve as the VCS for the Linux codebase.


That and the fact that Mercurial worked on Windows renders this line very poorly researched. Then again, perhaps the author, like the title, learned just enough to write the article.


And git does work on Windows. There's even a windows logo on the official download page. https://www.git-scm.com/downloads


The originally released Git did not run on Windows.

The first Windows-compatible release was msysgit, which was effectively a mingw/msys Linux emulation layer for Windows. Not quite native, but it worked. Kind of. Not really very well at first.

The current Windows download - "Git for Windows" - is still msys-based, just with a lot of compatibility/package-management/ux-polish changes.

All that and the fact that recent Windows includes Windows Subsystem for Linux make the differentiation somewhat moot, but, essentially, Git is a *nix app.


Here's the full quote:

"What resulted was an expansive suite of programs and utilities (collectively, software) that were written in Linux, for Linux - much of which was never ported to Windows. One example of this is the popular version control system (VCS) called git. Developers could have written this software to work on Windows, but they didn't. "

It's just wrong.


Technically it's correct. It's just really odd in tone; it almost sounds like it's unusual or negative that these utilities were not written to be completely cross-platform, as if that would be a norm.


But git is not an example of software that wasn't ported to windows. It was ported to windows.


Yeah, the wording is ambiguous. It could be construed to say Git is an example of a utility that wasn't ported, or just that it was a utility that was written in Linux, for Linux.


Git for Windows relies pretty heavily on MinGW, though. And it's best used from the included bash shell.


I do want to point out that Git being built with MinGW doesn't do anything to affect its compatibility. MinGW is basically just the GCC compiler suite ported Windows. It doesn't do anything about making C code written for Linux compatible with Windows.

Now typically, when you install MinGW it also installs some tools that let you operate within Windows a little bit more like you're working under Linux, such as by installing a more sane terminal emulator and Windows-compatible ports of some popular programs e.g. grep. But all those tools and such are ports of Linux programs running on Windows, not standard Linux programs running atop a compatibility layer to allow them to work on Windows.

I mention the above because this was an unclear distinction to me originally. Additionally, this was based on my experience with MinGW some five years ago, and I don't know if everything I've described is still true. Hopefully this slight aside clarifies things for someone.


> MinGW is basically just the GCC compiler suite ported Windows.

That's where you get it wrong. MinGW is a compatibility layer that helps port unix apps to windows. Yeah, it comes with GCC, but it also comes with support for POSIX, including a unix shell and other standard components.

Claiming that MinGW was just GCC would imply that to run Git on windows would be just a matter of recompiling the source code. But it isn't.


Why is git so heavily "posix-y" in a way that e.g. svn is not?


Wasn't git created to support the development of the linux kernel? It was a linux thing first.


>I think the real (and even stronger) reason is that Linus Torvalds specifically built Git to serve as the VCS for the Linux codebase.

Mercurial was built for exactly the same reason (VCS for Linux), at the same time. Yet it has always been cross-platform.

Linus wrote it for Linux because he didn't care about whether it would work in Windows. No other reason.


And who could fault him for that. After all, why would he care or even want to invest time to make it run on Windows? It’s GPL, anyone can port it if they want to.

Git for Windows and Cygwin Git have some issues (I use them daily at work), but many of these can be entirely attributed to the limitations of Windows: CRLF, case-insensitivity in filesystems, problems with long pathnames, slow process forking, etc.


The real reason is that cmd.exe is nothing like bash/zsh/csh/etc. Every OS except Windows can use bash (or an equivalent shell) natively.

There is no advantage to writing another interface to git that is directly compatible with cmd or powershell.


Yes, he has said this in interviews. Great write up! I too keep a list of helpful commands handy: https://github.com/vinniejames/koopa-troopa adding a link to yours


Linus wrote it with the code library interface only in mind. the cli was a hodgepodge just to access the interface, and i bet he had a bunch of other specialized hacky binaries that he used for his day to day tasks.


  This is not a surprise - the Windows ecosystem simply wasn't designed with software development in mind.
Aha? What is C# and Visual Studio? Or .NET? Was Linux designed for software development?

This article is so bad, I'm really getting mad over here. Stop publishing such crap, you feed a lot of people's resentment against all those hipster developers who don't want to learn by RTFM, but want to have handled everything.

Articles like this are a real hurdle on the way towards making Linux and programming in general more approachable to non-techies. Just stop, please.


> Was Linux designed for software development?

Linux was designed to emulate Unix and run the pre-existing Unix-like userspace from GNU.

And absolutely yes, Unix was "designed" for software development. That's exactly what it was in its early days: a hacker playground for the geeks at Bell Labs to try new ways of doing things.

And windows wasn't. Early windows development (for years after it became popular) was done at a DOS command line. Eventually Visual C++ because the dominant product, and certainly you can't say it wasn't innovative. But it remained a separate product. "Windows" wasn't for development. And we still suffer from this in, say, the way windows basics (like the standard C runtime!) typically need to be shipped per-app and are only vaguely compatible across OS and toolchain changes.


What?!?

Are we talking about Windows 1.0 and 2.0 here?

I was using Turbo Pascal and Turbo C++ quite nice IDEs on Windows 3.x.

Windows has always been a popular development environment for the demoscene, and games developers.

The only alternatives being Mac and Amiga, none of them UNIX friendly.


But... the point was that "windows" per se wasn't designed for software development. The fact that you had to buy an IDE from a separate supplier supports that point, it doesn't refute it.

In fact, as pointed out, Microsoft's own development environment (and the one used to create windows itself and most of MS's windows apps) until 1993 was a DOS command line compiler and separate editor.

No one is saying you couldn't develop "on windows" (obviously you could), but that's a rather different point than saying that windows was designed with developers' needs in mind (it wasn't).


Then by that point of view no commercial OS is made for developers, including UNIX ones.

It was the way that commercial UNIX vendors started selling their SDKs, with Sun being the first one, that made people actually contribute to gcc.


Being either a commercial product or open source doesn’t preclude Unix from being designed for development.


Only if by development one only cares about textual command line applications and daemons, because everything else is also third party.


"Linux" per se is just a kernel. It doesn't exactly have gcc built in.


https://en.wikipedia.org/wiki/GNU/Linux_naming_controversy

Not sure if that adds much to the conversation, it is a v. common shorthand to use linux for gnu + linux.


Neither DOS nor Windows had any compiler included. DOS did have a BASIC interpreter, later they did away even with that (with Windows 2000 I think). In Linux you had a rich range of compilers, interpreters and other development tools, right in your distro. I'm talking about late nineties here. The two were absolutely incomparable in this aspect. Although, I have to admit, I preferred Borland's documentation to Linux man pages.


Just like most commercial OSes, including proprietary UNIXes.

Other professions buy their tools.

In any case, most MS-DOS applications were written in Assembly and one could use debug for creating COM executables.

GW-Basic + debug was already a big step forward from using monitor applications with hexdumps on ZX Spectrum and C64.


> Early windows development (for years after it became popular) was done at a DOS command line ...

I could just as easily flip this and claim that Linux development is all backwards because it's still done on the command line.

Given Microsoft's commitment to tooling, backwards compatibility and extensive documentation I think it's pretty lame to try and paint windows as not targeted towards developers. The difference is that windows has actual users, as opposed to the eternal horizon of "the year of linux on the desktop".


Effective early Windows development required a surprisingly thorough knowledge of DOS and x86 memory management/pointers that one had to get through books (unless one was poring through Intel chip specs). Even though Petzold's books were/are sold through Microsoft in my opinion is it's a stretch to call it documentation. For a long time, the best debugger was a third party tool - Soft-Ice. Their documentation/tools for XBox was pretty great, though, they had to do well there being new kid on the block. Microsoft had the benefit of 90% of PC market share at that time so we all had to learn it if we wanted a piece of that pie.


Those books where widely published from the start from Microsoft and Developers. There was aspects of the API were not released to the general public but it didn't stop a developer from knowing about them.

You also has the MSDN subscription that did cover a lot about the development.


Yes the MSDN subscription which seemed like kind of a racket at several hundred dollars ( it was up to $1000+ by 2001 depending upon support level/media ). Also they would frequently have documentation about something in one year, then drop it from the CD's/DVD's and if you got lucky you had those or someone was really nice and posted it on the internet. I remember having to figure out the IME's for Far East/etc input circa 2000 from a usenet post then handing that information down to a someone else on usenet so they could implement it as well.


The fact that you think a DOS command line c. 1993 and a Unix one (from, say, 1984 even!) are comparable basically ends this argument, you realize, right?


Yup you're right, the extensive argument you provided has completely convinced me of my errors. /s

The point I was making was that they're just different systems and approach things in different ways. Trying to say one is more "developer focused" than the other is a useless metric(which we'll take forever trying to define what "developer focused" even means).

One could make similar arguments about the Alto not having a command line, yet Smalltalk was incredibly powerful and focused at giving a top-tier development environment that still hasn't been match in some ways to this day.


I dunno.

UNIX and thus Linux are all about pure user power. The shell is great and there are lots of tools available by default like Awk, Perl, Python, TCL, Sed, GCC, Vim, Emacs...etc. The entire OS isn't perfect, but is designed for people who know what they're doing. Compare that with CMD & Batch and it is Linux is like an SUV that can also fly and shoot lasers while Windows is like a moped. Of course I'm exaggerating a good bit, but even Powershell has a lot of issues and doesn't allow you to easily do what you can in Linux. In Linux everything is text and it's easy to pipe things around. In Windows it's all objects and frustration. I agree that Smalltalk on the Alto is pretty cool, but Windows is nothing like that besides having some OO stuff. Something I do all the time on Linux is "locate filename" and pipe that to "grep" to be more specific. On Windows, open up the search Window and pray you can narrow it down or wait a very long time.

I use Windows and Linux at work. On Linux I feel like everything is designed to increase my productivity as are the tools at my disposal. On Windows, I'm always fighting the lack of good tools. Sure Visual Studio + .NET is pretty good if that is your thing, but I personally don't like having to open an IDE that uses that kind of resources if I essentially need to write a simple script. I've spent quite a bit of time with Powershell, but it simply isn't as productive as Linux due to the increased complexity. Sorry for the rant, but as i spend a lot of time with both OS, I have strong feelings on this.


> I could just as easily flip this and claim that Linux development is all backwards because it's still done on the command line

Except that it is isn't an equivalent comparison for that point. OTOH if you made the point about GNOME etc being developed on the command line...



Yeah, software development on Unix was so natural and enjoyable, even today half the time building a Linux library or app is typically spent in byzantine shell scripts trying to determine which particular occult incantation of incompatible or broken Unix you are not running and throwing various test programs at your compiler that last made sense in 1995.

The true essence of delightful and plain right software development must be over in OpenBSD land, where they run the one and only gcc 4.2.1


I tend to fall into I use what-ever gets the job done. Though most of the times I did hear from high Microsoft Programmers they first started with Unix/Linux and super productive in that environment. Though when it came to shipping product and getting customers to buy it all the users where on Microsoft/Apple at the time. Tending to wanting to survive and buy `nice` things they all had to learn Microsoft/Visual Studio to ship product to make monies.

Not wanting to fall into a debate about Microsoft/Linux/Apple I've tended to find that Microsoft with all its problems do have very helpful and useful technical support when you run into a major door stop when shipping a product. What they do for backwards compatibility is amazing and still something I would like in the Linux world.

Here I go again....

One of my pet peves with Linux/Unix world is `Still` after all these years their asynchronous IO is still crap compared to Free BSD. Something that after all these years its still amazing that it still isn't fixed. The whole container/virtual machine is required in the Linux world because `Some` essential tool you need is no longer works with your specific version of Linux. To make matters worse I've seen 2-3 virtual machines in enterprise environments that are just simply there to run old software that it vital for the business.


To address that last paragraph, I don't believe it is anything special about Linux. I know a business that still keeps a small army of Windows 2000 servers in virtual machines due to software that simply won't run in any newer version (not even Windows 2003, despite being the last NT 5.x, it's just incompatible enough).


We've had bad experiences mixing Office 32bit vs 64bit applications. Also with Microsoft Windows Server 2000 just double check the license agreement and software licenses some of them have caveats of running the software within a Virtual Machine.

I know Oracle licenses we had to have a dedicated machine because they charge per core/thread.


>The true essence of delightful and plain right software development must be over in OpenBSD land, where they run the one and only gcc 4.2.1

Clang actually.


That's what it says in the marketing material, then you look inside and there it is, the ancient gcc the guys with the claim on most free license can't update because of it's .. licensing.

If it was only an exercise in self-flagellation, sure whatever, but no, their tears flow upstream when stuff 99% of the population uses on Linux inevitably breaks with their bastardized gcc.

(This is very obvious if you consider that no, they don't have an LLVM backend for the alpha)


The answer really depends on what kind of software development you're doing. So much of the software development that most recent, modern, web (or ML\data-science) focused developers do only works in a Unix environment that it might as well be impossible (even though things have improved greatly) to make software in Windows. As this kind of software development encompasses all work that happens at SV tech startups and most of the work at most big tech companies, most developers will not be exposed to other workflows. Game development, real-time graphics (most PC demos are Windows only), enterprise desktop software development, and I imagine a few other niches, are generally only possible or much better on Windows. For example good luck building and running a console game on Mac OS X or even Linux, even if the tooling and the OS running on the console is sometimes Unix based (Clang and BSD in the case of PS4 for example).


Linux will never be more approachable to non-techies. It’s an operating designed by and for developers. It’s brilliant at that. The push to make it palatable for everyone was always political and centered around its free software license, not about any technical suitability for it.


> Aha? What is C# and Visual Studio? Or .NET?

How many GB of pure crap are you forced to install on any windows system to be able to tweak, let alone develop, any software project developed with C#?

Even you acknowledged that you need to install a full blown IDE, such as the monstrocity that is MS Visual Studio, to do any software development in Windows.

In unix-like OSs, such as any linux distribution, all you have to do is pop open a basic text editor and compile/run the program.


I think MSBuild was part of all .NET Framework distributions. Even if it was not, you could download SDK without VS. Its just that people use VS because they like it over a clueless text editor.


And then there's Mac's XCode. A many gigabyte artifact that you have to install for many non-IDE related activities -_-


It's a tiny bit true. Windows was a platform to make Microsoft and other sell software.


those are not the operating system.


Mentioning strace in an article titled "Learn just enough Linux to get things done" seems so out of place as to be absurd, even in an "Advanced" section. Anyone who needs the rest of this advice is going to be absolutely baffled by strace.

Otherwise, decent little primer.


I've been a linux user for almost eleven years now, and a professional software developer for four, and I wasn't even aware that strace existed until just now.


I assume you haven't been done much development in C, C++ or something similar then.

Because if you had, I don't know how you could not know that strace existed. Maybe you wouldn't know how to use it, but I don't see how you could be unaware of its existence if you are familiar at all with linux dev tools.


I've been tinkering on Linux as a hobby for about 2 years now and was made aware of strace a couple of months after starting to dig a bit deeper. You make me hopeful of my future employment.


I’m 100% telling the truth when I say I got a job as a Linux sys admin at a massive tech company on the basis of knowing a little bit of bash scripting. And this wasn’t even that long ago. Learn some devops stuff and you’re golden.


Even 'netstat | head -n20' is out of place for introductory Linux.


Also curious was the inclusion of pushd/popd without explaining what a 'stack' is


Not sure bout netstat, but I pipe text to | head and use the -n flag to specify # of lines all the time.


Agree, although strace is a curious tool in that — by exposing the fact that userspace is "just" syscalls — it's highly educational about UNIX in general.


I'd tend to agree, but I think it really depends on what the person is doing/what the end goal is.


This article should be named "Learn just enough Unix to get things done". Almost everything explained can be done on any Unix-like system like macOS and it's not exclusive to Linux.


Disagree.

It could be mentioned early in the article.

However for its target audience (I assume) that would be far less useful.

("This looks great but seems to be about Unix and I know this is a Linux thing.")


Most of it applies to any posix-ish shell, even on non-unices.

And you can get quite a bit done on Linux without knowing any of it.


In my project, I maintain 3 documents. The first one is a similar list of basic linux knowledge that people should learn when arriving on the project. The second is "UnixTipsAndPitfalls" where people should add new entries when they encounter usefull commands. The third one is about tar.


Over years I've heard a lot of complains on tar being hard to use, but in real-life I really ever had to use only these two commands: tar zcvf and tar zxvf. It's actually quite easy to remember: tar Zip Compress Verbose File, and tar Zip eXtract Verbose File.


It's even easier than that! Less memorization, more understanding.

You can think of Tar arguments as either subcommands or options. The dashes are optional

The "subcommands" are x (extract), c (create), t (list), among others.

The "options" are f (read from named file rather than stdin), v (verbose mode), and z (run gzip on the archive before processing).

The fact that these options can be written without leading dashes, and all smashed together in a single argument, is legacy silliness for saving a few keystrokes. Don't confuse yourself with it, and don't confuse other people by teaching it to them, instead of teaching them how to actually use the command (which is really not hard at all).

So when I write Tar commands, I like to think along these lines. Instead of "tar zcvf foo.tar.gz", I write "tar -c -zv -f foo.tar.gz". After doing that about 3 times, it all sank in. Nowadays I'm much more intimidated by Git than Tar.

In the version of tar on the machine I'm posting from (GNU tar 1.27.1), this structure is obvious from right there in the man page:

    SYNOPSIS
       Traditional usage
           tar {A|c|d|r|t|u|x}[GnSkUWOmpsMBiajJzZhPlRvwo] [ARG...]
    
       UNIX-style usage
           tar -A [OPTIONS] ARCHIVE ARCHIVE

           tar -c [-f ARCHIVE] [OPTIONS] [FILE...]

           tar -d [-f ARCHIVE] [OPTIONS] [FILE...]

           tar -t [-f ARCHIVE] [OPTIONS] [MEMBER...]

           tar -r [-f ARCHIVE] [OPTIONS] [FILE...]

           tar -u [-f ARCHIVE] [OPTIONS] [FILE...]

           tar -x [-f ARCHIVE] [OPTIONS] [MEMBER...]


Thank you! That was very useful


Also -j for un-bzip2-ing which comes useful quite often.


You can just do `tar caf` and `tar xaf`. It detects the archive format automatically (and I personnally find it easier to remember: "compress-as-fuck" and "extract-as-fuck").


wow thanks for that :)


Yes. I also suggest to use tar instead of the dangerous cp -r: tar c -C src . | tar x -C dst.

It works over ssh connections (ssh -C for compression) and is a good training of tar syntax.


It is also quite slow.

For such moves rsync works much better.


Over years I've heard a lot of complaints on tar being hard to use, but in real-life I really ever had to use only these two commands: zip and unzip. It's actually quite easy to remember: zip, and unzip.


add 'tar r' and 'tar u' to your arsenal.


In about 40 years of using Unix I have never used r or u.


both seem fairly important in the 40-years-ago world concerning the 't'(ape) part of tar but okay.

I use them frequently as an "rsync"-ish into the archive file, which I can then use to take point-in-time snapshot/backups.


> The third one is about tar.

Why do people have such a difficult time with tar? The only tar commands I've ever needed (or seen other people need) are `tar cf ...` or `tar xf ...` (occasionally I'll need to chuck a z in there if the file is/needs to be gzipped), are other people just using far more complex tar commands than I am?


Because people assume they should RTM and if memory serves me right it is a whole lot more confusing than your post :-)


Doesn't look any more complex than most man pages: https://linux.die.net/man/1/tar


Part of my point: most of them are complex and to a big degree unnecessarily complex.


I agree and wish distro maintainers could make some changes. I would love to use manual pages, but instead I search the Internet first. If every manual had a section with most-common examples, it would help immensely. You could just type 'manexamples tar' and it would print out

tar cvf somedir filename.tar create an archive from somedir and store it in filename.tar

tar xvpf filename.tar extract the contents of filename.tar and scatter them around your home directory

Instead of getting examples that I might use, when I visit a man page, I get a wall of text 926 lines long, describing every possible option, including GNU option styles, or some truncated mess telling me to read the info pages.

Of course, it's appropriate that all this information is included somewhere in the manual. I just think the common use cases should be first. Does anyone need a man page for 'ls' that starts off describing how to list inode numbers and SELinux labels?


It's sort of embarrassing how much less efficient man pages are than a Google search to a SO question.

The justification I usually hear is that SO is for learning to solve one case, while man pages are for learning how the tool works. There's some merit to that, but if a tool has some hidden gotcha or unusual use, SO tends to make that much clearer than the actual documentation.

I suppose comprehensiveness was a higher priority when there wasn't an online discussion of every imaginable case, but it still feels like a lot could be done to convey the same amount of information more gracefully.


I'd love to get a man page like that but all too often they don't list examples near the top like that. If I open the tar man page for cygwin (probably an older one) I get a bunch of stuff about unix style and gnu style usage, then some irrelevant notes (where the full manual is), then a long winded description about how "tar -xvf" is the same as "tar -x -v -f", then the full descriptions of each option.

It sounds like this project moved in the right direction and made the man page more efficient than googling, but many others haven't.


IMHO, the minimum expected from a man page is to be complete. For example, this link does not give enough information to understand why "tar cfz x.tgz x" works, but not "tar -tfz x.tgz"


It says in the man page explicitly that it's not complete and you should look at `info tar` for the full man page. Also, because the problem really has nothing to do with tar -- every man page doesn't need to explain how old/short/long options work.

Section 3.3.3 "Old Option Styles" covers exactly the situation you describe.

> This old way of writing 'tar' options can surprise even experienced users. For example, the two commands:

> tar cfz archive.tar.gz file

> tar -cfz archive.tar.gz file

> are quite different. The first example uses 'archive.tar.gz' as the value for option 'f' and recognizes the option 'z'. The second example, however, uses 'z' as the value for option 'f' -- probably not what was intended.

> Old options are kept for compatibility with old versions of 'tar'.

> This second example could be corrected in many ways, among which the following are equivalent:

> tar -czf archive.tar.gz file

> tar -cf archive.tar.gz -z file

> tar cf archive.tar.gz -z file


I've just never understood why tar can't have sane defaults like zip. Why do you need three or four cryptic arguments to do the most basic use case?

Its like find. I try to remember the stupid syntax and just give up and use locate instead.


> I've just never understood why tar can't have sane defaults like zip.

Because tar means "tape archiver", and back when it was created, most of the time, you only needed one option: c, t or x (create, test, extract). You didn't need to pass f (file) because you actually piped tar to something that would append to a tape device (or read from it). And back then, compression (z, j, J, a) wasn't common place for tapes.


The most complex I have is this one in a deployment script:

    ssh -n -C -x ${DELIVERY_USERNAME}@${DELIVERY_HOSTNAME} tar ch 
    -C ~${DELIVERY_USERNAME}/${VERSION_PATH}/Escape/AIR_Delivery/PWP_Delivery
    Configuration_Files Executables Libraries Scripts
    | tar x -C ${install_appli} --transform 's,^Executables,bin,;s,^Configuration_Files,config,;s,^Scripts,config,;s,^Libraries/PWP_Configuration_Application.jar,config/PWP_Configuration_Application.jar,;s,^Libraries,lib,'


You should make github gists of these.


In place of the "tabview" command (which is a non-standard, pip-installed program) a simple "column -t" would be less trouble for most users.


Thanks. I was looking for tabview but it wasn't easy to find, and in python (not that it's bad, but it wasn't a native tool). Rather, column -t is definitely a better choice, since it is a native posix tool.


column(1) is not in POSIX.

It originates from BSD, but it's also part of util-linux, so it should be available on most Linux distros.


I don't think I could disagree more with having `man` at the bottom of the advanced/infrequently-used section. Learning how to read and search through the documentation that comes with the system is a very beneficial thing to learn how to do.


I think I agree with the placement. It's definitely a good skill, I wouldn't dispute that, but it really is advanced/infrequent by the standards set here.

Historically that wasn't true, but Stack Overflow has made it vastly easier to answer simple questions ("Which 'ls' flag shows permissions? How about hidden files?") without ever using a man page. You can go a long time without ever touching that and not have problems, while jumping to man pages without experience tends to be an overwhelming jumble of flags and conditions.

Which isn't to say it's low value! Learning Linux well beyond "good enough" is still an enormously worthwhile project. But I do think it's correct to say it's no longer an entry-level step.


Hi everybody - author here.

Surprised to see this on the front page. I agree with most of the feedback and criticism here.

Hopefully some readers found it helpful regardless.


It's ... pretty OK :-). My points/suggestions...

1. I've never heard of tabview and it's not on MacOS or Fedora.

2. Add ctrl-D to your possible exits. It's actually more common than ctrl-C.

3. Some of those commands are not installed by default on many systems; e.g. htop or dstat. And they require sdvanced knowledge to properly interpret. But they can be very useful. I'd suggest standard `top` and things like iostat instead.





Learn Just Enough Linux To Make QA, Ops, Security Hate You.

Learning to do a thing and learning to do a thing well enough to create a production quality result that isn't a liability to the company are very different. That said, improving Linux knowledge among those who had little/none before is a net win in the long term IMO.


>Learn Just Enough Linux To Make QA, Ops, Security Hate You.

Hey, that's me!


And not even one mention of man pages... That is was my main resource for learning new (remembering old) commands. Maybe it's out of date now due to google, but if you're already in the terminal man is the way to go. Also, I feel like cat and less should have separate explanations as they have diverging utilities.


There's two mentions of man pages.


<Apologize> I missed in my initial skim


I’ve always felt there are two different kinds of Unix/Linux: sysadmin vs user. While Linux distros have been getting better and easier for many years, installing and maintaining Linux on my own machine still requires sysadmin Linux knowledge to this day.

I personally believe this is why Macs have been so popular since OSX, it’s really the only machine out there that comes with Unix but doesn’t require any sysadmin knowledge to use productively.

As soon as Linux truly reaches that same point where users can install and maintain it without needing sysadmin Linux, I think it could take over Mac’s position. And I wish it would, OSX seems to stuck in old version limbo.


> While Linux distros have been getting better and easier for many years, installing and maintaining Linux on my own machine still requires sysadmin Linux knowledge to this day.

I'm not sure that's true. You can buy a few computers (including from Dell) with Linux pre-installed, and maintenance doesn't take a lot more than clicking on the upgrade button with the daily update-manager prompt.

I _do_ have sysadmin knowledge and find the shell more efficient than the GUI in 99% of everyday-use cases (I can't remember the last time I opened a file browser). So it's entirely possible that there's some day-to-day work required that I'm not even thinking about.

But my dad is running a Linux Mint system and I switched him to it (about ~10 years ago) because it required so much _less_ maintenance than running a Windows machine did. The number of phone calls for tech support I got from him plummeted from twice a week to zero when I switched him from Windows: I did so because I got tired of fielding constant questions about antivirus software, defragmentation, startup programs littering his systray, possible viruses installed, etc etc etc etc. It's really beyond me why anybody thinks that Windows is an appropriate OS for a non-technical user (er, or for a technical user, or really anyone who doesn't need specialty software).


> I'm not sure that's true. You can buy a few computers (including from Dell) with Linux pre-installed, and maintenance doesn't take a lot more than clicking on the upgrade button with the daily update-manager prompt.

Is it really that good these days?

I can definitely appreciate the constancy of Linux, it feels like Windows/Chrome/etc stacks move under my feet constantly. But my experience with Linux is still that when problems do arise, I have to escalate to shell commands and arcane sysadmin-ry for all but the simplest problems. Windows and OS X still offer a lot more hope of an accessible fix via the GUI or even an automated troubleshooting tool.

It's totally possible I'm dealing with the wrong distros, or am resorting to Linux for weirder tasks and therefore seeing harder problems. But I don't have the sense that Linux is more painless, only that it's less chaotic.


> Is it really that good these days?

Honestly, I constantly doubt myself on this because other people keep claiming you need to be some sort of tech-savvy god in order to use Linux: but I've had various Linux systems as my primary and only OS since 2007 and I pretty much haven't had any problems to speak of since like, 2012.

To the extent that I can take myself out of my own head and try to model someone not that tech-savvy, I find Windows _much_ harder to use on a day-to-day basis. This is further validated by the experience of my dad: I went from 1-2 tech support calls a week to maybe once a YEAR.

> But my experience with Linux is still that when problems do arise, I have to escalate to shell commands and arcane sysadmin-ry for all but the simplest problems.

I actually prefer the command line for pretty much everything (I haven't opened a GUI file browser in years), so I'm leaning primarily on the experience of a few friends/family who switched to Linux when they saw how awesome my system in college was (where awesome = not plagued with the ridiculous problems that every Windows user has to rationalize out of existence). I've tried to evangelize the command line to most of them and they remain pretty firmly uninterested. And yet none of them ever really _have_ any problems with their system[1] to speak of: clicking "install now" on the Update Manager pop-up is the only interaction they have to have with their system beyond usage. They usually tend to be on something like Linux Mint, which has all the GUI-heavy config tools pre-installed already. When they feel like dipping a toe in the water of customization, they can do so to the precise extent that they're comfortable and think that the trade-off is worth it (e.g. re-arranging their panel).

AFAICT, all the work in Linux is upfront and pretty quick: make sure that you buy hardware that supports Linux well (an easy Google search) and either buy pre-installed or spend 20 minutes installing it.

[1] These include my best friends and girlfriend, people I've talked to almost literally every day for the last decade.


I'm probably not the best person to comment, as I tend to drop out to the command line for most things and might not recognise when something's hard to fix using GUI tools, but I've worked with colleagues who are not terribly good at CLI linux. They've been using some of the dell laptops and don't appear to have many problems looking after their own issues.


Good point that some people are there already, I agree with that. I’m not though, and I give it a shot about once a year to see what’s new. I’m curious if your dad installs much software? As a developer myself, I should have more sysadmin Linux under my belt, I’m happy to get very technical and nitty gritty with my own work. But I lose patience if my OS wants me to learn super arcane things for stuff that should just work and does on other OSes. Display drivers, networking, installing new drives, this is the kind of stuff that seems to trip me up in Unix.


> I’m curious if your dad installs much software?

Surprisingly, he actually does. This is in large part because this is WAY easier in Linux than anywhere else, because they've been following the "curated marketplace" model (like iOS and Android) for like, a decade (without the restrictions on other modes of install, obviously: Linux systems have an incentive to be user-friendly, but not anti-user the way Apple does). He was in the menu and found the "Software Hub" or whatever Mint calls it, and spends time randomly looking around at stuff. On Windows, he would oscillate between being too paranoid about viruses to go around installing random .exes to being _not paranoid enough and actually installing a virus_ haha.


Great list for starters. But getting real work done requires much more.

Having just walked through some basic steps with a novice, I can concur, modern linux is impossible without "recipes".

Just setting up a hello world web app can involve: ssh login to running instance, using curl to download the gcloud sdk, tar to unpack, sudo to install, systemctl to start local processes, useradd dedicated www, chmod and chroot to allow access, rsync to tranfer files, netcat to manage ports, etc.

For beginners, without a trusted hand to guide them, its beyond intimidating. These systems must become more accessible to the average human.


>These systems must become more accessible to the average human.

I disagree. Firstly, all the commands you mentioned are pretty simple to understand given that they have a manpage and lots of help available online.

Secondly, not everything needs to dumbed down for the average user. A large part of the power comes from all the tools and flags available and the combination possibilities. Dumbed down tools are limiting and frustrating.

I’d expect any software engineer worth their salt to not be an “average user” and grasp a few simple commands the same way they approach a new project or programming language, i.e. they should be able to pick up the basics even if it’s a complex system. That’s a prerequisite for programming and understanding a project’s code well anyways.


Problem is, as I've already mentioned elsewhere, that the manpages do a pretty poor job in a number of places.

Yes, I do use them but I guess most Linux users learned most of what they use from colleagues and Internet, not the official documentation.


I rarely use `man`. Most of the time I find what I'm looking for with --help, and if I don't, I end up doing a web search. Sometimes I end up at the HTML version of a man page, somewhere like linux.die.net, which is much more comfortable to read than the terminal.


"Well" is the keyword here. They don't, far too often "good enough if it doesn't crash the machine on start" is a norm in companies of all sizes when writing applications code and infrastructure code.


Disagree! I'm normally all for easier entry points, but

  ssh login to running instance
  using curl to download the gcloud sdk
  tar to unpack
  sudo to install
  systemctl to start local processes
  useradd dedicated www
  chmod and chroot to allow access
  rsync to tranfer files
are absolute basics of system administration.

People who cannot even handle those tasks should not use a root server for their project. There are alternatives to a full system, if running a single app with a database is your use case. Be it Heroku or Lambda, there are solutions that do not need administrator knowledge to run an application.

To add to this, servers that are lousily configured because some developer did not know what platform to use damages the infrastructure as a whole. They are broken in, used for spam or data theft if the application has some user base. If you know someone who does not know how to handle a system, please advice them to either coop with an admin or to not use a full system for their project.


Someone who doesn't know how to do those things probably shouldn't be administering an important server themselves, agreed. But there's still a lot to be said for putting up a Hello World webpage locally, or making minor tweaks on a system formally run by a sysadmin.

My concern would mostly be that the list above is unacceptable for learning, not for usability. That's ~8 different tools to learn from scratch, totally independent of the underlying project.

And sure, most devs will eventually know all of those tools (or equivalents) without much conscious effort. But there's a lot to be said for scripted introductions to complex systems.


These systems should be handled and operated by dedicated people (administrators/ops/devops/pick your religion) who care enough to know how to use the tools they work with. ;)


>> For beginners, without a trusted hand to guide them, its beyond intimidating. These systems must become more accessible to the average human.

I cringe at thinking about what a newbie has to go through.

    Install the right version of php/python etc on your platform
    Install a virtual environment
    Install a package manager
    Install a framework
    Install a database


I think about this every time I have to set up a new environment myself. Getting from "fresh machine" to "clean build" can be a full day of work for very experienced devs if they're joining a long-lived project.

Setting up novices to start their work with an arcane and difficult step is unfortunate even if it is unavoidable.


Well, before facebook it used to be setup Geocities or setup wordpress/blog software

It's no wonder MySpace and then Facebook became popular. This was the dumbed down web presence people wanted but didn't want to have to manage.


> For beginners, without a trusted hand to guide them, its beyond intimidating. These systems must become more accessible to the average human.

They are not meant for beginners or average people. We have Windows and MacOS for that, once you get to a point where you are thinking about linux you shoudln't be soon 'green'. Just because we have an OS doesn't mean it's meant for everybody.


You do realize that the mainstream, used by everyone OS is actually Linux with a graphical layer now ? Yes I'm talking about Android. The next environment a user can use being a userfriendly desktop such as Ubuntu desktop which is a bit closer to the real Unix, then simple command line. My kids have zero issue with desktop Linux but don't use command-line.


Absolutely, but the complaint the poster I was replying to was that, essentially, linux is too hard. Am Im saying its just not for him because thousands upon thousands of people have learned to use it effectively and powerfully.


Learn just enough GNU coreutils and bash to get things done.


Learn enough literally every modern OS that isn't Windows to get things done.


> Basic tasks, like file parsing, job scheduling, and text search are more involved than running a command-line utility

This seems backwards. Going to the command line for these things is a horrible workflow. It’s not necessary in Linuxes either ...


Another pretentious thought leader spewing misconceptions.


The key is the text stream. Both for computer interaction, and for communication in general. The rest are just names.


This article is about bash and not linux


Something is wrong.

Quitting is ctrl+d

Ctrl+c is for interrupting the current job.

Some repl like the mysql cli does not rrspect that and it is a PITA




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: