Hacker News new | past | comments | ask | show | jobs | submit login
You Owe it to Yourself to be Old-School (codelord.net)
184 points by abyx on Feb 22, 2011 | hide | past | favorite | 67 comments



I think a more relevant portion of what Joel what talking about is the 'full stack'/'ductape' programmer. The number of times I've seen people optimizing their for loops instead of their DB queries or their caching mechanisms in a web app is simply astounding. Few people realize that there is far more 'slow' code in the parts they didn't write than the parts they did write.

Think about it, in the 'full stack' of browser -> client -> interweb -> web server -> app server -> db server the little bit of code that you wrote is minuscule compared to what you didn't write. Tune up your TCP stack, tune your webserver, add cache headers, add indexes, etc. Most apps can drastically increase their performance with out even changing a single line of code. And when you're tuning up your code, look for single lines of code to change.

Definite protip in there on debugging with wireshark.

We had a client once that we could only login to their IMAP server but not download mail. A little bit of digging with wireshark and we figured out that he had some old POS router that couldn't handle window scaling. Solution? Turn off window scaling on one of the servers.


Again, the assumption that everybody is a web programmer. No wonder people approach the command line with obsequious awe!


For the non-web (and sometimes even for the web) programmer, strace can be a real life saver.

An application complains about not being able to find a file, but doesn't report where it looked for it? strace can tell.

Do you want to know why a system call failed, and the app is too crappy to report the error? strace to the rescue!

Of course, if you're lucky you might even have access to dtrace...


This mostly-web programmer finds strace and (more often) ngrep (http://linux.die.net/man/8/ngrep) to be some of the most useful tools out there for debugging errors both strange and mundane.


For me as a user, strace can be a real life saver. I'm not hep enough to the jive to read the source code and figure out what [incredibly uninformative error message regarding file access] means, but I can read strace output and see what file the executable is trying to access, and what it's trying to do to it.


My favorite strace story: trying to get at the underlying causes of some "Unexpected Error" messages in ArcGIS, and it turns out it's writing helpful and useful (albeit two decades old) logging + error messages to the Windows equivalent of /dev/null.

shakes fist @ ESRI HQ once again


Yes most definitely, strace / truss have been absolute life savers. And if you are looking for something similar on Windows Process Monitor is excellent. You can also do a lot of DTrace style stuff with WMI and VBScript.


The Windows equivalents would be the Sysinternal tools, I guess (Filemon, Procmon, etc..).

http://technet.microsoft.com/en-us/sysinternals


A cute goth daughter of a friend once squealed with delight when she saw another friend of mine (a NASA hacker) type 'monkey' into Terminal.app and cause some remote login magic to happen.

Oh, the arcane magic of "alias." Then again, there have been times I've seen young people impressed by what amounts to "Row, your boat" played on an actual mandolin.


To be fair, many of us web programmers use the command line extensively. 75% of what I do is web work, and It's all written in vim, and often ssh-d into a server. I got my irssi running in a tmux session 24/7...

Web programming has much more in line with systems work than most appreciate.


"""Think about it, in the 'full stack' of browser -> client -> interweb -> web server -> app server -> db server the little bit of code that you wrote is minuscule compared to what you didn't write.""" This. The old saying "don't prematurely optimize" has never been truer. Too bad it's gotten harder to profile. Unless you write a boring CGI, one-load-per-page site, you're never going to be able to get all of that information server-side. The only way is with browser-based testing frameworks... Sigh.


And even at that you'll need a network nightmare box or equivalent to simulate all the high latency connections out there.


With Dummynet, FreeBSD can simulate any kind of connection you can think of. Latency, packetloss, the whole schmear, available in bootable ISO form.

http://info.iet.unipi.it/~luigi/dummynet/


I've used Dummynet extensively (with great satisfaction!), but I've been looking for a more recently updated alternative (preferably with good Linux support). Is anyone aware of any such thing?


The most recent commit to the dummynet source code was 5 days ago.

Are you sure you need a more recently updated alternative?


Where do you see that? On the page linked in the GP the most recent revision is dated 2010-03-19

Do I just not see the link to the repo or is there another project page someplace?


Luigi is a FreeBSD committer and works in the FreeBSD tree. That's where you'll find the most up-to-date dummynet code.


Why is less than a year too long? Is dummynet missing functionality?


Alright, I'm an idiot. For some reason I had it in my head that development had ceased a couple years ago. Thanks for the info.


"netem provides Network Emulation functionality for testing protocols by emulating the properties of wide area networks. The current version emulates variable delay, loss, duplication and re-ordering."

http://www.linuxfoundation.org/collaborate/workgroups/networ...


Dummynet now supports Linux, check its homepage out.


network nightmare

Thankyou so much. I have been trying to remember the name of these devices for ages.


The House example is horrible. In the show rarely does house diagnose an illness thorough intellectual debugging. In most episodes House sits around and insults his team while they run tests and about 5 minutes from the end he out of nowhere has an epiphany (while talking to Wilson most of the time) which often has little to do with what tests the team ran and he solves the case.


OK, maybe in Season 7, but in Season 1?


Yes, absolutely.

If you like House (for the medical mystery aspect rather than the snark), you may also like Berton Roueche's writing. _The Medical Detectives_ is a pretty good collection.


A good article, but intimidating, in a way. And I see this quite a bit in terms of budding programmer advice. It boils down to "you need to read this, and then read that, and then read this," and then the comments section will have even more suggestions for reading.

As I get older, and have kids, I find my ability to sit down and read all of this, and somehow retain it, decreasing. I've recently decided to "get back into" coding (won't go into the history as it isn't tangential to the discussion). I've been having some fun with very, very tiny beginner level programs in Ruby. When I build a list of jobs I would want with Ruby involvement, I find I also need to brush up on my Javascript and learn JQuery. I need to get a full understanding of Git/Github. I should also learn Rails and Sinatra, VIM, and get reacquainted with CSS.

On top of that, it really does seem that C is a requirement to really understand what is happening in Ruby. It seems a lot of things that _why_ wrote for Ruby, he wrote in C.

Then there is all the additional readings, such as in this article, and comments section. All of which seem incredibly legit, but leave me feeling like I will never actually find the time to write code due to all the reading I don't have time to do.

Is there an order of importance with all of this?


This article emphasises the important of building your baseline knowledge on the aspects that haven't changed since the dark ages (and probably aren't likely to for a while). As opposed to learning the flavor of the month web framework that might be yesterday's news a week from now.

Learning about C and how to use a decent debugger/tracing tool will help you track down exactly where those unhelpful errors are being generated. Learning how to listen in on network traffic will help you find out why the foo you sent is arriving as a bar (or not arriving at all). Learning to use the shell opens up a lot of quick reusable operations and automation possibilities.

The books he listed are old, but certainly valuable. I don't think they're going to suddenly go out of date.

On the other hand, how much more reading will you do when you have to learn a new high level language every month to keep up with the latest trend?

However, I wouldn't let this get in the way of doing what you want to do, which is writing code, I guess. Learn bits and pieces as you need them and I think you will see the benefit. Don't let them weigh you down.


The other side of the coin: I've seen frequent indications (even on HN) that there a hordes of programmers out there that know about almost nothing outside of their web app server, HTTP, and their database of choice. This isn't to say such specialists aren't valuable. Specialists are valuable, but if I see a medical specialist, I want them to have a good grounding in general medical knowledge. Likewise, I would prefer all of my computer colleagues to have basic general knowledge as well. Sometimes such knowledge can avert a disaster.


What do you consider general knowledge?


A better way to say it would be background knowledge.

Some idea of:

    - How computer hardware is put together (from the NAND gate up)
    - How computer memory is organized/optimized
      (and how the abstractions involved can break down)
    - How compilers/interpreters/computer languages work
    - How a handful of the most commonly used algorithms work
      (and the implications of their time/space complexity)
    - How networks are organized and implemented
    - What operating systems do, and how that relates to hardware
    - Basic security, how exploits work
The thing I see, is a lot of people learn something about the above, but only in the very narrow confines of their particular web development stack. I suspect they're just reciting a set of best practices by rote without understanding principles. They're completely unable to generalize the knowledge, to the point where they say patently incorrect things in comments, even sometimes here on HN. If people can't generalize in a basic way about computer systems, how do they properly think about things like security?

It's like my sister, who is a choreographer and has an exquisite understanding of physics and geometry -- as it applies to dancers. Yet, as far as I can tell, she's completely unable to generalize this to driving and furniture, though she would benefit greatly if she could overcome this limitation. For one thing, when she's driving the family around in a rented car in Italy, she wouldn't end up straddling the center line every time we go around a blind curve. For another thing, her DIY kitchen table made from salvaged parts wouldn't have 3 inches of shear wobble, and her custom desk made from IKEA parts wouldn't have a huge sag in its center. I know she understands the physical ideas behind static mechanics and diagonal bracing applied to dance. I know she has a knack for spatial geometry, because she can have a half dozen dancers move around between each other, keep them from colliding, get everyone where they need to go for the next move, and make the whole thing look beautiful.

I'm also reminded of that Richard Feynman story about teaching physics in Brazil.


Yorkshireman #1: "Bah, C programmers. They don't know how good they have it!"

Yorkshireman #2: "When I were t'lad, we'd have to use assembler by hand!"

Yorkshireman #1: "Assembler? Luxury! We never had an assembler. We had to poke the data in by hand!"

Yorkshireman #3: "Poke! Oh what I wouldn't have given for poke! We used to live int' shoebox int' middle of road, and we'd bootstrap the OS using toggle switches on't front of case."

Yorkshireman #4: "Oooh toggles! What I wouldn't have given for toggles. We used to live int' hole int' ground, and all we had to go on were scraps of hard stale bread, on which we'd punch holes and use as hollerith cards!"

Yorkshireman #2: "Holes? Holes? Bliss, oh what I wouldn't have done for holes!"

*shamelessly plagiarised from http://www.youtube.com/watch?v=-eDaSvRO9xA


I've actually programmed an 8-bit processor in binary code by repeatedly flipping 8 metal toggles and pressing a "commit" button.

A roommate of mine in school actually TA'd a programming class where he saw some frustrated students take their IBM punch card stack out of the reader, shuffle it, then put it back in.


8 bits? You were lucky to have 8-bits. We never had 8-bits. We had to make do with t' abacus, and we never had any beads!


How can you spot and truly understand memory leaks without having to manage memory allocation by yourself?

My peers (students) and I have discussed this several times. It's a good example, and we use it to argue for continuing to teach C to new CS students at our school.

(It often seems the CS department is trying to move in the Java + Python, and nothing else, direction)


Another worthy pursuit: buy all of Brian Kernighan's books, not just The C Programming Language. It is well worth the pain of becoming slightly familiar with Fortran and Pascal to read these little gems.

Being familiar with how things work is undoubtedly useful, but I argue if you are an application developer, learning software construction is equally if not more important.

See also: Code Complete, Programming Pearls, and C interfaces and implementations.


I never knew Kernighan wrote a pascal book. (or a version of a book in pascal). Neat. Also ampl seems quire useful, I've written my own versions of some of that stuff in mathematica. It'd be great to have it already done for me.


Software Tools in Pascal. There was a previous version with examples written in something called Ratfor, which from what I've read is a hybrid of fortran and C. http://www.amazon.com/Software-Tools-Pascal-Brian-Kernighan/...


Ratfor, if my memory serves me well, was a set of macros which sort of turned Fortran into a simplified C. Better read the Pascal book, the language (Pascal) would be easier to follow.


Why imply that there's pain in becoming familiar with Fortran or Pascal? It's one thing to argue that they may not be the best tool to build a web app with, but I'd argue there's a joy in learning about alternative languages.


You haven't actually used Fortran, have you? It's okay for demonstrating short snippets of code, but actually writing Fortran programs is pretty awful if you're used to modern languages. It's like the syntax of BASIC combined with the safety of assembly language.


> You haven't actually used Fortran, have you? It's okay for demonstrating short snippets of code, but actually writing Fortran programs is pretty awful if you're used to modern languages. It's like the syntax of BASIC combined with the safety of assembly language.

Please look at the feature set of Fortran 2008 and say that again with a straight face.


Possibly. The last versions I used were Fortran 90 and 95, which were quite frankly an insult to the 40 or so years of programming language design research and practice leading up to it.


The same could be said and has been said for C++ as an evolution of pre-ANSI C. A radical blank-slate language like Fortress is by its nature different from a conservative but still significant extension like Fortran 2008. Both efforts seek to advance the state of the art in the same field but from different directions.


The "F programming language" subset of Fortran 95 is rather nice, except for the bizarre omission of the while loop.


Not in anger, no - but that's my point: there's an intellectual pleasure in learning about it, even if there may be pain when trying to write a full application.


I agree that it is good to be a "generalist" or a "full stack programmer". I try to be one myself.

Be careful when you hear these classifications coming from recruiters. Whenever I have heard "generalist" what was meant was "someone who can magically already do everything within our narrow domain of concerns."


Not that long ago "old-school" meant assembly language. Knowing assembly language makes one a much better C programmer.

And we had to walk uphill to school both ways.


This is the #2 reason I am sorry I didn't grow up during the 60's.

A competent programmer could known the whole stack, down to the micro-controller. (Mainly because there wasn't much stack there..)

* Free love&drugs is #1


Emulation programming is a good way to learn this stuff today. Just pick a platform, emulate it, and you're guaranteed to know the whole stack.


I'm having a lot of fun with this MSP430 microprocessor that cost $4.30, delivered to Australia(!). Great way to teach/refresh some skills in simple assembly or C.

Here's the original HN article where it was mentioned: http://news.ycombinator.com/item?id=2202737


I fully agree, but all it really says is, "learn how things actually work, people!" This is still mostly fluff and identity politics, hoping people who know they're old-school will upvote it. Flagged.


No fluff intended. And no upvotes phishing - I don't write for upvotes. Been bothered by this, and hope to help someone understand this is important.

Good day!


I think posts like this mostly just encourage people to use the command line to feel "old-school", not because it works.

Kernighan and Pike's The Practice of Programming covers similar material very well.

And yes, I was a bit overly critical. I'm annoyed by that style of article, not that specific one, sorry.


Do you take your cynicism for a walk every morning before or after coffee?


Before - my cynicism gets the water boiling.

Seriously, I wonder if these sort of posts make people less likely to use powerful-but-old tools, by further inflating their reputation in a way that makes them seem intimidating. I much prefer the opposite approach, such as in Jack Crenshaw's "Let's Write a Compiler" (http://compilers.iecc.com/crenshaw/) - You can make things complicated if you want, but the basics aren't that hard, and helping people to realize "you can do this, you don't need to be some kind of programmer demigod" is very empowering.

Also, I recommend reading the source to tools you admire - seeing other peoples' dirty hacks can be a good reminder that shipping something useful beats waiting until it's perfect.


You make a good point here. I was thinking about this last night. It seems like there has been a cultural shift over the last 15 years or so. This is entirely anecdotal but it seems like attitudes have changed.

When I first got into computers/coding everyone I met was so excited to run into another enthusiast and sharing and collaboration where pretty much the rule. Even the writing from the day seemed to emphasized two types of programmer: skilled and green.

These days everything I read emphasises two types of programmers: ubermensch and the tragically useless, a dichotomy that entirely drops the implication that people in general are trainable in favor of implying there are gods among us and those doomed to merely mortal abilities should do the rest of the industry a favor and take up landscaping professionally.


Yes. I wonder if that split is one of the reasons so much energy goes into re-inventing and re-discovering old ideas. My academic background is in historical research, so I'm particularly conscious of that.

It probably doesn't help that the ACM, etc. keep a lot of important primary documents locked behind their paywall, either.


I'd say it's an entirely human failing to eschew "old" knowledge in favor of all that is shiny and new. This is in no way limited to the programming field. For example this kind of behavior frequently leads to comical nonsense in the machine shop at the local hacker garage (we have a local TechShop).

I've lost count of the number of times I've seen folks gravitate to the shiniest, most miserably complex high tech tools in the shop simply because they're 1. shiny, 2. high tech, 3. complex. There was a guy in last week who's working on a project he plans on patenting. I watched as he spent over an hour setting up a complex system of jigs and programming one of the CNC milling machines in the shop. Seconds after he set the thing in motion the bit managed to snag his work piece (a small bolt), snatch it out of the jig which it then promptly used to murder his jigging system. I could offer a laundry list of minor mistakes this dude made, (no cutting fluid, incorrect jig material/setup, CNC programming error) but the one big mistake is the most interesting in my mind: he picked the wrong tool for the job. A dead simple drill press and a machinist's vice would have not only been entirely sufficient to the task at hand, going "low tech" would have had him productively drilling parts in less than five minutes.


I think he's just saying develop better debugging skills, the stuff about the good old days is irrelevant.


My read is that the debugging skill is just one example of the benefits of background knowledge.


Ironic - this newfangled wireshark seems more usable then the old-school tcpdump that I used to use.


I might be misremembering, but I think wireshark is/was a frontend for tcpdump.


No it's not. TCPdump and Wireshark both use libpcap, however.


Great article. This is why I use and love C++. It will be used for the next 50 years or longer. Learning it (or any other foundational, open source programming language) is a must.


Baby, I played truant from hacking school.


You also owe it to yourself to learn the newer techniques.

The article is true and useful and contains many good pointers, but my god is it annoying when you either misunderstand the concept or take it too far. This kind of rot sets in when you refuse to use a tool or technique for no good reason.

Maybe this is a function of age or, more precisely, maturity level: Young people use bad tools to be 'hardcore', old people use bad tools because they stop learning new ones. Either way, the image I immediately dredge up when someone says 'old-school' in this context is someone writing a program using a machine code debugger instead of getting a decent text editor and development tools. Unless you can actually tell the difference in the resulting output, and I guarantee you almost certainly can't when it's assembler vs debugger, use the best (most likely, newest) tools you can get.

(The above doesn't apply to people doing things a certain way for fun. Heck, I run obsolete OSes on emulated hardware for fun. I just don't confuse that with reality.)


You got me at the House reference. Good one.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: