Hacker News new | past | comments | ask | show | jobs | submit login

Along these same lines, I can't even imagine what it must have been like to be a programmer (or any other kind of knowledge worker) before the web. You mean I have this problem and I need to wait 6 months for somebody to write a book about it, and then read the whole thing cover to cover in hopes that he/she wrote about something pertaining to my problem? phfft this is bullshit



It was frequently worse than that. I was a Microsoft developer back then, and the thing to do in that world was to join MSDN (http://en.wikipedia.org/wiki/Microsoft_Developer_Network). That meant agreeing to pay Microsoft several thousand dollars every year, in return for which they would periodically ship you a plastic crate containing a metric buttload of CDs. On those CDs were copies of every product Microsoft had ever made, along with all the documentation for all those products.

Buying one book every six months would have been a lot cheaper.

They'll still sell you that subscription, of course (http://msdn.microsoft.com/en-us/subscriptions/buy.aspx). Nowadays though the documentation portion is all available free on the Web (http://msdn.microsoft.com/library), at least. Progress!


But then again, by reading a book (and products/libraries often came with them, no waiting required), you might learn about the whole thing, instead of begging and praying that someone at stackoverflow ran into the same undocumented quasi-feature and hobbled together a jerry-rigged fix.

Compared to CGIs/PHP? Yeah, sure, finally it almost behaves like a normal GUI library. And the way things going, we're probably going to get our Taligent/MFC library soon...

I know, the OP (and Louis CK before him) were mostly arguing against the Nirvana fallacy, where people complain about the status quo as opposed to some mythical idea of perfection. Problem is, I don't see the whole browser stack as superior to previous art (i.e. definitely not mythical). Things like NeWs or Smalltalk.


I always thought NeWS was way ahead of its time. The modern web application stack looks a lot like what Gosling did in the 90s, but with PostScript replaced by JavaScript+HTML.

https://en.wikipedia.org/wiki/NeWS


The good thing about that was that you could basically do everything in PostScript, in what way does that compare to the modern JavaScript/HTML/CSS+client/server mess? Never mind all the apparently necessary auxiliary languages (templates for HTML, preprocessors for CSS, transpilers for JavaScript).

The closest thing would be something that does almost everything in JavaScript and would abstract the HTML/CSS underpinnings away, but that's certainly not the current approach to "proper" web design (ExtJS would come to mind).

Honestly, I'd be hard-pressed to find any prior system with that many layers (VBScript?)...


I liked the part about "To support user interface widgets, NeWS expanded the original PostScript stack-based language into a complete object oriented (OO) programming style with inheritance."


It wasn't as bad as it sounds; there were mailing lists and usenet groups. Before that, software development was much simpler and much closer to the metal. If I didn't understand how some system call worked I'd just drop into the low-level debugger and step through it to see what it did. You can reasonably expect to understand how your whole OS works when the whole thing is only a few megabytes of machine code.

You're right, though, I used to buy volumes of Inside Macintosh and read them, cover to cover, over a cup of tea. I learned C++ from a book, too. I actually miss that style of learning; I like to understand my tools comprehensively, and not to just skim whatever I need from the reference and leave the rest as a mystery. I know that there are lots of programmers who spend their whole careers glomming together bits of other people's libraries, but that just doesn't appeal to me.

It was basically impossible to get any information about CS fundamentals before I got access to the WWW. I guess people in universities must have had access to textbooks, but as a working professional I didn't know enough about what I didn't know to even know there was something to look for, much less have anywhere I could try to find it.


> I actually miss that style of learning; I like to understand my tools comprehensively, and not to just skim whatever I need from the reference and leave the rest as a mystery. I know that there are lots of programmers who spend their whole careers glomming together bits of other people's libraries, but that just doesn't appeal to me.

I think you're presenting a false dichotomy, and expressing inappropriate disdain for people with different learning styles from your own. For one thing, you can gain a deep understanding by using a library, language etc. in a nonlinear fashion. For my part, reading a book about programming from front to back 5 times and working the examples will leave me with long-term retention of somewhere in the neighborhood of 0%. Actually using the language and libraries and digging deeper as I need to gives me a far stronger grasp. I don't know everything there is to know about every feature of Core Graphics under OS X, for example, but three years since I last touched it, I could easily ramble for hours about all manner of crufty real world knowledge that I gleaned from using it.


Sorry about that; I was aiming for wistfulness, not judgement. I really enjoyed that era.


It was nothing like that.

BBSes were linked and people transferred files and knowledge everywhere, much like they do today. FidoNet existed, which had a good sized community. I bought one of my first PCs on there. This was many years before eBay and Craigslist existed. ASCII text files existed, which contained a wealth of information. One of the few free sources of information on how compilers work is an ASCII text file. People still refer to this source, despite being nearly 2 decades old now (http://compilers.iecc.com/crenshaw/)


In a lot of ways it was easier. Your programming environment didn't really change rapidly. You could really learn your tools. Today I feel like by the time I barely get my head around something it's deprecated, or worse doesn't work at all anymore. The tools today are extraordinary, but there's very little incentive to learn them deeply, and high motivation to toss them aside when the next one comes along.


What? You mean, "I have this problem that I can actually apply reasoning skills to come up with a solution to. Later, I can compare notes with others who attacked the same problem in other ways to see what the pros/cons were." As opposed to today where "I have a solution that works, but according to the blogosphere it is no longer in fashion, and so must therefore be terrible and replaced as soon as possible."

Granted, I do not really think it is that bad, but still, the "dark ages" were hardly dark in this sense. I've been reading some of Knuth's "selected papers" series lately. Just reading about how he and others approached problems is amazing.


I lived it. Operating systems and tools were simpler. There were reference manuals and tutorial books. Indeed, you better know your language and libraries from cover to cover. And sometimes debug in machine language. It's true that it could take months or years for you to learn about new things. I read about the first C++ compiler (with C as a target language) in a magazine. And I lived in a country that imposed a lot of barriers for importing software...


Well, before the Web there was Gopher and Archie and BBS's.

It wasn't the Dark Ages.

Maybe more like the Early Dawn Ages.


There was also an option of sitting down with paper and pencil and actually solving the problem, reinventing the wheel for the thousandth time. Or you could reach out to specific people you knew were likely to know something about the problem and socialize a bit by the way.

Both approaches had some good sides to them and it's a shame that they're not even considered as options now, which GP seems to demonstrate.


Yeah, it occurred to me that way back when, as I was learning C on a C=64 (though later I was able to buy a super powerful 386 machine) that as I was trying to code up alife and neural net programs my problems were rarely of the "What's the workaround for a bug in alife.js or alife.rb" sort.

I wrote out a lot of pseudo-code and drew diagrams and when stuff didn't work I could often figure it out by poking through the K&R book or the manual for my compiler.

As it became easier to spread code and ideas it also became easier to spread more sophisticated but sometimes opaque libraries and tools, and perhaps ironically it is those new libraries and tools that most require the increased communication because while they offer more power they also introduce more complex bugs and quirks.


Plot twist: the so-called "Dark Ages" were not the Dark Ages either.


I had no problems with pressing F1.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: