Hacker News new | past | comments | ask | show | jobs | submit login
Dizzying but invisible depth (2011) (plus.google.com)
71 points by ColinWright on Dec 5, 2013 | hide | past | favorite | 11 comments



Yes, I know this has been posted before[0]. But it was mentioned in the recent and continuing discussion[1] about someone writing their own OS and the small but significant triumphs along the way. The discussion on that previous posting is extensive, but closed, so I thought it would be worth having a reference to it, and allowing people to make further observations.

[0] https://news.ycombinator.com/item?id=3115951

[1] https://news.ycombinator.com/item?id=6852423


You might have put [2011] in the title in that case. But it's still an enjoyable article :)


D*mn - good point, but too late. Sorry - perhaps the mods will add it.


All that complexity, and he completely omitted entire feedback stack. Browser > OS Kernel > Video card > DVI > Monitor. This is simple version without sub pixel rendering and dozen of other technologies that are likely there. Throughout something that seems so simple - type a character and see it on screen, 20+ incredibly complex microchips and at least 4 different high performance communication links were utilized.

I am not trying to harp on him, the article is excellent. I am just pointing out that there is entire another stack of technologies that he chose to gloss over in order to make article readable.


Knocking The Steve may be fashionable, but that's ignoring the huge amount of hardware and software engineering work over the decades that he oversaw at Apple. It might not have rocked the world like inventing C or UNIX but it's not insignificant.


I agree. After talking this much about complexity, it is rather simplistic of him to praise Ritchie over Jobs in terms of impact.

On the one hand, I can imagine where the computing world would be without the work that Jobs did and the people he inspired: probably a bit less shiny, a bit more beige, a bit more square. Deep inside, though, our devices would still work the same way and do the same things.

Well, actually you can't imagine what computing world would have been like unless you assume technology develops in a purely deterministic manner, in the absence of any market or cultural influences. There is a name for this line of thinking: hindsight bias[0].

Also, in the presence of this level of complexity and different layers, comparing people's impact would be necessarily subjective and rather reductionist in that, a la computers a bit less shiny, a bit more beige, a bit more square sans Jobs. This would do disservice to the legacies of both Ritchie and Jobs.

[0] http://en.wikipedia.org/wiki/Hindsight_bias


Not only does the articled underestimate Jobs' influence, it also in my opinion (and I realize this may brand me as a heretic) overstates Ritchies' influence.

It's not as if there were no time sharing operating systems prior to UNIX, or no OS development concurrent with UNIX, and the same can certainly be said about C (in fact, many people would argue that C had a deleterious effect on computing overall).

A computing world based on e.g. Lisp and ITS, or Smalltalk, or Algol 68 and MULTICS would certainly look different than what we have now, but it's not a given to me that it would be inferior to what we have now.


I had not realized that Dennis Ritchie was dead, so I think he made his point about which person dominated the popular media, even if both were of similar importance.

Both were important people, but the kind of work they did definitely affected who would be more familiar with them.

Steve probably did as much as anyone to promote *nix, even if it was BSD, FWIW.


Hmm, I understand, to varying degrees, pretty much everything that happens there, at least down to the logic gate level. Once it hits the physics or quantum level I'm lost. It might be because I have an EE background, but it's not impossible to be aware of a sizable chunk of the underlying technologies that you use everyday.

What is truly mind-boggling is all the manufacturing processes and logistics that make such technologies a commodity. Maybe this is the point he wanted to make?

> Finally, last but not least, that is why our patent system is broken: technology has done such an amazing job at hiding its complexity that the people regulating and running the patent system are barely even aware of the complexity of what they're regulating and running.

Uhh... the people running the patent system all have technical backgrounds. Patent lawyers and examiners must have a technical degree (edit: at least in the US). That does not mean they are technical experts, and in practice it's far from being so, but the theory is sound. Sure, politicians step down from up high once in a while to shake things up as they're lobbied to, but they are not involved in the actual technical nuances.

>... the patent discussions about modern computing systems end up being about screen sizes and icon ordering,

No, those are only the patent discussions people in this particular bubble hear about. As an experiment, go to patentlyo.com, and browse the archives to see the wide and varied range of patent lawsuits that happen all the time: you don't need to know anything about patent law, the introductory paragraph usually gives a good, brief overview of the technologies involved.


Let me guess, you work as some kind of technical adviser at a patent law firm. And you think your work is useful and important.


No, I work as a software engineer, but don't let that get you down from your moral high horse.

But I have also worked for a small company that was ripped off by the big guys, almost went under, and could only fight back with patents. So yes, I think the work patent lawyers do is often useful and important.

Now let me guess: You are annoyed because I pointed out flaws in an article that was confirming your biases.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: