Hacker News new | past | comments | ask | show | jobs | submit login
The Bipolar Lisp Programmer (lambdassociates.org)
178 points by shawndumas on March 1, 2011 | hide | past | favorite | 86 comments




Completely tangential:

What is the reason for posting this reply? I think I understand the sentiment -- it gets tiring to see something repeatedly. The reason I ask tho is it comes off as a bit dismissive, but I consider re-discussing a post every 1.5 years to not be too terrible, as: a) some people may have missed the first posting b) lots of new people can now get exposure and c) we have all changed a bit in that time and have new perspectives perhaps.

I know that discussion of stuff like this really helps people understand the material better, so I would personally prefer a bit of repeat on the good reads, to keep people up to speed than a keeping repeats to minimum.

I can also see another side -- this is to point people to previous discussions for the reason of them benefitting from what has already been said. I agree with this motive. I just think it would be better if there was a way to present it which has a bit less dismissive of a feel to it.

Anyway, I don't have a great solution in mind, more just exploring the idea. Cheers.


Upvoted, because it's a pertinent question.

a. Yes, it's tiring to see the same thing repeated. As a hacker, repetition, and hence wasting time, is something I try to avoid.

b. Yes, it's nice to present classic items to the newer members of the community. That's of value

c. Without reference to the previous discussions the same points will be made over and over again. See item (a) above.

d. There is value in the previous discussions. It would be a shame to see it wasted.

In short, I don't have a problem with things being reposted occasionally, but I'd like to leverage the existing backlog of discussions. Regarding sounding dismissive:

e. I don't have time to word-craft things endlessly, and sometimes I don't get the nuances right. It takes me almost no time to create the cross-references - the time is consumed in trying not to offend people. Sometimes there's not enough time for both.


a. Yes, it's tiring to see the same thing repeated. As a hacker, repetition, and hence wasting time, is something I try to avoid.

I concur with this in general, but also suggest that repetition is a cornerstone of learning. Many ideas must be revisited several times before full understanding takes effect. This specific type of repetition is not wasteful repetition. It may not be you job to teach people, but I highly doubt getting in the way of others gaining understanding is your goal either.

e. I don't have time to word-craft things endlessly, and sometimes I don't get the nuances right. It takes me almost no time to create the cross-references - the time is consumed in trying not to offend people. Sometimes there's not enough time for both.

Fair enough. Seeing as how you are very frequently the person who posts the cross references (which btw, I really appreciate) I would like to propose this as a template for future "this is a repeat" posts (I certainly hope other jump in if there is a better way of putting it than my template):

This article is something of a classic here on HN. There have been some good discussions on it previously (see below). The comments in those articles may help provide some perspective for the discussion here.

$LINKS


> This specific type of repetition is not wasteful repetition.

If the repetition doesn't link to previous discussions, it definitely is wasteful.

It's like adding a comment without having at least skimmed through the already existing comments.


I don't know... Your statement assumes the entire concept of discourse is to say things in the minimal number of bytes or words or whatever. This may be true in code or maths, where less verbosity makes the point better, but I am somewhat convinced the point of most discourse is to disseminate understanding.

Since it is people who do the understanding, some repetition, while wasting bytes or words[1], certainly helps maximize reader comprehension. Even math and CS journal papers quote things, not just put pointers. That is wasteful repetition don't you think?

I look at it as an optimization problem. There are at least two variables -- number of words and reader comprehension[2]. The goal is information transfer between people. To minimize or maximize any one of those variables may result in non-optimal information transfer. Instead there may need to be some repetition rather than pointers for some things.

[1] There are serious questions at this point whether at the scale of HN discussions there is such a thing as limited resources for data storage, bandwidth, etc (and given the level of intellect here, even reading time is almost trivial for a lot of posts).

[2] There are lots of confounding variables, such as the intelligence and prior knowledge of any given reader, the eloquence of the writer, the complexity of the point being made etc.


> Your statement assumes the entire concept of discourse is to say things in the minimal number of bytes or words or whatever.

No, that's not my assumption at all.

I find it very useful if someone summarizes previous stuff, or reformulates hard-to-understand (e.g. badly worded) previous comments.

However, you should do that consciously. There's no sense in writing a comment that essentially just repeats what others already have written in better words than you'd ever do. Doing that is a waste of time not only for you (the writer) but also for all the readers. And it happens a lot, simply because people are too lazy to skim though the previous discussion.


Some people gain insight and understanding by trying to explain the material to others. Maybe they think they are saying something different and need to have the sameness pointed out. Perhaps they just think reiterating the point will help others understand by seeing it again.

The above may be wastes of your time or energy or limited word count, but they are not wasteful to others (in fact, the opposite is true -- someone may actually be benefitting!). This is my point: what is wasteful to you may be beneficial to others.

You can downvote it, or wait until stories have been around for a bit so others will have sorted for you (and only read the top comments of course).


> Perhaps they just think reiterating the point will help others understand by seeing it again.

This is of course possible, and I don't object that as long as they know they are reiterating. However, this is only possible if people at least skim over previous posts, which is usually not the case, and which is why hints to previous discussions are not only helpful but necessary to avoid wasting time.

> This is my point: what is wasteful to you may be beneficial to others.

Repeating something for didactic reasons is sometimes indeed no waste of time for the writer. However, if the writer then publishes this, despite other people already wrote that stuff up in a much better way, that person wastes the time of the readers by filling up the comments with, let's face it and call it by its name, garbage.


When enough time has passed for an URL to be resubmitted (I think it's a year or two), I think that the previous submission and comments should be resurrected (using a second karma score for ranking), not an entirely fresh story started.

Simple to implement; accumulated wisdom is not lost.


> it comes off as a bit dismissive

The presentation of previous discussions was as neutral, factual, and to-the-point as I can imagine; I don't see any dismissiveness or sentiment other than what I injected into it from my experience dealing with reposts on forums for a decade or so.

I see the RoG duplicate detection as serving the historian function you touched on admirably, without needing further padding in politeness.


I can only speculate at the intentions of the grandparent poster; but it's not necessarily their intention to indicate that you shouldn't repost things. It's helpful to link to old discussions of the same post so that interested HNers (like myself) can go read the comments. I know I'm often more interested in the comments than the article itself.


I am the original poster and I knew that it was a repeat but, honestly, my intention is to respond to up-votes so as to fine tune my participation in the HN community to maximize your enjoyment of my presence here.

In short; I am attempting to be Constructive[1] and am using up-votes to guide my attempt.

----

[1]: http://xkcd.com/810/


Yes, I agree. And, since I got a couple of questions about how to best find this, in this example google search query: "The Bipolar Lisp Programmer site:news.ycombinator.com" returns the previous discussions right at the top. I know most of you know this already, but there are many new faces here so no harm in posting.


Alternative (and usually more successful) search:

http://searchyc.com/submissions/bipolar+lisp?sort=by_date


Isn't is useful to have all prior discussions linked? Ideally, it wouldn't be a post, but a separate part of the message, shown on the side.


Why are there so many people discussing the way people comment? .. if anything this is what's killing hacker news.

There's only so much discussion, about discussion, a person can reasonably take :P


I wonder if the HN engine has ability to merge discussions. It would be nice to flag posts as duplicates, but not have the comments and discussions disappear into oblivion.


I'd love to see merging of discussions. RiderOfGiraffe's bot was nice. But the hive mind didn't like it.


Upvoted. If not 'merge' (hard), at least detect the duplicate and automatically add links to the previous discussions and save @RiderOfGiraffes from doing it manually. :)


I have a robot that used to do that, but it provoked an extremely strong negative reaction from a sizeable proportion of the community. I discontinued its use, and after some of the comments, I'm reluctant to bring it back.


Doing it in the comments with a bot.. yeah, I can see why that might've struck a nerve.

But it's the old usenet problem. FAQs I don't think would work here though.

I was thinking more as a collapsable "see also" section added only to duplicate articles.

Update Added reference to usenet.


It sounds useful, but I fear that new comments would be buried in the old comments, particularly if old comments retained their points in determining the ranking.


Even if it didn't merge the discussions having some automatic indicator of 'Related Discussions' ala Reddit would be fantastic.


While I'm quite glad you post the previous discussions, as I wasn't around HN when these were first posted - do you think "All have comments." is necessary? My initial thought upon seeing this is that you are trying to imply that we shouldn't comment in this thread since the topic has been worn out; while subsequent comments clarified that this is not your intent or the community's wish, I can't help but think that taking out the "All have comments" would better show what you're trying to do here- since if the previous discussions don't have comments you don't usually post them (from what i can tell)? Just a suggestion.


And yet other people complain if I put links to previous submissions without saying whether they have discussions or not. I think there is value in the cross-linking even if the other submissions don't have discussion. That makes it necessary to remark on whether there is any or not.

Perhaps I'm trying to make HN something it isn't. Perhaps it would be better to consider it like twitter and simply ignore the past discussion and submissions. If that were the case I, for one, wouldn't hang around any longer, because the "new" stuff feels more and more just like the old stuff re-submitted or re-hashed.

Without back references I feel that people wouldn't be as keen to find genuinely new material. As it is, there seems to be more "news" than "hacker news".


I like that the past discussion and submissions are considered- but I'm not sure that the cross-posting is actually discouraging the posting of material that isn't genuinely new? Seems like it tends to encourage discussion in the new thread, rather than having everyone first read the old stuff and then only mention any new thoughts or things which might be more relevant now in the new thread.

That something hasn't been previously discussed is more a function of it being new, most of the time. I'd love to encourage people to post things which are not necessarily new but have never been discussed on HN...that'd be ideal. Otherwise, it's more likely that only time-sensitive items will truly be "new" in this sense to HN discussions.

What's a good example of "hacker news" vs. merely "news"?


I've become really disillusioned with clinical psychology because of things like this. There are certainly people with specific personality types. In this instance, we have brilliant people who don't like to follow other people's arbitrary rules, and are attracted to new information. There are also certain people who will stay up for a week at a time, binging on cocaine and meth and driving to Mexico without sleeping and then becoming nearly suicidal for a few weeks after that.

The first person has a lisp-like personality. The second person has bipolar disorder. The two sort of resemble one another, but in the lisp programmer's case, they aren't doing anything really harmful to themselves or anyone else. They aren't hurting anyone, they aren't hurting themselves seriously; they're just acting a little quirkily. They might be bipolar-like, but they don't have a disease like the person who goes on cocaine binges and then tries to kill himself does.

The same principle applies with ADHD. Some people literally cannot sit still for a minute, and will disrupt every class they're in because they simply cannot pay attention to the work. This behavior is a problem for themselves and others. However, most people have some trouble concentrating on things they find boring. It's practically tautological.

The problem is, when we start using medical disorders to describe personalities, we conflate the two. Suddenly, everyone has a disease. This isn't a problem in itself, but we tend to try to fix these diseases with medication. If we try to give every lisp programmer some lithium to cure his bipolar behavior, he's going to become substantially less brilliant (lithium taken early on tends to knock off a few IQ points). If we give him adderall to treat his supposed ADHD, he might end up getting addicted to amphetamine.

I think we really need to accept the fact that some brilliant people are going to have problems no matter what, and they are fundamentally part of their personality. I don't think teaching a lisp programmer Python is going to make him write code more consistently and still write code that's just as beautiful. I don't think we can slice the "tortured" part of "tortured genius" off without cutting off some of the "genius" with it.


>"I've become really disillusioned with clinical psychology because of things like this."

I agree with your points about people using diagnostic jargon in conversation in a way that distorts its meaning. I'm sure that happens in other fields also, but speaking as a psychiatrist one thing that I have noticed is that it tends to make people feel like experts on mental health when that conclusion is not justified. There are, however, relatively specific technical definitions for terms like Bipolar Disorder and ADHD. This article doesn't use them, but it's not being published in a peer-reviewed journal.

Don't let this article disillusion you, because it's not about clinical psychology or by a clinical psychologist. It's by someone with some interesting observations who is mis-applying technical terms.


I wasn't really referring to the article in particular; I was more just ranting about something that I was thinking about before that the article reminded me of.

I think the problem runs a bit deeper than people thinking they're experts when they're not, though. I think when most people try to diagnose themselves, they look online for symptoms and such. If they already have some conviction in their diagnosis, they might think that they should also have other symptoms when they don't, or start noticing borderline things that they otherwise would easily ignore. Then, when they finally go to the doctor, their list of self-reported symptoms is no longer a list of what they independently noticed but a laundry list of whatever they read on webmd.

I'd be interested in hearing if you think this is actually the case, since you definitely have more experience and knowledge in it than I do. I'm basing most of this off of the personal experience of a few of my friends and the modicum of knowledge I have from a few clinical psychology classes I took in college.


>"Then, when they finally go to the doctor, their list of self-reported symptoms is no longer a list of what they independently noticed but a laundry list of whatever they read on webmd."

Sure, it happens all the time, in all areas of medicine. Not just psychiatry. The key here is that when someone says "I'm depressed," or "I have 5 of 9 criteria of a Major Depressive Episode," or "I have gout," that does not mean that the diagnostic evaluation is over. It's important to get clear examples of what people have noticed. Despite their conclusion, is this better explained by cancer, or anxiety, or a thyroid problem, or substance abuse? Like migraine, psychiatric illness is evaluated clinically rather than by imaging or lab tests, but there can be important overlaps that sometimes make these tests helpful to rule out other causes.

A patient's own conclusion is an important piece of the puzzle, but it's still just one piece.


The DSM-IV criteria have never impressed me as particularly specific or technical; especially for milder (more common) diagnoses like hypomania, ADHD, bipolar II, etc...


It's important to check the back of the manual for definitions of all of the terms in the criteria. Reading the criteria, you may be interpreting them through their conversational meanings. It also underscores the need for doing observed interviews during training, to make sure that you are applying the terms correctly. However, you are right in saying there's always room for improvement.


What's lacking from the psychiatric process is any sort of objectivity in the choice of thresholds distinguishing "normal" from "atypical". The criteria for most disorders apply to huge swaths of the population. They're not symptoms of disease, but symptoms of being human. In order to separate the diseased from the healthy, the psychiatrist must then twiddle nobs to find a distinguishing threshold.

"If you're only annoyed at the shape of your nose, well that's normal. You have a somewhat ugly nose. But, this other chap is really preoccupied with his nose. It's impairing his life quality--- he must be suffering from body dysmorphic disorder. Antidepressants might do the trick!"


My wife, who actually is a Clinical Psychologist has a motto.

"A problem is not a problem, unless it is a problem"

She agrees with you. It is the pop-sci nuts who do not.


Excellent article.

My take is that what kills Lisp is a common allergy towards anything related to so-called release management. The Lisp technology helps with it - it encourages staying within an image and putting all new stuff there. The problem is, it can and often does create an attitude "it works in my image and I don't care, you should know how to put it together by yourself or maybe you're not skilled enough". Well, it partly true. But the proponents of such elitist view tend to forget that today's software is so interdependent and relying on moving parts each of different quality, that actually we don't have the luxury of having "works for me" attitude. Even a genius can bang his head against stupid obstactes. It's not intelligent (and counter-hackish) to repeat the same mistakes again, or allow others to do that.

We as a civilisation invented cool things like semantic versioning, encapsulation, TDD & BDD. Please, Lispers, do follow this movement. Some people like creator od Quicklisp do a tremendous work in the area of release management, but are they a majority?

Sharing is part of creating. Making something shareable counts as well. Not being able to share makes people bitter, because we're social animals after all..


You read that entire article and your takeaway is a theory about why Lisp isn't popular? I got something entirely, completely, and utterly unlike that. I read an article that was about the author's students, but I recognized the same thing in myself, in people who have devoted themselves to playing Bridge full time, and many other corners of society where extremely bright people hang out.

The correlation between the personality and Lisp programming is very interesting, but I have to say that Lisp having a terrible "UI" for sharing--if true--doesn't strike me as being connected to the article.

Maybe you could help me connect the dots between what the author is saying and what you are saying?


"You read that entire article and your takeaway is a theory about why Lisp isn't popular?"

He said 'take' (not 'takeaway'), which I would interpret to mean his viewpoint on Lisp that might be unrelated or even contrary to the article.


Exactly. Please note I'm not speaking English natively and used phrase 'my take' according to my knowledge how to use it. My point was rather a digression comparing to the original article. In fact, I sympathize with the author and I put my 2cents what IMO causes the situation described.


You used it correctly.


Excellent point, thank you. My misreading of that one word biased my interpretation of the comment!


Much of the article read like a description of my life, even though it's only in the past couple of years that I've gotten into Lisp.


This seems really off-base to me. I don't think I've ever once heard a Lisper say "it works in my image". The consensus in a recent discussion on the pro-lisp group is that images are useful primarily for long-running servers, not for development. In any case, standard practice when putting out any CL code other than a snippet is to supply an .asd file with it. That seems to refute what you're saying.

Can you supply concrete instances of this "it works in my image" attitude? It should be easy if it's as common as you say.


In my experience packaging, TDD and images have nothing to do with the problems that Common Lisp had (has?). I tried to use Common Lisp for all my "home" projects between 2005 and 2006 and eventually gave up. My conclusion was that the problem of Common Lisp was a combination of:

a) the standard is outdated, on one end it lacks many features that we now see as indispensable, due to not existing when the standard was ratified (unicode, networkings), not being commonly part of standard libraries at the time (modern packaging, regexps, C bindings) or simply were forgotten (parsing floating point numbers), on the other hand many features were specified in byzantine ways for compatiblity with systems that do not exist anymore (file system access, character encodings)

b) there were 5 major implementations, 2 of which commercial with expensive licences.

c) the community was small as much as that of any language that isn't C-like or PHP, Perl, Python or Ruby.

None of this was a problem in itself but, in one sentence, the community was too small and fragmented to bridge the gap between an old standard and practicality. As a result even doing something as simple as a web scraper supposed to work on both OSX and Linux required major library juggling.

This doesn't happen in other languages not because of some psychological attraction of biploar minds to lisp but because other languages often have either a single implementation or a lot of modern features in the standard library or both. Or they have enough users that something like boost eventually gains traction.

I really hope quicklisp fixes this but it isn't simple.


"I really hope quicklisp fixes this but it isn't simple."

Clojure fixed (a) and (b) by taking most of the good ideas and repackaging them in a fresh design, and fixed (c) by building on top of Java's virtual machine and libraries. Come on in - the water's nice!

http://learn-clojure.com/


I alreaady know about clojure and I quite like it.


I've also noticed a culture of doing the beautiful and elegant thing instead of duct taping a solution to get something working.


Yes, but worse is better.

http://www.jwz.org/doc/worse-is-better.html

The duct tape solution may only do 75% of the functionality, but it's there, and it's simplicity allows for hacks to be built on top of it.


The author of Worse is Better wrote papers attacking and defending WiB, even under a playful pseudonym. A decade after writing it, he still didn't settle on a side. So citing "Worse is Better" doesn't end the discussion right there.

http://www.dreamsongs.com/WorseIsBetter.html


I can't speak about 2005/6, but as a CL apologist in 2011:

My dad was doing TDD in common lisp in the 80s. Back then he just called it programming.

A. The major releases support things like unicode, networking, packaging, threading, cbindings. There are also 'standard' libraries for things like regexps, cross implementation threading, stuff like that. The Common lisp asni standard covers a areas that would be part of the libraries for most languages.

(Parsing a floating point number is easy once you have regexps by the way; confirm it is in the right format then do a read-from-string).

I don't know your point about character encoding or file system access? (with-open-file (stream #p"/path/to/directory"... ) ...) seems pretty easy to me. There is a lot of pathname stuff in the spec, that could be mastered, but as far as the basics, you can get by simply by having knowledge of a couple of key functions and the file-system that you are using.

b.) I don't know of any commercial PHP, Python, Perl, or Ruby implementations. I also don't recall professional C development environments having particularly cheap licenses either. (How much for a copy of visual studio? I think the cheap version is $800. How about the Intel C and Fortran compiler suites? They're at least as much as Lisp Works or Allegro).

In terms of free implementations, just pick one. I like SBCL myself, as it is fast, mature and has a lot of low level features (A+++++ great, would code with again).

No one is kept up late at night in a quandry over the choice between cython, jython and unladen swallow! Why should it be so for CL? And languages like python and ruby change in major ways every few years! Do i want ruby 1.8 or ruby 1.9.2 or python 2.6 or python 3.0? OH GOD!

And unfortunately the languages that near CL's flexibility are invariably slow.

c.) How does the size of the community really effect what I'm going to make in the language? (Here's a link to a web client btw http://weitz.de/drakma/ Weitzware rocks). There's a plethora of libraries, and now with quicklisp, someone is actually curating them so you have a good idea of which libraries will work pretty well.

After the initial learning curve, making things in common lisp really is easier than making things in a lot of other languages.

The problem is that there is a fairly large initial learning curve. even in 2011 it is very different from a lot of languages that people are already comfortable with.

The standard covers a huge range of ways of thinking about writing computer programs. (You can feasibly combine ideas from functional, imperative, OO, and DSL based programming, all within a few lines of code.

You can even do C 'bit-bashing' types of programming, if you really want to (and it can be pretty darn fast).

I guess my basic thesis is that Common Lisp doesn't suck.

Learning it (completely) sucks, because it involves learning a large number of the disparate things are done with a computer, which is a daunting task.


> The major releases support things like unicode, networking, packaging, threading, cbindings.

But every implementation does it differently.

> No one is kept up late at night in a quandry over the choice between cython, jython and unladen swallow!

And that's because Unladen Swallow is mostly dead, most libraries are written for CPython first, and if you happen to choose Jython you can always pick a library from the vast pool of Java libraries. With Common Lisp it used to be that every library supported a different subsection of the implementation x operating systems matrix.

> I don't know your point about character encoding or file system access?

Read the standard on pathnames. Versioned filesystems are a blast from the past.

> Here's a link to a web client btw http://weitz.de/drakma/ Weitzware rocks

First time I used it I had problems convincing it to use the right character encoding when downloading pages. The second time I tried it wouldn't compile on SBCL on Mac OS X.

> I guess my basic thesis is that Common Lisp doesn't suck.

It doesn't, I never said it did.


> But every implementation does it differently.

So don't use every implementation. Pick one.

>Read the standard on pathnames. Versioned filesystems are a blast from the past.

I have, what is your point? Most languages use string concatenation for file-system access. How is CL a step backwards from that?

>It doesn't, I never said it did. It was an allusion to a fairly well known post entitled 'why common lisp sucks', which is basically what Dr. Taver is describing.


> So don't use every implementation. Pick one.

But then a library you need doesn't work on the implementation you pick on your operating system, or you want to port to another operating system.


This 'works for me' attitude is completely unrelated to Lisp. Ever worked in a Java team where some guy constantly checks in code which works for him? I've seen that. He still wanted to check in untested code.


Heh. A friend of mine contracted at a place where two programmers had a long-running feud. Each would check in code that wouldn't compile against the other's. One would comment out the other guy's incompatible code and check in his changes over top of it; then the other would simply uncomment his own code back in, comment out the first guy's stuff, and check in his latest work that way. They alternated this way for months.


The author is some professor using Lisp and falls into some traps. The personality he describes is not a speciality for Lisp (he may simply not know other programming community) and some of the people he observes may not really be have this kind of personailty - he really can't see since something like a programming news group only exposes a tiny bit of a personality.

It is true that in the Lisp community there are and were some highly intelligent people and some of them are not your average guy (think Stallman, Gabriel, ...) - but those were extremely productive and creative. Gabriel was the CEO of a development took company which was quite successful for some time.

I think that people exist who march what he describes, but not exclusively in the Lisp community. I also think that many in the Lisp community have much complexer personalities than what he describes.

comp.lang.lisp has long been the target for weirdos and trolls. That's one part of the problem. There are reasons for that, but I fear there is not a single explanation for the behaviour of various widely different personailities.

I have seen a lot of weirdos in other communities. Mathematics attracts some of them for example. I remember that guy who proved that PI is finite. Physics is another. I remember that guy who put a lot of effort into a presentation about water and its memory effect. A special, not that cheap device, could take advantage of that effect. That device was for sale.


    Another feature about this guy is his low threshold of boredom. 
    He'll pick up on a task and work frantically at it, accomplishing
    wonders in a short time and then get bored and drop it before its
    properly finished.  He'll do nothing but strum his guitar and lie around
    in bed for several days after. That's also part of the pattern too;
    periods of frenetic activity followed by periods of melancholia, withdrawal
    and inactivity.   This is a bipolar personality.
Actually, that also sounds like the prompt on a psychology exam, and the answer is "Attention Deficit Disorder".


Wow. That description fits me almost exactly.

Or, rather, it fits what I wish I could feel free to do as life cycles between these phases. What usually happens is I have to keep going to work and slogging through stuff even after I finish a productive frantic stage, and it's really really hard to do that. I get very slow. I feel down/demotivated/depressed during the down phase. Then the next frantic/excited stage hits (randomly as far as I can tell) and I work on the currently interesting feature/framework/whatever with all my waking energy until the phase suddenly fades on me. Rinse, repeat.


I was, straight out of hell, diagnosed as having bipolar disorder, and subsequent pharmacotherapy not only controlled my most salient symptoms, but helped me untangle many, many psychological issues.

After all that process -- two years, maybe -- me and the shrink began to look into some things that still happened, and arrived the conclusion that I have ADHD too. The comorbidity (epidemiological correlation) is actually not low.

Because I then went into palliative care for the ADHD (it's much less effective than bipolar, essentially because the drugs keep you "more on the edge" so you feel motivated to do things -- which is just a crutch), and I have a good idea of what ADHD and bipolar feel like.

Many things in the essay sound like ADHD. But if you remove the first sentence, that paragraph sounds more like bipolar compressed into a too-short time span. Change "days" for "weeks" and you have rapid-cycling bipolar.

Overall, the cluster that he refers to sounds more like the Dynamic Duo (bipolar+ADHD) than either condition alone.


I'm not sure there's enough information in that paragraph for a proper diagnosis... Bipolar and ADD can be confused easily.

Here's a nice grid showing the differences: http://www.revolutionhealth.com/conditions/mental-behavioral...


ADD is very common among people with bipolar disorder. I have both myself, and they play off each other, so to speak.


I've never written any Lisp, but after how much I related to this article I feel like I should be!


I found this article extremely interesting, as I felt that I found my own personality perfectly mirrored in it. To clarify, I'm currently in my final year of high school (doing the Irish Leaving Certificate) - my entire life I have been "acing most of my assignments", indeed doing things at the last minute and doing extremely well with it.

And yes, I am not taking school seriously at all. My punctuality is terrible, I think I can honestly count on one hand the number of days I haven't been late this year (and I live close to the school). I skip classes in which I feel I am wasting my time (religion, anyone?), and have gotten several detentions, etc. because of it. In many ways, my track record in the school (I got a full scholarship for secondary schooling based on an exam in sixth grade, I was the only person in my year to have received an offer from Cambridge etc.), and resulting from that my relationship with the teachers, is the only thing that has kept me afloat in this school.

I have a huge problem with the Irish Leaving Certificate, an exam which in my opinion teaches you exactly two things: How to learn things off by heart, and how to write fast. It has nothing to do with intelligence or skill, but just the number of hours spent memorising pre-written notes, and being able to spit it out onto the page in two hours during the exam. I am utterly bored by it, and so I spend a lot of time in class on my iPhone, checking the news, reading RSS feeds, Hacker News, etc.

This frenetic burst of activity, followed by a period of "[doing] nothing but [strumming] his guitar and [lying] around in bed for several days" completely describes me, except you'd have to replace strumming the guitar with lurking on Hacker News! ;) I have never found anything wrong with that before, and to be honest, this description of being "bipolar" struck me - is there really anything wrong with this sort of behaviour?

This article certainly paints a fairly grim picture for my future university life, if my personality is truly as reflected in the article as I can see it to be at the moment!

Anyway, it was truly enlightening to have found this now, before I'm even at university - I have only now become aware of my attitude. We'll see how things pan out for me, if I'll be "scraping along the bottom" or end up on top...

(This is my first comment here, even though I have been following along for a fairly long time.)


Memorising things is a pretty important skill for a good hacker. I forget (D'oh!) which famous researcher into comp sci it was, but one of them had done a study on the "uber-programmers" (the guys who out produce normal programmers by a factor of 10 or more) and found that all of them had much better memories than average.

The real danger of being too smart in high-school is that because you can pull the answer out of your arse any time you want, you don't develop a good work ethic. And in University, not having a work ethic is almost certainly going to catch up with you. In high school I was in a streamed class (ie all the bright kids) and almost all of them came off the rails in the first year of uni when they got to the end of the year and discovered that they weren't allowed to sit the final exams because they hadn't done the 'stupid' assignments.

Naturally, being smarter even than the rest of the smart people, I sailed through first year uni. ... only to come unstuck during the second year. :D

A lot of the kids you look at now and despise for their inferior intellects are going to have an easier time of it at university than you, because they've built up a work ethic, and you haven't.

You don't have to be bipolar for most of the symptoms the article talks about, just smart and lazy. You get a great idea and half complete it? Not necessarily a bipolar thing at all.

Problem is, smart and lazy won't cut it working for 'the man'.

So you might think, well, this is HN, I'll just start my own company and 'the man' can go whistle dixie!

Unfortunately, while 'the man' frowns on laziness, the free market absolutely despises it. It will hunt you down. murder you and then do unspeakable things to your corpse.


You sound like me in the final year of 6th form (last year of high school in the UK).

I don't really have time to write a longer answer right now, but I'd say - don't worry, your behavior is pretty much a rational response to your environment. Things do get a bit better at university, so it's worth making the effort to get into one with a program you like. (Caveats: one you like might not be one that's highly prestigious; university is better than high school, but not always that much better).


Should be a must-read for hiring managers at software / web shops.


Just a cynical observation. From the article:

"But also it goes with realising that a lot of human activity is really pretty pointless, and when you realise that and internalise it then you become cynical and also a bit sad - because you yourself are caught up in this machine and you have to play along if you want to get on. Teenagers are really good at spotting this kind of phony nonsense."

I think that what Teenagers object to is other people's phony nonsense. Or perhaps phony nonsense that is imposed upon them by external forces.

They seem more than happy enough to generate boatloads of their own phony nonsense.


Great article.

While the author might indeed be not quite right with psychological terms, he definitely has a point.

I read it, and (like many people here) found myself in it, so my next question was: okay, now what should I do with this?

The earlier discussion [1] here on HN has a link to comp.lang.lisp, containing some great advice [2].

Thanks a lot to RiderOfGiraffes for providing links to the earlier discussions.

[1]: http://news.ycombinator.com/item?id=20140

[2]: http://coding.derkeiler.com/Archive/Lisp/comp.lang.lisp/2006...


Makes me want to restart my 5th attempt at a Clojure project.



It was scary how much this article made me think of myself. Probably, no coincidence then that my favorite class at school was my first CS class, taught in Scheme, from SICP, and that my GPA was quite low.


Likewise. When I first read this, I honestly felt a shiver go down my spine--it was eerie. I was basically the physics major equivalent in college, and now love playing around with Lisp/Haskell/etc.


Whoa, same here. Got disillusioned with academic physics in college (though astronomers are cool). Been working through Haskell's Write Yourself a Scheme in 48 Hours tutorial. There must be a few dozen of us.


One more here. I don't think we're extremely common but we're also not alone. I'm coasting in community college while my parents pay for room and board and I try to get some revenue off of projects. I've realised that the only way for me to survive is to create something of my own or find an environment where I have enough intellectual freedom to stay engaged.


Heh. Just getting started with SICP. This reminded me even before college..


Are GPAs in the US based on averaging all of the marks from all of the classes in a course?

If so I'd have don't rather badly rather than getting a First - the UK system generally puts the emphasis on performance in the final year (or it least it did in ye olden tymes).


At my (US) college, the first semester of freshman-level classes were taken pass-fail, so they didn't contribute to GPA. But that's quite a ways away, and still generally the exception. We will also sometimes report "In-Major GPA," which tends to represent later years more. I would argue the in-major GPA is more relevant than either total average or last-year average (as much as GPAs are relevant at all). This would be especially true at my college--people would finish all their major requirements early so their last semester was a nice relaxing victory lap of mostly electives.


Are GPAs in the US based on averaging all of the marks from all of the classes in a course?

Generally, yes. There are some ostensibly harder courses you can take that are weighted slightly higher (the highest grade in an honors course may be a 5, while the highest in a normal course is a 4, so it contributes more to your overall GPA).

That said, I haven't ever really heard too many complaints about this system. In my experience the level of effort required from year to year is similar, it's the type of effort that changes. For example, in a first-year math course, you might struggle because you just don't know what you're doing. By a fourth-year math course, you may know what you're doing, you just have to do a lot more, and do it better.


That said, I haven't ever really heard too many complaints about this system.

One obvious complaint is that an engineering student might have 50% of their GPA attributable to non-engineering grades.


What kind of non-engineering classes might there be in a engineering degree? In the UK courses are (or at least used to be - some have gone all "modular") rather narrow in focus - in the 4 year CS course I did in the 80s only one elective class was included that wasn't specifically related to maths, engineering or CS (and that was a rather enjoyable "History of Science" course).


The specific nature of non-engineering classes varies, but most science/engineering degrees in the US will require about 16 1-semester classes spread across {history, sociology, literature, etc}.

This is reduced a bit if you go to an engineering school. I only had to take 9 non-science classes (+ 4 gym classes).


That's true, and it is a specific complaint I've heard that I did overlook in my statement.

That said, I think it's common to specifically point out your discipline-specific GPA (especially) if it is substantially better than your general GPA, whenever given the opportunity (like on a resume).


GPA calculation varies from school to school, and even from department to department within a school. Classes might be weighted differently across a variety of factors such as difficulty, hours, and course work, thus, a straight average, although common in many scenarios, is not always the hard and fast rule in the US.


I've never seen a US-based system that weights classes later in the curriculum higher than the beginning ones (which is too bad, since like many, mine rose towards the end).


Swarthmore has a system (the Honors program) where the degree granted (graduation, with Honors, with High Honors, or with Highest Honors) is based on the judgement of external examiners on the quality of examinations (written and oral) taken at the end of the senior year.


At my university, all classes weighted equally on a four point scale. My GPAs in my majors, CS and Linguistics, were slightly higher, but the heavy systems component of my department's curriculum kept it low.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: