Hacker News new | past | comments | ask | show | jobs | submit login
Why college students who do historical research become analytical thinkers (theamericanscholar.org)
92 points by diodorus on Dec 17, 2014 | hide | past | favorite | 28 comments



History major turned software engineer here. I think history gives its students two key skills:

(1) You have to wade through a lot of complexity and nuance, try to find the signal in the noise, and then marshall some of that complexity to support your argument. After you've speedwalked through thousands of pages of archival documents and other scholars' arguments, it's comparatively easy to be thrown ~50 pages of framework or API documentation and quickly find what you need. (That's also why I'm excited that Common Core puts greater emphasis on reading non-fiction to learn skills, instead of every English class being about reading fiction for pleasure. It builds the same skills.)

(2) You get used to the feeling that you know only a tiny fraction of what there is to know about the world. That intellectual humility leads you to ask good questions, try empathizing with perspectives and ideas that may seem strange at first glance, and only make careful assertions that are supported by strong evidence.

And by no means do I think history is the only academic discipline teaching students these things. I just wish it got more respect!


>> I'm excited that Common Core puts greater emphasis on reading non-fiction to learn skills

I agree.

I wonder, what other kinds of skills should be taught in k-12 that aren't taught today?


Econ and Finance major turned software engineer !!! I think it is an advantage, esp if you are working in Analytics


I attended a liberal arts college, and enjoyed the courses that I took in most subjects. I got good grades. I'm still interested in many of the topics gathered under the umbrella of "humanities."

But it strikes me that "humanities students become analytical thinkers" is as much of a meme as "humanities students become baristas," at least in the absence of supporting evidence.

It could be true, and to some extent is anecdotally supported by those among my acquaintances who studied in the humanities, but that could be just a matter of survivor bias. We don't hear from the ones who didn't become analytical thinkers, or we associate them with their terminal degrees, such as MBAs.


I'm writing this just to comment on your use of the word "meme"; it's not intended to be a criticism* but the expression of an observation.

In your fourth sentence, it seems that are using "meme" to specifically denote an idea or statement that is unsupported or unverified. This is not wrong, but I have not seen this particular usage of the word before. It does differ from Dick Dawkins' original meaning: he used "meme" to denote the component in cultural evolution whose role is analogous to that of genes in biological evolution. But I find your use to be an interesting development in the development of "meme" and the memes it signifies†.

-----

* However, I do believe the conversation has been framed improperly: e.g., the groups "baristas" and "analytical thinkers" do not have mutually exclusive membership, but instead intersect significantly. There exists many an analyst barista who brews Brazilian whilst brooding about Bruegel at Bruegger's Bagels, plenty espresso-synthesists with scholastic emphases on existentialists' expressions of Parisian café culture (dissertation: "Bean & Nothin'ess"), and please don't get me started on those poor doctoral students who must lecture on Melville in the morn then manage a late shift at the mall latte-mill aptly named after Ahab's first mate.

† Indeed, for me, I find to be the memetics of "meme" a remarkably meta matter, and that fact is itself alone the motivation to make my remarks (I must mention, if I may).


Are you familiar with Rap Genius? If not, they use "meme" as catch-all-word-for-units-of-culture pretty often, too. Interesting phenomenon for sure


Criticism accepted. Clearly, you're a lot more philosophically alliterate than me.


Terence McKenna defined a meme as “the smallest unit of an idea that still has coherency,” and he sure created a lot of them.


Beautifully said!


Before entering college, I used to think that philosophy majors were just degrees to kill time. I took a philosophy course for a blow-off elective. It was a class on critical thinking. The class was quite challenging and covered topics from James Randi's skepticism, to logical fallacies, and to statistical methods and reasoning. It really made me to what I am today. A skeptic and someone with a career involving statistics.


I sat in on a couple Philosophy of Biology classes on the suggestion of a friend during undergrad. There was an entire series of these courses for chemistry, physics, computation, etc., with a mix of history and philosophy. There's a certain contextualization of a field that the technically oriented study track never seems to cover adequately (or if it does, it's the first paragraph of a chapter that gets quickly skimmed through).


I read a short introduction to the Philosophy of Science, because I figured I must be obviously missing something. The closest the book came to useful was suggesting that biologists have serious fights over the meaning of the word "life", as if that somehow had any bearing on anything.


The way this article uses the word "humanist" had me quite confused for a few paragraphs. It doesn't match the use of the word that I'm familiar with.[0] I think it's using the word as in "academics involved in the humanities"; hopefully my note will help some other people who had a similar initial reaction.

[0] https://en.wikipedia.org/wiki/Humanism


Oh interesting. I was thrown also by the use. Just looked it up and found: "3a. A classical scholar. b. A student of the humanities. 4. Humanist A Renaissance scholar devoted to Humanism." [0] So I guess it's a pretty major use of the word.

[0]http://www.thefreedictionary.com/humanist


"Kolmogorov also started as a nonmathematician--he was studying history. His first paper, written when he was seventeen, was reported at a seminar given by Bakhrushin at Moscow University. Kolmogorov came to some conclusion based on an analysis of medieval tax records in Novgorod. After his talk, Kolmogorov asked Bakhrushin whether he agreed with the conclusions. 'Young man,' the professor said, 'in history, we need at least five proofs for any conclusion.' Next day, Kolmogorov switched to mathematics. The paper was rediscovered in his archive after his death and is now published and approved by historians."

--V.I. Arnold, "An Interview with V.I. Arnold," Notices of the AMS, 44(4).


"In this work, he used mathematical arguments to answer the following question: was it (i) a village that was taxed in the first place, and then the tax as divided between households, or (ii) the other way around, where it was a household that was originally taxed, and then the sum represented the total to be paid by the village?... Because the total received were always an integer number of (changing) monetary units, Kolmogorov proved that it was rule (ii) that was adopted" -Probability and Statistics by Example, Volume 1.

I guess the households that paid their tax either always paid it in full, or if they paid a portion it was always an integer value?

While you're definitely more likely to get integer values in the end if everyone is taxed $1,000 rather than $999.30. It still makes me wonder how they always ended up with integers in the end.

I guess if I have to write a check to pay off a portion of an amount I'm more likely to pay $400 than 399.90. But maybe it also had something to do with how their monetary system was set up.


> 'Young man,' the professor said, 'in history, we need at least five proofs for any conclusion.' Next day, Kolmogorov switched to mathematics.

Makes sense. In math, you only need one (solid) proof. :)

The relative demand actually makes a lot of sense. When you're constructing a picture by interpolating a story around some points of evidence, overfitting is going to be a hazard, possibly inevitable. And you can't even re-run an experiment. You'd have to be as careful as a scientist, more thorough than a lawyer in discovery, and pretty creative.


What else does one do with history, besides just experiencing it like one does a good story?

I didn't understand that when history seemed boring. After all, it had already happened. The excitement is in what could happen.

Once a person actually does start exploring history of any kind, they reach a point where mere experience isn't enough, and as they get there, they realize the value of history is in the data associated with that history.

Another analytical thinker is born.

This isn't limited to college students. Anybody actually interested in a better understanding can and should be looking at history as a means to that end.

And it can be as simple as actually paying attention to life experiences at a minimum.


The scarcity of data makes history very subtle and interesting. Something we rarely get in schools since we're fed facts and dates (and only partial reasons). When you learn how an historian finally discover by cross referencing antique texts, remains of artwork, geology, etc, it's hard not to be thrilled.


It's not the scarcity of the data that makes history subtle or interesting - after all, most periods and areas of study have a surfeit of data. Even when I was studying 17th century Russia, there were more primary source documents in the (terribly, horribly catalogued and organized) archives than I could possibly read in a lifetime.

The truly subtle bits come from the need to thoroughly understand the Sitz im Leben - as best you can, the surrounding context in which the source was produced and the purpose for which the source was produced. Without that, you just end up with anachronistic presentism.

In my opinion, the need to place oneself as deeply as possible in a completely foreign period and mindset before the interpretation of a new source can even begin is what makes historians excellent analytical thinkers - as well as extremely useful product managers and marketers.


Well, the surrounding context is also part of the data, even though I understand that it's easy to forget embedding yourself in it to interpret the data already at hand.


Yes. I've always framed this as a kind of simulation. When we can actualize things in our minds, we then can also process them in ways not possible in a basic sense.


History is an amazing field to do research in, but its incredibly complex as well. It can be especially enlightening to delve into and imagine the true conditions under which our ancestors had operated in. There is a tendency of a lot of people to 'explain' history without much research and sometimes this view of history becomes very popular. The advent of the internet has allowed a lot of published material to be more easily available to either validate or debunk historical facts.

I hope a more rigorous and scientifically accurate view of history emerges in the future, where children will question the biased views of their parents and discover "the truth" for themselves. Maybe their children will then get a better opportunity to understand history.


It's my hope there is a backlash in the next generation against pop-history like The Oatmeals of ours.


The conclusion: Doing reasearch by attacking a problem that matters to the student, identifying and mastering the sources, posing a big question and answering it in a clear and cogent way, in the company of a trained professional who cares about the problem and the student, makes the student an analytical thinker.


This is possibly a good article; I can't tell, not being up to date on current academic culture around humanites, which is what the article is about.

I don't think this belongs on Hacker News, though


True historical research isn't something many undergraduates do, and I can see how it would be exceptionally valuable.

Unfortunately, many undergrad courses simply have you read existing analysis, or at best present you with "primary sources" that are served up to you with no real digging around. It's not that the books you read are of low quality - often, they're really high quality. I remember reading a very elaborate history about the naval arms race between Britain and Germany prior to WWI, and I got a lot out of it. But what I never did was truly dig into the data. I believe it, but did I ever actually go to the the documents (without someone spoon feeding them to me in a way that leads to only one conclusion)? Nope.

Here's the thing - this can actually lead to big errors later in life. One amusing story I had - I worked on supply chain issues for a big manufacturing company once, and during an interview, I was talking excitedly about various mathematical approaches. The interviewer (a math PhD who eventually did hire me) stopped me and reminded me of how detailed the work really is. He said that one young engineer who worked there had spent six months digging through documents, and had discovered the source of an inefficiency. The QA tests for some components was different in Asia and Europe, so the pieces were passing in one location, getting shipped, failing in the other, getting shipped, passing… in this loop. It had nothing to do with math, it had everything to do with actually getting truly ground level with documents, tests, bits of paper, factory floors, and figuring out what exactly was actually going on in the world.

On a grimmer note, a huge factor in the banking crisis was that people used mathematical models without actually looking at the reality behind them. Yes, a package of very high quality mortgages has a low probability of collapsing, and if you take that low probability number and stick it into your model, all is well. But if you actually dig into the reality, find the documents, realize many of the documents don't exist, and realize that the ones that do are not at all the high quality loans that would typically get a strong rating, and ask why, realize that the package was rated with stronger loans that were taken out replaced with bad loans, the new ones being used to create a new package with a new high rating, over and over, and that the company that did this is now taking out insurance policies against the loan packages that they prepared because they know they have a vastly higher rate of default than the insurers think (based on the faulty rating)…

well, then you're thinking like a history major who actually dealt with true historical research. And the few people who did this were aware of what was coming long before everyone else.


Aha, so that's why Ed Witten won the Fields medal :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: