> News is fake. Science is fake. Schools are barriers. Everything is subjective, objective reality is nonexistent.
How do we have productive disagreements going forward?
Funny you describe it that way. I'd argue that young people in STEM fields, including IS/CIS/CompSci undergrad programs, think everything can be objective when that clearly is not the case.
You don't need to go to college to press buttons, fill out spreadsheets, or input code until you get the output you seek. You need to go to college to make the subjective decisions, which don't have a clear right/wrong answer.
The humanities are glossed over at best in public high schools across the United States. I don't know of one that requires a PHIL intro course of students.
College is not vocational training (unless you're a law or medicine student), it's for learning how to think.
If you study the International Bacalaureat (IB), a high school curriculum taught around the world that is based on the French system, you are required to study Theory of Knowledge; effectively an introduction to philosophy.
> The humanities are glossed over at best in public high schools across the United States. I don't know of one that requires a PHIL intro course of students.
I'm not sure how much weight this argument holds.. The whole "gen ed" thing is a rather US-centric concept.
I don't know of any universities in the UK that require a PHIL intro course of students. When you go to university, you overwhelmingly study the one course ("major") that you picked beforehand. There's often a small amount of room on many courses for optionals from other fields, if you want to take them, but this is by no means mandatory and I'd say the proportion of folks doing philosophy modules studying a different degree at my alma mater was slim.
No wonder the UK's app startups are even more ridiculous than the US's app startups ;).
Yes, gen-ed is ubiquitous in the US. If you're in any humanities related program in the state I live in (Texas, so that's probably 25-30 large universities total), you'll have to take an intro PHIL course at least, which will probably be Plato and a random survey of 19th century European readings.
More is highly recommended for students looking for law school admission after their undergraduate degree at the state-owned college I attended.
In theory, it would be good to have a part of our educational system which teaches people how to think. But what does that look like? I'd say that boils down to two things: logic and evidence.
The vast majority of philosophy classes are absolutely garbage at teaching either of those things. Sure, in theory, logic is part of philosophy, but in any of the philosophy classes I've taken, we didn't talk about logic. The things we did talk about were often examples of how not to think, yet they were presented as equally valid next to much more rational ideas.
For example, one of the things we covered in my ethics class was Kant's categorical imperative. The categorical imperative falls over instantly if you apply even the most basic logic to it, but no mention of this was made in any of the course discussion or materials. I'm sure lots of people walked out of that class thinking that the categorical imperative was a perfectly reasonable way to make ethical decisions. If this is the sort of "learning how to think" philosophy classes are doing, then I'd prefer we didn't--I'd rather let people figure out how to think on their own than to teach them unequivocally incorrect ways of thinking. Philosophy could be useful if these classes were taught as, "Here's a bunch of historical ideas, and here's how we apply logic to prove them wrong." But until that happens, I'd strongly oppose introducing any more philosophy to curricula.
Other fields are better-equipped to teach people logic and evidence. Science is all about evidence collection, and logically applying the collected evidence to the evaluation of hypotheses. Math, especially around proofs and derivations, is all about logic, and probability and statistics give you tools that are very broadly applicable. History, if taught well, teaches you how to logically analyze artifactual evidence and logically contextualize the present in terms of the past.
But, there are two problems: first, many college students don't focus much on these areas. And second, the parts of these fields which I mentioned aren't particularly well taught even by these fields. Many students get A's in science classes thinking that science is memorizing a bunch of facts about chemicals or living things, without ever having learned how to obtain new facts themselves. Many students get A's in math classes having memorized a bunch of formulas without being able to derive even basic proofs. Many students get A's in history classes having memorized a bunch of historical events, without knowing the difference between primary and secondary sources, and without ever considering that an author might have bias. Even the classes which do teach people how to think, to some extent, generally do a piss-poor job of it.
That's not to say that these fields (and other fields not mentioned) have no value. Even if you think well, your thinking is only as useful as the evidence you feed into it, and colleges do a very good job at moving vast amounts of evidence on a variety of subjects into people's brains. Further, colleges often do a lot of work getting people skills: lab techniques, using computers, effective communication, etc. You can argue that the purpose of college is learning how to think, but the implementation of college is much better at teaching people information and skills. Learning how to think would certainly be valuable, but de facto it's not what colleges are doing, and the things colleges are doing do have some value.
That said, modern colleges often put teaching of any kind behind profits, and that's not something I see any value in for society or students.
I agree completely on your critique of profit motive at universities, and think it particularly applies to state-owned institutions. There is a false notion that profit is an automatic good, when that is clearly not the case.
There is more to critical thinking than formal logic, I'd argue.
The classroom format of a typical humanities college course has a lot to do with this. For example I would argue that a person without any background in the classics and some basic theology is going to struggle to get much from Paradise Lost. It's dense, difficult, and you have to know some things going into it to pick up on the nuances. But 30 people discussing it collaboratively 2 or 3 times per week, with the expert professor's guidance when they stumble on certain parts makes for a lot of interesting discussion. Thirty different people will pick up on thirty different bits and pieces in every classroom session.
I'd guess that a big part of the reason there is such a glut of humanities graduates who can't find professorships is that people simply enjoy the classes enough to keep going all the way through graduate degrees. You get discussion and debate in those classes that you can't find anywhere else.
I don't think the above is true of many other disciplines of study, with so many degrees offered being pitched for purely profit motive as job training, as you mentioned above.
I can't do a better job of describing this than this professor who puts his public lectures on youtube for free..
> There is more to critical thinking than formal logic, I'd argue. The classroom format of a typical humanities college course has a lot to do with this. For example I would argue that a person without any background in the classics and some basic theology is going to struggle to get much from Paradise Lost. It's dense, difficult, and you have to know some things going into it to pick up on the nuances. But 30 people discussing it collaboratively 2 or 3 times per week, with the expert professor's guidance when they stumble on certain parts makes for a lot of interesting discussion. Thirty different people will pick up on thirty different bits and pieces in every classroom session.
Well, if you look at literary criticism, there are a bunch of different ways to do it. The oldest ways, such as authorial intent or historical criticism, aren't that divorced from history as described in my previous post, or from just normal old formal logic. But a lot of the ways popular now, such as Marxist criticism or feminist criticism, are forms of reader-response criticism. In the worst cases, this sort of criticism can be used as a pulpit for professors to pass on their ideologies, which is deeply problematic--rather than teaching students how to think for themselves, it's teaching them to think like the instructor. In the best case, it can teach students how to evaluate literature in relation to their own goals--but I would argue that this is just an application of formal logic. The reality, in my limited experience, is neither of these extremes--classes I've taken and my friends have taken have mostly been "these are some of the ways people have thought about literature"--it's more about passing on information than about teaching how to think.
As I've said before, there's a lot of value in giving people information, I just don't think it supports the "college is about teaching people how to think" narrative.
That said, I'll give two caveats here:
1. My own formal training isn't in literary criticism, and beyond some general-ed requirements and second-hand experience from friends/partners in literature programs, I have very little experience here. My impressions here may very well be off-base, which is why I didn't mention literary programs in my previous post. A notable bias in my previous post is that I talked most about the fields I'm most familiar with.
2. Up to this point, I've basically been speaking about teaching facts versus teaching how to think as if they were two different, mutually exclusive things, but it's worth noting that that's not quite true. It's true that simply giving a student a fact doesn't teach them how to evaluate whether something is a fact, but if you give a student enough facts, eventually they come across discrepancies and begin to experience cognitive dissonance. Over vast swaths of facts resulting in a few discrepancies, a student will eventually begin to come up with their own framework for evaluating discrepancies, and hopefully that framework will look a lot like formal logic and evidence collection. I'd argue that this is a very, very inefficient way to teach students how to think, but eventually I think it does work.
For example, one of the things we covered in my ethics class was Kant's categorical imperative. The categorical imperative falls over instantly if you apply even the most basic logic to it, but no mention of this was made in any of the course discussion or materials
I've read this a couple times, I'm curious about what you're saying here, do you mean that your class just reviewed some writing on the categorical imperative on its own, or read Groundwork of the Metaphysics of Morals?
I'm not sure what you mean by 'teach the basics'. Certainly, college can provide the texts and an environment where other people are interested in the same subjects. If that's all you mean I agree, though it's far from the only institution where that's possible.
The trouble, I think, is that making ethical judgements requires wrestling with ethical conundra oneself, and that is not something a professor with 300 undergrads to teach can provide any useful guidance to the majority of them for. The idea of accurately assessing performance is even more unrealistic. Maybe it's a function of the kind of university I attended, but the vast majority of my fellow students who were taking these 'subjective' courses were simply gaming a rubric in their writing. And this is true even of those who were genuinely interested in the subject matter, they saw it as a price of admission.
Which seems to me like an impediment to actually learning what was traditionally taught on more of an apprenticeship than an industrial model. If your own undergraduate experience was different, I'd be curious what your university did differently.
> making ethical judgements requires wrestling with ethical conundra oneself, and that is not something a professor with 300 undergrads to teach can provide any useful guidance to the majority of them for.
I don't remember a lot from my undergraduate course on Ethics, but I do recall that literally in the first lecture, the professor presented us with questions about things like how should one behave or treat others, and then presented us with "edge cases" that directly challenged what most of us had answered.
As a young person, it's very easy to think that our problems are novel and unique, and the ethics course very clearly showed that many of these problems are millennia old, with people having given names to better-realized versions of what most of us think of as the way we should behave, and that people have spent lifetimes of work writing and arguing about the ramifications and "edge cases" of such philosophy.
I feel like the biggest benefit from the course was not any particular ethical guidance, but rather the challenging of our beliefs, and the realization that these things _are_ hard, and are not something we can trivially answer with something that fits on a Hallmark card.
Do you really think new college students absorb all that they hear in the entry-level classes? You're also operating under the assumption that all professors employed by a college are capable of effectively communicating the topic they're supposed to teach
I agree. Developed is one thing, even building a space where it's easier to develop, but teaching? As a guy who's taught for both fun and profit, I don't buy it.
Funny you describe it that way. I'd argue that young people in STEM fields, including IS/CIS/CompSci undergrad programs, think everything can be objective when that clearly is not the case.
You don't need to go to college to press buttons, fill out spreadsheets, or input code until you get the output you seek. You need to go to college to make the subjective decisions, which don't have a clear right/wrong answer.