While there are differences, fundamentally the proposition of mentors, internship-based learning, and living on site is not that different from apprenticeships. There's a wider exposure to ideas, perhaps (because maybe you jump between internships instead of staying with one), but at its most basic level the proposed idea is training people to do jobs instead of trying to expand their thinking minds. Yes, there's lip service given to seminars, but it's evident in the quote (“Traditional universities… list the Nobel laureates… ours… would list the… entrepreneurs, inventors, and executives”) that academics are held in a certain amount of contempt.
Let me be clear: I don't think there's anything wrong with this vision. This approach would probably be a success. It may be needed. The problem is this: somehow we mutated universities into a place where everyone goes in order to get a job. I think we benefit from having more people being educated at a university level, but that benefit is not purely practical. I don't think the point of getting higher education is getting a job. I think it's an extension of the same reason we go to school: because an educated population drives knowledge further. Because when a large part of the population gets a Bachelors of some sort, they can reason better. And because when there are more people at that knowledge level, a larger part than before gets a Ph.D. And that helps society and humanity move forward faster.
There's a tension right now between the university as a place to expand your mind and learn for the sake of learning and the university as a place where you prepare for a job. I think it's a good idea to play with that tension and that balance, and see what balance leads to better results in what way. But I think there is absolutely a place for universities as they are today, and I think their existence is part of what has driven us forward at the frenetic pace that we have seen the last several decades.
I'm founder and CEO of Shopify and I was trained via apprenticeship. Germany has the brilliant dual education system where apprenticeships have never ceased to exist and It's the true driver behind Germany's incredible performance recently. After 10th grade I left high shool and joined a company ( Siemens ) "full time" as a trainee. This means that 4 days of the week you work and 1 day you go to vocational school where you learn about CS fundamentals like compilers and algorithms. The program takes 3 years ( just like the rest of high school ).
It's true, you make almost no money, the work is hard, and you are at the bottom of the food chain but if you get a good placement you will never learn faster in your live. Trying to finish programming projects while also keeping everyone coordinated and cleaning the kitchens also does a great job of stuffing out the entitlement that so many people join the workforce with.
If you have any doubt about the merit of this style of education simply look at Germany. It's vastly superior and I'm raking my brain on how to reproduce this system in North America successfully.
Culturally speaking, America does not necessarily incentivize merit as much as Germany (and Europe) for that matter. We are a nation of marketers and salespeople, which means: (1) personal networks are key and (2) ability to sell is key. Our cultural is built on the ability to control hyperbole and attention fulfillment (bureaucracy) as opposed to hard work (meritocracy). The skills you learn in personal networks (mostly social) in American universities are much more valuable than the ones attained from apprenticeships. In the HN/SV bubble view of the world, this is may not seem true, but largely outside of that bubble it is.
Therefore, what you are looking to do is make a massive cultural shift, not just fix a system. The best way you can fix our system is to continue to make yours (Germany) better. Competition breeds motivation to do better.
note - I respect that some of my comments are very much generalized. And while I'm from the US, I do live in the UK and have lived in Germany.
There is the question of the competency of a narrowly-educated public deciding critical matters in a democracy... The American liberal arts education tradition has this focus at its core, and the poster above alludes to it: the purpose of the education was not only to get a job, but rather to make full, well-rounded civic-minded citizens.
On the one hand, the fact that German democracy seems to function so effectively, perhaps challenges this view. On the other hand, perhaps America needs to invest more in the civic area because the challenges that come from our diversity require extra strength in order for that diversity to in net make us stronger.
Let me be clear: I don't think there's anything wrong with this vision. This approach would probably be a success. It may be needed. The problem is this: somehow we mutated universities into a place where everyone goes in order to get a job. I think we benefit from having more people being educated at a university level, but that benefit is not purely practical. I don't think the point of getting higher education is getting a job.
--
I think the vast majority of us (I include myself) would be better off focusing on learning and finding meaning through work. I have learned to appreciate the finer things of life (literature, philosophy, art, music) outside the context of formal education, as I gained life experience and learned to reflect on that experience. I don't believe I could ever have appreciated philosophy as a 19 year old. I just did not have the life experiences to appreciate the questions it tries to ask. No professor can supply those experiences, and the best the system can do is to simulate it.
So the traditional University, seeking truth, would continue to exist, but people would go there when their life experience propels them towards it.
In crass business terms, we unbundle training in specific skills, the important task of earning a paycheck, from the abstract pursuit of truth. In that sense, the University becomes closer to how a religious institution operates today. Commercial job skills get to be imparted by commercially driven entities, with employers playing a prominent role.
This could dramatically alter the economics of providing those skills, as we have found out from Zoho University. We not only don't charge students, we actually pay them from day one, and yet, over a 3 or 4 year period, their dramatic gain in productivity effectively pays for the investment in skill building. Yes, this is not the "abstract pursuit of truth" but that needs to be unbundled.
Caveat: I'm a co-founder of Dev Bootcamp (http://devbootcamp.com), so I have strong opinions. :D
"The problem is this: somehow we mutated universities into a place where everyone goes in order to get a job."
No, I don't think that's fair. There's no "we." Everyone -- students and institutions -- have a different answer to "what is the purpose of an education?"
In the US, we've been experimenting with the full spectrum of answers for the last 150 years.
For some students "an education" is the promise of livelihood, for others it's about needs higher up on Maslow's hierarchy of needs. Some reasons people "go to school":
* Access to expertise and high-quality curriculum
* Ability to connect with similarly-motivated people over an extended period of time (think: MBAs)
* Credentialing, or in general access to a certificate valued by the market
* Cultivating their own internal or external life, becoming a better person, citizen, etc.
None of these are better or worse than others, however these mandates often contradict each other.
I studied mathematics and linguistics at the University of Chicago and the thing I loved most about the university was that it had a very strong opinion about how to answer “What is an education?” That answer wasn’t for everyone, but the school was comfortable not being “for everyone.”
In my mind, by taking care of a small subset of those mandates, Dev Bootcamp helps free universities to have a strong opinion about that question again.
Sal Khan wants to call it a "new kind of college," which confuses things a bit. To me it's about understanding all the ways people answer the question "What is an education?" and unbundling those needs.
> No, I don't think that's fair. There's no "we." Everyone -- students and institutions -- have a different answer to "what is the purpose of an education?"
We have different answers, to be sure, but how is that answer expressed? The only thing my parents cared about when I chose my college and major was, "What kind of job can you get with that degree?" I was on good terms with the faculty and staff of my major and they felt compelled to acknowledge that most people worry about having a job when they leave school and thus, that faculty and staff felt obligated to explain the marketability of their taught skills, even if they had to go digging and networking with employers to find such answers.
You claim that, in the US, we have been experimenting for 150 years, but you only provide as an example your own, personal anecdotal experience as someone who has become a co-founding entrepreneur. You don't even provide UChicago's answer (which, had you done so, would have been an invitation for other alumni to disagree with you).
I don't see any meaningful experimentation happening. I see universities struggling to be everything to everyone, and not really having any idea how to do it anymore, except to provide for the lowest common denominator: the need to make a living after parental support runs out.
"I see universities struggling to be everything to everyone"
was exactly my point. :D
For some people, education is about "jobs," for others its about other things, and as a result universities have dozens of conflicting mandates.
I was rejecting the idea that we've somehow converged, as a society or whatever, on a consensus that education = jobs. That was not my educational experience. That's not the educational experience of people who attended an Ivy League school.
But I grew up poor and rural, so I know the flip side...viscerally. I meant to draw attention to exactly your point, that we've set up a situation where universities are directionless because we've asked them to be everything to everyone.
DBC is not everything to everyone. We hope that DBC will make it easier for universities to stop trying to be everything to everyone, too.
"You claim that, in the US, we have been experimenting for 150 years, but you only provide as an example your own, personal anecdotal experience as someone who has become a co-founding entrepreneur."
Happy to provide resources. :)
I'd start with Diane Ravitch's "Left Back: A Century of Battles over School Reform"
If you want names, I'd read up on the history of post-Civil War sociology, psychology, and philosophy in the US, especially the pragmatist tradition.
Start with John Dewey and William Torrey Harris (for a counterpoint to Dewey).
Here's a William Torrey Harris quote: "The great purpose of school can be realized better in dark, airless, ugly places ... It is to master the physical self, to transcend the beauty of nature. School should develop the power to withdraw from the external world."
Here's John Dewey: "The teacher is not in the school to impose certain ideas or to form certain habits in the child, but is there as a member of the community to select the influences which shall affect the child and to assist him in properly responding to these influences"
Also read up on Booker T. Washington and "industrial education."
Literally every point everyone has been making in this thread -- myself included -- was also being made by educational thinkers 100-150 years ago.
> I'd start with Diane Ravitch's "Left Back: A Century of Battles over School Reform"
I'll do that. I haven't read any of Ravitch's work directly yet, and enough time has passed since I heard her speak that I don't remember her position reliably anymore.
> Start with John Dewey and William Torrey Harris (for a counterpoint to Dewey).
I'm a fairly huge fan of John Dewey. I only barely hold back from calling myself a philosophical pragmatist in Dewey's tradition. On the other hand, I kinda recognize Harris' notion as a tradition I'd reject; Wikipedia's characterization of his supporters' arguments suggests that there isn't anything really useful in his ideas that doesn't require stripping it bare first. :P
> Literally every point everyone has been making in this thread -- myself included -- was also being made by educational thinkers 100-150 years ago.
But this is not experimentation. There is a difference between a thought experiment and an actual experiment. Schrodinger did not actually put a cat in a box and mysteriously wave his hands to express the quantum uncertainty of its aliveness.
Yes, we have some tiny movements like Montessori schools or project-based learning or Quest to Learn or KIPP's character training or charter schools and so on and so on, but they make such an infinitesimal impact and are often so incomparable that it doesn't make sense to call it "experimentation". The fact that Will Wright, Sergey Brin, and Jeff Bezos are all Montessori graduates, for instance, gets noticed sometimes but also gets mocked a portion of that. (That is, the term "Montessori graduate" is sometimes used derogatorily.)
> I meant to draw attention to exactly your point, that we've set up a situation where universities are directionless because we've asked them to be everything to everyone.
I don't think that the answer is to unbundle it, though. That's an easy, dare I say Silicon-Valley-esque response to a difficult and multifarious beast, but it's not necessarily the right one. I think that it'd be worthwhile to shed some of those things, yes, but for their invalidity or overgeneralizations on their own merits, rather than because they detract from a university's capacity to be itself.
I am working on an answer myself, naturally... but I've so far declined to try to untangle the nature of a university on my own. It feels like putting the cart before the horse to do so. I've mostly held to a single, simple principle: a university experience should be optional. If people feel compelled to go to a university after some particular amount of basic education, such as high school, then the problems of a university are symptoms, not causes.
UChicago has a well-known reputation for being academic and focused on cultivating a "life of the mind," which is probably why jfarmer neglected specify an answer.
And your criticism of his expression of a personal anecdote is unwarranted. Virtually all of the opinions in this thread are going to be based on personal experience. As far as I'm aware, there are no statistics elucidating "the purpose of college." Your last paragraph, for example, is entirely a personal opinion/observation.
My criticism was that his example of 150 years' worth of experimentation was captured in a single personal experience. Unless he's at least 170 years old, what he said was basically ridiculous.
His reply is slightly better, but I'll get to that in a direct reply to him.
"The problem is this: somehow we mutated universities into a place where everyone goes in order to get a job."... That is indeed fair. The parent I assume is talking about the current general perception about universities in US.
My point was more that if you ask 20 people, "What is the purpose of an education?" you'll get 20 different answers -- everything from "to get a job" to "cultivate the life of the mind."
You may believe that, but the evidence from many surveys shows that it is not true. The vast majority of students and parents really do feel that the purpose of university is to "get a [better] job".
You can't just say "many surveys" and leave me hanging! :D
And just so I'm clear: I wasn't speaking literally. I did not mean, for example, "Surveys show that among 20 answers to the question, each answer received 5% of the responses within X margin of error."
I realize that's a risky thing to do on HN and still be understood.
I'm a big fan of Dev Bootcamp and similar startups. Traditional universities are a bad deal for students who are going to university to improve their ability to earn a livelihood - a not-insignificant portion of the 3/4 of Americans youths who now attend college. And it is becoming a worse deal as college costs increase.
Students need a more flexible, more accessible system of higher education. Over the last 50 years, we've attempted to co-opt the traditional university system to fill the higher education needs of the greater population and the experiment is failing.
A lot of our students come from backgrounds in higher education, so I want to be careful not to imply that places like Harvard, Oxford, etc. are for "some people" and others just can't meet their standards, so hey, why not set your sights lower and go to vocational school?
It's more that because we want universities to be everything to everyone, they've lost a sense of purpose. Some universities retain it -- Chicago, MIT, Harvey Mudd, Reed College, etc. -- and have incredibly strong opinions about how to answer "What is an education?"
Most universities can't afford to have a strong opinion, sometimes for legal reasons and sometimes for economic reasons.
By removing those contradictions, I'm hope Dev Bootcamp will help free universities to be universities again, and return to them the luxury of holding a strong opinion about the nature of an education.
You're forgetting the university as a place to get drunk, get high, get laid, and play football which appear to be pretty high priorities for many if not most students.
You're also forgetting the university as a rubber-stamp credential, an exclusive mark of membership into a particular subgroup of people we like to allow into our other particular subgroups, which from my anecdotal experience is the #1 reason parents want their kids to go to college.
The university as a "place of higher learning" was created back when monks copied every book by hand. I think the way we've managed to gold-plate all of the more modern and typical reasons for college with this lofty image thousands of years later is pretty impressive. By now I think we can probably come up with much better/cheaper forms of "higher learning" but I suspect too much of society depends on keeping it where it is.
> You're also forgetting the university as a rubber-stamp credential, an exclusive mark of membership into a particular subgroup of people we like to allow into our other particular subgroups, which from my anecdotal experience is the #1 reason parents want their kids to go to college.
It's true. Watching the difference in recruiters' reactions at a graduate career fair depending upon a candidate's undergraduate institution was sobering. For two equal graduate students, the student with an Ivy alma mater got all the attention. College is partially a filter. If you can get in and subsequently graduate, it guarantees a lower bound on competence.
> You're forgetting the university as a place to get drunk, get high, get laid, and play football which appear to be pretty high priorities for many if not most students.
So...? While these may not be my priorities, people who are paying through their nose to go to these schools are welcome to have other priorities. The thing is that just as elsewhere in life, there are people who would like to do different things with their money, I love the American experiment because it lets you do whatever you want with your money.
> You're also forgetting the university as a rubber-stamp credential, an exclusive mark of membership into a particular subgroup of people we like to allow into our other particular subgroups, which from my anecdotal experience is the #1 reason parents want their kids to go to college.
Uhhh...This is exactly how society works. Everyone makes first order approximations based on certain metrics. Hell, there are people out there who approximate based on the fact that you have a social media profile, have worked for google and play video games (Yes, even nerds ). The fact of the matter is that it is incredibly hard to judge people (Professionally and Personally) which is why we use these markers and throwing away universities is not going to make the markers themselves go away.
> The university as a "place of higher learning" was created back when monks copied every book by hand. I think the way we've managed to gold-plate all of the more modern and typical reasons for college with this lofty image thousands of years later is pretty impressive. By now I think we can probably come up with much better/cheaper forms of "higher learning" but I suspect too much of society depends on keeping it where it is.
I feel like what people are really looking for are tech shops that teach you how to do software engineering. Those are cheaper, more effective in teaching you how to do that specific task. The problem is that Universities are inherently designed to be a place where you are encouraged to make mistakes in the hope that you eventually get lucky and move humanity forward (This is the "higher learning" bit).
>> I don't think the point of getting higher education is getting a job.
This may be true for traditional students (18 year olds going right to college from High School who need to learn about life/ the world). But for lots of people, the goal of going to college is 100% to get a better job.
40% of college students today are non-traditional (older, single parent, works part time more than 35 hours per week, etc). These students aren't going to college for any other reason than getting hard skills they can use to get a job. Khan's vision is much better suited than the community colleges that primarily serve this segment now.
There's a bigger discussion here about whether you should need a college degree to get a job/ pay raise and there are lots of arguments that suggest that a college degree is overvalued. However, right now most jobs do require a degree so candidates need one to get hired (e.g., we were helping a company hire for admins -- some of the candidates had been admins for 20 years but the recruiters wouldn't look at their resume because they didn't have a college degree). Some professions also get a pay raise when they earn a degree (I believe teachers and police are among them but may be wrong).
Hmm, I know many non-traditional students who went to college because it was something on their bucket-list, not because they wanted a better job (they were happy with their career).
On the other-hand, for nearly every person I know who went to college straight out of highschool it was with the intention of improving job prospects.
>There's a tension right now between the university as a place to expand your mind and learn for the sake of learning and the university as a place where you prepare for a job.
I see this argument come up all the time. "You might be making more money than I am, but I have a degree, and that makes me a classy person."
It goes right along with the theory that college degrees are class markers, made ineffective by the large number of lower-middle class and poor folks mistaking cause for effect and mortgaging themselves in order to go to school.
The hard truth? For most of us, we can't afford to spend four years and a couple hundred grand "learning for the sake of learning." I have to work for a living. Sure, I enjoy reading in the evening, but even now, I couldn't afford to drop that kind of money and take four years off.
If your parents can't afford to just pay for your school? well, you had better learn something that you can use to earn money, 'cause those loans? they don't go away with bankruptcy.
So yeah; You (rather, those that say school should be about learning for the sake of learning, not learning so you can earn more money) need to acknowledge that if school is 'for the sake of learning' then it's irresponsible to ask the lower middle-class and the poor to mortgage themselves to attend. There's nothing fundamentally wrong with having a four-year long summer-camp for rich kids, but if that's what you want, say so.
(yes, I have been told that I have a 'Napoleon complex' about my lack of schooling. Sorry. But I do think what I say is true, even if the tone, perhaps, has more emotion than a reasoned argument ought. Spending the kind of time and money required to get a BA "for the sake of learning" if it doesn't increase your earning power, is something that only the children of the rich (and recipients of full-ride scholarships) can reasonably be expected to do.)
You (rather, those that say school should be about learning for the sake of learning, not learning so you can earn more money) need to acknowledge that if school is 'for the sake of learning' then it's irresponsible to ask the lower middle-class and the poor to mortgage themselves to attend.
I agree with that, but one obvious solution is that it shouldn't require mortgaging yourself to attend. Here in Denmark, universities are tuition-free, and students are actually paid a small stipend to attend, to cover living expenses. The intent is precisely to remove any link to parents' socio-economic status in who is able to afford to attend university.
Even in the U.S. it used to be somewhat closer to that. My dad came from a poor family and attended university without taking out any loans, because at the time ('60s), public universities had very low tuition, and even what low tuition existed could be covered by part-time work on campus (e.g. in the cafeteria or library).
I have been thinking schooling this week; More and more of the poor people I know have gone on to get real jobs lately, (The local job market is really hot, if you are a nerd, and nearly all of my friends and many of my acquaintances are smarter than I am[1] so it's really no surprise.)
I ended up using a craigslist ad to find someone to help me shovel out the office. The random I got had an IT-related degree from one of those schools you see advertised on daytime TV, and was having a hard time of it, even in these good times. (I mean, uh, she had windows experience, so not really my field, but someone has to maintain the exchange servers and desktops for all the people hiring my friends.)
>because at the time ('60s), public universities had very low tuition, and even what low tuition existed could be covered by part-time work on campus (e.g. in the cafeteria or library).
This is still true for community colleges and state schools. The problem is that a degree from one of those schools doesn't mean as much as a degree from a good school, (or really, from what I've seen, much at all,) precisely because it's something most people can get.
This supports my theory that degrees are about filtering rather than about learning.
My belief that a degree from a good school says that either you have rich parents, or you are smart, hardworking and have some hustle. With partial scholarships, this isn't a binary thing, of course.
The idea is, though, from an employers perspective, for a lot of jobs, having rich parents (and connections who are also the children of rich parents) can be just as useful as being smart and hard working.
But yea, if a degree is about filtering and not about learning, then obviously, a degree from a school anyone can get into is of dubious value.
Perhaps, from that perspective, what we want is a cheap school with a brutal dropout rate? But it's going to be hard to change the current status quo, as it works out pretty well for the elites; The rich pay big bucks to go to school with the best scholarship students. It's a fine system for the children of the rich (the quality of your peers, I think, has far more to do with educational quality than the quality of your teachers... and hell, if you are one of the elite, destined to control means of production? you /need/ to know competent people that can actually work.) It's also a great system for the smart people that make it on scholarships; they gain the contacts and access to capital that their birth denies them.
(I mean, as a coder, rich people contacts are generally less-useful. So I guess that means google and facebook should start focusing their recruiting efforts on people that went to good schools but didn't pay for it.)
[1]To be clear, I'm not stupid, but my own set of prejudices and my own brand of arrogance means that I don't have all that much tolerance for people noticeably dumber than I am. It's not a virtue; I recognize that I would be alone if many of my friends shared my prejudice.
This is still true for community colleges and state schools. The problem is that a degree from one of those schools doesn't mean as much as a degree from a good school, (or really, from what I've seen, much at all,) precisely because it's something most people can get.
Good state schools are still pretty valuable degrees, I think, but the problem is that the tuition isn't low anymore that those places. Community colleges are still affordable, but not so much the "flagship" state schools. In the '60s you could go to UCLA or Berkeley for basically nothing, but nowadays it's getting up towards $10k/yr. Same with places like UT-Austin or UW-Seattle.
Huh. Usually I've heard "state schools" to mean schools a tier lower than the UC schools. Like San Jose State would be a state school - both the prestige and the tuition are rather lower, though than a UC school like UCLA, or UC Berkeley, even though all are run by the same state.
Oh right, I see what you mean. It's possible I'm using the term wrongly. Maybe "public universities" is a better word? Basically I meant the "flagship" 4-year universities run by the states, of which each state typically has at least one or two (UT and A&M in Texas, Purdue and IU in Indiana, etc). Those used to be a common route to cheap but highly regarded education, because many are huge (e.g. Michigan State has 47,000 undergrads, Texas has 38,000), and they used to have only nominal tuition, plus enough work/study programs for students to pay their own room & board by working on campus. They're still cheaper, but no longer like that.
California does still seem to have one interesting option, at least for engineering: from what I can tell, Cal Poly SLO is formally a Cal State, and priced like a Cal State, but sort of a "premium" Cal State whose degrees are well-recognized among engineering firms.
You say that universities provide learners with a wider exposure to ideas. I don't doubt that this was true in the hundreds of years that universities have existed. However, up until 1989, universities basically had a monopoly on a place where it is easy to be exposed to many ideas. Since TBL invented the www, and hyperlinks became commonplace, it's never been easier to be exposed to ideas. It's far easier to come across new ideas on the internet than in any university in the world. All you need to be exposed to thousands of ideas is hyperlinks and intellectual curiosity.
The only feature that university has that the internet doesn't is a curriculum required to get a degree. However a required curriculum without intellectual curiosity isn't worth much. A required curriculum with intellectual curiosity is worth something. But now with phenomena like Wikipedia and MMOC's and a generation that has grown up online, the internet is most certainly overtaking universities.
I'm 30 years old and were I to do it again, I wouldn't go to university. I'm know I'm not alone in this sentiment, but I also know this isn't yet a majority sentiment. I expect a significant portion of the generation currently at uni that grew up with the internet to feel this regret when they are 30.
I agree that having a population highly capable of reasoning and critical thinking is a good thing. But I do think this vision of an enlightened society is a little bit of an ivory tower fantasy.
We don't live in a society where people have a large amount of freedom and exist as free independent agents.
Most people need to be employees. And if you are an employee you will exist in a dictatorship with a military-like structure, and doing critical thinking and proper reasoning will be downright dangerous.
"Um excuse me Mr. Manager, but your proposal for the future of the division has a logical error in the reasoning, and your conclusion is actually false"
Nope, Mr Manager is always right. Because you have to play the game of corporate politics if you want any chance of success, and avoid being labeled as difficult and abrasive.
As long as the large majority of job postings and companies are looking for obedient workers of a certain skill, most people will want an education to become an obedient worker of a certain skill. And universities need to adapt to demand in order not to have a decrease in applicants..
> We don't live in a society where people have a large amount of freedom and exist as free independent agents.
But should we? And if so, why may we not work towards it?
> And if you are an employee you will exist in a dictatorship with a military-like structure, and doing critical thinking and proper reasoning will be downright dangerous.
This is complete bullshit. You understand neither the military nor corporations, both of which can and do permit and encourage back talk.
Corporations only permit or encourage "back talk" when employees know what not to say or question. You are free to question policy or management decisions; you are not free to question certain organizational goals or the hierarchical structure of the corporation unless you happen to be in a high-level position (which most people will never be in).
That's true in every human organization. If you're constantly questioning our societal decision not to eat other people, you're not going to make any friends.
I think one of the results of going to university is that people learn critical thinking. Obviously, increasing the ratio of critical thinkers to the intuitive people in a society puts that society in a much better position.
"Mr. Khan conjures an image of a new campus in Silicon Valley where students would spend their days working on internships and projects with mentors, and would continue their education with self-paced learning similar to that of Khan Academy."
Because of course the entire world centers around SV and tech. And even if you are just trying to work as a manager at a regional chain in Kansas there is something you can learn from Silicon Valley.
"Mr. Khan writes that he admires the work of Peter Thiel, founder of PayPal, who set up a fellowship program for college students to drop out and pursue entrepreneurship with the help of financial backing and mentors"
The entire world is not cut out to be entrepreneurial. In fact most of the world isn't and can't (not everybody can be a chief we do need indians as the saying goes).
Noting also that Kahn takes the traditional route of publishing a traditional printed book (by Hachette Book Group) to get his views across. And will most likely go on a typical book tour and do the usual media publicity route. Nothing wrong with that. Just like there is value in traditional universities and things like Kahn "Academy".
I think your criticism isn't completely fair. I'm fairly certain that the narrow design and location of his thought experiment is because he is most familiar with the finance and tech industries (His degrees are in Math, EE, and Business). He's clearly spent a lot of time in school and is familiar with his pain points and would like to fix them (for better or worse)[1].
If I was in the same position I wouldn't bother expunging my ideas about what would make a better art school because I'm clueless about art schools.
[1]My experience has been that Community Colleges already do a lot of what he's suggesting (at least mine did) and I doubt his university style campus could compete on cost with a CC.
My experience has been that Community Colleges already do a lot of what he's suggesting (at least mine did) and I doubt his university style campus could compete on cost with a CC.
I bet it is the other way around.
Smart potential mentors know the value of finding good people early, I don't think that will cost nearly as much as you would fear.
"My experience has been that Community Colleges already do a lot of what he's suggesting"
That was my thought exactly. The CC system is quite good at educating people, yet, I feel like many folks pontificating about the "future of education" focus only on "Research 1," or more correctly, doctoral granting universities with very high research activity[1].
In a chapter titled “What College Could Be Like,” Mr. Khan conjures an image of a new campus in Silicon Valley where students would spend their days working on internships and projects with mentors, and would continue their education with self-paced learning similar to that of Khan Academy. The students would attend ungraded seminars at night on art and literature, and the faculty would consist of professionals the students would work with as well as traditional professors.
--
This is what we do in what we call "Zoho University" within our company. We hire students out of high school, and they go through a program that has a mix of classroom instruction focused on extensive programming exercises with a heavy dose of interaction with people building products. Basically education combined with context - why should I learn this? Why is this relevant?
This comes from my own personal experience doing a PhD in Electrical Engineering from Princeton. I would rate my PhD as idle mathematical game playing. That would have been OK as a hobby, I just wish I had not gifted 4 years of life to its pursuit. And I definitely take issue with the fact that this was tax-payer-funded pursuit - in fact, tax-payer-funding has something to do with how abstract and detached from reality these things get. So I ended up repudiating the whole thing. Zoho University is my atonement.
I passionately believe that the vast majority of students (myself included) would get far more value from this type of a "contextual education" program.
My dad always told me an employee who wants a competitive salary needs the negotiating leverage that if he isn't paid market rate he'll have to look elsewhere. He said you need skills that more than one employer wants, and qualifications more than one company will recognize.
I'd be interested to hear how other companies view employees who've been through "Zoho University"? Have many of your students got jobs with decent salaries after leaving your company?
In the book, Mr. Khan also advocates for a separation of universities’ teaching and credentialing roles, arguing that if students could take internationally recognized assessments to prove themselves, the playing field would be leveled between students pursuing different forms of higher education.
Removing our heavy reliance on higher ed credentialling seems central to his idea, and yet the proposed solution is simply "internationally recognized assessments", which sounds suspiciously like "standardized tests" to me.
Sadly, many professions are not well-evaluated by standardized testing. This is not for lack of trying -- we already have the GRT and all of its related subject tests, plus the entire raft of tech-related certifications that aren't worth the paper they're printed on (remember "A+ certification"?). Grad schools look at GRT results, but usually only to make sure you didn't completely bomb the test.
This is the hard part, and will require a lot of careful though to untangle.
Standardized tests don't have to be multiple-choice tests from a question bank, though. You can have practical sections, like in the CCIE. You can have long-form scenarios you have to analyze like on part 3 of the CFA. You can have the exams only given on specific dates at specific times of the years, as is done on several exams, so that the questions are always secret and not reused.
The highest quality certifications comnbine more than one of the above techniques. For instance, you might make phase 1 be just a multiple choice test. This mainly acts as a filter so that not too many resources are wasted scoring the more comprehensive sections for test takers who aren't really serious. Then phase 2 testers might take a more open-ended test with essays, etc. Finally, people who pass phase 2 might have some kind of practical, similar to medical boards or the network building in the CCIE.
This kind of process might not be cheap, either. A lot of people's time are involved in creating, administering and grading of this kind of rigorous testing process, but it even if it costs $2000 - $3000, it is still much cheaper than a university. It is still much more egalitarian as it is not open only to the chosen few.
Grad schools look at GRT results, but usually only to make sure you didn't completely bomb the test.
You will not get into a top ten Economics grad programme without a perfect GRE Math score unless you are really impressive on some other metric. Neither (near) perfect grades nor undergraduate research of a high calibre will either. These are basically expected. To my knowledge Math and Physics have equivalent standards. These programmes (Top 10) graduate the majority of Ph.D.s in their fields. Within them it looks like GRE doesn't matter but that's because everyone there reaches a high minimum.
But GRE scores predict research productivity so they're definitely measuring something meaningful. Programmes that are less selective have worse outcomes. They're less rigourous than programmes that can assume greater ability and preparation of their students.
Many people love to hate on standardised tests but IQ tests and IQ test equivalents like the GRE, GMAT, LSAT, SAT etc. are the best predictor of workplace performance besides work sample tests. For the time investment involved they are absurdly effective. If you want to improve the usefulness of certifications like A+ you could just show percentile score rather than a simple Pass/Fail.
I regret that being on my phone renders me unable to provide citations but if you HNsearch tokenadult worksample tests you'll come up with sone relevant stuff.
I wouldn't be so quick to dismiss certifications. Having an A+ cert at least reasonably signifies someone could format a hard drive and install ram.
These tests do not have to be on paper alone, either. Look at the CCIE from Cisco. That has an extremely difficult lab component that no amount of "teaching to the test" is going to help you overcome (which is why having one pretty much guarantees a six figure income).
There are legitimate ways of doing standardized testing. I suspect the professions of engineering and law can be useful guideposts. There have been discussions before on how engineering as a whole is like or unlike software engineering... I'll leave that bone of contention alone.
But I don't think Khan is suggesting that every profession be so assessed: just that, if an assessment is merited, just put it out there as available rather than required.
My daughter(6 years old) asked me the other night at dinner when she would learn which which berries in the forest are poisonous, high school or college. I explained to her that she would not be learning these things in school, she will have to learn them from other places like books. At which point she tells me she is not sure she wants to go to college if she isn't going to be learning important things.
Even if not real, this anecdote is cute. The sad truth, however, is that she will learn from other places that without a college degree she is less likely to be employed...
But look at where it has already been done before, like Microsoft or IT certifications. They are okay, but don't really have a whole lot of value because it is too easy to 'cram for the exam' - there are even services in many countries that help with this. I wouldn't want to hire someone solely based on their certificates or badges - I'd want to see a portfolio of their work, hear from previous co-workers, and of course speak with the person to see about their skills in communicating and working with others.
And outside of some technical skills or rote knowledge, most skills can't be so easily measured by test questions.
A better argument for a 'new college' would highlight how right now a college degree doesn't really indicate what a person has learned, and the evidence points to not much actual learning happening in college (see Academically Adrift: http://www.insidehighered.com/news/2011/01/18/study_finds_la...). This isn't surprising since most professors aren't trained in how to teach effectively and properly support learning, and neither are most in industry, either.
All lawyers take the bar exam. No law firm or government organization asks for a lawyer's score on the bar. They only thing that matters is your GPA, given to you by your own university. Law GPAs are heavily curved; most students can only receive A's or B's. Everyone knows that law schools regularly adjust their curves to give their students higher GPAs and therefore an advantage in the hiring market. Still, nothing changes in legal education. If anything, legal education changes less than other forms.
How much material is covered by any single MS certificate? A semester's worth? If an entire 4-year curriculum were covered by the certification exam, it would be near impossible IMO to cram.
Of course, having your entire college record decided by a single exam is quite a scary prospect..
“Traditional universities proudly list the Nobel laureates they have on campus (most of whom have little to no interaction with students),”
They only have little to no interaction with students because students are stupid. When I started college, I was a physics major. After one year, I found the field of solid-state physics particularly interesting. I found out that Prof. Albert Overhauser was at Purdue (if you don't know who he is, just know that there is a very important spin-transfer effect named after him), and it turns out there were many opportunities to interact with him that almost nobody took advantage of.
Consider office hours alone: You've signed up for a lecture from a smart and talented person in a field you are interested in, and they are paid to set aside time to talk to you. If you are most students, you don't bother to go.
* mentors
* hands on experience and dives into theory/standard learning
* live "on campus"
I was lucky to have all of these at a standard liberal arts college, albeit not in the exact doses that Khan is talking about. It made my education great.
One of the great successes of the university I attended were several weekly hour-long two-on-one sessions (two students, one supervisor) with either a professor or a grad student, talking about that week's lectures.
The regularity and length of them allowed a relationship to be built up over the course of the semester, and you could be damn sure you'd learn a lot more from that dialogue than you ever would from sitting bleary-eyed in the lecture hall at 9am.
I've heard that's what you get at Oxford. Seems great but as a mere mortal who attended a slightly less prestigious institution, I had to make do with tutorial groups where 18-20 people would cram into a room designed for 12.
You get it at a few places. It has its downsides though: expensive for the university, and a big time suck for the supervisors, but honestly it's a really successful system. I'd go as far to say that if there's any one thing an institution can do to improve the abilities of their students, it would be to adopt a system like that.
I'm not sure how students can be expected to learn from a one-sided teaching system. We're not photocopiers. There's so much value from a meaningful, extended dialogue between a student and a "mentor." Unfortunately, a lot of people have to wait until postgraduate programs to get that, and even then may not.
A lot of US institutions have office hours which could offer much of the same if students knew to take advantage of them in the right way.
Liberal arts education seems to get a bad rap around here (I did take less higher math and science than would have been required at a state university), but the average class size for my college was twenty. Occasionally more, but there's also the counter case, where I had an English class with two other students. One had a voracious appetite for skipping, and the other was consistently absent for mock trial.
I found it substantially raises my perceived importance of finishing the required reading when I think there's a 50/50 chance of being the only one there to be questioned about it. :) I had a huge motivation problem end of high school/early college, and I think if I hadn't had the individual attention of a liberal arts college, I surely would have dropped out.
I've followed a similar background becoming an Engineer. I did the co-operative education while an undergrad and got valuable mentoring and working experience in the marketplace. Now I'm an organizer for a local R users group which meets at a university. We do a lot of hands on and practical theory which has benefited me and our data science community.
As a current PhD. student in the liberal arts my position likely veers significantly from the mean around here. But having read a lot of these sorts of "higher education is in tatters" sorts of discussions--and tending toward skepticism about the general fitness of most humans to be involved in higher education of any kind--I find myself pushing back against the critics. My classroom experience as a philosophy and psychology major at a mid-tier, state university left very little to be desired. I'm not sure I am capable of being crafted into a good employee, whatever the skills of the institution attempting the task. But I learned a whole lot about the world, how to reason, how to communicate, etc. And I wasn't even a particularly good student.
As an undergraduate in our college of arts the average student, regardless of major, was required to write scores of pages per semester, all to be read and responded to by someone more than qualified to critique even the especially precocious 19 year old. Most classes were of less than 20 students and involved lively discussion with a professor who, more often than not, was well published and read in his or her field. Setting aside employment for the moment, I just can't imagine a much better way to grow a person.
There are certainly problems with higher education--too many to list really. But the core of what universities do, as it relates to undergraduate education, they do pretty well by and large. It is true due largely to economic distortions and poor alternatives too many who would be better suited by something more vocational go to four year universities looking for something the university is not designed to provide. Hence the present discussion I suppose. But for those who are looking for education in the broader sense, I find universities to be serving as advertised.
>“Traditional universities proudly list the Nobel laureates they have on campus (most of whom have little to no interaction with students),” he writes. “Our university would list the great entrepreneurs, inventors, and executives serving as student advisers and mentors.”
Why would "great" entrepreneurs, inventors and executives have more time than Nobel laureates would to spend mentoring students?
Mr. Khan's model can work as one-of-a-kind college, but to scale it up to the mass, it will probably fail miserably.
The best model for future higher education is probably being done at MIT and Harvard through their joined venture EdX, where students are being educated and trained on and off line. In other words, these finest educators are taking advantage of both latest technologies and traditional methods to educate their students. Hopefully, in a few years, lessons will be learned and we can all benefit.
I disagree with A, B, and randomdata's C. (I mostly disagree with randomdata because that's way too broad a stroke.)
The purpose of universities is to provide an umbrella for intellectual specializations to conduct research. There are two reasons they admit students: to pay the bills, and to acquire apprentices.
It's about time we see some disruption in education. The old school methods of schooling will not work in this day and age. For example in the tech sector, more companies are looking for graduates that have experience rather then just a CS degree. There are free resources to learn how to code like Codeacademy, Udacity, Coursera, Mozilla’s P2PU, Google Code University, and MIT Open Courseware. There are also more and more high-quality paid resources, both online and off, like General Assembly, Treehouse or Bloc, not to mention local continuing ed classes. Mr. Khan and Peter Thiel are headed in the right direction.
Do not confuse old with crappy. The fact that most lecture-based schools are crappy does not diminish the value of a lecture (cf. Oxford etc.). The problem is not necessary the method but the numbers.
Of course, a "lecture" on Perl is probably as useless as a "practical with computers" on Metaphysics.
This is a very important point. The problem, particularly in the US but not exclusively so, is that at the top universities the best faculty are not necessarily the best teachers. You can be a world class researcher and still be a terrible lecturer.
A truly great lecture is akin to a piece of theatre, especially in the humanities. Really, that's one of the problems with the methods that Khan promotes: they appear to work very well for quantitative subjects that can be accurately tested, but that doesn't apply to many of the arts.
I don't see why the humanities should be relegated to "ungraded seminars at night on art and literature". And, as others have pointed out here, one of the reasons Nobel laureates who are faculty may have little time for students is because they have a lot of other things to do: the same is surely true for the professionals Khan wants to advise students?
I don't want to sound as if I'm against the Khan Academy - it's fantastic. I think there is real value in implementing many of its ideas into standard education, and I know that's already happening with projects like edX and the like.
Not all lecture-based schools are crappy. However, the reality is that the more practical experience people have, the faster they learn. Having a boring "lecture" on Perl is less productive then having a quick 10 minute session on building a Hello World app using Perl. People essentially learn by doing, it's as simple as that.
I suppose working with mentors could help students be better prepared to join the workforce, but students can already do that on their own by getting internships.
Classes could be self-paced, but then likely many students would lose the motivation to learn. People are strongly driven by competition and deadlines, two things schools excel at providing.
There already are internationally recognized assessments -- standardized tests. Though they are typically useless without a degree. It would be difficult to change this, but if you could then you might make colleges obsolete. Many people would not spend 4 years and $50,000 on a college degree if they could study for 6 months, take a test, and get the same job. But I don't think this could work. For example, even though we already have the GMAT, GRE, and MCAT, it's unlikely you will ever convince medical schools to let in students with an MCAT and no undergraduate degree. And if an employer is choosing between someone with a perfect GMAT and no undergrad, and a Stanford graduate with a mediocre GMAT, the choice will likely be in favor of the Stanford grad. I don't think this would change even if you made the test more comprehensive and difficult.
Not being graded: this may increase students' intrinsic motivation to learn, but it also may decrease their motivation, since there would be no penalty for not learning the material. But if you maintain that students must take an internationally recognized test at the end, then test-taking abilities will still be measured, so you're just playing with motivation here and not really changing much.
These ideas are interesting, but even if schools enacted them, what great improvements would they bring to our society?
“… the faculty would consist of professionals the students would work with as well as traditional professors ... by de-emphasizing or eliminating lecture-based courses, having their students more engaged in research and co-ops in the broader world …”
The problem lies in the teaching materials and the teachers. In most cases, faculties don’t have real world working experience, so they just provide their students with examples from text books. And text books are written by faculties who mostly don’t have real world experience. Also, the race in producing research artifacts is extremely distracting for the faculty members. Universities need to distinguish between teaching faculty and research faculty.
As a very simple example, a state machine is a very helpful design tool if used properly. But, I noticed almost none of the new hires knows that a state machine is best used in modeling the physical world, people keep using this concept in very inefficient ways (here is a concise version of what I teach http://www.drdacademy.com/?id=13).
"Also, the race in producing research artifacts is extremely distracting for the faculty members. Universities need to distinguish between teaching faculty and research faculty."
I disagree. When you make research faculty teach undergraduates, you are basically forcing them to interact with the undergraduates, which is a mutually beneficial arrangement.
Yes, it is mutually beneficial. But, I have seen how new hires are suffering due to the fact that they have not gained practical knowledge at school. So, there is a problem.
This sounds a lot like an apprenticeship with a book club and some arts and crafts attached.
Perhaps that is fine, it seems to me that to most people (especially parents) what is important is not so much what is taught or how it is taught but the fact that they can say they have been to a place called a "university".
My first thought (as a student) is that it is not easy. During my internships, I was usually spent at the end of the day; I only wanted to chill out after work. I tried to take online courses at the same time, and it turned out to be horrible. It would require a lot of discipline and motivation to make it work. This is not for everyone.
It is better separate the working and the learning. A model like the co-op program offered by University of Waterloo may works better. In the program, students study and do internship alternatively every 4 months. When they graduate (after 5 years), students would have 5-6 internships in their CV. So far most students in the program that I have met are excellent!
There are several interesting comments on this article that relate to company hiring procedures. With the help of other HN participants, I have gradually compiled references to build a FAQ file on that subject,
which seems to be well liked by other participants here. To relate what the research says to what Salman Khan is proposing, it is a widely replicated finding all over the world, for many categories of jobs, that work-sample tests based on actual job tasks do much better at identifying successful workers than higher education credentials. Any employer that doesn't want to be a chump should hire most workers on the basis of a work-sample test.
What an institution of higher education could do to respond to what research says about preparing learners to succeed in finding jobs and doing jobs well is make various work-sample tests part of a graduation comprehensive examination. Currently higher education degrees are based mostly on "seat time" (so that the saying is, "All you need to get a degree is a heartbeat and a check"), rather than on demonstrated competencies for doing any useful form of work.
A college or university that says to the world, "Our graduates who obtain our diplomas demonstrate their competence with work-sample tests, the results of which we list on their official academic records" would rapidly gain notice from companies with research-based hiring procedures. (And why would anyone want to work for a company with hiring procedures that are contrary to the best research?)
That could provide the new college or university leverage to gain market share as compared to other colleges and universities. That would be enough for it to thrive as a new institution of higher education.
Yes, there are other reasons to go to college besides getting a job after graduation. But the research shows that many of those purported reasons are poorly achieved by current higher education practice,
and a new model of college or university might be able to do some outside-the-box innovation to also achieve better in that regard. Trying something new would not be a bad idea.
"A college or university that says to the world, "Our graduates who obtain our diplomas demonstrate their competence with work-sample tests, the results of which we list on their official academic records" would rapidly gain notice from companies with research-based hiring procedures. (And why would anyone want to work for a company with hiring procedures that are contrary to the best research?)"
This is exactly the line of reasoning that ITT Technical College takes [1] and yet few, if any, of those graduates go to work for Google or Microsoft or Apple in their engineering programs. (all companies that pride themselves on having 'enlightened' hiring practices)
What this means for the entrepreneur is that you can hire top technical talent that these other guys won't touch because that talent either graduated from a more 'trade' oriented school than an Ivy League school, or if that talent trained themselves differently (online/self taught/Etc).
[1] "At ITT Technical Institutes, we are committed to helping men and women develop skills and knowledge to pursue opportunities in many of today's promising career fields, including electronics, drafting and design, criminal justice, business, information technology, health sciences and nursing." - http://www.itt-tech.edu/
I once had an education entrepreneur who had built 3 schools tell me in private that ITT techs business model is overtly designed to target quote: "The welfare market". ITT is ignored because it's a known parasite. If you look at ripoff report.com and search ITT tech you'll get a list of complaints. The cultural underpinning and intent of an institution actually has a strong affect on it's "cultural" outgrowth imho. You can't really measure this of course. I work at a school that is intertwined with the entertainment industry and unfortunately a large part of it is more culturally "entertainment industry"-ish (with all the shallow business and marketing psychology) than education-ish. So there is a rift between promoting something as a marketing strategy and really having the idea as part of the cultural framework that you believe in and intend to deliver on.
Also, think of ITT as a negative filter. If you go to ITT or Phoenix, rather than say, a traditional college, employers could make generalizations about your background almost immediately (destitute, not serious about schooling, bad background, high risk) regardless of your real capabilities.
> A college or university that says to the world, "Our graduates who obtain our diplomas demonstrate their competence with work-sample tests, the results of which we list on their official academic records" would rapidly gain notice from companies with research-based hiring procedures. (And why would anyone want to work for a company with hiring procedures that are contrary to the best research?)
In Randy Pausch's Last Lecture, he talks about how companies had actually sent his department letters that explicitly said, "We will hire your graduates." Which, as he noted, is pretty amazing.
But there is no incentive for colleges and universities to change anything about their practices. Demand for college education has never been higher and is increasing even still.
The question to ask is, if our higher education system is in such dire need of reform how come it is so popular? Why are people willing to going into a lifetime of debt just to participate in this system which, according to your linked research is so woefully inadequate?
One answer is that higher education may be experiencing a bubble right now. It's true that more people are going to college than ever before and tuition has never been higher. But that only means that college degrees have perceived value, and not necessarily value that comes from their contribution to the productivity of society.
The benefits of a degree to employability are obvious, but for a lot of students the benefit to one's knowledge and skills are dubious (e.g. low-quality undergrad business programs). The cost of college, on the other hand, and the opportunity cost of four years of lost productivity, is huge.
If you run a university, this is a scary position to be in - bubbles can end quickly. If this is indeed a bubble, then a shift in employer perception of a degree's value could end it.
Education intersects with two problem areas that have historically been correlated:
1. Knowledge sharing. Books no longer take three months of a person's time to copy, but <$0.01 of electricity. Video lectures are now freely available. This is awesome! But it's not the full picture when it comes to education. You also need frisbee games and poetry nights.
2. "Social". This includes credentialing, social networking, and creation of an environment (in college, a mostly parentally funded and artificial one) conducive to rapid learning.
The second of these is much harder, because people still believe that exclusivity is necessary to preserve quality (and, sadly, they are not entirely wrong). I've said before that there are two sub-problems in social. One is documenting existing relationships. That's lucrative but does little real good for the world. The second is to expand the social graph. That does a lot of good but it's also risky and rarely remunerative because some people (and the vast majority of the powerful) like exclusivity. The problem with "disruption" is that it assumes the disrupted are old fogeys who won't fight back, and that's never true.
While there are differences, fundamentally the proposition of mentors, internship-based learning, and living on site is not that different from apprenticeships. There's a wider exposure to ideas, perhaps (because maybe you jump between internships instead of staying with one), but at its most basic level the proposed idea is training people to do jobs instead of trying to expand their thinking minds. Yes, there's lip service given to seminars, but it's evident in the quote (“Traditional universities… list the Nobel laureates… ours… would list the… entrepreneurs, inventors, and executives”) that academics are held in a certain amount of contempt.
Let me be clear: I don't think there's anything wrong with this vision. This approach would probably be a success. It may be needed. The problem is this: somehow we mutated universities into a place where everyone goes in order to get a job. I think we benefit from having more people being educated at a university level, but that benefit is not purely practical. I don't think the point of getting higher education is getting a job. I think it's an extension of the same reason we go to school: because an educated population drives knowledge further. Because when a large part of the population gets a Bachelors of some sort, they can reason better. And because when there are more people at that knowledge level, a larger part than before gets a Ph.D. And that helps society and humanity move forward faster.
There's a tension right now between the university as a place to expand your mind and learn for the sake of learning and the university as a place where you prepare for a job. I think it's a good idea to play with that tension and that balance, and see what balance leads to better results in what way. But I think there is absolutely a place for universities as they are today, and I think their existence is part of what has driven us forward at the frenetic pace that we have seen the last several decades.