Unlike MOOCs, which mainly expect students to use resources online to
complete courses themselves, competency-based programs depend on faculty
mentors to walk learners through the learning process. It’s almost the
opposite of the university lecture. Rather than a professor talking to a
roomful of students, a professor talks to students one-on-one while they
learn information at their own pace.
This sounds almost exactly like the 'tutorial' system used for a thousand years (for undergraduates) at Oxford and Cambridge. Students meet with their tutors weekly and are given suggested reading, and usually an essay to write. Once a term, proctored exams are held.
Edited to add: One difference between university in the U.S. and U.K. is that in the U.K., 'breadth' classes are almost unheard of; the idea of a chemistry major taking an anthropology class, a psychology class, or an art class is strange. That's one of the reasons undergraduates there finish in three years, usually.
While I didn't particularly like it as a student, looking back on university ~15 years later, the educational component that I feel like I got the most out of was being forced to take a variety of sundry non-major classes. Most of the major material I probably would have learned on my own anyway; it was the other 'breadth' classes that, well, broadened my thinking in ways I did not expect.
Though from a purely vocational standpoint, the elimination of the breadth classes would seemingly allow for not only faster graduation, but also taking more major classes. That may well be a good thing too.
Agree, outside of my comp sci. major, I took courses in cognitive psychology, history of science, and technical writing, and looking back, those were very useful.
However, it's frustrating that it's regarded as perfectly fine to require STEM students to take courses in breadth subjects, but completely preposterous to require liberal arts students to take breadth courses in hard science.
I think at most colleges, you do have to get at least a couple of science credits. Of course, they are usually along the lines of "Stars for Stoners" (Intro to Astronomy), "Rocks for Jocks" (Intro to Geology), or "DNA for Dumbasses" (Intro to Biology)
I'll be honest, I took Stars for Stoners, and it was a very interesting class. But it certainly was not hard, and given the choice to do it again, without having to fill the general reqs, I'd spend the time reading some good hard science fiction and taking another higher-level history or computer science course instead.
The intro level humanities courses may not have as clever nicknames but they bear the same relationship to upper level courses and the discipline as a whole as those classes do.
A 100 level history course aimed at non-majors looking for distribution classes isn't going to teach much "real history". Not even close. At best you are getting a small morsel of the methods and philosophies of the field buried deep inside a large scoop of crowd pleasing "and then, and then" readings and discussion.
Not that there's anything wrong with that approach to pedagogy for either the scientists or the historians.
> it's regarded as perfectly fine to require STEM students to take courses in breadth subjects, but completely preposterous to require liberal arts students to take breadth courses in hard science.
Hmmm ... at my U. science was required for everyone. Are you sure that wasn't the case at yours? They weren't required to take the most difficult science classes but the same is true for STEM majors taking liberal arts courses.
(I'm using "liberal arts" to mean humanities, social sciences, and the arts.)
Interesting. I went to an institute of technology, so I'm pretty sure they didn't require science classes for humanities and social sciences majors (thanks for the correction!), since they didn't offer those majors. :-)
But I know the neighbouring universities offered science classes for non-science majors students, although it was never a requirement, and mentioning that maybe it ought to would make most people upset. It's been twenty years now though, but I don't think the situation has changed, sadly.
I have to agree. Pysch 101 was the best course I took in college outside of my Comp Sci classes. Having classes with group projects in business and english were really helpful from a sheer communication standpoint but there were parts that were overdone.
At my college my major had specific classes that we could choose from for the electives and only a few "true" electives. I think it would have been much better to just allocate the rest of the hours to pure electives and just give a category.
Choosing what I want to learn about vs being told "learn one of these three things" would have led me to be more interested in the subjects in the first place.
I have very mixed emotions about this. There were certainly some non-major-specific courses that I had to take that did have a positive effect on my thinking and reasoning ability. The majority, however, were a giant waste of time and money, and I still resent being forced to take them. May be just as much a reflection on my alma mater as on me.
My college's critical thinking, diversity, religion, and environmental science classes were outside my major. Yet, they made for fun discussions and learning experiences. Met some good friends, too. I can tell that things I learned in those classes have had a long-term effect.
I spent a term at Cambridge in Electrical Engineering. Our tutorial sessions were very similar to recitations in America and classes still had traditional lectures. I'm not sure if this is different for other fields.
I'm interested to see where this concept goes, particularly in its stated mission to provide an educational experience for the underserved. Most non-traditional students, no matter how motivated, simply don't have the time or resources to trudge their way through the traditional college experience. Should we not be concerned about educating them as possible as well? I believe so, and some of the barriers to traditional college for someone in their 30's, say, requiring a student be physically onsite, just seem absurd.
I think it's also important to note that this is not intended as an all-out replacement of traditional colleges, just a potential alternative. In an institution as old and important as this, cultivating innovation rather than hiding from it is absolutely vital.
Have you found Western Governor's University's B.S. Software Development? Competency-based program from a non-profit and regionally accredited U.S. university. (the real kind of accreditation, as opposed to the national accreditation most for-profits have)
How would a degree like this look in the job market compared to a traditional CS degree? This looks more vocational, which could actually prepare a student better for many software jobs.
One of my coworkers holds their M.S. IT Management. We work at a 3 year-old, ~150 person start-up in Mountain View. That's the only data point I can provide.
Edit: I should point out he already had many years of experience before he picked up that M.S.
At the same time, I would be concerned about having employers recognize the degree as valuable. Looks like there are some mixed opinions online from students and employers about the quality.
I mostly agree with Universities and CS programs. It seems that most competency-driven stuff are the "hand-on" schools like ITT tech (local example). Even their quality varies widely based on reviews I've seen. In security, the only thing I ran into was the SANS GSE certification that involved interviews, paper tests, and hands-on demonstration of skills. That's a nice combo if following an education teaching fundamentals and practice.
Anyone seen something like that in accredited institutions or even affordable training companies?
Indeed, although I would wager that current CS programs don't effectively gauge that either.
I think part of what's interesting is that there is this massive proliferation of non-college sources of education for CS, but none of them seem to have the clout and accreditation that a lot of employers look for. So why aren't universities copying the coding bootcamp model as a hybrid with CBE? I hope it's not just because they're being sticks in the mud.
I don't know whether or not judging programming ability is hard. But judging ability in computer science, which is what we're talking about, is not at all "very hard".
Do you see local technical colleges and career colleges succeeding in the future? Dental schools, technical assistance/maintenance/etc, welding/plumbing trades, etc? It seems there is huge swath of different types of trade schools out there...I am trying to figure out which ones are going to survive with physical schools and classrooms/labs vs which ones will migrate online?
That is a great way to look at it. I am a real estate investor looking into the pros and cons of investing in the campuses of technical schools. They are everywhere. I think your sentiment is a great way to sort the healthy/longer lasting schools from the ones open to technology change. Fields like nursing, dental hygiene, server maintenance, etc will be more likely to succeed than paralegal, medical billing, etc. Some jobs are inherently (and always will be) onsite.
> I am a real estate investor looking into the pros and cons of investing in the campuses of technical schools.
I don't know anything more than what you provided in this post, but it sounds like you're talking about for-profit education centers. These are generally looked down upon and are coming under heavy scrutiny by the U.S. goverment as of lat for their false advertisements and broken promises.
Yes, for-profit education centers. I'd really like to look into this more...Do you have any sources I could read into regarding decline and/or stigma? I feel there is more of a stigma on the larger "vocational" schools like University of Phoenix that offer college-esqe experiences and soft skills vs. blue collar/technical schools that have clear job paths and hard skills after graduation. True? Or are they all stigmatized?
I don't think they're stigmatized, I think we need more of them. I recruit employees at some of these 'hard' technical schools for things like CNC machinists.
Given the multitude of things you can learn to fix in your house using YouTube videos, (plumbing, electrical, appliances, finish work, etc) I think it's clear that you don't need an onsite instructor to learn trade work.
It would obviously be tricky to learn plumbing without wrenches or pipes, but I don't think an onsite instructor is required.
There's even an app called Fountain that lets you facetime with a plumber who then tells you what to buy/fix. I tried it when my bathtub faucet wouldn't shut off. It was surprisingly convenient.
This new model favors mentorship (adhoc and scheduled) but for the most part, self-education. And, as mentioned in the article, graduation rates are kind of variable due to the lack of physical accountability (lectures, fellow students, etc).
But. It's self-education. The way I'm interpreting this is a new, viable way for Universities to increase profit margins by accepting more students and delegating more of the core offering ("learning") to other students, while saving costs on traditional overhead (facilities, professors) AND increasing sales outreach (online).
Now don't get me wrong, self-education and finding your own way in learning is the key to real, personal potential. And honestly, this is really the only way a student can succeed in college anyway. However, encouraging institutions to scale back even more on what most already view as a secondary function (Yes, most view research as their primary) seems wrong.
Now if the cost of these acquired degrees we're to plummet I'd understand it a bit more. But I highly doubt the big name institutions will ever do this.
I don't know, maybe I missed something, after all it was a long article and I have to get back to my kickass programming gig ... which is a skill I learned via self-education and experience. (Yes I went to college for economics and marketing. No I haven't used it at all professionally.)
I've been scratching my head on this for quite a while as well. It seems to me that the value of the degree is going down, yet the price continues to rise.
The self-education model (and ignoring the complexities of managing that) is often written off as lacking in comparison to attending a college environment, which I completely understand, but yet institutions seem to be scaling back in ways that soon will seem to render themselves as more and more aged as self education becomes... viable? Changed, at he very least.
'“The essential manner of delivering education has not taken advantage of technological innovation in the way we’ve seen in other universities,” said Josh Wyner, the executive director of the College Excellence Program at the Aspen Institute, told me.'
I find the first sentence hard to read. I believe that it means that a few universities are far in advance of others in using technology.
I love this! It's a great way to 'not-stop' students' creativity and imagination. My college doesn't promote much of startup and new creative ideas. They're more focused on getting jobs and getting good salary packages from campus interviews. One of my professors literally asked me to focus on getting more marks and stop thinking of new ideas and startups till I complete college - not all professors help their students' in developing their ideas or startups. A student like me would definitely love this kind of college! :D
Your professor has some good advice. The problem I see is that many students discard the basics as 'boring' and 'pointless' and want to jump straight into new and fresh ideas. The end result is huge knowledge gaps.
If you want a startup, don't go to college. If you want to learn, go to college. When you try to do both at the same time, you just end up doing each half-assed..and both will eventually fail.
I agree to what you just said. But sometimes nothing could be learned when you already know that and I couldn't agree more to the second para. I am going to think wise and make a decision on that very soon. :)
Edited to add: One difference between university in the U.S. and U.K. is that in the U.K., 'breadth' classes are almost unheard of; the idea of a chemistry major taking an anthropology class, a psychology class, or an art class is strange. That's one of the reasons undergraduates there finish in three years, usually.