I work in post secondary administration, so I think I have some perspective here. Part of the problem, at least in the US & Canada (Where I live) is that post secondary institutions are positioning themselves less and less and places to get an education and more and more as places to go for an "experience." Its no longer enough to provide a quality education, universities now are selling themselves on their facilities, their "student life" and all the other intangibles that are secondary to actual education. This leads to all the administrative bloat we're seeing as now that many schools are functioning more like glorified 4 years spas they have to have departments filled with staff to plan events, throw parties, Snapchat sports games, provide "save spaces," etc.
I haven't been in the sector long enough to have a real handle on when or why this shift happened, but from my perspective its the primary driver of the increasing administrative bloat. Schools are competing more on the intangibles, and so they need to invest more into these areas, which means more staff and more overhead.
Personally I think the whole university model isn't long for this world though as there are plenty of ways competency can be signaled apart from a fancy foil-stamped piece of paper and eventually when the costs of university education don't provide a positive return over any reasonable time horizon students are going to start looking for alternatives en masse and the market will innovate to meet that demand.
This is the view of people who have not really looked at university budgets. "experiences" are cheap compared to headcount. Buildings and experiences are mainly funded by outside donation specifically for the experience or building. People do not often donate to "expand salaries and headcount for faculty and administrators".
The result is that students pay for headcount and donors pay for experiences. It isn't just administrative headcount, it is also teaching headcount. If you want a low price, you cannot have a $200,000 a year professor teaching 30 students and doing some research unless the research is funding their salary.
We need to lower the cost of colleges in the United States dramatically and that means tough choices, but it won't happen until we reach the point where student loan defaults are so prevalent that the federal gov can no longer subsidize them.
We have a crisis on our hands and our politicians are talking about lowering the cost of borrowing by 2%, and making colleges "free". They are not interested in driving down the real cost of an education.
On the research point, why are research and teaching commingled? Both are important, but it seems that they require very different skillsets. Would we be better off filling universities primarily with people who love teaching, and are great at it, and having some other kind of institution (or some separate division of universities) do the research? Is this heresy?
IIRC there's a fairly strong correlation between content mastery and teaching ability, even at the high school level, far more than the correlation between level of teacher training and teaching ability.
Obviously there's other factors (effort, as you point out).
Though I think it's partly an artifact of the days when an undergrad degree would bring students to the leading edge of research, so there was simply no-one else qualified. There would have been a lot more synergy when researchers were explaining cutting edge ideas to students (it still happens a bit, but it's the exception rather than the rule).
>Though I think it's partly an artifact of the days when an undergrad degree would bring students to the leading edge of research, so there was simply no-one else qualified. There would have been a lot more synergy when researchers were explaining cutting edge ideas to students (it still happens a bit, but it's the exception rather than the rule).
Yes, that was my suspicion too -- historically, students were only going to those lectures where they were actually interested and which they were much closer to the level of the lecturer. In that case, the bottleneck isn't "ability to be a good explainer to novices" but "ability to answer arbitrary questions".
IOW, it looked more like grad school today, where the good researchers naturally are a better fit for teaching, and don't mind it as much.
On the research point, why are research and teaching commingled?
They aren't. To an ever increasing extent, undergraduate teaching is done by adjunct staff, who are not involved in research, and who work on a temporary basis.
Professors with good research funding can often buy themselves out of all or part of their teaching load. Many choose to teach only the upper level or graduate courses, where immersion in research is in fact a useful qualification for teaching.
Disclosure: I was an adjunct for one semester, many years ago.
In large part it's an unfortunate consequence of how academia works; but I think in large part, what you're asking for is happening: the proportion of instructors who are lecturers and adjuncts, rather than tenure-track/tenured professors, has steadily increased over the years [1].
Meanwhile, tenure-track/tenured professors tend to focus primarily on teaching upper-level classes, which suits them infinitely better. The students they get are much more knowledgeable, which frees profs from worrying too much about background knowledge; by sheer virtue of selection bias, students that aren't interested have been filtered out; and the material is much closer to the work the profs themselves actually do.
The main reason they are commingled is that teachers need to have mastery of what they teach. As subjects become more difficult, fewer and fewer people have the required mastery to teach a subject. At the end of this (i.e. graduate level courses) the only ones with sufficient mastery are those that are doing the research.
For lower level subjects (i.e. most undergraduate courses) this is much less an issue, but the above still explains why we need researchers to also teach. Issues occur when you let a brilliant but eccentric mathematician teach calculus 101.
His research is in no way relevant to the course, and he might not be able to explain things on that base a level.
> At the end of this (i.e. graduate level courses) the only ones with sufficient mastery are those that are doing the research.
Sadly, sufficient mastery of the subject often comes at the cost of sufficient mastery of teaching, which becomes especially important for such difficult subjects. If only some of that money spent on more staff was spent on assistant teachers whose main skill was teaching and coaching other teachers, and providing paid hours for both of them to fix the curriculum together...
Anyway, I fully agree with your second paragraph. Another issue is that of researchers not really seeming to care, having an attitude of "you shouldn't be in University unless you really want to learn this, and I'm not going to try to motivate you or help you with that." There's some truth to that, but it also sometimes feels like it's used as a way to mask their own teaching incompetence.
I was asked to be an assistant teacher after graduating from my masters, and I completely freaked out about the responsibility of making sure I wouldn't fuck up my student's education (I cared more than most of my students did). I almost declined the job offer! I was basically told "yeah we know, that's part of why we want you."
Anyway, I spent a lot of time on-line trying to find advice on the dos and dont's of education. I can say that the best two lecture were Eric Mazur's Confessions of a Converted Lecturer[0] and John Corrigan's Are We Listening To Our Children?[1]. I think every teacher should watch these at least once.
Honestly, when mastery of the subject becomes detrimental to teaching the subject, your not taking the highest-level courses.
A great researcher has to teach and convince themselves, those skills translate to teaching at that level.
I'm mostly thinking about the highest level math courses though.
I guess you're talking about the point where it's more collaboration on a half-solved problem than data transfer about solidified knowledge? Because then I agree, but that's a really different type of knowledge transfer we're talking about then.
There's a point before collaboration, where you ask a researcher to bring a student up to the level of the researcher. At that point, the student and researcher struggle with similar enough problems that the researcher can think from the students perspective.
Anecdotally, I've had only one professor who excelled in conducting research and teaching concurrently. The rest either disliked teaching or were completely enthralled in their research, and we as students suffered.
AFAICT, there aren't that many people looking for jobs who love teaching and are great at it. Teaching is a really hard skill: it's hard to get, and it's hard to be good at it. That is somewhat heretical in higher ed, it's research that's supposed to be hard and teaching is the thing you do as a matter of course. Unfortunately, research and teaching are generally very different skills. It happens that sometimes you find people that are great at both. I happen to think that if you are great at one, you will be great at the other, but if you are only good at one, you could be totally terrible at the other. This might just be because people that are great at one thing could be great at anything.
There's a meme in research that teaching helps you do better at research. The story sounds plausible: to do a good job at teaching, you break the subject down into its component parts and tease out the essence, then package that up to deliver it. Perhaps through this process of repeatedly breaking your subject down to teach it, this process will eventually lead you to think about something differently, and then maybe you get an idea for your research. I've talked to people that say they have had this happen to them.
Partly because the idea that teaching is hard is heretical, though, professors really de-prioritize teaching when they are mentoring new professors. So not much in the way of classroom skill gets passed down from generation to generation, and it's up to the individual professor to re-discover how to teach when they start their career. Outcomes vary. Additionally, because of the idea that teaching is not hard, new professors (and professor trainees) are discouraged from spending too much time on teaching, so in my experience, when it's done well, it's done either by someone that takes more time than they ought (potentially at the expense of their career) or someone that is just so great at everything they do, that it works out okay.
This means there are just not that many people floating around who know how to teach and are amped up to do it. You see this when you try to staff "lecturer" positions at US universities, where that term means "someone who teaches but does no research." Those positions might even come with their own analogue to the tenure track! And you can interview dozens of people that can't really lecture, or teach.
If you go to grad school, you get indoctrinated very quickly to the idea that teaching is easy, you should do it with your eyes closed, but you should put in the minimal amount of work required before doing something that really matters, like publishing papers. I think it's hard to find people that can teach because those people wash out of grad school very early when they find out that their attempts at teaching well will not be rewarded.
This is not true if you look at the tiny liberal arts colleges, though. They actually care about teaching. However, if you go to grad school, you go to an R1 school (because tiny liberal arts colleges can't afford grad students) and R1 professors think that becoming a liberal arts college professor is a shade of failure, so they will strongly discourage you from going down that path.
I work at a big state university as well and completely agree. The amount of money spent on luxury dorms, extravagant academic buildings, athletic facilities, student entertainment, and administrative on top of adminstrative staff is stunning.
Some of the facility spending, especially in athletics, is paid for by donors and athletics also has revenue from the major sport events, but there is huge spending and bloat on the academic side a well, funded by ever increasing tuition and fees that I just cannot see being sustainable.
There area a number of comments in this thread similar to yours on the topic of university as an 'experience'. I started my undergrad degree in 1999, and I think that was very much the case at that time as well. It was the same for the dorm community that is mentioned as another reason in other comments.
All of that was pretty similar then, and this was when UC tuition was still very, very reasonable.
The main difference, maybe, was that at that time the experience was less controlled, and less managed. The dorm I lived in first year was run by fourth year undergrads and grad students, and their responsibility was basically to make sure that no one was opening vodka bottles in the common room, or setting off smoke detectors with bongs or puking in the elevator. They were compensated with free or heavily reduced room and board.
The experience was less controlled and less micro managed across all aspects of university life at that time, and even more so prior. My admittedly unsubstantiated viewpoint is that the proliferation of control mechanisms, and transition of traditionally subsidized experience management to formal salaried roles is a big contributor to the unnecessary increase in bureaucracy, and it has a negative long term effect on learning from both the academic and the 'experience' side of things.
Yep; this is the same problem people talk about when they talk about bundling. Not interested in a gym membership? It doesn't matter, your tuition is still going to go to subsidize the gym at the university.
It's also worth pointing out how colleges end up benefiting from having captive customers. It's usually more difficult for a 17-18 year old to rent a place off campus and commute, and many schools even have policies against that. And a social environment is encouraged in dorms and dining halls, so students who aren't buying stuff from the "company store" end up at a loss. Whether or not the school is good at managing housing and food is often immaterial.
I think there's somewhat of a vicious cycle at work. Someone paying $50k/yr. wants more than classes in a run down building, and then a threadbare dorm to sleep in. That's a ton of money, and I don't think it's ridiculous for people to expect a pretty darned nice experience (inside and outside the classroom for that). But of course, providing those facilities and experiences costs a lot, so tuition increases. And with the increased tuition, so too increase expectations, and the cycle repeats. I'm not saying it's right, but it's what I've observed.
>I haven't been in the sector long enough to have a real handle on when or why this shift happened, but from my perspective its the primary driver of the increasing administrative bloat. Schools are competing more on the intangibles, and so they need to invest more into these areas, which means more staff and more overhead.
If I had to actually guess, it probably started around the time they realized that the Millenial matriculation cohort was going to start "running down", resulting in universities having to either reduce their selectivity (which is death for a private uni) or somehow retain their selectivity by attracting more of a limited applicant pool. At the same time, there's a secular trend in cutting per-student funding for state universities.
The result is that state schools need to bring in more tuition money (forcing prices up on their end, reducing price competition against over-expensive private schools), while private schools need to retain selectivity (so they compete harder for students who can pay).
I think it makes sense why people compete on experiences. Can you really differentiate how much you're learning between the top 50 (or even 300) schools? It really depends on personal motivation.
And in terms of non-educational successes, the school name alone often provides the value, and since it's virtually impossible get an exact value, schools compete on other factors.
Combined with the fact students are more and more willing to go further away from home to get a degree, you need to offer something more than education. Going to a good school (and doing the work) will be challenging regardless of where it is, and I think most people realize this at some level -- so you look for the place that will make you happy otherwise. Slogging through finals week anywhere is tough, but it's hell if you also hate the school you're at and wish you were somewhere else.
As far as the "university model" -- it's got problems, definitely. But the value of the top schools has always been the name. Consider that graduates of top schools are so overrepresented in the highest levels of both the public and private sector, it will be hard to break the cycle of top-down everyone buying into the idea of a 4 year degree. It will probably happen at some point, but it seems slow going.
The biggest schools (especially R1-level unis) fare much better in terms of providing funding and aid to students, so they aren't always as cost prohibitive as many for profit schools are (where the vast majority of student loan debt is tied up).
I don't think that's it. Simultaneously, health care has seen the same massive growth of administration and shift of money away from the actual service (professors/physicians).
There is a huge new "student ethics" bureaucracy to handle that stuff that never used to exist. In the past, if you got drunk and did something you later regretted, that was your problem, as it is for any adult in the real world.
They were involved in a supervisory role. All male or all female dorms, mandatory on-campus housing, curfews, tightly controlled visitation and social events. Now, students get to do whatever they want and are only held to account after the fact. There is a huge ethics bureaucracy with committees conducting hearings, investigations, and various other CYA activities after the shit has hit the fan.
Exactly and eerily true. At my university there this is a category 2 serious incident; like a fire, flood or epidemic. Any student admitted to emergency medical care due to inebriation, means the Serious Incident Officer is involved. How the SIO is supposed to find out I don't know.
I struggle to understand why relatively high-level management or SIO needs to be paid to get involved in what another 'adult' does out side of lectures/labs and campus.
I haven't been in the sector long enough to have a real handle on when or why this shift happened, but from my perspective its the primary driver of the increasing administrative bloat. Schools are competing more on the intangibles, and so they need to invest more into these areas, which means more staff and more overhead.
Personally I think the whole university model isn't long for this world though as there are plenty of ways competency can be signaled apart from a fancy foil-stamped piece of paper and eventually when the costs of university education don't provide a positive return over any reasonable time horizon students are going to start looking for alternatives en masse and the market will innovate to meet that demand.