Hacker News new | past | comments | ask | show | jobs | submit login
U.S. News pulls Columbia University from its 2022 rankings (forbes.com/sites/michaeltnietzel)
440 points by selimthegrim on July 11, 2022 | hide | past | favorite | 336 comments



Related:

An Investigation of Columbia's U.S. News Ranking - https://news.ycombinator.com/item?id=30603287 - March 2022 (46 comments)


Per the article this came in response to questions raised by a Columbia faculty member critiquing the university's data. I thought that report[1] was actually more interesting than the original article.

As per the quote at the top of the report:

> Rankings create powerful incentives to manipulate data and distort institutional behavior for the sole or primary purpose of inflating one’s score...

It would be shocking to me if Columbia was the only institution that fudged its numbers a bit in pursuit of a higher US News ranking. Given how important that ranking has become in US culture, I wonder if it's time for the methodology to change to require audited/auditable metrics rather than continuing to allow self-reporting.

1: http://www.math.columbia.edu/~thaddeus/ranking/investigation...


Your suspicion is correct.

I've taught at several US universities, and it's an open secret that the administrative takes steps to manipulate numbers in order to secure better rankings.

Typical example: a small, prestigious liberal arts college always has smaller class sizes in the fall semester than in the spring semester. Why? Because classes with less than 30 students are ranked higher by US News, but they only take into consideration the fall semester. No one cares about class sizes in the spring.


Similarly, at a large land grant university, we talk a lot about improving our rankings, but acknowledge that one of the major metrics (% of students not accepted) is something that's contrary to our mission.

It was an open secret when I was in Boston that Harvard actively encourages people to "give it a shot" when they know full well they're not getting in to boost that metric.


I remember when I was applying for colleges and the only reason I applied to Harvard was because they didn't require any specific essay, which was unlike any other school. If they did, I probably would have just passed on them. At the time, I thought it was nice because I was tired of writing essays, but now I realize that it's probably just a way to boost their selectivity.


"Recruit to deny" [1] has been getting more attention but still not enough.

1: https://www.nytimes.com/2019/11/29/us/harvard-admissions-rec...


Michigan also seemed to waitlist me and several of the people I knew at MIT. If they know they're the backup school, and they think you're going to get into your first choice, it hurts their acceptance rate to accept you outright.

Though, maybe it's just confirmation bias and ego. Maybe there was something in my application that gave Michigan pause but seemed fine to MIT. At that time, Michigan did ask which other schools to which you were applying. I'm not sure if that's still the case.


I'm confused. How would Michigan know that they were your second choice?


They expect students of a certain caliber. If the applicant exceeds their expectations, the university assumes the student is using this application as a “safety school.”


My kid, got waitlisted at UM, while a classmate with similar but slightly lower GPA and extra-currs got in. Maybe his essay wasn't as convincing or maybe they thought he'd snub them for his first choice. He didn't get into the dream school either but ended up in a great place at a school that likely fits him better anyway (and is cheaper since UM's OOS tuition is insane).

It's still frustrating as both his parents are alumni. Thanks to Covid and test waivers, 2021 seemed like a rougher year than normal to be applying to the higher end of schools.


This is sometimes referred to as "Tufts Syndrome" or "Yield Protection": https://en.wikipedia.org/wiki/Yield_protection


cries in Jumbo alum


I used to live with an admissions officer - they've got data (and instinct) for "You're probably not going to go here". The same is true for graduate admissions in my experience.


Acceptance rates is a really poor choice of input for college ranking

It's basically saying "other people seem to think this college is good, so we'll nudge them up a few points"

In other words, it's attempting to measure public perception about the college instead of the actual results it produces. How the public _thinks_ a certain college is shouldn't factor into how good you _tell_ the public that that college performs


> It's basically saying "other people seem to think this college is good, so we'll nudge them up a few points"

So does page rank or citation index - popularity begets visibility, begets popularity.


The difference is that applicants by definition don’t have personal experience of how good the school is, it’s all perception. With page rank and citations the expectation is those linking to or citing something have read it and have reason to value it.


Same with students applying. They’re betting that students aren’t applying completely randomly to schools and that the research they do beforehand isn’t entirely noise and has some signal that will show up in the average.

I mean it’s a pretty darn conservative proposition to say “we think that averaged over the entire population of applicants that the schools that students spend their time applying to will trend, however small, to better schools.

Where the flip side of “it’s all noise and it’s impossible to evaluate a school better than flipping a coin without actually attending” is an incredibly strong assertion.


I'm in the UK, but my eldest is starting Uni this September and rankings played a huge role in her selection of colleges. Yes of course she looked at their prospectuses and visited a few of them, but the shortlist of the ones to look at and visit itself was based on rankings.

It's like the XKCD about Wikipedia entries being cited in articles, which get cited in papers, which then get cited in the same Wikipedia article.


I usually love this kind of feedback loops in systems but this one is kinda sad


Much (often most) of the value of a degree is in signaling. So public perception is very important


Not really. There are only a handful of schools where it matters, the rest nobody has even heard of. Odds are half the people you are competing with have a degree from a university I've never heard of, and the other half it is a local college that I know of but I have no idea how it stacks up compared to yours. The local school when you are local can be a negative though - I know the bad local schools.

If you have the big name degree it might push me to interview you when otherwise I'm borderline about tossing your resume, but that is it. I need to know you are good not your school.


Perception is the result. Average SAT score relative to others tells you roughly average IQ difference for the student body.

The rest is bunk. Undergraduates aren’t connecting with famous professors. A faculty team with deep expertise in a particular area is relevant but not in the rankings.


My impression has long been that the endowment per capita metric provides about as good a ranking as any of the third-party rankings within a category. Where it disagrees with US News, it usually matches my intuitions; and where it disagrees with my intuitions, I usually learn that the college in question has a better reputation than my impression.

It doesn't do a great job comparing across categories, but then, neither does US News...


Some top colleges don't have alumni/legacy preference so that endowment measure maybe gets skewed away from them. MIT and I think Caltech.


They still do great on this metric, though! MIT comes out looking a bit short of Harvard, on par with Stanford. These schools ride a bit higher than the top tier of LACs (Williams/Amherst/Pomona/Swarthmore), which isn't necessarily the fairest comparison, but Caltech fits right in with those.

Maybe they'd be richer if their admissions was more open about taking quasi-bribes, but I suspect you're dramatically underestimating the orders of magnitude of difference that appear on this metric. You find clear tier gaps.

Don't get me wrong, this is a silly proxy metric about three degrees out from anything you'd actually want to know. But it gets surprisingly good results for how simple it is to tabulate.


Imagine college A that 100 apply for and all 100 get in vs. college B where 10,000 people applied and only 100 got in. If both colleges are choosing students based off of some measurable criteria, let's say the SAT score. Would you expect to have higher average scores at college A or college B? Obviously college B would have students with better SAT scores simply because they are so much more selective.

Assuming that the college is selecting students for attributes that you are looking for in fellow students, then the college that denies more people will have students with more of the attributes you are wanting to be around. So it isn't just about perception. There is some math behind it as well.


> If both colleges are choosing students based off of some measurable criteria, let's say the SAT score. Would you expect to have higher average scores at college A or college B?

Except "average SAT score of incoming class" is another field of the rankings. All the measurable criteria are handled as independent variables - there's no need for a poor proxy.


Even so, you don’t need to guess the reasons for acceptance on aggregate data - you can check if the SAT scores were higher. At least that would tell you for certain.


It depends if school A costs 100,000 and school B almost free. I would put my money on the 100 who spent 100,000 over the free 10,000.


I didn't know that about Harvard. That's so sad. I've seen how devastated the rejects can be.

My anecdote is that I was once recruited by one of the best departments in the world in my field, at a very junior level ... and then, before I started, my contract was rewritten to be at a higher level, because they'd found out that that would allow them to count my grants in their metrics. Not only is this obviously dodgy, but I think it's actually illegal in Australia (since better-qualified candidates weren't given the opportunity to apply for the post at the higher level), but they did it anyway.


> It was an open secret when I was in Boston that Harvard actively encourages people to "give it a shot" when they know full well they're not getting in to boost that metric.

The common app has made this worse. In "the old days" you had to fill out a full application, write different essays etc.


I attended a well known and regarded public university. I was admitted for spring, and for the preceding fall entered a program where I could live in a regular university dorm and take classes for full credit taught at the same university’s extension, but special extension classes just for this program (so all spring admit students). Then you stay in the dorm in spring as you become a “regular” student.

I always suspected it was a way to fudge numbers, but at the same time I knew I was genuinely on the bubble of being admitted so I took the opportunity gladly.

(My memory is that the other spring admits were, like me, from racial groups considered “over represented” so they may have been trying to affect diversity numbers too.)


Berkeley has been doing this for at least 35 years.


Yup, I got into Berkeley EECS this way back in the late 20th century. I'd guess it's legitimate load-balancing rather than an attempt to game the ratings. I assumed this meant that my application was borderline and I was grateful to be able to sneak in. The Extension program is a nice transition to college too.


Load balancing is the most likely reason. At any given college a lot of first year students decide college isn't for them or they get home sick and drop out within the first semester. After that initial period student population of the class is much more stable.


> Because classes with less than 30 students are ranked higher by US News, but they only take into consideration the fall semester. No one cares about class sizes in the spring.

What a bizarre and totally unnecessary restriction.


You add a simple and reasonable (until it gets abused) assumption to your model which allows a lesser data collection burden on thousands of universities. I wouldn't call it a restriction.


> which allows a lesser data collection burden on thousands of universities.

How's it supposed to do that? Universities all already collect the data on enrollment for every class they offer.


Spoken like a software engineer! Now imagine every form is being filled out manually by an administrator with a calculator.


Maybe the forms should be made too complex to be filled with a calculator, to encourage doing things the right way…


Wouldn’t work. Entire state budgets are accounted for by spreadsheets without formulas, emailed between people in an office with no revision control.

A few years ago when 50 million went missing, I wasn’t surprised. I’d seen the budgeting process, and it was worse than you could possibly imagine.

There was an attempt to go the software route, but a nepotism deal left the budgeting department with a worse system than manual spreadsheet entry.


lowest bidder took the 50mil. on a serious note tho, this is craziness. technical education and knowing how to code should be mandatory for these kinds of jobs.


Most of the budgeting department is highschool graduates with a few higher ups with accounting degrees.

The pay is abysmal either way. You would make more money at an entry level software job anywhere.


“Hire a coder” appears to be good advice for many organizations, but there’s not enough coders for everyone to do it, so most will do without.


But it's not.


I don't agree that there's a good argument for ignoring the class sizes for half the year to decrease data collection burden. The data they're ignoring is of utmost importance to the statistic (well to the statistic as basically everyone will understand it since no one reads "average class size" and thinks "average class size of only half the classes").

In any case, the quality of the argument doesn't really matter. If the number is being gamed so trivially and yet US News continues to ignore spring classes, then the restriction has gone from bizarre and unnecessary to actively malicious. US News is basically colluding with the universities to engage in fraud.


Parents helping/guiding their near-adult children in the final spring of highschool to apply at or choose universities, where they would start university that same fall seem to be the target audience, here. Fall is when "most" incoming freshmen will start.


I TA'd in the attendance office at my high school back in the day.

The city's tracking of student attendance only lasted a portion of the year or a portion of each quarter.

During those times, much more effort went into calling the homes of missing students and finding out why they were not showing up to class.

I remember asking about why all the calls were happening or not happening and it was sort of quietly explained by the staffer: measurement of attendance affected how the school was being evaluated by the district. Presumably, budget was hooked to it in some way.

I didn't make any waves about it because "working" in this office was super powerful and the staffer always brought in cookies for us. Amazing gig.


This is a plot point of the season of The Wire set in the school system. In Baltimore, it is definitely tied to funding--schools get per-pupil funding, but students have to actually be in attendance a certain percentage of the first six weeks of school in order to count for the funding formula.


In California, schools get paid on daily attendance. I believe it would be considered fraud to not take daily attendance. They suspended this during the heart of the pandemic but it came back last fall. Because daily attendance is down relative to pre pandemic, Public schools are facing a significant revenue gap (5-10% is typical)


In contrast, Michigan Public K-12 schools take attendance on specified Count Days to account for the number of students they serve to determine state funding. Fall count day is the 1st Wednesday in October and represents 90% of state funding. Spring count day is the 2nd Wednesday in February and represents 10% of state funding. As you might imagine, getting students in school on these days is a Big Deal.

Teachers take attendance on other days for internal use, but Count Day is special and you must not be absent!


How often have Count Days been snow days?


It's almost never a snow day in early October, I only remember one February count day + snow day. They count on the next day:

> If instruction is cancelled on count day due to conditions not within the control of school authorities, with the approval of the State Superintendent, the affected instructional programs must use the immediately following day on which the district resumes session for count purposes

per the rules at:

https://www.michigan.gov/-/media/Project/Websites/mde/Year/2...

I do recall my cousins at a poor rural school in Barry county having a couple snow days on Count Wednesday, and then subsequently calling one on Thursday when it wasn't really required and another on Friday where it really wasn't required, so that counts were as high as possible on Monday. Their snow days were much more frequent due to bad roads and bad road maintenance, and their state funding was much more important compared to my wealthy district.


This sounds like the prompt to a horror story.


Ah yes, the ol’ warm-body count.


Great to know many peoples graduations were delayed so someone could game some numbers


I think they mean the size of individual classes, so they're paying for more sections/instructors/instruction time in the ranking-relevant periods.

Though shrinking the admit size is another well worn tactic, as it's the easiest way to increase how "selective" you are which is another major ranking element, plus it helps you more cheaply achieve smaller average class sizes, though of course you're forgoing some revenue to get there.


Encouraging people to apply is also a common tactic. You can't reject them if they don't apply!


Applications are also a revenue source.


Apparently there’s a common application thing now where you can apply to large numbers of schools for undergrad without spending large amounts of money.


The common application just means you don't have to fill out unique admission forms for every school. You still pay an application fee to every school.


In some other countries there's a national system with a single modest fee. In the UK this is 'UCAS'[1], a nonprofit org. You apply once, and indicate your top N (say, 5) favorite school+program. The applications are sent to the schools, and they can make you an offer.

If not accepted for any of them, there's a follow-up matchmaking service (called 'clearing' in the UK) to help find an unallocated spot.

[1] https://en.wikipedia.org/wiki/UCAS


From the common application site: While some colleges may charge an application fee, others have no fee to apply. And, many will offer fee waivers under certain circumstances, including financial need, veteran status, and more.

Based on a conversation with my nephew who applied to a double-digit number of colleges this past year, the application fees are apparently more the exception than the rule these days.


> You can't reject them if they don't apply!

You can if you admit on objective metrics. If you only take people who are 215 pounds or above, you can fairly claim to reject everyone lighter than that, regardless of whether they apply.


What are they going to do? Transfer out and have only half their credits recognized therefore pushing them back 4 semesters instead of the 1?

It's a fantastically anti-consumer market they've managed to create.


Gotta pay for another semester then, right?


Why is small class size a good thing? Anecdotally, you get better class material and TA staff in larger lectures. Why aren't Universities able to innovate here?


My uncle was a math teacher in Brazil - mostly test-prep course schools for university exams, later he opened some of his own private schools.

He used to say two things: that he has estimated to have taught at least 100 thousand people in his life (seems just crazy boasting, but feasible when you are working in 5-8 different schools per semester and each class has 100-250 people) and that a teacher is always teaching for the worst student in the class, and that the only reason that top universities are better is because of their selection process.

Forget class material, forget infrastructure and fancy facilities. What makes the best universities is the fact that the top students are there.

If you can not (dramatically) change your students and make them smarter, you can make your classes smaller and segment by performance. This at least stops your top 20 percentile from being dragged down by your bottom 20.


This is 100% true. It's also why attempts to make things more fair by allowing lower testing kids into specialized schools fail. It just makes the specialized school a normal school.


> that a teacher is always teaching for the worst student in the class

It's been my experience in college that most teachers just let the bottom quartile of the class fail (and likely switch into another major).


There is no "switching majors" in Brazil. You make the application for the major that you want, and you are competing against everyone that also applied to that course. If you decide to switch after starting a major, you almost certainly will be forced to drop out and re-apply.

But in reality, even that doesn't matter. In the best universities, your worst students will be better than the average students at the second-rate institutions. If the teacher is simply giving the class to the top 75 percentile, the class will be even better still at the top uni.


And yet this seems a self fulfilling observation .

How can somebody raise from the bottom 20% if they get subpar education?


First, why assume "subpar education"? One of the corollaries from my uncle's hypothesis is that there really isn't any secret held by some elite, nor there are special resources to be distributed to students during class. Simply put, the more a group is segmented, the more the class will be appropriate to them as a whole. If anything, you are increasing "utilization" of resources.

Second, there can be individual mobility. A poor performing student with otherwise potential to improve might receive help to overcome whatever is bringing them down, and get them to thrive again. Even in that case, it would help them to be in a smaller class, as it's much easier to identify who among the low performers who want to be helped, and those lost causes who are a drag for everyone else.


This doesn't pass the smell test. If you teach a class of 250 students, you aren't even aware of the bottom percentile until the first exam is graded. That's many weeks after you've already taught many lectures. So how would your uncle even figure out how to tailor those initial lectures to that bottom percentile?

Where this matters is in something like a lab or music practicum where the communication between instructor and students is the class. But I can tell you from experience that the variance among different years of students in a single university is large enough to disprove your uncle's point. I had one class that could have been confused for graduate students. Yet another might as well been sent by the Trump administration with the goal of destroying the institution of ear training.


You gauge the level of the class by asking questions, get an idea of where they are, and vary the pace accordingly. You've also got to factor in the experience of the teacher in gauging the level of the class.

You don't need to know everyone's level, just a rough idea about what the worst student of the class might need.


> you aren't even aware of the bottom percentile until the first exam is graded.

The first exam that students in test-prep schools take is before classes start, usually a mock university test that can be used precisely to assess your current level and to gather information about the class overall.

You don't need to know who are the worst performers. But if you know that your class of 250 has an uneven distribution of aptitude, you'd try to break the class apart.


> you aren't even aware of the bottom percentile until the first exam is graded.

I've taught some college classes and guest lectured in many others. The bottom 10% is trivial to identify in a matter of minutes.


> If you teach a class of 250 students, you aren't even aware of the bottom percentile until the first exam is graded.

You don’t need to know which specific 50 students are in the bottom two deciles, just that they exist and that your policy is to teach to what you’ve estimated is the 3rd decile and above.

You will find out who they are in the exams, but you know they exist in next year’s class even before that class is admitted to the uni.


Small class sizes are better precisely because the class can then become more than a lecture. I am a professor, and if you give me a hall full of students I’ll give a lecture, because that’s all that can really happen under those circumstances.

If it’s a small group, I can actually learn everybody’s names, get to know them, personalize the material more and spend big chunks of the class just answering your questions and starting discussions based on them. The class becomes less of a mere information dump and more of a mentorship.


> If it’s a small group, I can actually learn everybody’s names, get to know them, personalize the material more and spend big chunks of the class just answering your questions and starting discussions based on them.

I went to a relatively small private engineering school. It made a world of difference in my education knowing the professors, and having them know me, especially given that with small classes the professors offered office hours where I could come and discuss the material that was covered during class, as well as problems I encountered on midterms.

I still remember nearly all of my professors, fondly, precisely because of the dynamic small class sizes allowed for.


Same experience here. I wouldn’t trade small class sizes for anything.


I think it's wrong to think that lectures are always inferior. There have been many cases where a professor trying for a more interactive class has made the experience worse for me.

This most often comes in the form of prolonged rhetorical questions. A teacher, fearing he is losing the class' interest might pause to ask "And, what is the benefit of the scientific method?" The point here is clearly to gauge whether the class understands and is following along in the train of thought. Usually, the rhetorical question has multiple interpretations and valid answers. And usually, the students are not so much wrong, as they simply fail to track what answer the teacher is hoping for. For example, the students might answer the rhetorical question above with "It helps us understand the world.", "It allows us to test theories.", etc. In fact, the teacher was looking for something like "It allows for the independent checking and confirmation of the findings of others."

In a real world example, this sidebar of questions might have taken minutes, and more often than not, completely breaks my train of thought. Often, the teach has completely paused a valid train of thought to either measure or enhance class participation, and in doing so has made it harder for me to understand the lesson.

Sometimes this goes even more poorly, and the teach just wants to witness a "class discussion," which can often entail spending much of the class polling the students for their thoughts. Depending on the subject, this can be a total waste of time. Most of the students in the class are poorly informed, and their thoughts are much less useful than a traditional lecture would have been.

I certainly admit that interactive methods of teaching can be particularly effective, but they are not an unalloyed good. They can often be worse than a simple lecture.


What are you even talking about? Have you had any of that classes or are you speculating? I had years of college classes with under 10 students — some with 5 or fewer — and never experienced what you’re talking about.


> I think it's wrong to think that lectures are always inferior. There have been many cases where a professor trying for a more interactive class has made the experience worse for me.

A foundation of the bloated academic and teaching industry requires that that professors can not teach more than a small number of students at once, or at least not teach them well. And that people who were taught in schools with smaller classes that were more expensive must be better. All claims about teaching and learning should be considered with that in mind, especially when it comes from the industry itself.

That being said, interactivity with professors probably works better in most other settings than a lecture format.


This kind of thing is much more likely to happen in the discussion section of a large lecture class than in a small seminar, because the discussion section is likely taught by a graduate student with little prior teaching experience.


In my three student organic chemistry III class, the professor had each of us reading ahead and taking turns teaching the material ourselves. We'd be corrected for mistakes, of course, but it was very much a fire drill. We moved a lot of electrons.

That was quite the course.


I’ve had amazing large lectures and poor small classes. It always came down to the individual professor.

I actually think that putting a bad teacher in a small class makes it worse than a large one.


There are real issues with using class size as they do. If you have someone that's really good at teaching a particular class, it's better to have them teach one section of 60 students rather than have a second section taught by someone that does a mediocre job. You also have things like TA help for larger classes, and that doesn't factor in at all. There's also the discontinuity thing. 30 is way better than 31, but equally better vs 300 - which obviously makes no sense. Overall, the measures they use for class size in these rankings are garbage.

The same is true for selectivity. It doesn't account for education at all, only the difficulty of getting in. But since it doesn't account for the applicant pool, it doesn't account for the difficulty of getting in either. You could use things like SAT scores as a measure of the difficulty of getting in and ditch the acceptance % completely. I work at an R1 that gets hit hard because we admit most applicants because, you know, we exist to educate folks, not to serve as a signalling device.


>Anecdotally, you get better class material and TA staff in larger lectures.

Where is this anecdote coming from? A billion years ago when I studied at a small liberal arts college with small class sizes, professors were keen to share with us exactly why they were teaching from particular texts or why they had selected certain materials to make it clear that while the class wasn't being taught by that famous guy from MIT, it was still his book and materials we were using. I knew all my professors by first name, knew where many of them lived, had access to them anytime it was needed, and that's directly counter to my partner's experience at a large university.


Exactly this.

I went to a liberal arts college and had small class sizes (most around 9-15 in size, with the largest being 30) - I could see my professors 1:1 pretty much anytime I wanted, but there was also a preceptor (TA) for each class who didn't teach, but was available to answer questions related to assignments, projects etc. I went to the house of many of my professors for food or drinks too. Generally also was on a first name basis with professors.

The downside is that liberal arts colleges aren't research institutions and they are generally undergraduate only. This means you don't have as many choices for advanced courses. However, the professors at liberal arts colleges really tend to be there for the teaching and tend to enjoy that. One of my math professors was the head of the mathematical association of America and also authored the books on Analysis (the mathematical subject - not a data science thing) we were using.


Many professors at small liberal arts colleges still conduct research. And because of small class sizes, there is a good chance for a motivated student to get to know their professor and participate in that research.

There are also research consortiums of small colleges. I did original geological research 2 out of my 3 summers in a small college via the Keck Consortium:

https://keckgeology.org/


I did research for a professor as an undergraduate at a big university. Once you are out of lower division courses and into your upper division major classes, your classes are much smaller and the professors start to know you as you see them around a lot.


>The downside is that liberal arts colleges aren't research institutions and they are generally undergraduate only.

What's ironic about this though is how many universities won't accept their own undergrads into their grad programs... IE: they know that their undergrad programs are sub-standard but they don't seem to take action to improve them because the money and the focus is on the grad programs.


When I applied for grad school (long ago) they just said it was better to experience another department. The undergrad classes we similar enough to the grad classes that it was good to see a different approach. So if you had a family reason (spouse's job etc) that meant you needed to stay in the area, then they'd let you apply. Otherwise, they wouldn't write recommendation letters to our department.


It has nothing to do with the quality of the undergrad degrees. Even most top tier programs suggest going somewhere else for grad school because each department has its own research culture and it is good to pick up on more than one


My undergrad was at Umass Amherst, and I currently attend GA Tech's OMSCS program. Professors love sharing info on why they chose to talk about one topic vs. another, and by and large were happy to connect with any student motivated enough to attend an office hours or talk to them after class. I knew same personal information about the professors I worked with in the lab, but I don't generally see how that would improve my learning outcomes.

Class size naturally dropped as you got into niche subjects, of which there were many!


What I've heard anecdotal bears out what you are saying, IE: once you get through various mega weed-out classes, in many universities you still get the luxury of smaller class sizes and more direct access to professors as you reach those upper level classes.

Where I was asking for clarification was on the idea that somehow people were getting better materials and a better overall experience in thee 500+ people lecture hall type experiences, rather than those 20-person intimate learning style classes.


My point was that a large 500 person lecture on "Intro to ElectroMagnetism" will be about the same as a small 20 person lecture in terms of material. The variability will come from

- Honors sessions - Extra seminars - Quality of Material - Quality of student support groups

All of these things get better as the class size grows. In my last class at Georgia Tech, there were 900 students studying a niche topic from a particular professor. Due to class size, there were ~12 TAs and an extremely active slack discussion group where people could get support.

A smaller class just means the professor has limited resources to assist students.


One factor I don't see anyone mentioning here is the student. Some people can do great in a large lecture and that's great. But I teach in a SLAC and often students tell me they don't think they could do well without the smaller courses.


Fewer students per teacher is widely seen as a positive thing. A greater percentage of the professor's time and attention can be given to any one student. I've definitely experienced this myself, it's easy to feel like just a number when you're in a large lecture hall of hundreds versus a small classroom of a few dozen.


I already said this elsewhere, but I do want to reply to this comment, because it demonstrates a common mistake that may not be clear from my other comment.

If you have a given teacher, in a given classroom, with a given amount of TA help, blah blah, so that all else is equal, and then you reduce the class size, the students will do better.

The problem is that all else is far from equal in the data used in these rankings. Instructor quality drops off rapidly as you have more sections. TA help falls. The classroom may no longer be appropriate for the class, etc.

The rankings are not able to capture the effect you're talking about.


If what you need is lectures, you'd download them from MIT OCW or watch them on YouTube or Khan Academy. Or possibly just read the textbook, which may also be free online, or you can download it from Library Genesis (if that's legal in your country), or buy it used for US$35. Also, I can strongly recommend Wikipedia and... visualization videos that aren't exactly lectures, like 3Blue1Brown stuff.

You wouldn't pay tens of thousands of dollars in tuition for lectures unless you were the kind of person that universities like Columbia try very hard to not admit.

Classes, however, are very valuable. You can ask questions when you don't understand things, you can talk with the other students to build shared understanding, you can get your homework graded and confirm that the things you weren't sure about on it were actually correct, you can take exams and test your skills. And those things are much, much better when there are 16 of you in the class rather than 128 or 512.


For liberal arts? It's probably a pretty interactive format and smaller interactive classes are better than large classes that are more broadcast. (Often. I've also had great lecturers in the humanities.) But mostly don't look at many liberal arts classes (or case study, etc.) through the lens of an engineering lecture.


It's a perfect example of reverse causation.

Great Classes Implies Larger Classes (pure supply / demand); Larger Classes doesn't imply Great Classes


It's an interesting question, and that's probably why it ends up in so many econometrics and stats courses. You have the classic causality problem, along with ethical concerns of generating experimental data for such a study.

Controlling for as many of the relevant variables as possible, there does seem to be a strong negative correlation between class size and test scores. Beyond that, who knows, but it makes sense that small classes would succeed more often (at least up to a certain point of course)


But...none of those studies justify a discontinuity at a particular value. And if they control for things like TA help, they cease to provide useful information, because in the real world TA help increases with the class size.


Many courses rely on discussion!


One thing is closer interaction with faculty. Crucial if you want to apply for grad school and need letters of recommendation.


1) they aren’t rewarded for innovating

2) I guess they can actually by having massive classes in the spring and comparing to the fall.


Gaming and manipulation should absolutely be addressed, but it seems Columbia was just straight up lying about their numbers, not gaming them.


Goodhardt’s law in action.


Goodhart's Law

The key problem of using a measure as a metric is sometimes referred to as Goodhart’s law. The law is based on a 1975 paper on economic regulation, and is typically paraphrased as “When a measure becomes a metric, it ceases to be a good measure.” When managing a system, this problem is critical, but is deeper than it at first seems, as I’ll argue.

https://www.ribbonfarm.com/2016/06/09/goodharts-law-and-why-...


Ctrl+f for "Goodhar" right away to see who called it first +1


What's that adage? Every metric that becomes an optimization target ...


I FOIA requested a lot of universities and man do they fight to keep their stuff secret.

https://austingwalters.com/foia-requesting-100-universities/

Often I’d find the person in charge on linkedin and email them directly (work email obviously). And dig around until I found the particular person.

UIUC put up a particular fight when I FOIA requested information about their post-graduate surveys. They’re definitely manipulating there. I was in the process of filing a lawsuit when covid hit and gave up because I wasn’t sure if I’d need the funds (plus court shutdown at the time).


> I FOIA requested a lot of universities

FOIA only applies to federal government agencies, universities (other than ones operates by the federal government, like the service academies) aren't subject to it.

State universities might be covered by state sunshine laws that are similar to FOIA, and generally the laws and policies around those only require a request from the public and aren't too particular about form or citing a completely wrong and irrelevant legal basis, so in that case an “FOIA” request might actually have some legal effect. But most private universities would just see it as a particularly pompously written discretionary request.


No, there are definitely FOIA laws that apply to public universities. The post was calling out UIUC and this is their web page about it [1]

But I am really curious about the argument against the requested survey data. The poster linked to his web page which specifically lists UIUC as being exemplar and providing data without even the need for a formal FOIA request.

[1] https://www.uillinois.edu/erc/foia


With the data I’d be able to prove a degree isn’t worth it. Which was what I was planning on making - enter your degree, high school gpa, we could predict the likely outcome.

I’m sure they knew that lol

I requested: country of origin, gender, degree, gpa, salary range, field of work and a couple other fields I forget now.


I have a friend who works at Duke University’s marketing analytics and performance metrics team. It’s filled with MBAs who’s job is to massage the numbers and metrics that are asked by US News every year and provide them to US News.


It's probably staffed with MBAs to boost the job placement numbers for their MBA program.


username checks out ;) you guya have a collab prog with duke


Funny how this reminded me of a PG essay[1]

    The most damaging thing you learned in school wasn't something you learned in any specific class. It was learning to get good grades.
[1]:http://paulgraham.com/lesson.html


The problem with that quote is that if you don't learn to get good grades (e.g., because you are actually learning and not studying for the test), then the resulting bad grades can be extremely damaging to your future.


How is that a problem with the quote? It sounds like a problem with the system, a problem the essay correctly identified.


Ok. Now fix the system in a timeframe that is compatible with your expected working life.

That's the problem with the quote. Grades may not be the most important thing but they can, and often are, the most important thing for any given individual. For every Bill Gates that dropped out, there are a million John Doe's drowning in student loan debt with no degree and limited future prospects to show for it.


Who are you asking to fix the system, and why? Me, maybe? Paul Graham? gofreddygo?

How is the thing you say a problem with the quote? These can both simultaneously be the case, and often are:

1. The most damaging thing John Doe learned in school was learning to get good grades.

2. Richard Roe didn't get good grades and is drowning in student loan debt with no degree and limited future prospects to show for it.

You have offered no reason to believe that if (2) is more true, or true in more cases, (1) is less true, or true in less cases, or less important. In fact, this tighter conjunction can also often be true:

1'. The most damaging thing John Doe learned in school was learning to get good grades.

2'. John Doe didn't get good grades (despite being damaged by learning how, he didn't learn how enough, or other things interfered) and is drowning in student loan debt with no degree and limited future prospects to show for it.

You've also offered no reason to believe that if (2') is more true, (1') is less true or less important.

I have an idea! Maybe you're misinterpreting the quote as meaning that, if you're in school, it would be a good strategy to stay there but not learn to get good grades. In that case there would be a sort of conflict: by diminishing the damage from (1') you may be increasing the risk of (2').

But that isn't what the essay advises at all, and I'm puzzled as to how you could have read it as suggesting that. It explicitly advises students to do exactly the opposite; it advises them to get good grades by hacking bad tests in the usual ways (cramming before exams, etc.) because everyone will judge them by their grades.

But that's not really the point of the essay, and it would be a pretty boring essay if it was. The point of the essay is its advice to people who aren't enrolled as students: unlearn how to get good grades, and instead of focusing on how to make your product look good, actually make a good product. Also, and maybe more importantly, try to work in fields of endeavor where the way to win is to do good work, rather than hacking bad tests like you do in school. (This might suggest not going into school at all, which would at least avoid the student loan debt problem.)


The essay could be summed up using the following quote from Ghostbusters (1984).

"Personally, I liked the university. They gave us money and facilities, we didn't have to produce anything! You've never been out of college! You don't know what it's like out there! I've WORKED in the private sector. They expect results."


That quote is related to the essay in that both the quote and the essay discuss incentive structures in academia and industry, but that is where the similarity ends; they say completely different things about those incentive structures.


Its title says "UN"learn. Suggesting it is learned by default for a student.

Going to college optimizing for "learning" and to gyms for "fitness" are poor use of time and resources.

The essay is just pointing out that strategy to hack tests for grades stops working. You need to improv on it in startup land or in the kingdoms of BigCo.


My kid just went through college selection process and I can assure you that the US News rankings are largely irrelevant in terms of decision making. The revealed preference score[1] or global university rankings[2] are much closer to how students, their counselors, and their parents rank the universities. For instance, any ranking that does not have Stanford first among non specialized US universities (i.e. excluding MIT, Caltech, Julliard, etc.) is suspect.

[1] https://www.parchment.com/c/college/college-rankings.php?pag... [2] https://www.topuniversities.com/university-rankings/world-un...


I find revealed preference to be very useful in many situations, but a ranking that has Harvard just below Swarthmore, at #61, does not pass the smell test. I say this as a graduate of Swarthmore — I cannot take seriously a list with Harvard so far down.


I work at a university, and honestly have never heard of this ranking that you referenced in [1]. Also seems super weighted towards military academies. What's their model? Are kids in the states now looking at more global options ?


The revealed preference rankings use co-admittance data (student got into Harvard and Stanford) along with that student's final choice (went to Stanford) to rank the schools. The military academies will have a huge edge because their decisions come out later and most students are highly committed once they continue their application to that point (they've gotten congressional recommendations, passed physical exams, etc.). So basically throw those out.


>What's their model? Are kids in the states now looking at more global options ?

According to their methodology[1] page, it looks like for student who gets multiple offers they do an Elo style ranking. The first problem; and which probably explains the weight to military academies, is that students tend to self select into tiers of colleges, so a group of students who are interested in the Air Force might not even apply to Harvard (even though the USAFA is a competitive school) and upon admission opt to go there (after all if you even were even considering of joining the air force, why would you reject your one shot there to go any other school)?

[1] https://www.parchment.com/c/college/college-rankings.php


So this would favor specialized schools in general?


> For instance, any ranking that does not have Stanford first among non specialized US universities (i.e. excluding MIT, Caltech, Julliard, etc.) is suspect.

LOL wut? Stanford is modestly well known, but obviously Harvard is the one that is unambiguously first.


I'd give Stanford a bit more credit than this, but people I know who were admitted to both all went to Harvard. Stanford is surely the most prestigious general-purpose university west of, well, Harvard.


> Stanford is surely the most prestigious general-purpose university west of, well, Harvard.

Is that no longer Yale? From the perspective of someone who had a rather poor exposure to universities during my high school years—so, precisely what someone gets from pop-culture and osmosis without anyone in particular acting as a guide—my impression was that Harvard and Yale were unambiguously the Big Two, and everything else fell somewhere after them in terms of recognition and prestige.

Is this a CS perspective? I hear a lot about Stanford's CS program so I assume it's notably good.


I've seen a few places that Stanford and Harvard are close to 50/50 on cross admits now, and that Stanford has even passed Harvard a few times (which no school has ever done). Likely a lot to do with press around Stanford students becoming rich Silicon Valley founders. My original point is that there has been a lot of changes in how students perceive schools that may not get reflected in the US News ratings.


Interesting! My impressions were formed some time ago, so I'm not surprised they're out of date. Thanks.


I think Stanford is viewed as having eclipsed Yale and is competitive with Harvard. Princeton probably also eclipses Yale.


Insider who worked with a lot of schools: Yale is a much, much better school than Stanford. Stanford beats Yale on hype.

A lot of the elites are hype-over-substance, and it's often hard for a naive audience to screen out which ones.


It makes more sense to me that east-coast schools would show an artificial advantage on "revealed preference" type rankings, because more Americans live closer to the east coast schools. They're more "locally" well-known, you're closer to home (okay, a disadvantage for some), etc. I'd expect them to benefit more from hype than Stanford would.


And they're older and have had longer to work their way into the culture. Many of the eastern ivy league schools are older than the USA. Some are even older than the state they're in (in the east, unlike in the west, states are often a little older than the USA itself, rather than a litlte newer)


Stanford certainly beats Yale in CS quality both in terms of peers and instruction.

I have significant experience with HYP, less-so with Stanford. I don't think I am a naive audience, I know many people from these schools and have attended classes, either due to being a student there or online videos in the case of Stanford & Princeton.


True, but CS isn't everything. Stanford is definitely much more of a tech school than Yale.

I'd never have gone to Yale because it was a poor fit for my interests. That's different from being a poor school. Overall school quality is less important than fit. Given my interests, I would have pick Stanford over Yale (and in turn pick Caltech, CMU, or Georgia Tech over Stanford).

That said, Yale is a phenomenal experience. It's very academic in the classic sense -- deep, philosophical discussions, and thoughtfully exploring topics in a way that's increasingly rare in the fast-paced elites. It gives a lot of what people assume they'll get out of a small liberal arts college (which is different from the reality; most small liberal arts colleges are expensive and horrible).

I have little knowledge or Princeton. I know both Harvard and Stanford in great depth.


Yeah, there’s some “new gold rush money” thinking going on in this thread. Harvard has had 5 US Presidents. Yale has had 3. Stanford has had just one. Not a single Supreme Court justice went to Stanford. 4 went to Yale and 4 went to Harvard.

Stanford tied with Harvard for 6 Senators in the 117th Congress, impressively, but obviously Harvard dominates that metric over a longer time.

Harvard still has more than twice as many billionaire alumni notwithstanding how many billionaires these days are in the tech industry.


Not taking a side on this, but those are all heavily lagging indicators. IE, those are all almost 30 years after college. The current ranking can be highly different from what it was a generation ago.


You are quite wrong about the Stanford supreme court justices: Sandra Day O'Connor and Stephen Breyer


He's talking about Law School of course - that's where promising students are drafted into the Supreme Court clerk position that's the first step on the cursus honorum to becoming a Supreme Court Justice. Undergrad doesn't matter as much except inasmuch it gets you into the prestigious law schools.


Um, Sandra Day O'Connor went to Stanford for undergrad and law school.

Also, this is a thread that's largely about undergrad, so I'm not sure Rayiner was talking about law school.

There are different rankings for law schools, and Yale has been considered the most prestigious for many years. I think SLS nabbed the top spot various times in the last decade, but Harvard is definitely not the uncontested leader in law schools.


Sorry I meant current Justice. I thought that was clear from the count (4 Harvard, 4 Yale, 1 ND).


You're forgetting Princeton, which is almost always grouped together with Harvard and Yale to form "the Big Three"[1]

[1] https://en.wikipedia.org/wiki/Big_Three_(colleges)


Fair point. I would guess there would be some people who would choose Yale over Stanford, and vice-versa. It probably depends mostly on which coast you want to be on, and if you prefer a very old institution or one that is much younger.


Yale also has a beautiful campus and a nice Oxford-style college/residential system. I didn't attend, but was somewhat jealous.


I think it's easy to underestimate the impact of rankings. Just from your description, it seems that one of the most important overall decision factors for you seems to have been some sort of vague personal sense of how you'd rank them. That subjective ranking is probably influenced by a lot of factors, including many rankings that you consciously claim to disregard.


My personal opinion is that students are highly influenced by where the best students in the grades above them chose to apply, where they got in, and where they eventually chose to go. The software the students use also ranks how likely they are to get into each college/university based on the past performance of students at their high school with similar grades and test scores.


That might be the one thing I would do over again. Move to a "better" HS for my kid, especially since he missed out on 1.5 years due to Covid anyway. His HS ranks top 15 in our state in college readiness scores but falls out of the top 50 overall since there are also a lot of poor students who are lucky to have parent at home at all and often are the surrogate parent for their single mom or dad. Basically, the top students are about as strong as any school in the area but the reputation gets dragged down by the kids who were never on the college track due to life circumstances and a limited support network.

As a result, if you have your sights set on anything other than the two in-state schools or BYU (strong Mormon influence here) the counselors don't know what to do with you and you are on your own.

It worked out and I have no doubt my kid will have strong opportunities post-college but he missed out on any chance at the truly life-changing schools, which I have no doubt he would have succeeded at had he gotten in.


It’s really weird Cambridge is ahead of Oxford in world rankings.

Cambridge has a reputation for strongly preferring UK students.


Cambridge has traditionally been ahead in science and mathematics (e.g., it's where Newton was a professor). On the other hand, Oxford until the 1800s was pretty much a seminary and theological college. As a non-Brit, I still get the impression that Oxford is where you would find the best philosophers and other humanities experts, while Cambridge is still slightly ahead in STEM


Hmm ... Cambridge had Russell and Wittgenstein, e.g., you have that bass-akwards wrt Philosophy departments.


I took an Econ class with northwestern’s president and he said that they straight up lie about their middle 50 gpa and standardized test scores. Can’t say I was super surprised. Every top 25 school is saying 25% of their students got a 35 or better on the ACT, the numbers just didn’t add up.


Maybe it is because they "super score" (ie. take the best of each section if they take it multiple times).


Morty? He was getting pretty chatty nearing retirement.


Yea the class was basically him gossiping


This is a good example of Campbell's Law at work: https://en.wikipedia.org/wiki/Campbell%27s_law


Thanks for sharing. Sounds very similar to Goodhart's law:

https://en.wikipedia.org/wiki/Goodhart%27s_law


An obvious takeaway in this case: US News shouldn't be the primary ranker.

If statistics were distributed in open format, transparently, and there were multiple rankers (each with their own weights and mixes), universities would be less incentivized to game "the US News metrics".


It already exists and is a free US Dept of Education service. Schools have to report that data to the Dept of Ed already, so they have both an interactive search tool [1] and the raw data available [2].

[1] https://collegescorecard.ed.gov/search/?page=0&sort=threshol...

[2] https://collegescorecard.ed.gov/data


Looking at [1] just now, it seems overwhelmingly oriented toward answering the question "How do I get in?". Scarcely at all toward "What can the school deliver to the student?" -- only relevant measure is Median Earnings.

The fine print:

>The median annual earnings of individuals that received federal student aid and began college at this institution 10 years ago, regardless of their completion status.

So for the anxious parent, not a lot to go on.


That seems naive to assume. We already live in a world where there are multiple rankings, and many of them (even some of the most important ones, like QS or Shanghai) rely only on independently collected or public data.

We already have what you suggested as a solution, and look at where we are. The problem is more fundamental.


It’s like doping in cycling. Everyone was getting caught but we’re supposed to believe the couple who snuck through weren’t like the rest. Once the tests got good enough there was hardly any one left standing.

https://en.m.wikipedia.org/wiki/Doping_at_the_Tour_de_France



this is also paraphrased in the paper by the Columbia professor.

> Almost any numerical standard, no matter how closely related to academic merit, becomes a malignant force as soon as universities know that it is the standard. A proxy for merit, rather than merit itself, becomes the goal.


Which begs the question - if you want to get top in rankings wouldn't you simply design the school around the objective ranking parameters and nothing else?

Ripe for abuse...


Not sure if this is still true, but at one point, the biggest factor in the ranking was peer reputation rating. That kind of results in a self-reinforcing vicious cycle of top schools remaining top schools, but short of bribery, there isn't really any way to manipulate that at least.


This is sort of what Columbia did… they juiced their faculty numbers by counting the med school which has much lower class sizes and more educated teachers.


I'm really curious what the personal cost to that faculty member is. Surely this report was not well liked by the administration.


From the conclusion of that professor's article:

> No one should try to reform or rehabilitate the ranking. It is irredeemable. In Colin Diver’s memorable formulation, “Trying to rank institutions of higher education is a little like trying to rank religions or philosophies. The entire enterprise is flawed, not only in detail but also in conception.”

A lot of HN commenters here trying to "fix" the details of the rankings.


Adam Ruins Everything, Season 2, Episode 7 [1]

1: https://youtu.be/EtQyO93DO-Q


For context, U.S. News and World Report used to be a trusted periodical that reported national and international news. In the 80's they started ranking colleges and selling it as a separate publication, which turned out to be pretty profitable, so they repeated the trick by ranking hospitals (1990), cars (2007), and then states (2017). The news business went online only in 2011, then shuttered in 2015. Since then they've just been a rankings company, but they kept the name to trade on their reputation.


U.S. News and World Report was a highly respected & trusted news magazine in the 70s-90s, but it's online incarnation now is just another clickbait site (similar for Newsweek).


They’re the #1 best rankings company.


According to US News and World Report


They sort of put themselves on the map for college rankings since 1983 but I would hardly say it’s a fair ranking system. For example, they love to tie multiple schools together: 2x 2nd place, 2x 6th place, 4x 9th places, 3x 14th places, 2x 17th places, and so forth.

Here’s a classic case where if you do something long enough, people will trust you.


I remember reading the paper magazine for school in the 1990s. Teachers would assign parts as homework sometimes. It was on that level of trusted journalism

I don't know what the equivalent today would be. The Reuters website maybe?


I had first heard of them because Zork had a US News and Dungeon Report magazine, and it was such a specific name I knew it must refer to something in the real world. I never did get a chance to read the actual magazine, though.


Wow fascinating - so US News is now a rankings company.


If you think of clickbait as a crowdsourced ranking of emotionally manipulative hyperbole, that could be said of any corporate press rag.


US News was a clickbait company before there were clicks to bait.


I feel like it would not be possible to sell such a static and easily leaked thing as a ranking list


There's a very interesting article "How to Game the College Rankings" about how Northeastern University's president focused on improving the university's ranking. In 1996, it was a "third-tier, blue-collar, commuter-based university" rated #162. The new university president had a singleminded goal: to improve the ranking. He got the university into the top 100 in 2006 and into the top 50 by 2013.

https://www.bostonmagazine.com/news/2014/08/26/how-northeast...


Now the interesting question: did Northwestern^H^H^H^H^H^H^Heastern get meaningfully better for students between 1996 and 2013?

Did they graduate in higher numbers? Did they have better career outcomes? Did more get into better(?) grad schools? What is their median income? Are they happier, did they enjoy college more?


Just a note that this is Northeastern (in Boston) not Northwestern (in Illinois). Northwestern was and is typically a very high ranker.

As for what the improvements are like... hard to say. Some ways to improve the rankings like getting more students you don't want to come to apply (so you can appear more selective) don't actually help at all, while things like small classes even if done to "game" the rankings could actually help.

There's also just the cyclical aspect of things: the quality of your experience is going to be significantly determined by the quality of the student body, so if you have a better applicant pool because of your better rankings you maybe have a bit of a virtuous cycle that's somewhat disconnected from whether the things you changed actually improved anything real on their own other than the ranking.


What about North Central?


Almost certainly yes, because the school could presumably attract a higher-quality applicant pool in 2013 than in 1996. This, of course, makes it impossible to tell how much of that value was produced by the university and how much was just selection effects.


From first hand experience, I can say that quality of selection makes a big difference in your university experience. My first university was /modestly/ selective. My second university was not-at-all selective. (I was a terrible university student.) The difference in class discussion and group projects cannot be understated. Students from the higher quality selection university were much more intellectually engaging.


Most colleges have the ability to graduate close 100% of their students.

If a school only admits students who have at least two years at another college with high grades, their graduation numbers will skyrocket.

A highly selective college, like Harvard, could even refuse to grant credit for those two years.

There are less nefarious ways of selecting a class of likely graduators: leadership in extracurriculars is highly correlated with post-secondary graduation.

Foreign students with limited English fluency have very low graduation rates. They could (should?) be excluded.


bit OT but I'm curious what a college that ONLY selects extracurricular leaders would look like. This is coming from a software job where career advancement depends on being the lead in projects. IME people are mostly chill and happy to do the "grunt" work, but what if EVERYBODY wanted to be "team lead"?


West Point, Naval Academy, etc have a reputation for recruiting extra curricular leaders. And have good graduation rates.


> West Point, Naval Academy, etc have a reputation for recruiting extra curricular leaders.

Those are officer training schools, so it makes sense they're ostensibly selecting for leadership skills, because they'll be overseeing enlisted recruits.


>Did they graduate in higher numbers? Did they have better career outcomes? Did more get into better(?) grad schools? What is their median income? Are they happier, did they enjoy college more?

And make sure you control for your inputs. It's harder for the #2 campus of state school to wring "good outcomes" from the children of plumbers and truck drivers than it is for big brand name schools that have plenty of kids of doctors and lawyers in there to drag their average up because even if you do a middling job educating them and have squat for industry connections they will still land internships and go on to get good jobs.


Was it that hard to read and get my alma mater right? ;)

In my opinion as a student who entered Northeastern in the mid 2010's the answer to all your questions is an unequivocal yes. More competitive student body, better profile to attract employers, more student amenities to enhance on campus life (we could have more in this regard imho). Compared to the commuter school of yesteryear it's for sure a much better educational experience.


"better profile to attract employers" -- this is very important in the United States. If you are good student from a below average university, your job prospects are /much/ worse!


I actually think that Northeastern is really good at this even relative to its current peers in the rankings. I have no affiliation but just being around the Boston area I hear many good things about their co-op program.


I've worked with a number of northeastern university grads, and they are consistently well prepared. NEU has co-op program, a standard part of their undergrad, where they get like a year of experience before graduating (5-year undergrad is the norm). who knows if the school is what creates the difference, but grads are really well prepared in general.


Subjectively, I imagine that it did. Northeastern has had a good reputation for as long as I've been aware of it.


Northeastern.

Northwestern was already relatively prestigious.


Northwestern also - by the way. The president's goal when he joined was to break into the top 10.


I was at NEU from 2007-2012. Aoun, who became president in 2006, was not popular amongst the students. It was pretty apparent that he was making changes simply to make things appear better on paper, and some of them actually made the students really angry. A big one that I remember was that they eliminated the mandatory co-op requirement (which means most students graduate in 5 years instead of 4), which is a core tenet of NEU's philosophy!

The changes NEU made between 1996 and 2006 were seemingly genuinely good for students. My dad had graduated from NEU in '86, and he was really impressed with how far the school had come. But I wouldn't say anything that changed while I was there had any real impact on my experience, so my gut instinct says they were more gamey than anything.


speaking as a as-of-this-year Northeastern CS graduate

the university definitely has done things to improve its rankings over the past ~30 years, some legitimately good (hiring higher quality staff, putting more emphasis on research, building more student accommodations like on campus housing, expanding the co-op program) and also less than savory moves (the NUIn program which admits more freshman students without having to record their statistics in the incoming class record, international campuses to "build the brand"). I think, at least the CS and co-op programs are great, and - minus the cost - is a overall great school. they definitely had the aim of improving the ranking, but in doing that they did improve the quality and size of the school along many important fronts. my read is then that USNews still has use in measuring universities, up to a certain point (say top 50 or top 100)


#49 now. I guess it's harder to game the system past certain threshold


The beginning of the conclusion of the original study [1] is worth repeating:

No one should try to reform or rehabilitate the ranking. It is irredeemable. In Colin Diver’s memorable formulation, “Trying to rank institutions of higher education is a little like trying to rank religions or philosophies. The entire enterprise is flawed, not only in detail but also in conception.”

Students are poorly served by rankings. To be sure, they need information when applying to colleges, but rankings provide the wrong information. As many critics have observed, every student has distinctive needs, and what universities offer is far too complex to be projected to a single parameter. These observations may partly reflect the view that the goal of education should be self-discovery and self-fashioning as much as vocational training. Even those who dismiss this view as airy and impractical, however, must acknowledge that any ranking is a composite of factors, not all of which pertain to everyone. A prospective engineering student who chooses the 46th-ranked school over the 47th, for example, would be making a mistake if the advantage of the 46th school is its smaller average class sizes. For small average class sizes are typically the result of offering more upper-level courses in the arts and humanities, which our engineering student likely will not take at all.

[1]: http://www.math.columbia.edu/~thaddeus/ranking/investigation... (section 8)


> Trying to rank institutions of higher education is a little like trying to rank religions

Did anyone try to make a ranking for religions similar to those for universities? Sounds like it would be a fun project making a great point.

Measures like "Fees", "Diversity", "Alumni Salary", "Financial Aid Provided" would all be interesting to see for each religion.


I’d love to see that too. Mormonism for example would rank very high in “Alumni Salary” and “Quality of Alumni Network”, but low in “Diversity”, “Fees”, and “Amount of Bullshit You Need to Swallow”.


Here's the response:

Universities are mostly unimportant in terms of what they offer: class sizes, education, curriculum, etc... All mostly bs that doesn't matter.

The real role of universities is to gather together smart people as they develop. This requires mostly a sort of self-selection of applicants, who need to agree independently to go to the same university. Hence rankings, prestige, and all that nonsense.


This is just the wrong way to look at it. Clearly, there is real demand for rankings by students. No one is stupid enough to think that there is some real difference between #47 and #48. But obviously #47 is very different from #26.

Just because you can't get an exact measurement does not mean that a metric does not exist or is not useful.


I would argue that the difference between 47 and 26 comes down largely to field of study or cost of attendance.

The top 10-15 offer an almost indisputable advantage with the top 5 or so being a tier unto itself. Anything outside of those groups is largely "it depends" and 50-100 forms another tier where total cost of attendance largely dictates whether one school is "better" than another.


I couldn’t agree more. The schools I got into were ranked something like 12, 13, 25, and 30. I went to 12 for only that reason and always regretted it. Was it my own dumb fault? Of course. I was 17.


I've been looking at the bias in rankings for a little while. I think one way to identify and raise awareness of the biases, is just put rankings together side-by-side. I did this for computer science programs, and there's some interesting differences that I noticed:

https://jeffhuang.com/computer-science-open-data/#bias-in-co...


The focus on best paper awards is odd as the major conferences of some CS subfields dole out awards as if they were party favors, and those of other subfields don't have best paper awards at all.

For instance SIGCHI 2021 had 28 best papers out of 747 accepted papers (or 3.7%) whereas CVPR 2021 had one best paper out of 1660 accepted papers (0.06%).

I have no opinion about whether it's "better" to be stingy or generous with best paper awards. But obviously any kind of ranking that doesn't account for differences between conferences and subfields is going to be quite suspect.


Fair point, it's something I'm aware of and it's a bit intentional to counterbalance the "normalization bias" already done in other rankings, if that's what you meant by "account for differences".

To put another way, to count in a way such that a CVPR best paper is worth 28-fold a CHI best paper is also quite suspect. There's a rabbit hole you can go down to find the best conference to submit to where the submission-to-best-papers is low, to optimize for this.

PS: note that there's an upper bound to best paper awards, which is "< 1% of submitted papers".


This is completely true of undergraduate studies. There is a very real reason to think that department (not university) rankings in graduate studies matter.


In the same way that one restaurant flouted Yelp ratings[1], couldn't all the Ivy League schools just refuse to participate in US News' annual rankings? What would happen if Harvard, Yale, Princeton, Columbia, et. al. decided that this ranking is not helpful, so they won't supply any info?

Readers of US News would more likely start to lose trust in the rankings rather than move their assessments of the Ivies downward, right?

[1] https://arstechnica.com/information-technology/2014/09/why-t...


These rankings consistently tell people that Harvard, Yale, Princeton, etc are the best schools in the country. Why would they boycott rankings that praise them?


Because it puts other 'less desirable' non-elite schools in the same lists for top twenty/thirty/forty, etc.


The Ivies rank 1, (UNLISTED), 2, 5, 8, 13, 14, and 17. All of the "less desirable, non-elite schools" are listed BELOW these schools, which keeps them looking good.


Is that even really a negative?


But it’s a prisoner’s dilemma situation, right?

If none of the Ivy’s are on US News, well, it’s probably a shitty ranking.

But if e.g. just Penn falls off the list… most people are just gonna assume that Penn got worse.


This is why it is really just Harvard (likely Stanford, Yale, and Princeton too, but definitely Harvard) that can make a statement boycotting the list, just like a billionaire can get away with wearing a plastic $10 watch but mere millionaires feel the need to wear a Rolex to show off their money. In fact, it wouldn't surprise me if Harvard did the least to game the US News metrics of any top 25 university, because the US News staff would just tweak their formula to keep Harvard in the top 5 regardless.


Harvard doesn't need to game the rankings because Harvard grads own the friggin' board. The rankings will fall in line accordingly.


Yeah they should stick all the ivies, MIT, etc... in a top "non-rankable" rank. Like realistically a student will be happy to go to whichever one they get into, and if someone gets into multiple, they probably won't pick based on position in some list.


> if someone gets into multiple, they probably won't pick based on position in some list

Not trying to be contrarian, but I had two friends literally pick between Ivies based on this very list (in 2015).

It’s generally accepted as “The List” by a lot of people. If you’re a parent with no other frame of reference, The List has a serious impact.


I’ll chime in with contrary anecdotal experience having gone to a high school with lots of selective college placements. I didn’t know anyone who chose between two elite schools based on their relative US News ranking.


Mine neither. Maybe it’s because they’re the prep school set, but what turned out to be the tie breaker for my multi-Ivy admit friends was the campus visit. I remember one who was adamant about getting in shape in college and chose the one where the freshman dorms were closest to the campus recreation building.

If you’re of a certain background, it’s really your mom or dad’s alma mater, and then one of a few “perfectly acceptable, fine schools.”

I think the difference between say #4 and #12 in a given year exists in the minds of middle class strivers. I went to a public university and a private one for undergraduate and graduate school, and at the top levels it really comes down to the professors on an individual level and perhaps the department, more than the institution itself.

And in any case, if you pay attention to the finer movements in the rankings, schools rarely keep the same position between publications. It’s just the same ones playing musical chairs with each other year after year.


I find this hard to believe. Obviously everyone had more important considerations, but I know a few people that got full rides to a few top schools and absolutely chose based off rankings.


I'm saying among people I knew, if you got into #4 and #10, for example, you didn't pick #4 because it was higher rated. At that point the difference came down to other things.

If you got into #4 and #40, then yes, you were going with #4.


Call me crazy but... Perhaps there are people who did, and others who did not.


These rankings absolutely influence people's decisions.


I didn't say that they don't. I'm saying among people I knew, if you got into #4 and #10, for example, you didn't pick #4 because it was higher ranked. At that point the difference came down to other things.


I got into 2 & 4 and even then it had some impact on my decision.


Well, that's more direct experience than I have, at least.

In any case, I bet Columbia will keep getting more really high quality applicants than they have seats.


If you get to the point where you are choosing between Harvard, Yale, and Princeton, you may as well choose based on the color of the curtains in your dormitory because it really isn't going to matter.


What happens if your pick drops in ranking by the time you graduate?


Not much. Most peers and hiring managers stopped paying attention to these ratings when they got into college, so the market's perception of your degree is some blended average of the programs' ratings over the past ~40 years. By the time rankings after you matriculated represent a meaningful portion of the average, your alma mater is no longer a particularly relevant part of your resume.


I know someone who switched law schools after the rankings came out, because #1 changed.


If I'm not mistaken the news that Columbia "won't participate" in the rankings came a bit sooner[^1], but if anything that struck me as a "you can't fire me, I quit" sort of preempting of what was inevitable.

Honestly though, college rankings are toxic, as is this whole prestige economy we've built around them. I don't even really care about colleges faking numbers, so much as other shady practices like sending students false promises to get their hopes up - pretty much for the express purpose of rejecting them down the line. As any unfortunate student who has either had to apply to college, or been the parent of someone applying, you can't avoid the "chance me" threads or the endless HYPSM dick measuring contest while trying to get legitimate information.

[^1]: https://www.nytimes.com/2022/06/30/us/columbia-us-news-ranki...


Columbia announced they wont participate in next year’s rankings. The delisting was for this year’s report.


US colleges foresee a future where they are under attack for being absurdly selective, what some are now calling being "highly rejective". I think they want to avoid being targeted by the fairness wonks anymore than they already are.

Also, US News and World Report rankings have major problems, for instance, the ranking algorithm considers campus aesthetics and food quality, but doesn't account for price, giving schools incentive to raise prices to fund campus improvements that boost rank, in turn boosting applications, in turn reducing % acceptance (the principal indicator of "quality" in US schools). This was how TCU went from a meh christian school to a "selective private college" in just a couple of years. World Report now publishes a "best dollar value" report to account for this, but few read it.


The gap between the schools' rhetoric and reality is pretty funny; I was just writing about it the other day: https://seliger.com/2022/07/06/nonprofit-boards-of-directors...:

You can see a lot of hypocrisy that’s uncritically accepted by a lot of organizations, including nonprofits. Exclusionary higher education is a particular notable example, given the soaring rhetoric of “inclusion” spouted by some people involved with higher ed, versus the reality of those same schools seeking to reject as many applicants as possible. Princeton University’s president, Chris Eisgruber, has, for example, blathered extensively about the school’s efforts to “combat systemic racism.” Princeton has a $37 billion endowment. The school’s undergrad acceptance rate is 5.6% and it charges a sticker price of $73,000 a year (yes, the school does accept a handful of token low-income students every year, but that the school’s overall demographics reflect its target: the wealthy). Does that sound like a school devoted to combating systemic racism to you? How can people make these kinds of arguments with a straight face? Colleges and universities are run largely for the benefit of their administrators. The other exclusionary schools are doing the same things, as are their private-school feeders, despite their vigorous marketing to the contrary.

Regarding the above paragraph, let me be clear: describing how something is, is not the same thing as approving of it.


It's truly mind-boggling. The Ivies were always bastions for maintaining the position of WASP elites. They still serve that function. Maybe there is a bit more wiggle room about skin color, but they still function to socialize the next generation of people to run WASP institutions like "JP Morgan Chase."


There's nothing inherently white, anglo-saxon or protestant about the oligarchy that these institutions preserve; they will very happily pass on to a new generation of westernised, college-educated liberals of all races and genders but no more variation in thought than the one they replaced. The cultural focus on such ephemera gives them a very useful fig leaf for excluding the demographic that it actually matters to exclude: the working class.


In this context, “WASP” refers to social status and cultural norms derived from northeastern British and Dutch colonists, but not limited to them. Working class southerners or Appalachians have never been “WASPs” in that sense despite fitting the definition literally.


I guess I must have hallucinated reading all those Stewart Alsop articles about the decline of the WASP Elite then.


Is that really so wrong though? I'd like to go to a college where the food doesn't suck and the campus is nice. Those are definitely factors that went into my choice. Value was definitely the overwhelming factor in my decision though, I went to a UC.

Also, even the value rankings are mostly just Ivies at the top. Funny enough, my school is in the ~30s for overall rankings, but in the low 100s for value, since they're looking at the out-of-state cost (which for UCs is around 50k vs. 15k in-state), and they include grants, which is extremely variable by individual student. If they looked at the in-state tuition, UCs would dominate the value rankings.

https://www.usnews.com/best-colleges/rankings/national-unive...


US News rankings considers neither food nor campus aesthetics.

https://www.usnews.com/education/best-colleges/articles/rank...


A big problem: admitting too many students actually decrease the number who apply.


"the fairness wonks"


> the ranking algorithm considers campus aesthetics and food quality.

wtf


Having eaten some absolutely miserable dorm food in my time, I get it.



The influence of US News on colleges is wild. One absurd tidbit that I've heard: the rankings weigh freshman stats much more than transfer student stats, so a lot of competitive schools that won't let you in as a freshman if you would hurt their SAT/GPA/etc averages will happily let you in as a transfer because they'll make more revenue and your stats no longer hurt their rankings. *

The ranking formula is also changed arbitrarily based on the reactions of colleges and readers. My school went from #9 to #1 one year, there were a lot of "wth??" reactions, and the following year the formula was tweaked so that our ranking dipped down to #4. I'm pretty sure nothing changed materially in the school over those 3 years, but the ranking moved around a lot.

The whole system is a farce.

* I haven't researched freshman vs transfer data myself, but it's something I've heard from multiple startup founders in the college education space.


Yeah this was one of the whistleblower topics. Columbia has highest number of transfers http://www.math.columbia.edu/~thaddeus/ranking/investigation...


Depends on the school. At a place like Harvard or Yale it is incredibly difficult to get in as a transfer, much harder than to get in in the first place.


Yeah but that's Tier 0, it's not surprising that HYPSM don't feel the need to engage in as many number fudging tactics. It's a lot more impactful to Tier 1 top schools (e.g. a Dartmouth) where USN ranks them.


For UW this is 100% true. A 4.0 is like a 50/50 to get in the CS program but for transfers it’s easier.


One down, 99 to go. These university rankings are absurd, and just a status game that school administrators feel they are forced to play. Individual programs may be outstanding at otherwise middling colleges — it’s better to learn about specific departments than try to reduce the school as a whole to a number.


One of the heaviest part of their grading is just the opinions of academics. They just send a survey out and ask academics to rank universities. Of course they end up reproducing and reinforcing the existing rankings as always.


Aren't there rankings for specific schools in schools? like Engineering, etc...


Sure, but excellence in computer engineering does not imply excellence in chemical engineering does not imply excellence in civil engineering.


But someone looking for a good chemical engineering school won't look in the computer science ranking table, no?


OP was talking about how rankings are bad. are rankings for these schools more valid?


Universities are a status game in general. Reputation and networking opportunities matter more than education quality.


Having pursued a PhD at Columbia and having taught classes there. I am surprised it took this long for someone to speak up. Also the fact that there are no checks in place to certify top ranking academic institutions is fascinating.


could you expand on what you mean please?


Somewhat related - How Northeastern Gamed the College Rankings

https://www.bostonmagazine.com/news/2014/08/26/how-northeast...

There are probably some significant parallels to any executive who manages to game the compensation structure at any large corporation to their financial benefit.


I'm proud of my alma mater, Reed College, for refusing to participate in the US News rankings since 1995.


Unless a school is willing to accept a entirely random set of students all schools success or lack there of can be chalked up to selection and survivor bias.

I’d have to find it but someone did an analysis of folks who got into elite schools and didn’t go for whatever reason and saw that by mid career they had more or less the same outcomes.

Thiel Fellows entire premise is around this effect, as well.


The US News rankings were already a bit of a mess, but leaving out one of the Ivy Leagues is only going to be bad for US News.

An Ivy League school does not need help from the rankings -- but rankings that leave out one of the most prestigious and well-known universities in the country are useless.


> An Ivy League school does not need help from the rankings

I tend to agree, but why then did Columbia fudge the numbers?

I think there’s some envy from the “lesser ivys” that they wish they were Harvard or Princeton. A high USNews ranking proves little, but I guess Columbia wanted to be tied with Harvard.


Ivy league is kinda obsolete now, esp with places like Stanford, Duke, John Hopkins better known than Dartmouth, Brown, Cornell.


Columbia incentivized administration MBAs to increase the rankings, perhaps attached to bonuses. They did it unscrupulously by fudging the numbers and so Columbia will hopefully will dis-incentivize the behavior in the future, and so perhaps will other schools... doubt it.


What is the point of university rankings? To gratify the human need for someone to dumb down a complex system to a simplistic measure, just so we don't have to live with the "unbearable" uncertainty that one university may not be strictly superior to another?

Once everyone starts optimizing for a metric, that metric ceases to be useful. It's only a matter of how long it takes people to realize that this has happened.


> just so we don't have to live with the "unbearable" uncertainty that one university may not be strictly superior to another?

Unfortunately, yes

> superior

Key word unfortunately


One interesting thing to come of these discussions - a lot of the problems with US News methodology arise from the attempt to push certain metrics from 90% to 100%. Class size, faculty with a "terminal" degree, class size, admissions rate. Many of these things are treated by the rankings as "the higher (or lower) the better." As an analogy, I'd use lean muscle mass percentage for elite athletes. Generally speaking, lower is better, and in many sports, all elite athletes are all under 10% body fat. But much below this starts to become harmful, and would eventually be fatal.

I think this is kind of what's happening with Columbia here. For example, Columbia claimed that 100% of the faculty had a terminal degree. Prof Thaddeus (if you read his full blog post) questioned this from two angles. The first was to say it can't possibly be true. The second (which I found more interesting) was: why would you even want this to be 100%? Seems like Columbia has valuable faculty who don't have a PhD or MFA or whatever is the "terminal" degree int he field.

The harm here isn't in wanting a terminal degree for a high percentage of faculty, it's trying to get this to 100%. I'd say that it probably is a good sign, overall, that most of the faculty has a doctoral degree, as this is the main training degree for research and teaching. But the way US news does it, you get more points for 95% than 90%, more points for 99% than 95%. Seems like a lot of harm comes from trying to wring out that last bit, since even if you do generally agree with the value of a PhD for research (or other terminal degree), it does seem like you'd build a stronger faculty overall with the ability to hire 1 in 10 or 1 in 20 based on some other credential.

So much else fits this pattern. Small class sizes.. kind of the same thing. Sure, it's a good sign, but they're only an indicator, not a goal in and of themselves. Low admissions rates, high test scores. All good metrics, but things that can become not just harmful but harmful and maybe even fatal to a university if pursued with a single minded intensity as if total purity in the metric is the goal.

Honestly, overall we just have to reject the US news rankings. I appreciate what Prof Thaddeus did here, and it was a useful and very well reasoned and sourced takedown. But now and then, I realized that there isn't much to be added, just new and interesting ways to say what everyone knows at this point - yousnoozeandworlddistort rankings are pretty well idiotic. I actually think college rankings in general can be useful if managed cautiously and read critically, but this one is just a turkey.


I wonder how much these measurements actually matter, for a place like Columbia. I mean, I can see why, like, people might want to look at the list and see "hey this school I've never heard of is actually OK."

But like, people don't find out about Columbia from this list, right? It is, uh, older than the country "U.S. News" is named after and an Ivy League school. Not having them on the list just reflects poorly on the list.


Not really , Knowing about Columbia is kind of a new york, upper class thing

They don't have good sports teams, no way a person in middle america would have heard of them if not for school rankings


It might not be a good spot for people who pick their school based on sports leagues, but I mean... surely that's not everybody in Middle America.


Interesting, because they're a fairly famous university internationally - Obama was a graduate from there etc.


I absolutely know people who have had the choice of multiple top-tier schools and chose the one that ranked higher on U.S. News, so it definitely matters.


Why?

MIT vs Harvard vs Stanford are all amazing programs but are better for different types of people/interests/goals


On the other hand, Coca-Cola spends 4 billion of dollars every year on advertising.


Yeah.

I dunno, nothing about advertising makes sense to me, so I guess I have trouble evaluating this kind of thing.

Common sense would tell you that you don't give 4 billion dollars to the guy who says "I'm really good at convincing people to part with their money -- whether they should or not!" But then I don't run Coca Cola, so I guess they've figured out something I've missed.


In 2010, Pepsi switched at least their superb owl advertising budget to feel good community projects and lost sales at 6% vs a 4.3% overall decline for soda. Advertising drives sales, even if you can't tie a sale directly to an ad.


I dunno, Pepsi always seemed more faddish anyway, I'm not surprised their marketshare would be less durable.


I imagine there is value in being in the same company of more prestigious schools like Harvard and Princeton.


I really wonder why Columbia even cares: it's a globally well known ivy league school obviously producing a good education and is in NYC.* Doesn't US News need Columbia more than the other way around?

* No I have no connection to the school except I do have some friends who teach there and some others who attended decades ago.


I think professors and the actual scholars there don't really care. However, administration cares because ranking is tied to both how many undergraduates students apply and how much they can charge these students. If college rankings didn't matter financially then I hazard that Columbia would've had questionable numbers to begin with.

That said I agree that being an Ivy in NYC certainly was a major contributor to its more recent rise in the rankings and applications.


Having high ranking is like the entire job of administrators. Especially at Ivys they’re selling prestige, being highly ranked gives you prestige.


If you're in the small top 'n' I think the prestige goes the other way. Will anyone care if MIT is +1 or -1 some year? I assume those top schools can just run themselves the way they want and still get highly ranked.

And if US news were serious they could simply have kicked columbia out for a year, but it would have hurt them to do so. They'd probably do it for a small college ranked 100 or below though.

I'd be embarrassed for Columbia that they gamed the numbers. Outside that small n though, I presume gaming the numbers is commonplace.


These rankings are rampant with manipulation. For instance, I heard of classes at Hopkins being taught half in-class, half pre-recorded, so that they could say the "classroom" size was half its actual enrollment. It's especially frustrating when this kind of manipulation actually hurts students' education.


If folks are curious about how schools are ranked, I highly encourage you to listen to the following podcast:

https://www.pushkin.fm/podcasts/revisionist-history/lord-of-...


This story is not necessarily related to fudging of numbers/stats etc but it does illustrate to what level schools will go to attract students:

- Small liberal arts school

- Is ranked Division 3 in the NCAA

- Realized they are losing students

- They start a football team!

- Why? For a layout of $500,000/year for a coach + staff, using existing facilities (e.g. soccer stadium) etc they get ~100 additional students who want to be able to say "yeah, I played football in college"

100 students * full boat tuition is nothing to sneeze at and is a pretty good deal from an ROI perspective. It's not a small leap from "let's drop some cash for a very obvious improvement to our attendance" to "let's fudge some numbers that no one may ever really look at" for the same outcome.


Lol colleges are more expensive country club than place of higher education. We really shouldn't be surprised with the corruption anymore after we found out how easy it was to get on the rowing team of most of these places.


Kind of sad that US News decided to unrank just Columbia rather than suspending the whole thing. If one college games your ranking to the point that a professor from the university is like "hey you guys this seems pretty sus", it's probably true that all the other universities have the same problem and just don't have anyone willing to talk about it. As it stands it seems like just Columbia did something wrong, when they are the only university that did something wrong but also had someone who did something right.


Well, it's kinda their main business at this point.


Suspending the whole thing would be suspending pretty much US News's entire remaining raison d'être at this point.


"The picture coming into focus is that of a two-tier university, which educates, side by side in the same classrooms, two large and quite distinct groups of undergraduates: non-transfer students and transfer students.

The former students lead privileged lives: they are very selectively chosen, boast top-notch test scores, tend to hail from the wealthier ranks of society, receive ample financial aid, and turn out very successfully as measured by graduation rates.

The latter students are significantly worse off: they are less selectively chosen, typically have lower test scores (one surmises, although acceptance rates and average test scores for the Combined Plan and General Studies are well-kept secrets),43 tend to come from less prosperous backgrounds (as their higher rate of Pell grants shows), receive much stingier financial aid, and have considerably more difficulty graduating.

No one would design a university this way, but it has been the status quo at Columbia for years. The situation is tolerated only because it is not widely understood. "

https://www.math.columbia.edu/~thaddeus/ranking/investigatio...


For all of the ways we can easily criticize the incentive structures and possible intentionality behind these bad numbers (which I agree merit scrutiny), I think it's worth noting and remembering that the issue was flagged by a mathematics professor of the same university: there is at least some well-aligned incentives, some measure of self-regulation, with respect tenured positions and academic freedom.


The fact that U.S. News and World Report can pull a school from it's own rankings and it makes such a strong impact is really a strange phenomenon.


Columbia’s downfall began in a single moment in the early 50’s. When I.I. Rabi refused to let a poor Mr. Bader go to grad school in Physics part time so he could help his family out by working. Mr. Bader later became Feynman’s high school physics teacher, and taught him the principle of least action. So everyone knows that the world will still turn without Columbia.


This is a bit of a non-sequitur (not to mention silly), but I'm curious about the details. Do you have any links to this story of Rabi refusing to let Bader study part-time?

The problem, of course, it that the timing doesn't make sense at all. You say this happened in the early 50's. That can't be the early 1850's, since Bader wasn't born yet. But by the 1950s, Feynman was already long gone from high school. Indeed, he'd already devised path integrals, and so he certainly knew about the principle of least action!

Perhaps you meant 30's?


You’re right somewhere in here. I’m certainly wrong about the 50’s. Columbia was a bastion of antisemitism until the 70’s, it had the world’s best Physics Dept. for a long time, but frittered it away. Meanwhile, Cornell built a world-class research university, partly on Columbia’s leavings.


University rankings are mostly bunk because they weigh mostly on perception and lagging indicators. A university can deteriorate for decades while remaining high on ranking.

The only criterion that matters for a student that wants to select a university is the fraction of graduates that go on to do impressive things.


This has been brought up multiple time now over that last decade. At least this is one step in the right direction.

Adam Ruins Everything, Season 2, Episode 7 [1] 1: https://youtu.be/EtQyO93DO-Q


Just when you think it's time to count hire education out because of bloated administrative spending, overwhelming student debt, and poor technical training for real world jobs, a light shines in the darkness.

I guess there's still hope if higher education can learn from Professors of the likes of Michael Thaddeus. He and others like him make pursing a degree valuable. The administration can go...

Note: read the original document: http://www.math.columbia.edu/~thaddeus/ranking/investigation...



Wouldn't it make more sense to just make sure that all universities just pass the same requirements to be a university, and beyond that not add some ranking or other entertainment nonsense in the mix?

This whole entertainifying business really isn't good for anyone, except perhaps for people making money off of numbers, which is still not good for most.


If colleges manipulate rankings like this en mass I think we have a solution to the student debt issue.

A FTC review of college advertising would be interesting, if they are selling you a degree with basically no career prospects while proudly displaying these gamed rankings, that sounds like fraud and misrepresentation. Perhaps the students can get a refund?


How to stay objective on these rankings?

What about a list of independent|objective tests(for the 100 different majors out there) to rank students from any schools when you graduate that acts like SAT|ACT for jobs? do make it optional though, if employer finds that valuable, more college graduates will take it, it will work better than leetcode I hope.


I have an interest in researching CS programs within a specific geographical area in the states but am having trouble finding good data on student outcomes and associated metrics. If this ranking data is bs or gamed, then what’s the best available resource for helping to determine what programs are good, bad or otherwise?


misleading headline (on the original article)

from TFA: "on Thursday of last week, just before the July 1 deadline for submitting new data to U.S. News... Provost Mary Boyce, said [Columbia] would not submit data this year, adding that the university had “embarked on a review of our data collection and submissions process”

and that is why they will not be in the rankings. It seems Forbes filled out this submission just like Columbia did theirs, to skew overall appearance.

the "scandal" could very well still be the scandal everybody thinks it is, but it seems to be an overstatement to draw the bold lines that Forbes is juicing their article with. And as others have said, unless we know the extent of fudging at other schools it's difficult to judge the severity of this case.

full disclosure, I have no Columbia or ivy affiliation, but I perceive it to be easily within the top 10


The article is correct. US News kicked Columbia out of the 2022 rankings. Columbia subsequently withdrew from submitting data for the 2023 rankings.


It's interesting to think what most universities would do to a student who tried to falsify their data like this.


Sure. But, why not doubt everything else US News said? It’s not like they uncovered their own incompetence. The takeaway here isn’t that Columbia cheated. Everyone cheated. US News failed in their duty and aren’t a worthy authority.


Polymatter did a great episode on the pressures that create outcomes like this in regards to rankings:

https://youtu.be/cQWlnTyOSig


a huge shift over the last 10 year and most visible in technical roles is the elimination of college rankings. i used to work at a company that had a ceo mandate “all resumes must come from top 20 schools”. saying something like that now as an engineering manager would raise questions and probably make you look dumb. there’s many reasons for this but the biggest is it’s impossible to do this with a global workforce. this is great for high school students, one less area of pressure



I worked for a marketing agency that gets paid a shit ton of money to run ads targeted at the US News voters. Always seemed so stupid to me, its just a popularity contest.


I honestly hope this starts an upheaval against the US News and World Report college rankings. That stupid magazine has way too much power in higher education for no reason .


A reminder that the growth and cost of College administration is one of the primary causes of the massive increase in College tuition across the U.S.


Explains how ASU has always been number 1 in innovation. It's a great school, but no.q for innovation is pushing it too much.


If Harvard, Yale, and Princeton mutually agreed to withdraw from these rankings it would go a long way towards killing them.


Nice cookie dialog you have there, Forbes. And I was mildly interested in the article. Not any more.


Rankings do not matter. Unless you are going into Big Law, or if you are focused on staying in academia.


Cool, now do all the Ivys.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: