Hacker News new | past | comments | ask | show | jobs | submit login
College admissions officers rank prospective students before they apply (washingtonpost.com)
90 points by pseudolus on Oct 17, 2019 | hide | past | favorite | 34 comments



This article is pretty sloppy, mixing up students in different parts of the process. And given the headline about ranking students "before they apply", the opening anecdote is seriously misleading. The student visited the university website after applying, so the tracking software linked the visit back to the profile full of information the student had directly provided to the school's admissions office. That feels skeevy, especially if they aren't clear about it when you are applying, but the implication given the headline is that this detailed profile just popped up the first time the student visited the university website, which is not the case.

None of this is remotely new, of course. Colleges have been collecting information from standardized test companies and market researchers and building profiles of individual students for decades. I got mailboxes full of solicitations to apply to X or Y school 25 years ago when I was in high school, and now my daughter is dealing with the same.

Later, the article gets around to actual conflicts of interest and ethical issues. But the headline and the opening story are grossly misleading.


as someone who is on the research side of university education, i think its rather outrageous how easily the admissions people can collect data on students without any sort of informed consent whatsoever. If my research group did it this way, we would be out of job, and even tenures could be in jeopardy.


It's kind of outrageous the hoops we make nonprofit researchers operating transparently jump through when for-profit researchers who hide their results can basically do whatever they like as long as they make money.

Of course, I guess that's the thing here: the admissions officers are working for a nonprofit but they are a key part of revenue generation.


IRBs exist to protect academic integrity, that's not a factor in other contexts. A staff member can collect and use data that could be associated with individuals as a part of improving some part of a university's operations but an academic would have to jump through those hoops to do the same for their research.


It's hard for me to believe this is just sloppiness, even. The headline, subhead, and podcast link all say "before they apply". The lede is the story of a student visiting the site and being profiled, and all of the data listed could plausibly have come from somewhere outside an application. It's only ten paragraphs later that the article returns to the initial story and clarifies that the profile was derived from her application.

The use of cookies in this way is novel for universities, if not for other sites. And some of the other moves like classifying contractors as "school officials" look like blatant attempts to bypass privacy laws.

But the headline here vastly oversells what's happening; the amount of data available prior to an application doesn't seem much different than what I remember getting in unsolicited university ads. That's been happening at the very least since the College Board decided to sell all sorts of exam information to universities.


2018 Cost of Recruiting an Undergraduate Student Report

Benchmarking data for recruitment and admissions costs and enrollment staffing levels for four-year colleges and universities, based on a poll of campus officials

http://learn.ruffalonl.com/rs/395-EOG-977/images/RNL_2018_Co...

Median cost to recruit a single undergraduate student in 2017 was $2,357 for private 4-year institutions and $536 for public institutions. But there is a lot more data in the report. The institutions reported on do not seem to be the ones likely to have the most expensive recruiting campaigns, either academically or athletically.


Why on earth are they spending so much? Are seats going unfilled? Are the "best" students that more profitable for them?


The "best" students, those most likely to be successful within the school and after school, become walking advertisements. They're also more likely to be happy alumni who donate to the school.

I wonder what the data would show if you compared the money invested in admissions to that invested in alumni affairs & development, and how much is brought in by tuition and fees and how much is brought in as donations. From what I've seen, development is definitely an area that operates on the "you have to spend money to make money" principle.


We hit "peak university" in 2011. [1] In 2017 there were 1.25 million fewer enrolled students than in 2010. But this [2] is the zinger: in that same time frame, more than 300 new degree granting institutions opened up! It's becoming a hyper-competitive business. Good students not only are likely to stay on for the entire term of the institution, but also increase the prestige of your school which means more students, higher fees, and more profit.

I think a big issue is that the regulations on loans are removing any and all notion of supply and demand, so costs just keep getting crazier and crazier. Living on campus at an average private nonprofit university now will run you more than $50k a year. [3] Spending $2,357 for what may likely be $200k+ for 4 years is a phenomenal return on investment.

As usual, non-profit in practice and non-profit in connotation really don't mean the same thing. I attended a top 10 university only to see them constantly tear down perfectly functional buildings just to replace them with more aesthetically pleasing buildings that otherwise functioned just about identically. In some cases the new buildings had less room than the older ones. All it seemed to achieve was to 'Amazon' their profit figures. I don't really understand why they seem determined to waste so much money. The cynic in me imagines it's easier to justify a million dollar administrative salary when that's a much smaller fraction of your gross, and a "non-profit" institution would have difficulty justifying constant cost increases if they're showing a massive profit margin.

[1] - https://nces.ed.gov/programs/digest/d18/tables/dt18_105.30.a...

[2] - https://nces.ed.gov/fastfacts/display.asp?id=84

[3] - https://nces.ed.gov/programs/coe/indicator_cua.asp


>I think a big issue is that the regulations on loans are removing any and all notion of supply and demand,

There shouldn't be student loans from taxpayers in the first place. If voters want to give assistance to students, then pay all or a portion of their tuition. If the financial markets deem education to have a worthy return on investment, they will underwrite loans just like any other loan.


They are trying to get students who can:

1) Pay 2) Stay the full 4 years

So yes, the "best" students (the ones who will pay full price for 8 semesters) are very profitable for them.


This is the right answer for many small colleges in the US. Rate of attrition and individual discount rate are what keep up the revenue and the lights on year over year.


3) Take a semester or two of high margin 100 and 200 level classes (e.g. entry level classes that only require one professor and a lecture hall) before dropping out.


I think there is a demographic and/or preference shift occurring.

The small liberal arts college I attended recently had a year where it could only fill 600 out of 700 potential seats. The college filled out all 700 seats the next year, but redoubled their targeting and recruiting and used more waitlist admits than usual.


I think colleges should just define a minimum SAT score and then select students through a lottery.


Waste of money imho. But i guess if higher ed is strictly a business...


It's a business in that in the long run revenue has to exceed expenses, and if it doesn't, eventually the institution will cease to exist.

I agree with you that most of these schemes are wastes of money, but spending money on recruiting itself is a required part of operating a higher education institution.


It's a business in that in the long run revenue has to exceed expenses, and if it doesn't, eventually the institution will cease to exist.

Or it needs to be subsidised by another source of money (tax, benefactors, patents, research, sports, etc) if the benefit to society makes it worth keeping around even if it runs at a loss.


Those are all business revenues acquired by the government affairs department, the alumni affairs department, the intellectual property legal group, the provost for research, the athletics department, etc.


Your High School student receiving mail from colleges? Then they have been 'ranked' by a college before they applied.

The OP linkbait title is not actually surprising or new. Been going on for decades.


This was my first thought. I just thought that this is how it works.


This (‘affinity index’) has been going on for at least twenty years, late 1990s.

The software seems a bit ‘better’ now.


Is it just me or are others also seeing a pdf file download unexpectedly from this link?


College is becoming or already became what religion was thought to be a couple decades ago. Necessary. The recruitment stage will block new developers with potential unless already financially secure with a past 4 years of work experience in programming. I cannot see the tide changing as more college degrees flood the market and recruiters just take degrees as entries with coding algorithm sit downs.


There's multiple factors at play, and the everybody-must-go-to-college bandwagon is certainly a big one.

I'd add a second one. It used to be the case that you could hire programmers without screening based on degrees. But that was back when programming was offputting and unappealing to anyone but nerds dedicated enough to become good at it.

Now STEM is hot in the collective conscious, programming is seen as a well paying job that's easier than becoming a doctor or lawyer, and corporate efforts are trying to recruit as many people into the candidate pool as possible.


When did programming become seen as easier than becoming a doctor or lawyer? I would say that isn't the case unless just creating basic websites.


LOL in terms of bang for buck, being a Developer/Cloud Ops person is way more lucrative more quickly than law or medicine.

Also keep in mind it's much more egalitarian. In law yes you can eventually make partner at a white shoe firm and make millions (and make FAANG money at a Xth-year associate) but the competition is staggeringly intense.

Let's say I'm 18, intelligent and ambitious, I live in the USA, and I want the quickest path to making $500k per year and my choices are law, medicine, and computer science.

I sincerely hope that for everyone on HN, the answer is a no-brainer.


Mid-career quality of life is a thing.

The doctor making 200k stitching up chainsaw mishaps in some small city can live on several acres (or in the nicest suburb of that city if that's his preference), send his kids to private school, he drinks $2 milk and fills his car up with $2 gas. Try living like that in SF or any other "tech hub" on 200k.

I'm not saying tech isn't easier and doesn't start off making big bucks faster but doctor probably takes the lead around age 30 or so. If you wanna make big bucks without going into management doctor seems like the better route.


I suspect most MDs will never take the lead versus a programmer making FAANG level money. Just the health toll and lower quality of life alone from the studying and work requirements in residency, as well as having to work off hours and nights and whatnot put them behind. But also compounded returns on investment from income in 20s would require a lot of income to makeup for in 30s. Which doctors do make, but in the recent past, not enough to surpass top programmers.

Also, doctors have to deal with the byzantine paperwork nonsense in the health field, and work with the general public as well as in places that have infectious diseases floating around. And can't work from home as easily. I wouldn't trade not having to work with the general public for a couple hundred thousand.


200k is a 22 year old at a FAANG vs 30 year old MD.

I was a somewhat successful premed student (99.9th percentile MCAT, CS major), and I went the FAANG route instead. Physician and tech salaries are similar in that the top makes alot more than everyone else, although the bottom for physicians is a bit higher. For less competitive people who may only have only gotten into a single med school and whose job prospects with a biology degree are limited, the med route can make sense from a purely financial point of view.


Can you really even have been a doctor for that long at age 30? That's barely even "start-of-career" by my calculation.


I have a four year degree and have been employed for eight years now. A lady I knew in college just finally finished all of her residencies and got her first actual doctor job this summer.

Becoming a doctor is significantly more work than becoming a programmer.

Plus, I work a set schedule, with easy, predictable hours. Doctors get to deal with on-calls and 12 hour shifts.


I was a lawyer.

Now I'm a software developer. Primary reason for the switch? It's so much easier for the same amount of pay.

I work the same hours, I get paid the same amount. The questions presented to me in both jobs were fascinating and the work was satisfying.

When I got home from being a lawyer I was dead to the world. Exhausted from a hard day's work.

When I get home from being a software developer I've still got enough vim and vinegar to take my son to the children's museum or the library. Every day.

Anecdotal yadda yadda yadda. But I assume the pool of people who have been both lawyers and programmers is, if not tiny, then small.


In the United States at least it takes a lot of time and money to become a doctor.

The attrition rate for "pre-med" freshman actually becoming a doctor is much higher than CompSci/Eng students becoming SWE.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: