I’m a fan of community college, but I’m not surprised that no companies reached out. It is rough to try and jump into an entry-level position after only two years of studying programming. If you want to be a programmer, the way community college fits into that plan is as the first half of a four-year program. You take intro programming classes in community college, transfer to a four-year college, and then take your 300-level and 400-level algorithms / compilers / operating systems / databases classes. You end up with significantly less student debt and the same credentials, although this is not without its drawbacks (mainly, you may transfer but end up needing three years of study instead of two; you have to really be on top of shit w.r.t. bureaucracy to de-risk this one).
I’m pretty sure I could do 95% of my job without the advanced classes I took. The important things that I learned since I started my career were not taught at university.
There's always someone who will reply with something along the lines of "I don't use anything I learned in school on the job" or "Software engineering is not about coding, it's about X"
With due respect, in my experience as someone who did not major in CS, it was a rough first few years because I had weak knowledge of data structures, almost no knowledge of computer networking (showed itself why managing containers etc), didn't know even the very basics of bash/linux.
Nowadays, when I work with CS majors vs non majors, it has noticeably been more difficult to get the latter group up to speed and productive. Of course there have been notable exceptions (there always are when it comes to people matters)
100% likely depends on what you’re doing but I work in FAANGY tech and if anything CS majors lag behind people with real work experience, any real work experience. DA&A is not relevant to writing a CRUD app or service. It’s not relevant to collaborating or getting things done.
If 100% of a job is easy enough for a fresh community college grad, then it is easy enough to be outsourced for 1/10th the cost.
The 5% of the job that is hard, is what you're paid for. It is what keeps the job onshore, and it is what keeps fresh community college grads from being hired into it.
Just because he said he didn’t need a big part of the university courses doesn’t mean that it’s easy. He said that he learned most of the things at the job, which I mostly agree with. For some domains it’s important to have the proper educational background though. But mostly it’s a different kind of difficult compared to how advanced courses are difficult.
Idk I have a lot more confidence because I know what I know, that is, fewer known unknowns, and I have high confidence that there aren't unknown unknowns lurking around corners. This helps me drive decision making across the org.
Ya but if you're really honest with yourself, did you learn what you needed to "drive decision making across the org" in Uni? You may have I guess, but it seems like you really only get that confidence or those skills after you start doing whatever it is that you do. It's hard to imagine how anything I learned or could have learned in Uni, would have at all helped enough with any of even the least complex tasks once getting started professionally, that I didn't learn in technical college or on my own prior. That said, I wasn't a FAANG go-getter trying for a spot that required complexity analysis or whatever, or a game developer with a necessary background in maths and memory. I feel like the stuff you learn in Uni that's valuable, becomes encountered in only some special cases, and even then there's probably better environments for actually learning it rather than just figuring out how to test well on the material.
I personally found the humanities and hard sciences a bit more compelling compared to any CS courses; viewing the CS testing and teaching methodology through the lens of having already been in industry, it pained me more than it might have otherwise had I been a naive impressionable student with no professional perspective. Sitting there writing out an ADT in Java with a pencil in a not large enough physical space, thinking "why did I decide to do this crap again?", but also not being able to respond to the artificial pressure of the classroom with a fabricated stress response.
I think I agree with a lot of what you said - non-CS classes can be fundamental to learning some job skills (like writing, communicating, and deep thought). However when I say that it helps me drive decision making, I mean that I am literally more confident in what I'm saying because I'm reasonably sure of the boundaries in my knowledge.
For example, I went to grad school and studied databases. On that topic, I probably know more than anyone I work with; so if I put forward a proposal that is related to DBs in any way, I can be quite sure that there are no surprising edges that will come back to bite me (i.e. no one will come along and say "well what about this?", at least from a technical perspective). That's because I know what I know, and there are things I don't know, but I also feel like I know what those are (e.g. I don't feel as comfortable with low-level programming, mostly because I actually skipped a machine organization class and never picked those skills up separately).
I feel like, outside of the work experience that was baked into my degree, I could've spent those 4 years and all that money in a much more productive way.
(Note: I don't regret it, I had a lot of fun, but in terms of becoming a professional I could've been much more efficient)
As a result, anyone who's come to me asking how they can "break into" programming, I strongly discourage them from University at all.
I was got a non-CS bachelors, and then managed to get into development, partly because I had been programming all my life. But I went back for a masters and shored up the undergrad CS that I missed. Boy was that helpful! It wasn't necessary for 95% of my job, either, but it enables me to solve the other 5% that I would be unable to solve, or end up with something that barely works. I've found compilers useful (for parsing; didn't have time to take finite state automata), the OS class was excellent for understanding and to be able to write threaded programs that work. Data structures is simply essential (but I took a that in undergrad).
I am a much better "engineer" for having taken the advanced classes, but more to the point, I was a much better programmer for having taken the advanced classes.
The total amount of code I had written in my first two years of university was not very much; by the time I had taken some of the advanced courses I had written a lot more code and had thus become a better programmer (still not a good one, but better!). I had written probably 10x more code by the end of year 4 than the end of year 2 thanks to practicum courses, etc.
Not sure about your school but part of college is a washout tool. Can someone handle BS for 4 years without giving up which is a good indicator of their character. Another is that most of the classes should eventually teach you how to teach yourself. You're given progressively harder challenges but you have to fill the gaps with your own study and research.
Bingo. A BS is a great filter for hard working and higher IQ.
Guess who does really really good in sales? People who play varsity or college sports. Is it a rule? Nope. Is it a great heuristic? Yep. Does football teach you sales? Not really... but it does filter for certain traits, one of which is probably pure animal magnetism and the base desire of humans to be influenced bt the most fit individual.
I like to phrase it this way—in life, there’s a lot of bullshit. In college, there’s a lot of bullshit. If you can handle four years of bullshit in college, that’s a good sign.
The 12 years before is mandatory so it's no indication but it used to be when it wasn't. That's why high school diplomas don't matter as much as they did in say 1920
We could probably do a better job in the years leading up to the 16-18 year old start of going away to college. But we'd probably need to spend the money to customize things a lot more. One of the fun reads I had over the past year or so was from the NY Times on basically a one-room schoolhouse in Alta Utah where the kids basically spent half the day skiing. I can think of worse things.
With AI what we going to see is unbundling of school: education separately, socialization separately.
You can socialize with your ski buddies.
You can study with self-selected motivated group of teens together with AI and a tutor (college grad with AI teaching high schoolers, high schoolers with AI teaching mid schoolers, etc).
This model will be decentralized and more like twitch group, rather than centrally administered school
District
College wont be needed because of AI, the entirety of liberal arts is already obsolete.
Engineering disciplines can be split into blue collar (like operating CNC) learned at vocational schools, and deep tech (research focused) - these really only need “elite” colleges like top1%. The rest of 99% of colleges might as well disappear
this is my vision of the future, given the current rapid pace of development of ai.
just ask yourself: are you ready to pay $300k (65k tuition+20k living costs per year) for each of your kid to go to college, party, get drunk and study data structures & algorithms and Java?
do you want to spend 300k so your kid understands Cormen Leiserson algorithms book and do you think it is a good value ?
and I am talking about CS - arguably one of the highest ROI degrees out there.
meanwhile a kid from India/Eastern Europe who got same CS degree for $10k out of pocket will get the same knowledge and can get the same faang job
I'm not sure what AI has to do with that question to be honest. The discussion upthread was about secondary education which is required at some level and generally free (to the student's family) in the US today. There are tons of resources for self-study and other activities for motivated students/families.
As for college/trade schools, options range from essentially free for self-study to very expensive with everything in between. Again, I'm not sure to what degree AI changes most of that. I'm certainly prepared to believe that credentialism becomes less important in some fields (it sort of has in CS) but that has less to do with AI than some companies relying less on universities to do their vetting for them.
Other fields, of course, differ. You've got a lot better shot at a top-paying job in Big Law if you went to Harvard Law and clerked for a Supreme Court justice.
My argument is that secondary education is not about education, since standards have dropped with the introduction of common core and other improvement in the name of "education equity".
If you want to send your kid to daycare, please go ahead and send to public school. But more and more parents opt for alternatives to public school: private schools, college prep programs, charter schools, extra-curricular Math and science classes, coaches/tutors and etc.
In the end it doesn't make sense to maintain bloated school system that does not accomplish anything, but being glorified childcare for teens, while all academic achievements are due to self-selection.
You can look at any top-rated high school and realize they are not doing anything much. They just have self-selection mechanism that filters the most motivated kids with strong parent support.
I don't agree at all. You're describing a trade school. A university does not care if you get a job. They only care that they've satisfied your academic interests enough to stay in the profession or continue your education. They offer extra services to help you find a job but it's completely voluntary so I don't equate extra motivation with "equipping".
Many report out this exact statistic. My alma mater claims 90% job placement for STEM grads.
> They offer extra services to help you find a job but it's completely voluntary so I don't equate extra motivation with "equipping".
They go out of their way to offer these services, but of course it all boils down to personal motivation. They hand students a tool and show them how it's used. That's my definition of "equipped."
And, at the risk of being totally cynical, they report out this statistic, care if you are successful in a noteworthy way, and make big contributions over time. (And may even care a bit if you are successful in an abstract sense.)
It’s weird that people equate Computer Science to Software Engineering. TOTALLY different!
In fact, you could argue that you could do compsci bachelors and at the end come out knowing everything theoretically but not know how to actually code well. Compare this to a software engineer undergrad - the whole time you should be coding!
I'm not sure about your #2. Self-promotion is not necessary at university, since the profs receive mandatory submissions. Very different from a job search, putting yourself and your paperwork "out there" in a targeted way.
I agree about classwork, but I think many universities put a lot of effort into placing their students. Job fairs, resume classes, internships, student research, etc.
Personally, I suck at self promotion. The career aids offered by my school made a world of difference.
Actually a university does neither. A university's job is to expose you to higher order thinking, and different ways of thinking. The rest is up to you.
Obviously it comes down to the motivation of the student, but that does not mean university does not impact your employability, particularly in your early 20s. Documented exposure to higher order thinking is an employable skill.
Agreed, but the OP's claim wasn't whether a university degree is needed for employment, the claim was whether a university trains you for a job. The answer to the latter is a no. Community colleges and trade schools train you for a job. Universities give you an education in a subject area, the rest is up to you.
Our industry is swimming in auto-didacts, dropouts, bootcampers, and community college graduates. There's no licensure and most industry-adjacent academic programs spend a lot of time teaching concepts and abstractions far divorced from everyday industry needs. Committed tinkers who are ready to get their hands dirty and start learning the trade are historically very welcome.
However, the job market is upside down right now, because we had an insane hiring bubble that has since burst, leaving countless people with real work experience in the candidate pool for jobs at all levels.
In this kind of market, which we're used to seeing come and go, there's temporarily a lower-than-usual demand for candidates with less vetting, and so the auto-didacts/etc can expect to have a hard time getting attention for a bit.
A lack of interest in community college graduates is an ephemeral market state that we're familiar with seeing sometimes, and definitely not a structural detail of the industry.
I think that's probably fair. I know very qualified people who are having trouble finding jobs. I don't personally care all that much but a number of opportunities post semi-retirement haven't really materialized. Which is fine but would probably have been different in different times.
When Seattle built their light rail system they ran it past UW and nearly every community college in the region.
It was already a known phenomenon for UW students to take CC classes to reduce the cost of genEd requirements but after the trains that just got easier. Much faster than the busses.
I worked in the CC system and I am a huge advocate to what they bring to market--a low cost alternative to primary institutions, cutting the high cost of a Bachelors degree by at least 1/3; but, those CCs were primarily focused on getting students to take their generals (call it what it is: an Undecided major). While this is great and all, most companies are expecting even their interns to have a 'hit the ground running' ability; right or wrong.
The only way for ANY college student to do this is through real-world experience (from an organization that is willing to take on someone very green or private repo work/portfolio). I am regularly asked by randoms on LinkedIn and/or my network as well as interns and/or candidates what they can be doing to improve their chances to get an internship or FT role. In those instances, I always recommend that these folks:
1. Come up with a project, end to end.
2. Develop the code and post it in a repo that can be accessed by companies during the hiring process.
3. Be able to explain the why behind the project, the entire process end to end, and--most importantly--why the work was important to do.
These are the items I believe will put those green students toward the top 20% of their competition in the market. I find entirely too many candidates who have a lot of coding and theory courses and not a whole lot of application of those learnings. By being able to show management of a project from end to end, be able to explain the development process coherently, and provide the potential business value it brings is a big way for those folks to stand out from someone who can just write code.
Many average four-year colleges aren't doing this for their students; CCs most definitely aren't.
Expecting college students to hit the ground running with whatever stack a company is running is bad and unreasonable and we should push back against it. Engineering, law, accounting, and medicine dont do this. Why should Computer Science? Dont reduce a university education into learning javascript frameworks.
I guess. I was never a programmer and I guess my last job was pretty nych hit the ground running--at least I was told I did. But not sure how much it's the norm.
I went to the University of Illinois. That’s a solid tech school pretty much across the board, often cracking the top five in a number of subjects, including CS.
But the real secret of that school was that, at least when I was there, they had too much computer hardware for the number of students.
The number of hours in a day you could show up in a lab and not have a waitlist was fairly generous. Which meant a lot of the smarter students had no problem working on their own pet projects.
From a “hit the ground running” standpoint, a lot of these kids were only bothering with a 3.3-3.9 GPA because time spent getting three more points on a test or homework was time lost from working on your project, or talking to people about theirs.
> I worked in the CC system; but, they were primarily focused on getting students to take their generals (call it what it is: an Undecided major). While this is great and all, most companies are expecting even their interns to have a 'hit the ground running' ability; right or wrong.
I haven't worked in the CC system, but I've been around the Belleuve/Seattle area CCs for many years.
And at least for those, there are broadly 2 tracks. The first is what you say - the first 2 years of a bachelor's degree track.
The 2nd track is a variety of certifications. Sometimes these are industry certs like Microsoft, Cisco, etc. And some of them are CC certs. But they aim to get people productive in 2 years.
Okay, but it's plenty enough for a paid internship. Many of these students are going to transfer to a 4-year after this, so I don't know why this would be different.
I think it's just rough no matter what. I've worked with a lot of junior devs and the thing is I haven't noticed that the 24 year olds with a fresh CS masters do particularly better than code school grads who were working at walgreens or whatever the previous year.
Software development is a discipline in its own right that only very partially overlaps with a university CS curriculum.
CCSF alum here (AA, completed over 20 CS courses). I attended a similar CCSF job fair in 2017 and there were only 2-3 companies involved even then. If I recall, it was Lawrence Livermore National Lab and Mission Bit, maybe a third that must not have been interesting enough to stay in my memory. There were ~10 students at the fair.
The LLNL internship seemed too IT-oriented to interest me. Mission Bit involved teaching high school kids to code after school, which I did for over a year and received course credit at CCSF for.
So it's not that surprising to me that they went from ~2 companies to 0, even disregarding the major drop in attendance at CCSF since 2017 [1]. I didn't attend any other fairs so maybe I missed something, but I never got the impression that they were especially "bumping".
The post looks like it's SF area. Given that employers want 4-year credentials and there's a big Stanford/Berkeley/top-tier bias, what was the representation by companies prior to this at the community college level?
Looking back over the decades in my career, only a handful of the really standout people that I've worked with had CS degrees. Most common were music degrees or being self-taught.
I understand how people recruiting are filtering for degrees, but I think outside of true entry-level positions it's completely silly to do so.
There is this thing also: when you have a project or a passion, you don't waste 4 or 5 years for a degree, when you already spent everyday during 5 years learning these skills.
Many talented people just skip the school and go straight to work (for themselves or others).
The courses also are usually outdated. Learn AI by yourself with the videos of Karpathy and you will know state-of-the-art. Follow courses from your local university, and you will learn the perceptron and OCaml.
(1) the actual message doesn’t say the fair is canceled - it is still on; the message says no employers specifically looking for software engineers will be attending.
(2) it says that IT employers will be attending the carer fair.
(3) it also says that they plan to have more government employees for software engineers at the next career fair.
IT and CS overlap but imply a different range of activities. I wouldn't be suprised that network admins or helpdesk representatives would be sought out.
When I first started teaching college it was for my local community college. CS was situated with other tech careers like System Administration, Web Development, and IT Repair. One of the first issues I had to manage was requests to adopt a "Software Engineering" certification, which I fought against because unlike the other skills, CS doesn't benefit from holding a cert.
CS in CC is in a weird spot - many 4-year universities will only accept CC CS1 as an elective, meaning they'll need to retake CS1 when they transfer. There may be some partnerships between a university and their local CC, but outside of those are gray areas. This could be because the CC instructors haven't reached out to get the courses to transfer or due to quality assurance. I know my predecessor would post the solutions on a sheet of paper in the classroom and students could earn partial, but passing credit if they retyped it out.
Finally, local area companies may not need software development or don't need it in a continued employment fashion. The goal of for CC is to help give the community the skills people need to find employment - but with more focus on LOCAL employment. If there aren't any local software companies, CC graduates will need to compete against 4 year university students for the same entry-level position. As a consequence, the 2 year program only taught them how to program, but didn't give them those additional "why you program" skills or projects to build experience.
Stepping away from this particular case, what does a well-executed Associate's degree for computer science look like? At top-tier schools (at least in the US, which is all I'm familiar with), there's an expectation that most of the incoming students already have a reasonable familiarity with the basics of programming, but that's unreasonable to expect for everyone entering college. And since it's useful to have some practical familiarity with programming before taking an algorithms class, it further seems like getting some sort of software engineering two-year degree would be a good use of time before either entering the workforce or deciding to cover some theory in the following two years (algorithms, computer architecture, compilers, etc.).
What would a two-year degree like that cover? A variety of useful languages plus their most common frameworks? Basic data structures? Common industry tools (git, CI, docker, linux)? Even though it doesn't fit very naturally into the US college experience, I'm wondering if a well-executed two-year "bootcamp" (for lack of a better term) could actually fill a gap that exists right now. It at least allows people to choose if they're interested in theoretical computer science or not. Theory is quite helpful, but not everyone wants/needs to opt in to math, proofs, etc.
Finland has a model of high school alternate vocational programs. But standards and funding are so low it is mostly useless, places seem mostly day cares for students.
Applied science universities however are not as rigorous as traditional universities, but do offer suitable degrees with practise in actual software development with lighter load on pure computer science or advanced topics.
In the US there are technical high schools in many states. Not sure exactly what the curriculum looks like. I'm sure it involves programming but probably not CS topics.
That's not surprising...I remember after the dotcom crash was in full swing I went to a career fair and there like 20 companies and 4,000 people searching for jobs.
From what I've seen, things haven't hit dot-bomb nuclear winter. But at that time there were a lot of IT workers in the would you like fries with that category who never got back into the profession. I was very lucky to get rehired shortly after 9/11 but, to be honest, didn't really get a well-paying position until years later.
The thing that saved the computer industry after the dot bomb was:
1) Broadband. A lot of the 90s business models that didn't work out at the time did work out a few years later as people got faster internet and
2) A little bit later there were smart phones.
The problem I see now is we need some other big thing to come along that would make a lot of business models that didn't work before suddenly start working. We haven't had any big platform shift since smart phones and the industry is now entering into a "maturation state." Kinda like aerospace in the 80s/90s after SST didn't work out and everything became about getting another 3-5% efficiency at mach 0.81 every decade.
I'm very glad I didn't go to work for Boeing in the early eighties when I had a job offer without even an interview out of school. I had to actually ask to get flown out to Seattle to meet with some folks.
I'm not sure where I'd put my bets these days. Blockchain ended up pretty much a bust even outside of the crypto-scammers. Quantum feels like it's receding and I know folks that have exited from day to day involvement. Not sure what's past Kubernetes/containers/etc. and many folks seem to get annoyed if I press too hard on the question. Of course, there are always new innovations but unclear to me what's game-changing which may imply compensation that's very ordinary engineering/professional levels as was the case outside outside of SW in some locations for the past couple of decades. Which people with visions of $500K comp pages for ICs may find disappointing.
Maybe. We'll see how it plays out and whether it's a positive or a negative from an employment perspective and what types/levels of employment it affects. I actually think AI is probably a big deal but the details are very unclear to me.
I basically agree. There were a lot of things: Web 2.0, smartphones (as you say and mobile more broadly), broadband had been coming in but really blossomed and a lot of things associated with it... So there was definitely a dot-bomb with a lot of knock-on effects in the tech sector but people who were able to wiggle through a few years to the other side, like I was albeit with not great comp, were mostly OK and generally weren't as affected by 2008 as a lot people outside tech were.
Not surprising. But always a good reminder that despite what they try to tell (lie to) you: the economy is in fact not in good shape. Or at least, companies forecast a bad economy even if they believe the current economy is good.
As an employer I'll tell you what the problem is: there's too many universities and all of them have different UIs. I'm only reaching out to the professors I know and they'll figure out how to advertise the jobs to their students. Career fairs? Even worse, who has the time to do that? We just want to increase our funnel, not increase the time it takes to reach out to junior engineers.
The proposal in these kind of situations is to have common job boards (like Who's hiring on HN) and check there. Students should do that and not assume that employers will come to them.
Good professors have relationships with the industry and will preemptively reach out for their students (that's what my professors did in my uni back in France).
The reverse API could also work, have a standard that all universities follow so that one can easily post job descriptions to all relevant universities.
I've seen companies do some really stupid stuff recruiting-wise post-covid.
Like one company I'm familiar with that after a two-decade relationship with Waterloo, stopped sending recruiters there because the recruiters no longer wanted to travel so far. Literally their best hires came from there.
I know the California CC system pretty well and have taken a few courses. In tech, there are usually two main paths: either transferring to a 4-year school for CS/Engineering, or specializing in something like Cybersecurity, AWS, Cisco, etc. For Software Engineering, there’s not gonna be any employer demand for those students at CCs since most people going that route tend to transfer to a university and their education/experience isn't really complete for that role.
I think CS is starting to become more and more like being a lawyer. Nobody cares if you have the credentials unless it’s from a good school, but if it is they’re willing to pay a lot
That doesn't make sense because a lawyer's pedigree is valuable to the firm. They want to tell their clients that "we have Harvard lawyers working on your case."
Nobody really cares about the pedgree of a computer programmer. There are some narrow areas where they may want to have a highly credentialed person serving as "chief scientist" or something, to impress investors. A startup may want to tell investors that "our technology team is lead by an MIT PhD in AI" but beyond that the credentials of your dev team are not marketable.
Contact firms care a lot, although I think the degree is intended to convey competence. An exceptional programmer is worth a lot more than two mediocre programmers, and so if you can be convinced someone is exceptional they can make a lot of money. Although I don’t always agree with the logic, going to a top school can help convey this sentiment
The situation with law is that you need credentials from a good school, with a decent class rank, and probably a good clerkship to get your foot in the door with a well-paying white shoe law firm. Otherwise there are probably plenty of law positions but they may pay pretty poorly in many cases.
Yeah, I think people arguing about "they want 4 year degrees" and "they use professors as contacts" may be missing the forest for the trees.
Career fairs are a very effective advertising route (guaranteed real people, high chance people are developing in relevant fields, chance to network with future recruiting wings) on top of a way to try and get interns or junior level candidates. Especially small local businesses who have no chance attracting national talent.
To tell people that you aren't interested in I'm advertising (a business's favorite thing) paints a dire situation, reflecting how bad times really are. Especially in an area like SF
And as a fun little dystopia to end off on, there are many non-school career fairs companies go to in 2023/4 with no intention of even interviewing. A few will straight up tell you they aren't and just want to "bring brand awareness". Yay economy.
Outside of courses, a college's alumni network is also pretty powerful. Four year colleges tend to have a pretty robust network where alumnis will advocate for their schools and come back to do recruiting including working with professors they had
In addition, colleges like to boast about their placement rates so they're incentivized to help keep that network robust
Let's all hope because when it's an employer's market, they tend to lean towards exploitation and burning people out that do have jobs which isn't exactly healthy.
renewiltord did not claim education provided at CCSF is low quality because it is free.
However, the barrier to entry to a higher education entry is commonly used as a proxy for the quality of the student coming out of the higher education institution.
Someone who is good at learning can just learn for free, only someone who is not confident in their abilities would pay for the name of a college attached to a degree (with likely less actual skills transferred, because paid colleges depend on throughput, not on results). The actually good students at paid colleges are the ones who got a academic scholarship, i.e. free like community college.
You're right and I read your comment history and really enjoyed the fact that you often correctly comprehend other posters who others pile onto. Pretty interesting about language comprehension. Love to see it.