Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Given AI advancements, is a master’s degree in CS worthless?
63 points by lisplist on Dec 20, 2022 | hide | past | favorite | 117 comments
Hi all,

I’ve got a BS in Computer Science and have been considering pursuing a Master’s degree part-time with a focus on ML/AI.

I know the common narrative is that a Master’s in CS really isn’t worth it if you’re just looking for a pay raise. However machine learning is an area I’m interested in but lack the requisite background. I just really worry the degree will mostly be worthless by the time I graduate considering the rate at which AI is advancing.

The degree would mostly be for personal knowledge/fulfillment, but I don’t want to bother with it if we’re all going to be unemployable in a few years anyways. Another alternative I’m considering is learning HVAC repair as a fallback career.

What are your thoughts?




Disclaimer: I am a CS professor.

I don't think AI advancements will cause a problem for the value of the degree (or rather, if they do, then it wasn't a very good MS degree). The value of formal university CS education done well, at both BS and MS levels, is learning skills in a context that integrates those skills into a knowledge framework that transcends any particular technology and hopefully outlasts several trend changes. The specific ML algorithms you would learn in an ML-focused MS will likely be out-off-date soon; the training on problem formulation, data preparation, fundamental limits of learning, and the theory of how ML works will not only outlast many technology shifts, but give you a good framework for navigating those shifts and integrating new advances into your knowledge.

There are likely many programs that would not provide this kind of foundation. But in understanding in general the value of an MS, this is how I would advise a student to think about it. (and on MS vs BS, BS usually provides some opportunity for specialization but is very much a generalist degree; an MS provides more opportunity for specialization and credentialing on that specialization.)


asks a drug dealer How do you feel legalization will impact your business? /sarcasm

Disclaimer: I dropped out, but i do wish i finished just because it's sad to now be 36 and I hate leaving things undone.

In all seriousness, i think higher ed has issues to resolve regardless of whatever AI does to it. The ongoing imbalance between the value one can extract from a degree and what you get out of it has been mostly impacting students other than CS or other engineering degrees, but with a slower economy we may end up sucked into the issue other fields have long suffered from. Speak to anyone in the environmental field, hard to believe this is /the issue/ of our time yet we value is so poorly.


>The value of formal university CS education done well, at both BS and MS levels, is learning skills in a context that integrates those skills into a knowledge framework that transcends any particular technology and hopefully outlasts several trend changes.

While I don't disagree with your main point re the value of a CS degree, this is the same argument verbatim given by every English, History, and Underwater basket weaving professor.


They’ve also got a point. The skills may not be technologically valuable, but they can teach critical thinking and give broader context for life. Philosophy majors tend to do better than average salary wise as well.

That said I also believe many fields have gone bunkers. The whole everybody needs a degree also creates incentives for degree factories.


Outside of ML/AI what would you say are areas of CS in which a lot of active research is being conducted?


Programming language theory and formal verification have been relatively hot during the last 10-15 years and show no signs of slowdown. Still, a relatively niche area.

Also the intersection of CS, probability and statistics is a very interesting area to work on. Less trendy than deep ML, but really practical. See e.g. Stan, Pyro, Andrew Gelman's books, etc.


Thanks for the insight. My Software Quality prof gifted me a copy of one of Gelman's texts but I haven't had time to take it in; I should change that...

It's weird to me that formal verification isn't more widely used; I would think it would be common at least in safety critical systems development.


> I just really worry the degree will mostly be worthless by the time I graduate considering the rate at which AI is advancing... The degree would mostly be for personal knowledge/fulfillment, but I don’t want to bother with it if we’re all going to be unemployable in a few years anyways.

I went to university from 2005-2008. Back then, with dot-com scars still fresh in everyone's mind, an extremely common piece of "advice" I received was: Don't bother going into programming; software development is going to all be outsourced to offshore developers. You'll never make more than $50k/year in your career as a developer, the competition from India and Bangladesh will be too high.

As much as futurists hate to admit it, coding AI is still way worse at many things than even the now-near-universally-loathed "outsourced dev" boogeyman. Your job as a software developer isn't to write functions that reverse a binary tree or solve the Towers of Hanoi puzzle. I haven't seen any evidence that AI can evaluate a legacy codebase and determine what the best integration path forward is. I haven't seen any evidence that an AI can figure out how to put together a backwards-compatible API. I haven't seen any evidence that an API can put together a build pipeline.

Your question is based on an assumption that because ChatGPT can spit out some pretty impressive stuff that an entire career path isn't going to be viable. I will tell you emphatically that assumption is wrong. Spend a few years in the industry and you'll understand that ChatGPT is impressive, but only touches about 5-10% of what a software developer really needs to do.

It will be an important tool for developers going forward, and maybe reduce the overall number of devs needed in the world due to increased efficiency, but no, it's not going to replace software developers. Not even juniors.


This! Tech influencers on YouTube are using these bad examples, to scare people into believing that ChatGPT will take over programming jobs.

The day when AI technology can fix bugs in a multi-million line codebase, make improvements to it; that's the day when I will start worrying. That day is far, far away.


Couldn't that mean they're at least partially right?

For instance, I don't think it's unreasonable to suggest that the low-hanging fruit of developer AI will be combing AI with no-code so that a small to medium business owner can build their own website or apps using the English language, assuming their needs are pretty typical. This is more or less already possible with sites like Wix, just without the AI part; AI would add some flexibility that Wix and Squarespace lack. A business owner would then be able to say "put a widget on my homepage that shows the latest video from my youtube channel" and AI would probably able to do it when an existing component either wouldn't exist or not be as straight forward.

So what ends up happening is human software development more or less becomes an exercise in shoveling dung rather than building new things from scratch, which I think we're already seeing more or less regardless of AI.


Yeah. The skill floor and skill ceiling keeps continually rising as tools get better. AI is just another tool in the toolbox.

As an example, even as late as the 00s, you could still get a job as a "web developer" where you only made static sites with HTML, CSS, and basic Javascript. First tools like Dreamweaver or Frontpage, and now sites like Wix, made that kind of position obsolete. However, a "web developer" is now called "front-end developer" and is still very much alive, just focusing on different things.

Or another example, there is this nearly extinct breed of people known as "database administrators." You used to be able to get a job as a DBA by just knowing how to set up backup scripts, optimize indexes and set up disk space monitoring. (If you could set up a read replica, you were top-tier!) Now cloud tooling has made all of those things trivial. Yet those same people are now very likely "DevOps Engineers" or "Cloud Engineers" which, again, are in extremely high demand.

You should only feel threatened by advances in tech if your life plan was to learn how to do one thing, then never develop any skills. In the tech industry, that's been a path to failure since the beginning. For most of us, AI will, in the best case, be another tool and allow developers a whole to move onto the next big thing.


I'm pretty sure there's still reason to feel threatened even if one's life plan isn't to learn just one thing. Being willing to continually learn doesn't mean there's an infinite capacity for how many skills a person can learn frequently or simultaneously. Being "willing to learn" sounds cool when you're in your 20's, but eventually the constant (and 99.9% needless) churn is going to become tiring and even insulting to one's intelligence. I feel sorry for anyone still in the constantly-learning mode in their 30s or greater, or the time in their life when they should be doing other things besides hustling.

It's not that I don't agree with you up to a point, but I wonder just how sustainable this trajectory really is. AI hasn't really been a "thing" in everyday life until relatively recently.


I am working on a natural language programming tool using the OpenAI APIs. I have a version that I started which I believe can do your task if there is a file in the directory named "latest_youtube" or something and if the home page is less than 12kb. I believe text-davinci-003 knows how to embed a YouTube video and knows how to select files to look in, and will be able to find the latest in the JSON file.


And if it’s not far away, we’ll have many other things to worry about! For me, accepting my total helplessness on world-altering events like this makes the anxiety go away. There could be a zombie apocalypse tomorrow, or a giant meteor could be headed right toward earth, or I could be diagnosed with terminal cancer. I’ll either survive or I won’t.


I think it's quite possible that "code monkey" type jobs will shrink in number as automation of boilerplate tasks accelerates. I think a lot of "full stack developer" jobs are in large part full of these types of tasks: know how to wire up the framework and shuttle data from point A to point B in the canonical fashion. That work will still exist, but it will be primarily supervisory. I hope.

I think there will still need to be highly qualified people 'directing' the tools, teaching, and determining architecture.

I suspect there'll be less of us, and the job will change, but the market need will still be there. I think in this context a masters will indeed be quite useful.

(Saying all of this as a person who has gone their whole career without a degree.)


I remember in 2008ish my older brother wrinkled his nose when I mentioned I wanted to learn programming. He said something about programmers just did what they were told and it wasn't a good career choice. Sure glad I didn't listen


I think the right question isn't whether or not AI such as ChatGPT will replace developers. The right question is how much more efficient it can make a developer, especially when AI tools are purpose built for developers, and those developers are good at using such tools.


I thought masters degree was just a trick/glide path to get a visa?

My own experience is that masters is for people who need to immigrate to the US for work and can afford a masters degree. When I was in school, the masters level CS students weren’t expected to know CS well going in, so it was kind of like cramming a full CS degree into a two year window… with not amazing results. Obviously that depends on the student. Also lots of students who just didn’t want to work professionally yet, and you can get student loans to keep working on a masters degree.

All that is to say I’m extremely surprised by the people saying the degree is valuable on the merits and not for some other instrumental reason.

P.S. regardless of education level, programming, at least in software companies, is an extremely privileged career regardless of pay. Hours, work environment, remote availability, treatment of labor by management are all better than I can imagine even comparably-paid trades positions, especially if you’ve already invested in the bachelors. I think people in the software engineering bubble can, sometimes, fail to appreciate how good they have it relative to others (especially if you get caught up comparing between FAANG or have ever complained about an equity package).


> people saying the degree is valuable on the merits

This: https://blog.alinelerner.com/how-different-is-a-b-s-in-compu... lady suggests that Master's degrees are actually an indication that you're incompetent, unless you're a foreign student angling for a visa.


US universities don't really have research focused masters so they are shitty cash cow degrees. I think a research master's from a good European university can teach you a lot and make you a better programmer. It's kind of like doing a few years of a PhD and dropping out in the states.


heh in some fields that's exactly how you get a masters in the US


Interesting. My dad leads the AI division for a major tech company (one of the top 10) and loathes interviewing candidates with masters degrees for the same reason. "They are smart when it comes to research but fucking clueless for implementation" I believe were his exact words.

I'm a high school drop-out and I've been doing software dev for 17 years or so, currently a senior dev in FAANG. My experience with master's degree holders has been about the same and I often surpass them on teams I've been on when it comes to promotions or getting recognition. To be fair though, two of the most brilliant devs I've ever meet in my career had master's degrees.


> To be fair though, two of the most brilliant devs I've ever meet in my career had master's degrees

From what era / decade? World isn't static. MS is CS earned in 80s means a different thing than the one earned today


I wonder if that's more true for people who went straight from a BS into a MS without spending any time in industry than for people who got their MS after or during working in industry.


It's funny how while this might make some sense for the US, it doesn't apply at all to Europe:

- many positions in EU, especially on the leadership level, straight out require a Master's degree.

- most masters are consecutive. This means you need to have a bachelor in the same or an adjacent field in order to pursue the master.

- there aren't any high fees and the admission process is often more competitive that for the bachelors. Employers would rather wonder why you left university early. Bachelor's still has the reputation of not really being a full degree.


Gatekeeping. Make it about anything other than the actual skills required


> If your undergrad degree was in some other field, you can get through an MS in CS without ever taking an algorithms or data structures class.

Hate to admit it, but she has a point. Though my CS masters program (BSU, US) only accepted me on a provisional basis until I finished the undergrad CS algos courses. It was more challenging than many of the MS courses. Partly due to the challenge of a foreign discipline.


> I know the common narrative is that a Master’s in CS really isn’t worth it

Having interviewed a bunch of job candidates with CS Masters and CS Bachelors for jobs in a fairly small, research-y group, and a shmear of PhDs, level of education absolutely matters. The more education, the more prepared people are to think well on their feet. The average high school dropout can think. Thinking well requires training. Generally, CS Masters hits a sweet spot: they can hit the ground running and mold themselves into a job. They may need a bit more guidance to understand the space around the problem. The PhDs have often self-selected into the job and have a good grip on the problem but take a bit more guidance on what to not do. The bachelors folks need strong leadership and team around them. While you may get that in larger orgs, if it's a smaller org, that may not be reliably available at all times.

> considering is learning HVAC repair as a fallback career.

Having had HVAC guys install and repair A/C systems at various homes, I think you would quickly find the baseline work mind-numbing. That said, if you make it through the apprenticeship, you might be in a good position to build a startup.


From experience, people with CS Master's are able to navigate uncertainty better than bachelors folks, show more initiative and are better at accumulating information, though I do note that I may be biased.


Accountants used to spend an enormous amount of time manually doing arithmetic and balancing spreadsheets. When the first computerized spreadsheets came out, it was a massive improvement in accountant efficiency and you would’ve expected there to be less accountants at the end of the revolution. Instead my understanding is that there are more accountants per capita today than there were 50 years ago.

Similarly, even without chatGPT, most individual programming tasks are easier today than they were 50 years ago. No punchcards is an amazing development. However, there are way more software engineers today then there were 50 years ago.

It’s no guarantee, but often an improvement in efficiency in a resource does not reduce the overall usage of that resource, because it increases demand for that resource. It’s hard to predict what effect a new technology like AI driven programming will have, but it’s a range of outcomes that includes increasing demand for adjacent skills as well as reducing demand.

I doubt a CS masters skills will forever be as highly remunerative as it has been, and the work will be different, but that is different than becoming a less valuable than an HVAC certification.

An AI/ML focus seems especially valuable as an understanding of how these new systems actually work, how to best use them, and what the pitfalls may be is likely going to be a hot skill set for some time.


Current AI models are like an airplane and programming jobs are like flying to the moon. No number of incremental advancements on current technology are going to replace programming jobs. That only seems possible if you don't understand the details.

You need a catalogically different technology (a rocket) to do space travel.

A layperson seeing the first airplane and rapid advancement in that industry could easily think, "look we've been flying higher and higher, soon airplanes will fly to other planets." If you were to point out that that's not how airplanes work they might retort that people used to think flight was impossible at all.

We're at the Wright flyer in 1903 moment in ML/AI right now. It will get a lot better an change many things but it will ultimately still be just a tool of programming rather than a replacement for it until someone invents AGI, and even as someone who is optimistic about that happening, I'd guess we're still decades away.


> The degree would mostly be for personal knowledge/fulfillment

I feel like this may answer your own question? It may be trite, but even in a world where AI makes us obsolete, there's still value in doing something fulfilling that you're interested in. Just because DALL-E 2 can replicate an oil painting doesn't mean that there isn't personal value in physically painting anymore.


> Another alternative I’m considering is learning HVAC repair as a fallback career.

Why not both? My neighbor configures HVAC for datacenters as a living. From what I've understood in our chats, there's a lot of expert system processing going on. It's only going to grow as ML/AI does. When there's gold rush, don't invest in gold, invest in pickaxes.


my father in law was an "HVAC guy" and got to know some commercial real estate owners through networking. I think he did $250k/year working when he wanted. Granted, it was very hard physical work.


No the AI still has to work on an Operating System and people will always need computer science to understand the fundamentals of how these things work.

Can’t comment on the state of machine learning/ai as it is not my field of expertise but I would also guess no. The mathematical basis on which ai is based is not going to radically change. Linear algebra etc is always going to be useful no matter what happens.

Neil Gaiman says that he visualised becoming a writer as climbing a mountain and weighed up every decision as whether it took him closer to the summit. I would say you need to get clarity on what your ultimate goal is/which mountain you want to climb and then evaluate taking the masters in relation to that. We can never be sure if we are going to make it to the top regardless so you’ve got to have a little bit of faith. And you could very well fail at whatever your back up plan is so you might as well take a chance on the thing you actually enjoy.


>I just really worry the degree will mostly be worthless by the time I graduate considering the rate at which AI is advancing.

A masters is two years. What significant AI developments could possibly happen that would reduce job security?

Do you plan on your job being about writing 20 liners solving the most common CS questions on SO? If yes, SO has already replaced you, if no, what competition from AI do you have?

I don't want to make any grand predictions, but current generation LLMs will not make a dent into programmer jobs.


A few years ago we had GANs that generated meaningless code, but was somewhat legible, but didn't compile.

A couple years ago the pointless code started to compile.

A year ago we got copilot.

Past week or so I have been pasting React components into Chat GPT and it successfully tells me what it does, and I ask it to change it, and it can change my program for me.

I would say in a couple of years AI may be in complete control of entire Git repos for backends, front ends, etc... and you will be able to modify it by telling it the new stuff, in plain english.

I mean right now it's just the same thing as Stable Diffusion, but with code and so it's "mostly right". When that crosses over, it's going to take out this profession. I honestly don't know what to do.


AI which produces 99% correct code is still useless.

Current LLMs will not threaten programming jobs in any way.


ride the tiger lol


Lot of confidence that our careers are going to be OK.. I'm suspicious because if there were sufficient AI to replace us, we would be in the position of our salaries depending on us not understanding that fact.

Seems to me that the only thing stopping AI from doing our jobs in 5 years will be the legal department.. at that point I guess my last job will be to set up the on-prem/cloud infra that runs the model. AI doesn't change the requirement of owning your own data. After that, I dunno. What jobs will be left? AI trainer, mascot at Disneyland, and CEO?

I might have the time horizon wrong but I do think it's going to be a bloodbath. There's nothing about previous industrial revolutions to suggest it would be anything but. And if you think you'll be able to see it coming, have a look at https://intelligence.org/2017/10/13/fire-alarm/ . This problem space is starting to feel less like fusion and more like flight


Are you picturing a world where companies are still making mostly the same kind of products as today, just with robots instead of people?

The change AI brings could be lot more fundamental than that. What that means for jobs though, is really hard to predict.


I'm picturing a curve of increasing steepness that we'll all inevitably fall off of eventually, but where the marginal utility of hanging on for one more cycle increases.


Is it that hard to predict? In case of AGI, humanity is likely to go extinct - we already live a perilous existence on this dot. Jobs will be the least of our worries.


it's possible we'll be able to blow up the AGI before it's finished with us. Maybe the best possible world is one where an early AGI destroys a huge amount of capital and ruins the reputation of its successors so they're never turned on.


I meant this decade or so.


Instead of thinking about jobs, think about startups that solve problems. You can "hire" the perfect AI or robots as employees.


for very rich values of "you"


> a Master’s in CS really isn’t worth it if you’re just looking for a pay raise

Nope.

> the degree will mostly be worthless by the time I graduate

A Master's degree's value is not in the information-content you managed to download into your brain. Its value doesn't have to expire as a domain advances. As with anything academic, most of the value depends on how much sweat and time you put into the thing (independently of how good the classes or profs are), and if you do it right, your competitive advantage will be in having learned how to learn better or faster.

> The degree would mostly be for personal knowledge/fulfillment

Assuming you can afford to study, and to do it well, this is the only thing you need to focus on.

> Another alternative I’m considering is learning HVAC repair as a fallback career

This comment makes me wonder whether you're being honest with yourself about what your motivation is. Nothing stops you from learning about HVAC as well, if you're interested in that.


My motivation is learning useful, relevant skills that

A) Are intellectual fulfilling

B) Help me remain employable in the current rapidly changing world of tech

I brought up HVAC repair because it’s a skill that is at least currently difficult to automate and pays reasonably well if you own your own business.


More schooling won’t help you stay employable. Work at jobs that use newer, widely used technologies, and don’t let yourself stagnate. Develop a variety of skills and keep on eye on whether your team, company, or you personally are getting into a rut.


AIs are not going to take your job (but may change the nature) for the forseeable future and a degree in Computer Science is still valuable. It's always valuable to diversify your skillset but in general it's highly likely that you will get more out of a CS degree than an HVAC qualification if that's what you're interested in.


It’s still worth it because the value in a CS degree is NOT the domain knowledge you will learn, it’s the fact that it will teach you how to think.

You will learn (or at least be exposed to):

problem decomposition, systems thinking, good interface design, encapsulation, abstraction and generalisation, optimisation, algorithmic complexity, time/space trade offs, searching algorithms, and much much more

All of the apply to many fields outside of programming, I have used binary search to to bisect real world problems, systems thinking to help with social interactions, abstraction thinking to plan life events and optimisation to make sure all my tasks get done efficiently.

Let me tell you something really, really important about university: it’s about learning to think, NOT about gathering and hoarding some domain knowledge and then exploiting that for the rest of your career. The worst type of university graduate is the one who spends a few years learning some bits of knowledge in a field and then sanctifies that knowledge and sits there with a sense of entitlement, hoarding what they think is valuable - everyone can look up stuff on Google now, hoarding knowledge to appear valuable is a mugs game, it’s not 1850 anymore where this strategy worked.

The valuable part is learning all of the different ways to think and then applying them and keeping your mind fluid and flexible, and I think of all of the degrees, CS is one of the best for this


AI may take over all programming jobs one day (assuming we don’t destroy human civilization first), but we’re still at the Ford Model T stage of AI development. Do you think in a few years we’d just have CEOs mumbling to AIs and nobody in between? Far more likely is that developers will use AI enabled tools to generate more and better quality code in the same amount of time. In that scenario, having a Master’s in ML/AI would absolutely be an advantage.


The masters might help you better explore why this is not a realistic concern :) I got a lot of value from mine.

The risk in CS that I worry about right now is suppressed wages as more folks enter the field and the large companies get better at paying us less.


This is a big concern of mine. Maybe all the dev jobs don’t go away, but our wages are suppressed enough that I might as well do something else.


Yes, get the degree (as long as you’re ok with the cost / debt burden, which you did not mention).

It’s very, very unlikely IMO that AI will take our jobs in the next 5-10 years. But I think the demand for good ML engineers will stay strong. Having a master’s in this is a big leg up, it’s hard for people to learn ML well without formal education.

Also, education is a great way to ride out the current crappy job market.


> Also, education is a great way to ride out the current crappy job market.

Don’t sleep on this.

In 2008 I had a few friends go back for higher education rather than stay at/take low quality jobs. They graduated with additional debt, but also with credentials that accelerated their job search and earnings during the recovery.

If your personal situation allows it, it’s a good idea to consider.


This has definitely been on my mind. A Master’s might help you stand out a little more when everyone is fighting for roles in the downturn.


So, so much of our job is not writing code, it's understanding what is going on, communicating it to non-technical people, etc. It is very clear that these tools don't have an understanding what they're doing, it is all an advanced form of pattern recognition.

Some of our jobs may be automated by AI in the next N years, but I think we are quite a ways away our jobs being truly replaced, as opposed to simply augmented.

Don't forget that also in many ways these tools may (and are likely to) spawn the need for more programmers, while decreasing the total costs of development, but we will absolutely still be needed.

Remember, a computer can't be "fired". It can't be "blamed" when the site goes down, etc etc. We work in human organizations and fill in many needs beyond simply the time we spend in emacs.


> Another alternative I’m considering is learning HVAC repair as a fallback career.

Not sure if this is a joke or not. If it's not a joke, I think you should go into HVAC. AI/ML is a competitive academic field, and only the top few percent of people are going to make an impact. It will take extraordinary passion to make the cut, and if you're the type of person who is wavering, looking for a career change, and seriously considering something like HVAC repair, please save yourself the pain and go for HVAC - it's a solid option with far less demands or risk.


> please save yourself the pain and go for HVAC - it's a solid option with far less demands or risk.

The physical demands of the trades are very real. Many people retire with disintegrated knees, backs and life-long chronic pain. The risk of learning some AI/ML is basically zero compared to it -- a couple years of an income hit and ego damage.


Why do you think ChatGPT will effect employment for programmers? If ChatGPT makes a few more major leaps forward it might be able to make me more productive (it is not even close currently for my work), I don’t see how current AI models will ever be able to replace my job.

If it improves an order of magnitude it might be able to make me more productive and I will be able to charge more for my time.

I wouldn’t even begin to know how to ask ChatGPT to help me with some of the more complex problems I am dealing with on a day to day basis. As far as I understand these models don’t build up any understand and even their memory is an illusion, previous conversations is just reevaluated with every new question. I assume this limits the complexity of what it can do, and does not seem scalable to me.

I see it being helpful in generating function names etc, perhaps adding more context to autocompletion tools for better suggestions, better front-end to Google so search becomes more efficient. And hopefully it helps me write the boilerplate in every email.

I honestly hopes it improves a lot, I see only benefits for our jobs. But currently not ready for serious work. Although a very impressive feat of engineering.


Isn’t this a bit like saying that cars will never replace horse-drawn carriages, as they max out at 5mph, but might be useful for cleaning up manure?

Or perhaps Ke Jie who claimed that he would be able to beat AlphaGo, after seeing Lee Sedol beat it in one game. AlphaGo won easily.

This is not a criticism per se, everybody else in this thread is also assessing AI from its current capability, when OP asked about future capabilities.

There’s far too much complacency.


I am not saying AI will never replace my job, just not the current generation models.


What are the problems you are dealing with on a day to day basis that ChatGPT could not possibly help with?


Absolutely not!

A master's degree in computer science can be a valuable asset. While it is true that AI technologies are advancing rapidly, a master's degree in computer science can still provide a solid foundation in fundamental concepts and technologies that are likely to remain relevant for many years to come. In addition, a master's degree can help you develop advanced skills and expertise in a particular niche.

Moreover, a master's degree can also benefit those interested in pursuing academic or research careers in computer science.

Also, it is important to note that a "degree" or a "certificate" is just one factor. Other factors, such as your technical skills, ability to work well in a team, and problem-solving abilities, will also determine your success in the industry. Lastly, going to a University can also help you network with your peers, which is helpful in the long run.


> I just really worry the degree will mostly be worthless by the time I graduate considering the rate at which AI is advancing.

Unlikely. There might be less routine work grinding out boilerplate code to be done but competent developers are far from being replaced.

> if we’re all going to be unemployable in a few years anyways

I'd be more worried about the economic outlook for the next couple years.


As someone who hires a lot of ml engineers a masters can be a good way to learn but isn't a strong signal on a resume.

Hands on experience and side projects (especially ones in investing/betting where you actually put money on the line) count for way more.

A good masters will help you nail down the fundamentals of linear regression, metrics, regularization, gradient descent, metrics etc. That knowledge doesn't go out of date and you do use it and get asked about it in interviews.

Large language models etc are growing rapidly but how most practitioners user them is much slower. People still use BERT and the smaller ones for computational reasons and ways you fine tune, evaluate, prompt engineer etc for the larger ones don't change that quickly.

There is also the large area of debugging, monitoring, performance tuning and improving large training and production systems. Even if generative models write could write great code for an entire system i don't see this area going away any time soon.


My thoughts are, given the current state of projects and the fact that there are so many that requires maintenance, which said maintenance is actually 3 to even 10 times more costly than initial development stage, there are too many wanna be programmers in the world who only do a half-assed job. So if AI development scares the living shit out of those to go pursue other carriers then it's all for the greater good.

As for people who really like programming, that are passionate about it, those have nothing to be scared of. Coding camps, universities spilling software developers - all these needs to die. In the past plenty of people got miserable being doctors because a doctor is a good paying job and they were forced by their parents to become one. Only to either fuck a patient for life or to grind a soulless job just because it had financial and social status reward. Now software development/programmers are the new doctors.


Visual Studio recently augmented tab completion with an AI based system to suggest small parts of code.

Although it's impressive, the suggestions are so frequently wrong that I doubt AI will be able to write code as good as a human during either of our careers.

Instead, AI will merely become another tool in our toolbelt that helps us in our day-to-day jobs.


Someone reverse engineered copilot and found that it was using the weaker faster OpenAI coding model. So they are not using code-davinci-002 or text-davinci-003 both of which are better at code generation.

Also several groups including OpenAI are racing to advance the state of the art in text (and code) generation. We should anticipate better models being available within the next year or so. In fact I believe some papers within the month or so are an improved SOTA in their contexts.

Try the Codex Playground stuff on the OpenAI site.


Just think how productive you could be with your background if you leverage AI to help you.

That is how you should be thinking.

(I typed this comment as I was in between writing technical articles being assisted by ChatGPT doing 90% of the actual w writing while I am telling it what to write about and making corrections)


I have witnessed software development jobs being threatened by one thing or another for the last 20 years. At this point I see the jobs moving from CA to states with sane politics/CoL (or even to Europe, at least judging from the latest layoff trends) courtesy of WFH as a much more credible threat.

AI/ML is the only domain which I find completely unlike regular software development. It requires more than simply common sense and love for computers. Possibly even a completely different mindset/psychological traits. I'm not sure I'd be able to do it even with years of data engineering experience.

I'd feel much more secure about my future in software if I had a stronger background in stats and some formal ANN training.


(1) If you want to learn AI / ML, there are ample resources online that do not require a costly, multi-year academic stint. The field is also maturing rapidly, so you'll need to stay up-to-date with online materials anwyay.

(2) If you want to go deeper into multiple areas of CS (e.g. non-AI/ML course requirements), then it could certainly make sense.

(3) Getting the MS from a "premier" school will likely improve your career prospects -- it's a matter of signaling, but still matters less than your work ethic & the quality of work you perform on the job.

(4) If AI makes programming obsolete, the logic & foundations will still be valuable -- e.g. to instruct or query the AI. Besides, this is a decade+ out (at least).


From an experiential point of view, this reminds me a lot of the "computer science is useless because of outsourcing" claims I heard when I was in college. It didn't prove true then. It could be true now, but I admit it's harder to take the sky-is-falling thesis seriously now.

From a theoretical point of view, I believe we are a ways off from AI replacing software engineers. The hardest part of most real world software development is the translation of requirements into a workable product. I don't have firm proof for this, but my argument would be that that translation is itself something that would require general intelligence AI and that we are a ways out.


It was worthless even before the AI advancements. I don’t think my MS degree contributed in any significant way, either in the day-to-day or in the overall arch of my career. I seriously consider it to have been a waste of time and money.


It's tempting to think that way but maybe it's worthwhile rephrasing the question as "is it worth getting a Masters degree at all?" In some fields the answer is clearly yes, in others not so much. Where does CS fall in that spectrum? I think you could spend a couple years doing some great research that will serve you well in your future career, even if advances in the area you're interested in are outpacing your work. A lot of it is about the mindset of a researcher vs. a practitioner and of course we all want to be both.

Some professors are better than others, though, I think the value you get will depend a lot on that.


Many on HN have been telling us that most software jobs do not require a CS education and, unfortunately, they're not really wrong about that. However, those jobs are the first that will be replaced with AI. Given that you have a CS education, you should be able to identify whether you're in jeopardy or not by how much of it you're using and you are also in a better position to leverage that education to move to another subfield of the software industry if you are. A MS in CS can help you with that.

As for fallback careers, I'd say go with plumbing. A trustworthy plumber will never go hungry.


HVAC repair will likely be done by robots within 10 years. It could even be 5 years. So I don't think it would be a long career.

I think that we should plan to create businesses rather than be employees. And to do that we can leverage AIs and robots ourselves to work as _our_ employees.

Studying ML can help you find and apply the AI advancements that you could deploy for your business.

The AI (and within a few years) robots will be strong multipliers for those who know how to use them. So as long as you keep one eye on applications, the CS/ML masters might actually be ideal.


yeh I dunno about this for HVAC. Unless the AI can drive down capital costs for robots, humans will have a distribution and cost advantage


If AI reaches a point where programmers are redundant then whole swathes of the workforce will be too. I’m not sure there is any point even trying to predict how such a scenario would play out.


what if you have children? Not everybody can afford the luxury of shrugging on this one.


You cannot plan for every possible scenario in an unpredictable world, it is just impossible, be adaptable instead.


I suppose you could try to hedge. Buy shares in companies likely to capture the value of the AI revolution? I honestly can’t think of a profession that would be totally safe from AI - that’s what AGI means after all


I dunno, it still looks like human bodies will be useful for a while yet, if only because of the massive investment in legacy infrastructure designed for humans. So knowing the trajectory of AI better might help you determine whether to say retool as say a drill rig operator; the crossover point of extractable income in sw eng vs. petrochem could be meaningfully impacted by tech on the horizon. Of course you have to sacrifice your body but that's an expected tradeoff... could always try to pack into one of the gov/admin lifeboats I suppose.

Speaking of children though, what the hell are we supposed to teach them?


Given the nature of programmers, I think the bigger threat is that AI makes the jobs so mind numbingly boring and unchallenging that they quit before AI gets the chance to replace them.


Anyone more worried about AI than the incredible tech bubble we're still deep in is not paying attention.

AI isn't going to take away your job, the contraction of the tech market will.


Do it!

You are clearly interested, which to me is the primary reason to do anything like this. I did it and never regretted it. I learned a lot and met a lot of cool people.

I've used chatGPT to ask about my area of expertise, and the answers are usually frighteningly good. It can also give completely wrong answers with an equal level of confidence.

So I think there will always be a need for skilled people to interpret the answers these Delphic oracles give. Be one of the experts!


I think the Legal Profession is the low-hanging fruit for systems like ChatGPT. Paralegals will be hit the hardest, being virtually eliminated or transformed into “AI herders”, with a requisite tech background. Attorneys will be able to prep more letters/cases/pleadings more rapidly, leading to a slow darwinistic downsizing of their profession. Probably the first ones who blink and lower their rates will be the survivors.


How is this any different from just re-using a bunch of template letters? If I was a lawyer I would be worried that the AI doesn't say something just right, potentially opening up contract loopholes. I would expect that you would need to proofread/audit everything anyways, which is the real time killer.

Would a lawyer sign something they didn't read?


“ChatGPT, form a legal defense against _____ referencing all cases containing keywords _____, ______, and ______”

Nobody has to search lexis/nexis and form a theory of defense. Lawyer of course proofs the output.


Thank you all for your responses!

I realize I’m being a bit hyperbolic and it’s really hard to predict where things will be in a decade, but I’m pretty early in my career and am just concerned what my prospects will look like as this stuff evolves.

It feels like I’ve got two paths:

A) Keep learning and hope it’s enough to keep up

B) Quit now and focus on learning skills that are difficult to automate in order to get a head start on the transition away from dev work


I have a MSCS from Georgia Tech (in-person, not online) that I got while working full-time. It's had zero impact on my career progression.


I started this MSCS (online version), passed 2 courses (out of 10) but an opportunity came up and decided to leave it.

Overall, I'd say it opened some doors (getting noticed more on LinkedIn). I was able to switch careers (coming from Telecom Engineering), went to work for a big company, and then for a startup.

I had a MS (Telecom Engineering) before getting into Georgia Tech so I don't feel like I'm missing that badge. Companies still care for credentials though.


If programming is out, what could possibly survive the onslaught, besides, I suppose being a plumber or electrician?


This, if AI solves programming any time soon we are close to if not at AGI and we are not the only ones in trouble. I won’t hold my breath.


This. By the time this happens, almost every job will be gone.

The only ones left will be physical work, and this will also go really fast since we already have the tools to create robots with the newly found intelligence. Everyone will be out of a job. This will be an interesting future, to say the least.


Anything that requires physical presence.


Imo as useless before as after. Software engineering is life long learning. 5 years in school is better spent solving real problems anyways.

All these Ai advancements help you solve more problems. Just like higher level languages or your favorite library or the cloud. More tools to solve things. That's our job.


I predict the first big crunch is going to happen once we have on-prem models in closed loops with company software. Fwiw looks like writing code will die before reading it will, so maybe get real good at making PR comments and close reading? That might buy a few years.


Who are making the breakthroughs? Are they made by people not holding a degree?

If the breakthroughs are made by people with a MSc degree or higher, do you want to be a part of that group?

If you're in it for the 80 to 90 percentile pay, sure with enough grit you do without a degree.


Who's doing the marketing / promotion over at OpenAI? Very underrated effort IMO.


Someone still has to be a subject matter expert in order to have an AI produce something. There will be room for employment even if the technology was ready for production tomorrow which it isn't.


I really dislike this futile attitude that people have regarding AI. Your self-learning should never be replaced by an AI. You are just robbing yourself of opportunities by doing nothing.


If anything, the value of your degree would increase? If things start to accelerate, someone would need to maintain these AIs, and you'd be in position to help them.


A quality Computer Science degree will cover AI and, therefore, will be one of the best investments in this case.


That’s a lot of money to spend on personal fulfillment. I’d only do it if you don’t have to go into debt.


ml degree should be a mix of heavy-ish mathematics and cs. in most cases they are a cs specialization

for example, at my university ml graduates are expected to excell at writing performant parallelizable code and to mathematically understand the models they are building


What's the use of MS level math in ML? Post-hoc rationalization of ML architectures invented by a good guess? IIRC, the batch-norm paper is a shining example: while the technique is useful, it was invented by intuition, and its post-hoc rationalization - all that impressive looking math - turned out to be bs.


have you ever read an ml paper? besides, good luck getting ml solutions in high-stakes problems to production without good math to back it

doing ml without knowing math is like doing programming without knowing alg & ds


Again, what's the use of MSc+ level math in ML? A good BSc course teaches you in the 1st year linear algebra up to the eigenvalues theory, and diff equations up to fairly involved numerical methods. Just these two subjects are way more than necessary for applied ML.

I'd go even further and claim that ML has no theory on its own: it's a bunch of methods based on simple linear algebra. Unlike a proper math subject that begins with axioms, definitions and theorems, an ML course would have nothing like that. One shining example is convergence of an ML model: there is no theory that would predict convergence, so the only way is ad-hoc attemps to massage the data and hyperparameters.


i'd go even further and claim you have never picked up an ml textbook, let alone an ml paper


Maybe you should ask ChatGPT?


Further education is never worthless.


Pure CS might be worthless in the long run. But if you specialize in AI or robotics, I think you will pretty much guarantee employment.


The advancements in AI are as incredible as they are overstated. AI is nowhere near making human coders obsolete.


Do you want to do research?




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: