Hacker News new | past | comments | ask | show | jobs | submit login
Your waitress, your professor (nytimes.com)
148 points by nvader on Dec 20, 2014 | hide | past | favorite | 140 comments



The other day my wife and I were at a baby clothing store. We were chatting with the young woman at the counter, and she was complaining about her student loans. I asked her what her major was, and she said it was physics. I said "oh" with a degree of surprise that made me cringe. I mentioned my brother had majored in physics. She asked me where he went to school, and I told her he went to Yale. She said "oh, my parents went to Yale, but I went to MIT."

I felt bad about the encounter, and I couldn't figure out why. Statistically, you're safe in assuming that any random person working at a clothing store isn't an MIT grad making pocket money while working on her PhD. I realized later that I felt bad that it mattered to me. That when she revealed she was smart and educated, in my head I moved her from one class of people into another.


I have a hard time reconciling the idea that we need more STEM graduates with the difficulty that actual STEM graduates have finding work in their fields.

Even if it's part time work while pursuing a PhD, if there was anything resembling a real shortage, these things wouldn't happen. And it's not a rare story.


The problem is one of miscommunication between tech and media/government.

There is definitely a shortage of developers, a shortage which tech companies have regularly been complaining about.

Unfortunately, politicians and journalists just hear "we need more nerds" so assume there is a STEM shortage (in their non-technical minds, nerds are all equivalent). Thus the broad and pointless push for STEM graduates when what's really needed is a specific subsection of Technology.


I wouldn't even say we have a shortage of "good developers." Like commenters pointed out a few days ago on the "how to make it in the tech mecca" post, there is a shortage of "ideal" developers willing to work for less and a strong employer willingness to pass on 100 or 1000 "ok" or "good" candidates that they could train in favor of that "ideal" candidate.

So I would argue that it's a problem with an asymmetrical job marketplace and poor long term decision making on the part of employers.


Training doesn't always turn an "ok" or "good" candidate into a great one. The requirement is for developers. Either junior developers or senior developers. Hiring a trainee actually costs you developer time in the short run, for no guaranteed output in the long run as many of the trainees will just wash out.

Besides, all you need is some books and open source software and you can train yourself, at least to the level that you're hirable. OK, some things you have to learn from experience in ways that are hard to accomplish outside of working for a big company (large scale distributed systems) but most of the companies that have large scale distributed systems are willing to hire straight out of college and, effectively, train them!

So, the rational thing to do is to let the "ok" developers develop their own skills and then hire the ones that end up being good. But then, I don't think the standard is unreasonably high in the first place.


You're correct in that most "good" developers can self-train on whatever stack the employer uses. The problem is that many employers are not even willing to buy one book and wait two weeks. If the candidate cannot be instantly profitable from day 0, they are deemed to be "not qualified" for the position.

There are too many tech fads and trends for a developer to self-train on everything a potential employer could possibly want before knowing exactly what that is.

In my opinion, tech employers should stop being so specific with their requirements and simply allocate some time for the new people to adjust to their in-house way of doing things. As many of us know well, every development team has its own slightly different way of doing things, which has to be learned in order to work more effectively. Nobody "hits the ground running", because the tech skills are hardly ever the limiting factor in hitting full productivity.

And I assure you that any company that wants to hire people to "hit the ground running" will forget to mention their new hires will be doing that running as a steeplechase over mismanagement hurdles.

Training doesn't improve the quality of a candidate. It assures the candidates you can find at the price level you have chosen are familiar with the specific technologies and processes that your company uses.


Then there's a pretty big disconnect that I've seen. The bigger companies that try to hire all the best programmers--Google, Amazon, Microsoft--don't actually care much about tech stacks unless they're hiring for a senior specialist in some specific technology, and even then it'll have more to do with a problem domain than a programming language. The employers who care about tech stacks tend to be either mediocre large companies (who by definition aren't very clueful) or small companies and startups (who really do need you to hit the ground running because they really do have limited resources).


I think it comes down to whether the person directing the hiring process knows anything about programming or not. At large, development-heavy companies, the person hiring may actually have been a 100% software developer at some point, before taking on management responsibilities. Then there is a big doughnut hole, where the company is large enough to need developers, but too small to want to promote any of them, and that is a wide wasteland of frustration before you get down to small companies and start-ups whose business is so strongly focused on tech that they can afford to have a full time developer, but not so much that they can afford a bad one. Those tend to be split between companies that hire for competence and those that want to hire someone that they can use up fast and burn out. If the latter have fewer numbers, they need to hire more people, so their influence in the hiring market is larger.

The mix is very heavily influenced by geographical locale. Around me, the market is mostly the large-but-not-huge companies that hire a lot of software developers but never promote them. So there's my bias. Google, Amazon, Microsoft, et al. do not have a presence here, so they cannot improve the behavior of other companies with their competitive pressure.


I am unemployed because of this.

I worked for too long with Lua, and noone hires a Lua coder, so I am screwed :(


Very few people want to train developers. Essentially, that is the problem in the first place. Employers want educators to provide job training, when that's not what is supposed to happen in higher education.

In the 1990s (and probably earlier), a lot of for-profit educational institutions, in collaboration with a number of large tech companies, produced a lot of "certifications" which tried to fill this gap. It failed pretty miserably for everyone but the for-profit educational institutions.

What's left? Most of the employers want to hire at a lower pay level than most of the qualified applicants are willing to take, especially since regular pay increases have dried up in many markets.

I've been on both sides of this coin, though. I know it's difficult to hire people even if you have plenty of good applicants, because, even when you're willing to train people on the specifics of the position you're hiring them to fill, truly qualified applicants may not work out after 6 months.

In a perfect world (sometimes called "once upon a time"), companies take all of this into account and consider it a part of doing business. In the modern world, companies have found that they can get more work out of their existing employees and out of new-hires by behaving as if all training can be done on-the-job. You might give the new-hires a little less work to compensate for the extra load of learning the job, but the rest of the team has to make sure the job is still getting done AND train the new-hire.


Honestly, the better hiring processes I've seen focus on basic algorithms and data structures as the foundation of judging a candidate's technical ability. That is the stuff you learn in CS. That's also the foundational kind of stuff where, if you don't know it, it doesn't really help if you learn a specific tech stack.

I've also seen C++ shops that quiz you on C++ trivia. Either they were doing a poor job or they were very small and worked only in C++ so it was reasonable for them to ask questions that you'd know the answer to if you just read a book before the interview.


Your post seems to have shifted its goalposts from good to great and back again a couple of times. I would not say that someone becomes great at what they do given only books and practice at home.


I dunno, it feels to me that there are lots of okay developers, but few good ones. Given the nature of the tools available, I am incurring technical debt at a significant rate unless I hire a good developer.


Can you not design a process that can achieve your objectives with the help of 'OK' developers?


If we knew how to do that, we wouldn't be in the mess we're in.

There are things that can help teach OK developers to become good developers though.


In the absence of this process we do not have, what are these things which can help elevate "ok developers" to "good developers"?


> There is definitely a shortage of developers

Even this isn't quite true - I'd qualify 'developers' with 'good'. There are a lot of average/bad developers out there and companies don't really want to hire them...but many don't have a huge pool of candidates like Facebook and Google.


I see this miscommunication also extended to within the schools. Both my daughter's schools (elemetary and middle schools) have STEM specific programs but they are entirely focused on the Science or Math aspect (where job prospects are not easy to come by) with virtually no emphasis on technology or engineering (where job prospects are more available). When I asked my daughters teacher who is one of the STEM chairs how they would better incorporate technology she said she hadn't thought about it. We've created this acronym when really we need to be bumping up the TE as I feel the already existing availability in primary education of SM is sufficient.


I feel a lot of the STEM stuff in K-12 boils down to hypothesis testing. Do X a hundred times, plug and chug into this formula, and presto! the outcome is successful with confidence Y%. The variation is just on what X is.


A professor I worked with absolutely hated the acronym STEM. Each letter of it was very different from each other. It also brought with it an over simplification of education and the issues around it. It does make for a handy political acronym though..


STEM as a term might have made sense if it had only been used to refer middle and high school teaching. Getting "more nerds" pre-career path would mean more people would at least be prepared to pursue whatever the subsection of tech was in demand when they are at college.


Maybe because "STEM" isn't a thing that exists.

"Oh, we're looking to hire a virologist. Or a algebreic geometer. Mechanical engineer maybe. Something like that."


That's pretty much Google's recruiting philosophy you quoted.


We don't need more STEM graduates, we need economic incentives for STEM graduates including companies actually willing to take on and train green grads (everything requries a few years of very specific experience these days), willing to pay them good wages for part time work, willing to pay them good wages at all... etc. etc.

The problem is managers aren't scientists, few companies care about pure research and long term benefits coming from it, and nobody is willing to spend effort and resources training inexperienced average graduates any more.

It is also, in many ways, too easy to get a STEM degree by learning how to be a good University student as opposed to learning your subject well.

The core problem is university needs to be something special and uncommon, not something necessary. Most jobs have little to do with the University training that comes with them and there's not the budget for 1/3 of the population to do science.


Nowadays the economic incentives are the other way around. It used to be that pharmaceutical companies did everything in-house, from early discovery to clinical testing, and Central Research did discovery and lead optimization. That system provided stability for the employees.

Compare that to the pharmco carousel that you can see in action today in Boston and San Diego. The major players are IP brokers that buy startups with promising compounds (good if you are a lawyer or in marketing), but the science is done by university startups which are either bought out or fail, and the number of very capable people on postdocs (because they are between startups) is plain staggering. Try putting your kids through college doing that.

Is anyone surprised that I'm very suspicious of anyone preaching the startup gospel? They are inevitably venture capitalists or marketeers, or fresh out of college and still very green.


Add to that companies like Valeant whose business model is to buy companies that already have drugs on the market and fire most of the research staff. This is predicated on the belief that in-house R&D is nearly always a long term money loser. Better to crowdsource R&D and only pay for the winners.


"the number of very capable people on postdocs ... is plain staggering"

What's wrong with doing a postdoc? I'm in the UK, but here a postdoc is a normal part of the science career path. It pays reasonably well for academia and is a first step towards being a lecturer (assistant professor).


We are not talking about the academic career ladder, we are talking about the career path of med. chemists in general. To an increasing level, early discovery happens at academic startups, and when the company fails (because their two or three compounds fell through) all the scientists are out of work. The fastest new gig available is a postdoc, and the salary is a real problem when you try to put kids through college on that.

This model takes the "scientists are cogs in a machine" paradigm to a new level. Of course med. chemists are cogs in machines, lead discovery is nothing but finding ligands for targets (perhaps it's also finding the proper target), and your scientific training hopefully equipped you for that. If one project fails, there will be others, and it's management that chooses those that will go forward. But it used to be that you didn't change your employer with your project, Big Pharmco always had plenty of projects going. Nowadays it's the employee that bears all the risks, and that includes the risk of poor management. It's no longer cogs in machines, it's cogs with legs.


Another popular term is the 'postdocalypse' as seen here;

http://www.motherjones.com/environment/2014/03/inquiring-min...

Basically, fewer positions, more PhDs, less NIH money to fewer scientists and the situation becomes pretty terrible.


Have you heard of the postdoctoral treadmill? That's what happens in the states. Postdoc treadmill is another source of cheap labor for labs, without any stability and job prospects.


> there's not the budget for 1/3 of the population to do science

Then what should they do? There is no industrial demand. We could theoretically replace almost the entire service industry overnight with autonomous vehicles and online ordering.

Sounds like a record on repeat nowadays but labor is dying. It takes one person to feed a thousand, and if you average all the labors required to supply ones needs it comes out to less than a whole person by a lot. IE, the sum of plumbing, electrical, power production, road maintenance, car construction, furniture building, home building, medical, agricultural, etc production necessary per person is less than a whole person, and thus you have an excess of people who have no real value add labor to do.

Science is really the only thing you can propose in a labor vacuum - well, we don't need welders or farmers or auto mechanics or factory workers or people digging holes, so why not go figure out the next big thing? Too bad, one, not everyone can do that, and two, research has to be paid for by someone, and considering the gross wealth concentration in the US, its either philanthropy or the government paying for it. And as can be evidenced by real demand for science, nobody is.


We need new industries to come up. New industries which tend to hire lots of scientists come from big government spending initiatives. The ramp up in research during ww2, moon race, etc. Plenty of industries came from that.

We need to push into space further, attack cancer, deal with antibiotic resistance, dig into groundbreaking physics, etc. R and D is only 3% of US government budget.


Isn't it funny how we chastise companies for treating workers like replaceable cogs, and then we do the same in these discussions? If I need an expert in neurobiology, a glut of physics majors is unlikely to be helpful.


Then we should be saying we need more neurobiology grads.

But what's being said is we need more STEM grads.

74% of STEM grads don't work in their fields. That's not a small percentage.

http://www.census.gov/newsroom/press-releases/2014/cb14-130....


The first failure of STEM education: statistics.

1. It says they don't work in STEM fields, not just their fields. My guess would be that the percentage would be much higher for people who don't work in their field, but in many cases it's still just a statistic.

2. There is no path for job growth in many STEM fields except out of STEM and into management. Most people staying in a STEM job for 20+ years find themselves under pressure to move to management, in part because it is somehow easier for companies to pay the salary in middle management than in STEM for someone with a lot of time in the field.

Still, some people just change their minds, or actually want to work in business or some field considered non-STEM, but do their undergraduate work in STEM. Additionally, many STEM fields require a level of specialization that usually requires an advanced degree or long-term work in the field to acquire that specialized knowledge. Many people get discouraged by this when they see their student debt and their lack of employment options after getting a Bachelor's degree, so, if they continue in higher education at all, they look at getting a degree in something else (like an MBA).


The reason for that was that it was the industry that framed the shortage in that way, to begin with.


There is no fixed number of STEM jobs... and it's probably true that the dearth of job candidates actually reduces the number in the long run. If you can't find the workers you need, in the long term you shift strategy to that which doesn't require them.

After all, the United States isn't the only place to find them.


I thought the argument for increasing the number of H-1B visas is that we need more technology people? The current limit is 65,000 people every year, and the claim is that's too low.


There is a shortage... Of tech workers willing to indenture themselves, or to work for a fraction of what they're worth. But the real elephant in the corner of the room is ageism.


ageism how?

I am 27 (or I will be in some hours... right now 26, in some hours 27).

I am jobless for a while, and in my country, industry research showed that the companies are hiring less and less people in their 20s and hiring more people that are 50+

Any position I see with reasonable pay (ie: pay more than my rent + food) requires 20 years of experience+ since I am 27 I cannot have 20 years of experience (I learned to code at 6, but I don't got my first job at 7 :P)


Which country? I'm guessing not UK or US.


Brazil.

But general stats on US jobs (not the TI industry stats) also show the same pattern.


To be fair, this was a little college town and I don't imagine it's easy to find any sort of physics-related job that has much schedule flexibility.


This is likely because you've been receiving the message that it's a sin to have preconceived notions and to even formulate a mental model of someone that contains any negative elements.

It's not. Instead I suggest taking pride in that you are not restricted to initial impressions and were able to move her box upon receiving additional data points.


I don't wanna reveal the identity. There was this girl in a group I am familiar with. She did BS in physics, PhD in astronomy/astrophysics from Yale. She works as a data scientist at one of valley companies. Even though she is a data scientists, what she does most of the time is to write SQL queries against hive which is a SQL interface to hadoop/bigdata.

If she is indeed an MIT grad, there are many companies who wanna hire her just to do SQL queries.


I am actually in need of an underemployed physicist. I'm writing a book about quantum mechanics and I need a consultant to make sure I get the math right. I was going to put an ad on Craigslist, but maybe someone here has a lead?


I don't know if he does this sort of thing, but I recommend Jonathan Walgate. He has a Ph.D. from Oxford in quantum physics and a business helping people to write grants. He's also a great guy! (I used to do quantum physics.)

https://www.linkedin.com/pub/jonathan-walgate/35/965/553


Thanks for the lead!


It would be amazing if the guy you replied to somehow ended up hooking that cashier up with a job.


Well, it would just be a small consulting gig, not really a job. But yeah, it would be cool.


StackExchange and/or AskScience at reddit can be useful for this sort of thing (though possibly not for lots of repeat queries).


That doesn't seem odd to me. If there's a young person working a retail or service job, I usually assume they're doing it for extra cash while they finish their degree. 9 out of 10 times it's true and completely predictable, as these sorts of jobs provide far more of the necessary flexibility than traditional "professional" jobs.

On the other hand, if they're 30+ my preconception is completely different.


If she's smart enough to get a physics degree from MIT, she can probably teach herself enough to become a developer and get a big salary boost. Heck, with the way MIT STEM programs work, she probably already can program and pass most technical interview questions.

(She also might not have been entirely truthful with you.)


Couldn't it be because her skills are obviously being underutilized? Society could benefit from her a lot more, and she could get more resources at the same time, but for what ever reason that's not happening. And that is rationally upsetting.


It was upsetting to me because I don't think we should value and respect people who are MIT grads more than anyone else. Someone who is working retail because that's all they can do is just as valuable as someone who does it while working on something better. I was upset because I realized how classist I was being.


That's over $160,000 in college tuition. If it doesn't pay for itself, it's not a very effective use of capital, regardless if her parents paid for college.


Was this Cambridgeside Galleria? Not that uncommon for students to work at a Mall.


Well at least you're in good company with that attitude, on this site.


We go to school because it is a tradition. Parents wants to see their kid in the funny hat. You are almost shunned when you drop out.

Not to say that school is useless, I loved learning. But I knew that the computer science degree I was working towards wouldn't make much a difference in the field I was hoping to work in.

Going to school should be an investment. It is too bad that most of us are too young to understand the risk we take when taking loans or choosing an education path.


I wish there were more options for technical "trade schools" in the US. At my university I knew I wanted a career in information security and development, and I had a choice between Computer Science and Information Systems. Computer Science was mostly theory with not that much hands-on programming, while Information Systems had some more hands-on real-world programming but was heavily business/management oriented and essentially assumed only the most rudimentary programming and technical skills even in the highest level classes. I ended up going with Information Systems, which I'm still not sure if I regret or not.

I was fortunate enough to be a pretty self-motivated learner and become skilled at the things I was interested in through the Internet and self-teaching, and landed a pretty good job straight out of college, but I can't help but feel I wasted 4 years of time and money.

I would've loved to go to a university with practical Information Security/Assurance and Software Engineering programs at the undergraduate level, but sadly 1) none were anywhere near my area and 2) they're considerably more expensive, and college is expensive enough already in the US.


It's too bad that schools don't prepare graduates for real world jobs. Especially, for what they are charging. Personally, I don't know if most 10 yeared professors could keep up with the pace of change in the real world. I think what a lot of people leave out, when they are espousing their college degree, is the amount of self-learning, or who they know after graduation in order to keep/get their job? We still have a lot of naive Sheep who will judge you whether you finished that degree; so you will probally be glad you stuck it out and graduated? As to the dude above me who automatically put that Sales Person in a different "class" once he found she was a graduate of MIT--I'll give you a break, but repeat a quote I will never forget; "Chubby--you need to stop judging people on they way they look?" by Louis Medlock(Deliverance).

When I was in college I knew it was 90% B.S., but society expected me to finish the degree. Would I go today, I'm not sure. You need to be exposed to a few facts in life, so you don't end up believing in things that gave no scientific validity. On the other hand my sister is a multimillionaire, and believes in Psychics? She is also once very attractive, and thinks nothing of taking advantage of people for what they can offer her--she used to call it , "They would make a good contact!" Now--they call it Networking. Personally, I couldn't stomach it, but my sister doing great financially. Now if she could get her family to like, or trust her, or a partner who actually liked her for personality; she would have it all. Happy Holidays!


Just fyi, the word you're looking for is "tenured professors could keep up..."


I'm not sure trade schools, per se, are the answer.

The problem with trade schools is that you're just making a different bet on what jobs are going to be available in the future. Maybe the numbers are better than for some traditional university degrees, but it's fundamentally the same sort of bet: you educate yourself up in some direction, and then go look for jobs.

An alternate route, which has been used at different times in different places, is to first get hired at a company based on general competence, and then to get trained up in whichever direction they want you to work. That way, you have a guaranteed job after you finish training.

It's a much better system for the workers, but it requires different attitudes on the part of the corporations -- they need to be focused much more on the long term; they need to be willing to pay upfront in time and money to train up their workforce, and be stable enough to employee them long enough to make back their investment. Fly-by-night operations that explode rapidly for a couple of years and then collapse can't really afford that.


That's how trade schools work in Germany. About two thirds (up to 3/4) of the time you are at work and learn there, the other you are at a normal school. After you graduate (about 3 years) you most of the time stay at the same place to work.


I work at one! New York Code + Design Academy. There are similar schools across the country and world. People graduate having enough skills to land an entry level job as a junior developer without having to go incredibly deep into theory, we focus on implementation and actually building products.


I went through the M.S. Information Security program at Georgia Tech and and glad that I did it. Though at this point in my life I work mostly on building out web applications with Ruby on Rails for startups. Security is a tough sell in most of the industry. Bolting it on after the fact is almost always impossible, which is why I have had to learn so much about startups, their lifecycle, business model development, the customer discovery process, etc, in order to be able to attempt to build more secure software for startups before its too late in the process and bandaids are the only options remaining.

Point is, no matter what you do, there is a tremendous amount of learning to do outside of the university curriculum of your choice.


Any particular reason why you aren't taking a job in the infosec industry directly? Compared to being a RoR dev, I think you could be paid much more and have more interesting work.

Helping manage secure dev practices for startups is of course very important, but I think that would be much easier to do if you're either in some kind of CISO or appsec management position in the company, or are part of a third party auditing/pentesting firm that deals with startups.


I was in a similar boat as you were (wanting to go to a "trade school" type thing for IT, ended up going to a generalist school with business knowledge), and I feel better for it.

Granted, my schooling let me spend some time in Japan on a school exchange, which led to a whole other sequence of events. But I also got some classes on management, economics, and accounting that does help me understand how things are happening around me. Plus having a degree is pretty much mandatory if you want to work abroad (for visa issues). My school prepared me to deal with the human aspect of work, I think.

I did do a lot of self-teaching though, so I didn't end up learning much on the programming side of things.


The logic here seems... hypocritical, or at least inconsistent.

At the beginning: "In class I emphasize the value of a degree as a means to avoid the sort of jobs that I myself go to when those hours in the classroom are over."

At the end: "My perhaps naïve hope is that when I tell students I’m not only an academic, but a “survival” jobholder, I’ll make a dent in the artificial, inaccurate division society places between blue-collar work and “intelligent” work. (also, that diaresis...)

It feels to me like the first statement says "I lie to my students and present them with unrealistic expectations of the world" and the second says "If I say something, I'll let them know that they are deriving little value from their education... and thus put my career further behind."


That may be the intent of the article. To lead you into the thinking that the author wrestles with each day.


As someone that took a degree because the "adults" then forced me to, and spent years of my life just to pay it, totalling 10 years wasted (I am 26 today, and I my assets are the same as they were when I was 16, and soon I will have less than what I had when I was 16 because I am unemployed).

My parents, and the academia, school teachers, all lied to me, they told me that getting a degree was a good thing.

It is not, it basically fucked up my life so far. If I could go back in time only once to give me a single advice, it would be to tell my 18 year old self to get a job and skip college, or get into a trade school.


I dropped out of college when I was 21, in part because I knew two different people with bachelor's degrees who had jobs like delivering newspapers or selling shoes at a department store. Everyone around me seemed to think I needed a degree in order to have a career and that a degree automagically would give me a career. The lives of these two personal acquaintances suggested otherwise.

I am currently personally acquainted with two other people who completed college and they thus are saddled with huge student loans. Neither one is making enough money to deal with those loans, much less justify them. One owes $100k the other owes $250k. Even if you successfully declare bankruptcy, you cannot write off student loans. These two people may never manage to get their loans paid off. I cannot imagine how awful that must be and I used to have more than $50k in personal debt, which I have been slowly resolving in recent years. (Part of mine is also student debt, but not all of it. It is one of the reasons that although I have considered declaring bankruptcy, I have never pursued it in earnest. Part of what I owe has to be paid regardless.)

There is something very wrong with the system here in the U.S. A college degree should not be a path to permanent poverty because you cannot really afford the student loans you wracked up getting it.


Contrary to popular belief, while it is difficult, it is possible in certain situations to have student loans forgiven, canceled, or discharged.

https://studentaid.ed.gov/repay-loans/forgiveness-cancellati...

Edit:https://studentaid.ed.gov/repay-loans/forgiveness-cancellati...


Thank you.


Anyone running an online business will tell you that words are valuable. Text is the primary way we communicate with our customers. Carefully chosen words can have a massive impact on revenue. The author possesses a valuable skill but is not fairly compensated for it. What's going on?

Inequality is increasing in the US (http://en.wikipedia.org/wiki/Income_inequality_in_the_United...) and many other countries. Technology is giving capital an increasing advantage over labour. A job as, say, a doctor, lawyer, or University professor, which used to guarantee you a position at the top of the social hierarchy, no longer does. Inequality is generally regarded as a bad thing, both socially and economically (see, e.g., http://en.wikipedia.org/wiki/Economic_inequality) Addressing inequality at a macro scale is beyond my knowledge, but I do know how to address it at the micro scale: own your work.

Technology contributes to inequality by allowing successful businesses to scale further with fewer employees. However, you can use the same infrastructure to scale out your microbusiness. Within our community, patio11, Amy Hoy, Brennan Dunn, Nathan Barry, and others have show how to do this. It is noticeable that many of the above have made a substantial chunk of change from publishing a book. (E.g. http://nathanbarry.com/2014-review/ ~$260K from book sales.) The OP is in a great position to replicate this -- they have all the skills, namely command over the written word, to produce great content and great copy to sell it.

On a wider scale, I think we can use technology to redress some of the growth in inequality. Presently it mostly allows relatively few businesses to concentrate wealth. If more people retake ownership over their work, which technology in many cases enables, then perhaps we can do something to address inequality. It's not the whole solution but it might be part of it.


Somewhat off topic, but to your question about why we don't value certain kinds of work...

Culture provides a kind of distributed wage-fixing. The free market fallacy gives people the idea that wages are optimized and reflect true economic value. But in reality employers pay as little as they can get away with.

Imagine two societies. In the first, its generally accepted that people in any position, including nurses and receptionists and customer support people, can create immense business value if they accel at their jobs. A customer service agent who makes big impacts on retention could expect to make a six figure salary. Others who are less useful might make minimum wage.

Now imagine another world where the general social consensus is that your job title implies a ceiling on how much value you can bring to an organization. In this world a programmer can make millions but a customer service rep is capped at $25 or so per hour. General belief is that if you want to make more you can't just create more value, you have to get a different job title.

In the first world, corporations have to compete for every employee, and overall compensation costs are higher. In the later world, only the top slice of jobs are subject to market forces, and costs go down. It also has social side effects that appeal to the ruling class, but that's another story.

I would argue that businesses who think this way are actually leaving money on the table, and that by removing title-based compensation caps, you have a more accurate picture of the economics of your business and can harvest more value, but most people don't seem to think that way.


> In the first, its generally accepted that people in any position, including nurses and receptionists and customer support people, can create immense business value if they accel at their jobs. A customer service agent who makes big impacts on retention could expect to make a six figure salary. Others who are less useful might make minimum wage.

Sorry, but as a matter of real-world fact, individual receptionists and customer service representatives can't create immense business value. Or rather, the individuals in those roles might be able to, but only if they are moved to different, higher-leverage roles. (If you can retain hundreds of thousands to millions of dollars worth of business by talking to individual customers, "customer service" is not your job title. Maybe "sales" is.)

The market has two parts: supply and demand. If you do a job that just about anybody could do, you don't get paid much because there are a lot of people who can do a job just about anybody could do.


That's largely because of the deliberate deskilling and commoditization of labor by capital. If you structure your business so that most labor roles have low requirements and low productivity, you shouldn't be surprised when you have low productivity.

Of course, this model fails to account for all the low-paid but high-skilled employees in technical, high-productivity roles, who are being quite simply exploited.


> That's largely because of the deliberate deskilling and commoditization of labor by capital. If you structure your business so that most labor roles have low requirements and low productivity, you shouldn't be surprised when you have low productivity.

Is anyone surprised by this? Is anyone expecting receptionists to deliver millions of dollars of value to their business? There are just some roles that have to be done where the value delivered per worker is very low. Labor intensive work. And, in fact, some of it is paid proportional to the value delivered. Fruit pickers are paid proportional to the amount of fruit they pick, for instance.

> Of course, this model fails to account for all the low-paid but high-skilled employees in technical, high-productivity roles, who are being quite simply exploited.

Who might these be?


>Who might these be?

The geneticists, molecular biologists, and chemists in your average biotechnology company.


This is why the myth being spread that "anyone can code" is so long term dangerous. Because it doesn't matter that it isn't true (beyond trivial stuff like websites): it will still drive down what people in general think programmers are worth.


Consider that you don't see "Anyone can be a physician" campaigns. Then again, physicians are unionised, if you consider the AMA a union. Why is it that unions are anathema in the programmer community when you can see that they work nicely for other professions?


There a few big fears...

First a professional association like the AMA restrict supply by demanding substantial credentials. So you wouldn't be allowed to work with any developer without a degree from a program recognised by the association. That means that you could get kicked out for working with people like John Carmack or Steve Wozniak before he went back to school.

Next, in the US most unions were built around unskilled or semi skilled labour and the culture reflects that. So seniority is king and productivity suffers. Unions will fight to keep old obsolete technologies around to protect existing workers.

That doesn't produce an environment conducive to making cool things.


We don't need a union, we need a guild.


when I startup company I was at years ago started failing and laying people off, someone proposed unionizing to prevent it. In the serious discussions that followed, many reasons came out. I don't agree with them all, but these were floated by people at that time:

1. Unions are about more than simply the jobs 'n' wages. There's lots of favoritism, a whole new layer of politics beyond the question of "who works enough to earn their paycheck".

2. There was insufficient inertia. A chicken and egg problem. No one was used to unions and no one knew exactly what to do or how to determine if it was being done properly.

3. Some people were uncomfortable having a more exact spotlight shone on their work. In more physical, less individualized jobs like plumbing, pipe-fitting, etc., people are more interchangeable and less mysterious. Programmers at this company preferred to endure the risk of layoff rather than have their art/handiwork/craftwork get recast as something that another individual could do just as well.

4. At least in the startup we were in, failure was more of an anticipated outcome than other professions. Unlike a tradesman, most of us didn't invest our whole lives in a set of skills with a known or "background" level of demand. iow, many of us just said "fuck it" and went to a new startup or a new white collar job.

5. Elitism: At least some people viewed unions as something "lower-class" people have to do to earn their wages. Some people at this company felt they were of high class and didn't need "lower" themselves by unionized. (yes the cognitive dissonance here is strong).

Personally, I still don't know the answer, but it can only help us to give these objections some thought from time to time in considering the next several decades of our trade.


"when I [sic] startup company I was at years ago started failing and laying people off, someone proposed unionizing to prevent it."

Attitudes like this is why I feel unions are harmful. The company is failing, but people want a union to step in to prevent cost-cutting via layoffs? Sounds like a sure-fire way to destroy any hope the company has from recovering. Now everyone loses their their jobs.

Union-mentality seems totally divorced from business sense. Union employees demands take precedence over business wants and needs - e.g. seniority takes precedence over competence, wages forced to be higher than competitors who pay market-rates, etc. It's a long-term spiral into company failure


I'll wager the management made sure they were looked after, no cost cutting for them until the coffers were bare.


Possibly. But that's a different issue. The owners of the company, unlike the employees, have invested their personal hard-earned savings in the venture. It's their responsibility to be monitoring management to make sure that funds are spent appropriately.


Not as simple as that. A founder invests his savings, but that might be less than the total salary difference for employees and the market rate. So they have given up more than he has invested, but for a fraction of a fraction of his stake. How is that a sensible deal for them? There needs to be some balance.


"A founder invests his savings, but that might be less than the total salary difference for employees and the market rate [...] There needs to be some balance"

Irrelevant. The employer made an offer of employment. The employee weighed the pros-and-cons of the offer and accepted or not. "Balance" makes no more sense the the kid I pay to mow my lawn asking for a balance between my household and himself.

"How is that a sensible deal for them?"

Not for me or you to say. Each and every employee needs to take personal responsibility and determine whether or not a employment "deal" is sensible for their situation. If an employer is making all non-sensible offers, they won't attract a quality workforce and will fail. Market-force in action. Happens every day.


The reason the AMA gets to limit the number of seats at med school is because the government limits the number of residency spots there are. You can't be a real doc without doing a residency.

The American Bar Association got sued for antitrust violations for trying to restrict the number of law schools.


But in reality employers pay as little as they can get away with.

And you don't?


There are a lot of writers out there. It hasn't gotten easier to make it big, even if the size of the check once you do is bigger. Google "death of the midlist" for more information about this trend.


Part of the point is that if you own the product you get 100% (well, more like 95%) of the income, rather than the 10% you might get via royalties. 1000 sales for a nonfiction book is a solid year's income.


True, but the 90% that you don't see through a traditional publishing arrangement doesn't go up in smoke - it pays for marketing, distribution, accounting, and a bunch of other stuff. Now, historically publishers sometimes rip people off. But the problem is that by focusing on that narrative (especially in the music business), we have collectively thrown the baby out with the bathwater in many cases, overlooking the fact that authorship and publication are often entirely different fields of competence. Some authors are also good at publishing - and more power to them. But economic efficiency is maximized when actors specialize in those fields where they have a comparative advantage; if you're a writer, it's not really a good use of your time to be organizing your own book-signing tour unless you really enjoy that aspect of things.


You also take 100% of the risk.


Anyone running an online business will tell you that words are valuable. Text is the primary way we communicate with our customers. Carefully chosen words can have a massive impact on revenue. The author possesses a valuable skill but is not fairly compensated for it. What's going on?

She's not actually using that skill to increase revenue of some business?


She's working as a college professor! Why is this not considered worthy of an economically sustainable income? Most schools seem to have a lot of money sloshing about, it just goes into the pockets of administrators and sports coaches and people erecting buildings. The people delivering the actual service of education seem to be the most poorly rewarded despite the significant degree of competence required.


Is it not worthy an economically sustainable income? According to their salary schedule, an assistant professor makes a minimum of $60k, and the median makes $90k. I'm not an American, but isn't that a decent salary? And from what understand, Las Vegas isn't a particularly expensive place to live.


Most "professors" are actually adjuncts (earning <$20k/year, if that), not assistant professors. Assistant professorships are rare and coveted.


The U.N. salary schedule doesn't seem to include adjunct professors, only professors, associate professors and assistant professors. It does include a grade below called "instructors", can that be it? They still make a median income of almost $66k.


What UN salary schedule? The American rank at which the plurality of academic instructors work is adjunct.

>Adjunct professors now make up half of all college faculties, and 76 percent of instructional positions are filled on a contingent basis, according to the American Association of University Professors’ annual report on the “economic status of the profession.” There’s no starker way to consider adjuncts’ economic status than to hear that they’re paid an average of $2,000-$3,000 per class, with few to no benefits.

http://www.pbs.org/newshour/making-sense/one-professors-amer...



Adjuncts typically don't get paid a salary; they get paid a certain amount for each course they teach, so they aren't on that schedule. UN pays their adjuncts $1,130 per credit[1, Section 6], so someone teaching four courses each semester would make $27,120 per year.

[1] http://system.nevada.edu/tasks/sites/Nshe/assets/File/BoardO...


I find it fairly surprising how certain the author seems to feel that getting a doctorate would somehow equate to higher income and job stability -- English Ph.Ds aren't exactly a credential in high demand. In fact, reality indicates the opposite is true.


People get stuck in bubbles. In the academia bubble, the only people making a living are professors and to become a professor you need a Ph.D. So they are the example people follow. They don't realize that becoming, say, an accountant might be a better move because there aren't any accountants in the academia bubble.


People who get into PhD programs should have the general intelligence to realize that every PhD candidate can not become a professor. It doesn't add up in any hierarchical organization like that.


If you understood how people actually think, in a cognitive-science sense, you would understand why "X should have the general intelligence to realize Y" is a complete load in almost all cases.


I'm sorry. Clearly some of the self-appointed smart people of HN were offended by my naive remark, uninformed by any research in cognitive science or other relevant discipline.


Incorrect. There are many professors at colleges that do not have PhD programs.


I was curious what an English (literature) Ph.D might entail, still am, but Birimingham.ac.uk website had this:

>"Over the past five years, over 90% of English Literature postgraduates were in work and/or further study six months after graduation. " //

Against an unemployment rate of ~20% for 16-24 year olds, http://www.ibtimes.co.uk/unemployment-four-ten-uk-graduates-..., that seems quite good. I'm sure there are some damn lies in there though.

More on stats for UK graduate employment, http://www.bbc.co.uk/news/education-29327590.


By the time you've completed a PhD you aren't 16-24. Roughly speaking, Bachelor degree will take 3 years, PhD will take at least 4, so you'll be at least 25.

That's also 90% from a relatively reputable university - http://www.birmingham.ac.uk/International/impact/global-rank... claims top 100 in the world.

The University of Nevada in comparison is apparently the 113th best college in the US for English: http://colleges.usnews.rankingsandreviews.com/best-colleges/... - so much further down the rankings.


I'm not really interested enough to hunt it down but presumably UNLV's employment figures for those who've taken post-graduate degrees are a matter of record too?

I'm surprised to some extent that the 113th best college for English offers a postgrad program in English.


I like this article, but I wish it would do more to point out that high-education low-wage workers, like adjunct university teachers, are usually in their low-wage jobs by choice, and their education gives them a lot more opportunities even if they don't take them. They are in a way very financially secure, since they could easily give up on academics and move into high-wage industry jobs (at least in the applicable fields). There are a lot of people who really want academic jobs anyway, so schools are happy to pay low.

Note: I'm an academic and I have a second job to supplement my income. I have often considered moving into industry where I would make 5-10x my teaching income.


But, but, there is no industry left in the US!

(Or so the internet tells me. Meanwhile I drive past half a dozen bustling factories every day on my way to work at an electronics manufacturer.)


The real problem is that neither of her jobs affords her a career.

Why should her teaching position be just above the poverty line? People should regard that as horrific.

Why should her service job not be a career? People should regard the fact that service jobs are so unstable as horrific.


I just don't get all the complaints.

Since when did we become entitled to a well-paid job?

If, let's say, tomorrow they invent an AI that can write great code, putting me out of business - I will say, well, time to switch to something that pays. And there are still TONs of jobs that pay - electricians, plumbers, carpenters, construction equipment operators, elevator repairmen, illustrators, real estate brokers - they all make good money. For now at least.

Why would I even waste my time as a waiter? That time can be spent learning a skill that's in demand, not something literally anybody can do.


Because neither your landlord nor your supermarket proprietor is going to wait around while you learn the skills to become employable in some other field. You don't just decide to take up plumbing and bootstrap yourself from fixing leaky buckets to running your own plumbing company. As well as the skills, you need a pile of equipment, a vehicle, you probably need to be licensed in some fashion, you need to be bonded or insured before people will do business with you, and you need to spend a good bit of time networking.

You can of course bootstrap yourself into a lot of jobs, I've done it in more than one field. but it can take quite a while to break even, and it's not like there isn't competition. I was perticularly perplexed by your mention of illustrators - for sure some illustrators can make a lot of money, but I'm pretty sure that like every other branch of the arts, the average illustrator starting out is beset with requrests to work for free or on spec on the grounds that 'this will be great for your portfolio!'


I have savings and my credit history is nearly perfect. I'm pretty sure I can take out a loan and start one of the above businesses.

It's important to have other skills in life as a backup plan. That, and because it's interesting :)

I did wedding photography for fun, so I can tell you for certain - nobody who does it professionally bills less than $2K per wedding (which is 6-8 hours of shooting plus around 20 hours of photo editing). It's mostly $3-6K. If they want a professional looking album, it's another 5-10 hours of laying it out and ordering, and you can bill them another $500+ on top of the album cost. Of course, you need a portfolio, and you need skills, but I billed for my second wedding shoot - the first one was free for a friend.

How long will it take me to start a full blown wedding business? I'd say I will land my first one within two-three weeks. I will probably not have many clients at the beginning. In 6 months I will have more than I can handle. It's true of every single good wedding photographer - they are booked months in advance.


>If, let's say, tomorrow they invent an AI that can write great code, putting me out of business - I will say, well, time to switch to something that pays.

This is an absolutely absurd limiting case to take. "An AI that can write great code" can rewrite itself and "go all Singularity" on us. If that happens, there will not be any damn jobs.

>Since when did we become entitled to a well-paid job?

While there is any such thing as a job, yes, the social contract dictates that in exchange for forcing people to work to eat, with no access to the commons and all the Earth enclosed as private property, we give people a damn job to live off. Hell, basic morality demands it.


> Since when did we become entitled to a well-paid job?

This is a funny mindset. Since when were you entitled to claim large swaths of this planet as belonging to an arbitrary group of people, and subsequently feel entitled to this land's resources and bounty and the rights to dictate how it would be reaped and exploited in a very arbitrary manner as defined by the current so called "modern society"?

You have the cart and the horse backwards. Humans are primates, and we fight for our right to live. If a group of people is impeding that right unreasonably, they will face the consequences.

Don't be naive, and don't take this snapshot of the present era as an axiomatic law of the universe.

Nature is a cruel mistress.


> Since when were you entitled to claim large swaths of this planet as belonging to an arbitrary group of people, and subsequently feel entitled to this land's resources and bounty and the rights to dictate how it would be reaped and exploited in a very arbitrary manner as defined by the current so called "modern society"?

And where did I claim that?


Not to worry in 10 to 20 more years robots will take your survival job away too. Then you can teach college and be one of the homeless at the same time!


>English instructor

The most important piece of information was at the end of the article! The whole read I kept thinking "what type of professor is being economically valued at a third that of a cocktail hostess?"

While this is ridiculous and seems like an example of the free market gone wrong, I think it's very particular to Vegas. Serioisly, I know friends doing the equivalent of 50/hour thanks to tips. However if this were true, in say, Seattle, then I'd be much more worried.


Not surprised when 1% of the country holds 40% of the wealth.


I find it hard to sympathize with these articles.

She was aware that the academic job market is bleak, especially in English, yet chose to pursue it nonetheless. Thus, her plight is squarely on her shoulders. It also doesn't mean her students will fair worse: some of them might be making rational choices to get degrees in useful, lucrative fields. Especially at a third-rate university, nobody really has the luxury to major in English.

Nothing in our economic system entitles you to making a living at something just because you like it and practice it. I could train for decades to be the world's foremost expert on modern cave painting, but that doesn't entitle me to a cent of wages unless the market deems that skill is useful.


Any halfway decent programmer I've ever met or heard of loves programming. They'd do it even if it didn't happen to be hyper-lucrative. We just got lucky; we're passionate about something that the owning class happens to currently consider fashionable. Enjoy it while it lasts and try not to pretend it makes you better than anyone else.


Yeah, there's this narrative about programmers being born, not made. I have come to believe it is mostly false.

First of all, when I was in my early 20s, I loved a lot of things. Writing, music, graphic design. For a time, I defined myself as someone who used to program. When the internet happened, I just dropped back into it. If I had been born in different decades, I most certainly would have done different things.

Secondly, I know a number of people in SF who decided to jump on the programming bandwagon since 2011 or so, in the hope that they could remain employable in an increasingly topsy-turvy economic environment. And... they aren't bad. Not great, but not terrible either. A friend of mine who couldn't write a line of code two years ago cracks jokes about D3.js today. Some of them were women who might have had STEM-ready brains, but were scared off from science by the usual sexism in childhood. But others, as far as I can tell, were pretty average and don't have a great aptitude for it. They just learned it, by learning it.

My feeling is that, in North America, they just forgot how to teach computer science for a few decades. There's an effect some people have found (anecdotally) that computer science starts to skew male when, in a given country, home computers become widely available as gaming machines. So it seems to me, CS departments relied on amateur geeks (usually male) who already knew how to program to be the top students in the class, and explained away the high failure rate (and a very skewed gender ratio) because computer science is just so hard.


Those programmers were born, not made; they just didn't discover it as early as some others. This isn't a myth. It's supported by various studies. Professors in the field I've talked to have all agreed. It's a double bell curve: http://blog.codinghorror.com/separating-programming-sheep-fr...

(For some value of born. It might be innate, it might be due to experiences during formative years; all that's known is that by the college age it appears to either be there or not.)


Yeah but isn't that study already telling? They're saying that if you are already familiar with the idea that a computer system has to be perfectly consistent, then programming makes sense to you. (Also, I think they had issues replicating this.)

Anyway, it doesn't say that this is an innate quality. So perhaps they are measuring the gaps in how we educate people.

If you want another graph, look at this one:

http://www.npr.org/blogs/money/2014/10/17/356944145/episode-...

And here's yet another one which shows percentage of degrees conferred to women. Note how weird the graph is for CS, and it doesn't track fields which you might think would indicate innate talent, like math and statistics, or physics.

http://www.randalolson.com/2014/06/14/percentage-of-bachelor...

What is the explanation here? Coding was way harder and more mathematical before the advent of high level languages and yet the participation of women crashes right around 1982. I suspect that this is just the most obvious effect; we've probably been turning off men who would have been perfectly competent as well.

Ask yourself what the more reasonable hypothesis is:

- Computer programming is unique among all technical skills, in that talent predominates over education, to a higher degree than medicine, or even mathematics.

- Something is wrong in how we teach computer programming, and it might be related to the introduction of the personal computer.


That test makes me wonder if we've somehow turned off a generation of potential computer scientists by the unfortunate choice to go with C/Fortran style assignment instead of Pascal/Ada style.


While I do indeed love programming, I also enjoy many other things. For most of high school, I thought I wanted to be a lawyer. For the first two years of college, I thought I wanted to work in finance. Heck, for a brief period I was infatuated with architecture.

However, as a rational actor looking at the career landscape, it was clear that CS was a much more productive and lucrative field—so I went down that path. I'd expect any other human to do the same.

I guess I don't entirely blame her though. For years we've been spouting atrocious career advice to "follow your passion" when what people really need is passion for what they do.


>We just got lucky; we're passionate about something that the owning class happens to currently consider fashionable. Enjoy it while it lasts

I imagine programming and related fields (networking, system administration and integration, other IT operations) will still be very lucrative in 100 and 200 years from now. It might be slightly less lucrative (or it could be more lucrative, who knows), but programmers will likely be very important to organizations for at least a few centuries, if not millennia.


>I imagine programming and related fields (networking, system administration and integration, other IT operations) will still be very lucrative in 100 and 200 years from now.

I'm going to quote something told to me by an MIT professor in Computer Science. I won't say who, but he said it this past September.

"In 500 years, all economic activity will be carried out by algorithms, if anyone at all is left."


A few centuries is a long-ass time; far too long for such cocky predictions to be taken seriously. (And millenia? Jesus...)


Unless you think computers are suddenly going to vanish forever or will achieve absolute sentience and hyperintelligence in under a few centuries, I'm not quite sure how I could be wrong.


> I'm not quite sure how I could be wrong

Supply eventually will greatly exceed demand. The bar for entry will be lowered.

Being able to write well is a good skill, probably comparable to computer programming as being able to communicate with other humans is yknow, important. But it seems to be that humanities majors and other degrees for which writing and understanding writing is a critical skill aren't exactly in high demand.


>Supply eventually will greatly exceed demand.

This could only come true if there was a shift in programming technology that made it a lot easier for someone without aptitude for programming as we know it, or the average person started to end up with more programming aptitude (maybe due to a revolution in early education or parenting methods). Programming as we know it is very highly dependent on aptitude. Read this: http://blog.codinghorror.com/separating-programming-sheep-fr...


It doesn't look like she is looking for sympathy, and it seems to be her opinion that her career was her own choice.


Okay, Brittney Bronson, it is what it is... You teach the young grown-ups / old kids at the college, and then you shuck it in your part-time service job... It is what it is...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: