Hacker News new | past | comments | ask | show | jobs | submit login
U.S. Software developer wages fall 2% as workforce expands (computerworld.com)
77 points by biotech_anon on May 19, 2013 | hide | past | favorite | 55 comments



This does not necessarily mean that wages are falling. It could be that a lot of the new jobs created are "beginner" jobs with lower wages that pull the average down, but the existing jobs still pay what they used to or more.


What is / will drive wages down is making the process of becoming an (experienced) programmer easier, by lowering the per case work needing done and lowering the barrier to entry (compare writing C++ in 2002 to 2012), and the resources available) and that most use cases are best suited by high level dynamic languages like Python or Ruby, which have significantly less work investment to get results compared to old favorites like Java or C.

Most people don't need a tech savant. As that becomes more apparent as it becomes a game of legos putting blocks together rather than having to plastic cast mold every block yourself, I expect mean salary to drop.


I'd also point out that the expansion of SaaS, open source, and mobile app marketplaces impacts this.

Look at services like FreshBooks and Basecamp. The engineering effort to implement comparable solutions used to be immense; now you can buy into online services for less than the price of lunch, with no long-term contracts.

Similarly, look at open source solutions like WordPress. An afternoon with a $5 hosting account and a few plugins will get a site that would have cost you $100K+ in 1998 feature-wise.

I'm also amazed at how often a few minutes searching for an app will get me something that fits the need perfectly - and costs less than $4.


This is somewhat offset by higher expectations. Every app better have 9 trillion features and social network integration etc.

The complexities of twiddling with memory have been replaced with the complexities of making a ton of different systems work together in a sane way.


What's up with C++ in 2002 and today? If you knew C with classes in 2002 (MFC style) you were good, if you knew GoF patterns you were elite.

Today you have to know STL, auto/smart/shared pointers, all kinds of right casts, you have to know boost (a huge code base) and on top of that you are expected to know C++0x (or how is it called?) with its lambdas, autos and stuff, while still understanding the whole "classic" C++.

(But at least nobody cares about MFC and ATL anyway)

And don't forget templates of templates of templates which crept into many code bases. And 64-bit (or ARM) is now reality which you should account for.

If anything, you have to know ten times more in C++ in 2012 compared to 2002.


> But at least nobody cares about MFC and ATL anyway

About a decade ago I did a lot of MFC/C++.

Form what I see here in Australia one of the reason nobody cares about MFC and ATL is for the last 5 years C++ has been pretty much dead on the Windows platform.

Over here in Oz C# and .Net are the kings of the Windows platform.


Your example is flawed.

C++0x is easy to learn. For example, autos is just the ability to say "auto myFloat = 5.0;" and it will default to float.

Lambdas in C++ is fairly easy. right casts? trivial [especially with auto].

Templates of templates of templates are easy as well.

64-bit? ARM? The whole point of C\C++ is to make yourself not assembly. You shouldn't worry about 64-bit vs 32-bit as long as you use proper things.


This, my point was that you can use C++ make_shared (and eventually C++14 make_unique) instead of manual management, you can get away with a lot more reference passing instead of of raw pointer manipulation, the std now has an actual hashmap implementation now, auto and lambdas make things faster, etc.

Modern C++ vs second iteration standard C++ in the wild west days when I was first learning about it is an entirely different affair.

I also contest having to know boost - I work in some KDE projects and boost is only a dependency on around 1 in 4 that I have found. Though qt in many ways becomes the surrogate stack to learn.


Very true. What % of developers are writing C++ however? I'd think it's a reasonable assumption that they're going down (and a cursory glance at Indeed shows this).

So some jobs will always pay well; the average salary should go down if the average skill need goes down however. (Ignoring for the moment that some lesser skilled development jobs actually pay better as a result of demand)


If we take ruby, you still have to know ruby, rails, js, jquery, sql, quite a few libs, html, css, be familiar with ie/webkit/firefox, be aware of sql injection, xss, csrf and stuff. Do not forget http, rest, be able to understand caching, dns, proxying, load balancing. It's a lot of knowledge. In 2002 you could get away with a perl script!

use CGI;


Yeah it is very difficult to find Rails developers who are on the market. Wages for Rails developers have certainly been going up in the Bay Area, even including the tons of junior hires, because no one can find senior talent.

Maybe this isn't the case in the middle of the country where businesses don't realize that a 10x programmer is easily worth 2x the salary of a brand new programmer.

Ruby may be easier to pick up than C++ but the entire web stack still requires quite a lot of knowledge. Especially when you start talking about scaling databases, distributed systems, fault tolerant cloud architectures, etc.


Even 1x programmer still has to know a lot of topics or his products will be lacking.


Wasn't the same thing said when we went from hard-wiring programs to writing machine code, then from machine code to assembly, then to 3rd generation programming languages, and so on?


Possibly, however the better you are the faster you work the higher your value per hour. If I can be 3x as efficient as a junior dev I'm still worth more.


If you are introducing bugs that will at some point need to be addressed while "working fast", that'll cut away at a lot of the added value.


This is where experience helps. You know how to be efficient without sacrificing code quality and you also get a better intuition for tracking bugs that you do have.


Yep, and you know what to write tests for


I think even in the Silicon Valley the salaries other than the selective big guys has not even kept up with inflation. A 2 year-experience developer still gets between 80-100k or less. This is what was being paid in 2000 as well. The situation is probably the same elsewhere.

Of course, my numbers above are from my own poor sampling.


I'm not sure what part of 2000 you're talking about, but the middle of 2000 was the beginning of the bust. I don't think that's a good place to take the sample.


Well I am talking about time between 2000 and 2012. Salaries have remained in the same range since then for the most part based on my conversations with people I used to work for then in the Silicon Valley.


And without consideration of inflation.


My comment was considering inflation. Can you please elaborate what you were alluding to.


Which, to be fair, the article points out right at the top:

"There are multiple theories for the decline in pay, but a common one cited by analysts is simply that the new people being hired are paid less than those already on the job."


But these are not necessarily "beginner" jobs: these unemployed people may have been hired back for the same types of jobs they had before, but at lower salaries. If you've been unemployed for a couple of years, you're probably willing to settle for less money just so you could have a job again.


Some such instances could conceivably be a simple market correction, too. Some localities particularly entangled by the housing bubble could have had inflated tech salaries as part of general market/cost-of-living forces.


This is unlikely since the housing market topped 5 years ago and may be actually coming out of the bottom finally (we won't know for a few more years)

Another possibility is that there are companies hiring "software developers" for jobs that most HNers wouldn't find interesting because the level of software development is low and hence has a lower waged associated with the value generated.


> This is unlikely since the housing market topped 5 years ago and may be actually coming out of the bottom finally

No, that's exactly my point. Hiring would be occurring now because the economy is recovering from the recession, but at corrected wages that do not reflect an artificially bloated cost of living in particular localities.


"new people being hired are paid less than those already on the job."

Is this surprising at all?


I...hope not? The point is that a sudden influx of new workers can drive down the average salary, even if the salary distribution isn't changing. By "salary distribution" I mean the curve of salary vs experience/skill level.


I really hate DOL stats. They have a menu of jobs, and you have to slot people into them. Often you have jobs that are halfway between 2 and you don't know which is "better", so you just pick one (e.g. I write a lof of excel macros, does that make me a programmer or an analyst? To me, it's obviously not a programmer, but I wouldn't be surprised if that's not applied universally). Also, categorizing people is often left to the discretion of the employer, and sometimes they have an incentive to lie (R&D tax credit anyone?). These stats include tech support and also capture a wide range from "Web Developers" to "Computer and Information Research Scientists".

http://www.bls.gov/soc/2010/soc151130.htm


Am I the only one driven crazy by reporting that states an ambiguous "average" for something asymmetric like salary, when the choice between median and mean as a measure of centrality makes all the difference in the world?


You are not alone, amigo.


The Dice salary survey says salaries went up. I wonder which is right: http://media.dice.com/report/2013-2012-dice-salary-survey/


To compare, you'd need to know the statistical and systematic uncertainties of each measurement.


Ugh. Those data visualizations are unclear at best, misleading at worst.


I think this has due to an expansion of entry level jobs. ie. Code/App Academy, people learning Ruby on Rails and going to entry level jobs. Since tools are getting better and better, there is an expansion on the lower level of engineering. They should have been measuring people with the same level of experience (e.g. 5 years or experience), and see how they stack up.

My personal experience, and from what I have seen around for experienced devs, expect a 5%-6% increases annually (few percentage points above inflation). It is never linear though. You usually you might see this increase as a modest early 3%-4% on your current job, and then if you switch jobs every 3 years or so, get another 10% or sometimes more from your new employer. This is for experienced engineers (5+ years). If you are very early in your career you probably will see higher jumps on your early years.


Code/App Academy, people learning Ruby on Rails and going to entry level jobs

I kind of doubt it's Code Academy.

This article as of Jan 2013 said Udacity (which IMO is more rigorous than Code Academy) has only placed 20 people and Coursera only a 'handful'.

http://online.wsj.com/article/SB1000142412788732433920457817...

About 350 companies have signed up to access Udacity's job portal in recent months, though it has placed just about 20 students so far.

But the company (Coursera) matched only a handful of students in its months-long pilot


oh yeah , blame it on Rails , Like JEE garbage makes you look smart ...


This title seems very misleading. There's a difference between "U.S. Software developer wages [falling] 2%" and the average falling by 2%. Since the demand for software engineers is so high, perhaps companies have been lowering the bar for how qualified an engineer needs to be in order to be hired. Less qualified generally means paid less. That's just a theory but it's what I think is happening.

I've heard from industry sources that the same engineer is worth as much as 7% more than he was last year (not including the experience he gained since last year). I'd believe it. I think the qualification bar is just lowered because the demand is so high and supply so low.


I wonder how much of the $2k decrease could be contributed to geography? I think we are seeing more opportunities for technical jobs throughout the US, lowering the percentage of developers that need to be paid a Silicon Valley wage.


I really wish they would weigh geographic location into their stats. I live in TN where around half that $99k average wage is the norm for CS grads, and that's considered a good salary. Cost of living is much lower here than in Silicon Valley, so it partially evens out. A country-wide average is really no help to anyone with such disparity.


I was wondering the same. You can't talk about average wages without adding location information in with those wages as that's a big part of it.

A good indicator of use usefulness of this info is that nobody can explain why these wages have dropped. Sounds like they need more info.


Yes, this could be an instance of Simpson's paradox. If more jobs were created in low paying areas, the salaries could rise in every single geographic area, but still decrease overall.


U.S. Software productivity falls 200% as legions of the technically illiterate conspire to make it look like they are doing work they can't actually do.


So does this mean the U.S. needs to issue more work visas?


Yes, because there is a shortage of X workers until wages hit the federally mandated minimum.


Two related thoughts that always occur to me in these discussions:

* Many (I wouldn't say most but at least a sizeable minority) American programmers and IT workers display reservations regarding the policy of issuing visas such as H1Bs for a large number of foreign workers, because it's perceived as unnecessary or perceived as catering purely to the interests of megacorps. I wonder, do those software developers elsewhere) have a problem with buying products manufactured in China or other countries with low wages? Buying stuff made in China, including from American companies that outsourcing to China, effectively lowers American blue collar wages below the mandated minimum.

* It seems a lot of people on HN are favourable to policies allowing working from home. Most companies still don't have such policies, but I suspect that if telecommuting ever becomes commonplace (e.g. due to software development process or technological changes), it will depress salaries far more than the measly 60~80k H1Bs per year have ever done. For every H1B holder there are certainly many others who are skillful enough programmers and speak English, but cannot work in the US due to the limited number of visas, or don't have a degree, or don't want to take the risk of working under the constraints of H1B, or simply don't want to move to the US for any old reason.


I think you're overly focused on global economics.

H1B abuse presents local economic problems. Wages for programmers working in San Francisco are higher than wages for the same programmers in, say, Lexington, both are higher than China, and all are related to local cost of living.

If a company abuses H1Bs to import cheap programmers from China to San Francisco and keep paying them Chinese rates, San Francisco programmer wages are depressed and SF programmers are disadvantaged relative to the entire local San Francisco economy.

Note I keep saying "abuse". The problem people fear with H1Bs is that their nature opens them to abuse. Not abuse by the immigrant programmers, but abuse by the corporations employing them, which can use the conditions of the H1B program to essentially hold immigrant workers hostage in below-market-rate jobs.

I think you'll find that if you talk to programmers reasonably informed about the nature of H1B visas, most will have no general objection at all to programmers immigrating to the US from China, India, or anywhere else. Only to the particular circumstances of the H1B program, which break local market forces.


This is something I have been thinking a lot about recently since I just started a new job at a well known software company. Of the 9 people on my team, I am one of 3 that is NOT working on an H1B visa. This ratio seems typical company wide, if I had to guess.

I think as a business it makes sense to take advantage of this loophole, but obviously as a developer my wages are being suppressed.

I say either get rid of the H1B program (it's obviously a price issue and not a talent issue) or modify it to give them a fast track to citizenship or some other status that allows them to change jobs.


Outsourcing manufacturing also depresses local salaries. It's just that it's not very visible anymore. I think it's fair to assume that, out of all unemployed Americans, there are some who would happily take a factory job paying $X; but $X is too high compared to the equivalent Chinese worker salary.

I totally understand that some programmers feel more competition is not in their best interest. If they would just say they are protecting their turf, then I'd understand. Doctors and lawyers limit the number of licenses, certain trades restrict their jobs to unionized workers, etc.

I'm just wondering if their objections have other grounds, e.g. moral or public policy principles, and in that case whether those principles would apply to e.g. blue collar workers (or phone technical support, or farm workers etc), and whether they would be willing to pay more for US-made products in order to support American-based manufacturing.

> If a company abuses H1Bs to import cheap programmers from China to San Francisco and keep paying them Chinese rates,

I think you're resorting to unnecessary hyperbole here. There's public data on H1B salaries and they're far above median Chinese salaries.

In any case, my question above concedes the assumption that programmer salaries are depressed to some extent.


> Outsourcing manufacturing also depresses local salaries.

No, it moves jobs, generally entire categories of jobs, to lower-cost areas. If someone wishes to continue working in that type of job, they must do it in a place with lower cost of living.

> I totally understand that some programmers feel more competition is not in their best interest.

That has nothing to do with anything I said, and in fact I pretty clearly articulated that such a view has nothing to do with the fear some people have of the H1B program.

> I think you're resorting to unnecessary hyperbole here.

I think you're reading things into my comment that are not there, because you wrongly assume I oppose H1B visas.


* It seems a lot of people on HN are favourable to policies allowing working from home. Most companies still don't have such policies, but I suspect that if telecommuting ever becomes commonplace (e.g. due to software development process or technological changes), it will depress salaries far more than the measly 60~80k H1Bs per year have ever done. For every H1B holder there are certainly many others who are skillful enough programmers and speak English, but cannot work in the US due to the limited number of visas, or don't have a degree, or don't want to take the risk of working under the constraints of H1B, or simply don't want to move to the US for any old reason.*

Or live in states with poor economies and no tech jobs of their own and are willing to work for less than what a Bay Area programmer is.


It depends on the job. While I don't doubt that is true for a simple web development task, most programmers who live in states with poor economies and no tech jobs probably have not written service-oriented distributed systems on AWS that have handled Google/Amazon/Dropbox/insert-big-tech-company-here size loads reliably with fault tolerance etc. I can see how that knowledge will eventually become more common all over the world, but the companies in the Bay Area will be on to the next technologies and want senior people with 5+ years in SomeNewWebScaleDB and production experience with EvenMoreTrendyLanguage which is still likely to be less common outside of tech hubs.

We'll certainly tend towards more telecommuting and less emphasis on physical location though.


Exactly. In fact, I was thinking not only of India and China, countries that are traditional sources of H1B programmers, but also Western Europe, Mexico, Brazil, etc.


What is the margin of error for a statistic like this? Seems like 2% could be within measurement error, unless the DOL data is counting every single software job in the US.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: