Hacker News new | past | comments | ask | show | jobs | submit login
The Rise of Developeronomics (forbes.com/sites/venkateshrao)
282 points by ryanackley on Dec 6, 2011 | hide | past | favorite | 118 comments



> if you are a baker with a small business, you are effectively useless, not because bread isn’t important, but because surviving in the bread business is now a matter of having developers on your side who can help you win in a game that Yelp, Groupon and other software companies are running to their advantage

I don't think this is true right now; I very much doubt it will ever be true.

I live within walking distance of not less than five excellent boulangeries and have never used any sort of "social" tool or other technology to select the one I'm going to buy bread from. I try them all regularly, know what they do well or less well (one is better at croissants than the others) and basically go to the one that offers the best version of the product I'm looking for.

Looking for "bargains" in bakery is asking for trouble; choosing your bread according to your "social network" (who's composed of people who do not live right next to you) sounds completely useless.

It doesn't seem like bakers need to worry about Groupon, or Yelp. Is this just a poor example or am I missing something?


>It doesn't seem like bakers need to worry about Groupon, or Yelp. Is this just a poor example or am I missing something?

You're missing the leverage that software gives you. A bakery that keeps track of its Yelp reputation and works hard to maintain a 5-star rating will attract customers from all over the city. It might even start attracting out-of-towners who are in for a visit. This bakery's "market" is effectively unlimited because in any major city, there are going to be out-of-towners flowing in and out on a regular basis, and some of them will be looking for a bakery.

A bakery that doesn't do as well at maintaining a good reputation on Yelp will have its market limited to reach of its advertising and the word-of-mouth of its customers. Now, in a relatively tech-heavy town like San Francisco, that might be fine. If you provide good croissants, you'll get mentions on blogs and Twitter, which would magnify the word of mouth effect in a manner similar to Yelp. But if your customers aren't bloggers or heavy users of social networks? You could have the best bread in the world and not be rewarded commensurately, simply because not enough people know that you have the best bread in the world.


Is there such a thing as "the best bread in the world"?

There are "bad breads", or even "mediocre breads", but beyond that, all breads are interesting, each in its own way. I'm not sure I would want to eat the same bread everyday just because it's the "best" one (according to a bunch of Yelp users).

I don't think this "winner takes all" analysis really works for food.

Let's think about something else outside of food. There is probably such a thing as "the world's best laundry service", but do I care? I won't drive ten miles (or even 1 mile) to find a better laundry service, be it marginally better or even incredibly better, if the one that's nearest me is affordable and good enough. I will never know they exist because I won't look for them.

So I'm not sure "winner takes all" works for proximity services in general.

That's not to say that technology cannot disrupt those markets; someone could build a machine that lets people cook incredible bread in the house (it's been tried, and failed miserably up to now, but that doesn't mean it can't happen).

But the whole "bakers need to become experts at managing their online social reputation" line of reasoning strikes me as misguided and, frankly, the root of a lot of evil in modern society.

I'd rather patronize bakers who are experts at baking. There are many of them.


"A bakery that keeps track of its Yelp reputation and works hard to maintain a 5-star rating will attract customers.."

Never used Yelp, but do you really need a developer to do that? Would think the easiest way to get a good rating would be to hire some people to write good reviews for you and trash the competition.

(OT, but I bet sites like that will get less and less useful as more companies start gaming them)


Hiring people to "write good reviews for you and trash the competition" is a quick way to establish a very short-lived repuation boost.

However, it's inevitable (or at least hugely likely) that you will be found out and your repution ruined. Competitors will likely sue.

It's probably cheaper and easier to hire a developer who can write tools to monitor reviews posted online, manage a website, run your facebook group/app, write an app to make it easy for customers to order bread from their mobiles, and so on.

All of those things are legitimate activities and add genuine value to your business. They are also things that are impossible to achieve if you don't have technical knowledge.


  > > Hiring people to "write good reviews for you and trash the
  > > competition" is a quick way to establish a very short-lived
  > > repuation boost.
  >
  > However, it's inevitable (or at least hugely likely) that you will
  > be found out and your repution ruined.
Short of your reviewers deliberately snitching on you, how exactly is anyone ever going to find out?

I'd really, sincerely like to know -- as I can't tell which strongly positive or negative reviews on various sites like Amazon, IMDB, etc, are written by hired reviewers and which are real.

Of course, I have my suspicions.. but no real proof. And I don't see any way to ever get real proof -- unless the reviewer confesses.

So.. what's the secret?


First of all, I think it's very likely that one of your own (ex-)employees will either go public with the information or blackmail you in order to buy your silence.

Secondly, a sudden increase in the number of positive reviews should be fairly easy to spot by a competitor who has already hired a developer to watch for such things and do automated sentiment analysis on the reviews of competitors.

Thirdly, it's entirely possible that a broader-spectrum monitoring of reviews (perhaps provided by a third-party service) could identify common authors and/or text that is likely to be fake by analysing the content of the reviews themselves. This would help pick up paid reviews that were placed over a long period of time.


"First of all, I think it's very likely that one of your own (ex-)employees will either go public with the information or blackmail you in order to buy your silence."

Why would they go public? They're presumably making (what they consider to be) good money writing fake reviews. By going public they'd be 1 - losing their revenue from you, 2 - undermining the public's confidence in reviews, which would translate in to a less lucrative business for the provider of fake reviews.

As for blackmail, I'm sure it happens from time to time. But I don't know how to estimate the frequency with which fake reviewers blackmail their clients. What is your basis for estimating the probability of such blackmail as "very likely"?

Furthermore, blackmail is an obvious risk for any secret/illegal undertaking, and yet people attempt try to do things in secret and break the law all the time. Clearly, for them the risk seems worth it (at the time).

"Secondly, a sudden increase in the number of positive reviews should be fairly easy to spot by a competitor who has already hired a developer to watch for such things and do automated sentiment analysis on the reviews of competitors."

So, the fake positive reviews might be staggered over a reasonable period of time. But even the business receives many positive reviews in a short period of time, that's not proof the reviews were fake.

"Thirdly, it's entirely possible that a broader-spectrum monitoring of reviews (perhaps provided by a third-party service) could identify common authors and/or text that is likely to be fake by analyzing the content of the reviews themselves. This would help pick up paid reviews that were placed over a long period of time."

Are there any such services which claim to be able to detect fake reviews? Or is this mere speculation on your part? Regardless, just because some service or algorithm claims a review is fake doesn't mean it is. Once again, their guess/estimate is not proof.


> Why would they go public?

Sorry, my bracketed "(ex-)employees" wasn't really clear enough. I was trying to suggest a vengeful motive after they had been fired.

> What is your basis for estimating the probability of such blackmail as "very likely"?

Basic human nature. If you're directly employing people whose job is clearly unethical then you're painting yourself into a corner. Either they do in fact actively blackmail you or else you will feel unable to fire them in case they start telling people what you're up to - I'm regarding this implicit threat as passive blackmail here.

I agree with you that there are lots of people who seem perfectly happy to do unethical or illegal things in order to make a living. They calculate (if not always very accurately) that the potential gains outweigh the potential costs.

However, a lot of these people make such a decision in light of particular circumstances e.g. their low intelligence, or their lack of employability (due in part perhaps to an existing criminal record), or their unwillingness to work.

Someone who has established a business that is making enough money to pay the bills should clearly be reaching a different conclusion to these sorts of people (at least most of the time). Their potential losses are much higher (i.e. loss of the business) and the potential gains are actually not all that favourable compared to doing things the "right way" - for example, by hiring a developer who can boost your business in other ways.

> But even the business receives many positive reviews in a short period of time, that's not proof the reviews were fake.

It's evidence. What a business owner might do if they suspected a competitor had hired someone to "opinion spam" is up to them. If they though a competitor was paying someone to post negative reviews of their own business then they could very well sue, and request access logs etc. from the review site in order to further the investigation.

> Are there any such services which claim to be able to detect fake reviews?

I haven't looked. But it would certainly be possible to create such a service without too much difficulty.



I think it's a very insightful article regardless of whether some tech startups successfully become gatekeepers (internet landlords?) who dominate bakers, like Amazons of baked goods. I'm willing to accept a lot of incorrectness if the author is taking a few good concepts and pushing them to one logical extreme (which is essentially a process of exaggeration; and I expect a mental model to diverge from reality, like in any science).


I'm wary of the fact that it basically tells us what we want to hear: the only talent that matters are developers, companies who "get" developers will survive and those who don't, won't, no matter from which industry.

That's a little too good to be true, and the bakery example stroke me as very ill-suited.

I enjoyed the article, though, and I wish it were true that clueless incumbents would die rapidly and developers would get infinite leverage.

What I think is true is that developers will get a little more leverage while clueless incumbents will take forever to disappear.

(Edit: in the long run, everyone will be a developer. There will still be good and bad developers, but there won't be non-developers professionals, just like today there are not many people with a job who can't read or write.)


Regarding everyone in the long run being a developer, there is little historical evidence for this.

We are not all mechanics. In fact, the knowledge of how to make a simple repair like fixing a puncture in a tire is much less prevalent now than it was forty years ago. Nevermind the knowledge of how to troubleshoot a vehicle that won't start.


It's pretty easy to decouple transportation from what you need transported, without losing value. The author's thesis is that it is impossible to decouple development from what you need developed without losing value. It's not exactly proven in the article, but I think it has some merit.


The modern vehicle has evolved to a highly specialized state. Though we have the capacity to design a vehicle that the common man could fix himself on the weekend, we clearly don't have enough demand for one, and the regulatory market for pollution and safety might very well make it impossible.

The consequence of this is that knowledge of the mechanics of the complex system that is a modern vehicle is typically beyond any one person, and certainly well beyond the majority of its users.

Today there is leverage in far flung industries for the individual who has specialized knowledge of his field AND knowledge of how to apply software development to that field. But these opportunities will not last long. Any software that can increase profits in an industry will attract competition from other software developers. That software will be improved and refined by software specialists. It will (and has) evolved to a systemic complexity that equals the modern vehicle.

Software is a tool. The civil engineer who uses AutoCAD to design your road does not need to know how to code his own AutoCAD.


So, once your niche has sufficiently good software tailored for it, there's no profit in being a developer. That makes sense.

Although industry standard tools with scripting environments often have active communities of people developing sophisticated tools on top of them. Encase is one example; the Canadian Mounties write quite a bit. Splunk is another. Niches may asymptotically outrun good software.


Maybe not but a lot of them are very well versed in autolisp and customize their environment with it. That's what's interesting about software in my view. Once the low level plumbing is done you can always move up to a higher level and continue to build. I'm not sure what the max might be. Maybe one day programming will be much less direct and be done by non programmers than it is now, I won't hold my breath that this will occur too soon for most of us to be out of a job.


The way it was framed might be a tad hyperbolic at this stage. However, managing an online presence if done properly can be a more inexpensive way to acquire new customers than using conventional, bricks-world advertising and marketing channels. As many businesses experience a relatively constant attrition of customers due to factors having nothing to do with the business' service delivery, etc., acquiring new customers inexpensively is a constant activity.

A lot depends upon the focus and goals of the business owner. The owner that is happy staying as a corner boulangerie could get just as much accomplished by making sure that they are active on a neighborhood mailing list that distributes a hardcopy newsletter to each new resident arriving in the neighborhood, than by paying someone to manage their Yelp presence, for much cheaper. Somone who wants to run a local chain of coffeeshops that push their baked goods on the other hand, might find it more feasible to pay someone to manage their Yelp presence.

The author doesn't make this clear, but I suspect (due to the Forbes origination of the content) he assumes the reader is someone who intends to run a scaled-up business as opposed to a classical "small Mom & Pop" operation. Otherwise, I find the "software as core" argument he puts forward a little difficult to buy into.


You nailed it with the "writing for Forbes" comment, and it is indeed a little hyperbolic, but consider: even the small corner coffee roaster now needs to compete with a small coffee-plus-software operation like http://tonx.org

(which does have excellent coffee, by the way...)


You are right and wrong. It is about consumer behaviour. I can't think of a technology that is affecting consumer behaviour in a way internet does. And now it got legs with mobile. If it has not affected you in the case of a bakery, surely your children will be.


Where would my children have to live to need technology to choose a bakery? It would have to be a place with either thousands of bakeries to choose from (not likely), or, more probably, a place with almost no bakery.

If you have to drive 30 miles to find a bakery then it makes sense to try to find out which one is worth the effort.

Which makes me wonder: is this an American problem? In cities where

1. almost anything is accessible by foot

2. driving is extremely annoying and expensive

you have an enormous incentive to choose services based on proximity rather than their current Yelp rating (or lack thereof).


> Which makes me wonder: is this an American problem?

It is a combination of hipster fantasies, delusions, and a hope for a business model.


As a counterexample, there's a bakery literally around the corner from where I sit.

Yet, I've never eaten anything from it, preferring several alternatives that are several blocks away. Why? My coworkers warned me of dubious sanitary practices (it even looks dirty), and none of them eats there. So "social" made me not choose that bakery.

A new worker at my building might not know of that, and thus might ignorantly go to that bakery. More likely, a contractor, deliveryman or similar, would find such a service very useful.


That's basically a problem of the social networks not being good at that right now (really local/location based recommendations and coupons, not citywide but neighbourhood-wide).

As usual, there are a few startups working on that :) (I know of one).

I still like the "trying out new stuff" approach, but I can see that experience being augumented by social networks.


At the end of the article, the author explains how "once a good developer recognizes his/her own value, [she turns] to either an individualist-mercenary mindset or a collectivist guild-like mindset."

He elaborates on the "guild-like mindset":

> The other kind of developer turns to guild-like structures, which serve as centers of balance-of-power politics in the constant wars against the developer-capitalists. Except that instead of taking on the dynamics of class warfare along an upper-lower dimension, the conflict takes the form of exit warfare along an inside-outside dimension. Rather than form a union to negotiate with management, the talented developer will simply exit a situation he/she does not like, and use guild-like resources to move to a better situation. Stock options are simply not as effective in limiting mobility as the power of Russian nobility to whip serfs into immobility once was.

I've tried to make sense of what that means, but I'm lost in the balance-of-power, upper-lower/inside-outside, and guild references. If someone has a moment, could they please explain this paragraph in clearer terms? Perhaps as a basic narrative of what a guild-minded developer would do when she recognizes her value?


In political science, a fairly well-known basic idea is that the two ways to dissent are exit and voice. Exit means you leave and go somewhere else, and was common in early political eras when most civilizations were small and surrounded by plenty of nomadic/lawless regions to retreat to. As populations increased, voice (i.e. protest, class warfare etc.) became increasingly common. The classic reference on this is "Exit, Voice and Loyalty" by Albert O. Hirschman. It's been on my reading list for a while.

http://en.wikipedia.org/wiki/Exit,_Voice,_and_Loyalty

For developers, it is swinging back towards exit, since the Internet can be considered a kind of virtual equivalent of the nomad regions to retreat to.

"Center-periphery" dynamics is the common term in geopolitics for dissent dynamics driven by movement inside/outside a core. Class warfare is the better known kind of dissent involving unionization etc. and involves fighting up and down a class structure.

Guilds, historically, were a medieval kind of institution that had characteristics of both exit and voice. The classic guild professions (weaving, masonry and in more recent times, things like typesetting in pre-lithography days) used their portable skills to leave kingdoms/cities and move elsewhere if they didn't like their current situation. This option was not really available to laborers tied to the land, and was essentially an urban phenomenon. This is the origin of the term "journeyman" for instance... apprentices who would follow master craftsmen around until they became masters in their own right.


This is the origin of the term "journeyman" for instance... apprentices who would follow master craftsmen around until they became masters in their own right.

No, that's not right. When an apprentice finished their apprenticeship, they would usually have to leave their master's service. The openings for masters would be fixed per town by the guild, so journeymen would work itinerantly until they found an opening to become a master. Journeymen would work for a series of masters, and their relationship to these masters was entirely different to the relationship of apprentices.

The status of journeyman became institutionalised, so that the criterion for taking up a mastership was that one had travelled widely enough as a journeyman for some length of time and had crafted a masterpiece.

I've never heard of the idea of guilds organising flights: I should think that master guildsmen had more to lose than agricultural laborers from relocation. Perhaps if several guilds coordinated, it could be less than massively destructive. Where did you get this idea?


It may help to know that while journeyman and journey come from the same French word, journée (meaning, essentially, day), that journeyman simply means someone who is entitled to charge a day rate for his work. (Masters were the "contractors" of their day; they were paid for the project, and apprentices got to eat -- they usually paid for the privilege.)

Yes, the guild system and town laws often made the journeyman itinerant, but it wasn't travel that put the "journey" in "journeyman".


Interesting, thanks for clarifying. I assumed too quickly that journey in journeyman meant what it means to us today.


Don't quite recall, but I think it was from a book about cathedral builders or something. But I'll admit my knowledge of medieval guilds is largely superficial and drawn from books where it was a side issue rather than the main topic. Your description sounds like you know what you're talking about. Got any good references (books, not articles) worth reading to get a clearer idea of how guild economies worked?


I'm not an expert: I spent a bit of time about ten years ago researching the topic, back when there was a lot of talk about guild structures being appropriate to software development.

Epstein has an article on craft guilds in the Oxford Encyclopedia of Economic History: http://www.oup.com/us/pdf/economic.history/craft.pdf


Dude, that is fascinating.

Is there some way I can contact you?


I wrote the original piece. You can contact me via my blog http://ribbonfarm.com


Oh, it's you. I am already in touch with you, heh.


Drawing a superficial parallel to some other poorly understood or fictional organizing principle seems to be the author's main schtick.

http://www.ribbonfarm.com/2009/10/07/the-gervais-principle-o...


One man’s superficial parallel is another’s thought-provoking analogy. I think a lot of what Venkat says loses substance – or even falls apart – under scrutiny, but I keep reading, because another lot of it hits on some key insights.

This is a style of writing that should be valued more for the questions it raises than for the ones it answers. As another example, take Marshall McLuhan; much if not most of what McLuhan says in e.g. Understanding Media is somewhere between flimflam and complete bullshit... but the mode of thought starts the wheels turning, and some otherwise solid seeming cultural assumptions are questioned at their core... and at the end I’m really glad I spent the time reading it.


Perhaps. But the danger of clean, attractive, reductionist metaphors and analogies is that the general public -- including decisionmakers in the business world -- tend to reify the analogies, and treat them like immutable laws of physics.

Witness, for instance, the rise and fall (and rise again?) of "The Long Tail." As a concept, the long tail certainly applies to some businesses. But not to all, and not nearly as successfully as the concept's author once claimed. But the business world was frenzied with long-tail fever for years on end. "Long tail" became the buzz-phrase of the day, even in companies for whom the concept made no sense whatsoever. People with no understanding of the underlying principles could, nevertheless, grasp the surface-layer metaphor -- and, in so doing, assume that they understood the whole thing. That's a dangerous mindset.

So there's a sort of Faustian bargain in all of this. Appending a nice metaphor on top of a concept increases the likelihood that the concept will be disseminated, talked about, and taken up. But, at the same time, it invites intellectual laziness and fads of half-baked thought.


I totally agree. The problem here is not the creation of these metaphors though, but their elaboration and reception. And the bigger problem is that we already carry around so many obsolete or suboptimal cultural concepts without subjecting them to scrutiny despite changed social circumstances or improved technology. I think continuously trying to both (a) collect granular data by immersing ourselves in its context so we can sniff out bad aggregate explanations and (b) come up with new metaphors for what we see is essential – the former keeps us honest, and the latter extends our mental reach.

As you say, the problem is just grasping the surface metaphor without examining the details. For myself, I like having the surface metaphors as a way of storing and communicating ideas, because once I’ve established that someone has a deep understanding of a concept, I can talk about it at a high level with a high data transmission rate.


These clean and crisp ideas are dangerous if you just repeat them (news stories, cocktail parties, management retreats), but not if you use them as a way to evaluate your other ideas, plans and so on. I do agree completly with how annoying these things get, like your long tail example, when repeated so often.


Rather than move the mountain by unionizing and forcing big employers to pay up, skilled and connected developers can simply go around the mountain and disappear to the slightly better job down the street. As long as you are wired into a community of talented developers (and perhaps willing to relocate) you will have options. Spending years trying to "fix" a BigCo IT group from the bottom up is not likely to be the most rewarding use of a talented dev's time.


Just to let you know you're not alone, that one went completely over my head as well.


Based on the other replies (which I greatly appreciate - thank you), I think I have a better understanding of the paragraph. I was particularly interested in understanding the author's distinction between individualistic and guild-minded developers. It seems to me that he is suggesting the following:

When a developer find herself working in a company that undervalues her, she will typically react one of two ways:

1. "Individualistically": Such developers might begin by bargaining for better terms at their company, but they are quick to leave the company to 1) consult, 2) join/found a startup, or (following way in the distance) 3) join another company. Individualistic developers invest time in learning new technologies to stay current, because their ability to be so mobile exists primarily because they can work in such varied environments. These developers still seek large windfalls, so they "hedge their bets" - taking high-risk jobs more often than most people. Individualistic developers are probably younger than other kinds of developers.

2. "Guild-Minded-ly": Such developers will respond to workplace injustice by leveraging their connections to find another job - probably at a company with similar specifications to the one they are leaving. A guild-minded developer might be an ex-Google employee who - over the years - has built relationships with people at Microsoft and Facebook. In order of preference, such developers seek jobs with 1) another company, 2) a startup, 3) themselves (consulting). Stock options are less effective at retaining guild-minded talent, because they are not as concerned with large windfalls. Guild-minded developers are probably older than individualistic developers, and they also probably want to stay at a job for longer.

The above is probably not a 100% accurate representation of what the author was trying to say. But if it is, I don't much care for it.

While it's true that some of us are more likely to work for ourselves, or tend to be risk-averse, or desire long-term relationships with a company, I don't think the author's division of mindsets encapsulates these motivations.

There also seems to be a lot of overlap between his divisions: Both sets of developers are active in the community (individualistic developers via open-source, and guild-minded developers via whatever a guild is - and probably via open-source, too); both developers are very mobile despite attempts to retain them; and both groups supposedly represent highly talented ("10X") developers as opposed to average ("1X") developers.

I found the article interesting - though I'll need time to digest it and form an opinion. But his decision to close with an articulation of this division is strange. It seems tangential to the larger point, which is that good developers (of whatever group) are important.


It's quite funny because a group of my friends used to call ourselves the mercenaries because we had no loyalty to any existing company, and would leave in a heartbeat if the project was doomed or a bad manager was drafted in.

But there was also an element of guild behaviour in that anyone who found themselves in a good project/company tended to bring in others.


I think this is what the author was trying to convey, and the article does a fairly good job of providing information from and insiders prospective, even if it does objectify developers. I think most developers move between the two, what you stated seems to be very normal and 10x'ers seem to pick up other 10x'ers along their way. I have about 8 guys in my (guild) now and we float in between contracts and start-ups. I am working on funding of our next venture and I will be calling them in. When I do, they will come and that is very hard for someone on the outside to understand. Our dynamics work very different than other industries and it is very hard for people outside of it to understand. That is probably the think that I think made Steve Job's so formidable was his understanding of the dynamics of creative employees.


I think you hit the nail on the head already with the first part of this paragraph:

>>While it's true that some of us are more likely to work for ourselves, or tend to be risk-averse, or desire long-term relationships with a company [...]

Basically, developers with more of a mercenary mindset are more willing to work for themselves, while developers with a more guild-like disposition tend to be more risk-averse and/or desire long(er) term relationships with a company.

Both groups definitely overlap in terms of their achievements and community involvement. But how they see themselves in the market, and more importantly how they react to changing conditions, seems to be the primary distinction between the two.

Being a Forbes article and the general audience that the author seems to be writing for, I doubt that much of this is anything new for a HN audience. However, it's probably a good radar ping/wake-up call for the more general audience.

tl;dr Polarizing divisions make it easier for people to retain information.


As a developer ages, and finds it harder and harder to switch technologies, at some point, he or she is considered hooked for the rest of their natural lives to some technology — Java or C++ or the Facebook API say — that they can be expected to grow old with.

Sigh. ::Closes eclipse. Googles "ruby tutorials"::


thanx for the laugh, made my day!


My "telling me what I want to hear" cognitive alarm bells are going off.

On the other hand, this is simply the flip side of the frequently-heard argument around here that technology is driving people who can't cope out of the ability to hold a job at all, because they become unable to produce value in excess of what they cost. If that is true, then it follows that as the supply of people who can still be productive in the new economy drops, then they must also go up in price.


I don't think that's right. If 50% of people became replaceable by robots, the other half doesn't become more valuable. Rather the value shifts to the robot owners.

The article paints a simplistic picture, which is probably not correct even to first order, but it may be a useful viewpoint. The stratification should only increase, with the super devs obsoleting the ordinary code monkeys, but the time frame on that might be large.


The value shifts to the robot developers. They are the ones that capture more of the value originally pooled by the entire workforce.

The robot owners get value in that they can lower their prices because of the replacement of labour with robots, and overall, society benefits because whatever the robots are making is now less costly (ie, takes less of your own hours working to purchase).

The key to understanding all of this is productivity. Most people tend to think in terms of raw job numbers, rather than productivity numbers.

Employing robots results in a more productive economy because the same (or greater) amount of output is achieved with less human input. This does release labour from that activity, but it also creates opportunities in other areas, as the savings people make on purchasing the robot-cheaper goods can now spend in other areas, which should create jobs in other areas.

Of course rapidly advancing technology produces lumps of unemployed which take time to retrain and redeploy themselves elsewhere, and it sucks to be that person. But eventually they will find employment, assuming something else isn't working against that (like oversized debt accumulation, which works against new enterprise creation by causing a negative investment flow, but that's another story)


You hypothesize an either-or where it's a both-and. The robot owners need somebody to program the robots, too, or they're expensive-yet-worthless hunks of metal. And if that talent is scarce compared to demand, the price is going to go up.


Maybe I'm nitpicking, but I'd say that effect is due to an increase in demand for robot programmers, not due to a decrease in the supply of the productive workers.


I had the same response w.r.t "telling me what I want to hear".

After thinking about it there is an interesting notion in there. If you consider how the author may benefit from writing such a piece (as opposed to any other piece which would have resulted in him receiving the same payment).

If we assume that hypothetically he is simply writing what developers like to hear, then it is likely that he is trying to curry favour with either a specific developer or developers in general. If he was trying to do that then it implies he sincerely believes what he wrote and therefore wasn't just writing what people might want to hear.

Admittedly it does fall apart if it transpires if his motives were 100% financial, however I found the concept of a self-negating idea to be an interesting one.


Ok, I'm calling it-- when I find myself getting a bunch of hot air blown up my skirt by Forbes, we're officially in a tech bubble.

That said, I found the point about being a capitalist or a commodity relevant to the startup scene in many important ways. There are a lot of people who want me to be very smart with code and very stupid with money.


There are several layers the author is arguing.

First, he is telling developers that they are awesome. That's nice to hear, and of course will guarantee a friendly reception by developers. :-)

Secondly, he is arguing that all businesses will be heavily software driven, and if they don't invest in IT, they will be winding up buying from a "cloud" provider.

Thirdly, he is arguing that management needs to take a long position on hiring(non-pedantic note: a long position is where you expect a stock to go up, it is the "buy low sell high" approach), because it will pay off massively. Investing into software creators on the 50-year plan will provide heavy returns, because business are becoming software-ized, and it will pay off.

I think in the scope of the next fifty years, the author is correct in that software will be one of the core strata of societies across the board (I don't see us there yet by a long shot - some places don't even have electricity still).



That's pretty amazing!

But, you know, I would argue that someone in that region needs electricity to hook up the facebook. :-)


The individualists turn into hard bargainers as they carefully probe their own market value and frequently re-negotiate relationships. They carefully invest in keeping their skill-base current and avoiding being shunted into the sunset end of the ecosystem for as long as possible. This sort of developer likes to hedge bets, stay invested in multiple projects and keep one foot in the open source world at all times. They position themselves for massive upsides when that is a possibility, and the ability to walk away from failures with their own reputations intact where there is real risk.

Full agreement on this point. Once I started looking at my own career from this perspective I couldn't stop. Books like The Passionate Programmer and The Pragmatic Programmer as well as heaps of prose from pg, patio11, edw519, and tptacek guide my continuing quest for increased leverage.


The article described old-guard programmers who mastered a technology and then couldn't find work when the technology died off.

I have a hard time understanding this: programming skills are extremely transferable and learning the hot language/platform of the month isn't extremely difficult. Is it just inertia that prevents them from doing this?


> Is it just inertia that prevents them from doing this?

I don't think it's just inertia, at least not in my case. I know that my example it's just anecdotical, but I'll write it down anyway.

I'm in my early '30s, I've been programming in Python for 7 years now (and during the last 6 years I even got paid for the privilege), along with other languages (PHP, JS etc.)

The thing is there's only so much that you can learn about the world around you by focusing on only learning programming- or tech-related subjects. Of course that, for example, learning Erlang or Caml are extremely interesting things to do, intellectually speaking, but when I realized that by learning to speak Arabic of Farsi I could potentially be able to interact with and to better understand the culture of tens if not hundreds of millions of people then my interests slightly changed.

Of course that it wouldn't help me one bit if I were to tell my potential future interviewer "hey, I learned to speak Farsi so now I can really read Hafiz's poetry the way it's supposed to be read", but as long as that experience enhances my understanding of the world around me it's all for the best.

Re-reading what I wrote I realize maybe it doesn't make much sense, anyway, what I wanted to say it's that most of us, programmers, are (really) smart people, it comes a point in one's life when you realize that there are other things in life worth spending your time on.


Good point. I learned to dance tango in similar situation. And I never danced anything in my life before that. I'm approaching the magic 42 number of my age.


Ian Eslick put it interestingly (http://ianeslick.com/using-100-of-your-brain):

"If we really want to become smarter, we will have to resort to the hard work of building more effective representations within the existing processing capabilities of our brain. Modern neuroscience backs me up here (e.g. summary in nice NPR interview); the process of compiling our experience into higher and higher order patterns that we use to structure future experience is what makes the judgement of experts more efficient than that of a novice (not always more effective, as the higher order patterns can also limit our generative abilities)."

You can be a very effective developer (at least in certain contexts) without building robust mental representations which allow you to assimilate newer techniques easily. And there can be cases where this is an advantage.

So more concretely, people recommend learning tools like Lisp, to give your mind mind practice with a more abstract, meta way of working. (So you can explore and rejigger new ideas without necessarily writing a classical compiler and idiosyncratic language from scratch.)


As a full-time developer with a family, it's not as easy as it seems to pick up the hot language/platform of the month, especially to learn it deeply enough to compete with people who have been working with it for years. (As everyone knows learning requires doing, and that takes time that is hard to come by.)


Inertia, other life priorities, all types of things.

I think one of the things that's hard for people is that thinking along a particular type of technology for people creates a type of mental rut that means subsequent thinking tends to go down the same rut.

By this, I don't mean it so much as the 'stuck in a rut' sense of being unable to escape, but more of the fact that it's hard to approach a problem without using an existing mental pathway you've spent years using.

It's like trying to learn a new (spoken) language - you tend to try and think in your old language and do the conversion, rather than think in the new language. The people who can pick up new languages quickly apparnetly have a good ability to skip over these mental ruts and embrace new types of thinking quickly.

So people struggle initially to understand new technologies because it might require a different type of thinking. And they may also have no chance to practice these skills without leaving their existing job, which might involve a significant cut in income and/or a relocation.

Of course the end-run around this is to be involved in an open source project in the new technology, and try and pick up work through those avenues.

The other issue is that, as a technology gets mature, as long as it is still being used, the dollar rewards tend to go upwards as time goes by. I personally know of guys who still work on database systems written in the 1980s which predate relational databases. They are the only ones who know it, and are pretty much untouchable for both domain and technical knowledge. They get to work from home for large money and are off-limits as far as cutbacks and layoffs. Of course, this all happens up to the point a new system is put in place, but, as the article says, old systems are geographical layers, so businesses can tend to build on top of the old layer and glue it together with middleware, rather than do a bottom-up rebuild.

So if you're making big money doing something you can do with your eyes closed, and are indispensable, there's not a lot of incentive (beyond intellectual curiosity) to switch out into a new stream. The people I know are financially set anyway, and they just see it as a revenue stream that will one day dry up, but are happy to keep taking it while it comes in.


From a personal perspective, besides the many aspects already mentioned by others, there's also the simple lack of challenge.

I've already mastered an x-number of technologies in my 25 years as a developer, I'm 100% certain I can deal with the umpteenth new one (albeit a bit slower than I used to). I already know I can program, and what my strengths and weaknesses are in that area. I'm more motivated to take on challenges I might actually fail at, or learn something new about myself.


It isn't an old guard thing. It is a personality and strengths thing. I am not particularly old guard (I am 37), and while I am about average in the technologies I do know, I suck at learning new platforms/technologies. It's like pulling teeth.

Those who are good at it just don't get how hard it can be for those are not quick learners in this sense. This is part of not knowing their own value -- claiming that learning new languages is "easy." It is. For them.


I dunno, I guess I'm of the mindset that the waves are coming, and you're either on them or under them. These days, I get antsy if I get enough time in one stack that I really feel like I have it cooking. Which is how I've clawed my way up from the embedded programming basement all the way up to javascript now. (Aside: sweet jesus, how do you web guys put up with javascript? It's like java and ml's thalidomide baby.)


Why do you think we've turned to Coffeescript in droves?


The article lightly eluded to the fact that the 10x'ers are able to transcend this, as a developer in his late 30's and a 10x'er I have seen it, my (guild) is getting younger and younger. Burn out has taken some of my age peers, while comfort or riches have taken others. Meanwhile, I keep on ticking, adopting the new stuff as soon as it comes out. There does come and age though where burn out sets in, and you either just start doing a job, or you drop out of the industry all together. I had one bad burn out in my early 30's in which I tried to leave the industry, but pastureing just does not seem to be in my nature. We will see when I am in my late 40's.


I enjoyed this article, with the mindset of there are kernels of truth overlaid with layers of rhetoric and bandwagon hitching.

I think though, that for the general audience it is intended for, it might wake a few people up to the fact that software is a core competency for just about every business and not a cost centre, and that competitors who can use software well are well positioned to outstrip those who can't.

Further to that point is that, to get good software, you need good developers. And if you aren't actively trying to work out how to get the best developers, you're getting the leftovers.

This fits neatly into the point I was making recently about government projects always being doomed because they end up with bad developers on projects with too many owners, and that governments everywhere should just buy off-the-shelf solutions and fit their way of working to them, rather than the other way around.


"This fits neatly into the point I was making recently about government projects always being doomed because they end up with bad developers on projects with too many owners, and that governments everywhere should just buy off-the-shelf solutions and fit their way of working to them, rather than the other way around."

Initially I had a strong reaction against what you're saying here, but I see you chose your words carefully, saying gov projects "end up with bad developers on projects with too many owners". Sadly I have to agree, since I am in that world in the federal gov and see it first hand. But there are many highly talented developers here! They typically don't stay on projects long enough, so "end up with..." is exactly right.

And that does tie in to your point that "to get good software, you need good developers", so it's a never-ending challenge trying to find and keep those 10xers engaged and happy with their work so they tend to produce that 'good software'...


...or "buy off-the-shelf solutions" as brc suggested?


I don't think buying COTS or GOTS actually saves the government money in the long term. I dont have actual facts to back this claim of course, but it's just from many years of experience in this scene.

Many times the requirements of an agency/office/mission are too specific for something off the shelf. The solution has to be either completely custom or a mix of COTS/GOTS. And when you have a mix you have integration. Software is a smaaaaalll part of the equation in government. It's the solution engineering (read: consultants/contractors) that cost the most. And I bet you'll always need/want highly skilled engineers for all sorts of roles. innovation is one, but keeping missing critical systems alive and operational is important too. what qualifies as 'mission critical' is another subject for another day, and can vary wildly depending on who you talk to...

However, a lot of times, the requirements are truly simple, or simply met. And I think that's why you see a TON of Drupal being used in the gov these days. For many offices, that's all they ever needed. But even that's a smaaaaaal part of the pie. The government is huge! You can't imagine how many thousands of software projects are all going on at the same time right now...


This is my original point.

The vast majority of government IT systems are very ordinary things that are used all over the place.

Things like: payroll, issue tracking, asset tracking, emergency services management, waste disposal, vehicle registration, fines management and payment - the list goes on and on and on.

Yet time after time we find a government department insisting that their payroll system is unique because of x, y and z, and that their asset tracking system requires a custom solution.

This should be flipped on it's head and instead it should be : here are the top 3 whatever purpose systems available in the market. We will choose which one suits our needs the best, then we will change our work practices to best match the system.

Because changing the work practices will be easier and more successful than trying to build out (or customise) their own IT project.

But this requires people to think about practical solution, and to no have IT empire builders.

No government tries to re-invent railways, roads or sewage to their own needs- they look at what works elsewhere, and adapt to the technology available. Yet they go in the opposite direction when it comes to building computer systems. When you look at the graveyard of wasted billions on customised government computer systems - it's time for a new approach.


"Things like: payroll, issue tracking, asset tracking, emergency services management, waste disposal, vehicle registration, fines management and payment - the list goes on and on and on."

Most of those on the list are things that the gov already buys off the shell software for (I have worked as a network administrator at a large state university in the past). Our payroll system was off the shelf from Oracle, asset tracking was part of that erp system, our course management system was the off the shelf blackboard, security access system was from another vendor. It was rare that a project would be 100% built in house.

Using payroll as an example, for the size of these organization, the off the shelf components you will find are usually built with the expectation that it will be customized anyway. (Not all organization in government work the same way for very good reason. Laws) They are incomplete in a lot of cases and require integration with various system. The above poster was right, the problem is often integration. How to I plug my directory services into my erp system that connects to system for another agency somewhere. You will have a hard time getting a unified system across all agencies when you go by a biding system that is anywhere close to fair.

The companies that play in that space Oracle/SAP/HP/Novell whatever are all built on the assumption that you will buy something and need their help with it for years to come.

The problem is huge I feel you are underestimating it.


I'm on page two of the story but I am already starting to doubt the writer's basic understanding of technology and question whether continuing to read it is a waste of time:

1) "If your bakery doesn’t have an iPhone, it will soon be at the mercy of outfits like Yelp." - What does that even mean?

2) "A little known fact about Google, for instance, is that its investment in Python (one of the three languages the company uses for its work)" - Gonna go out on a limb here and state that Google uses more than three languages.


For 1. I'm being generous and reading it as shorthand for; "If your bakery doesn't manage it's presence on the network and patrol it's reviews and it's image online; it's business will be eaten by those of your competitors who 'get it'." I'm not sure why an iPhone is required equipment, but the point he's reaching for isn't entirely lost.

For 2. I'd say he's right in that python has played a fairly large role in Google's developer outreach and seems to be the lingua franca for expressing ideas in code within Google.

The first point is just sloppy, and suggests that Forbes doesn't engage in the outdated and unfashionable practice of editing it's writers. The second is a fairly pedestrian observation about the software industry, that languages and technologies are identified with some entities more than others. That the languages a person who is a developer knows may affect his view of companies that use them is not in question, whether it's a benefit to the companies in question is unknowable.


Forbes contributors (staff and outsiders) are generally copyedited for style but my understanding is that content is generally the responsibility of the author.

This model may not be for everyone but it allows us to bring in great non-staff writers like Timothy Lee (http://blogs.forbes.com/timothylee/).

While contributors can still pitch mag stories there's still a lot of editorial involvement in the print publication.


I think the intent of #1 is clear (I agree with your reading of it) but the sloppiness of the statement is distracting. Same really for #2, though it looks like I'm wrong and Google does have three official languages..still I have a hard time believe a 30,000 employee company would not use many more languages internally in some capacity.


>still I have a hard time believe a 30,000 employee company would not use many more languages internally in some capacity.

There are likely uses of other languages internally for small projects or one-off kind of things, but large companies generally prefer some standard practices/patterns/languages. For well written code, you have to realize, your code will likely outlast your employment. That means someone must maintain it, fix bugs, augment its functionality, etc... It is MUCH easier to manage that whole flow with strict(ish) stylistic standards (Google has these) and a small range of possible languages. If you've ever seen code written 15 years ago in the (then) language du jour, with no sort of consistent/recognizable style/coding patterns, etc.. you will understand why companies do this kind of thing.


The first quote is difficult to describe for me, but let's call it "poetic", as my brain is too fried to think of a better term for it. Basically, my interpretation of what he's saying is: if your bakery isn't tech-savvy, even at a basic level, they could be getting killed by reviewers (or some other bogeyman) online without knowing it.

The second quote is pretty easy to support with sources. Google has three _official_ languages that can be deployed into production, and Python is one of them. See [this post from a Googler](http://everythingsysadmin.com/2011/01/python-is-better-than-...), for instance. I have no idea how hard a rule this is, but many a Googler has mentioned it. I think it's worthy of being called a 'fact', and thus completely valid in the article in question.


Agree with yours (and olefoo's) interpretation of #1; I singled it out because it raised a red flag for me as the kind of statement non-tech savvy people would make, thus calling into question the validity of this piece as a whole. (As much as I, as a software developer, would like to believe it!)


2) "A little known fact about Google, for instance, is that its investment in Python (one of the three languages the company uses for its work)" - Gonna go out on a limb here and state that Google uses more than three languages

For a long time the 3 "approved" languages for use in production systems at Google were Java, Python and (a subset of) C++ (Javascript was a fourth language, but was a special case - used only in the browser) [1][2].

I believe that has changed recently with Sawzel, Go, Dart, etc, but he is correct in principle.

[1] http://panela.blog-city.com/python_at_google_greg_stein__sdf...

[2] http://dave-ford.blogspot.com/2008/01/googles-4-approved-pro...



I remember hearing about this. Sounds like she should have placed a limit on the number of deals Groupon could sell. But what's the connection to his line about having an iPhone?


If you had your own iPhone app and knew how to market it properly, you wouldn't be at the mercy of Groupon's well-known strong-arm tactics in dealing with small businesses for instance. Or some other alternative tech-savvy way to market your business so you aren't killed by a few bad Yelp reviews or discounts from the social commerce/coupon startups that you don't know can hurt you till you try them. Many of the more prosperous small businesses are already starting to vie for control over their customer acquisition channels.


I also thought maybe he meant to write "iPhone app" and accidentally left out that word.


After the first page I was waiting for the "but seriously" part of the article. Are we already too young to remeber WebVan? And the promised method of investing in developers never really materialized.

Anyway, I don't think this article is going to help the author find the "technical cofounder" he's looking for. The fact that he even uses that term pretty much guarantees that he's not going to find a 10xer, even if that mythical creature existed.

I for one can't wait for the social bubble to burst so programmers can flow back into the currently neglected branches of software development - we're getting old and lonely and could use some fresh talent.


if this guy believes what he writes, why isn't he a developer? certainly these are skills that would be worth investing his time in. is it because he fancies himself a 'capitalist'? The cat fell out on this one when I reached his parenthetical, half-joking-but-not-really-wink-wink plea for a 'star' iOS developer to come out of the woodwork and build whatever ipad app is on his mind...nice of the forbes people to give him an audience for his recruitment pitch... content on the web has become so self-serving... i miss reading, and awkwardly trying to fold, the printed nytimes. yea, i said it.


For the same reason that there is a maximum of people that will go into math and science. It is looked at as one of the hard jobs, where smart technical people go to. A majority of the population does not fancy themselves as technically smart and hate logical puzzles. They view the market as something other people do.

It is the same mindset of the founder "just looking for someone to build it". The though of learning it themselves, literally never crosses their mind because, they are not of that personality type.

It's akin to saying people that go into finance where worth their weight in gold in the 90's, why didn't you go into finance, may of us did in a way, but most of us used a wealth manager, broker, mutual fund to do so. Developers are like that wealth manager, they do all the technical work that investors do not want to do. Some people take the time to learn the models and self direct their investments, but they are the minority.

Most use a professional, though I doubt many investment professionals had to deal with the: "hey I have an idea I want you to build, I have no money, but I will cut you in for 10% propositions".


For a Forbes article on developers, there sure are a lot of markup mistakes showing up throughout the text...


This guy is a bit too heavy handed in technology is everything and provides some not so convincing examples. I get his point but I'm pretty skeptical that the bakery's core business is now software - in fact I'd rather not taste the products of a bakery that has a core business of software. Yes software is becoming increasingly important to business but many times (ouside SV) software is a tool to better the core service but not the core service itself.


If you run a bakery and know how to write software (or convince other talented people to do so for you), do you think you'll be more or less competitive than a bakery owner that doesn't?

Think about what software can do for a bakery:

- use the web to sell more product

- save gas money and time when delivering to wholesale customers

- make production more efficient

- use just-in-time ingredient ordering to keep spoilage and inventory costs down

Perhaps you build those systems in-house and they (in addition to a killer baguette) are what makes you special.

Or perhaps you productize your software and make millions selling it to bakeries (or spin off a company to do so).

How will the baker who knows how to make good bread but doesn't know how to leverage software compete with you?


I have a buddy that owns a vending company, he lost a pretty big account and was hurting, I thought about how I could help him and I obviously thought, teach him to develop a web presence and teach him adwords. He learned and learned how to program in the process, I was amazed at how much, given his inside knowledge of the business he had built to support new ways of doing business. Not only did he replace the revenue, he created entirely new revenue stream. I always thought,hey coin operated vending is just coin operated vending, but a guy armed with the ability to develop and deep knowledge of his industry is a formidable business man. He said to me, he should have learned to develop a long time ago. We may not know how software can help a baker but a baker sure would, if he knew what software can do, and could assemble it and the day that someone translates thoughts into applications is the day that we fundamentally change the world again. It probably wont be pretty for individuals in pure development roles.


The emphasis is on "core" - software is just a tool to enhance the core business as you're pointing out. If the bakery's core business sucks but they are great at software they will still suck as a bakery.


I may have missed this, but I didn't get that the article talked about every company's core business now being software.

I understood his argument to be that software developers are becoming increasingly valuable, and outsized returns will go to firms that understand this and learn how to treat software developers well to leverage what they're able to do to any industry.


...but I didn't get that the article talked about every company's core business now being software.

Here is a quote from the article where he lays out the Andreessen Hypothesis[1], and links to David Kirpatrick's article that restates that hypothesis into a format more readily grasped by business readers[2].

Which brings us to David Kirpatrick’s now famous line that every company is now a software company.

We are only just beginning to understand how software is now the core function of every company, no matter what it makes or what service it actually provides.

I have some reservations about these kinds of sweeping proclamations (which if I posted would turn into a tl;dr for most people), as they miss a lot of the nuance of the situation on the ground. That might be deliberate, as part of consulting at these guys' level is to get clients hooked on the sizzle of a catchy idea, then sell them the steak of the nuances you've worked out.

[1] http://online.wsj.com/article/SB1000142405311190348090457651... Not that Andreessen was the first to realize this of course, but he just happened to catch the news cycle with his verbalization of this observation at the right time to break into a more mainstream meme, that many in the industry already either tacitly or explicitly acknowledge.

[2] http://www.forbes.com/sites/techonomy/2011/11/30/now-every-c...


Thanks for this. I think there's a subtle distinction at play here: having software as a core function of your company vs its core business.

I think you're still in the publishing / music / insert_industry business; software becomes a core function of your business.

But I don't think your core business necessarily becomes software (though it could).


I'm wondering if everyone enamored of this meme is thinking that software will allow industries to change the very definition of "core". I don't know anything about the bakery business, but scratching my head over your very good question led me to posit some "what if's" to try to find ways to change the "core" of a bakery business. Hopefully these corny off-the-cuffs will get the gist across of how some think this trend will redefine the "core" of various businesses.

What if the bakery was built around an automated production system driven by software that lets a master baker key in the latest recipe tweaks to a baked good product that works great in a small batch, and automatically works out the changes necessary to scale it up (or maybe it is so automated that it can mass produce in small batches), sets up A/B testing in limited distribution areas, and measures the results...the very same day that the tweaks are made? Or it can permute variables in the first production run, yielding a few dozen samples laid out in an array, let the master baker (or apprentices) taste test, tell the system what result to modify, and dynamically adjust, permuting in a day of testing what would take a conventional bakery a week or more to work out what to do in production?

What if the automated system is so modular and flexible that it can do whatever a human baker can do, simply by watching a baker perform a single rendition (with appropriately-instrumented tools to sense pressure, flow, orientation, etc.)? What kind of competitive advantage would it confer (if any) to allow a bakery to start mass producing baked goods on demand for special promotions and occassions, in the time it takes a conventional bakery to produce a single prototype?

What if the marketing department has software that enables them to set up A/B testing for a different package artwork, and tweak it day-by-day, comparing against historical sales data and adjusting for economic conditions?

What if the sales department has software that can deduce the relative stock levels of competitors from the phone cam video taken by delivery and stocking employees of the bakery that stock at grocery retailers, and pounce to deliver more stock when the competitors are running low on stock but won't know until a couple days from now on their regularly scheduled stock deliveries?

These what-if's do not really apply to the baker that only wants a small corner bakery in the European style, serving only the neighborhood within a walkable distance, and taking a craftsman's approach to their pursuit of their art and livelihood. For that context, I believe you are right: the core business is all that really matters, and software is merely a tool for reducing frictional costs of doing business. I suspect Andreessen, Kirkpatrick, and Rao are thinking of mid-size and larger-scaled businesses when they discuss this observation/trend.

The huge assumption here is that the flexible fabric of software bytes is translated across into some directable and malleable bits in the real world. That happens, but it many times doesn't come cheap. I wonder if the proponents of this trend are vastly underestimating those costs.


I've noticed in my industry that those who can't program, manage.

Those who manage, tend to blog more because they have less to do in front of a computer.

Kind of a useless and filled with false assumptions.


Key takeaway: "For practical purposes, [developers] are [products], since the vast majority of them haven’t found a way to use their own scarcity to their advantage"


The ideas in this article are eerily reminiscent of what people were saying in the late '90s/early '00s, particularly with respect to companies like Enron. Malcolm Gladwell wrote an article making the opposite point to this one. http://www.gladwell.com/2002/2002_07_22_a_talent.htm

I suspect after the bubble pops we'll be reading 'the talent myth'-type articles in Forbes.


The article you linked is so much better that the one we discuss, there's no comparison.


I think this "Developeronomics" article is a perfect example of what this other article was referring to: http://blogs.hbr.org/pallotta/2011/12/i-dont-understand-what...

Honestly, I think there are some interesting ideas there but I am not sure if those ideas are the author's or just a random collection of memes he picked up from the Internet.


Clearly, there's one thing critically wrong with his hypothesis. A developer is never actually tied down to any one technology. We're human beings, not cogs, and our fundamental programming technologies are all derived from the same basic, mathematical algorithms.

I started programming when I was six years old. BASIC on the TRS-80 Model 1. I aggressively studied programming because I thought computer games were cool, not because job prospects were hot.

If a recruiter today tried to tell me that I had to stick to BASIC on TRS-80 because I started out there, I'd think he was nuts. Many recruiters actually believe this ludicrous idea and ARE nuts. It's not "re-inventing yourself" to learn a new programming language mid-career or at any time. Good hackers know it's simply learning another way to apply your skills to a new set of challenges.


Anyone know how to toggle a single-page/print view for this?


Readability seems to handle it ok:

http://www.readability.com/articles/c1xma652


Amazing — I came to the point where I will actually refuse to read an article if I can't get it into Instapaper easily. I guess I won't read that one, then…


I skip those articles, too. Same for the "top xyz" lists/galleries. They should be one page, like http://www.theatlantic.com/infocus/


ReadItLater works.


Please post it when you find it. thanks.


He may not get every single detail right, but his overarching thesis is correct (or at least very useful), IMO. The author provides a good model for /thinking/ about these issues, even if some things could use a cleanup (including the formatting.)


How hard is to convert to developers adults with training in other areas, but with presumed cognitive skills suitable for programming? I am thinking of people of 35-50 years old, that would be intelligent enough (i.e., there would be just a percentage of this cohort that would be fit for becoming developers), but who, from various historical and personal reasons, work in industries where the prospects are less interesting than in programming.

Are there any studies showing how the potential to learn programming decreases (or not) with age? I have looked on Google Scholar but I found nothing.


  >> thinking of people of 35-50 years old ...
You are in a tough situation if you are in your 40s and need to retrain, because there is not a long period of time to amortize the investment over.

http://norvig.com/21-days.html (Teach Yourself Programming in Ten Years)


I read all of this but had the uncanny feeling that it could have been said in about 10% of the space. Lots of waffle and fluff. Good points though ... I think? I can't hardly even remember what page 1 was about.


"When the last veterans of the earliest still-in-use software layers start to die, we will be in historically unprecedented territory."

So true, so true. We have already seen a spate of that this Fall, unfortunately.


> Risk Management in Software Talent Investment

Stopped reading one paragraph into that section.


Reading stuff like this reminds me of 1999, when the bottom fell out of the job market a few years later:

>>You need to find a way to invest in software developers.

Sigh, the gold rush will start again...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: