Hacker News new | past | comments | ask | show | jobs | submit | maxklein's comments login

This problem will be faced by many developers soon. The Internet is huge. Very large. The big companies are going to be dealing with huge data. You'll need to understand algorithms and math, and frankly, this stuff is a bit difficult to learn on your own. I thought I knew it all till I went into the algorithms class - that when I realized that not only did I not know it all, I was not as smart as I thought I was, and I would never have had the motivation to go through with this if I had not been forced to. And that goes for many developing.

Programming is a scarce profession now, but the simple stuff will soon be done by too many people. Software will become a real engineering task. In 20 years, the age of the code monkey will be gone.


Oh no, it will be cyclical. Programming is all about attaining higher abstractions, hiding more technology under a simple interface. Every now and then some new set of abstractions will be useful enough that we'll need a bunch of people to explore a field of opportunity. These explorers are called entrepreneurs.

Maybe in 2016, you're going to need deep credentials to be a useful web dev, but none whatsoever to start something useful with 3-D printing.

EDIT: that said, nothing makes you more employable than knowing things at a deep level. A friend of mine, a former Plan 9 kernel contributor, quit the tech industry after the first bubble to become a wildlands firefighter. Returned to the tech industry in 2008 and resumed being a highly-paid infrastructure geek like nothing had happened.


I really agree with this.


I think it will be the exact opposite. As computers take over more and more jobs from us, we will need less "office workers" who know how to shuffle documents around, and more workers with programming skills.

You want to be a mathematician? You need to know how to program. You want to be a "secretary", you need to be able to dig through your boss's e-mail using regexpes when he needs to find sth, you are a dentist - you will install your own scripts on the website because you know how to do it from high schools.

In 2016 (or 2012) it won't be "oh, we need more skilled programmers", it will be "sure, you know programming, everyone knows, but what really you know?". Programming will have the same place on CV like "MS Office", or "keyboard typing" has right now. No big deal if you know it, but much harder to find your job if you don't.

Of course there will still be place for real computer experts - algorithm designers et al, but the basics will be known to more and more people.


This isn't going to manifest itself as more people knowing how to program, though. Easier interfaces and smarter searching, but not programming as we know it.

In 1980 you'd say that in 20 years the average office worker would be performing calculations on thousands of rows of data, generating charts, typesetting documents, creating full color presentations, doing business with clients in multiple continents and they'd wonder how people would cope with the increase in cognition required to do all that. But it's just button-pressing for most people.

In 2016 they'll say "we have a database with 4 billion data points and we need to infer customer behavior patterns from it". You'll say "Sure.", sit at a desk, click "Segment", click "Demographic: 18-21", click "Intersect", click "Products", click "Make Recommendations", click "Apply" and a discount coupon for "Justin Beiber's Comeback Tour" will be beamed directly into the eye sockets of anyone who bought canned salmon last fall.

I don't see a society where 80% programs, I see a society where 10% builds things for the other 90% and a huge part of the middle class will be automated out of existence. This, to me, is the big issue that will shape this generation and the next.


Excellent thoughts. I think that more people need to know how to program their computers. But, as you have so elegantly pointed out, the inexorable march of progress will not bring this to pass. It hurts a little to think about, but in a large way you seem to be on the money.

These ideas are worthy of more than a two paragraph comment on HN. I third the notion that you should pen a full blog post.


I second the need for a blog post on this!


Have you written a full blog post or article on the subject by any chance? I'm really interested.



Thanks.


Give this test to the next 5 random non-technical friends and family you talk to:

A = 1

B = 2

C = 3

A = B

What does A equal?

I'm not saying people can't be taught. But think about how big the workforce actually is, think about how widespread MS Office skills are. For every power-user analyst and project manager that's really taking Excel out for a workout, there are 10, 20, 50 people who use Office in every day non-challenging tasks.

I've given that little test to my MBA wife and a GP family member and several other people. Hardly anybody gets it right.

Edit:

The x-factor here, btw, that determines whether or not somebody understands it, is whether they see that assignment is happening, not some sort of "wha? 1 equals 2? what is that?" And those that didn't just get it, even after I explained assignment they were just as puzzled. Just the concept of variable symbols confused and (i presume) disinterested them.

What we do here everyday, this is difficult, challenging stuff, that I don't think most the workforce will ever understand. Instead, people like us will be busy for decades to come, building tools so they don't have to.

There was a time when machines were new concepts versus simple tools. You could say, in the early stages of the industrial revolution, that soon everybody would understand and be able to fix their machines. But machine complexity has out-paced the desire and ability to learn those skills.

Software is no different, I don't think.

Edit Two:

http://www.codinghorror.com/blog/2006/07/separating-programm...


I don't think it's unreasonable for people to assume that = means equality, not assignment.

Edit:

In particular, the link posted in the second edit has a rather poor test using a and b, because it uses = in two different ways with no indication that the meaning of the symbol has changed. Maybe the problem with the test isn't just the people, it's the sloppy notation that assumes people with no programming background are able to infer when we mean equality and when we mean assignment.


The population that took that test were self selected computer science undergrads!

And even after three weeks of instruction most of the people who didn't understand it immediately never understood it. I'll quote from the article:

"Either you had a consistent model in your mind immediately upon first exposure to assignment, the first hurdle in programming-- or else you never developed one!"

My wife is a brilliant woman, fantastic at what she does. The GP I mentioned in my post is a very good doctor who had no problems getting into a medical school, passing his boards, or running a successful practice. But that doesn't mean that everybody is meant to be able to understand the abstract concepts you have to master in our line of work.


I wonder how the experiment would change if you change it to "let A = 1, let B = 2, let C = 3, let A = B", or "make", or some other verb that seems more like assignment.


Or if this were explained to be a sequential process and not a just a descriptive list of unrelated declarations. It seems reasonable to think that that list contains a contradiction if you've never been exposed to these concepts before.


So education and the availability technology will eventually result in many people with basic programming skills will result in not only computer literacy in the "I can use MS Office" sense, but in the "sure, I can hammer out a Python script to get this task automated or sift through that data" sense.

Is there some sort of disruption of (basic) programming skills coming the way blogging has disrupted journalism?


I'm of two minds on this one. Part of me agrees completely with what you say. Further tools will exist that automate a lot of what the code-monkey does. The level of work done by a lot of us will be push-button, or "plug these couple of things together".

On the other hand, in 1998 as a nerdy guy getting out of high school, with minimal html, javascript and programming experience, and running linux on a pentium pro I faced a big choice. I went to the local ISP to pick up a "real modem" (vs a winmodem) to connect my awesome unix box to the internet. The guy there asked why I wanted these modems vs going to circuit city for some amazing sale they were having for a faster winmodem. When I told him I was running linux I was offered a $40K/yr job on the spot - just for getting linux installed on a computer and understanding the basics (quote "we can teach you anything else you need to know, you got the spark"). I was 18, and that was a HUGE deal. Anyway this wasn't uncommon, at the time wired was running stories about "HTML factories" where people were making pages and pages by hand all the time. Minimal programming skills got you a job.

Some of this was just normal boom-time labor shortage. There were lots of stories about how after the crash these guys would never work again. Some aspect of this was true, but some of it was bunk. The 2000 version of this story would be "sure you can do html, and you can do CGI, and you understand http headers and can whip up a server, but we need people who understand SQL and how to work with record objects and how to do live updates to a system, stuff you need a real degree for. It's 2005 not 2000"

So basically I am suggesting that while Rails may be a non-skill (like HTML has become) and maybe good REST APIs will be auto generated, and some Backbone.js future version or successor will do most of our tricky js stuff, there will likely be good toolkits that allow people to plug together data-mining and data-management without needing super deep algorithmic understanding, we are already seeing the emergence of such tools.

So the other part of me disagrees, the code-monkey will be needed, just that they will be putting together different bits than they are today.


> So the other part of me disagrees, the code-monkey will be needed, just that they will be putting together different bits than they are today.

It seems to me the Hiring Manager in the story was fishing for a jack-of-all-trades kind of applicant. The interviewee was obviously a web developer, but the position to be filled was a data analytics job.

What difference does it make if Rails/PHP/Node.js/Backbone/etc become commodity jobs? In order to get the raw data that requires complex and high speed algorithms to parse in to useable data, you'll still need a website, built by a rails/backbone code monkey and a Photoshop designer and with the help of a decent DBA, at the very least. The data position, if it ever comes to it, will just be another job type that a well rounded development team will need to fill, not a replacement for everyone else on the team.


"When I told him I was running linux I was offered a $40K/yr job on the spot - just for getting linux installed on a computer and understanding the basics (quote "we can teach you anything else you need to know, you got the spark")."

Yeah, my technical screen for an 8-year career in Schlumberger was exactly this.


Internet may be a huge place with a huge amount of developers.

It is also an even huger place when it comes to demand of people that can do simple stuff.


Math is overrated. Nobody really needs math; if you find yourself needing some math you're probably reinventing some kind of wheel. Same with algorithms.

How do you deal with huge data: you just do. There are tools for that, and you apply those and perhaps make new tools yourself, but math and algorithms tend to never enter the equation.


I am reminded of Jeff Jonas' "Data Beats Math": http://jeffjonas.typepad.com/jeff_jonas/2011/04/data-beats-m...


Who do you think makes those tools?


Not computer scientists. Programmers themselves, based more on their experience than theory.


Congrats. Continue in the fine tradition of traditional science dismissing everything that falls outside of known worldview without thinking about or discussing it.


Physicists are generally quite busy people, and the fact it was published it in crackpot journal is a pretty good indicator that reading it will be a waste of time. If the authors are physicists, they know what the reception of an article in crackpot journal will be. If they care, they will publish it in a journal with good reputation. If they cannot, it means that it's not worth reading.


Yes. Let's have a discussion with crackpots.

Dr. Amrit Sorli is a researcher with the Osho Miasto, Institute for Meditation and Spiritual Growth, Siena, Italy. His research subjects are Unknown Vacuum Energies in Living Organisms and Direct Scientific Experience. Dr. Sorli is the author of several books and articles and currently gives courses on this theme.


More ad Hominem. You can have batshit beliefs and do proper math/science on the side. Prejudice due to the former may prevent a rational evaluation of the latter, as it happens here, and may also happen during peer review or editorial decisions.

I can't read the full text, but the abstract claims that their views are based on the interpretation of experimental data.

If you can read it, could you tell us if is it a mere thought experiment, or if they performed it? In the latter case, what's the flaw in their experimental setup or reasoning?

Edit: I've done a quick google scholar survey of the second author, and he's also used to delve far into the "not even wrong" territory... That being said, neither the abstract nor the phys.org summary ring any objective crackpot sign to my untrained eye. The claim that they proved Einstein wrong, their track record, and the journal the paper was published in are of course big red warnings. But that doesn't mean that the core of their argument has to be dismissed out of hand.


The problem with engaging with everybody that has an idea is that we don't have the resources to do so.

This was actually discussed in an episode of the Scientific American podcast. I can't find the episode, but they mentioned that they would need a member of staff working full time to reply to and debunk all the crack pot theories.

What's more important, staff doing research and teaching or answering crack pots that cannot be persuaded?


I know that. You don't need to engage with the authors, but assessing some of the papers once in a while may not hurt either.

In the abstract of this paper, they claim that they have experimental support for their theory. This is enough to lower my guard, and make me want to know if 1) they indeed do and 2) they bring something novel on the table.

The explanation given by phys.org sounds like it could make some sense (even though they are most probably beating an old misconception like a dead horse).

That's why I wouldn't mind if a physicist was kind enough to skim and debunk or validate the paper beyond simply bashing the authors.


That's why I wouldn't mind if a physicist was kind enough to skim and debunk or validate the paper beyond simply bashing the authors.

So hire one and pay him to do it. What do you think a physicist will rather do, read a paper published in a reputable journal that is highly likely to teach him something new, or paper that is enormously unlikely to be something else than crackpot theories?

Have there ever been any actual progress in physics coming from someone without academic credentials and/or published in crackpot journal, hm?


Attacking someone's work is not ad Hominem. If he called him an asshole or some other unrelated insult that would be ad Hominem.

If just listing the guys work comes off as insulting, you gotta wonder...


By attacking his previous work, you don't address the question at hand, and sidetrack the debate on the worth of the messenger.

Strictly speaking, you could say it's an ad opus attack, which is still fallacious.


Moving to parts of South East Asia or Africa.


Tip: Instead of writing the articles via your keyboard, get a good voice-to-speech machine (like the iPad) and simply free-talk about the topic. In 30 minutes, you can write on quite some topics. Totally unique content.


Great idea! Does anybody know if there are any good dictation apps for Android that would work while driving in a car? All of my attempts at doing voice dictation to Google maps while driving have failed miserably. This would be super productive to do while driving around town.


One workflow is to create audio files which you later process with a speech rec application.


No, don't believe this. The world is a very complex place, and what makes for a good story is often very different from what reality is.

When American journalists write about the world, they write as if the world is America. Stories are broken down into us vs them, people become black and white, some people are assigned the label 'good' and some the label 'bad'. Everything gets americanised, even the american obsession with skin colour or racial differences works itself into every conflict that is reported on.

Whatever the real story in mauritania is, it's made of centuries and centuries of history, a land caught between the arabs and the africans, a culture that has lasted hundreds of year.

This cannot be captured in a short article on CNN that reduces it all to the archetypal american "black people enslaved by people who are not us". After reading this article, you still know nothing about Mauritania. You have no understand of the complexities of their society. All you have is this single story, this opinion piece by a single author.

Your knowledge is second hand and it's second rate. You do yourself, Mauritania and history a great disservice if you read the article and believe it. A single story should only ever be something that invites you to discover the real history of a place and people (http://www.ted.com/talks/chimamanda_adichie_the_danger_of_a_...)

When you read an article like this about Mauritania, don't read it. Realise that it's like overhearing a conversation between two strangers. You lack context, you lack understanding.

That's why you should not believe this story. It's a single story about a place you know nothing about.


While I think it's reasonable to ask reader to consider secondary sources, you have offered neither actual evidence that the article contains falsehoods or "Americanized opinions" nor alternative sources that contradict this article.

I would argue that it is an equal disservice to brush aside issues in foreign countries as being the result of "centuries and centuries of history" or cultural complexities. Based on your rationale, I should not believe anything I read about any place I am not intimately familiar with because I "lack context" about it.


Ok, that's fine if you're talking about the opinions in the article, but if want to refute the items presented as facts in the story, you need to back it up with facts yourself. Otherwise I would just as easily not believe you.


And that's the correct attitude. I don't say anything refuting the points in the article. I'm not against the article either. I'm pointing out that a single story does not describe a phenomena accurately.

But everyone is jumping on my comment as if I said that anything in the article is wrong.


You said "No, don't believe this," as if you wanted to say everything in the article is wrong.


But then did not proceed to do so, but instead explained how lack of context in articles generally make them difficult for people to understand. People react too fast before understanding, like most people did on my comment.


The video is better. It points out that there's a continuum. I couldn't help but think of tenant farmers or sharecroppers. Tenant farmers could be free, or bonded to the land. There was a continuum from wealthy tenants (who hired workers) to serfs to slaves. It seems that a lot of Mauritania sits all over that continuum.

Most of the problem could simply be that they don't have anywhere else to go. But it's a little spooky that their government seems to be in cahoots with the slave owning classes.

On the other hand - look what happened to Rhodesia (now Zimbabwe) when the land owners were disenfranchised, and their farms given to cronies of the new government - the whole place fell into a famine because the new "masters" couldn't farm.

There's no surefire way (to my knowledge) to reconstruct a country which has just come out of slavery. My gut reaction would be incrementalism - find ways of punishing the worst offenders so the bad (but not terrible) land/slave owners don't feel threatened. Then slowly start moving up the chain - punish physical abuse, then ensure freedom of movement, then look at land redistribution and better political representation. At no stage should a large portion of the population (i.e. every land / slave owner) feel threatened, just a small fraction of them.

You'd make a false distinction between "landlords" and "slavers". First you go for the people who murder their slaves, and the "landlords" won't resist because they don't murder their slaves. Then you go for the ones who beat their slaves, then the ones who restrict movement, then the ones who don't pay wages ...


I see you repeating again and again and not providing any background information and facts. What's exactly your problem with the article. Have you found any errors or biased stuff? If so, please highlight it for us.

I get the strange feeling that you have a problem with the article because it shows that not only white people are able to enslave other human beings. Maybe you should have a look at the history of slavery in islam.


I think you misunderstand what I am saying. If you read my comment, you'll see I'm not talking about the content of the article or saying slavery does not exist.

It's difficult to tell people "think before you decide" when the people instantly assume that I am saying something that is attacking them and immediately start aggressively defending their opinion.


I think the problem is the rabbit hole of self-delusion.

1. This is bad, and we all agree. 2. Something should be done about it. 3. Old white people (Republicrats and Democricuns) should do something about it. 4. That "something" can and may be violence, intimidation, economic sabotage, ... 5. Oversight is provided by the same people planning and execute such measures in 4. 6. Somehow, all "we" are doing is for the greater good, all "they" are doing is wrong and backwards. 7. If you disagree with 6, you hate apple pie and mom. 8. WTF ??


Actually, I do know about Mauritania. I've been there. I've seen the cities and the desert. I've travelled thousands of kilometres in the desert, slept in the desert. It is an open secret that slavery exists there on a wide scale.


And why do you assume that I am saying it does not exist? If you are widely traveled then you will understand the difference between articles and reality.


Except this is not a single story but something that's been known for quite some time:

https://www.google.nl/search?hl=en&noj=1&site=webhp&...


Your search seems to only shows western articles.


Here's an example of what maxklein might mean.

Ask yourself: is the relationship between Moulkheir Mint Yarba and her master typical of Mauritanian slavery? Or is it very atypical and unusual? Or in between?

I'll bet you have no idea. But if you knew - wouldn't that change the way you react to this reality? (Assuming it is a reality. Mike Daisey's reality sure wasn't.)

I have no idea, either, because Mauritania is a foreign country to me. I'm happy to let it remain so. I have enormous doubts that American social engineers can improve it from afar with pallets of dollars, or drone strikes, or whatever we're using these days. (We might improve our own country first, although it's admittedly harder because we can't use drone strikes...)


Such fluff stories were also very common in 1998 and 1999. You never read such stories about car designers, do you?


One core difference that I notice between Africa and the West, irrespective of the current development stage, is that people in Africa are mostly optimistic that things are better than the were, and they are expecting things to get much better. So they see the future as being bright and full of opportunities.

People in the west seem to have the opposite viewpoint - the see the past as having been better, and the future bringing doom and gloom. So their outlook is pessimistic.

So there is much more of this lets-work-forward energy in Africa.


And they would both be right. Both cultures are approaching the future from very different socioeconomic positions. In the past (30-40 years ago), things were bad in Africa. At any point 3-5 countries were engaged in civil war, despotic rule or famine. At the same time the west were doing incredible scientific achievements, building wealth and living longer lives.

If you're at the bottom you can only go up. Now that Africa is getting into the resources business in a major way the infusion of cash can only accelerate this.

On the other hand, many western countries are now trying to figure out how to survive in a world they have largely built. The price for information has fallen but the price for most goods have gone up (especially food and oil). There are many that grew up with what their parents had and realized they might not be able to have that lifestyle, and it scares them.

Edit: I'd also like to add that Africa's economy has been growing for years. There are 49-50 countries all moving very quickly but all the western news wants to report about is the few problem areas with wars, dictators and famine. Hardly anything was spoken about the farming commodities market in west Kenya going fully computerized in '05.


Paul Graham said. "We're trying to figure out why this YC batch did so well. One theory: they all used Convore (http://convore.com)

What actually made the startups switch away from convore? At the point pg said that, quite a number must have been using it. Then they just stopped?


There are many reasons why Paul Grahama might speak too highly of Convore, but one theory is endowment effect:

http://en.wikipedia.org/wiki/Endowment_effect

People think more highly of things that they have an ownership stake in.


It's not like every YC startup was a paying customer of Convore. I'm pretty sure he meant that they used Convore instead of/as well as a mailing list to communicate with the rest of the batch. He was making a claim that Convore leads to better communication than a mailing list, and that in the case of that YC batch, it was enough better that he thinks it made the companies noticeably better overall.

But 200 people using it for free (and probably naturally fading out of using it once the summer is over anyway) is not going to keep them afloat.


"What actually made the startups switch away from convore?"

I don't think anyone switched away from convore as much as they just stopped using it over time. The problem with convore was that it was too focused on information, and people don't care about information, they care about people. So what started as a better version of chat/forums eventually devolved into a less good version of quora.

The best thing that convore had going for it was that all of the content was extremely lightweight, but in the end I think this is what killed it also because the emotional connection just wasn't there.


Their new startup is Grove, which is hosted IRC for teams. I'm guessing this is what most of the YC batch switched to to replace Convore, since it hits more directly on their need.


What's more possible:

1) he just said it to promote Convore

2) he really believed that (in an environment with hundreds of factors), companies' use of Convore had a noticeabe effect on their performance?

I know where I'd put my money on...


Very faulty logic. 1 in 5 logged in with github, and 1 in 7 logged in with twitter, and this implies you don't need facebook? Measuring two other things does not have any meaning to another not-measured metric. One should add a fb button and then measure how many log in via fb to make such a statement.


Not at all! I think it's totally valid to use some gut instinct here.

My instinct is that there's a scale from Facebook -> Twitter -> GitHub in the services that our target audience uses (less nerdy -> more nerdy).

Since GitHub performed better than Twitter (by a significant amount, although that's debatable) then I feel that the Facebook end of that scale isn't worth even testing at all. Instead, it would be more useful to try services on the GitHub end of the scale, such as OpenID or maybe something more novel like BitBucket, Heroku, Dribbble etc.


I get what you mean and it's probably a fair assumption. But, the data you showed and the conclusion you draw "Won't be adding that FB button soon" is totally illogical. You have no idea how many ppl. that are coming to grove, want's to sign in but don't because they have neither a twitter or github account and/or don't want to sign up with it.

You need to add a FB signup button, test the increase (or non increase) of signups and then draw conclusions. Or measure bounce rates in some smart way. Your current method isn't very scientific.


What about if they made merit not just where you are, but rather how much you have achieved?


And there's also the question of how meritorious high marks in classes and nominal positions of leadership in common high school organizations are. If everyone can do it, it's not exceptional.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: