And furthermore, an instance of the more general case of 'unverifiable claim.' The list doesn't stop there, either.
The first paragraph is a misleading hypothetical. You're left to infer the statistic from the author's anecdote. The new guy can't code? Is that every new guy? One in a hundred? How would you know that his first paragraph says anything at all? In the absence of making a factual claim, the author is trying to inflame passions.
"It's a big problem...[because] there's a boom on." Really? A boom? And then the "50 times more productive" trope? Then "negative productivity?" Ah. This time, the payoff is actually stated: "terrible mistake," "catastrophic company-killer" that "happen[s] so often." LOOK! THE BOOGEYMAN IS RIGHT HERE!
And then, Microsoft's fault. Of course. What isn't? It's here that we find that 'unverifiable claim' is just one of many classes of things that the author is using as a means to pander. Certainly, MS's interview tactics might have worked when initially introduced. Certainly they may have become a fad, and ineffective through overuse and the mere passage of time. But "fault" is a complex thing, and ultimately of little use, here. What if this was a deliberate plan by MS to wreck interviewing forever? So what? It gets you nowhere, except indignant, which is what the author seems to be counting on.
The article just goes on (and on) in this fashion. I won't bore you, but, now that you've got an idea of what the author is doing, go back to the article and find every spurious or unverifiable claim and ask yourself, "What could the author be trying to accomplish since he's clearly not trying to relate factual knowledge?" (Remember that the author included 9 links in as many paragraphs.)
The article is flippant, and is written in a breezy we-all-know-it's-true voice. It's apparently a moot point how tripe like this front-pages, but, with a little work, you can learn to quickly sift through crap like this and move on to meatier fare.
Agree. I've had professors that were uber hackers, prolific authors, grant magnets, inspiring educators and thorough gentlemen to boot. That's a lot of 'getting things done'.
The Techcrunch definition of getting something done is being on their blog: you've announced a (fill buzzword here) product, gotten cash, made a presentation that shows you had potential, said something about an IPO or otherwise drawn their attention. Hard core academics are in the business or startup space: they're starting their research.
To the extent that there is a problem here, it's because academia optimizes for getting different things done than businesses do. It takes even smart people a little while to make the psychological switch-over, the more so since most of them, and their bosses, don't realize the need. It is just another kind of cost for new hires, like their learning your technology stack and coding conventions.
Is this really true? Will I be wasting at least five years of my life by taking various CS degrees? I have some professional experience (web dev for a startup) but that's really everything beside school I have time for, and I'm not the only one in this situation.
Some parts of the article were spot-on, but certain assertions seem rather drastic and biased.
You're not wasting the time if you're learning something, but don't expect the degree itself to be generally considered a notable accomplishment.
Additionally, whether or not 5 years and $X dollars is worth the reward depends heavily on you, and where you go to school. Some people might find more value in spending that time and money elsewhere.
No you won't. Academia thought me how to be a better programmer, about compiler construction, functional programming the theory behind OO programming, prolog, how databases actually work (IE not just some random subset of the sql language) how 3D graphics work (including how to compute light and how you go from a mesh of triangles to an image on a screen), Haskell, SVN and how to work on large projects with others (including many of the issues that are not just technical, etc). While there were a few classes that was a waste of time (how to design enterprise systems and usability comes to mind, since the first is just a matter of ticking of enough boxes and the second is do what the user expect) most of them wasn't.
There is a real bias against CS degrees in certain circles, which I suspect is fueled in part by a couple of bad experiences with no-good hires (which exist in all professions) and in part by a feeling of inadequacy some who never got a degree might feel.
As a hiring manager, my 'bias' against CS degrees, such as it exists, is that they don't actually tell me much of anything.
Individuals have the capability to learn "compiler construction, functional programming the theory behind OO programming, prolog, how databases actually work", et al, independently of a degree, and yet, having that degree doesn't seem to actually guarantee that a potential hire understands any of the above.
I had to write a binary search a few weeks ago in the course of Real Work.
I agree technical interviews aren't the only way of assessing a candidate's qualifications. Their open source projects, their writing on technical topics, and just having a geeky conversation with them are all good indicators of cluefulness.
Not saying I don't believe you, as there are times when it does come up but IMO even if you do have to write something search-like, it's usually better to do a linear search for the sake of having more readable code.
You have no idea what I was doing (or even what language I was using), so don't presume.
I benchmarked my binary search, a hand-rolled linear search, and the search function that was built into the container I was using.
binary 0.019003 ms
linear 3.402408 ms
builtin 9.007484 ms
> it's usually better to do a linear search for the sake of having more readable code
For small inputs, maybe. But a binary search takes only marginally more time to test and write than a hand-rolled linear search. And anyone halfway competent should be able to recognize one when reading code. When you have more than a dozen elements or so, a linear search is simply Doing It Wrong.
I once needed to find several elements in a ~500,000 object array several times a second. And the criterion by which I could judge whether they were the objects I wanted was expensive to compute, and often complex -- involving looking at neighboring or seemingly unrelated objects, or the general program state. And in special cases, I needed to return a filtered slice of the array.
There wasn't a library function to do what I needed, and that's a pretty serious understatement. I wrote my own libraries for the job, and still wound up writing one flavor or another of tweaked binary search about once a week.
Regarding the last bit, how hard can I say "wtf"? Writing readable code and a writing a binary search are hardly at odds; they're orthogonal.
Regarding the first bit, there are lots of situations where the algorithm needed is a binary search, but for which your generic search-a-sorted-array won't do. For almost any non-trivial need, really...
A project I worked on needed more than just the value (or lack of value) from a sorted array---it also needed the index where the item would have been had it been in the array. The standard binary search in the language I was using didn't return that information.
Sometimes, the requirements exceed what the standard stuff gives you.
In a few years of experience, I have come across 2 rather bad developers where one of them outpaced the other by a mile in the race of 'Suck'. Both were fairly recent hires and I can't bring myself to think how shortcoming of such magnitude can go unseen.
I imagine that a person can get very far with BS and co-workers usually are the back-stop to those antics. If they are not, they should have absolute permission to call on the manager to verify someone else's competency.
In addition, I find it rather offensive for a person to have to learn on the job after it becomes evident that the resume, linked in profile or any other point of reference was embellished.
Le sigh,- without painting a group with a broad brush, but I see PHP as a gateway language as more folks try to pivot
into applying for programming jobs. I've seen enough this year alone to turn me into a skeptic.
This post is obviously overly-biased against technical interviews. Steve Yegge said it best: "It's a bit easier to tell if someone's in great shape physically than if they're in great shape mentally. You can't just stare at their brain and hope to find a six-pack in all those folds." - http://sites.google.com/site/steveyegge2/practicing-programm...
I think this is one of the biggest misconceptions in our industry. Software developers, REALLLY, need to get over ourselves.
Bad development practices and bad coding, has not (as far as I have ever heard) killed any company.
I would like to be shown at least one example, where bad software killed the software company ?
For crying out loud, Windows was 'bad' until Win 2000+. Don't forget WinME.
The first version of Netscape was horrible. Internet Explorer was absolutely, horrendous.
Norton Anti-virus became more horrible with every release until they slimmed it down recently.
I was recently reading questions asked by Larry Page on Java's forums, and it is CLEAR that he was NOT an 'A-grade' Java programmer.
Twitter's early versions kept dying - under what could conceivably be argued is bad software development - yet they still thrive today.
I am pretty sure that the first versions of Facebook sucked ass too. Developers get better over time...you know how ? By sucking, making mistakes, learning and improving. Not by being awesome all the time.
So we, as developers, need to get over ourselves and realize that we are just one tool in the tool belt for company success. You can build the best product on the planet, if you have no sales (assuming that the market is rational) and no VC financing you are dead. Finito. Kaput. Gone.
Kapish ?
Edit: Sorry if this comes off as belligerent, I am just tired of the stereotypes. If developers don't start off as bad, how do they become good ? Plus, having the most efficient sorting algorithm will not let one software company win over the other - unless you are selling sorting algorithms, of course.
Bad development practices can and do kill companies, although in those cases the cause of death is usually considered "failed to innovate" or something similar. Accumulate enough technical debt and you'll be unable to move very quickly for fear of breaking things. After code starts breaking expect to add processes on top of processes, which may stop the breakage but only slow things down even further.
That's not to say that bad programming is the only reason companies fail to innovate, but if the code gets bad enough things will show down and newer competitors will overtake you.
As I asked in my response, Please provide examples.
Bad coding does not mean 'failed to innovate'. Digg.com innovated with the Digg button, but according to Kevin they had a bunch of 'bad coders' doing 'bad coding'.
Bad coding does not mean "failed to innovate" but bad coding will certainly lead to failure to innovate.
There are many, many companies that fall into that category but if you want a direct example of bad coding -> failure then look at Friendster:
"But the board also lost sight of the task at hand, according to Kent Lindstrom, an early investor in Friendster and one of its first employees. As Friendster became more popular, its overwhelmed Web site became slower. Things would become so bad that a Friendster Web page took as long as 40 seconds to download. Yet, from where Mr. Lindstrom sat, technical difficulties proved too pedestrian for a board of this pedigree."
There is hardly ever a single reason why a company fails, but site performance and the inability/unwillingness to address it played a large part in Friendster's decline.
We really are arguing in circles here, because there could be many reasons for the slowness at Friendster. Didn't have to be bad coding. It could have been insufficient hardware. No matter how good the developers were, if their hardware can only manage X number of concurrent users with max optimizations, when growth brought in X^2, no amount of 'good development' would solve that issue.
However, I do agree with your point that there isn't just one thing that kills a company. That was my original point.
The TC article made it sound as if 'bad coders' killed companies. That is very delusional and it flatters developers too much. They are important, but not THAT important. That's my point.
Please don't flame me, as a developer myself, I am fully aware of where my expertise (and value-add) stops.
This writing has a few awesome hum-dingers: "Like many of the hangovers that haunt modern software engineering, this is ultimately mostly Microsoft’s fault," "Yes, I am being deliberately sexist here, because in my experience those women who write code are consistently good at it," and "So what should a real interview consist of? Let me offer a humble proposal: don’t interview anyone who hasn’t accomplished anything. Ever."
When I read his work, I feel like this author writes in a binary way: Either an entity's right, or they're wrong. Period. (Jon Evans or the highway!)
Yes. Same for his dissing of Simonyi. Almost seems personal. He might have invented the Hungarian notation, but he also oversaw the development of the most successful enterprise software ever shipped. Most of us are not exactly MS fans, but lets acknowledge 'getting things done' where due.
I agree with a lot of this article, but the slam on Charles Simonyi was fairly random to the point of making me wonder if the author is just ignorant of software development history or straight out trolling.
I've never liked Hungarian Notation either, but that doesn't change the fact that during his years at Xerox PARC and Microsoft Simonyi exemplified "Smart and Gets Things Done".
The author also seems unaware that Joel Spolsky, who he speaks approvingly of otherwise, has explained/defended that using a sane version of Hungarian notation can make perfect sense.
Given a problem and access to a search engine, see if he/she can break the problem down and find the most accurate solution and tailor it to the problem.
This should be a reasonable measure of a programmers aptitude.
The following are some of the main misconceptions of what's floating around between the blog, TC comments and HN comments.
Misconception 1: answering programming questions tells if the developer is awesome
No it doesn't. When an interviewer asks you a programming language-specific question, they are wanting to know how awesome you are at the language to see if you can hit the ground running on your first day. These sorts of questions only tell you one thing though, that the person you are interviewing has spent a lot of hours in front of the one programming language. I personally do not rate these types of questions for an interview because anyone can learn syntax, data structures and best ways to implement language-dependent code.
Misconception 2: brain-teasers don't tell you anything
This couldn't be more wrong... The reasons why an interviewer throws you a brain teaser or design question is to understand your thought logic and problem solving skills. While you talk through how you would solve your problem, they are assessing your communication skills, your process in solving a problem and also what knowledge you have as part of your experience.
Misconception 3: degrees don't tell anything
There is a lot of "show us your projects" being thrown around. While this is a fair call, one should not dismiss the degree. Simply being, that the degree is a project. It means that the candidate has had to spend three to five years juggling multiple subjects (read as 'projects'), while working part-time (read as 'projects') and managing their social life (read as 'drinking beer' and 'tuning hot people'). A degree is a testament of the students ability to see something through from start to finish... it's an example of their dedication.
Misconception 4: degrees aren't teaching students how to
code, so how are they expected to code
Again, this is a fallacy. The degree is teaching students how to collaborate through group projects. How to work unsupervised and be resourceful while working unsupervised. It teaches the fundamentals so they can pick up any programming language (just another tool) and apply the fundamentals they have been taught.
I strongly agree with @marcamillion's statement about "developers being better over time". This is why I disagree with Misconception 1, as all this is doing is showing how much experience the interviewer has with the language they are quizzing someone on.
Overall, give the new coder a break. They most likely got hired because they:
- fit into the work culture
- possess strong problem solving abilities
- can work unsupervised
- can work within a team environment
- have imagination
And if the new guy is asking you a question, it's because they're wanting to learn, so respect that as they're trying to be awesome like you.
Disclaimer: Sometimes people make mistakes though, and a dud ends up being 'that' coder that can't code ;)
Misconception 2: brain-teasers don't tell you anything
This couldn't be more wrong... The reasons why an interviewer throws you
a brain teaser or design question is to understand your thought logic
and problem solving skills. While you talk through how you would solve
your problem, they are assessing your communication skills, your process
in solving a problem and also what knowledge you have as part of your
experience.
The problem here is that a lot of the brain-teasers I've seen bandied about for interviews are the kind of problems where you either get them via an a-ha insight, or you don't get them at all. That's kind of what makes them good brain-teasers.
Yeah, I've never really understood how you can "talk through your thought process" for a question like "Why is a man-hole cover round?" It seems like you'd either come up with a reason or you wouldn't be able to come up with anything at all.
Perhaps you can say things like "Well, let's see...you need to be able to pick the man-hole cover up out of it's hole, then put it down somewhere, then maybe move it, ..." but I really think these types of ramblings are just stalling for time rather than an actual English narration of what I believe is an inexplicable thought process that would go into coming up with an actual answer to such a question.
1: I want to see if someone knows language foo, and I want to know that they can implement algorithms and data structures. In short I want to see that they've spent a lot of hours on this platform.
Of course anyone with the hacker nature can learn any language, but that doesn't mean I want them to learn it on the clock. When I want an electrician I call an electrician, not a smart friend who I think can learn it.
2: I don't ask brain teasers because someone who has heard the question before has a tremendous advantage over someone who has not. That's not what I want to test for.
RE: 3 and 4, I don't much care as long as they pass #1.
I have a completely different approach. I hire people who have worked with a particular technology or ecosystem for a while, because I think that builds the relevant experience and habits. If they are good at things similar to what we are doing, they should be able to pick it up and integrate quickly, and learn more stuff.
And don't underestimate the value of good documentation.
But of course, one prerequisite is absolutely necessary: the ability to approach problems logically. That is what one should look for on top of the domain knowledge.
Besides interview questions, give an interview ASSIGNMENT. Meaning, before you are hired, you have to complete some task. This shows a) commitment to wanting to work here, and b) shows that you can sole the kinds of problems we had to face before.
You know your version control, where someone had to go in and fix some bugs? Well, encapsulate these things as minimum examples (with e.g. javascript errors, maybe a stray comma or something in JSON) and then tell the guy "it doesn't work on IE, please fix it". You know it can be done because it WAS done. That's the kind of stuff they'll need to do at work.
But of course, one prerequisite is absolutely necessary: the ability to approach problems logically. That is what one should look for on top of the domain knowledge.
This is something that many people in the enterprise and in mainstream programming are incredibly in denial about. I would guess that fully half of enterprise software decisions are made on an irrational basis, or at best on stuff that amounts to no more than hearsay.
Actually, most people are partially in denial, in that they are unconsciously trained to operate on groupthink. Most companies are not startups, so they have some proven formula for making money. The best strategy in this case is to let that work and to not rock the boat. In this case, being an obedient corporate drone is desirable. In the startup case, one is actively in search of a new formula, so it is most certainly not!
>Besides interview questions, give an interview ASSIGNMENT. Meaning, before you are hired, you have to complete some task. This shows a) commitment to wanting to work here, and b) shows that you can sole the kinds of problems we had to face before.
Are you hirering for Facebook? Because if not you will miss the developers who aren't desperate to work for anyone.
Hmm: Poster vineet's account was created 1120 days ago, has 112 karma, and average karma of 1.12. Hacked, a coincidence, or am I missing something? Is 112 part of some number like pi or e?