>Our hiring “secret sauce” largely stems from the fact that it seems to take significantly less time for someone with leadership and community skills to develop technical skills than the other way around. I’m seeing a large number of people who graduated from code bootcamps 3 and even 2 years ago now handily and gracefully filling the role of senior developer.
This statement makes me concerned that they have greatly devalued technical skills. There are not a whole lot of areas where we would consider a three month course plus 2 years experience anywhere close to being senior. According to http://www.plumbingapprenticeshipshq.com/how-to-become-a-jou... to become a journeyman plumber (mid-level) requires a 4-5 year apprenticeship!
When technical ability is devalued so much in a company, there is a real danger that this turns into a Dunning-Kruger clique, where "senior" developers that have been programming for 2.5 years automatically favor hiring experienced business people over experienced developers (remember you can turn someone who has never programmed into a senior developer in a little over 2 years)
Think about law, medicine, engineering, or the military. We would never call a lawyer that started training 3 years ago a senior attorney. A doctor after 4 years of medical school is called an intern and basically expected to mess stuff up. As mentioned above, even a plumber with 3 years of experience is an apprentice. Why should we as software developers devalue our craft so much?
>This statement makes me concerned that they have greatly devalued technical skills. [...] When technical ability is devalued so much in a company,
I'm not going to justify their thinking but I'll attempt to explain where it probably comes from.
The context for their perspective is crucial. Notice that their assertion for "less time to develop technical skills" is followed by a sentence praising graduates of "code bootcamps". They also prominently espouse Ember.js[1].
To make sense of that, we can (roughly) divide programmers into two groups: (1) CRUD LOB Line-Of-Business (2) algorithmic/embedded
(1) is programming the enterprisey, forms & fields, "back office" apps. It was COBOL, dBASE/Clipper, Visual Basic, Microsoft Access, C# Winforms, 4GLs like Oracle Forms & SAP ABAP, and now Javascript frameworks such as Ember.js/Angularjs. Basically, slapping a client GUI in front of a database backend. Whether that client GUI technology is Visual Basic, or mobile phone Javascipt, or iOS Swift app... that choice is more about whatever zeitgeist of programming you happen to be living in rather than any inherent difficulty levels between the technologies. The idea is to take the high-level frameworks+libraries and glue them together to deliver value to the business.
(2) is programming of realtime kernel schedulers, complex distributed computing algorithms, search engines, database storage engines, machine learning, ray tracing graphics and physics engines for video games, audio DSP, traversing graph nodes, control theory for drones and Mars Rover, etc. This would be more "engineering" type of coding rather than "integration/glue" coding. Typical programmers we'd think of in this group would be Jeff Dean (Google MapReduce/Tensorflow), John Carmac (Doom), Fabrice Bellard (ffmpeg).
The programmers in group(2) wouldn't say it's easy to take "leaders" and add programming skills to them. However, that sentiment is often expressed by group(1) programmers. I'm not saying it's the "right" philosophy but it's an observation I've seen repeatedly. The CRUD programming is often seen as just a longer more elaborate version of programming the Tivo / thermostat / lawn sprinkler system. It doesn't seem like group(1) is "devaluing" themselves but instead, they honestly just think "programming skill" isn't really a big deal.
I like this. I'd elaborate on your groups this way:
Group 1's job is to save the company money, or make existing systems easier to use/more efficient. They help increase sales by implementing metrics/ab testing/etc. A top member of group 1 will be leveraging technology to automate costs & complexity away. If the company nets more money because a Group 1 person is employed, they're doing their job.
Group 2's job is to invent core technology, often from scratch. They are well-versed in theory and application, and their deliverables take lots of time. If the company gains a patent because of a Group 2 person, they're doing their job.
Group 1 is business (and sometime public) facing, so soft skills are much more important. Group 2 IS usually the business, so tech skills are #1. The two groups have the same components in their pie charts, just in different proportions.
That's not always true - many businesses make a lot of money with a core product in Group 1.
Group 2 is only necessary for certain companies, where leveraging technical advantages is core for the business. Most startups are not this, though being one maybe helps a startup's chances of success.
That division of programmers is a very good model. I spent many years as a self-taught programmer of group (1) but I really wanted to do the work of group (2). I found the skill gap to be impossible to bridge on my own and went to college to do so. CS departments tend to focus on group (2) knowledge and only have a few electives for group (1). I've made several friends in their late 20's and early 30's who also went back to school to get out of group (1).
This model also works well for understanding many comments here on HN.
IMHO most of the world runs on project management, social skills, soft skills, "changing the world" by making a better mouse trap, increasing the conversion rate by 0.5%, decrease the customer support response cycle by 2.5 minutes and satisfaction/loyalty survey by 3 points.
I think most people has moved onto (1) because of the universal adoption of the web. (1) makes the most splashes because it's more relatable, accessible and dare I say, more profitable? However (2) still exists, but seems they've declined because in the late 90's, they seemed to be the dominant icons/heroes of computing (Phrack/2600, compare to today's "founders"); it takes more fortitude and individual direction to push yourself to dive deeper into the stack and take on more kernel and distributed computing problems, without the extrinsic reward of money.
In a sense, it's kinda of students choosing business consulting vs. going into (pure science or humanities, not pre-professional) grad school.
These types of programming are different, but not in the way you describe. It's not the cliche of CRUD LOB vs. "Cool things", but of making apps targeted at the end user vs. creating a shared infrastructure (whether it is a library or a cloud service), and there are many levels down that rabbit hole.
But that being said, Unless you are working for a very conventional small or midsize business with no plans of rapid growth (and businesses of these rarely employ large teams that need leadership anyway), it's never going to be "just CRUD" or just slapping a client GUI in front o a database backend". With large datasets or many users you have to plan your storage and query patterns for scalability, when precision is crucial you have to design for consistency when a real-time response is required you need to reduce latency and when business requirements change often you need to design for flexibility, when your company has multiple clients you need to design a good API, and so on. In most cases you must balance between several requirements, and I haven't even gotten to security yet, which is a too often neglected requirement for almost every system out there. I also didn't touch all the peripheral requirements for having automated tests (and designing for testability), designing a CI pipeline and educating junior devs about revision control workflows and DevOps practices.
Handling all of that doesn't require the same technical skills required for writing kernel code, a ray tracing engine or a database engine (all of them different skills in their own respect), but to be able to design a good system and give guidance to junior developers you definitely need more than 2-3 years of experience.
Yes, many companies choose to ignore what I've just said, and hire low-level programmers anyway, and often put an MBA (who did some formulas in Excel back then) to (micro-)manage them. Of course the products they turn out are buggy, hard to upgrade, fall down to pieces under the smallest hint of load, end up requiring expensive hardware, have gaping security holes and all-in-all cost their failings end up costing the business dearly.
This blasé approach to group 1 development is very common in the so-called "enterprise world", but that's just because the organizational elite over there (as opposed to most of Silicon Valley) is not even tangentially technical. That's understandable: take a CS-driven company like Google or an engineer-driven company like Facebook, and you'll get the mirror image. But there's no cold truth about programming skill not being a big deal here. Group 1 development is at the very least as demanding a skill as plumbing.
>Why should we as software developers devalue our craft so much?
Frankly there's too much money being made at average skill levels that don't require any degree. I don't think there's an economic incentive to go for more regulation. Better yet, there's a ton of need and some of the work is just plain easy once you do have 1-2 years.
There's positions that don't directly affect life/liberty unlike law, medicine, engineering, and military. Making our industry like theirs would require legal regulation of titles and work.
Open source would then become a legal grey area (because OS stuff is used in software that affects life/liberty right now) unless you are willing to police open source and make sure all contributors are qualified. Or hold the companies responsible for only using open source that only has qualified contributors. Either way, it's a pretty massive blow to the whole idea of open source.
If you want this to happen organically, then that's just going to take time on the scale of generations. Law, medicine, engineering, and military have all had a ton of time to develop this versus software engineering. Right now we're just cavemen protecting and healing our villages with shell scripts and compilers.
Frankly I hope that that regulation never happens. It sounds like a dystopian nightmare to me. The biggest reason I became a programmer, besides the fact that I really enjoy programming, is that there was no one to tell me I couldn't become one. Growing up I wanted to be all sorts of things only to find out there were entrenched gatekeepers and a multitude of of barriers to each profession. For whatever reason jumping through other people's hoops (even if their existence is totally justifiable) had no appeal to me. I shudder to think of a world where kids like me grow up and find out that programming has just as many licensing requirements and regulations as everything else they might want to be. Had that been the case I probably would have ended up just working a boring retail job or something.
But, on the other hand, companies are throwing away countless dollars on bad software. The problem is that it's relatively easy to write code to make something look somewhat functional, but it's a different level to complete it. The mistakes of the quick approaches take time to show, and then it becomes too late. It takes way more time and money to fix the issues, later.
This has been a debate that's been going on for decades. There's the camp that wants software to get to the rigor of other engineering domains. But, there are a lot more who don't. Personally, I think we should be cautious. Software is such a subjective and broad field that it would be harmful to emulate the methods of other engineering domains.
That said, I have seen a trend over the last decade of 'juniors' disregarding the wisdom and experience of 'seniors'. Maybe it's because we are too slow, too methodical, not well informed on what's new or recently hyped. Maybe we are more strict, more ambivalent (seeing everything as trade offs), or maybe we are just not fun. Who knows? But, it's incredibly frustrating to continuously fix poorly developed systems. It's irritating to give advice and guidance, only to have it disregarded because some celebrity consultant, blogger, or tweeter invented another silver bullet.
Contrary to what some may think, it's really not fun to be able to say, 'I told you so.'. It just means that we have another mess to clean up.
But, back to the topic at hand, when I hire a 'senior', I expect them to not require lots of guidance, make good decisions, not break anything, have a breadth of experience, be an awesome team players, be calm and tenacious, be able to pick up new things with ease, and have fundamental understanding of the whole stack, algorithms, and data structures.
Going to play the devil's advocate here in regards to 'juniors' disregarding the wisdom and experience of 'seniors'. Maybe the advice, experience you're giving is not backed up with any evidence? They may be taking a analytical and evidence based approach and simply choosing between you're advice or a celebrity tweeter as both lacking any evidence and thus being both equal. Possibly the silver bullet gives them a upper hand of using the latest marketing trends for reciter's to land their next job that pays 10K more?
I find it a bit of a juxtaposition that on one hand you're undecided about more engineering approaches and certification to building software. Then on the other hand you're frustrated that your advice or experience is not valued? To me it seems like you've answered your own question why your advice isn't valued by hip new guys.
Honestly, it's difficult to back things with evidence in this field. Ever try to get someone to read a paper on software engineering methods? Before one can get them to understand the evidence or even look at the data, they've already whipped together another couple thousand lines of code to demonstrate the opposite.
Look, I'm definitely not a top %1 (or even %10) architect. But, I have had enough successes under my belt that it's statistically improbable that I don't know what I'm talking about. However, my experience is usually discounted because 'times change, and old ideas aren't relevant' as they say.
> That said, I have seen a trend over the last decade of 'juniors' disregarding the wisdom and experience of 'seniors'. Maybe it's because we are too slow, too methodical, not well informed on what's new or recently hyped. Maybe we are more strict, more ambivalent (seeing everything as trade offs), or maybe we are just not fun. Who knows? But, it's incredibly frustrating to continuously fix poorly developed systems. It's irritating to give advice and guidance, only to have it disregarded because some celebrity consultant, blogger, or tweeter invented another silver bullet.
Maybe you're just getting older, and more aware/tired of it.
I'm not a proponent of blind regulation, and actually, I prefer less regulation, generally. But, do you have evidence to back up your implicit claim that less regulation yields better results?
I ask that with complete curiosity and humility - my perception is of experts telling me for decades that awesomeness is attainable with more deregulation. And, when awesome doesn't come, they say that it's because we didn't deregulate enough. Then, they ask for more.
But, I truly want to know if my perceptions are off - i.e. changes could be happening slowly enough that I don't see the immediate results.
There are scenarios in game theory (which I have not studied myself, only read about indirectly) that regularly occur in the market, that can only be solved by regulation enforced by a non-participant. Many of those scenarios relate to externalized costs (e.g. pollution) and information asymmetries.
Health care in countries that have the best results are also heavily regulated. Regulations can be very beneficial, but only if they're well developed and well protected from regulatory capture.
You made the following statement `Health care in countries that have the best results are also heavily regulated` give a citation backing up your statement.
I mad a set of statements, so I was asking which you wanted cited. The one you're interested in is pretty much like citing evidence that the sky is blue, but if you'd like to see it, here's a ranking.
The reams of regulations for the health care systems of other country's health care systems are mostly not hard to find. You can start there, and then check here for more details:
Yeah, but part of my point, which I suppose I could have made more explicit, was that I don't have to work for a large tech company. Or any company at all. I can, if I really want to, strike out on my own and do it myself. That's how I got started getting paid to program. That path isn't for everyone, but I think that it's possible at all is a wonderful thing. I get a sinking feeling in my gut when I even contemplate a future where doing so just isn't a real option in the same way that one can't just become a doctor by obsessively practicing medicine on their own every night in their free time. I know I'm not the only one who got into programming that way and I think it would be really sad to close that avenue off for potential future programmers.
I unfortunately think that the reason why programming has been accessible is because we didn't know better. The standard for what can be publicly accessible on the Internet or what will compete for users attention is much higher today. I think for programming to remain open there must be formal ways to acquire knowledge or knowing the quality of a piece of software.
Seems like they're describing what I think of as a project manager, albeit one blessed with enough technical skill to provide well-informed leadership of a project, including working with the senior developer.
The article says that someone considered Senior in the other two areas, once reaching a mid-level of technical competence, would be considered senior overall.
To bring the idea into a couple of the other fields you mention, consider:
A lawyer, just out of law school but with 10 years of software experience and recent work in patent submission.
A doctor, just out of medical school but with 10 years experience as a registered nurse.
An engineer, just out of college but with 10 years in construction, specifically welding structural beams and overseeing other welders.
The one that I have personal experience with is the military. Soldiers respect as senior their peers with significant outside experience, even if the regulations do not allow them to be promoted officially.
You can also consider, among officers, the brand new 2LT who was enlisted for many years prior. While they are a 2LT just as any butterbar, they are respected beyond their rank.
I don't think that the article is claiming that everyone 2 years out of a code bootcamp can be considered a senior developer, but that there are exceptional individuals who can.
There's a vast, vast divide between people who can use Rails and people who can design and build Rails, and an even bigger divide between the people who can use AWS and people who can design and build AWS.
The company in the OP isn't in the business of systems engineering, they're in the business of software integration. I suspect if they were in the business of engineering they'd have a much different view of what it takes to be "senior". Because they'd be able to measure it in terms of lawsuits, audits, and fines rather than in slipped deadlines for web apps.
I had the same reaction -- all good until that same paragraph. This kind of broke it all -- do these people understand what they are writing about?
I recall my own feeling of getting into another universe after I finally started to systematically look into the assembly code generated by the compiler, and it took me 6 years to develop the need for this.
It has been at least 5 years that I have removed 'Senior' from my LinkedIn profile ;-)
I find it amusing that you liked an article which argued against "X years' experience" as a way to define a senior developer, but your disagreement is that you want to insist on "X years' experience" as a definition.
And as others have pointed out, it's not "technical ability" that is being devalued; more likely it's that a particular subset of technical knowledge is devalued -- perhaps appropriately for the situation -- but it just happens to be a subset that you personally feel is extremely important (and I might well disagree with you on whether that subset, or any specific subset, is actually essential).
And that's without getting into the irony of how much we as an industry approve of successful high school or college dropouts who learn to code well without completing a formal CS degree, but then turn around and, frankly, shit on the idea of actually hiring any such people because it'd require lowering our technical standards.
I can't speak for the parent comment, but I will say that while there are plenty of developers with "X years experience" who are not operating at a senior level, it is unrealistic to expect that someone will be operating at senior level without those years of experience.
I liked their ideas around direction given vs. direction required. I liked their distrust of the notion of "cultural fit". I liked that they identified leadership and connectedness as distinct skills, and that a senior developer has both technical skills and at least one of the other two (leadership or connectedness).
But true technical expertise requires the ability to make holistic, contextual decisions. That kind of stuff takes experience shipping multiple products, full-stack exposure to at least one or two mature software stacks, and awareness of the historical context of how technology has changed. That is, it's not so much about "years of experience" as it is "diversity and quality of experiences".
This is the kind of thing that bugs me, though, this sort of software-engineer exceptionalism. We like to pretend that somehow we're the only ones who have complex projects with tight deadlines and changing requirements. We like to pretend that if you haven't been coding since you were in diapers you can't possibly have enough context or experience or sense of the history of tech to make good decisions. We like to pretend that somehow the only people qualified to be software engineers are the kinds of people who've traditionally been software engineers.
But that's just bullshit. People with CS degrees from big-name schools and years of experience still do a shit job of managing software projects. They still do a shit job of picking tech stacks. Our literature, such as it is, is positively overflowing with stories demonstrating just how fundamentally horrifically awfully bad we are at this stuff. Often, they're bad at this stuff precisely because of the background that's supposed to make them exceptionally qualified; they're bad at it because they got extensive training in how to coordinate a dozen threads in an application but precisely zero in how to coordinate a dozen people on a team. They're bad at it because they got a bunch of ideology about theoretical purity shoveled into their heads in college instead of a bunch of pragmatism. They're bad at it because in the easy-VC-money days nobody cared what your tech stack was, as long as you picked something shiny and new. They're bad at it because they literally do not have the relevant background to be acting in a senior-engineer role.
But what about someone who gets tired of being, say, a lawyer and decides to do a coding bootcamp? They come in from a field where research, shifting requirements, teamwork and pragmatic approaches to dealing with a client's wants are all front-and-center required skills. The same is true for plenty of professions: we can take in somebody with the project-management skills and teach them about tech and coding. We're good at teaching tech and coding. What's hard is taking the person who has only ever learned tech and coding, and teaching them all the other skills.
I'm going to make myself very plain: For anyone who intends to pursue a technical/engineering career path, as opposed to switching to the managerial/entrepreneurial path, you're going to need to master some very difficult technical subjects. The point of calling a profession "engineering" implies that there is quite a bit of depth to the technical side. Therefore, calling someone a "senior" engineer implies that they have a good grasp of that deep skill-set.
> This is the kind of thing that bugs me, though, this sort of software-engineer exceptionalism. We like to pretend that somehow we're the only ones who have complex projects with tight deadlines and changing requirements.
In other news, the lead civil engineer on the local levy project has only two years experience and their only training was an associates degree in CAD, but don't worry because they used to be a lawyer and so they know all about "shifting requirements, teamwork and pragmatic approaches to dealing with a client's wants".
>They're bad at it because they got a bunch of ideology about theoretical purity shoveled into their heads in college instead of a bunch of pragmatism.
And the very first comment on that post is a better rebuttal than I have time to write up:
But working as a consultant to a lot of big companies (and a lot of small ones) over the past few years, I don't think the degree matters so much. The thing that matters to the exclusion of everything else is which side you're on: are you working to make something great or are you collecting a paycheck and trying to stay out of the line of fire?
A CS degree, X years of experience in tech, writing PDP-11 assembly in your diapers, etc. are terrible indicators of which side of that divide someone will be on.
Our company recently ran most of us through interview/hiring training. One of the core messages of this training was "hire for soft skills; technical skills can be trained"
This included a slide with a hypothetical "candidate A" who had no technical experience, but great communication skills, and "candidate B" who had extensive technical experience but no "soft skills". Who do you hire for a hypothetical technical role? "Candidate A" was presented to us as the correct answer.
Nevermind the fact that our company has never historically demonstrated the ability to technically train an employee...
So, yes. Devaluing technical skill appears to be all the rage in HR these days.
At most of the places I've worked, I've been able to easily get consensus on the assertion that if a person can't explain what they're thinking to you then you won't be able to correct them if they're wrong.
You'd rather have an honest disagreement than have someone bumbling around the code quietly (and possibly quickly) making a mess that nobody notices until the debt is piled so high it blots out the sun.
>Our hiring “secret sauce” largely stems from the fact that it seems to take significantly less time for someone with leadership and community skills to develop technical skills than the other way around. I’m seeing a large number of people who graduated from code bootcamps 3 and even 2 years ago now handily and gracefully filling the role of senior developer.
This statement makes me concerned that they have greatly devalued technical skills. There are not a whole lot of areas where we would consider a three month course plus 2 years experience anywhere close to being senior. According to http://www.plumbingapprenticeshipshq.com/how-to-become-a-jou... to become a journeyman plumber (mid-level) requires a 4-5 year apprenticeship!
When technical ability is devalued so much in a company, there is a real danger that this turns into a Dunning-Kruger clique, where "senior" developers that have been programming for 2.5 years automatically favor hiring experienced business people over experienced developers (remember you can turn someone who has never programmed into a senior developer in a little over 2 years)
Think about law, medicine, engineering, or the military. We would never call a lawyer that started training 3 years ago a senior attorney. A doctor after 4 years of medical school is called an intern and basically expected to mess stuff up. As mentioned above, even a plumber with 3 years of experience is an apprentice. Why should we as software developers devalue our craft so much?