I got an impression who their target audience is, based on the examples used, drinking kombucha and listening to Spotify.
All the young, beautiful people who wouldn't have ever taken a computer science course if it weren't such a lucrative industry to be in right now.
Maybe with this guide they can pass an interview at a big company where they just twiddle with bits all day. I won't be holding my breath until they can produce something useful.
This is what you get when you commoditise Computer Science. The article is about coding interviews and then goes on to deliver a lecture about "Computer science in plain English". Isn't it common knowledge that Computer Science and coding are like chalk and cheese?
People used to study DataStructures for a whole semester to get a deep understanding of how these work, and the time/space complexities affect system design. But now we get a pop-culture laced listicle that people will use to ace the interviews and write mediocre software. Does anyone wonder why there aren't listicles like this for Structural Engineering/Thermal Power Engineering/Mechanical Engineering etc?
No wonder we get data-breaches, password leaks, system outages, because we continue to treat Computer Science/Software Engineering as the next fad to make quick buck, and not science.
It isn't science, or engineering, because many of the things that count in computing defy measurement :
- code quality
- software productivity
- expected time between failure
- tolerance to error
- expected life in field
- usefulness to users
Because the science and engineering cultures of computing have failed to address these effectively, or even create cultural norms that support their development, a craft culture has evolved instead. Coders are artizans rather like clock makers in the 16th century.
> Because the science and engineering cultures of computing have failed to address these effectively
That doesn't automatically make coders 'Artists'. There is a huge gamut of software outside the CRUD world of HN. Software that runs mission critical applications like Mars Rovers, power plants, Public transport systems, etc. We should have a fair amount of rigour to ensure the software being written is rock solid, and taking shortcuts to learn basics of CS is bad.
Artizans, not artists. But I agree with your point, failure to understand hashtables is like failure to understand the secret nail in carpentry (I choose this as I've never understood the secret nail in carpentry and therefore do not count myself much of a carpenter).
We shouldn't delude ourselves, software (at the moment) is a craft discipline and craft disciplines can have huge blind spots. Comp-sci needs to spend real effort on the hard to do questions like working out how to make systems usable and other corner cases which are ignored in favour of reams of papers about verifiability and modular composition; note I am not against this work, but I just don't think it should be funded while the experience of watching a six year old trying to use google or a mac is as humiliating (to a professional) as it currently is... and boy is it.
I second the artisanal mindset that you have elaborated, and it will be better for everyone if we adopt the craftmans mindset (Which Cal Newport also talks about). I personally feel that we should model Software Engineering like the Apprenticeship model in Germany. It would be interesting to see how it fares.
I had someone reply to one of my comment that they 'didn't know about the Vietnam War because they weren't born then'.
... I feel like our discipline suffers from a lot of the same problems. IBM mainframe experience doesn't translate into knowing the intricacies of React, but it does provide wisdom on systems and usability problems people coding in React are going to run into too.
Glorify the new, but be informed by that which came before. Otherwise doom, repeating, etc.
Not that I'm disagreeing with you there, but I've also seen the opposite. I've had conversations with people who know their mainframe environment inside out but have no idea how a modern computer really works.
I'm under the impression that it often comes down to a combination of Dunning Kruger and a certain unwillingness to inform oneself about "new" things (i.e. things one doesn't know about yet) - and those seem to exist in both camps. So, just maintaining Aristotle's mindset and keeping an open mind seems like a good starting point to me in that regard.
Job offerings look for "engineers" while they mean coders. Young person who think he/she does not count as engineer will pass over that job, despite being fully able to get it and work there. "You are not actually engineer" is bad advice in current job market.
(I used to be that young person and used to pass on opportunities because of superficial reasons like that.)
Better for competition who gets the job with the same qualification and skills maybe, better for you personally (in both terms of what you learn and how good jobs you get) - no.
I have an actual engineering degree and write code that controls some sophisticated hardware. Am I allowed to be called an engineer and not an artisan or whatever?
As my dad always says, "you can call me a jar, as long as you don't smash me to bits". Since Slavic proverbs don't always translate well to English, it might be best to clarify: what you call yourself is not as important as what you do.
What @sgt101 wrote about measurements is an important and insightful point. Does it mean we can't call ourselves engineers? That's a discussion that won't be resolved any time soon. Personally, I care more about the job itself, than about the label you slap on it.
I don't even have a degree and I write web stuff and I still call myself an engineer.
We "design, construct and test structures, materials and systems while considering the limitations imposed by practicality, regulation, safety, and cost."
Anyone who doesn't take a systematic approach I suppose might be an artisan, but the vast majority of software engineers I've worked with in my career at least attempt to use a systematic approach of learning and building processes.
Blame the young, beautiful people all you want for not slaving away over a 4 year degree that puts them in the same position as 1 year of self learning would get them, OR blame the business yuppies who keep coming up with the SaaS business models that don't need anything more than plug and play code monkeys.
I understand memorizing data structures does not make a good engineer, but honestly who is still asking for cream of the crop engineers, especially at the entry/junior levels?
At the risk of sounding like a cranky old codger, I'd like to point a couple of things out. If all you have are "plug and play code monkeys", all you'll have is shitty software. The fact that shitty software is "enough" for so many businesses is just another symptom of the biggest problem with our society: we optimize for profit and damn everything else.
Oh I agree completely. I just take issue with the fact that it somehow is a "youth problem", as is being pointed out by the parent comment. It's a business problem, not a lazy millennial/education problem.
Stop hiring shitty coders and you'll stop getting shitty interviewees. Less shitty interviewees means less need for these types of "beat the coding interview" services/blogs.
Honestly a lot of times the comments in these threads amount to "I was a geek and it wasn't cool, I know CS from the ground up, and nobody deserves to have an easy path to being a coder". It's an exhausted form of gatekeeping and doesn't make anything better for anyone.
I'm not defending such comments, but there's more to them than just gatekeeping. Speaking from my own point of view, the "I was a geek and it wasn't cool" sentiment comes from feeling betrayed by having your lifestyle become an industry that too often focuses on profit more than on quality; the "I know CS from the ground up" comes from frustration with all the people who dismiss learning from the ground up without understanding all the insights it gives you; and "nobody deserves to have an easy path to being a coder" is an exaggeration of "when you try to make learning easier than it can really be, you end up dumbing things down".
Recently I found myself struggling to formulate my attitude towards software development and the best I could come up with is "lifestyle coding": sure, it's important to me to make something people will use and like, something that will improve life in some aspect, but in the end, I'm in this because I love to create programs. To me programming is more than my job, more than just means to an end, it's what I truly enjoy. People like me will often feel bitter about many aspects of our industry and it takes a conscious effort to keep aware of that feeling and to make sure it doesn't taint our decisions.
A four year degree at a quality program will open up a lot of jobs that the one year of self learning will not prepare you for.
That's not to say one year of self learning won't prepare you for a lot of programming jobs, or that the person with the four year degree won't be starting at one of those same jobs. But there are a lot of types of software I would only want someone with a real CS degree writing.
(Of course, 1 year of self learning, plus 3 to 4 years of real world coding plus further supplemental self study, could very well get you to the same level as a CS grad's knowledge, or beyond.)
I wasn't suggesting that a 4 year degree confers any advantage. I think it's merely status signalling to potential employers. Practically speaking, my degree only helped me at one point in my life: moving to a foreign country.
There's nothing wrong with software that only needs code monkeys to write. It doesn't have to be technical masterpiece to be useful, or valuable to business. Just think of all the code monkeys with jobs.
Basically every place I've worked for in the last 35 years has been desperately seeking new grads who can actually program a computer. And it's getting harder to find them.
Are we now blaming young people for looking at employment perspective before picking up major? Or are we blaming them for trying to learn?
Because last time students were mentioned, they were blamed for picking up unpractical major unlike everyone older in STEM who was supposed to make more rational choices (as supposedly proven by them not picking humanities by passion).
This is not blaming of inadequate teaching methods: "All the young, beautiful people who wouldn't have ever taken a computer science course if it weren't such a lucrative industry to be in right now.
Maybe with this guide they can pass an interview at a big company where they just twiddle with bits all day. I won't be holding my breath until they can produce something useful.
"
This is bull shit. Are you also the kind of guy (I know you are a guy) that is also on the "look out" for "fake" geek girls? The homepage looks a lot like the programmers I meet. Do they need to be overweight and not care about how they look to be real programmers that can actually produce something useful?
The projection in your comment is unreal: a just world fallacy where only unattractive people can be smart, attractive people are mindless, which was definitely not the point.
How exactly is this projection? Your original comment was stereotyping real developers based on their appearance. They were not "legit developers" in your eyes because of how they looked. I am looking at the home page photos and it is not like they all look like models. They are younger and better looking than average but they would not look out of place on any dev team. This nonsense about what a developer looks like is pure bullshit. Why assume that a person is any less of a legit developer based on your image of how devs look? Why is more OK than any other kind of stereotyping? How is it better than stereotyping woman and driving?
It was clearly not my intention to judge programming competency by appearance. But it is also undeniable that the industry does select for young, good looking people.
To be honest, your comment doesn't make you seem like you're someone who actually knows anything about or is particularly good at coding -- rather, it makes you seem like someone desperate not to have to compete with a larger field of candidates, and grasping for any reason you can find to claim they shouldn't be allowed to try for the same jobs as you.
Also, I'm quite sure someone could easily come up with comments just as arrogant as yours, about you.
Usually, any "(X) for Interviews" post is a review of things that in theory a programmer learns very early on and then forgets due to disuse (since most programming interviews are essentially pop quizzes on such things).
The ironic thing is that this biases toward the most newly-minted programmer; actual experienced working programmers rarely need to implement basic data structures or their relevant algorithms from scratch (they rely on existing implementations), and so move them to dusty disused corners of their minds, while newly-trained programmers with no job experience have been regurgitating these things on exams quite recently.
So even the dismissiveness is wrong -- the original commenter is, to be honest, less likely to pass such an interview without remedial study, compared to the "young, beautiful people drinking kombucha and listening to Spotify" being sneered at, who probably have been taught more recently and have it fresher in their minds.
It should be mandatory for programmers to be able to concoct even bad examples of a sort algorithm in real time and create data structures for anything from a linked list to binary trees.
It is also important to have candidate code samples
for an interview with dissection and analysis by the candidate. This is to understand where the programmer is in their professional development, how much is copy-paste and how much is functional and design integration. All this demonstrates level of knowledge and the candidates productive approaches.
Having to teach programmers how to deal with recursion and other fundamental concepts or language specific approaches like pointer arithmetic or interpreted language nuances like lambda calculus and list comprehension should not be on the table.
All modern software is built on layers upon layers of leaky abstractions. If you view fundamentals as nothing more than hazing rituals, you won't even be aware of just where the abstractions start to leak and you'll end up writing shitty code. Of course, it's perfectly possible nowadays to do so and let it be an SEP (Somebody Else's Problem).
I got an impression who their target audience is, based on the examples used, drinking kombucha and listening to Spotify.
All the young, beautiful people who wouldn't have ever taken a computer science course if it weren't such a lucrative industry to be in right now.
Maybe with this guide they can pass an interview at a big company where they just twiddle with bits all day. I won't be holding my breath until they can produce something useful.
You don't see any arrogance here? All it needed was an avocado-toast reference to be indistinguishable from a "why millennials are terrible and my generation is much better than them" thinkpiece.
I'm a millenial myself, not older than the people I am critiquing. (after looking up your profile I realize that I am much younger than you)
My comment was made in jest, pointing out how incongruent technical interviews can be to the job they're interviewing for, which I think doesn't imply any arrogance. Maybe you are reading it differently.
All the young, beautiful people who wouldn't have ever taken a computer science course if it weren't such a lucrative industry to be in right now.
Maybe with this guide they can pass an interview at a big company where they just twiddle with bits all day. I won't be holding my breath until they can produce something useful.