Are you joking? Without understanding binary, you can't understand:
- Numeric types, which numbers can be represented exactly, their failure modes, etc.
- Bit-field flags, e.g. for enums
- IP address masks and other bitmasks
- Anything at all about modern cryptography
- Anything at all about data compression
- The various ways color is represented in images
- Any custom binary format, MIDI, USB, anything low-level at all
Honestly the list goes on and on. It's absolutely insane to me to hear people say that you can be a competent software engineer without understanding binary.
The average web CRUD developer never needs to touch any of this stuff.
- Numeric types? Who cares? I know min, I know max. I take number from user and insert it in database. For calculations with money, I use integer cents.
- Bit-fields? I work in Java, what are bitfields?
- IP addresses? I am web dev loper, not network engineer. I don't need to deal with netmasks.
- Cryptography? Me no understand. Me use Let's Encrypt. Is secure, no?
- Compression? Browser do gzip for me. Me no care.
- Colors? I pick the nice color from the color wheel.
- Binary? What is binary? I only use binary when I want users to upload a file. Then I put the files on S3.
Well I do embedded dev as a hobby now so I know this stuff. But for a long time I didn't know how many bits were in a byte simply because I never really needed that knowledge.
Look, it's fine if you want to be hobbyist making some personal website that doesn't store PII or take user submissions. But we're talking about career software developers here. If you want to make a career out of it, this attitude is not only harmful to your career, but dangerous to your company. Besides: do you really not want to actually be good at what you do?
My attitude is: I learn whatever I need to learn to get things done.
And for a long time, I've just never needed to learn about how many bits were in a byte.
The only time I've ever needed to deal with individual bits in Java was when I was working with flags or parsing binary formats. Both of which are extremely rare when doing generic server dev.
You can even do rudimentary binary reverse engineering without any knowledge of bits. I remember running programs through a disassembler, looking for offending JMPs and then patching them out in a hex editor with 0x90.
Not having knowledge is not a problem as long as you know that you are missing that knowledge.
You have an artificially high bar so you can gatekeep people from being smart and be the arbiter of who's smart and who's not. What you don't realize is most people don't give a crap and easily hundreds of billions worth of software is sold every year by developers who don't know about anything you mentioned.
Your attitude is also inappropriate for a place called "hacker news" where people are resourceful try to do the most with whatever they have. Maybe you want to go to /r/compsci
You don't have to be a "competent software engineer" to be a developer, and in fact, we were never talking about being a "competent software engineer".
These developers do not get jobs as "competent software engineer"s, do not train to be a "competent software engineer", and do not care about being a "competent software engineer". And yet they make things that work perfectly fine! I'm sorry that you think handling PII or having a career (ooo) has anything to do with being a "competent software engineer".
> But we're talking about career software developers here.
I don't remember the name of this fallacy but it sucks, knock it off.
It's absolutely insane to pretend most software engineers need to do any of these things. We're making billions of dollars in aggregate without knowing this stuff. Most people aren't writing their own data compression algorithms, and we sure as shit aren't writing cryptographic algorithms because we're told to leave that one to the experts (i.e., mathematicians).
I'm guessing 80% don't even know bit-field flags and never have to use them, despite them being relatively simple. It would also take someone moderately capable at understanding math less than an hour to "learn." I learned them when I was 13 because I fucked around with MUD codebases, and I don't think I am special for it.
Do you realize that once you write code to solve a problem, that exact problem never needs to be solved again? Either you're solving new problems (and "new" can be slight -- new situation, new context, new hardware, new people, new company, whatever), or you're doing the compiler's job. If most people aren't solving new problems, then their job is bullshit. I frankly don't even understand how you can be confident that code copy-pasted from Stack Overflow actually does what you need it to do without understanding the fundamentals.
> we sure as shit aren't writing cryptographic algorithms because we're told to leave that one to the experts.
Shouldn't we all be striving to be an expert in something? If you're not working your way toward expertise, what are you doing? Why are the absolute fundamental basic building blocks for understanding how computers work and what programming languages are doing something that only "they" need to bother to learn?
Most of the problems software engineers are solving today are business problems defined by stakeholders in a business.
And yes, I agree we should be an expert in something. Harping on binary seems like a waste of time, however. I would certainly like that people are interested enough in the field that they spend their time learning as much as they can about CS, but I'm under no illusion that I'm going to automatically be more productive or better as a software engineer because I know bit-fields.
We're "harping" on it because it's so basic. There are a hundred equally basic things you should also know.
> Most of the problems software engineers are solving today are business problems defined by stakeholders in a business.
Even understanding the business problems usually requires a lot of fundamental knowledge in a wide variety of subjects. Being a good software engineer is hard.
And regardless of the problem, if the solution is going to be in code, you can't get away from binary. I actually don't think most programmers should be learning assembly language, because you can actually completely abstract away from it (and you should, because you don't know which assembly language you're programming for). But you can't abstract away from addition, the alphabet, binary, strings, algorithm complexity, and other basics.
PS: I didn't downvote you. I don't downvote people for disagreeing with me. I like disagreement, and besides, it would reduce the visibility of my pithy responses! I only downvote comments that aren't even worth arguing with. So the very fact that I'm taking the time to respond means I have at least some respect for your argument.
Are you joking? Without understanding binary, you can't understand:
- Numeric types, which numbers can be represented exactly, their failure modes, etc.
- Bit-field flags, e.g. for enums
- IP address masks and other bitmasks
- Anything at all about modern cryptography
- Anything at all about data compression
- The various ways color is represented in images
- Any custom binary format, MIDI, USB, anything low-level at all
Honestly the list goes on and on. It's absolutely insane to me to hear people say that you can be a competent software engineer without understanding binary.