Hacker News new | past | comments | ask | show | jobs | submit | more l__l's comments login

This is my field :)

Category theory is about connecting the dots between different areas of maths. The "general application" is to allow you to reason over the structure of a problem you're interested in, while throwing away all the superfluous details. It arose when geometers and topologists realised they were working on the same problems, dressed up in different ways. I think the utility for technical people, from this perspective, is pretty clear.

As for the general working person? I think it's just an exercise in learning to do abstractions correctly, which is valuable in any line of work.

There are actually people who advocate that we should base maths education on category theory much earlier (much as New Math was interested in teaching set theory early on, as a foundational topic). CT is an unreasonably effective tool in a large section of pure maths, so this doesn't sound unreasonable to me; it wouldn't be nearly so scary if it were introduced gently much earlier on (in the same way we start to learn about things like induction in the UK in secondary school, long before formalities like ordinals are introduced at uni). Currently only a very specific, highly-specialised section of the population learn CT, but if something like this were to happen, I'm sure we'd see lots of benefits which are hard to identify at the moment.


Don't need to convince me ;), but I mean this is for the very average person who argues like "why do I need more math than adding two numbers"... and even just "allow you to reason over the structure of a problem you're interested in, while throwing away all the superfluous details" and "learning to do abstractions correctly" would not seem more approachable to them than the calculus description I gave above? imo. Is there a simple real-life problem or model everyone should know or has touched in school this can relate to? I'd be curious, because as you frame it I may even need to revisit my faded memory, hm.


I honestly don't know. This might be a skill issue on my part (very much not an educator), but I think of it as a language for thinking about structural abstraction, so to me the question is akin to "is there a real-life problem that German relates to?"; I can certainly think of lots of problems that would be made much much easier by understanding the language (e.g., getting around Germany, i.e. noticing abstractions), but it's tough to point to anything for this explicit question other than "conversing with someone in German".

I guess to try and mirror your calculus example, I'd try and motivate why someone should care about abstraction itself, perhaps with examples like 'calculating my taxes each year is exactly the same problem, except the raw numbers have changed'.

Alternatively it might go over better to say something like: "Imagine you have a map with a bunch of points, and paths which you can walk between them. CT is the study of the paths themselves, the impact of walking down them in various routes: for mathematicians, this means looking at things like turning sentences such as 'think of a number, add 4 to it then divide by 2 then add 6 then subtract 1' into 'think of a number and add 7'. Once you've spotted this shortcut on this silly toy map, you'll recognise the same paths and the same shortcut when you see on your tax form 'take your income, add £400 to it, divide by 2, add £600 and subtract £100"


This text comes to mind: https://math.mit.edu/~dspivak/CT4S.pdf

I've only read pieces of it, but I think this moves in the right direction towards making category theory useful to day-to-day life in non-trivial ways.


This makes the same mistake being called out in the comment you're replying to. The point isn't about the mechanics of solving a differential equation, it's about gathering the intuition about a way of approaching problems.

(Also, while it might not be the tools needed for the average homeowner, there are plenty of optimisation problems similar to "how much fence do I need" which are most easily solved by solving the Euler-Lagrange equations)


Interesting I've seen the opposite in highly trained mathematicians - they can really struggle with taking an intuitive leap without the safety net of the working out the numbers.

While it's always better to work out the numbers if possible, sometimes it isn't possible and people get trapped trying.


Not a solution but FWIW, in my time working at Apple, they encouraged you to log in to work devices with your personal ID (with the expectation you don't allow files to sync). There were some people who kept them apart, but the overwhelming majority just used their personal with no issues. Personally I made a new one, but at the time it was my first Apple device so wasn't really a decision.



Honestly I just think the concept of Son/Radarr doesn't translate well to music, I find Lidarr fiddly in general.

In particular I'd add to your list that the overnight scans to update cover art are an absolute mess. It's not so big a deal on my libraries in Sonarr and Radarr, but for Lidarr? Jesus Christ. I have reasonably sized music library (~400/500 gig), and every night Lidarr starts phoning out to check, for every single album and artist, whether the associated cover art or artist image has changed. This takes hours, and is completely unnecessary, and cannot be turned off. I've resorted to just blocking the addresses it does this on, but this breaks things when I try and use it to add new music.


The point here is that this isn't some example from a textbook or even stack overflow, but licensed pieces of work with all the legal complications that come with that. This is about the potential use of this code in proprietary code (or code otherwise incompatible with the original licenses), and I really don't think anyone would say it is "accepted best practice" to copy out someone else's work you find online, licenses be damned, in a professional setting.


> this isn't some example from a textbook or even stack overflow, but licensed pieces of work with all the legal complications that come with that

I understand why these might feel different to you, but textbooks and stack overflow are also proprietary, licensed pieces of work. I don’t see why there would be much of a legal distinction.


No, you're missing the point.

There are two worlds.

In one, everytime someone publishes code with a license attached, they've taken a chunk out of the set of valid lines of software capable of being permissibly written without license encumberance. This is the world the poster you are replying to is imagining we're headed toward, and this case basically does a fantastic job of laying a test case/precedent for.

The other world, is one where everyone accepts all programming code is math, and copyrighting things is like erecting artificial barriers to facilitate information asymmetry. I.e. trying to own 2 + 2. In this second hypothetical world, we summarily reject IP as a thing.

The 2nd world is what I'd rather live in, as the first truly feels more and more like hell to me. However, given the first one is the world we're in, I'd like to see the mental gymnastics employed to undermine Microsoft's original software philosophy.

EDIT: Voir dire will be a hoot. Any wagers on how many software people make it onto the jury if any?


> In one, everytime someone publishes code with a license attached, they've taken a chunk out of the set of valid lines of software capable of being permissibly written without license encumberance.

If this were true of copyright, we would’ve run out of permissible novels a long time ago. There’s plenty to complain about with how software IP works, but copyright seems pretty sane. The alternative of protecting IP via trade secret is not a world I want to live in. That seems bad for open source.


Code is a more restrictive space than prose. Prose has to be grammatical and meaningful, but code has to compile and efficiently serve a useful specification.

The central idea of programming languages is that the grammar is very restrictive compared to natural languages. It's quite likely that, with the exception of variable names and whitespace, some function you wrote to implement a circular buffer is coincidentally identical to code that exists in Sony's or Lockheed Martin's codebases.

Plus there's the birthday problem -- coincidences can happen way more than you expect. And even with prose, constraints like non-fiction can narrow things down quickly. If everyone on HN had to write a theee-sentence summary of, say, how a bicycle works, there would probably be coincidentally identical summaries.


Three sentence summaries probably wouldn’t qualify for copyright protection. The same should be true of code - if we think the standard for copyright protection is too low, we should raise the bar on complexity requirements, not throw out copyright.

Even if a programming grammar is more restrictive, there’s some length where things become almost certainly unique.


ReactOS actually got sued by Microsoft for stealing code and one of their proofs was a piece of code (can’t remember exactly what it did) that basically matched the same function Windows code with a few things changed.

It was ASM code I think, and their defense was that there was basically one way to write a function that does this.


I think you're misremembering here; as far as I know (and as far as I can tell from searching just now) MS has never sued ReactOS. There was a claim made back in 2006 on the mailing list that a portion of syscall.S was copied, and this caused ReactOS to do their own audit:

https://en.wikipedia.org/wiki/ReactOS#Internal_audit


It raises an interesting question though.

Aside from obligatory syntactic bits, what is the most common line of code across all software ever developed?

It'll probably be C or Java. HTML doesn't count.

And it's probably something boring like:

  i++;


I'm don't think this dichotomy is at all fair. Just because someone makes a piece of software public does not mean they want it freely copied, and I think that can be a completely reasonable stance to have. I'm struggling to make sense of your argument unless you believe either:

- Code is not intellectual property; I don't see this as easily defensible. It takes time, effort, and in some cases seriously heavy resources to come up with some of the tech companies rely on. Should all private companies rescind copyright on literally everything their staff write?

- Intellectual property is a nonsense concept altogether; in this case, I don't think you're ever going to get your way in the court of public opinion.



in many cases a snip;routine;proc...whatever you work with, is rote procedure. such as device access. ie retrieving a directory listing.

code that reverts to a conserved sequence of bytes interchanged ,no functional variations.

code that is so common knowledge it has become street graffiti, belongs in world 2

versus code that creates a functionality not available by direct command, is innovative and should be attributed. this sounds like what 1st world should be.


That’s not actually how it works. Purely functional code, such as code that it written in a certain way to achieve maximum performance, is not deemed expressive and therefore not covered by copyright. This code would be covered by patent.


i think we are actually talking about the same thing.

in simpl terms:

mov bax eax ; an obvious function; no IP

mov eax eax ; seems useless unless you know what de-referencing is. probably IP

this is of course example not considering granularities at level of patents on a language, or macro directives


Even from this page there appears to be a fair bit missing. E.g. one of the biggest UK gov orgs is @alphagov (~1.6k repos), which I can't seem to see on your dash?


> Today's developers didn't learn binary before learning Python, why should you learn how to code without the most modern tools?

This phrasing makes me wary. There's a difference between being self-taught, and not even bothering to teach yourself the absolute fundamentals of computing, like binary...


Learning binary and knowing what binary is are two separate things. I don't think you need to learn binary to learn to code. Been doing this for 25 years working at some big name companies, I don't know binary other than 1 is on, 0 is off.

I'm starting to feel like this whole notion that newbies can't come in and learn say React or modern JavaScript without having to go all the way back to that day in 1995 when they were hashing out the std lib for JavaScript and learn every thing about it before that "Hello World!" in React starts is becoming a gatekeeping method.


> Learning binary and knowing what binary is are two separate things. I don't think you need to learn binary to learn to code. Been doing this for 25 years working at some big name companies, I don't know binary other than 1 is on, 0 is off.

You're wearing your lack of CS knowledge like some kind of badge of honor. Of course you don't need deep CS knowledge to be a competent programmer. But almost undoubtedly you'll be a better programmer if you do know CS (I'm using binary here as a proxy for CS since I can't imagine having a deep knowledge of CS without knowing something fundamental like binary). How many times over those 25 years could a problem have been more efficiently solved (in programmer time or CPU time) if you had known about problem solving techniques (perhaps those related to binary knowledge for example) that you don't know from the world of CS? You'll never know... You may guess, but you'll never know what you don't know.

It's not gatekeeping to say that to give someone a complete education to be a software developer we should provide them a knowledge of binary. We can teach programming in an approachable fashion AND teach binary in an approachable fashion. We do it every day at my college.


Why are you talking about CS and nonexistent "problem solving related to binary"? By "knowing binary" we are not talking about knowing machine code instructions or the details of how they are executed, but literally knowing how to read and work with binary numbers (using bitwise operations). Which isn't necessary for problem-solving or implementing most algorithms.

(Yes, there are algorithms that use bitwise operations. They're technically expendable and it doesn't make you any less of a programmer not to know everything. Especially if you're using Python or JavaScript!)


> nonexistent "problem solving related to binary"

Are you joking? Without understanding binary, you can't understand:

- Numeric types, which numbers can be represented exactly, their failure modes, etc.

- Bit-field flags, e.g. for enums

- IP address masks and other bitmasks

- Anything at all about modern cryptography

- Anything at all about data compression

- The various ways color is represented in images

- Any custom binary format, MIDI, USB, anything low-level at all

Honestly the list goes on and on. It's absolutely insane to me to hear people say that you can be a competent software engineer without understanding binary.


The average web CRUD developer never needs to touch any of this stuff.

- Numeric types? Who cares? I know min, I know max. I take number from user and insert it in database. For calculations with money, I use integer cents.

- Bit-fields? I work in Java, what are bitfields?

- IP addresses? I am web dev loper, not network engineer. I don't need to deal with netmasks.

- Cryptography? Me no understand. Me use Let's Encrypt. Is secure, no?

- Compression? Browser do gzip for me. Me no care.

- Colors? I pick the nice color from the color wheel.

- Binary? What is binary? I only use binary when I want users to upload a file. Then I put the files on S3.

Well I do embedded dev as a hobby now so I know this stuff. But for a long time I didn't know how many bits were in a byte simply because I never really needed that knowledge.


Look, it's fine if you want to be hobbyist making some personal website that doesn't store PII or take user submissions. But we're talking about career software developers here. If you want to make a career out of it, this attitude is not only harmful to your career, but dangerous to your company. Besides: do you really not want to actually be good at what you do?


My attitude is: I learn whatever I need to learn to get things done.

And for a long time, I've just never needed to learn about how many bits were in a byte.

The only time I've ever needed to deal with individual bits in Java was when I was working with flags or parsing binary formats. Both of which are extremely rare when doing generic server dev.

You can even do rudimentary binary reverse engineering without any knowledge of bits. I remember running programs through a disassembler, looking for offending JMPs and then patching them out in a hex editor with 0x90.

Not having knowledge is not a problem as long as you know that you are missing that knowledge.


You're being facetious.

You have an artificially high bar so you can gatekeep people from being smart and be the arbiter of who's smart and who's not. What you don't realize is most people don't give a crap and easily hundreds of billions worth of software is sold every year by developers who don't know about anything you mentioned.

Your attitude is also inappropriate for a place called "hacker news" where people are resourceful try to do the most with whatever they have. Maybe you want to go to /r/compsci


> competent software engineer

Interesting term to conflate with "developer".

You don't have to be a "competent software engineer" to be a developer, and in fact, we were never talking about being a "competent software engineer".

These developers do not get jobs as "competent software engineer"s, do not train to be a "competent software engineer", and do not care about being a "competent software engineer". And yet they make things that work perfectly fine! I'm sorry that you think handling PII or having a career (ooo) has anything to do with being a "competent software engineer".

> But we're talking about career software developers here.

I don't remember the name of this fallacy but it sucks, knock it off.


It's absolutely insane to pretend most software engineers need to do any of these things. We're making billions of dollars in aggregate without knowing this stuff. Most people aren't writing their own data compression algorithms, and we sure as shit aren't writing cryptographic algorithms because we're told to leave that one to the experts (i.e., mathematicians).

I'm guessing 80% don't even know bit-field flags and never have to use them, despite them being relatively simple. It would also take someone moderately capable at understanding math less than an hour to "learn." I learned them when I was 13 because I fucked around with MUD codebases, and I don't think I am special for it.


Do you realize that once you write code to solve a problem, that exact problem never needs to be solved again? Either you're solving new problems (and "new" can be slight -- new situation, new context, new hardware, new people, new company, whatever), or you're doing the compiler's job. If most people aren't solving new problems, then their job is bullshit. I frankly don't even understand how you can be confident that code copy-pasted from Stack Overflow actually does what you need it to do without understanding the fundamentals.

> we sure as shit aren't writing cryptographic algorithms because we're told to leave that one to the experts.

Shouldn't we all be striving to be an expert in something? If you're not working your way toward expertise, what are you doing? Why are the absolute fundamental basic building blocks for understanding how computers work and what programming languages are doing something that only "they" need to bother to learn?


Most of the problems software engineers are solving today are business problems defined by stakeholders in a business.

And yes, I agree we should be an expert in something. Harping on binary seems like a waste of time, however. I would certainly like that people are interested enough in the field that they spend their time learning as much as they can about CS, but I'm under no illusion that I'm going to automatically be more productive or better as a software engineer because I know bit-fields.

PS: Thank you for downvoting me.


> Harping on binary

We're "harping" on it because it's so basic. There are a hundred equally basic things you should also know.

> Most of the problems software engineers are solving today are business problems defined by stakeholders in a business.

Even understanding the business problems usually requires a lot of fundamental knowledge in a wide variety of subjects. Being a good software engineer is hard.

And regardless of the problem, if the solution is going to be in code, you can't get away from binary. I actually don't think most programmers should be learning assembly language, because you can actually completely abstract away from it (and you should, because you don't know which assembly language you're programming for). But you can't abstract away from addition, the alphabet, binary, strings, algorithm complexity, and other basics.

PS: I didn't downvote you. I don't downvote people for disagreeing with me. I like disagreement, and besides, it would reduce the visibility of my pithy responses! I only downvote comments that aren't even worth arguing with. So the very fact that I'm taking the time to respond means I have at least some respect for your argument.


"Knowing" binary isn't some deep, fundamental CS knowledge, it's a party trick. Knowing about different number systems in general is nice, but hardly foundational.

The actual choice of what we consider part of the "core" CS curriculum is pretty arbitrary. Why do we consider, say, binary part of that but semantics not? Would it be "gatekeeping" to say that you can't be a good programmer without a passing knowledge of, say, operational and denotational semantics? Do you really have a "complete" CS education without it?


> "Knowing" binary isn't some deep, fundamental CS knowledge, it's a party trick.

Reread my comment I never said binary in-and-of itself is "deep" knowledge. I said not knowing it is a proxy for a lack of CS knowledge more generally.


> But almost undoubtedly you'll be a better programmer if you do know CS (I'm using binary here as a proxy for CS since I can't imagine having a deep knowledge of CS without knowing something fundamental like binary). How many times over those 25 years could a problem have been more efficiently solved (in programmer time or CPU time) if you had known about problem solving techniques (perhaps those related to binary knowledge for example) that you don't know from the world of CS? You'll never know...

You're both right and wrong.

Principles are a lot of cognitive debt to take on, and not strictly necessary to be functional. Literally anybody can sit down with an IDE and write code to merge spreadsheets or whatever the actual business need is. We teach this stuff to anybody with any interest. Kids, even!

If someone thinks they want to be a plumber, they shouldn't start with a 4-year commitment to learning principles of hydraulics and mechanical engineering-- makes much more sense to apprentice, lay some pipe, and if it's the job for you, then go back and learn it at a deeper level. Doing it the other way is why college is so ridiculously expensive and most spend 5 to 6 figures on a major and then go manage at Target.

After failing discrete math three times and giving up, I managed to make it almost 20 years in this industry before needing to learn anything about binary math-- and even then, it was only so I could understand how flags in old MUD code worked, for fun. Truth tables are important, but there has never come a logic problem I couldn't solve by taking a few extra steps to be explicit where you might reduce the logic to a compact blob of symbols. I'll never optimize code as well as someone classically-taught. I don't know what Big-O is-- and outside of a botched Google interview absolutely not a single employer or client has ever given a shit. Nobody has ever been victimized by my code. The "CS" way implies the academic approach is the only way to do things. It's the textbook definition of gatekeeping. All academia did was appropriate and institutionalize experiences yahoos like myself have always learned through ingenuity and trial-and-error.

You've learned more than I do, so you have more tools in your bag. The only thing that sets us apart is that I won't be advancing the industry or publishing anything on arxiv anytime soon. Outside of that, you're going to struggle with quantifying how you're better than the self-taught without resorting to speculation or guild/union mentality (gatekeeping).


> Nobody has ever been victimized by my code.

You don't know that. Actually all code that is inefficient is victimizing both the user (via performance) and the environment (unnecessary energy usage). I'm not saying you've done anything wrong, I'm just saying we all don't know what we don't know. I'm sure my inefficient code has had many users and electric bills as victims. I released software before that subsequent versions where I had better CS techniques improved 100 fold in performance. I wasted my users' time before I learned how to do it better.

> You've learned more than I do, so you have more tools in your bag. The only thing that sets us apart is that I won't be advancing the industry or publishing anything on arxiv anytime soon. Outside of that, you're going to struggle with quantifying how you're better than the self-taught without resorting to speculation or guild/union mentality (gatekeeping).

You made a lot of assumptions about me without knowing my background. I was a self-taught programmer as a child/teenager and my undergrad degree was in economics, not CS. I went back to school to get a masters in CS which was difficult coming from a self-taught background. I did programming both paid and hobbyist for over a decade before that more formal education. And I write books read largely by self-taught programmers.

Saying a full software development education includes binary and the fundamentals of CS is not gatekeeping, it's a low bar if we don't want inefficient software wasting users time and sucking energy. I'm not saying you have to start there, I'm saying it should be part of your education as a programmer whether self-taught or formally taught.


You're obligated to define exactly what you mean when you talk about "knowing" binary.


You're absolutely right, it's definitely gatekeeping.

That said, there's also a point in here that's often underappreciated. There's a big difference between someone who learns today's modern tools from nothing and someone who has a good foundation learning today's modern tools. I think it's fundamentally one of approach. The former treats coding as a trade where you just need to learn a few tools. The latter treats it as a vocation where fundamentals allow you to learn whatever tools you need.

The one makes sense short-term - it gets people to paying work with modern tools in a minimum of time. The other makes sense long-term - it keeps people in paying work with modern tools for decades.

When you're just getting started and looking to get that first job with the life-changing six figure paycheck, all that farting around with fundamentals that the gatekeepers are talking about seems like an absolutely massive waste of time.

It is gatekeeping, but gatekeeping can and sometimes does serve a purpose that is not just the purely selfish.


I've found in my years mentoring and teaching that showing something is much better at keeping them learning and interested than "Sit down and read 30 years of out dated documentation and coding practices just so you can feel the pain and agony I had, then and only then when you've proven yourself can you spit out that hello world on the screen, filthy scum!"

Give them a codepen with modern React already bootstrapped so they can start just tinkering with it and changing things, man watch their EYES LIGHT UP at the possibilities... Every time I see this happen it takes me back to 1997 when I was first learning to build websites.


You're completely right. That's a wonderfully kind, empathetic, and compassionate approach that's incredibly effective for teaching people what kind of power they are starting to have access to.

I've found it's also one that is very expensive as measured by instructional time and energy. I've also found it relatively ineffectual for teaching fundamentals.

I do not have to lecture someone about how they are unworthy and useless to know that understanding a bit of discrete mathematics, HTTP fundamentals, DNS, or relational algebra will make them better software engineers. There are absolutely people in this world who, when learning the glories of React with a codepen, will ask how things work several times and learn big-O notation... but there are far more who won't ask but would benefit from knowing anyway.

Do you think it's perhaps possible that people benefit from both a solid grasp of the often-boring fundamentals as well as feeling the joy of tinkering?


> Do you think it's perhaps possible that people benefit from both a solid grasp of the often-boring fundamentals as well as feeling the joy of tinkering?

I don't think they were explicitly excluding the former, but rather saying it's important to get someone interested before they even become interested in learning the fundamentals.


This seems like a good context for the quote, “if you want to build a ship, […] teach them to yearn for the vast and endless sea.” I don’t think that Saint-Exupery intended to suggest that the yearning was enough on its own, but it makes the process so much more effective.


The trick is doing so without demeaning the value of basic carpentry. Which would be obviously silly in building a ship, but in computing we frequently have people looking to become software engineers without encountering or learning the fundamentals of the field.

This particular project comes from people who regard fundamentals as optional.


There is a difference between going all the way back to 1995 / reading old documentation vs learning the basics of CS.

This strawman is used in several comments. CS is not "some old knowledge you might never use". Knowing that it's way faster to search by key in a hashmap rather than iterating through a whole array is useful. Knowing why it's a bad practice not to have primary key (and other DB knowledge) is useful. Knowing the stages of a HTTP request is useful.

You can get a job and actually do some productive work without any of that, but that some point not knowing all those basics is going to harm your work.


+1, and also, CS fundamentals absolutely can be learned & exercised in the use of high-level tools. One of the beauties of that knowledge is that its concepts are transferable across domains and layers of the computational stack.

Pro tip for anyone working w/ junior devs, especially those who came through bootcamps and the like: you can point them to CS knowledge without actually calling it CS.


> I don't know binary other than 1 is on, 2 is off.

Either an epic troll or a great illustration :-D


I guess the person you replied to edited their comment. Still, "1 is on, 2 is off" made me smile.


haha you caught that before I fixed it, that my friend is the result of my failure of math in my great edumacation in the US public school system hahaha.

Also it's 9am and I just woke up a bit ago.


You should have left it as it was :)


> Also it's 9am and I just woke up a bit ago.

You should try waking up at 6 AM


I woke up at 8AM Chicago time. Which is 6AM San Francisco time. Same thing.


I guess jokes are no-longer a thing in our culture?

Let me spell it out: You should have woken up at 6 AM. The error would have been even funnier.


> I guess jokes are no-longer a thing in our culture?

Nah I just think people here like to be liberal with their downvotes. "I didn't find that funny, downvote"

While I didn't get the joke, I still appreciate the attempt.


I've literally had to use knowledge of binary and number representation just last month in order to implement a lower level protocol. You may not use it in your simple CRUD app jobs but it's absolutely not "some old thing people only had to use back in the day."


I occasionally have to manipulate binary numbers too. But, aside from bitwise operations, I almost always forget what I had learned last time and have to re-read the material again.


I've gotten to use bitwise operations like twice in a 20-year career.

Trivial applications of things that might go in the first couple weeks of an algo class, about the same rate of use.

I always get a little excited when I get to use those things. Like spotting an old acquaintance in a restaurant in another city. "Hey! It's that thing they said was really important but in fact my career would basically be identical if I never learned it at all! Long time no see!"

[EDIT] Still waiting to use math past what I learned in 6th grade, aside from extremely rare use of very simple linear algebra or plugging in stats formulas that I looked up anyway because I don't trust my recollection since I use them so rarely. Doubting I'll even once need any of it before I retire, at this point. Which is great, because I've entirely forgotten all of it, on account of never needing it.


> You may not use it in your simple CRUD app jobs but it's absolutely not "some old thing people only had to use back in the day."

My point is that most jobs are simple CRUD app jobs or positioning divs on a screen, not deep seeded CS stuff.


> My point is that most jobs are simple CRUD app jobs or positioning divs on a screen...

It's really not "most jobs." Although I do agree that a CS or software engineering degree is overkill for that type of stuff.

Also, "knowing binary" is a strawman, and not a very good one. A newbie developer getting confused by bit flags isn't a big deal. Point them to Wikipedia and let them read about it.

The much bigger problem is when inexperienced developers go off and write a ton of bad spaghetti code, re-invent a bunch of wheels they never learned about, and generally just write crap because they're clueless about best practices (or even any practices at all). Now the clueless newbie is slowing down everybody else and creating a maintenance nightmare.

TBH, most new developers are pretty bad (self taught, university taught, or whatever). The important thing is having experienced people around to point them in the right direction and help them get "real world" experience.

Gate keeping isn't always a bad thing.



> I don't think you need to learn binary to learn to code. Been doing this for 25 years working at some big name companies, I don't know binary other than 1 is on, 0 is off.

I... guess?

But the full explanation of binary is only a paragraph long. At a certain point that seems like something you'd have to avoid on purpose.


> Learning binary and knowing what binary is are two separate things.

Really? Binary isn't a language like Python, it's a notation; either you understand it or you don't (you could say it's 'binary', I suppose). If you don't understand it, you don't know what it is.


> Learning binary and knowing what binary is are two separate things.

Knowing what binary is lets you generate smooth sounding sentences about it, which someone else who also only knows what binary is accepts as true.

Learning binary lets you actually exploit its properties in a solution.


So when you're reading code and you get to a bitmask... what do you do?


Let’s hope you never have to parse an archaic binary format then


Most people, in fact, do not have to do this. That’s an exceptionally rare thing to do that I would estimate a fraction of a percent of developers will ever encounter.


I would have put the number at 75% rather than a fraction of a percent. What do all these people do, these "developers" that never have to use bitstrings in python, deal with encodings, endianness, interface with hardware, etc.? Are they all UI people?


> use bitstrings in python, deal with encodings, endianness, interface with hardware

is nowhere even remotely close to

> parse an archaic binary format

The goalposts are on the other side of the field.


In my experience, most files in userspace are in an archaic binary format. And most hardware talks archaic binary.


Most developers interacting with files are going to be doing it through a higher level API, so while sure, technically they're stored like that, it's not like that's what the developer actually has to deal with.

And most developers don't interact with hardware day to day, no. That's an exceptionally small set of people.


It hurts to be called exceptionally small XD

But also... everyone interacts with hardware whenever they use a a computer.

I think our difference of opinion has to do with abstraction layers here. Just so you know where I'm coming from, I work on web apps in java, python, and js, C++ desktop applications, and I operate a mars rover using various domain specific languages. Before switching to engineering I was a scientist and had to deal with data collection in the lab and field from all sorts of instrumentation which often required understanding low-level protocols and the movement of bits and bytes.

It's hard for me to imagine the world you're describing... sounds like it's full of script kiddies.


I didn't know there were computer nerds who can't at least count in binary.


And hex, or any other base for that matter. Fundamental concepts and not difficult... I mean, hopefully people learned about place value in elementary school...?


> There's a difference between being self-taught, and not even bothering to teach yourself the absolute fundamentals of computing, like binary

Been doing this for almost 30 years now, about 15 of those professionally. I think the last time I used binary was in a microcontrollers class in high school.

Even in college, yeah we talked a lot about binary, we learned the single most useful thing in my career – truth tables – and we dived deep into how CPUs toss bits around. Did we ever use binary for anything practical? No of course not, that’s what the compiler is for.

I mean I guess the real question is: What does “learn binary” even mean? Knowing it exists? That’s easy. Knowing that your code is eventually all binary? Yeah great. Knowing how NAND gates and such work? Well that’s discrete mathematics, bool algebra, quantum physics, circuit design, and whatever field of math talks about translating problems from one to another, not “binary”. Being able to calculate, by hand, numbers from binary to octal and decimal? Meh you can have a computer do that or google the algorithm when you need it. Does “learning binary” mean memorizing the ASCII table so you can read hex dumps or whatever? Maybe, doubt a lot of modern engineers still do that tho.


>Did we ever use binary for anything practical? No of course not, that’s what the compiler is for.

How to tell people you have never implemented anything performance sensitive in your life without spelling it out bluntly in a nutshell. The code of any popular image processing/decoder/encoder/compression/cryptographic libraries is littered with the use of bit operators because they operate at the fundamental building block level of computing, they are the most efficient and the "sufficiently smart compiler" that supposedly always produces the best interpretation is a lie. You merely need to skim through any implementation of jpeg, of h264 or anything that actually matters in this world to see the practical application of working on bits in the real world.

But sure, understanding computer architecture is meaningless. Trust the compiler. Thank gods I can still see stutters scrolling a web page on a 16 core CPU with 64gb of ram. I don't know how I could live my life if people actually knew how to make proper programs!


> How to tell people you have never implemented anything performance sensitive in your life without spelling it out bluntly in a nutshell

And that’s okay. Millions of engineers work on software that leverages those fine-tuned optimizations. Hundreds of engineers make them.

Plus I grew up in an era of cheap compute. Everything got faster every 18 months for free. Even now computers as a whole keep getting faster and faster despite the cap on single core performance.

Even at my relatively high level, the performance concerns I spent months learning about in 2005, just don’t matter anymore when you can get a single server with terabytes of ram and petaflops of compute.

99% of [user facing] code just isn’t performance bound these days. What does it matter if your loop takes 10ms or 20ms when the data is a 300ms network call away


I'll argue - bitwise operations, they're fundamentals too, but

If you arent working close to hardware, then for example doing bitwise operations may be really rare.

I don't think I've met a problem that required them even once during my first few years as SE.

I've had CS degree and years of experience and I werent proficient with those, I needed to write it down in steps

and then I started working closer to hardware where those operations were common and I've learned it

I don't even remember whether we were doing them in school


Boolean logic is close to bitwise operations and it is the knowledge without which nobody should dare to call themselves a software engineer.


is close, yet no the same.

There's difference between "do you understand this if ladder" or "can you simplify this boolean expression"

and "clear bits 3, 25, 26, 27 of this register and set bits 25, 26, 27 to value 0b101"

I'm not saying that this is hard or something, just the 3rd thing is really rare unless you work close to hardware or some other specific tasks


I think the generalization skill that allows to go from one to another is essential. If someone has only some basic understanding of boolean expressions, but cannot apply this knowledge to binary numbers, this is a good characterization of their programming skills in general.


I work in networking, not particularly close to the hardware. I have colleagues who manipulate IP addresses by looking at the string representation of the address, splitting on the `.`, casting each part back to an integer, etc. Their code breaks as soon as we use a netmask other that /24 or (shock, horror!) we run in an IPv6 environments.


I don't really understand the hurdle to "learning binary". It's not necessary to understand how binary works to complete a hello world program, but I think it's something that you want to get a handle on pretty quickly.

If I recall correctly, we only spent a few minutes on it in a larger lesson about different schemes for representing numbers in Comp Sci I. I don't think I've performed binary arithmetic since that class, but it's good to know how it works and I could always look up/figure out how to do those calculations again if I needed to.


It's good to know how to eat healthy, avoid processed foods, maintain healthy sleep patterns, avoid PUFAs, maintain strong mind-muscle connection in the body, communicate effectively and succinctly and empathetically with others, and many other life skills as well. And they'd all make you a heathier, happier, more robust, performant human. Which translates to better productivity. Better code.

So, should this new programmer start there, or start with binary?


Honestly, learning binary isn't something that will take any significant amount of time. This is not a tradeoff you need to make.


I mean, you definitely Should be doing all that. 2 hours at the gym + suppluments + cutting out processed food + getting at least 150 grams of protein a day + drinking purified water have all contributed significantly to keeping my brain active and capable at 38.


How do you know that you wouldn't have the same capabilities if you didn't do those things?


3 years ago I weighted 210, had persistent fatigue and chronic pain. now I'm 187, and feel better than when I was in my 20's. Its made a difference.


38 here as well. Agreed on all counts except the need for supplements if you're eating grassfed beef, lamb, organs (liver, etc) and other high quality whole foods. Also 180g protein for me, since I weigh 205lbs.


> So, should this new programmer start there, or start with binary?

The new programmer should start with the life skills. As a toddler. They should also learn the alphabet and how to wipe their own ass. Why do you think this is a gotcha question?


By your own admission you have never used the knowledge you learned.

Why exactly is it “good to know how it works” if you literally have never used that knowledge? Why is it “something you want to get a handle on pretty quickly” if you don’t touch binary?

Are there places where it would come up? Most certainly. Is it required learning for every single dev out there? Highly debatable.


I've certainly benefited from knowing about floating point error. I likely would have spent a lot of time confused about why certain kinds of math kept coming out wrong without know about the underlying representation of floats, how it results in error, why this tradeoff is good for most scenarios, and other options.

The problem is that this is the sort of foundational knowledge that isn't easily gained through the learn-as-needed approach that applies to higher level things. Most people can notice when they don't know how to use a library. It's probably not obvious to most people who don't already know about it that their computer can handle numbers wrong.


> It's probably not obvious to most people who don't already know about it that their computer can handle numbers wrong.

0.1+0.2=0.30000000000004 is not obvious? When a floating point error happens, it’s quite plainly obvious. At that point, someone would look up a SO article like https://stackoverflow.com/questions/588004/is-floating-point... and learn about it. And from there, FPE mitigations.

Would you get to the mitigations faster if you knew binary? Sure, the first time you ever hit it. But that seems to be bottom of the barrel optimization, IMO.


Most manifestations will be in the middle of something more nuanced than a demo statement, the people in question will often lack the vocabulary to describe what they're seeing, and it's rarely anyone's first or even fifth thought that addition is going wonky. With that in mind, I think it's perhaps a stretch to call floating point error plainly obvious.

You're right, of course. This would be only marginally faster if the person knew binary. That said, there's a large swath of very similar things that crop up where the person benefits from a familiarity with the fundamentals of computing. There's enough of these things that a reasonable person might conclude that a software engineer benefits from such a knowledge base inside their head. That way a person can benefit from all those marginal gains at once.


The example is overly simplified, sure. But even if it’s in the middle of a swath of other operations, it will still result in very similar behavior. You’re likely never to get a clean number once a floating point error happens, and the result will be slightly off and seem to not be rounded.

Searching “why is my arithmetic operation not rounded”, I got https://docs.python.org/3/tutorial/floatingpoint.html as the third answer. I obviously can’t unlearn what floating point arithmetic is, but it feels like someone without any knowledge of it would likely be able to get a similar result relatively quickly as long as they are good at searching for answers (a much more important skill, IMO, which should be considered foundational)

> That said, there's a large swath of very similar things that crop up where the person benefits from a familiarity with the fundamentals of computing.

We’re talking specifically about binary, not fundamentals in general. Some fundamental knowledge is more important than others, and I posit that binary is on the lower end of that spectrum.


IMO, being good at searching is a skill that's only really useful when you have some idea what kind of question you're looking for an answer to.

I don't think we're actually talking specifically about binary. I am treating the reference to binary as a stand-in for the mathematical fundamentals of computing, rather than a narrow comment on understanding binary and bitwise operations.


> IMO, being good at searching is a skill that's only really useful when you have some idea what kind of question you're looking for an answer to.

"why is my arithmetic operation not rounded" seems like something anyone facing the problem would ask. That's pretty much the root issue in words.

> I don't think we're actually talking specifically about binary.

I mean, that is literally what the thread you replied to was discussing.

The comment I replied to explicitly talks about binary and their use of it, the comment they replied to originally also specifically calls out the OP’s quote which talks directly about binary. You can choose to deviate from that if you want to make a point, but the thread has always explicitly been about binary and that is what we should actually be discussing.


> the OP’s quote which talks directly about binary

The OP's quote is "fundamentals of computing, like binary..." It seems reasonable for a person to think the discussion is about "fundamentals of computing" more generally than the specific example given of "binary". More precisely, the fact that a thread caught on to one specific aspect of the discussion doesn't mean that a commenter can't keep in mind the greater context.


> Highly debatable.

For the sake of debate: call me old-fashioned, but I don't ever want to rely on code from a "developer" who isn't familiar with binary notation.


How does knowing binary produce meaningfully better code? Most of us aren't working at a low enough level for it to be substantial.

You should be judging the code, not the person who wrote it.


For nearly all of the software I rely on, I haven't examined the code. I'm not in a position to judge it.

For most of that software, of course, I'm not in a position to judge the developer either; but if all I know is that the developer isn't familiar with binary and hex, then I wouldn't expect her to be competent to write, for example, a brochure website, let alone a webserver. URL encoding depends on hex. Debugging often depends on hex. Arithmetic overflow and carry are binary. Twos-complement notation for signed integers is a binary convention.

I wouldn't hire a developer who couldn't explain binary notation. In fact I don't think I've ever met one like that.

As I suggested, perhaps I'm old-fashioned.


I could see there being a future course about binary on codeamigo actually. I'm not saying people shouldn't learn the fundamentals of computing, rather, it's not knowledge that's a requirement to have before building most modern applications.

Before it was "learn C before learning Python" but some people didn't love that either...I guess my point is, we've been moving to higher and higher abstractions ever since computer programming was invented, the next abstraction is probably going to be talking to an AI to write some code that you need to vet, the above is just marketing speak for that.


How are courses created? Who creates them?

>Made with in

How did all the haze treats you yesterday?


My friend/co-founder and I wrote these courses.

Thanks for asking. I didn't leave my apartment. Hope everyone who is forced to be outside for work is safe, and this passes soon.


It would be unsettling for a software engineer to have little knowledge of the fundamentals.

But software engineers aren't the only people using python. I work with data scientists - with degrees in data engineering from computer science departments in very good universities - and I am certain that they believe a computer to be a magical box. I know for sure they're terrified of binary. Honestly, I'm looking forward to the day that they actually use functions and classes properly.

I wish I was exaggerating, I really do. It'd make my life easier. And it is no surprise - I've seen the supplementary material attached to papers that come out of those departments. I won't go into too much detail, but I don't know how any codebase could more closely resemble a house of cards and still function.

They still have successful careers in what they're good at. After all, one of the main reasons that python is so successful is that it can be used by people who don't know much - or care much - about programming. It can obviously be used by far more capable hands to do many more things, but for applied tasks it takes the pain out of learning something that they consider tangential.


Realistically, one can achieve quite a lot without needing to ever think about binary, and today’s languages/frameworks explicitly enable this.

I’m not arguing that someone shouldn’t eventually teach themselves more fundamentals as they mature their skillset, but most modern languages are so many abstractions above binary that it’s more of a distraction while learning about the basics of code in the context of real world use cases.

Understanding assembly on some level is in a similar category.

I think of this more as an avenue for specialization. One need not learn these things to get started, but they may very well need these things if they want to continue their journey past a certain point.


Seconded. And I'd posit that learning CS theory and fundamentals like binary shouldn't be too difficult if the individual is savvy enough to grok the finer points of Python.


I dislike this perspective.

I taught myself programming at age 12, well before undergrad.

If I'd been forced to learn binary before I could make websites and games, I would have given up. I might have even avoided programming forever.

I think the very first thing someone should learn is the fastest path to build something that interests and delights them. Theory can come later, when they're ready to appreciate it.


You could have made learning binary a game. Learning binary isn't difficult and should only take a few days to a week of lecture to grasp a sufficient understanding.


> should only take a few days

This doesn't work for everybody!

Let people learn in the direction that interests them. The highest energy reward function first.

Once they feel rewarded, then let them learn theory. They'll have a deeper appreciation and the stamina to press deeper.


Hard pill to swallow: Doing things that are good for us that we don’t necessarily like is a part of growing up.


Forcing someone to learn something is a quick way to turn them off forever.


Yes, I can see that now that I’ve tried to teach you this point.


Unsubscribe [1].

I get your point. But you can't convince me that teaching binary before the fun parts of programming is the most effective way to bring more people into the subject matter.

It's like trying to onboard game developers with linear algebra before they play around with pygame.

Fun first, rigor once they're hooked on the magic.

[1] (Tongue in cheek, obviously.)


It's not about learning hard things before fun; it's about learning it at all. Many folks, like OP, say that they should never want/need to understand such things at all in their career. That they are superfluous.

Sure, you can do that, but you can't convince me that that's not troubling.


Don't you learn about number bases in maths lessons in school in the US? We did that in the late 60s in England.


We do, but it's taught as trivia. Applications and various "tricks" are what matter for binary in CS, and they don't teach that. We're taught about non-10 bases, but not taught why we should give a shit about them.

Same as most of the rest of primary and (especially) secondary school math, really. I doubt 1% of recent high school grads can tell you a single reason why anyone should care about quadratic equations, even though they likely spent months of their lives jacking around with them (for unclear reasons). Most of it's just taught is extremely-painful-to-learn trivia. Trig and calc get a little bit of justification & application, but not much. Stats probably comes off the best, as far as kids having even half a clue WTF they can do with it after the class ends.


> Most of it's just taught is extremely-painful-to-learn trivia.

That's weird. When I was in school in the UK (1960..1974) mathematics was illustrated with practical applications. This was especially so as we progressed into more sophisticated physics and chemistry.


You'd get examples sometimes, and (infamously) contrived word problems, but not enough and most of it wasn't at all relatable. Some whole topics were totally lacking in anything but horribly-contrived motivations, including covering bases other than 10. You might get "computers use binary!" which... OK, cool, so what?

But yes, we'd see some applications of usually limited and relatively simple math from e.g. calculus or algebra in other classes, which typically amounted to plugging values into a handful of formulas (which, to be fair to those classes, is what the vast majority of "using math" is in the adult world, aside from basic arithmetic). Not in math class, though, and not at all for many topics.


In my case, I started with the FET junction (electronics).

My first program was Machine Code, designed on a pad of lined paper, and punched directly into RAM, via a hex keypad.

These days, I write Swift. It's nice to not need to deal with the way Machine Code works, but I'm glad I learned it.

That said, everything we learn takes up NVR, so there's a strong argument to be had, against the need to learn below a certain fundamental floor.


> My first program was Machine Code, designed on a pad of lined paper, and punched directly into RAM, via a hex keypad.

Yup. That's also where I started. In fact, I still own one of these in mint condition:

https://www.hewlettpackardhistory.com/item/making-a-case/

I truly believe the low level fundamentals are crucially important. Even today, in the days of Python and Javascript.

Not to go too far, about a year ago I was working on an embedded board we designed which used MicroPython. The communications protocol required IBM CRC-16 to be calculated. MicroPython could not run this calculation fast enough. Since MicroPython allows you to insert code in assembler, I just wrote the CRC routine (an many others) in ARM assembler. The performance boost was, as one would expect, massive. Writing the code wasn't a problem at all given my background.

Having this knowledge also allows you to reach for optimizations that someone who has only seen high level languages and (as posters elsewhere in the thread have revealed) don't understand low level code or even binary. A simple example of this is that comparing to zero is faster than comparing to a value. The simple explanation being that you have to juggle registers and perhaps even read and store that value from memory every time you go around a loop. All processors have the equivalent of a JZ and JNZ (jump if zero, jump if not zero) instruction, which does not require anything to be fetched from memory or register swapping. Of course, this is processor and task dependent.

And then I wonder about such things as DeMorgan, bit-wise operations, masking, etc. I was surprised to read some of the comments on this thread about people not being comfortable with binary beyond 1=true and 0=false. I don't understand how someone can develop non-trivial software without having a solid foundation. I mean, even for CSS you should understand what "FFC21A" or "3FE" means, where it comes from and how to work with these numbers.


The pathways formed by all those fundamental and low level operations create capabilities and reasoning strategies that are very valuable. Perhaps they are even unachievable in other ways.


> Today's developers didn't learn binary before learning Python

I didn't learn binary before Python. I learned Lua before Python. But now I'm messing around with low-level C++, and Rust.

Where you started doesn't necessarily have anything to do with where you are now.


I genuinely think this is a UK/US thing (limiting to those countries because that's the limited experience I have); in the UK this is absolutely true. There is a definite cultural thing going on where people play down how much effort they put in to things. It's definitely 'uncool', throughout school, higher education and in the workplace, to be super keen.

Contrast to the US: American colleagues I've had have all agreed that it's kind of the reverse on that side of the pond. I've heard people brag about the hours they've put in to work; that would be a complete embarrassment over here, especially if the work turns out not to be top-tier.


Well this is wildly wrong. Scales, arpeggios, etc. are taught not just for the theoretical understanding they help with but because they occur VERY frequently in actual pieces


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: