I'm always baffled by how many software engineers with CS degrees don't understand IEEE 754 floats. I had a coworker who was bugging the shit out of me claiming that Macs had a "bug" in several meetings because he didn't understand floats. Recently I heard someone else make the same claim about Ruby. WTF are they teaching people in CS programs?
They‘re teaching them that floats and doubles are numbers with decimal places, and that‘s where most stop. There‘s a reason articles like „Myths programmers believe about floating point“ are written and usually get highly upvoted.