Hacker News new | past | comments | ask | show | jobs | submit login

A bit off- topic but i want to know; is binary code (01etc) still used today in programming/coding? And for what applications?



Maybe not for general-purpose computing. I've used it for on-the-fly code generation (hacking display rotation into the Windows 3x BitBlt engine) and programming special-purpose media accelerators. In both cases you end up creating a bunch of convenience #defines or macros that generate the bits, which immediately takes you back into tiny language territory rather than pure machine code. The relative ease of creating new programmable hardware in FPGAs is another place this might occur.


This seems like a strange question. All computing today uses binary code. We often write things in other bases for convenience, but at the hardware level, it is nothing but ones and zeroes.

But maybe you are asking about uses of independent bits. A maximally efficient representation for a set with a fixed universe of members uses each bit position in a large-enough integer type to represent presence or absence of a possible member. Then, AND and OR operations correspond to set intersection and union operations. C++ provides std::bitset for this use. Useful such sets include days in the week or month, and letters in the alphabet.

The article mentions chess, where you might have a 64-bit word to represent the positions of (say) all the pawns on the board. Fairly simple bitwise operations identify all the positions those pawns threaten.

Modern symmetric cryptographic primitives often use principally bitwise operations, including shifts.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: