And maybe the possibility of building more efficient hardware. Floating point units take up a lot of transistors (in the ~100K transistor range AFAIK), whereas boolean logic is tiny.
Going to 1-bit precision is probably overkill, but papers have shown how neural networks with 8 bits of precision and simple ALUs can give results equivalent to full FPUs.