>"The reason for wanting a definition for zero divided by zero is given by Falkoff and Iverson in The Design of APL as "it was deemed desirable to avoid an error trap in this case."
I think that both decisions are crazy. Any calculation that leads to division by zero is indicative of a conceptual problem, and the user should be alerted to this by an explicit declaration of error, not silent failure. I guess that this is mitigated by the fact that the languages are intended to be used interactively by individuals who know what they are doing.
As an alternative to the expensive, proprietary implementations of APL, one can also try NARS2000, which is a free, open-source, ISO/IEC 13751 Extended APL.
I find that division by zero usually satisfies some edge condition in practice, as you are usually dealing with lists of numbers in APL, and it doesn't really come up that often.
But which is preferable 0 or 1? Maybe something in between, or maybe even something surprising like -1/12? Which choice will lead to the least error in the long run?