Hacker News new | past | comments | ask | show | jobs | submit login
How did anyone do math in Roman numerals? (2017) (washingtoncitypaper.com)
128 points by pmontra on July 27, 2020 | hide | past | favorite | 124 comments



In An Introduction to Mathematics (1911) Alfred North Whitehead wrote:

By relieving the brain of all unnecessary work, a good notation sets it free to concentrate on more advanced problems, and, in effect, increases the mental power of the race. Before the introduction of the Arabic notation, multiplication was difficult, and the division even of integers called into play the highest mathematical faculties. Probably nothing in the modern world would have more astonished a Greek mathematician than to learn that ... a large proportion of the population of Western Europe could perform the operation of division for the largest numbers. This fact would have seemed to him a sheer impossibility ... Our modern power of easy reckoning with decimal fractions is the almost miraculous result of the gradual discovery of a perfect notation. [...] By the aid of symbolism, we can make transitions in reasoning almost mechanically, by the eye, which otherwise would call into play the higher faculties of the brain. [...] It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilisation advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle—they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.

John Allen Paulos discusses the power of notation in his book Beyond Numeracy:

A German merchant of the fifteenth century asked an eminent professor where he should send his son for a good business education. The professor responded that German universities would be sufficient to teach the boy addition and subtraction but he would have to go to Italy to learn multiplication and division. Before you smile indulgently, try multiplying or even just adding the Roman numerals CCLXIV, MDCCCIX, DCL, ANDMLXXXI without first translating them.

Numbers may be eternal and invariant, but numerals, the symbols used to represent numbers, are not, and the above anecdote illustrates how easy it is to take for granted the Hindu-Arabic numerals we use today. The history of numeration systems is a long one extending from prehistoric times to the adoption in the Renaissance of our present system. The heroes of the story are the nameless scribes, accountants, priests, and astronomers who discovered the principles of representing numbers systematically.


Paulos in the quoted snippet is arguing against a straw-man version of the past, set up as a foil to prove some unrelated point, rather than taking it on its own terms and trying to understand it. This is a common conceit of modern writers, but it does a disservice to readers.

Roman numerals by and large were not used for calculating, but for recording calculations. Those calculations were done (as the very word “calculate” suggests) using pebbles or other tokens, on some kind of counting board.

The point of Roman numerals is to be as direct as possible a representation of the state of the counting board. They are a serialization format.

Think of it as the JSON (or s-expressions if you like) of the ancient world. You don’t run your algorithm by writing and rewriting JSON literals over and over with pen and paper. They are just a record; your computation is done using a different automatic tool.

When someone writes down the state of a graph structure as a big nested JSON literal, you don’t laugh at them for having a uselessly unwieldy written expression.


Nice quotes. The first one is how I think about programming languages as well. We've moved from binary/punchcards, to assembly, to imperative and declarative layers, to OO/functional code that can be statically analyzed, and systems designs that can be automatically verified. There's all sorts of memory management tools from manual allocation and freeing, to garbage collection and reference counting, and optionals to avoid nulls that can be statically checked. Each thing has solved entire classes of problems and we're getting more and more software solving different kinds of problems, faster.


Whiggish bullshit. Functional code and GC were invented in 1959. Some level of static typing was de rigeur in most application-development languages after assembly and before the scripting boom starting in the late 80s. In Coders At Work Frances Allen bemoaned the effect C's popularity had on automated program analysis since 1970:

> C has destroyed our ability to advance the state of the art in automatic optimization, automatic parallelization, automatic mapping of a high-level language to the machine.

"Progress" in CS remains fad-driven pop culture churn.


So Rust, Python, Julia etc are no more productive, safe, or easy to use than programming was in 1959?


The question is not whether any particular language you might pick today is better than any other language you might pick in 1959, but whether there is some kind of teleology or "progress" to which languages are aspiring or at least slouching.

GC (and "memory safety" more generally) was not invented to solve the problems of C after C somehow revealed them solving the problems of e.g. Fortran. C variously sidestepped and ignored the work on program analysis including GC and memory safety for various commercial, aesthetic, and incidental reasons. Similar things are the case for C++ (vs. e.g. Object Pascal / Simula), Objective-C and Swift (vs. Smalltalk and Self), JavaScript and PHP (vs. nearly everything).

Lisp from 1959 stacks up incredibly well against Python today. Fortran still autovectorizes better than most modern languages. Pascal remains better to teach structured programming, we just don't teach that much anymore (and you can tell just by grabbing a half dozen loops at random and trying to figure out how well their conditions capture their invariants). Languages don't get better over time. They do get marketing budgets unimaginable before the 90s ("thanks" largely to Sun and Java for kicking this off), and for the past 20 years or so weird personal identity arguments on top of that (probably somehow Perl's fault).


Rust maybe, python definitely no and Julia I don't know a lot.


How is Python definitely not "safe, productive and easy to use". It has faults, but these three are not them.


This is absolutely the case for programming languages. From Haskell to Rust. Programming languages shape the way we think and reason about the problem. Sometimes it does this so well, it leaves us starring into the abyss. Facing dead on the problem domain—after all sides quests have been removed. Once you can delegate away all the busy work, you can focus on bigger and better abstractions.


Both Haskell and Rust are poor examples because one has to focus more on the notation (i.e. types, declarations, etc).

Python is better.


Types in Haskell are generally used to describe the structure of the data. Defining a type in Haskell is like writing documentation and a test all in one. I can mentally offload certain concerns to the compiler, which will tell me whether or not the shape of the data in my head matches the code I'm writing. I pretty frequently use the compiler to guide my development, especially with a tool ghcid, which gives instantaneous feedback as I'm programming. It's also like a million times easier to refactor, I tweak a type signature and the compiler will basically give me a list of things I need to update. No need to hunt for and test every instance of that kind of input, the compiler just tells me where it is, and what is wrong.

You also don't have to write out type signatures 99.9% of the time after declaring a type, the compiler can infer them. People in the Haskell community tend to write out signatures for functions because it makes it easier to understand, but it's pretty rare to see a type signature in the body of a function.

That's not to say thinking in Haskell is always as natural or more natural than Python, but I'd argue that immutability and laziness more than types can require a lot more mental effort that can feel like fitting square pegs into round holes for certain problems. When it comes to types (at least Haskell vs. Python), I'd say it's much more a matter of personal taste.


That's not relevant unless you can do all of the above instinctively and without involving the higher faculties of the brain.


You had to learn 0,1,2,3, etc. at some point. The thing with notation is that it takes time to learn, but it proves profoundly useful over time. The benefit of notation integrated over your career outweighs the adoption costs.

Arabic numbers are like this (no child immediately ‘groks’ Arabic numbers), and Haskell notation is making the same case. Haskell abbreviates abstract structures so that they become fluid.

If focusing on notation is a problem, then we should all have grown up to use our fingers for counting, not Arabic numerals.


You have to remember the context in order to understand my reply. Here it is once again "It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilisation advances by extending the number of important operations which we can perform without thinking about them."

First of all programming without thinking about it (i.e. instinctively) is arguably not even possible. But let's be generous and assume that it is somehow.

Are you seriously trying to argue that it's possible to program in a complex programming language which not only requires types, but almost requires encoding complex relationships in them without involving the "higher faculties of the brain"? This is such an extraordinary claim that it requires evidence to even consider it.


Imagine I'm consuming code written by someone else, without looking deeply into its implementation. Would I now be disadvantaged by careful and explicit typing? How much time is spent consuming code versus writing it, of course, is a bit situational.


In python you have to focus more on the debugging where this run-time type-error came from. Also, Haskell has type inference unlike the other two.


Python merely allows you to program quite close to a state where you focus on the problem and less on the infrastructure and syntax required to solve the problem. At least in exploratory phases and when working with smaller pieces of code.

Other languages almost always require more thinking and design upfront, even when doing exploratory programming.


> and less on the infrastructure and syntax required to solve the problem

I do not think that this is true for python any more than it is true for haskell for example.


I’m not sure. I find this in people that say JavaScript is better than TypeScript. Most people that learn TypeScript or Flow learn to love it even more than JavaScript once they get over the initial hurdles. Why? It’s more self documenting. It actually provides better auto completion. It scales much better for larger Frontend projects. Going without it is almost painful to me now. I now know a bit better a few of the big reasons why a lot of devs hate JavaScript. Because they are used to the great properties, tools, and features that come from having a well typed language. Languages seem to have big hurdles for no reason when it comes to their type systems, but those hurdles pay off.


Which one requires more focus on the notation?

    def square(x):
        return x * x

    square x = x * x


I wonder if arabic numerals really are the best system.

Too much inertia is invested in them now but I wonder if a yet easier representation exists. For example, I know that there was a brief push to use quaternions in physics.


Base 12 is likely better than base 10. Twelve is the number of phalanges on your hand that you can touch with your thumb, so if that counting system had caught on and stuck we'd likely have a better base for divisibility.


if you add tip of the finger you can easily work in base 16.


Incidentally there are 12 inches in a foot.


Don't you mean the other way around


Cool idea.


Adding Roman numerals, at least, isn’t hard. If it was I doubt Roman numerals would have ever lasted. Arabic is still, in my opinion, easier to add—from the perspective of a lifetime spent exclusively doing arithmetic in Arabic numerals—but it’s not much easier until you’ve memorized all 50 unique sums of one-digit numbers. Multiplication, though, that’s the real difference maker.


> Adding Roman numerals, at least, isn’t hard.

I find it insanely difficult - but as you note, a lifetime of arabic numerals, and a lack of skill in appropriate tools (eg. an abacus) will skew that comparison.

> ... but it’s not much easier until you’ve memorized all 50 unique sums of one-digit numbers.

This doesn't feel right.

I don't think I have memorised the sum of all pairs of 1-digit numbers - but contemplating this now, it's impossible for me to be sure. I'm not sure where multiplication kicks in for breaking down larger numbers into quotient and divider for me, let alone 'most people'.

Given the maximum value (sum) of two x 1-digit numbers is 18, a naive assumption is the permutations don't really number 50 (I get 45) - and given that roman numerals didn't have a zero, a fairer comparison would be [1-9][1-9] (36 unique combinations)

Either way, in any counting system there's presumably a similar 'memorisation' gate you have to pass for the fundamental set. With roman numerals there was historically 7 I think - I, V, X, L, C, D, M - and summing those wouldn't be anywhere near as straightforward as summing sets of single-digits, so I don't think the comparison of this requirement is as skewed against arabic numerals as you suggest.


I was going off the rough estimate that 0-9 is 10 digits and 10x10 is 100 but that reduces to 50 thanks to commutativity. I think 45 if you don’t count zeroes—I don’t know where you get 36 from.

I also use the term “memorize” pretty loosely—I remember that memorizing times tables was a thing but not so much for plus—but addition is simple enough that most people can kind of intuit what 7+4 heuristically if they’re sat down and forced to do arithmetic as small children for long enough. (Also I’ve never had the patience for memorization; I just rely on my brain to cache things that I use frequently and it ended up working for times tables. Also other things.)

But I do want to acknowledge that Arabic numerals make multiplication nearly as easy as addition, which is a staggering achievement over Roman numerals.

Though I will say, on the other hand, that the Romans weren’t that stupid and neither were their medieval successors prior to the adoption of Arabic numerals. They could add things up and we’ve discussed in a parallel thread how that algorithm works. The Roman numeral system isn’t as optimized informationally—let’s not underestimate the sheer awesomeness of seamlessly expressing numbers as large as 108730026190037365462849562635965—but that would be useless to most cultures that used Roman numerals.

I would even question one more thing. To someone who doesn’t know a numeral system to begin with, do Arabic numerals actually make addition harder? I mean, very small children (and programming languages like JavaScript if you accidentally express one number as a string) sometimes make the mistake of thinking 11+8=118, but in Roman numerals that’s just like saying XI + VIII = XIVIII, which is also wrong, but not as wrong as 118. A Roman child could easily be taught no, that’s XVIIII since V’s go before I’s, and then maybe reduce to XVIV. A child today is like, “wait wtf are places?” Roman numeral users never have to learn the concepts of places, carry, or borrow, which honestly sounds like a good trade off for a civilization that doesn’t have to do multiplication and division that easily.


I think I'd calculated combinations, but dubiously ignored where both were identical - that's where I ended up with 36.

OTOH do we memorise any of the x+1 combinations? I hope we don't, but perhaps we do. I genuinely can't say at this point. I was trying to work out how I processed sums such as 8+7 earlier, and concluded that so far I can tell, I work out the difference of one of those numbers from 10, subtract it from the other, then it's a very simple addition - ie that becomes 10+5. But I'm now unsure if that's what I do as a general rule, and am even less sure what other people may do.

Times table I vaguely recall learning by rote in formative school, but that's an awfully long time ago, and trying to self-analyse my mechanisms for multiplications is highly challenging. It feels like I try to move those back to multiples of 10 or 100, again, too.

I recall reading aeons ago that the only intuitive interface is the nipple - beyond that, everything is learned. So what makes for an intuitive or sensible mathematical representation of things is probably so arbitrary as to be pointless arguing about. It feels that the kinds of things we do, day to day, with numbers, that base-10 arabic number system is optimum, but that may simply be the lack of exposure to a better system.


As a parent of young children and former math teacher, yes, we do absolutely memorize those sums. It usually happens at an early enough age that the process of doing so evaporates early in life and becomes part of your base mental code. I spent a significant part of my teaching life teaching high school math for college students (and occasionally grade school math for college students) and there are very much people who never managed to get that memorization step completed. What's really fascinating is that it's an orthogonal skill to higher mathematics. I've seen students who needed to use a calculator to do 6+5 and yet managed to be able to solve algebraic problems. This is, I must add, uncommon, but it's less because one skill depends on the other but rather because the failure to gain the basic math skill leads to an unwillingness to try to gain the more abstract math skill.


I agree that Arabic numerals are better, but I did want to make the point that Roman numerals aren’t quite as bad as you might think having already learned Arabic numerals.

Your method for summing 8+7 is what I think I do for things like 7+4 (since I can visualize 7 as “three less than 10” and 4 as “one more than three” all in the same thought to reach 11), but for 7+8 my brain noticeably spits out 15 immediately and only a moment later does it actually do the processing you mention.


This is only tangentially related, but until now I never really thought about how I sum single digit numbers, but it's not by having memorized all combinations. I'd say I have all combinations that sum to 10 or less memorized, which adds up to 25 unique combos (plus the rule that number + 0 = number).

If I can tell it sums to more than 10, I break it up mentally into [larger number] + [smaller number] = [larger number] + ([smaller number] - remainder) + remainder, where [larger number] + ([smaller number] - remainder) = 10

You could break it down further by just memorizing what each digit less than 10 is when you add 1. Then you can do addition like 5 + 4 = 1 + 1 + 1 + ... = 2 + 1 + ... = 8 + 1 = 9. Then you'd only have to memorize 9 things (10 if you include 0). I guess this assumes that you know the order numbers go in though, whereas memorizing all of the combinations doesn't require that.


Honestly I feel like I do the same thing when I think about it, but it’s so well cached that whenever my brain sees 5+7 or 7+5, I simultaneously think “12”, forget the specific order, and imagine 5+5=10 with 2 overflow all at once and I can’t honestly tell you which concept arrives first.

And yeah, you can get by with less memorization and more counting, but eh...either you’re taught not to do that or you do enough arithmetic that your brain just caches the whole table eventually anyway.


Please explain the easy method for adding, for example MCMLXVII and LXV. I mean that seriously, I'm curious what the trick is.


Tokenize, concatenate, sort, reduce.

MCMLXVII + LXV

M CM L X V I I + L X V

M CM L X V I I L X V

M CM L L X X V V I I

V+V=X, L+L=C, C+CM=M so MMXXXII. Convert everything to decimal if you wanna check my work.

Note that converting CM to DCCCC is actually pretty unnecessary since you can just combine CM and C to make M instead of having to count up lots of C’s. A computer algorithm would be simpler by reducing CM to DCCC but adjusting for human fallibility, allowing for CM + C = M makes things a little easier.

Also note that this method scales to any number of sums, not just adding two numbers together.


I think originally Roman numerals didn't have shorthands like IV instead of IIII. In that case, to add Roman numerals, you just write the letters all together, sort them, and combine smaller digits into larger digits as necessary.


MCMLXVII+LXV = MCMLLXXVVII = MMXXXII

Addition in Roman numerals is dead simple: you do it by just bunching symbols together. The only difficulty is subtractive notation, which wasn’t really used in Ancient Rome.


  MCMLXVII + LXV
   = MCCCCCCCCCLXVII + LXV (canonicalize)
   = MCCCCCCCCCLXVIILXV    (concatenate)
   = MCCCCCCCCCLLXXVVII    (sort)
   = MCCCCCCCCCLLXXXII     (combine, VV => X)
   = MCCCCCCCCCCXXXII      (... keep combining, LL => C)
   = MMXXXII               (... C{10} => M, nothing left to combine)
   = MMXXXII               (optionally, look for ways to re-write with the subtraction rule)


I originally thought canonicalize was an important step, but it actually isn’t for humans. For humans, CCCCCCCCC requires a lot more tedious counting than CM or even DCCCC, leading to more errors than simply allowing the human to notice that CM+C=M.


I think the practical way of thinking would be:

- seeing L + L = C and converting CM to M ( basically striking L L and C )

- then V + V = X, ( striking Vs and add an X)

Then write whats left:

MMXXXII

Then try to re-write.


Amazing, the nth level effects a superior notation can have in problem solving. The same consequences of a higher-order programming language.


Thank you for the quotes. I think this essentially answers (or at least provides a hypothesis for) the question I posted about how the numerical system influenced their math.


I'm reading a book called Mathematics from the Birth of Numbers by Jan Gullberg and it covers these systems from various cultures in depth. Great book and easy to read even for the non mathematician. It reads like a cross between a history book and a technical reference. This is literally the first topic in the book.


> Our modern power of easy reckoning with decimal fractions is the almost miraculous result of the gradual discovery of a perfect notation.

Great quote. I wonder if this process will continue with adoption of duodecimal numbers.


I feel like the article misses the most interesting question about Roman numerals and Roman (Greek really) math. How did the numerical system influence the math that they developed and used?

The Greeks were really into geometry using the compass and straight edge so they actually did a lot of math without really needing numbers at all. They viewed calculation as less worthy of mathematicians and my understanding is that we don't have a lot of evidence for how merchants and engineers did basic calculations since most of the great Greek math texts ignored it.

Algorithms and algebra probably existed in some informal way but they weren't really formalized until the Arabs did it with the help of Arabic numerals.

So, while you can do some calculations in roman numerals or using an abacus, the interesting question to me is: Did the Greeks (and the Romans) not develop algebra or use Arabic numerals because they weren't that into numbers as compared to geometry? Or was it the other way around? Did the clumsiness of doing calculations in Roman numerals keep them from developing more complex systems of numerical calculation?

I'm not an expert on the subject at all but it's always interested me. It makes me think of Bret Victor's Media for Thinking the Unthinkable (http://worrydream.com/MediaForThinkingTheUnthinkable/)


>Did the clumsiness of doing calculations in Roman numerals keep them from developing more complex systems of numerical calculation?

The book Mathematics for the Million[0] suggests a number of limitations were created from the roman numeral system, largely due to the difficulty of division, making infinite series and even large (and extremely small) numbers difficult to work with, intuit and even see outright. As a specific example, it suggests the Achilles and the tortoise paradox[1] is trivially intuited and resolved in the decimal system, whereas no relationship between each division is made clear in the roman numerals

[0] https://archive.org/details/HogbenMathematicsForTheMillion/p...

[1] https://en.wikipedia.org/wiki/Zeno%27s_paradoxes#Achilles_an...


> The Greeks were really into geometry using the compass and straight edge so they actually did a lot of math without really needing numbers at all. They viewed calculation as less worthy of mathematicians and my understanding is that we don't have a lot of evidence for how merchants and engineers did basic calculations since most of the great Greek math texts ignored it.

Greek society was awash with arithmetical calculations. It wasn’t written about probably mostly because it was considered so obvious and commonplace. (Though it’s also a bit hard to say quite what was written about, since we’ve lost the vast majority of the books from the time.)

http://worrydream.com/refs/Netz%20-%20Counter%20Culture%20-%...

* * *

> Did the Greeks (and the Romans) not develop algebra or use Arabic numerals because they weren't that into numbers as compared to geometry?

Algebra and arabic numerals really took off with the introduction of cheap paper and printed books.

It’s really hard to transmit the oral culture of skilled use of a counting board via printed book (we might call it tacit knowledge), but you can pretty straight-forwardly print out a pen-and-paper arithmetic algorithm.


but it's always interested me.

There's Morris Kline's Mathematical Thought from Ancient to Modern Times which is 3 glorious fat volumes of just this stuff.


The Greeks had several competing numeral systems (https://en.wikipedia.org/wiki/Greek_numerals) including a decimal based system. What I do not get is how come the Romans adopted (if they adopted from the Greeks) the most unwieldy one.


"Did the clumsiness of doing calculations in Roman numerals keep them from developing more complex systems of numerical calculation?"

Probably? I mean, look what the world achieved after it left roman numerals behind.


> look what the world achieved after it left roman numerals behind.

Look what the world achieved after we started wearing button shirts.


Perhaps it would be expressed better as Look at what mathematics achieved after it left Roman numerals behind.


Nope. Still an unproven correlation.


Probably? I mean, look what the world achieved after it left roman numerals behind.

Yet the Romans were able to construct aqueducts that are still standing, and a road network spanning thousands of miles, and many other great feats of civil engineering.


It's pretty likely that architects of the time understood place-based arithmetic and used calculators similar to an abacus, even if society as a whole used Roman numerals. The tools a layman use is sometimes different than what a professional does.

We know that other civilization in the region we adept with arithmetic and geometry. And Rome's straight roads and aqueducts are evidence that their citizens understood the practical applications of such mathematics. So it stands to reason someone in the process understood how to perform arithmetic using a place-based notation. Even if they were only generating calculation tables used by field engineers.

I'm sure people devised clever tools that allowed builders to actually build these structures. Much like a roofer today doesn't need to perform any calculations beyond measurements, because they are taught how to use a speed-square to quickly find the correct angles for cutting rafters.


Plus Eratosthenes and Cleomedes got pretty close on calculating the Earth's circumference.


Sure. But they did those things by experience and rules of thumb. They didn't do a real stress analysis on those aqueducts, for instance.


For those interested in reading further, the ideal (unloaded) shape of an arch isn't a semicircle, but a catenary.

It sounds so simple: so hangs the chain, stands the arch. Took until Hooke in the 17th century before that was written down, though there are earlier (15th century) examples in architecture.

The Romans were still working on the Greek ideology that the circle was the perfect shape. Not to belittle what they did, but the key advances were really in concrete and having an authoritarian empire giving unprecedented resources to public works.


"Ut pendet continuum flexile, sic stabit contiguum rigidum inversum -- As hangs a flexible cable, so inverted stand the touching pieces of an arch." (although they figured out at some point that this curve was close to being a parabola)


They didn't do a real stress analysis on those aqueducts, for instance.

The Romans tested bridges by having the engineers stand under while a legion marched over.


In general, with an abacus. Roman numerals for generally used for recording information, not for calculating with them. The article emphasizes how easy it is to add and subtract with Roman numeral notation, but everything else I've read emphasizes the Abacus even for that. After all, most people today calculate with calculators as well, and we have a snazzy Hindu-Arabic system for numbers.


It's actually somewhat surprising to me that the Romans didn't invent the concept of zero, when they used the abacus for daily calculations. With the benefit of 20:20 hindsight you'd think that the concept of zero would follow quite naturally from the concept of an empty column on the abacus.


But that's the thing about Roman numerals: you don't need a placeholder number to represent empty columns. And for "what is XVI subtracted from XVI" they could just use a word meaning "nothing", such as nihil or nihilum. The need for the concept of zero as we understand it really only arises together with a place-value system.


Thanks for explaining that clearly. I've always been so baffled by people who claim that some society didn't have a concept for zero, as if "inventing" zero marks some major advance in intelligence.

Every culture has a concept of "nothing" which works for zero. The ancient Greeks debated over whether nothing was a number or not, but that's just a semantic splitting of hairs.

At some point a symbol for nothing becomes useful so you invent a number-like notation for it. But that's just a matter of convenience. It's not some great conceptual leap.


I think that for the majority of people throughout history, numbers were inseparable from numerals, i.e. notation for numbers. This would explain why people are far more comfortable with the notion of real numbers (which despite their name are very very strange in a lot of ways) than imaginary and complex numbers. Even their names betray the difference. However, one has a common notation that everyone has learned whereas the other has a more confusing and less well-known notation.

Therefore I think ancient arguments over whether 0 is a number (and acceptance thereof) are representative of a greater paradigmatic shift, similar in essence to the arguments over whether the square root of -1 is truly a "number."

Viewed that way 0 is the first step in a journey of an understanding of numbers from purely counting discrete entities, to abstract parts of computation.

So basically I would posit that it is in fact a great conceptual leap (just as the negative numbers are) that only seems like an obvious fluke of notation when every schoolchild has learned it.


I would clarify that to say numbers were inseparable from words for them. Writing has only existed a short time of our history, so numerals are pretty recent.

And the "meaning" of zero as a number like others along a number line, rather than as mere notation for "nothing", I assume only ever became necessary with the invention of negative numbers.

With addition, multiplication and division, zero simply does nothing or annihilates a number, and so doesn't need to be treated like other numbers. AFAIK, zero as a number arises first in figuring out how to "get to" negative numbers, e.g. what is two minus four (one, zero, negative one, negative two), where zero is required as a numeric concept.

Negative numbers were a big step forwards. Zero, I still don't see it -- either it was just convenient notation for "nothing", or part and parcel of the shift to negative numbers. Unless I'm missing something in the historical record?


From my understanding (and I well could be wrong here), Europeans inherited from Greek a mathematics system that favored geometry and tended to abhor algebra. Concepts like integers, rationals, and irrational numbers are all pretty easy to explore and explain with geometry. By contrast, zero, negative numbers, and imaginary numbers create absurdities in geometry (how can a line have length 0? -2? 3 + 4i?). Moreover, even as algebra is introduced to Europeans via the Arabs, I can see people resisting algebra in part because it introduces these absurdities and paradoxes that need explanation.

As far as I can tell from the historical record (and it doesn't help that modern histories tends to describe historical mathematical discoveries in modern terms, meaning it's difficult to work out as a lay person in what terms the historical discoverer understood their own work), it looks like the acceptance of zero, negative numbers, and complex numbers are more or less concurrent, and this also seems to coincide with the shift in mathematics from being predominantly geometric to algebraic.


You need zero for place based notation. Eg 210 is "two in the hundreds place, one in the tens place, none in the ones place." Place based notation gets you representation of multiples all the way to infinity. Otherwise you need to represent multiples with unique symbols like in Roman Numerals. Eg X means 10 no matter how many symbols are to the right of it.


This is interesting to me.

I suspect most HN readers probably live in a world of binary logic where things are either "true" or "false", but I come from data-land, where ternary logic is the norm, and where "zero" is a very different concept than "null".

"Zero" is something that can be compared to other numbers, mathematical operations can be performed on it, etc., but "nothing" is just that... nothing.

I'm sure the ancient Greeks used one term to refer to both concepts, and that's exactly what people are pointing out when they refer to other cultures "inventing" the concept of zero. These cultures correctly realized that these are two distinct concepts, and so they created a new, separate term for zero.


Is zero a number isn’t just a semantic splitting of hairs but has an important outcome. If zero is a number then you can do arithmetic to it. In particular having the concept of zero as a number that you can do maths with is a prerequisite for algebra.


If you can debate whether or not zero is a number then you haven't invented it.

If zero might not be a number then you can't state that 0 * n = 0 and 0 + n = n and n - n = 0, etc


> Every culture has a concept of "nothing" which works for zero. The ancient Greeks debated over whether nothing was a number or not, but that's just a semantic splitting of hairs.

Nonsense. Zero being a number is the conceptual leap. It's important.


Not really related to the article per se but I always find it interesting how one may become tempted to say "this alternative to a thing I already know makes so much sense, why don't we always use it?"

I felt the same way when encountering Chinese numbers via Japanese. If 二 is two, 十 is ten, 四 is four, and twenty-four is 二十四, that's so clear! Two tens and four!

I quickly decided that this number system, though something I'd obviously need to learn and become acquainted with if my Japanese learning were ever to progress, wasn't necessarily as easy as I initially imagined. Yes, there are no places, but numbers in this system are grouped at different boundaries — not every thousand but every ten-thousand.

So 六十七億八千三百一万五千四百二十一 breaks up as sixty-seven hundred-thousands eight-thousand-three-hundred-and-one ten-thousands five-thousand-four-hundred and two-tens-and-one — and obviously, that's not quite how we would represent six-(billion/thousand-million) seven-hundred-and-eighty-three-million fifteen-thousand-four-hundred-and-twenty-one, or rather 6,783,015,421.

I post this not to discuss the positives or negatives on the Chinese number system compared to the Arabic one or vice-versa. Rather, just how one's imagination can be so easily captured by the apparent simplicity of an alternative to that with which one is familiar, almost to the point of wanting to adopt it altogether. The realisation of where things get tricky for oneself, often not coming until quite a bit later, sometimes doesn't come until later.

For myself, I tried using roman numerals for my own math for a long time but stopped when I found division too brain-breaking!


Living in Japan, I became accustomed to using numbers for up to around 10,000 yen ($100USD) due to interactions at the stores and around town, but when I would hear the price of a car (1,000,000+ yen or 100 myriad yen) or a house, it would just confuse me and not register at all. It’s all just based on your personal experience, I think.


I have the same experience with German reversed way of speaking numerals, yet I speak the language fluently, including a good understanding of Swiss German as well.

But reversing back the numbers into my Portuguese brain, just doesn't work after a certain size.


There is (for me at least) something very deep (in the brain) regarding numerals and basic math. I am quite proficient in English, but always do arithmetic in my native Bulgarian and then have to translate the result (unlike other speech, which flows freely). And not for the lack of trying. And that is even though Bulgarian numerals do translate 1:1 to English.


I read a fascinating article once I've never been able to find again that said the brain actually has internal biological representations of zero and one along with a rudimentary representation of two, but nothing above that.


Similarly, Indians express large numbers in terms of lakh (10^5) or crore (10^7) which is confusing to people using the thousands-based system.


Is it blogspam when it's called syndication?

Here's the "original" source of the column, which is on a less... determinedly fashionable website, so it might be friendlier to mobile users and people who dislike fixed headers:

https://www.straightdope.com/columns/read/3330/how-did-anyon...

Also, previously:

https://news.ycombinator.com/item?id=14818633


Fixed headers are better than fixed autoplay ad videos.


When I got my math teaching credential, there were bunches of interesting historical things we learned along the way including Egyptian Fractions https://en.wikipedia.org/wiki/Egyptian_fraction

Never actually used any of it so most of it has evaporated from my memory along with calculating square roots by hand, but it's nice to know at least enough to be able to look up the information if I want it.


Egyptian fractions have an interesting property that may have been useful back then.

Consider the problem "How do you divide five things for eight people?"

Simple - cut everything into 1/8 and give each person five.

But Egyptian fractions give an even easier way. 5/8ths is 1/2 + 1/8. Divide four wholes into halves - give each person a one half. Take the unit and divide it into eights and give each person one. The answer to this problem is the way you write the number itself. This makes division of the items easier and simpler.


That's how I cut up bell peppers into the right sizes. They're either 4 loved or 3, so cutting them into 3 or 4 equal sized piles is a challenge


It is well-known that the place-value system introduced into Europe from India via the Arabs played an invaluable role in modern arithmetic.

But, if you read the "Sand Reckoner" by Archimedes, what he lays out are the rudiments of a place-value system. He essentially describes the modern notation, but not rules for addition, subtraction, multiplication and division using this notation.

Another tidbit: if you see the recent movie about Shannon "The Bit Keeper", he shows the journalist a mechanical device (he designed?) which can do calculations using Roman numerals.


I'd never heard of this documentary, so I took a look. I assume you meant "The Bit Player"? ( https://www.imdb.com/title/tt5015534/ ). It looks interesting, and it's on Amazon prime video so I'll give it a watch. Thanks!


Sorry, you are right.


Romans essentially used the same binary arithmetic computers use today for multiplication; doubling and halving.

http://www.phy6.org/outreach/edu/roman.htm


Yeah, this was discussed previously: https://news.ycombinator.com/item?id=13636277

And division basically works the same.

Discussed article is https://thonyc.wordpress.com/2017/02/10/the-widespread-and-p...

Nowhere as complex as it's made out to be.


I am very wary of just about anything published on the history of science or mathematics. Much of it tends to be very low quality full of just-so stories, or at best whiggishness. Unfortunately much of the work by academics on the subject is not much better as the field is full of former mathematicians or scientists with an interest in history rather than lots of historians with an interest in mathematics or science.


Or feet and inches. I've been watching videos online of people making things with the added challenge of using imperial measurements. 7 29/32 less a margin of 5 3/8 plus a 1/16 offset... Is metric too easy for Americans?


Why would it be significantly harder than using some other system of numerals?


They probably had algorithms for it, but even then it sounds challenging.

Addition sounds easy and works mostly like how we do base 10 addition. I imagine they would first go for sub 5 part which is a bit exceptional and had to be manually. And then start grouping letters together like we do and create carries if they reach the next letter.

Subtraction sounds harder. It sounds close enough to our base 10 system but borrowing from next digit is much more complicated. Like subtracting D-I (500-1). The answer is CDXCIX but I am not sure how I can go there. And now imagine this with more complex numbers

But multiplication and division? in that weird base format? I won't even try

Keep in the mind that they don't convert or even think numbers in base 10 system like we do.


How numbers are written down does not necessarily correspond to how you do calculations. Given that 499 was called 499 I’m pretty sure they thought in base 10, so subtracting 1 from 500 would be trivial. Writing it down took a few more symbols but so what?


Hmm, I was assuming they didn't have a concept of base 10. If they do have it I wonder how they couldn't make the connection and write stuff in base 10 as well instead of a mixed base with weird rules

Then I guess the real challenge is converting numbers around?


Well, you can try and then you'll appreciate the difference, I am sure. Even if implemented in computer hardware, operating with Roman numerals would either be slow or take many more transistors (or both). As to why in a philosophical sense, it is because the positional system was invented specifically as a computational device, which only happened many years after people learned how write numbers down. Optimization effort takes time, and a random solution is not guaranteed to be optimally suited for a particular application (such as performing calculations).


We all do the same thing. Our calculators work in binary, and we only do i/o in decimal. Their calculators (slaves who pushed pebbles, calculi, around) worked in a decimal place-value system (or used egyptian/russian peasant multiplication), and only the i/o was done in roman numerals.


Sure, but the essence of the discovery, which is in fact far from trivial, is that it was found possible to perform calculations by manipulating written symbols themselves (i.e words) rather than using pebbles or an abacus. (This is by the way how mathematical notation began - as the means of reasoning by performing formal manipulations with marks on paper.)


When I was learning programming, one of the beginner's exercises was to implement addition, subtraction, and multiplication of natural numbers "by hand" (division wasn't covered). There you learn to formalize the grade school algorithms you know by heart into concrete code, a necessary mental process for beginning programmers.

At the end of the session, the teacher asked us to try the same with Roman numerals. That was when I knew it's significantly harder.


Roman numerals are very economical.

The base with best radix economy (except for e) is 3. But roman numerals are better still.

For representing 0-999, you would need 19 base-3 digits, but only 15 roman numerals (plus a symbol for zero).


...in terms of digits, which is a very odd quality to optimize for.


A good one to optimize for if you need to carve them into stone columns :)


Hah, perhaps the best case to optimize for!


...in terms of only the worst case numeral.

Common numerals like II III VII VIII have worse length, while I IV VI are same, and V X IX are better.

And factoring in per-digit cost, Roman numerals up to 999 have 5 distinct digits, 46% more cost per digit than base 3, making it worse than base 3 in almost every case, information theoretically. (You could win some back with a huffman encoding, though)


Note that as mentioned in the article, Romans very rarely used the subtractive notation. So in general 4 was IIII and 9 was VIIII.


Suppose there was a basketball game between Athens and Rome (go Athens!) and you have a 3 digit scoreboard.

You would need 19 base 3 digits and only 15 roman numerals (plus N for zero) in order to represent every number.

In base three, you sometimes have all three digits equal. With roman numerals, you can reuse the same digit in different positions.


I don't understand this. 999 is less than 3^7, so you can represent any number up to 999 with just seven base-3 digits. Where does the 19 come from?


999 is 1101000. So, for MSB, you only need 1 (when it's zero, you just don't put anything there). All the other positions can be 0, 1, or 2. So you would need 1 + 6*3 digits for your scoreboard. You can't not have 6 2s, for example, because your score might be 222222.


I see what you mean now. That's 19 with additive Roman numerals only, right? And for good measure, you don't even have to rearrange the digits, you can just order them as DCCCCLXXXXVIIII and cover the unused ones with a cloth.

With subtractive Roman numerals, you can get away with just 17 digits, but you lose the feature of not having to rearrange them.

Edit: actually, how did we get 19 here? I agreed at first, but my number above is just 15 digits.


I play village cricket, and the scoreboard has cards with numbers on, then hooks to hang them up depending on the score.

The problem is: Given the full range of possible (or at least plausible) scores, how many of the cards do we need for a full set?

So let's simplify it to just a run tally. You could be 111, so you'd need at least 3 of the 1 cards etc. Allow for scoring up to 999 (unlikely) and that's 29 cards to keep somewhere (only 2 zeros needed)

In base 3, you need 7 digits, but only 3 cards per, so we are doing better with 19 cards needed (21=3*7, but don't need all zeros, and that gets you to 1093 so for 999 you could save another)

In roman numerals, You'd need an M, a D, 3 Cs, 3 Ls, 3 Xs, 1 V and 3 Is. Total is 15 cards.

Can we do better? Good question.


Of course we can do better. Every card that is an identical copy of another in the set loses you some flexibility in picking different sequences.

6 different cards give you

  - 1 zero-card sequence
  - 6 different 1-card sequences
  - 30 different 2-card sequences
  - 120 different 3-card sequences
  - 360 different 4-card sequences
  - 720 different 5-card sequences
  - 720 different 6-card sequences
That’s enough to almost get you up to 2000. And I don’t think the resulting encoding is objectively weirder than Roman digits.


The answer is: It reduces to a question of permutations. 7! > 999, so we could do it with different arrangements of just 7 cards.

Can we do better? Good question...


Now if we arranged them on a 2d grid, could use 4 cards, then the different shapes (even discounting the similar looking shapes) would get you to >1000. You'd have to allow more than just the standard tetrominoes.


Ahh, I see the error of my ways.


Contrast Roman numerals with the rod calculus[0] invented in Ancient China. Wikipedia has a list of algorithms for calculating with rods, from the usual arithmetic operations to fractions, division, square and cube roots, Gaussian elimination, and solving polynomials.

It would seem that such tasks would be extremely difficult for someone working with the Roman numerals.

[0] https://en.wikipedia.org/wiki/Rod_calculus


I'd previously read that Fibonacci had helped popularize the hindu-arabic notation in Europe. Wikipedia says "In 1202, he completed the Liber Abaci (Book of Abacus or The Book of Calculation)," which included lessons and examples.

Interestingly, he grew up with a merchant father based in Northern Africa and had internalized it.


Add and subtract on a good abacus (like a Japanese soroban) is quite fast. Multiply and divide are miserable, but in ordinary trade, it's mostly add and subtract with the occasional multiply.

With a soroban, a slide rule, and a book of tables, you can do most classical engineering math. Slowly.


A millenia later, humans will look back and wonder the same about how we did math on decimal system, while binary is far superior. I expect they will even genetically remove one finger from each hand so they can do binary math better with their fingers


> The IIII-for-4 notation survives today on the faces of clocks.

What? On which clocks?


When I search for "clock with Roman numerals" on Google images I see about 50% use IIII.


On this new Omega Constellation, just as a random example

https://www.omegawatches.com/watch-omega-constellation-omega...


I was as baffled as you when I first saw IIII instead of IV...


Most merchants at Roman times used Greek nimeral, it has much easier algebra comparable to arabic.


I have read that the uptake of Arabic numerals was actually fairly slow and fraught in Europe, but of course can't put my hands on any references.

There's a reddit thread [0] that might be some of what I saw, and my wife does paleography work where she runs across books of accounts that are rendered in Roman numerals, because that's how formal accounts were prepared, even if the actual accounting was done by other means.

That same reddit thread has link to an "algorists vs abacists " article [1] which purports to back this up, but I can't confirm because the article is paywalled for me.

Edit: Moved/fixed links.

[0] https://www.reddit.com/r/AskHistorians/comments/12m0vp/how_a...

[1] https://www.jstor.org/stable/pdf/2686479.pdf?seq=1




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: