Hacker News new | past | comments | ask | show | jobs | submit login
How inevitable is the concept of numbers? (stephenwolfram.com)
182 points by perardi on May 25, 2021 | hide | past | favorite | 202 comments



I think numbers are wonderful and cool abstract thingies. We grow up being taught them (in my case Arabic numerals in base 10) and they become such part of our being that we start to think that we know what “5” is. Then I was introduced to Roman Numerals. Didn’t like those much. That one taught me that 4 is more related to 5 than it is 3. Then some of us learn a base 2 number system or base 16 number system and for a brief moment we question whether we actually knew what “5” was. And then we may learn about Peano and the successor(s…) of zero. Did we really know what “5” was then? What is five-ness really? Perhaps fiveness is a set within a set within a set…

Every interaction our human minds have with fiveness is with a representation or encoding of 5. With each new representation, I’ve experienced fiveness slightly differently.

But I like computers and math. For me, the best to experience oneness with the fiveness is through computation with Lambda Calculus and Church Numerals. Very fond of that encoding.


> Did we really know what “5” was then? What is five-ness really?

Ah, but we do, and it’s very simple: it is what five cows an five fingers have in common in the most obvious sense. There’s no mystery there.

Or, stated in another way, “fiveness is something you’ll be very sad not to see when you look at your hand.”


I lost one of my fingers when i was a child, so after getting used to it i don't experience sadness when i look at it. That statement also checks out for four, in this case.


Ok


This presumes the existence of a category like 'cow' or 'finger'. I'm pretty sure there's nothing intrinsic about 'cows' and 'fingers'; these are artifacts of human cognition.

I base reality, there are no such things as 'cows' and 'fingers'. Just systems with similar properties. And if you have to be very pedantic, there are not even discrete systems, since everything is linked by information. You cannot separate a river from the ocean - it's part of the same system.

So, given that there are no categories to put discrete things into - they just exist in our imagination - and there are probably not even discrete things - these also just exist in our imagination - the question of five-ness is not really answered.


> You cannot separate a river from the ocean - it's part of the same system.

> discrete things - these also just exist in our imagination

What leads you to separate "imagination" from "exists"? Isn't imagination also "part of the same system"?

Imagination definitely has the capacity to affect the world, so it's not obvious in what way it's "not real". Why do you draw a hard line there?


More to the point, the idea that "you cannot separate a river from the ocean" seems to be ostensibly wrong - otherwise we wouldn't have the (very useful in practice) notions of 'river' and 'ocean' in the first place. If you look at a map, the differences stand out pretty clearly - e.g., a river has a starkly different topology from that of the ocean. So, no, these differences are not just figments of our imagination, as, in particular, everyday practice shows.

In general, the failure to perceive emergent phenomena as something different from the particular substrate, and consider it separately - for instance, the failure to see how nature is not just a bunch of atoms moving around, has a name - reductionism. It is a form of intellectual blindness (not to be confused with the ability to think abstractly).


I'm not drawing a hard line at all; yes, imagination is part of the same system. And of course it has the capacity to affect the world.

I'm just pointing that without arbitrarily dividing the world into separate parts, numbers don't arise.


But "world", "separate" and "parts" are all language concepts too. I feel the inconsistency in your (circular) argument is not getting through.

That divisions are arbitrary and "exist just in our imagination" doesn't line up with your admission that imagination is real, and using words to describe it. That line between "real" vs "arbitrary / imaginary" is not as clear cut as you (unconsciously, apparently) draw it.


> I feel the inconsistency in your (circular) argument is not getting through.

I guess it's not, since I'm not convinced that mine is a circular argument.

My argument - to put is simply - is that we are all part of the same system and that there are no divisions. Without divisions, no numbers. I don't need any distinction between 'real' and 'arbitrary' for this to hold, that's a dichotomy you assume on your part.


Yes, we may be all part of the same whole. But there definitely are divisions – you are manifesting them with your own words. QED.

Let me try differently. Your premise seems to be that "base reality" (your words) is a single teeming interconnectedness, indivisibly unique, from which it follows "there are no discrete systems", no categories, from which it follows that counting discrete things is an "artifact of human cognition".

Correct? Did I get your position right?

I simply pointed out that human cognition / imagination, including language and categories and logic and numbers, is as much a part of the base reality (that same teeming interconnectedness) as anything else. You manifest your words = they capture a pattern, patently recognizable from other patterns, transmittable (e.g. to me), with a potential to affect me and others and our future.

The world being interconnected doesn't mean it's undifferentiated. All work is still ahead of you in showing that categories and words are somehow "not intrinsic" (again, your words). Yes they may be a teeming that refers to other teeming, but you haven't shown that's anything special, worthy of singling out as extrinsic.


> Let me try differently. Your premise seems to be that "base reality" (your words) is a single teeming interconnectedness, indivisibly unique, from which it follows "there are no discrete systems", no categories, from which it follows that counting discrete things is an "artifact of human cognition".

Yes, I think you summarized it pretty good! Thanks for that.

> Yes, we may be all part of the same whole. But there definitely are divisions – you are manifesting them with your own words. QED.

Woah, not so fast, there! Where exactly do my words manifest divisions? The words you read on the screen, that manifest in your mind as meaning are not discrete things themselves. No word stands for itself, else we would not need dictionaries. They are fuzzy things that often change their meaning, depending on context, place and time, on the reader and what she ate in the morning. To claim that words are discrete things that exists

> The world being interconnected doesn't mean it's undifferentiated.

I agree. Just as there are patterns in stellar nebulae that swirl and dance but never quite separate, all the world manifests its decay in wonderful shapes and patterns.

But none of these patterns can be isolated from the others, for non of the patterns can there a definite line be drawn to say "your existence starts here and ends here". The boundary of every definition, of every category can be shifted and shifted and shifted some more. They are - as already mentioned - artifacts of cognition that allow us to form a mental model of the world that surrounds us. Replacing the unfathomable interconnectedness with a simple game of blocks and strings and forces that fit into the limited capabilities of the simulation we run in our heads.

> All work is still ahead of you in showing that categories and words are somehow "not intrinsic" (again, your words).

Categories and words, numbers and letters and the human mind are fleeting. How can they be intrinsic if they exist for only an eye blink?


In your world without numbers, do letters exist? Does the letter "n" exist? Is there any relation between "n" and "nn" and "nnn"? If so, how would you describe that relation?


Do letters have an existence outside of our heads? I doubt it.

With regards to the relationship between repeating sequences; we (humans) invented funny games to describe these patterns as we perceive them. Some of these games have very strict and elaborate rules, although most of them are either inconsistent, incomplete or undecidable or trivial.

Even constructive mathematics offers no help here.


> I'm just pointing that without arbitrarily dividing the world into separate parts, numbers don't arise.

I agree. Though I think a stronger statement is also true: Without "arbitrarily" dividing the world into separate parts, non-trivial thought is not possible.

Assuming by "parts", you mean "categories", I think my statement is still true. And, once we have categories, we can use numbers to compress information, which in terms allow us to perform more complex computation/though with our limited computational power.

As to how arbitrary our categories are, one could argue that some are hardwired to our DNA, as claimed by Chomsky(1).

[1]: https://en.wikipedia.org/wiki/Universal_grammar


> This presumes the existence of a category like 'cow' or 'finger'. I'm pretty sure there's nothing intrinsic about 'cows' and 'fingers'; these are artifacts of human cognition. I base reality, there are no such things as 'cows' and 'fingers'. Just systems with similar properties. And if you have to be very pedantic, there are not even discrete systems, since everything is linked by information. You cannot separate a river from the ocean - it's part of the same system.

When I cut my finger the cow does not share in my pain. When we kill the cow for its meat, we do not share in its pain. That the cow becomes part of us through its consumption does not seem to invalidate this point—discrete systems do exist in our experience.

Of course, if you zoom out far enough you might refer to the sum of those discrete parts as some singular, complex system, but it seems the human experience is fairly limited in exposing this subtlety (not to mention that it is often useful to discuss the parts themselves without considering their relationship to the entire universe).


Also, the concepts "artifacts", "human", "human cognition" and "artifacts of human cognition" are all example artifacts of human cognition.

As is "circular", "reasoning" and "circular reasoning", as well as "irony".


I don't know about that. Reality is strongly suggestive. (Of course you can argue that "reality" also is a figment of our imagination or, for example, it's not a thing in the first place, but I personally wouldn't go that far.)


Does the base really have anything to do with 5? I can represent 5 in base 10 as 5, or I can represent it in base 2 as 101, but both encode the same information.

Using different bases, to me, is just about storing information. How many symbols do you want to create before you increase the length of your number and start re-using them.


In fact, you're talking about digits and numerals, not numbers. But this actually drives your point home, I think.


> What is five-ness really?

It is everything that has some relation to something that has that same relation to something that has that same relation to something that has that same relation to something that has that same relation to some unique thing that does NOT have that same relation to anything.


Is this the spoken form of a Church Numeral or something?


Yes. Exactly. (You could also see it as a rendering of some of the Peano axioms.)


Username checks out. :)


I would say that positional notation, Zermelo ordinals and Church numerals are just ways to encode numbers. I would never call them numbers in the strictest sense of this word. Just how your C code is not an algorithm.

I would say I knew what 5 was even before I learned any notation or written language whatsoever.


> Did we really know what “5” was then? What is five-ness really? Perhaps fiveness is a set within a set within a set…

The reason it seems mushy is because there are two related concepts at play: a numeral vs. a number.

A numeral is the symbol: 5 in Arabic numerals in Latin script or ٥ in Arabic script; V in Roman numerals etc.

A number is the quantity, the amount being expressed: five (English), ḫamsa (Arabic), cinq (French) etc.


Yes; the difference between token and denoted concept. They are not the same thing (as René Magritte probably wanted to say with his "This is not a pipe" painting). In computer contexts: each representation of 5, in whatever base or language would point to the same element in a lookup-table.

But for us humans, the tokens and concepts blend to some degree: maybe not so much with regards to numerals (although there seem to be exceptions ...). So even if 'Amour', 'Love' and 'Liebe' essentially point to the same concept, we might conceive of them differently, ever so slightly, depending on the language we chose to use.

Which gives an inkling of the vast difference between computers and humans, at the current stage.


Five's how many fingers I have.


Sorry to hear that... May I ask how you lost the other 5?

(My inner Asberger's could not resist.)


Five Five Five Five Five

There is no Antimemetics Division


5 means territory


I like that he keeps it open-minded, but I've thought a lot about this and to me the case for numbers being inevitable is decisively yes. Every adaptable system evolves to adapt to a changing environment by detecting "modes" (categories) and adapting to each mode. Then we start noticing categories come in instances. There's a tree, there are more trees. So now it's useful to count them... Then it's useful to have fractions. And measurements (like weight and length). Basic operations like + - emerge, and with that multiplication. And with that division... and so on.

I think the fundamental properties of math are one of those things that will keep emerging for every thinking system. Not because numbers are fundamental to the universe. Rather they're fundamental aspect of intelligently adapting to (i.e. thinking about and making predictions about) the universe.


> There's a tree, there are more trees. So now it's useful to count them...

It's not that useful to count them. My personal interactions with trees have likely never involved counting them; not even when I was involved in woodland management.

Trees control their numbers according to whether or not they can reasonably live in a particular density of tree coverage.

Only humans want to count trees, perhaps in general to claim (abstract) ownership of a particular large number.


Lets say, based on your nick, you are teaching two classes. Both consist of individuals.

If we were to count the classes, we would say that one class consist of 2 people, and the other of 20000.

But those are just numbers.

My point is that numbers are useful for us in understanding how to interact with the world. If you don't plan to interact with the trees, you probably don't need to count them. But if you are going to use them for firewood or building materials, the knowing the difference between 2 trees and 20000 is useful.


We agree that the difference between 2 and 20000 is significant, but I can easily observe this difference without counting all the way up to 20000.

Identifying the difference between 19,999 and 20,000 is certainly not significant in this way. 19,999 is already "too many" students; or, "plenty" of firewood. I don't need to count them to know this.

I'd say the difference between 5 and 6 is already relatively unimportant in these contexts.


My point is not that you need the number to be exactly accurate, my claim is that for the number 2 you don't even condider the student body as a quantity, but for 20000 it very much becomes a quantity.

Also, if 20000 students need training, as a teacher you can still provide training, you just need to do so in other ways. Either, you can recruit more teachers to help you (and then it matters if the number is 10000 students or 20000 students), or you can switch to another medium (a book, a video, on-line training).

Treating a student body as a number will make it more managable for you to provide/organize training to them than just hireing more teachers at random until each student gets enough attention.

Likewise, 2 trees may keep you warm for a week, 20000 may keep you and half your (small) village warm for a year, with enough trees left over for the forest to replenish.


> Every adaptable system evolves to adapt to a changing environment by detecting "modes" (categories) and adapting to each mode. Then we start noticing categories come in instances. There's a tree, there are more trees. So now it's useful to count them...

I really like this part of the explanation, although not all adaptive systems are intelligent. E.g if I add a grain of salt on a heap of salt and it suddenly collapses, that is an adaptive system but not as a result of intelligent modeling of how it should behave.

> Not because numbers are fundamental to the universe. Rather they're fundamental aspect of intelligently adapting to (i.e. thinking about and making predictions about) the universe.

Here the problem is assuming intelligent adaptivity and the agent that has it is something separate from the universe. Being able to have a model of the universe inherently requires internally resembling the structural functional organization of the universe, a mutual conformity if you will, and therefore if numbers, counting, modes, categories etc are fundamentally useful constructs, I think they also at least resemble a fundamental part of the universe.


A tree adapts to its environment and I would bet money that trees in their phenomenology can’t count.


Nature counts all the time: tree rings, the angles of petals or leaves, the packing shapes for seeds.


Venus flytraps even count the number of their spikes that were moved during a certain interval of time to know whether to close or stay open.


Lots of animals have been verified to understand and use abstract count which is what numbers are at their basics.

Plants are much less adaptive than animals by definition. The problem isn’t whether they know numbers but how do you even test what plants “know”.


Well, in a very primitive and non-accurate sense, it counts/measures the number of (photo synthesis generating) photons hitting each leaves, and then grows in directions that maximizes its photo synthesis.

Since a tree has a much lower number of branches and leaves than there are photons, there has to be some aggregation of information involved in this computational process.


They don't design spaceships either.


Of which we are aware.


mycelia microcosm intensifies


Your argument is tautological.

Do we really know if adaptation is responding to discrete classifications of environmental pressure?

There could be other ways to describe how systems adapt. You are ascribing a lot to how you think it emerged.

I think your opinions are great, but I just have a hard time having beliefs about things that are as fundamental as numbers.


Another perspective could be that we are creatures emergent out of the physical and chemical processes of the universe.

There is some similarity or equivalence to some of these processes with mathematics, which allows us to model a subset of physics/chemistry with mathematics.

Because physics/chemistry can create us using rules that follow math (or even some undiscovered rules), this same physics/chemistry gives us the faculties of reason that lets us think about and apply math in some universal way

The physical, chemical, and mathematical substrate to which we are born allow us to even discover that the rules of math can be applied to some subset of the laws (or chaos?) of the universe.


Physics creates us, universe creates physics, … creates universe


Math creates universe? The answer normally lies in the open. (I don't know though)


... creates math.


42 creates …


That is only circumstantial evidence.


> Do we really know if adaptation is responding to discrete classifications of environmental pressure?

Yeah we do know.

Take any system of variables that interact, and change them smoothly, and inflection points will emerge. This is true literally any way you look at it. In space. In time.

Heck aggregate states of matter represent "categorical adaptation" of matter to smooth alteration of temperature.

Point being this process happens even before we even can call a system "thinking". It's absolutely inevitable.

We can keep our options open forever and say "I don't know", that's fine. But if we analyze what we do know... turns out we DO do know.


You are just saying that interacting systems have phase transitions.

I think this is a far cry from "emergent numbers".

Edit: Maybe self organizing matter that lives on a phase transition boundary benefits from being aware of the boundary? Thats the only analogy I could think of that could logically connect.


> You are just saying that interacting systems have phase transitions.

Right.

> I think this is a far cry from "emergent numbers".

Because it was just a clarification remark on everything else I said before that, which did connect it to numbers.

> Maybe self organizing matter that lives on a phase transition boundary benefits from being aware of the boundary? Thats the only analogy I could think of that could logically connect.

That's also what I initially said, but think about it a little more broadly. Whether you "live on a boundary", or you live in a system that experiences such boundaries, or your own system internally has such boundaries, or the INTERACTION between you and your environment creates these boundaries, it doesn't matter. You benefit from being aware of them.

And here's the thing. You will experience such boundaries, because it takes infinitely more "resistance" to change, in order to survive a changing environment without changing yourself, than it takes energy to adapt to a changing environment so you resist less, and it resists you less (you become more compatible).

When leaves drop in winter it doesn't matter there's a specific day and hour and second we switch from summer to winter, but trees do benefit from recognizing the overall "shift" over time and adapting to it through its own shift. I'm deliberately using "non-thinking" adaptations to show that categories are precursors to how thinking works, rather than thinking inventing the idea of categories for no fundamental reason.

Our recognition of objects and entities are the same phenomenon. We recognize a boundary (inside and outside the entity/object) where there's a shift of overall behavior in that local timespace, compared to its surroundings. We benefit tremendously from recognizing that a field of grass looks and acts more like a hungry lion in a specific region of it. Technically objects/entities are not a perfectly defined thing. A lion is in consistant exchange and interaction with its environment, it's not a closed system. And anyway I don't feel like repeating the rest of this again.

TLDR; There be phases/categories in N dimensions/parameters. There be instances of them (repetition of patterns). There be adaptation by recognizing the phases/categories and their repetition and counting and measuring them, in order to optimize our predictions quality.


When I used to teach a class called “Mathematics for Liberal Arts Majors,” the first class began with having students “draw six.¹” I refused to give more explicit instructions. It was fascinating to see the results that came out of this, along with the ideas about numbers that it sparked.

1. I have to confess that the choice of six was not arbitrary.


Need you to expound pretty please


This is really intriguing. Do you have any examples of what people came up with?


Why six?


Mathematically, 6 is the most interesting small number. For example, one might represent it as

  XXXXXX

  XXX
  XXX

  XX
  XX
  XX

  X
  XX
  XXX
or the sides of a cube, hexagons or six-spoked wheels. Occasionally someone would draw a rectangular prism with sides of length 1, 2 and 3. I often had students who might also draw 六 or 陸 or ٦ or ৬ (I had a diverse student body).


If any students hear the word wrong, you'll get some really funny drawings. (my guess)


Or if any of the students are from New Zealand


NZ = Sux

Aus = Sex


Most Kiwi accents I've heard noticeably pronounced other vowels as "i". e.g. "dick" instead of "deck", "tinnis" instead of "tennis"


Starts his article in the very first sentence with referencing another article written by himself where he praises his achievement of having invented interstellar travel during a hollywood movie shoot for which he was the science advisor and which showcased the language invented by him and called by his name.

Stephen Wolfram is so self-centred, I sometimes wonder if it's actually a stage character invented by a really talented comedian.


If Mathematics is a language to describe reality, maybe numbers are not its whole alphabet. Inherent complexity, things that are not discrete, and emergent properties may not be described adequately with numbers, and maybe a different alphabet or even language is needed.

Numbers may be (or not, it may depend on our biology) a good initial concept, but maybe something else may be developed, something more "correct" to deal with the tasks of describing reality, something like metric vs imperial units.

The idea of integrating time to your vision of reality in the way it is used in the Ted Chiang's The story of your life (the movie Arrival was based on it) could be a good approach.


Almost all of modern mathematics is not about numbers, although a lot of the objects studied can eventually be related to numbers in some way.


It's not about numbers, but it's built on concepts which wouldn't exist without numbers as a foundational abstraction.

There may be entire categories of representational concepts which we abstract poorly, or don't abstract at all, or possibly don't perceive at all - because from our point of view the relationships are so complex and remote they're effectively invisible.


I've been thinking about this since I read about the Pirahã people. The concept of number enables engineering and business (accounting) (and perhaps timekeeping). Someone unfamiliar with numbers would find it very difficult to function in modern society and a group of such people wouldn't build modern civilization (unless they discovered numbers, which someone did at some point, of course). Now, here's an example of my crazy dream: imagine the possibility of currently inaccessible ways of perceiving or thinking, that, if they were to become available, would enable a four year old child to gain an understanding of, for example, elementary particle physics from zero to phd level in mere minutes. Of course, that way of thinking might be so different that said understanding might make no use of (our current) mathematics or the concept of elementary particle at all.


It's an aside, but it's worth remembering that mathematics was not humanity's first attempt at explaining reality. For a very long time we tried to explain it using will. There were a multitude of beings, all with their own different motivations, and they could be angry, or happy or jealous or any other emotion, and it was these feelings of theirs that guided their actions and explained how everything came to happen.


> If Mathematics is a language to describe reality, maybe numbers are not its whole alphabet.

Who would claim they are? Mathematics initially co-evolved with our ideas of numbers, but many other important objects have been thought about for a very long time now.


An analog computer with huge resources would better use geometry than numbers to do calculations and obtain exact results. For example, calculus was first discovered thousands of years ago using geometry:

https://youtu.be/GAcUZ3my6E0?t=480

Actually such computer would look like a universe, moving particles around in a continous space.


There's the related concept of 'sortals', the preconditions for countability. From the Stanford encyclopedia of philosophy[1]:

The three main ideas are that a sortal

    tells us what the essence of a thing is
    tells us how to count things of that kind, which requires knowing which things are different and which are the same
    tells us when something continues to exist, and when it goes out of existence
[1] https://plato.stanford.edu/entries/sortals/


NASA found it simpler to explain numbers with hydrogen atoms than peano arithmetic for their voyager.


Does "easier" equate to "universal". I think it does, just thinking about your statement out loud.

Since to humans, explaining numbers with hydrogen would be harder, I think. The common ground between humans is larger, so we can rely on something less fundamental and abstract.


This was about finding some kind of common understanding to anchor the rest of the message to. Hydrogen, being the most abundant element in the universe, is likely to be known and studied by any civilization advanced enough to detect Voyager or its signals.


Interesting. Any source or context on this?


Not OP but I'm guessing OP is referring to the Voyager Golden Record attached to the first Voyager probe, specifically the playback instructions using hydrogen atom to derive time units for playing the record: https://en.m.wikipedia.org/wiki/Voyager_Golden_Record#Playba...

Many of the ideas for the record came from Carl Sagan and a committee he lead working with NASA.


I can understand how natural numbers can be “constructed” (for lack of a better word) as a byproduct of counting, what I could never understood on a deeper level are negative numbers, I can’t see how a number i.e. a count can be lower than zero.

Maybe related, while I can also partially understand multiplication (syntactic sugar for adding) I could never understand multiplication by zero, meaning how come when you multiply a number (no matter how big) by zero you get zero as a result.

Maybe there’s some Wittgenstein-like material somewhere that will better explain this, in which case I’ll very happy for some references.


I'm not sure if it is what you want, but Intuitionism [1] is one area that challenges modern fashions in Mathematical thinking. It suffered greatly under the formalist approach lead by David Hilbert and still has little main-stream support despite Gödel and his incompleteness proofs. Veritasium's latest video on Gödel's incompleteness [2] gives a pretty fair account of how we settled on the current fashionable foundations of Math (including nods to Cantor and Hilbert). For a more formal history there is a book "The Philosophy of Set Theory" [3] that sketches out how we got to where we are.

I have always been unsatisfied with the current foundations of math and their obtuse basis in Set Theory. Although I must admit, Category Theory has alleviated that quite a lot, especially with the relaxing of equivalence compared to equality.

1. https://en.wikipedia.org/wiki/Intuitionism

2. https://www.youtube.com/watch?v=HeQX2HjkcNo

3. https://www.maa.org/press/maa-reviews/the-philosophy-of-set-...


I am somewhat familiar with the standard foundation of maths in set theory, and slightly less familiar with the program to ground maths in Category theory (although I do know a fair bit of category theory).

Maybe I am biased but I do not find the category theory foundations any less obtuse than the standard formulations. Like its pretty cool that you can do this stuff with category theory, but I think there is a reason the set theory was done first.


I didn't meant to imply that Category Theory addressed the obtuseness of Set Theory. Rather, I was alluding to work in Category Theory that helps to redefine our ideas of equality and equivalence. A discussion of that is available in Quanta Magazine [1].

1. https://www.quantamagazine.org/with-category-theory-mathemat...


No need for Wittgenstein.

Fish have 0 legs. So how many legs do N fish have? N*0 = 0.

Now division by zero, that is Devil's idea.


The spatial metaphor is the number line.

Positive operations and numbers move you to the right. Negative operations and numbers move you to the left.

So a negative number is simply a count of a change in direction.

This works very literally. Ten miles west followed by five miles east is five miles west. There's nothing mysterious or weird about the "absence of westness" when you've turned around and started off in the opposite direction.

It generalises neatly to complex numbers where i is a rotation in the complex plane - instead of being pointlessly-weird-for-the-sake-of-it: "the square root of -1 which we've spent years telling you can't exist, and now we're telling you it can."

Technically this is a form of basis vector. Beyond that things get complicated.

But the idea is still valid - you define your direction markers (even if they're functions instead of constants) and then you can work out where you are in the space you're exploring, and what "movement", "counting" and "position" mean in that space.

Multiplying by zero is always a null movement of no distance.


It might be better to conceptualize negative numbers as numbers in a different direction or along a different axis.

If I cover lunch for you, you would owe me $5.

In a way, you now have -$5. You won't have $0 until you pay me back the $5 you owe. We can also say you have $5 of debt. That makes the number positive while still representing the amount. But when reconciling your books, it'll be subtracted from your total.


I would highly recommend Paul Benacerraf's article "What Numbers Could Not Be" for a philosophical if somewhat technical take on the subject (PDF link): http://michaeljohnsonphilosophy.com/wp-content/uploads/2015/...

The thesis, broadly, revolves around contemplation about whether an number is an object, but the larger question is whether it's possible to have any small part of mathematics, even a single integer, which doesn't imply the whole (or at least, a significant structure).


Thanks a lot for the link/reference, that's the sort of stuff I was looking for.


> I can understand how natural numbers can be “constructed” (for lack of a better word) as a byproduct of counting

So you start off with zero, and the successor to zero, and the successor to that number, etc, and they're the counting numbers...

> what I could never understood on a deeper level are negative numbers, I can’t see how a number i.e. a count can be lower than zero.

...and there are all sorts of numbers that aren't counting numbers, but can be manipulated by the same rules of arithmetic. E.g.:

    h*2=1 -- fractions

    n+1=0 -- negative numbers

    r*r=2 -- irrational numbers

    i*i+1=0 -- imaginary numbers


It’s fun to posit these ideas by going in the intuitive direction, then going in the reverse direction and seeing what happens.

I recently found this to be helpful when investigation fixed and floating point numbers. If ”1011” means 2^3 + 2^1 + 2^0, which is 11 the “1011.101” means the same thing but with an additional 2^-1 + 2^-3 aka 5/8. Negative powers seem weird to begin with but it kind of just is there to just discover, due to the symmetry.

This kind of arithmetic is far less fundamental than what you and the article are talking about — it is just representation really, in computers — but I think it is a good example of how if you can tread a path in one direction then turning around and coming back to where you started then carrying on in the other direction is a useful tool for teaching and learning.


We can informally derive other types of number by extending operations on our existing set of natural numbers, seeking a "closure" for that operation, and seeing if we get consistent results.

So extend the concept of differences between natural numbers, by subtracting a large number from a smaller number and defining the result as belonging to a new class of number. Similarly, get fractions by defining non-integer ratios between integers, real numbers by defining non-fractional limits of infinite sums of fractions, imaginary numbers of defining non-real roots of polynomials with real co-efficients, and so on.

Each time, we have an operation that works for some subset of our numbers, so we look at what does not work, and see if we can make it work anyway by defining the result of such an operation as a new type of number. And then we repeat with a different operation.

But there are things we cannot consistently "extend", like division by zero, or 0^0, so we leave just those out.


0^0 is near universally regarded as 1, except in specific domains. If you define your terms carefully, you usually get a reasonable answer.

0^-1 = 0^0/0 = 1/0 is indeterminate, as are all the negative powers of 0.

https://www.maa.org/book/export/html/116806


On multiplication, it's better to not think of multiplication as being repeated additions, but instead to think of it in terms of areas. Taking that approach, it becomes easier to consider, e.g., (x² + 2x - 1)×(3x² -4x + c) since we can write the terms along the sides of a rectangle and divide the rectangle into the products of the terms.

One of the things I did in my math for liberal arts majors class I used to teach was also to do a fictionalized version of the expansion of the concept of numbers from *N* to the whole numbers to integers to rationals to algebraic numbers to reals to complex numbers. I say fictionalized because whole numbers and negative integers historically come after (positive) rationals, algebraic numbers and transcendental reals, but for pedagogical purposes, its easier to go in order of increasing supersets.


From what I read, negative numbers took a while to be accepted for the same reason, so you are not alone ;)

IMO negative numbers, complex numbers for circles, etc can be thought of as practical overloads. If we agree that the symbol minus means a debt, we can use it like that. We could agree, on *5 as meaning something, and if enough find it useful in 200 years we will teach it in school as obvious.

I think the negative number is obviously useful (you can add up positive and negative numbers and get a balance, it has an easy way to be plotted, etc)

P.S. another example is the Orbifold notation, from Conway. https://en.wikipedia.org/wiki/Orbifold_notation


I prefer to think of natural numbers as a 1D vector. Negative numbers are therefore just a function of the direction of the vector.

I don't have a good explanation of what zero is in this context... Would someone else have a way to explain it (i.e maybe related to the nullspace?).


So I'm no philosopher of math, but the intuitive way I think of it is that negative numbers are like borrowing a place holder.

Think about how electrical charge and currents work at the physical level. There's electrons, which have negative charge, or holes, which have positive charge. Holes are just an empty place an electron can go, rather than an extant particle.

Similarly, when we think of negative numbers in relation to counting numbers, we're just using a notational trick to keep track of a place holder or hole, a spot where a unary count can potentially go later to cancel it out.

There's a pretty fascinating book named Quantum Computing Since Democritus by Scott Aaronson, one of the leaders in that field. The idea of the book is "Could the ancient greeks have discovered quantum mechanics?"

Much of the higher level math in the book is past my familiarity, but the central theme is clear and intuitive: if you take ordinary probabilities, generalize them to allow negative probabilities, and then generalize those to allow complex numbers, out pops quantum mechanics quite naturally.

Why do these two generalization steps make sense? What the heck is a negative probability of an event? It's exactly just notational borrowing in the same sense as above. Why generalize to complex numbers? That's more tricky, but I think of it from two directions: 1. It allows you to model partial constructive and destructive interference of probabilities, rather than just simple union or intersection. 2. Complex numbers are algebraically closed, while more simple numbers are not. So it feels natural that our ultimate number system to model nature would need to extend all this way.

I realize some math heavy folks would find my way of thinking of this a bit hand wavy, but it really has helped me cut through the confusion and mystery. It also fits in very well with a bayesian perspective on probabilities.

This, the "fields are what's real" perspective on physics, and bayesian epistemology in general have greatly simplified the way I think of these big idea question topics.


I believe the concept of negative number doesn't arise naturally from counting, you need equations – even though we teach it all at once to kids when introducing the number line.


You don't really need equations, you just need missing elements, or debt. Do I have extra bricks, just enough (0 extra) of them or am I missing some? The case of missing bricks is naturally modelled with negative numbers.


At t0, you say zero.

At t1, you say one.

...

At t7, you say seven.

Positive seven is what you say while your clock says t7. Negative seven is what you plan and "will" say at t7 while your clock still says t0.


How much do you add to five to get two?

edit: another intuition is that ordering is a property of numbers. Positive numbers are ordered in one direction. Can numbers be ordered in other directions?


Isn't multiplication by 0 natural?

If i'm at a party and everybody wants 2 beers, then if there is 1 person I need 2 beers (1 * 2), 2 people need 4 beers (2 * 2), but 0 people need 0 beers (0 * 2).


“Zero people” and “needing” is an oxymoron, nothing (zero) can’t be associated to any natural thing (like “needing”), that’s what I was trying to say in my not so clear comment above.

Writing this down I realized we’re still trying to write in fancier words what the pre-Socratics had a clear understanding of 2500+ years ago, and speaking of the Greeks is too bad that Wolfram didn’t mention Plato by name in the first few paragraphs, the chair example is practically taken from him (expecting a Parmenides quote would have probably been too much).


Say you have 5 bags with 5 apples in each. How many apples do you have? 5 x 5 = 25.

If you have 5 empty bags, 5 x 0 = 0.


When you combine understanding language and numbers you will have a hard time.


Noone needs a fancy car. Noone = zero people.


I think I understand what you're trying to say. Let me try to motivate algebra in less explicitly algebraic terms for you:

Zero is an algebraic concept of nothing. While it refers to no physical thing, its existence in algebra is necessary to describe several mathematical laws, and several properties it has in algebra inherently derives from its algebraic concept of nothing.

Let's define both addition and multiplication [1]. Addition is an abstract representation of combination: you combine a pile of 2 things and a pile of 3 things to get a pile 5 things. Now zero is the model for what you can combine with anything else that doesn't do anything: combining a pile of 0 things (that is, nothing) with a pile of 3 things leaves you a pile of 3 things. Negative numbers represent undoing a combination: combine a pile of 5 things with a pile of -3 things (i.e., "take 3 things from the pile") leaves with a pile of 2 things. Negative numbers and zero numbers may not necessarily have a direct physical analogue, but by introducing them for algebraic purposes, things actually become simpler: you use the same terminology and logic to deal with both adding and removing things, or perhaps coming to the conclusion that in the end there's no net effect.

Now, multiplication is scaling. A half a pile of 2 things is 1 thing. Scaling and combining interact with each other, too. Taking half of a pile of 3 things and half of a pile of 5 things is the same taking half of a pile of 8 things. Or I can say that doubling a pile and then adding another of the original is the same as tripling a pile (i.e., 2x + x = 3x).

This is where things get interesting. We can do nothing by adding a pile of something and immediately taking it away, leaving us with what we started (i.e., 0 = a·x - a·x). From above, we can also see that that is the same as adding a pile whose scale is 0 (i.e., a·x - a·x = (a - a)·x). Simplifying the equation a bit, we end up with 0 = 0·x: multiplying by 0 must yield 0 to make both addition and multiplication make sense. So the concept of nothing times anything yielding nothing isn't a requirement of nothing itself, but it's a requirement of how addition and multiplication works, and how nothing itself interacts with those operations.

Incidentally, the deeper you dive into mathematics, the more important you realize the concept of 0--of nothing--actually is. The most powerful ways to describe operations are based on how they arrive at doing nothing in interestingly nontrivial ways. And things that don't have ways to do nothing tend not to be very interesting structures to look at.

[1] I'm alluding to a vector spaces here, although by glossing over the difference between scalars and vectors, it could also be viewed as rings instead.


Given how long it took to figure out that zero exists I don’t think it is natural, or at least a lot less natural than 1, 2, etc.

https://en.wikipedia.org/wiki/History_of_ancient_numeral_sys...: “Abstract numerals, dissociated from the thing being counted, were invented about 3100 BC”

https://en.wikipedia.org/wiki/0#History: “By 1770 BC, the Egyptians had a symbol for zero in accounting texts”

That’s over a thousand years, and that’s only the use of zeroes as a placeholder inside numbers. The use of a lone ‘zero’ symbol for the number zero seems to have taken over 2,000 more years (https://en.wikipedia.org/wiki/Brahmagupta#Zero)


Nobody needs no beers, therefore everybody needs at least one beer?

Language is funny.


As a kid, I recall that the thing that really made negative numbers click for me was underground floors, particularly when pressing on elevator buttons.


Because multiplication is just the number of times you add something to 0. If you add 2 to 0 3 times you get 6, if you add 2 to 0 0 times you get zero.


I’ve mentioned in a comment above, I find nothing natural in doing operations related to zero i.e. related to nothingness, meaning that I see “adding” and “zero times” as an oxymoron (you cannot associate an action, “adding”, to nothingness, i.e. to “zero”).


Well by that logic if you have nothing, there's nothing you can do to it, so you will always have nothing, which reconciles quite nicely with multiplying numbers by 0 resulting in 0.


Yep.

Q: "How many times has it rained this week?"

A: "It has rained zero times"

Zero is a perfectly natural state.


I think humans could predict solar eclipses and solve quite a few differential equations before they managed to give that answer. The normal answer to that question was “It hasn’t rained this week” or even “What do you mean? It hasn’t rained this week”.

https://en.wikipedia.org/wiki/Brahmagupta#Zero: ”The Brāhmasphuṭasiddhānta is the earliest known text to treat zero as a number in its own right, rather than as simply a placeholder digit in representing another number”

That book is from around AD 628 (https://en.wikipedia.org/wiki/Brāhmasphuṭasiddhānta)


> I find nothing natural in doing operations related to zero i.e. related to nothingness, meaning that I see “adding” and “zero times” as an oxymoron (you cannot associate an action, “adding”, to nothingness, i.e. to “zero”).

Seems related to [this problem](https://stackoverflow.com/questions/47783926/why-are-loops-a...).

This is, a proper loop is like

    do while (condition)
    {
        // ... body ....
    }
, where each iteration checks for the condition. But some folks might feel like a loop should always just do something first and then consider if it should repeat, i.e.

    do
    {
        // ... body ....
    }
    while (condition)
From that sort of perspective, zero might seem kinda unnatural and contrived, sorta like

    if (!zero)
    {
        do
        {
            // ... body ....
        }
        while (condition)
    }
In other words, the fallacy might in some assumption that doing something multiple times means doing it once and then possibly more times.


I think historically there were a number of cultures who had trouble with the mathematical concept of zero, so it's not universal


I can envision relativity if our brain has relative anatomic constructs. That said the symbolic idea is a source of struggle even for teens.


Start with then fingers. Then lose two. You have now 8 fingers. What is the number of lost fingers if not negative number.


What do you mean? The number of "lost" fingers is 2.


My guess is that the aliens of which Wolfram speaks would know about numbers. After all, animals do (see e.g. https://www.bbc.com/future/article/20121128-animals-that-can...).


What if that's only a feature of animals that went through biological evolution on earth?


Apes together count.


"The brain does much more than just recollect; it inter-compares, it synthesizes, it analyzes. It generates abstractions.

The simplest thought like the concept of the number one has an elaborate logical underpinning; the brain has its own language for testing the structure and consistency of the world."

Thank you, Carl. Sit down Stephen.


I guess one of the most fundamental difference between Wolfram’s model for fundamental physics and traditional physics is that Wolfram’s doesn’t have the concept of measure or of continuum at the fundamental level. Space and time, according to Wolfram’s model of the universe, are emerging properties of ‘the network’. Without such things as space and measures, there is no numbers in the fundamental “equations” that drive the system.

While I am not a big fan of Wolfram’s physics (I think it’s a bit backwards that he took something that he knows very well and somehow finds that it’s how the universe works), I kind of like the idea that space and time emerge from something more computational.

I think that’s what the article should have been focused on...


>I think it’s a bit backwards that he took something that he knows very well and somehow finds that it’s how the universe works

That's in a sense how every model works. All theoretical models of the world are human inventions that aid in making sense of the world in terms familiar to us and nothing more. If you get good results from thinking the world is made of atoms, then the world is made out of atoms. When someone comes along and explains the same thing with strings, then the world is made out of strings. Models are just 'manners of speaking'. It might very well be that you have a dozen entirely different, but equally accurate fundamental ways to talk about a thing.


Well yes in a way. But it somehow makes the claim more dubious.

It’s like when you go see an osteopath and they tell you that your stomach problem is due to your bad posture...


It's so sad to see that he is taking an interesting question and turns it into a sales pitch of his computational universe theory.

It's as if Stephen Wolframs mental horizon starts getting more and more restricted over the years since he tries to frame everything he sees in terms of his physical theory.


Or maybe he has an interesting theory he understands deeply, and think its valuable to show how this theory can give insight to the problem in question. Why be so cynical?


I had that view initially. And then I followed his writings for quite a few years and it started getting old.

And long. Loooong. His sales pitches seem to always go in circles. Computational irreducability here, causal graph there, namedrop Wolfram Language a few times with pretty pictures, make sure that "in our models" Einsteins theory and quantum mechanics are mentioned as special cases and "surprisingly everything works out beautifully". Yeah no right.

Time to move on. I've read enough of his 20-mile-long self-praising sales pich novels. He is just going in circles.


Repetition or iteration is fundamental to mathematics. Symbolic mathematics is iteratively applying axioms and schemas to elements of a language. Computation is iteratively applying rules or functions to sets. The correlation between numbers and iterations is strong; zero iterations is identity, one iteration is the unit of computing/application/doing-something. More iterations yield ordinal numbers, and there we are at numbers being inevitable.

In theory a universe could be entirely continuous with no need to "compute" in any discrete steps but quantization in our universe suggests that numbers also have a place in correspondence to quantized aspects of reality.


‘If a lion could speak, we could not understand him’.

- Wittgenstein

So even things we understand as fundamental truths of the universe are so deeply rooted in human experience that it's possible another living entity could be so different than ourselves that the capacity to sympathize with one another about certain things may simply not exist. Would a lion understand a corsage? Would an alien know what 12 x 12 means? The universe is just one hella big ¯\_( ͡° ͜ʖ ͡°)_/¯


When we train neural networks, we don't tell them what to think and how to think about it. We mostly focus on overall structure, layers and we want useful outcomes. And yet we see lots of patterns that repeat ways in which we think.

I think if a lion could speak, we'd understand him, because we're much closer to "a speaking lion" ourselves than we realize. Our intelligence didn't just happen at the last mile between ape and humans. Our intelligence has been emerging from the simplest animal there is throughout our entire evolutionary history.

And even then we see birds, octopuses and so on use tools and solve problems very similarly to us, which have branched much earlier from us. There are many different types of minds that can evolve. But I believe the fundamentals will always be the same. They're fundamentals of information processing systems. And BTW, we know dolphins, parrots and so on can do basic math, like counting, and translating that count to another set of objects. So they understand numbers, that's what numbers are at their most basic.

We marvel at how enormous the universe is, and how varied life in it can be, how different the minds of aliens may be. But in that seemingly humble pondering, there underlies something very arrogant. We think our mind is special. We think it's unique. And other minds will think extremely differently.

I think we'll figure out sooner or later, that our mind is basically inevitable, the broad strokes will be replicated for every species throughout the universe. This doesn't mean it'll be immediately easy to figure out aliens and their culture. But those are the details, not the fundamentals.


The way I see the universe is that all this complexity is just the interplay of a finite few axioms (the fundamental laws of physics). Given enough time and scale, the seemingly unique complexity converges back into a few (relatively speaking) patterns. If that wasn't the case the universe would be pure chaos.


Wouldn't it suck if you and I and the rest of the universe are the inevitable result of a tweet-long equation, and any change to the equation renders it unviable.

So everything is just one formula, and me picking my nose right now came from that formula.


Related to this I have often wondered if childhood amnesia is (by a crude analogy) a "format problem".

In this analogy, the lion is our younger selves. Perhaps our brain changes so much that we cannot understand (or access) what we were thinking or experiencing anymore.

If we imagine that long term memories are encoded in a way that they can later be interpreted actively by the structures of the brain, we could expect that major developmental changes in brain function might impact the ability to interpret long term memory formed before the change.

Physical trauma, the major reorganisations of infancy and language acquisition all prompt changes in organisation and processing that could be what renders earlier memories inaccessible.

This (admittedly hand-wavey) idea seems to mesh well well with the observed details of the phenomenon. Frequently accessed memories may be re-encoded, leading to some continuity. Children themselves experience fairly continuous long-term memory. Types of memories where relevant processing might not have changed as much (smell, visual memory) might be more resilient. We would expect age of language acquisition to play a part. Etc.


While the post covers quite a bit of ground, it feels (to me) like it conflates knowledge representation, language, biological systems (i.e., the messiness of implementation), computability, and realism.

Regarding numbers in particular, there are a practically uncountably infinite number of mathematical truths that apply equally to numbers or to other abstract (non-numerical) mathematical ideas.

I would rather see depth in one area or another, rather than a conflation of ideas providing food for thought. Otherwise there are far too many variables to consider for a worthwhile analysis.


While the post covers quite a bit of ground, it feels (to me) like it conflates knowledge representation, language, biological systems (i.e., the messiness of implementation), computability, and realism.

I understand that many of those well-developed fields which exist on their own terms, have standard methods, standard questions and standard approaches to moving towards answers.

The article jump between these fields to ask and grope for an answer to a simple question that in many ways can't be asked or answered in these fields.

One thing to consider is that present day computers can follow the mechanical production of mathematical propositions close to completely. But computers have a lot of trouble producing or following arguments like this, in "natural language", which have a definite logic to them but whose operation is not based on only explicit, codified rules.

Edit: To me, this sort of speculation is what philosophy actually should be doing. The questions that are "ill-defined but compelling" are the questions that have lead significant intellectual progress. How Zeno's paradox lead (or at least related) to the invention of calculus, how Einstein's thought experiments lead to relativity, etc.


> conflates knowledge representation, language, biological systems [...], computability, and realism.

You have pretty concisely described Wolfram's wheelhouse.


Because of the limitation of language there's only a countable number of mathematical truths that can be proven or written down. So for all practical purposes there's countably many.


There are countably many proofs, but a proof does not entail a single truth. Many truths are entailed by a single proof, in fact an uncountable number of truths can be entailed by a single proof.

That does not mean that all truths can be written down or enumerated, but strictly speaking it is not sufficient to conclude that due to the fact that the set of all proofs are countable, that the set of all truths entailed by the proofs must also be countable.

None of this should be taken to violate Godel's incompleteness theorems.

Finally, it's worth mentioning that not all formal systems are limited to finite proofs. There are formal systems where theorems as well as proofs can be countably infinite in length and where there are uncountably many proofs. These systems, known as infinitary logic, are often reduceable to second order logic and hence are incomplete.


Could you prove for every real value between 0 and 1 that it's greater than or equal to 0? That's an uncountable amount of proofs


No, because proofs have to consist of a finite number of words. Thus there are only countably many proofs of anything. In particular, there are only countably many reals between 0 and 1 which can be expressed in a finite number of words.


Are there a finite number of words? Language seems to grow and adapt to new concepts as needed, perhaps there is an infinity of linguistic descriptions available to us.


It's not. It's a single proof about a set, a set that's assumed to be uncountable in standard ZF set theory.

The "axiom system" that (supposedly) contain a countable number of axioms. But these too are constructs of set theory. We still create proofs one by one of theories about axiom systems with infinite axiom - so we have a countable/enumerable set of such theories.

The proof systems to we can see or touch involve this enumerable properties. Perhaps you could change that with an analogue computer that a person could input "any" "quantity" into. But that's outside math as things stand.


Do you mean the proof that 0.25 >= 0 and the proof that 1/e >= 0 count as the same one, because there's a more general proof that a set of values including those is >= 0? But then where do you draw the line? When can you consider 2 proofs different enough to count as different ones?


I think you have a slightly stricter definition of "a proof" than me. I would consider a proof that all the numbers in (0,1) are positive to also be a proof that the number 0.5 is positive, as well as the number 1/e, and Champernowne's constant.

Since the original question was about uncountably many mathematical truths I would say we have one proof that proves uncountably many mathematical truths.


It is an abstract proof of a generator for concrete proofs of specific assignments to variables. The potential is uncountable, but only a countable subset will ever be invoked.


I don't know what it really means for a proof to be invoked, and I also don't really like the idea of separating proofs into concrete and abstract proofs. Either it proves something or it doesn't.


Only a countable subset of such potential proofs exist, which more the enough to answer every question of that type that can ever be asked.

Countability is not just a limit on our ability, it's also a limit on our needs.


But we can make new words.


Only an (unbounded) finite number of them.


how so?


Numbers is the consequence of conservation law. If you relax the definition of the natural number as something that can be stable at time and space, you will see another system is possible. Gosh, even periodic table is a good example of what is possible if no strict proton-oriented association is in mind.


Numbers are inevitable.

How many people are we? How many berries do we need to collect for each.

Later: How many seeds do we need to sow. How big a foundation is needed for this building… How many suns until the weather gets warmer again?

I suppose you could do without if you lived in a cave with endless food readily available outside.


> How many people are we?

We are Steven, Simon, and Sarah.

> How many berries do we need to collect for each.

A handful for each.

> How big a foundation is needed for this building.

What building? You haven't even built the foundation yet.

And so on. You can do without numbers. And in fact people did all of the above without measuring quantities in any meaningful way in the history of humanity.


We use the natural numbers as an abstraction to understand computation. Arithmetic and number theory have isomorphisms to other areas of mathematics. The proofs-as-programs equivalence shows that math and computation are the same. So i like to think numbers are how we perceive the computational aspect of reality.

https://en.wikipedia.org/wiki/Curry%E2%80%93Howard_correspon...


Arguably, number came from money. Not just the counting, but the generalization/fungibility - money can be exchanged for anything (unlike barter); number can represent a quantity of anything.

Trade and money have more direct survival advantages than number, an evolutionary gradient for improving the cognitive capacity supporting this generalization


Shouldn't language predate money?

Language or cave paintings or whatever, use discrete symbols to describe physical reality


Interesting! You can have 1:1 between symbols and objects, i.e. without generalization.

Sidenote: the oldest writing found (so far) is of accounting...


Numbers are not fundamental. What is fundamental is laws of arithmetic. 1 + 2 == 2 + 1. Laws like that can only be expressed in some language, which we call "numbers". But only by knowing and using such laws "numbers" become useful.

So in some weird sense numbers allow us to define the laws and the laws define numbers. No?


The concept of symbols is inevitable.


When my two year old can't count the items in front of him he just says it's "many".


IIRC (edit: and as alluded in the article) there are languages which count as "one, two, three, many".

But one two and three are also innate (immediately recognizable) quantities are they not? So does a person using such a number system actually count, or recognize and categorize only?


This Lexicon Valley podcast goes into the “having a vocabulary for large numbers is actually weird” thing.

https://slate.com/podcasts/lexicon-valley/2021/03/english-la...


Great addition (no pun intended), thanks.


The piraha language is probably the most famous non-counting language. They appear to recognise - https://slate.com/human-interest/2013/10/piraha-cognitive-an...


Depends on context maybe, Koreans have different number names depending on context.


One two infinity


Eh idk something that moves uses energy and something that uses energy needs to replenish said energy. A natural question becomes how many recharge traveling a path needs, of course you can think about it as positions, but the stops themselves are intrinsically countable


To me the concept of numbers is inherent in the dualistic minds we all have. If there is "me" and "other", there is 1 and 1, together making 2... a grouping of similar "others" is 3, and so on. It's simply just our nature.


Odd, I was thinking about this earlier today. I came to the conclusion that an axiomatization of math sans numbers doesn't make sense because you have to have a certain number of axioms. I'm certain a counterargument could be made though.


Well [Kind](https://github.com/uwu-tech/kind) has only one "axiom" (the lambda), so, I don't know? This is a really inspiring thought and I'm glad people are debating it for the content and not attacking Wolfram just because.


Goedel showed that 'less than', 'equal to', and 'more than' (ie pairing) are essential. Individual natural numbers are cute but not compelling. Even if God created them.


One of my favorite thought games is trying to imagine an alien species who started off with a different way of counting/measuring things. Say, maybe going with complex numbers right away.


As late as the development of any resource gathering society


I've always thought of numbers, and the broader system of mathematics as equivalent to the Newtonian understanding of physics - an adequate tool to explain observed reality - but the edge cases that break the system are starting to pile up, and we are just waiting for someone to discover the Quantum/General Relativity theory of mathematics.


As someone who has recently started diving deeper into math recently, that is very interesting and something I was totally unaware of. Could you point me to some of these edge cases?


"God made the integers, all else is the work of man. " Leopold Kronecker


A more modern mathematician might say something like "God made the empty set, all else is the work of man".


In the beginning was the primordial distinction between darkness and light, nothing and something, before and after. Its name was the Unit, its faces named zero and one.

And with the Unit came the Successor, the primordial operation, for what is cleaved may ever be joined. From one comes two, from two comes three, and ever on until forever and always, with each successor given a name as a number.

And with these numbers came a set comprising them, and the set was Natural and good, and from it came many wondrous things.

For from repetition of the Successor came Addition, and the set was closed under Addition, and the Counter saw that this was good.

And from repetition of Addition came Multiplication, and the set was closed under Multiplication, and the Counter saw that this was good.

And from repetition of Multiplication came Exponentiation, and the set was closed under Exponentiation, and the Counter saw that this was good.

But if a thing can be done it can be undone. What is given can be taken away. If there is Addition there must be Subtraction. A shadow fell over the face of the Counter for under Subtraction the set was not closed.

Yet the set of Natural numbers had its closure under Subtraction, and this closure was another set named Integers, and the Counter saw that the Integers were good.

But if Addition of a Natural number has an inverse, so too must Multiplication by a Natural number, and this inverse was Division. A shadow passed again across the face of the Counter for under Division by a Natural number the set of Integers was not closed.

Yet the set of Integers had its closure under Division by a Natural number, and the closure was another set named Rational numbers, and the Counter saw that the Rational numbers were good and rejoiced at their scope, for between any two Rational numbers was an infinity of other Rational numbers, each with its own name.

But if Addition and Multiplication by Natural numbers have inverses, so too must Exponentiation, and indeed, so must the combination of Addition, Multiplication, and Exponentiation in a polynomial with Integer coefficients, and this inverse was the finding of Roots. A shadow passed again across the face of the Counter for almost never were the Roots of polynomials Rational.

Yet the Roots of polynomials with Integer coefficients gave rise to a new set, the set of Algebraic numbers, and the Counter saw that the Algebraic numbers were good and rejoiced at their scope, for the Algebraic numbers have complexities that delight and amaze, and each has its own name.

And yet.

Almost no number is Algebraic.

Almost every number belongs instead to a Transcendental realm where there are many terrors and almost nothing can be named.


Did you write this?


"We take in some visual scene. But when we describe it in human language we’re always in effect coming up with a symbolic description of the scene."

Human mind works in a qualitative manner, we need symbols to translate qualitative perception into a quantitative abstraction. This started as an economic and social organization need (geometry), but later evolved into Mathematics and got to overcome the shortcomings of natural language to describe reality (philosophy became science).

Numbers are just symbols that map human perception to a reality that is inherently quantitative.

I fail to grasp what Mr. Wolfram tries to explain here, but it looks to me as if he is regressing into philosophy.


It's inevitable.


And memory


Every number is a random number.


All structures in our universe are trees^. Numbers are a metalanguage for describing those trees. It is possible that there are other independent universes that we can't perceive, but I'd expect any aliens that we eventually interact with to be operating only in the treeverse, and so also be fluent in a language like our numbers.

^ Perhaps there are other independent universes out there that we don't perceive, but human brains are trees and all structures we can perceive and communicate about also are trees. We live in a treeverse.


I genuinely don't understand what this means. How a beach ball a tree? What about the symmetry group of the beach ball?


> How a beach ball a tree?

      _ 
    /   \
    |   |
    \ _ /


That is neither a beach ball, nor a tree.


This is some serious science happening.


The search for nature's joints.


The word "finger" did not occur so I skipped it.


"Please don't post shallow dismissals, especially of other people's work. A good critical comment teaches us something."

https://news.ycombinator.com/newsguidelines.html


It's certainly a big reason why we're so adept at counting but don't you think speech plays a big role too (and could be sufficient) ?

I mean numbers are fingers but they're also a kind of song "one two three.." you map on things.


> also a kind of song "one two three.." you map on things.

Apparently it's as easy as "A, B, C...".


There is a brazilian tribe (piranhã) that knows no concept of quantity besides one, two an d many. Also, in their language, verbs are not flexed related to time. This is probably du e to their lifestyle that needs no long planning, discussions about the past or managing m ultiple instances of the same resources.


Basically every single claim about Pirahã (not Piranha) needs to be taken with a huge grain of salt. There's just not enough people who have studied the language and the claims are so strong and unparalleled that we really ought to have more evidence between making any conclusive statements.


Hmmm... I stand corrected. Although I lived in the Amazon region for a few years (1993 to 2002 in Rondônia), never formally studied any native language and mostly blindly believed the wikipedia article on Pirahã people[0] and probably mixed things from the Hopi language time controversy.

Thanks for such info.

Ironically, my English was mostly learnt from people from Europe who came there to conduct researches. This is amazing and sad at the same time. But the fact that some tribes can whistle names, words and entire phrases and use such skill to communicate while hunting in the jungle is something I was told from an native speaker, can't remember what tribe or language it was; probably Caripuna or Suruí.

[0] https://en.wikipedia.org/wiki/Pirah%C3%A3_language#Verbs

[1] https://en.wikipedia.org/wiki/Hopi_time_controversy


I'm not claiming that what Everett et al. are saying is wrong, just that the claims are so spectacular that I'd like to see more evidence before I believe them.


Interesting. I would have thought that just by looking at their fingers and toes they'd arrive at a set of 1 to 10 at least...


Saying that the universe is irreducible with pockets of reducibility is a total oxymoron. It's not "irreducible" if parts of it are reducible.

"This wall is impenetrable, except for the holes there. Don't mind them I'm making a point here!"


He's using a technical definition of irreducibility [https://en.wikipedia.org/wiki/Irreducibility] for which the claim is true. It doesn't mean "cannot be expressed more concisely at all".


It is if I'm an idiot




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: