Hacker News new | past | comments | ask | show | jobs | submit login

I've just checked with a few math books I like, and they are using ℕ and ℕ₀ throughout. What I don't like about ℤ⁺ is that 1) it talks about integers, although negative numbers might not be relevant in the current context at all, and 2) it's not really clear if ℤ⁺ denotes ℕ or ℕ₀. Of course, you have the same ambiguity problem with ℕ.

Oh, see how I numbered this 1) and 2), not 0) and 1)?




> Oh, see how I numbered this 1) and 2), not 0) and 1)?

So what?

> 2) it's not really clear if ℤ⁺ denotes ℕ or ℕ₀.

On the contrary. ℕ is ambiguous between the positive integers and nonnegative integers, though in my experience it's usually the nonnegative integers.

But ℤ⁺ is absolutely unambiguous. It's the positive integers.

And since you frequently need to refer to the nonnegatives, it makes sense to have a symbol for them too. There's almost no benefit to having ℕ available as a synonym for ℤ⁺. There's a lot of benefit to having it available as a synonym for ℤ\ℤ⁻.


> So what?

Well, it means I usually need either ℕ or ℤ, not ℕ₀.

> And since you frequently need to refer to the nonnegatives

It depends on the context. You don't use ℕ₀ much, if you agree that counting starts at 1. You then use either ℕ or ℤ, which just looks cleaner. No reason to complicate such a simple concept as {1, 2, 3, ...} with something complicated such as ℤ⁺. As I said before, to understand {1, 2, 3, ...}, you don't need the concept of negative numbers at all, so why drag it in through confusing symbols?

> But ℤ⁺ is absolutely unambiguous. It's the positive integers.

Or the non-negative integers. And that's not just my opinion. Here is what ChatGPT has to say about that:

*system*: You are a mathematics professor, and happy to chat about mathematics and the philosophy of mathematics.

--------------------

*user*: What would you say ℤ⁺ stands for? Are there multiple options?

--------------------

*assistant*: ℤ⁺ typically stands for the set of positive integers. This usually includes all the whole numbers greater than zero. However, there can be some variation in its definition depending on the context.

In some contexts, ℤ⁺ might be defined to include zero, so it would represent the set of non-negative integers. This is less common, but it does occur in some areas of mathematics.

The symbol ℤ comes from the German word "Zahlen," which means "numbers." The superscript "+" is used to denote that we're only considering the positive members of this set (or non-negative, in some cases).

So, yes, there can be multiple options for what ℤ⁺ represents, but they are all closely related. It's always a good idea to clarify these kinds of notational issues when they arise, to avoid any potential confusion.


> if you agree that counting starts at 1

I argue that counting really starts at 0 in https://news.ycombinator.com/item?id=33022031


I guess it is the difference between measuring the size of a set, and labelling the elements of the set with a number. Both can be called "counting", but the meaning is different. The first of something should always be labelled 1, but obviously the empty set has size 0.

That's the main problem I have with using ℕ for {0, 1, 2, ...}. It's easy to get into the habit of writing stuff like x_0, ..., x_{n-1} for n elements, and that's just ugly. x_1, ..., x_n is much better and clearer. On the other hand, 0 is useful when it comes to measuring the size of something (an offset, for example, or age).

I think ℕ and ℕ₀ is a good way out of this dilemma. ℕ is the natural numbers, and ℕ₀ is the natural numbers with, well, 0.

The other way out of this dilemma is what most here prefer, I guess: It's to say a label is just a label, and starting with label 0 when labelling the elements of a set, is just as good as starting with label 1. Then you just need ℕ = {0, 1, ...}, and ℤ for the integers, and you will not have much use for ℕ⁺ = {1, 2, ...}, because now sizing something and labelling something is one and the same. So you will now use x_0, x_1, ..., x_{n-1}. So you start counting from 0. I don't know, I just don't like it, but in the long run, maybe it is less confusing, because you unified the concepts of sizing and labelling, and now you can call both of them just counting.


> I think ℕ and ℕ₀ is a good way out of this dilemma.

I think it's no better than using N and N^+ [1]. Note that all formal definitions on that Wikipedia page take ℕ = ℕ₀

[1] https://en.wikipedia.org/wiki/Natural_number#Notation


I added a bit to my previous answer before seeing your reply. But yes, it does not really matter in terms of notation if you use ℕ and ℕ₀, or ℕ⁺ and ℕ. But both ℕ₀ and ℕ⁺ are slightly annoying compared to just ℕ, and so it changes where you start counting from: 0 or 1. If you start counting from 0, you will mostly not need ℕ⁺, and mostly just use ℕ and ℤ. If you start counting from 1, you will use ℕ more than ℕ₀, but you will use ℕ₀ often enough so that you need ℕ, ℕ₀ and ℤ.

Logic likes to unify things, so formal definitions usually start with 0, and conflate sizing and labelling. Note that Peano first started counting from 1. Later on he changed it to 0. Doesn't mean that's the right thing to do, though. Maybe these two concepts should be kept separate: ℕ for (default) labelling, and ℕ₀ for sizing.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: