Hacker News new | past | comments | ask | show | jobs | submit login

JSON’s numbers are not IEEE-754. They’re numbers with an optionally infinite number of decimal places. It’s up to a parser to handle it. Python can parse these into integers if there isn’t a decimal place.

It’s in the name, but be careful not to get confused with JSON being JavaScript.




You wrote this as if it’s a defense but honestly I feel even more terrified of JSON numbers now than I was before entering this thread, and before reading your comment.

Not following a set standard is undefined behaviour, leaving it up to the implementation is a large problem in other areas of computer science. Such as C compilers.


Yes but this is a necessary limitation for all human readable numbers. The context decides what to deserialize into and different contexts/languages will choose bigint vs i64 vs u64 vs i32 vs double vs quad vs float, whatever is convenient for them.

Heck, some of them will even choose different endian-ness and sometimes it will matter.

I still remember the first time I dealt with a Java developer who was trying to send us a 64-bit ID and trying to explain to him that JavaScript only has 52-bit integers and how his eyes widened in such earnest disbelief that anybody would ever accept something so ridiculous. (The top bits were not discardable, they redundantly differentiated between environments that the objects lived in... so all of our dev testing had been fine because the top bits were zero for the dev server in Europe but then you put us on this cluster in your Canadian datacenter and now the top bits are not all zero. Something like a shard of the database or so.) We have bigints now but JSON.parse() can't ever ever support 'em! "Please, it's an ID, why are you even sending it as a number anyway, just make it a string." But they had other customers who they didn't want to break. It was an early powerful argument for UUIDs, hah!


It also means you can use JSON for incredibly high precision cases by making your parser parse them into a Decimal format. You couldn’t do this if you specified these limitations into the language.

Edit: Omg that story. Eep. I guess if someone provided too-large numbers in a JSON format, you could use a custom parser to accept them as strings or bigints. Still, that must have not been a fun time.


Yeah I believe I hand patched Crockford’s json2 parser? It was something like that.


JSON isn’t intended to narrow all details. That’s up to the producer and consumer. If you use JSON you will specify these details in your API. JSON isn’t an API.

I wonder how many times this gets violated though, and how many times this “I dunno… you decide” approach causes problems.


If you want something stricter, specify it in the JSON Schema and use that[1].

You could declare your own "int32" type[2] for example, and use that. Then validate the input JSON against the schema before parsing it further.

[1]: https://datatracker.ietf.org/doc/html/draft-bhutton-json-sch...

[2]: https://json-schema.org/draft/2020-12/json-schema-core.html#...


You could invent a language that represents data that is very explicit about having integers, the implementation in javascript would still spit out floating values, because that's all the language has.

So either you don't target javascript (which would be a bit silly in the case of JSON), or you go the other way and forbid integers, even in languages that do support them. Which is also kind of silly.

Ultimately the real issue is that javascript doesn't have integers and if you're interacting with it, you need to be aware of that, JSON or not.


Doesn't matter.

The baseline is anything written in C and C++, which don't have bignum or decimal types and so more or less always parse JSON numbers to either int64 or double, at best.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: