|
|
Subscribe / Log in / New account

Why not JSON?

Why not JSON?

Posted Aug 11, 2017 18:26 UTC (Fri) by nybble41 (subscriber, #55106)
In reply to: Why not JSON? by BenHutchings
Parent article: An alternative device-tree source language

> JSON doesn't have 64-bit integers.

The JSON "number" type has no limitations on the range. All 64-bit integers, both signed and unsigned, can be encoded as JSON numbers. It is true that not all JSON parsers *read* such integers accurately; for example, JavaScript treats all numbers as double-precision floating point, and thus can only handle 53-bit integers without rounding. However, this is a limitation of the parser, not the file format. One needs to be careful when selecting tools to manipulate JSON files to ensure that the full precision of the original data is preserved.


to post comments

Why not JSON?

Posted Aug 12, 2017 9:51 UTC (Sat) by mbunkus (subscriber, #87248) [Link]

Exactly. A lot of folks assume that JSON's number type is limited the same way JavaScript's is, but that simply isn't true. The JSON spec does not limit it at all. JSON is independent of JavaScript.

As an example of how wide-spread this false assumption is, look at Qt[1]. Their JSON implementation stores all numbers internally as doubles, losing significant precision for large integer values. This bug has been present for the whole Qt5 lifetime and is still unsolved, making the implementation completely unusable for interfacing with external systems where you do not have control over the data format.

[1] https://bugreports.qt.io/browse/QTBUG-28560


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds