Why not JSON?
Why not JSON?
Posted Aug 11, 2017 18:26 UTC (Fri) by nybble41 (subscriber, #55106)In reply to: Why not JSON? by BenHutchings
Parent article: An alternative device-tree source language
The JSON "number" type has no limitations on the range. All 64-bit integers, both signed and unsigned, can be encoded as JSON numbers. It is true that not all JSON parsers *read* such integers accurately; for example, JavaScript treats all numbers as double-precision floating point, and thus can only handle 53-bit integers without rounding. However, this is a limitation of the parser, not the file format. One needs to be careful when selecting tools to manipulate JSON files to ensure that the full precision of the original data is preserved.
Posted Aug 12, 2017 9:51 UTC (Sat)
by mbunkus (subscriber, #87248)
[Link]
As an example of how wide-spread this false assumption is, look at Qt[1]. Their JSON implementation stores all numbers internally as doubles, losing significant precision for large integer values. This bug has been present for the whole Qt5 lifetime and is still unsolved, making the implementation completely unusable for interfacing with external systems where you do not have control over the data format.
Why not JSON?