Description
Context
JSON technically has no limits on size or precision of numerics. However, practically, the programming language / library used to interpret the JSON will make assumptions.
I think it would be reasonable to put at least a size limit of 2^63 - 1
on (positive) JSON integers so that they can easily be processed by languages with something like the C# or Java (signed) "long" type, the default PHP integer type on 64-bit systems, etc.
However, I would suggest going further and reducing this size limit to 2^53 - 1
, which is the largest "safe" integer that can be represented in JavaScript's primitive number type. JavaScript represents a large part of the OpenActive community and also an absolutely necessary one due at least to web applications
Proposal
- Impose a limit on all integers, so that they will be rejected by validator if they are not between the values
-(2^53 - 1) ≤ x ≤ (2^53 + 1)
- This would be imposed on schema data but also, importantly, the RPDE
modified
field, where imprecision could lead to the wrong update being considered the "latest"
Checklist
- Implement
- Document (e.g. in Data Publisher Docs). Include instructions not to use the particular SQL Server timestamp mechanism which outputs numbers over 2^60
- Remove this section about dealing with large integers, which will no longer be needed: https://developer.openactive.io/using-data/harvesting-opportunity-data#storing-rpde-modified-with-less-than-64-bit-integers
RE BigInt
JavaScript has a BigInt type, which can represent larger integers, but this is awkward to use with JavaScript's standard libraries and OpenActive's data as stands. MDN suggests using JSON.parse's reviver
arg to parse large numbers into BigInts in JSON (https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/BigInt#use_within_json) but this can only work if the number was serialized into a string, and integers are not serialized into strings for OpenActive data.
The only other alternatives that allow a JavaScript user to therefore parse OpenActive data that has large integers is to either create a custom JSON parser (!) or use a library like https://github.com/sidorares/json-bigint https://github.com/josdejong/lossless-json, where there unfortunately don't seem to be many options