I just spent 2 days programming a timezone selector in a React form that changes the displayed date/time as you switch timezones but the underlying UTC representation wouldn't change.
All this without loading 500k JS timezone libraries. I only used Intl API and tz.js [1].
The trick simply was to temporarily shift the date by the difference between the local time and the edited time zone. :)
Yeah, not to be rude (we've all been there) but that hack is bound to be borken somehow.
Really this time & timezone stuff should be handled by the OS, system calls or libs should be more sophisticated so we aren't "solving" these problems over and over again.
But I'm about to start ranting about Unicode, so I'll shut up now... ;-)
I agree but I'd like the generic implementation of that. JSON schema never really took off and I believe that part of the reason is that there's not a way to indicate what type might be contained in a string or number (I'm okay with JSON booleans and null). As ugly as it could get, adding XML Schemas to XML documents did in fact help the parser.
The reason I stated I'd like the generic version is that there are other types that we use consistently. There's a very nice RFC available for telephone numbers that we've started following and we can marshal/unmarshal pretty easily from strongly typed languages (where we control the code), but wouldn't it be nice if there was a standard way (within the JSON to let systems know it was a telephone number?
I dunno, JSON Schema seems to have certainly found a lot of traction where it makes sense--web responses, etc. That may to some extent be a POV thing, as I work on a lot of OpenAPI-consuming and -producing systems myself, but I never really have to dig around to find schemas and the like for stuff I really care about.
VSCode even has JSON Schema support built in, which is cool. I use that a lot.
> the objects being mapped to implicitly are the schema
Dynamic languages don't have an enforced schema. You can do it manually, but a schema is easier, special-purpose, declarative.
Plus, data schema languages are typically far more expressive than programming languages. e.g. java doesn't even have nonnullable (which C had, as non-typedef structs). They're closer to a database schema.
a) no mandatory quotes for dicts keys
b) date and time intervals in iso 8601 format.
c) optional type-specifiers for strings, so we can add e.g. ip4 and ip6 addresses. Eg. { remote: "127.0.0.1"#ip4 }
e.g.
{ guests: 42, arrival: @2020-02-17T17:22:45+00:00, duration: @@13:47:30 }
Name "JSON" would be weird, because it wouldn't be JavaScript-compatible syntax anymore. I know, that few people would agree with me, but I would propose `new Date(2020, 2, 17, 17, 22, 45)` syntax, even if nobody uses `eval` to parse JSON, keeping historical heritage is important. And if you need timezone, something like `Date.parse("2011-10-10T14:48:00.000+09:00")` could be used. Now it's not a real constructor or function calls, it's just a syntax, but it's still compatible with JavaScript.
18.2.2020 is easy if you recognize that 2020 must be a year and 18 can't be a month. But if you want to parse 18.2.2020, you probably want the same parser to handle 1.2.2020.
To be universally applicable it would have to support at least date, time, time zone and time zone database version (since time zone definitions keep changing). You would then have to define how each of these look in a backward- and forward-compatible manner and define what every combination of them means. For example, a time and time zone but no date or time zone database could mean that time of day in that time zone on any day, using the time zone database active at that date and time. Not saying it can't be done, but it's a big job.
Unix time still has issues. It officially pauses for a second when leap seconds happen. You can't actually calculate a delta in seconds between two unix timestamps without a database of when the leap seconds were.
On calculating a delta, isn't there the exact same problem with UTC timestamps? Unless one of the ends of your delta is the exact 23:59:60 moment, there's no way to account for possible leap seconds in the middle of your range without just having a list of them.
Totally! Just pointing out that unix timestamps don't solve everything (even before getting to relativity).
International Atomic Time (TAI), which differs by UTC by 37 seconds since it doesn't count leap seconds, solves everything I know of. Although the clocks aren't in a single reference frame, the procedure for measuring their differences and averaging them to define TAI is well defined and so sets an objective standard for "what time is it on Earth".
Presumably you mean unix time as a numeric scalar in the JSON. That is still not self-describing - is it time, or just a number? Which scalar data type should your parser use? And is it seconds since epoch or milliseconds since epoch?
It should be maintained as a numeric scalar until you are going to do something with the value... and if you are going to do something with the value, you should know if it is a date or not.
JSON isn't meant to be self-describing format. There is JSON schema or the like if that is what you are after.
And yet to a very large extent, it is. Strings, numbers, booleans, arrays, and associative maps made the cut. Timestamps would be a pretty reasonable addition. It would certainly cut out all the controversy here.
Then it is a complete non-starter for "just use ______". I work on software that requires millisecond precision (honestly it would benefit from even greater precision) at the transport layer. It's not even really doing anything spectacularly complex or unusual. Seconds simply aren't sufficient for tons and tons of use cases.