Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Temporal literals really deserve dedicated syntax in JSON. I'm bored to parse yet another "18.2.2020".


That's a can of hairy worms right there.

Check out "Falsehoods programmers believe about time": https://infiniteundo.com/post/25326999628/falsehoods-program...


Hairy indeed.

I just spent 2 days programming a timezone selector in a React form that changes the displayed date/time as you switch timezones but the underlying UTC representation wouldn't change.

All this without loading 500k JS timezone libraries. I only used Intl API and tz.js [1].

The trick simply was to temporarily shift the date by the difference between the local time and the edited time zone. :)

[1] https://github.com/dynasty-com/tzjs


> The trick simply was to temporarily shift the date by the difference between the local time and the edited time zone. :)

If you're near a change in daylight savings then that will go wrong.


Yeah, not to be rude (we've all been there) but that hack is bound to be borken somehow.

Really this time & timezone stuff should be handled by the OS, system calls or libs should be more sophisticated so we aren't "solving" these problems over and over again.

But I'm about to start ranting about Unicode, so I'll shut up now... ;-)


I agree but I'd like the generic implementation of that. JSON schema never really took off and I believe that part of the reason is that there's not a way to indicate what type might be contained in a string or number (I'm okay with JSON booleans and null). As ugly as it could get, adding XML Schemas to XML documents did in fact help the parser.

The reason I stated I'd like the generic version is that there are other types that we use consistently. There's a very nice RFC available for telephone numbers that we've started following and we can marshal/unmarshal pretty easily from strongly typed languages (where we control the code), but wouldn't it be nice if there was a standard way (within the JSON to let systems know it was a telephone number?

- https://tools.ietf.org/html/rfc3966


I dunno, JSON Schema seems to have certainly found a lot of traction where it makes sense--web responses, etc. That may to some extent be a POV thing, as I work on a lot of OpenAPI-consuming and -producing systems myself, but I never really have to dig around to find schemas and the like for stuff I really care about.

VSCode even has JSON Schema support built in, which is cool. I use that a lot.


Adding a schema meta language just creates unnecessary complexity. Instead of 1 language now there's two.

If you're going to invent a new language like json-schema, you might as well skip to json2.

People who care about schemas already enforce this at the serialization layer where the objects being mapped to implicitly are the schema.

Pushing the schema down to the protocol just adds complexity and bloat.


> the objects being mapped to implicitly are the schema

Dynamic languages don't have an enforced schema. You can do it manually, but a schema is easier, special-purpose, declarative.

Plus, data schema languages are typically far more expressive than programming languages. e.g. java doesn't even have nonnullable (which C had, as non-typedef structs). They're closer to a database schema.


I think it's time for a JSON version 2.

a) no mandatory quotes for dicts keys b) date and time intervals in iso 8601 format. c) optional type-specifiers for strings, so we can add e.g. ip4 and ip6 addresses. Eg. { remote: "127.0.0.1"#ip4 }

e.g. { guests: 42, arrival: @2020-02-17T17:22:45+00:00, duration: @@13:47:30 }


There’s a corollary in there to Greenspun’s tenth.

Any sufficiently complicated serialization technology contains an ad-hoc, informally-specified, bug-ridden, slow implementation of half of ASN.1.


Name "JSON" would be weird, because it wouldn't be JavaScript-compatible syntax anymore. I know, that few people would agree with me, but I would propose `new Date(2020, 2, 17, 17, 22, 45)` syntax, even if nobody uses `eval` to parse JSON, keeping historical heritage is important. And if you need timezone, something like `Date.parse("2011-10-10T14:48:00.000+09:00")` could be used. Now it's not a real constructor or function calls, it's just a syntax, but it's still compatible with JavaScript.


And comments.


CBOR (https://cbor.io/) allows custom types, but it's a binary format and not anywhere near as popular.


My theory is that XML has got all the uglies you could ever want (and more), so why not just use that instead of defiling JSON?

Also, thingo aggressively protected JSON's simplicity, banishing comments because people were using them as parser directives.


> JSON schema never really took off

I see JSON Schemas used all over. Advertising, Medical, Banking, etc.


"18.2.2020" seems easy. I once walked into a meeting room where the whiteboard said "Deadline: 4/7/3". No idea what I was reading.


18.2.2020 is easy if you recognize that 2020 must be a year and 18 can't be a month. But if you want to parse 18.2.2020, you probably want the same parser to handle 1.2.2020.


To be universally applicable it would have to support at least date, time, time zone and time zone database version (since time zone definitions keep changing). You would then have to define how each of these look in a backward- and forward-compatible manner and define what every combination of them means. For example, a time and time zone but no date or time zone database could mean that time of day in that time zone on any day, using the time zone database active at that date and time. Not saying it can't be done, but it's a big job.


Just use Unix time everywhere.


I would just use ISO-8601 everywhere. I would happily use it for ordinary life! But it's not always me who builds API.


Yeah but what about relativity?

ISO8601 won't scale to universe-level applications!

:)


Unix time still has issues. It officially pauses for a second when leap seconds happen. You can't actually calculate a delta in seconds between two unix timestamps without a database of when the leap seconds were.


On calculating a delta, isn't there the exact same problem with UTC timestamps? Unless one of the ends of your delta is the exact 23:59:60 moment, there's no way to account for possible leap seconds in the middle of your range without just having a list of them.


Totally! Just pointing out that unix timestamps don't solve everything (even before getting to relativity).

International Atomic Time (TAI), which differs by UTC by 37 seconds since it doesn't count leap seconds, solves everything I know of. Although the clocks aren't in a single reference frame, the procedure for measuring their differences and averaging them to define TAI is well defined and so sets an objective standard for "what time is it on Earth".


Presumably you mean unix time as a numeric scalar in the JSON. That is still not self-describing - is it time, or just a number? Which scalar data type should your parser use? And is it seconds since epoch or milliseconds since epoch?


It should be maintained as a numeric scalar until you are going to do something with the value... and if you are going to do something with the value, you should know if it is a date or not.

JSON isn't meant to be self-describing format. There is JSON schema or the like if that is what you are after.


JSON isn't meant to be self-describing format.

And yet to a very large extent, it is. Strings, numbers, booleans, arrays, and associative maps made the cut. Timestamps would be a pretty reasonable addition. It would certainly cut out all the controversy here.


> is it time, or just a number?

Yes :)

That's the beauty of it: time is just a number (of seconds since 1 January 1970).

> Which scalar data type should your parser use?

Integer, since it's an integer number of seconds.

> And is it seconds since epoch or milliseconds since epoch?

Unix time is always seconds.


> > That is still not self-describing - is it time, or just a number?

> That's the beauty of it: time is just a number (of seconds since 1 January 1970).

Having a bare number where the units and zero point rely on out-of-band information is not self-describing.


JSON has never been self-describing. Literally every use of JSON is fundamentally dependent upon some degree of out-of-band information.

If you want something self-describing, maybe look into XML?


> > is it time, or just a number?

> Yes :)

> That's the beauty of it: time is just a number (of seconds since 1 January 1970).

> > Which scalar data type should your parser use?

> Integer, since it's an integer number of seconds.

> > And is it seconds since epoch or milliseconds since epoch?

> Unix time is always seconds.

Whilst time can indeed be modelled as "just a number", Unix time spectacularly fails to achieve even that.

* It can go backwards * Simple subtractions do not give accurate intervals

It's not a good measure of time. Don't use it as such.


> Unix time is always seconds.

Then it is a complete non-starter for "just use ______". I work on software that requires millisecond precision (honestly it would benefit from even greater precision) at the transport layer. It's not even really doing anything spectacularly complex or unusual. Seconds simply aren't sufficient for tons and tons of use cases.


> Unix time is always seconds.

Javascript time (that 'JS' in JSON) is always milliseconds.


Unix time is always seconds.

Wasn't there a BSD that used double instead of long for time_t for a while ages ago?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: