Implementing Timezones

20 May 2018

I sometimes find myself giving technical advice on how to handle different issues to founders or engineers at their companies. One thing that usually comes up is timezones, so I'm writing down my advice here.

I've implemented timezones a number of times in my career as an engineer. A calendar app whose concept of a timezone was only the offset (-5) from UTC was one of my first implementations. Each time I've implemented timezones, I've done it better and my understanding has improved.

First, what is a timezone?

A timezone is abstraction of time based on politics. It is essentially a region of geography combined with some rules and data about when the timezone's offset from UTC changed for events Daylight Savings Time in the past and when it will change in the future.

A timezone is usually applied to a timestamp when shown to a user based on their computer's local time.

It looks like this: America/Chicago

It does not look like this: -05:00, -5, CST, CDT

Whether or not you are in Daylight time or Standard time is a function of the date of your timestamp and the rules making up the timezone.

If you are storing the raw timezone (without a datetime) in any other way, you are probably doing it wrong.

Timezones are usually only used in two ways in an application:

  1. To convert timestamps stored in the database to the local time of the client when the date and time is presented to the client.

  2. To parse information supplied by humans or other data sources that don't have a timezone or offset information associated with the datetime doesn't exist or can't be trusted.

Converting timestamps

Any timestamp that is generated by a machine usually has some sort of offset associated with it (Tue May 15 2018 23:02:07 GMT-0500 (CDT)) or is already in UTC (2018–05–16T04:02:07.247Z).

A timestamp without an offset from UTC is a datetime. Attempting to convert datetimes to local time will have mixed results, if you know the timezone it was based in, or can guess its offset from UTC, you can probably convert these. However, the offset from UTC now may not be the offset from UTC then.

A date alone is not a point in time, nor should it be converted to a local timezone. Sometimes we get lazy and store dates as datetimes or timestamps in the DB with T00:00:00Z for the time. Converting a date or one of these other representations of a date to a local timezone will likely cause the date to change backwards for any offset less than the offset stored.

A unix timestamp is by definition based on UTC.

Unix time is a system for describing a point in time, defined as the number of seconds that have elapsed since 00:00:00 Coordinated Universal Time (UTC), Thursday, 1 January 1970.

Any date and time that has an offset from UTC can be described as a "point in time", this is what I have been referring to as a timestamp.

A point in time can be converted to any timezone.

2018-05-21T22:00:00+01:00 is the same point in time as 2018-05-21T16:00:00-05:00 and the same point in time as 2018-05-21T21:00:00Z.

If you subtract any point in time from another point in time, the difference should always be the same:

2018-05-21T22:00:00+01:00 - 2018-05-21T16:00:00-05:00 is 0.

2018-05-21T21:00:00Z - 2018-05-21T14:00:00-05:00 is 2 hours.

2018-05-21T22:00:00+01:00 - 2018-05-21T14:00:00-05:00 is also 2 hours.

This means the only time you really need to worry about a timezone here is when you are converting these timestamps, or "points in time" to be displayed to the end the user, in their local timezone.

When your application is server rendered, you usually have to have a user setting for this. When the application is client-rendered, you can just convert a timestamp to local time on the client and format it however you like. This is usually included in the programming language the client is written in or at least the standard library for your language.

Parsing and Storing datetimes

The other place besides the presentation layer you have to worry about timezones is anytime you are accepting dates and times as input from humans, or other sources that may not have an offset from UTC.

There are a number of ways this can be handled, depending on the context of information provided.

  1. Most of the time, using the client's timezone will be sufficient.

  2. If you need to parse it on the server, you will likely need to have a user setting for timezones that can be set.

  3. If the information is related to a location, you will want to use the timezone of that location. Ideally, you do this automatically for the user with the right geography dataset. There might even be a library for your language that does this.

Ideally, we don't want to store any datetimes in our database, but only timestamps.

Modern databases like Postgres and Mongo handle timezones natively. They'll automatically store everything as UTC and Postgres will convert datetimes (without information about offset from UTC) to timestamps automatically based on the client's timezone, which is usually what you want.

If you aren't using a database like Postgres or Mongo that handle timezones natively, some advice in addition to the above:

  1. Run your servers on UTC time. Both application servers and database servers.

  2. Convert all timestamps to UTC before being stored in the database.

Doing this should minimize timezone issues with your application and allow you to worry about bigger problems.