This post was cross-posted to Dev.To, an awesome community!
Recently my colleague and friend Donovan Brown was in Switzerland for customer visits and for the DotNet Day conference in Zurich. I took the occasion to spend some time with him and we got to talk (as you would expect) about geeky and "cloudy" stuff.
One topic of discussion (and that was also mentioned by one of the clients) is: What do you need to get ready for the cloud... and one topic was: Timezones!!
What time is it, really?
Because of all the regions we have, it means that your code suddenly has the potential to easily go global :) So here's a fun experiment you can do:
Create a new web application. Here I use Visual Studio to create an ASP.NET Core web application with Razor pages.
In the HTML code, display the local time and the UTC time. In ASP.NET, I can use the following code in the index.cshtml:
Local time: @DateTime.Now
In Razor pages, you can call C# code directly from the CSHTML markup, which is very convenient.
- Run the application locally. If like me you are based in Western Europe, you should see something like the below (notice the 2 hours difference between local time and UTC). If you are based somewhere else, it will vary, but there are very good chances that the Local time and the UTC time are going to be different (except of course if you happen to be in a timezone where the local time is the same as UTC).
Now deploy your application to Azure, into the West US region. You can do that from Visual Studio directly, or create an App Service in the Azure Portal and deploy there.
Run the West US application. Now you should see the same time displayed for Local and for UTC.
So that might be a surprise, and in fact I wasn't expecting that when I first did this experiment a few months ago.
What's going on here is that the Azure architects decided to make it easy for people to migrate web applications (App Services) from region to region without changes in code used to calculate time and time differences. So they run all the Azure servers on UTC. On the other hand, this has the disadvantage that you might have to modify your code to take this in account when you migrate from on-premises to the cloud
For example, if I am based in Switzerland and deploy my application to Western Europe (which is based in the Netherlands), I would expect my application to have the exact same
DateTime.Now in Azure than locally. And yet...
The morale of the story...
The big learning here is that if you are looking to migrate to the cloud, suddenly timezones become very relevant, but not necessarily in the way that you would imagine. It's not that you need to know where your code will be running. It is that you need to know that these regional questions will be abstracted.
It would be a great idea to prepare yourself to the migration by doing the right thing already now on premises: Do not use
DateTime.Now in your code but use
DateTime.UtcNow instead and do the conversions where needed. This way you are already abstracting the location of your code, and when you migrate to the cloud and the location becomes irrelevant or unpredictable, your code will continue to work without being affected.