IoT with an ESP8266 (Part 5) – IoT with Azure

by | Apr 24, 2017

As I mentioned in earlier posts, my “Environment Watch” system needed a publicly accessible web application so my devices, once connected to Wi-Fi, would have a consistent target to which they could post measurement data.

I briefly debated setting up a simple web-server at home and exposing port 80 through my router to accomplish this task, but I also knew that I had a certain amount of monthly Azure credit with my MSDN subscription. So I decided to try deploying my IoT application to Azure.

Azure has an online dashboard/portal that can be used to configure services. I was able to log into this portal using my MSDN subscription credentials. Azure requires that you specify – or create – resource groups and sometimes service levels for everything you create. As such, to create my database I would need to use their portal.

New SQL ServerImage of creating an instance of a logical SQL Server.

The first step was to create an instance of a logical SQL Server. To do this from the Azure portal:

  • Click on More services > on the left side
  • Scroll in next panel and find section titled DATABASES
  • Click on SQL Servers
  • From the new window click Add

I picked out a lower-case server name (as required) as well as an admin login and password.

In my case there was only one selection possible for Subscription so I used that.

For the Resource group I stuck with an existing resource group rather than creating a new one.

Location gave me quite a list of options, but as the only likely user of this application, I stuck with what I thought was the closest to my own and selected North Central US.

Since I’d likely use this database for other projects going forward, I also selected to Allow azure services to access server.

Once that was all set, I clicked Create to finish the process. Doing this showed a window briefly indicating that deployment had started, which quickly disappeared followed by nothing. Clicking refresh at the top of the pane showed me that my database server had been created:

Picture showing that my database server had been created.

New SQL Database

When creating a new database, Azure expects a “pricing tier” to be specified. As such, I couldn’t create my database in the cloud using the “Update-Database” command from the package manager console as I did with the Local DB. I had to create the database first from the Azure console and then use that command once the database had been created.

To create a new database from the Azure portal:Image of creating a new database from the Azure portal.

  • Click SQL Databases on the left
  • Click the large plus sign and “Add” in the upper corner of the new window

As with the creation of the SQL Server, doing this presented me with a series of text boxes I needed to fill in.

So I named my database something obvious, stuck with the default Subscription (there was no other choice) and again used an existing Resource group.

Since I was creating a new database and, hopefully, seeding it with some default data, I left the source as a Blank database by default. The server, of course, was my newly created SQL Server instance.

I had never used an “elastic pool” before, so I clicked on the link to find out more. In short, here’s what MS docs has to say:

SQL DB elastic pools provide a simple cost effective solution to manage the performance goals for multiple databases that have widely varying and unpredictable usage patterns.

Being the sole producer and consumer of data, this seemed unnecessary, so I left this option turned off for the time being. That left pricing tier and Collation.

The cheapest pricing tier, B - Basic and its options. Since I’d be working within the limitations of the MSDN subscription budget, this sounded find to me.

For pricing tier, clicking on the arrow next to it showed me a number of options. Again, within the MSDN monthly budget, the cheapest seemed the best option. In this case that was B – Basic.

This pricing plan was limited to 2 GB. The amount of data one device (so far) could create seemed like it would be fairly minimal, even with said device taking measurements every minute. So at least initially, 2 GB would suffice.

I selected that which closed the window and updated the selection in the previous window. For Collation I stuck with the default.

I clicked the Create button to create my new database. Once again, Azure proceeded to start the process and then no updates in the current screen. A quick refresh, though, showed me my database had been created.

Image of the new Azure database being created.

Publishing the Database

Before attempting to publish my database, or at least the code-first migrations and seed info, I decided to try and connect to it. So from within Visual Studio, I did the following:

  • Click View > SQL Server Object Explorer
  • Clicked the “Add SQL Server” button near the top

Image of connecting to the new Azure database.

From here I selected Azure and could see my database along with server name and so forth. I proceeded to fill in the blanks and then clicked Connect. What followed was another dialog which requested that I add a firewall rule:

Image of another dialog which requested that I add a firewall rule.

Clicking OK granted me access to my database from within Visual Studio. I was able to see my database and could see that there were no tables yet. To add tables I needed to first redirect the app at my new connection. Within the appsettings.json file I updated the connection string to point my new database:

“DefaultConnection”: “Server=winksnewdbserver.database.windows.net;Database=EnvWatchCloudDb;User Id=sauser;Password=**********;”

Next I ran a quick rebuild to ensure I hadn’t inadvertently broken anything. I hadn’t, so I opened the package manager console and typed in and ran Update-Database.

This completed fairly quickly. I opened the SQL Server Object Explorer and clicked refresh. Then located my database and looked at the tables. Sure enough, my tables were there.

Image of the SQL Server database tables.

Right clicking on tables with seed data (such as DeviceType) and selecting View Data, showed me there was no data yet. This was because I had not yet run the application against this database. If you’ll recall I had attached my seed process to the startup of the application. So I started the application & confirmed it seeded the database afterwards by once again checking with right-click and view data.

Publishing the Application

So now I had a database in the cloud, seeded with data. Next I needed my application to run in the cloud. To do this I decided to use tools available to me within Visual Studio 2015.

  • Right-click on the project from within Solution Explorer
  • Select Publish… from within the menu that shows
  • This brought up another window which asked what my publish target was:

Image of window asking to select a publish target.

  • I selected Micrsoft Azure App Service, which brought up another dialog:

Image of app service window.

  • The subscription setting looked right as did the view. Having no application yet, I clicked New… and got another dialog:

Image of naming your IoT web application and creating a new app service plan.

  • I named my new web app something memorable, stuck with the subscription as well as the default resource group. For App Service Plan I decided to create a new one, which produced yet another dialog:

  • I named the plan and selected a location close to me. For size I selected Free.

Clicking OK to these two dialogs started the process of creating the app and returned me to the publish dialog:

Image of the publish window and button for validating connection.

I decided to confirm this was created by clicking Validate Connection. This spun for a bit and returned a green checkmark. Before proceeding to the next step, however, I decided to check on the Azure portal for my new app service. Sure enough, it was there:

Image that shows a check of the Azure portal for my new app service.

Clicking next moved on to settings, which I spent a little time with:

Image of the settings for the new database.

The only change I made here was to ensure that the connection string I used at runtime was correct and checked. Then I clicked next. On the next dialog I decided to preview changes. This gave me a list of all the files that would be pushed to my new application.

Image showing a list of all the files that would be pushed to my new application.

Everything seemed in order here, so I proceeded to publish. Doing this initiated the web deploy process, which I could watch via the output window:

1>—— Publish started: Project: EnvironmentWatch, Configuration: Release Any CPU ——

rmdir /S /Q “C:\Users\jwink\AppData\Local\Temp\PublishTemp\EnvironmentWatch88\”

Environment variables:

DOTNET_CONFIGURE_AZURE=1

Path=.\node_modules\.bin;C:\Program Files (x86)\Microsoft Visual Studio 14.0\Web\External;%PATH%;C:\Program Files (x86)\Microsoft Visual Studio 14.0\Web\External\git

C:\Program Files\dotnet\dotnet.exe publish “C:\Users\jwink\Source\Repos\EnvironmentWatch\src\EnvironmentWatch” –framework netcoreapp1.0 –output “C:\Users\jwink\AppData\Local\Temp\PublishTemp\EnvironmentWatch88” –configuration Release –no-build

Publishing EnvironmentWatch for .NETCoreApp,Version=v1.0

Bundling with configuration from C:\Users\jwink\Source\Repos\EnvironmentWatch\src\EnvironmentWatch\bundleconfig.json

Processing wwwroot/css/site.min.css

Processing wwwroot/js/site.min.js

Configuring the following project for use with IIS: ‘C:\Users\jwink\AppData\Local\Temp\PublishTemp\EnvironmentWatch88’

Updating web.config at ‘C:\Users\jwink\AppData\Local\Temp\PublishTemp\EnvironmentWatch88\web.config’

Configuring project completed successfully

publish: Published to C:\Users\jwink\AppData\Local\Temp\PublishTemp\EnvironmentWatch88

Published 1/1 projects successfully

Publishing with publish method [MSDeploy]

Executing command [“C:\Program Files (x86)\IIS\Microsoft Web Deploy V3\msdeploy.exe” -source:manifest=’C:\Users\jwink\AppData\Local\Temp\PublishTemp\obj\EnvironmentWatch88\SourceManifest.xml’ -dest:manifest=’C:\Users\jwink\AppData\Local\Temp\PublishTemp\obj\EnvironmentWatch88\DestinationManifest.xml’,ComputerName=’https://envwatchcloudweb.scm.azurewebsites.net/msdeploy.axd?site=EnvWatchCloudWeb’,UserName=’$EnvWatchCloudWeb’,Password='{PASSWORD-REMOVED-FROM-LOG}’,IncludeAcls=’False’,AuthType=’Basic’ -verb:sync -enablerule:AppOffline -enableRule:DoNotDeleteRule -retryAttempts:20]

Info: Using ID ‘b78e14dc-00c3-4001-9ac5-ee21040c6522’ for connections to the remote server.

Info: Adding directory (EnvWatchCloudWeb\refs).

Info: Adding file (EnvWatchCloudWeb\wwwroot\_references.js).

Total changes: 373 (373 added, 0 deleted, 0 updated, 0 parameters changed, 23860034 bytes copied)

Web App was published successfully http://envwatchcloudweb.azurewebsites.net/

========== Build: 0 succeeded, 0 failed, 1 up-to-date, 0 skipped ==========

========== Publish: 1 succeeded, 0 failed, 0 skipped ==========

When the process completed, it opened the site in a new browser window.

Image of the opened site in a new browser window.

While this new site had the seed data, it did not yet have any measurements. So I updated my sketch to redirect the device at the new site:

char* host = “envwatchcloudweb.azurewebsites.net”;

After doing this and letting my device run for a few minutes, I checked again. There was data, though only one entry since I was averaging by hour. I also noticed that time looked off – it was coming back as 9 PM, even though at the time I’d started gathering data, it was only around 3:45 PM CST.

After some digging on this it turns out Azure sites assume UTC time by default. So when my app grabs DateTime.Now, it is getting the Coordinated Universal Time, which is about 6 hours previous to where I am reporting from, which is Central Time (US). So the time I see in my database is 6 previous to when it was recorded.

There are a few different ways to resolve this issue.

  • Use a Javascript utility (such as jsTimezoneDetect) to detect the caller’s timezone and adjust UTC to that.
  • Adjust to a specific timezone in code, rather get DateTime.UtcNow – pretty much the same as DateTime.Now where the app runs – and then adjust it using ConvertTimeFromUtc
  • Include a setting on the Azure app service to force it to be in a specific time zone

This last method was what I went with. While it is recommended to store your date/time info in UTC and adjust for the caller, my little app is unlikely to ever move anywhere else or report data anywhere else. My devices will be central time as will any calls to the site. If this ever changed, I could adjust it later.

To set an Azure site’s time zone, I needed to add an application setting to set the website time zone:

  • Click on App Services on the left
  • Find and click on my app, or EnvWatchCloudWeb
  • On the pane that shows next, find and click on Application settings
  • Find App Settings in the next pane and enter a new key (just below WEBSITE_NODE_DEFAULT_VALUE)
    • Key: WEBSITE_TIME_ZONE
    • Value: Central Standard Time
  • More on why it’s a bad idea and yet how to get around it can be found here.

Image showing how to enter a new key.

I saved these settings (save icon at the top of this Application Settings pane) and restarted the site:

  • Right-click on my app, or EnvWatchCloudWeb
  • Select Restart
  • Answer yes to the next question:

A restart app window.

After restarting – and allowing the device to send a few more entries – I ran a quick query against the database from within Visual Studio:

SELECT * FROM Measurement ORDER BY MeasurementId DESC;

This revealed that the new entries were indeed now listed with CST:

Image showing the new entries were now listed with CST.
But this also meant that my initial entries were now wrong. I corrected this with more SQL:

UPDATE Measurement

SET    MeasuredDate = DATEADD(hh, -6, MeasuredDate)

WHERE  MeasuredDate >= ‘2016-12-13 21:00:00’

Once this was corrected, I proceeded to gather data for several days. As mentioned in a previous post, much of this data was gathered near the window of my basement office at my house during a cold snap here in Minnesota. Negative external temperatures meant that internal temperatures in my office – near my somewhat leaky window – were also very low. Not freezing, thankfully, be still rather cold.

Image of all the data gathered for this IoT with Azure project.

Conclusions

Even with one device, I was rather pleased with the results. I could fairly easily move my device to different locations and start measuring for an extended period of time. It would be interesting to see how multiple devices would work and what it would take to aggregate multiple devices and measurements for an entire building.

Setup with the Arduino software was a little inconvenient, though pre-loading with several different Wi-Fi networks helped quite a bit since it meant I could move between some pre-set locations without much effort.

It is worth mentioning that security with this device is a bit of an issue, at least with my example. I elected to use port 80 for my communications which is not secure. Granted, the data I am sending isn’t a secret, it’s still possible the setup could be converted for a purpose that would require better security. Adafruit has a library that can be used to communicate via SSL/TLS. Possibly in later posts I’ll pursue this.

Give the other posts in this series a read:

IoT with an ESP8266 (Part 1) – The Hardware

IoT with an ESP8266 (Part 2) – Arduino Sketch

IoT with an ESP8266 (Part 3) – ASP.NET Core Project and Database Setup

IoT with an ESP8266 (Part 4) – IoT Web Application

Working on IoT with Azure or other IoT project with your company? Intertech has experienced IoT developers that can.