I have 21 hours of budget to finish a substantial project at work, and then another project to finish by the end of May. Posting may be iffy the next couple of days.
Coming up, the final figures on how much moving to Azure saved me.
On March 10th, I completed moving Weather Now to Windows Azure, and shut down the Inner Drive Technology International Data Center. I had already received my lowest electric bill ever for this location, thanks to a 25% rate reduction negotiated by the City.
Earlier this week I got my March electric bill, for my electricity use between March 8th and April 7th. Take a look:
My electricity use in March 2013 was just 26% of my March 2012 use (243 kw/h in 2013 against 933 kw/h in 2012). The bill was 80% lower, too.
My telecommunications bill also went down considerably. After I have complete data of these cost differences mid-May, I'll post a full rundown of how much moving to Azure saves me every month.
Too much going on:
Now, I will go back to drafting documentation while I wait for AT&T to reconfigure my DSL and kill my landline. I've had a POTS ("plain old telephone service") twisted-pair line longer than most people on earth have been alive. After today, no longer. I don't think I'll miss it, either. I only have it because I have a business-class DSL, which I don't need anymore, and the only people who call it want money from me.
I've fixed seven annoying bugs and added three minor features to Weather Now, including:
- Fixed searching from the search box so you can enter an airport code directly;
- Fixed the Last 24 hours page to show day and night icons properly;
- Added a status page so users can peek under the hood; and
- Tweaked a few things in the background worker process around logging and status update alerts.
A minor bug fix release like this used to take a couple of hours to deploy, because I had to update the code running on the web server file-by-file. I got the process down to about an hour—but I still had to bring the application offline to make the update.
Since I put it up in Microsoft Windows Azure, publishing an update takes about 15 minutes, is completely automated, and doesn't require taking the site down. The great Inner Drive migration continues to pay dividends.
I'm paying 90% of my attention right now to a Windows Azure online training class. I already knew a lot of the material presented so far, but not all of it. It's like re-taking a class you took as an undergraduate; the 10% you didn't know is actually really helpful.
Like next week's class, which will go over Infrastructure as a service: a lot has changed in the last year, so it should be valuable.
Apparently, though, my homework is to build an Azure web site this week. Not a multi-tier application with a worker role. Just a web site. How adorable.
The Inner Drive Technology International Data Center is no more.
This morning around 8:15 CDT I updated the master DNS records for Weather Now, and shut down the World Wide Web service on my Web server an hour later. All the databases are backed up and copied; all the logs are archived.
More to the point, all the servers (except my domain controller, which also acts as a storage device) are off. Not just off, but unplugged. The little vampires continue to draw tens of Watts of power even when they're off.
The timing works out, too. My electric meter got read Thursday or Friday, and my Azure billing month starts today. That means I have a clean break between running the IDTIDC and not running it,* and by the beginning of May I'll have more or less the exact figures on how much I saved by moving everything to the Cloud.
Meanwhile, my apartment is the quietest it's ever been.** The domain controller is a small, 1U server with only one cooling fan. Without the two monster 2U units and their four cooling fans (plus their 12 hard drives), I can suddenly hear the PDC...and now I want to shut it down as well.
* Except for the DSL and land-line, which should be down in a couple of weeks. I'll still have all the expense data by May.
** Except for the two blackouts. Now, of course, I never need worry about a blackout again—unless it hits the entire country at once, which would create new problems for me.
Weather Now is fully deployed to the Cloud. As soon as the Worker Role finishes parsing the last few hours of weather, I'll cut over the DNS change, and it will be live.
Actually, that's not entirely true; I'm going to cut over the DNS in the morning, after I know I fixed the bugs I found during this past week's shake-down cruise.* So if you want to see what a weather site looks like while it's back-filling its database, you can go to its alias, http://wx-now.cloudapp.net. (Because of how Azure works, this will remain its alias forever.)
Time to meet my friends, who are wondering where I am, no doubt.
* Bugs fixed: 13. Total time: 6.9 hours (including 2.4 to import and migrate the Gazetteer).
The final deployment of Weather Now encountered a hitch after loading exactly 3 million (of 7.2 million) place names. I've now kludged a response for the remaining 4.2 million rows, and a contingency plan should that upload fail.
Meanwhile, I have a saturated Internet connection. So rather than sit here and watch paint dry, so to speak, I'm bringing back some of the bugs that I decided to postpone fixing. The end result, I hope, will be a better-quality application than I'd planned to release—and a rainy Saturday made useful.
Tomorrow morning, shortly after I have my coffee, I will finally turn off the last two production servers in my apartment the IDTIDC. The two servers in question, Cook and Kendall, have run more or less continuously since November 2006*, gobbling up power and making noise the whole time.
As I write this, I'm uploading the production Weather Now deployment along with the complete Inner Drive Gazetteer, a 7.2-million row catalog of place names that the site uses for finding people's local weather. It takes a while to upload 7.2 million of anything, of course; and it's only 35% done after two hours. Trying to deploy the Cloud package at the same time may not have made the most sense, but I need the weather downloader to start running now so that when I cut over to the new site, it has actual weather to show people.
I started this project on November 3rd, logging almost exactly 100 hours on it until today. I'm through the tunnel and almost done climbing up the embankment. One more night of whirring fans and then...quiet.
Update: Crap. The Gazetteer upload crashed after 3 million rows. Now Plan B...
* Yes, I did just link to the Wayback Machine there. The original Inner Drive blog is offline for the time being. I have a task to restore it, as I haven't updated it since 2008, it's not a priority.
Another update: the original link at (*) pointed to Wayback Machine, but after reconstituting the old blog I corrected the link. That's why the footnote above no longer makes a lot of sense.
I don't know what to do with myself the rest of the day. I've just deployed the completely-redesigned Weather Now application. I feel 10 kilos lighter.
Check out the preview on Windows Azure.
The application started in mid-1997 as a feature of the now-defunct braverman.org, my proto-blog. The last major changes happened in 2006, when I gave it a face-lift. I've occasionally pushed some bug fixes, but really, until today it has looked and acted essentially the same way for 6 years. (The GetWeather application, which downloads and parses data from outside sources, hasn't changed significantly since 2002.)
So what's new? In sum:
- The application now runs on Microsoft Windows Azure, up in the cloud. (Check out the preview!)
- This means it also runs on Azure SQL Database instead of on-site SQL Server.
- Since I had to port the database anyway, I completely re-architected it.
- The database rearchitecture included moving its archives to Azure Storage, which will pay benefits once I update the UI to take advantage of it.
- The ancient (1997, with revisions in 1999, 2002, and 2005) GetWeather application, which downloads weather data from outside sources, got rebuilt from byte 0 as well.
- Finally, I fixed 35 bugs that the old architecture either caused or made fixing overly difficult.
There are a few bugs in the preview, of course. This morning I found and fixed 6 of them, all related to architectural changes under the hood that the creaky user interface didn't understand. And just now, I discovered that it thinks the sun never shines anywhere—again, almost certainly a problem related to changing from using the broken System.DateTime object to its replacement, System.DateTimeOffset. Always another bug to fix...
Still: I'm done with the port to Azure. I'll bang away on it for the next week, and if all works out, on Saturday I'll finally, finally, finally turn off my servers.