The Daily Parker

Politics, Weather, Photography, and the Dog

The Romney campaign's software epic fail

Politico's Burns & Haberman explain:

[ORCA] was described as a mega-app for smartphones that would link the more than 30,000 operatives and volunteers involved in get-out-the-vote efforts. PBS profiled it a few days before the election. The app was created and managed by the Romney campaign and was kept a secret among a close circle in Boston, according to POLITICO sources.

It's been reported the system crashed at 4 p.m., but multiple sources familiar with the war room operation said it had actually been crashing throughout the day. Officials mostly got information about votes either from public news sources tracking data, like CNN.com, or by calling the counties for information, the source said. Officials insisted the day after the election that they had still believed they were close, and that they had hit their numbers where they needed to, even as Fox News and other outlets called the race.

The post links back to a Romney volunteer's description of how ORCA failed in the field:

On one of the last conference calls (I believe it was on Saturday night), they told us that our packets would be arriving shortly. Now, there seemed to be a fair amount of confusion about what they meant by "packet". Some people on Twitter were wondering if that meant a packet in the mail or a pdf or what. Finally, my packet arrived at 4PM on Monday afternoon as an emailed 60 page pdf. Nothing came in the mail. Because I was out most of the day, I only got around to seeing it at around 10PM Monday night. So, I sat down and cursed as I would have to print out 60+ pages of instructions and voter rolls on my home printer. ... They expected 75-80 year old veteran volunteers to print out 60+ pages on their home computers? The night before election day? From what I hear, other people had similar experiences. In fact, many volunteers never received their packets at all.

Now a note about the technology itself. For starters, this was billed as an "app" when it was actually a mobile-optimized website (or "web app"). For days I saw people on Twitter saying they couldn't find the app on the Android Market or iTunes and couldn't download it. Well, that's because it didn't exist. It was a website. This created a ton of confusion. Not to mention that they didn't even "turn it on" until 6AM in the morning, so people couldn't properly familiarize themselves with how it worked on their personal phone beforehand.

The project management antipatterns are apparent: Blowhard Jamboree, Smoke and Mirrors, Throw It Over the Wall, and basic Project Mismanagement, for starters. I haven't seen the code, but I can't imagine the management and deployment problems didn't lead to design and construction problems as well.

We software professionals have learned, through painful experience, that software developers have a better understanding of how to develop software than corporate executives. Go figure. Since Mitt Romney ran as a father-knows-best, authoritarian candidate, it should surprise no one that the people he hired couldn't run a software project.

Lowest electricity usage ever

Last month I used less electricity than ever before at my current address, mainly because two of the five servers in the Inner Drive Technology Worldwide Data Center have had their duties migrated to Microsoft Windows Azure.

This past month, I used even less:

It wasn't my smallest-ever bill, though, thanks to Exelon's recent rate increases. But still: here's some more concrete evidence that the Cloud can save money.

And before people start pointing to the New York Times article from September about how wasteful the Cloud is, I can't help but point out that the writer left out the part where moving to the cloud lets businesses turn off their own on-premises servers. And given the way Azure works (and, I assume, Amazon's and Google's equivalents), instead of dedicated servers doing just about nothing all day, you have shared servers handling sometimes dozens of different virtual machines for a fraction of the cost.

Anyway, except for the part where I fly 100,000 km every year, I feel like moving to the Cloud is helping everyone.

Chicago electricity aggregation passes

Voters in the City of Chicago (including me) passed a referendum giving the city the authority to negotiate electricity prices on behalf of everyone. Implementation will be swift:

The timing of the deal is important because Chicagoans stand to save the most money over Commonwealth Edison's rate between now and June 2013, when ComEd's prices are expected to drop because pricey contracts they entered into years ago will expire. The timeline has Chicagoans moving to the new supplier in February 2013.

Michael Negron, deputy chief of policy and strategic planning for the mayor's office, said electricity suppliers have shown great interest in snagging Chicago's service. Nearly 100 people packed a conference Monday for the city's "request for qualifications" process. The bidders ranged from multi-billion corporations to smaller providers from all over the country, he said. Industry analysts say the deal could be worth hundreds of millions of dollar to the winning supplier or suppliers.

Residents and businesses may opt out of the scheme and negotiate supply prices separately. As readers of this blog know, I'm desperate for lower prices, and eagerly looking forward to my electric bills next year after the new rate deal hits right after I shut down the Inner Drive Technology Worldwide Data Center.

Starting the oldest item on my to-do list

I mentioned a few weeks ago that I've had some difficulty moving the last remaining web application in the Inner Drive Technology Worldwide Data Center, Weather Now, into Microsoft Windows Azure. Actually, I have two principal difficulties: first, I need to re-write almost all of it, to end its dependency on a Database of Unusual Size; and second, I need the time to do this.

Right now, the databases hold about 2 Gb of geographic information and another 20 Gb of archival weather data. Since these databases run on my own hardware right now, I don't have to pay for them outside of the server's electricity costs. In Azure, that amount of database space costs more than $70 per month, well above the $25 or so my database server costs me.

I've finally figured out the architecture changes needed to get the geographic and weather information into cheaper (or free) repositories. Some of the strategy involves not storing the information at all, and some will use the orders-of-magnitude-less-expensive Azure table storage. (In Azure storage, 25 Gb costs $3 per month.)

Unfortunately for me, the data layer is about 80% of the application, including the automated processes that go out and get weather data. So, to solve this problem, I need a ground-up re-write.

The other problem: time. Last month, I worked 224 hours, which doesn't include commuting (24 hours), traveling (34 hours), or even walking Parker (14 hours). About my only downtime was during that 34 hours of traveling and while sitting in pubs in London and Cardiff.

I have to start doing this, though, because I'm spending way too much money running two servers that do very little. And I've been looking forward to it—it's not a chore, it's fun.

Not to mention, it means I get to start working on the oldest item on my to-do list, Case 46 ("Create new Gazetteer database design"), opened 30 August 2006, two days before I adopted Parker.

And so it begins.

Changing the way I read

Last week, I bought an ASUS Transformer TF700, in part to help out with our seriously-cool Galahad project, and in part so I could read a bunch of heavy technical books on tonight's flight to London. And yes, I had a little tablet-envy after taking the company's iPad home overnight. It was not unlike fostering a puppy, in the sense that you want to keep it, but fortunately not in the sense of needing to keep Nature's Miracle handy.

Then yesterday, Scott Hanselman pointed out a great way to get more use out of the pad: Instapaper. I'm hooked. As Hanselman points out,

Here's the idea. You get a bunch of links that flow through your life all week long. These are often in the form of what I call "long-form reading." Hackernews links, NYTimes studys, academic papers, etc. Some folks make bookmarks, have folders called "Links" on their desktops, or email themselves links.

I have these websites, papers and interesting links rolled up and delivered automatically to my Kindle every week. Think about how amazing that is and how it can change your relationship with content on the web. The stress and urgency (and open tabs) are gone. I am naturally and organically creating a personalized book for weekend reading.

I have a bookmarklet from Instapaper that says "Read Later" on my browser toolbar. I've put it in every browser I use, even Mobile Safari. I've also logged into Instapaper from all my social apps so that I can Read Later from my iPhone Twitter Client for example. You'd be surprised how many apps support Instapaper once you start looking for this.

What this means it is that Instapaper is ready and waiting for me in every location where an interesting piece of long-form reading could present itself. I don't stress, I click Read Later and the document is shipped off to Instapaper.

I'm sold. I actually have it updating my tablet every 12 hours, because I do a lot of my reading on the 156 bus. Or, today, British Airways 226.

Direct effects of moving to Azure

I still haven't moved everything out of the Inner Drive Technology Worldwide Data Center to Microsoft Windows Azure, because the architecture of Weather Now simply won't support the move without extensive refactoring. But this week I saw the first concrete, irrefutable evidence of cost savings from the completed migrations.

First, I got a full bill for a month of Azure service. It was $94. That's actually a little less than I expected, though in fairness it doesn't include the 5–10 GB database that Weather Now will use. Keep in mind, finishing the Azure migration means I get to shut off my DSL and landline phone, for which AT&T charges me $55 for the DSL and $100 for the phone.

I also found out how much less energy I'm using with 3 of 5 servers shut down. Here is my basic electricity use for the past two years:

The spikes are, obviously, air conditioning—driven very much by having the server rack. Servers produce heat, and they require cooling. I have kept the rack under 25°C, and even at that temperature, the servers spin up their cooling fans and draw even more power. The lowest usage periods, March to May and October to December, are cool and moist, so I don't use either air conditioning or humidifiers.

Until this month, my mean electricity use was 1100 kWh per month overall, 1386 kWh in the summer, and 908 kWh in the shoulder seasons. In the last two years, my lowest usage was 845 kWh.

Last month it was 750 kWh. Also notice how, during the much hotter summer in 2012 (compared with 2011), my electricity use was slightly lower. It was just harder to see the savings until now.

Including taxes, that means the bill was only $20 less than the usual shoulder-season bill. But I'm not General Motors; that $20 savings is 20% of the bill. Cutting my electricity bills 20% seems like a pretty good deal. And next summer, with no servers in the house, I'll be able to run less air conditioning, and the A/C won't have to compete with the heat coming off the server rack.

Now I've just got to figure out how to migrate Weather Now...

Windows Azure deployment credentials

My latest entry is up on the 10th Magnitude tech blog:

We've taken a little more time than we'd hoped to figure out how to deal with Azure deployment credentials and profiles properly. In an effort to save other development teams some of our pain, we present our solution. First, the general principle: Publication profiles are unique to each developer, so each developer should have her own management certificate, uploaded by hand to each relevant subscription.

When you deploy a project to a Windows Azure Cloud Service instance, you have to authenticate against the Azure subscription using a management certificate. The Publish Windows Azure Application wizard in Visual Studio presents you with a helpful link to sign in to your Azure subscription and download credentials. If you do this every time you publish to a new subscription, you (a) rapidly run up against the 10-certificate limit in Azure; and (b) get ridiculous credential files called things like "WorkSubscription1-AzDem12345-JoesSubscription-MySecretProjectThatMyBossDoesntKnowAboutSubscription.publishsettings" which, if you're not paying attention, soon shows up on a Subversion commit report (and gives your boss access to that personal project you forgot to mention to her).

Don't do that. Instead, do this:

1. Create a self-signed certificate using IIS. Name it something clear and unique; I used "david.10thmagnitude.com," for instance.
Image of creating a self-signed certificate
Then export it to a private folder.
Image of exporting a certificate from IIS to a folder

2. Import the .pfx file into your local certificate store.
Image of importing a private key

3. Export the same certificate as a .cer file.
Image of exporting a cer file

4. Go to the Azure management portal's management certificate list.

5. Upload the certificate you just created to the subscriptions to which you want to publish cloud services.
 Image of uploading a cer file

Now you have a single certificate for all your subscriptions. Next, create a publishing profile with the certificate:

6. In your Azure cloud service project, right-click the project node and choose "Publish…" to bring up the Publish Windows Azure Application wizard.

7. Drop down the "Choose your subscription" list and click "<Manage...>"

8. Click "new"

9. In the "Create or select..." drop down, find the certificate you just created and choose it.

10. Continue setting up your publishing profile as you've done before.

That's it. Except for one other thing.

If you have more than 0 developers working on a project, at some point you'll use source control. Regardless whether you have Subversion, Mercurial, or whatever, you need to avoid committing keys, certificates, and publishing profiles into your VCS. Make sure that your VCS ignores the following extensions: *.pfx, *.cer, *.publishsettings, and *.azurePubxml.

You want to ignore pfx and publishsettings files because they contain secrets. (I hope everyone knows this already. Yes, pfx files use passwords, but publishsettings don't; and anyway, why would you want to risk anyone else authenticating as you without your knowledge?) Ignore cer files because they're not necessary in an Azure project. And ignore azurePubxml files because every developer who publishes to Azure will wind up overwriting the files, or creating new ones that no one else uses.

Quick link roundup

I haven't any time to write today, but I did want to call attention to these:

Back to the mines...

My poor, dead SSD

My laptop's solid-state drive died this afternoon. It had a long, long life (23 months—almost double what they usually get). I am thankful to the departed SSD for that, and:

  • for dying after the client presentation, not before;
  • for dying on the first day of a three-week project, not the last; and
  • for living 23 months, which is about as spectacular as a dog living 23 years.

I am now rebuilding my laptop on a larger but slightly slower SSD, which I hope lasts nearly as long.