The Daily Parker

Politics, Weather, Photography, and the Dog

It's 5pm somewhere

Actually, it's 5pm here. And I have a few stories queued up:

Finally, author John Scalzi puts Rogue One in third place on his ranked list of Star Wars films, with some good reasons.

Web3 is coming for your kitchen

Via Molly White, a new company called Gripnr wants to monetize your D&D campaign, and it's as horrible as it sounds:

Gripnr plans to generate 10,000 random D&D player characters (PCs), assign a “rarity” to certain aspects of each (such as ancestry and class), and mint them as non-fungible tokens, or NFTs. Each NFT will include character stats and a randomly-generated portrait of the PC designed in a process overseen by Gripnr’s lead artist Justin Kamerer. Additional NFTs will be minted to represent weapons and equipment.

Next, Gripnr will build a system for recording game progress on the Polygon blockchain. Players will log into the system and will play an adventure under the supervision of a Gripnr-certified Game Master. After each game session is over, the outcome will be logged on-chain, putting data back onto each NFT via a new contract protocol that allows a single NFT to become a long record of the character’s progression. Gripnr will distribute the cryptocurrency OPAL to GMs and players as in-game capital. Any loot, weapons, or items garnered in-game will be minted as new sellable NFTs on OpenSea, a popular NFT-marketplace.

As a D&D veteran who once played a character (for 5 minutes) with Gary Gygax* as DM, I can't see how any gamer would want to do this. Molly White has spent the last two years documenting the ways scammers and grifters have used "the blockchain" and "NFTs" and other Web3 buzzwords to steal (or, as I believe, launder) billions of dollars. Gripnr seems like just one more scam, but I could be wrong: Gripnr could just be a lazy get-rich-quick scheme for its creators.

All the rows in the world

When I launched the final weather archive import on Tuesday, I predicted it would finish around 1pm today. See my accuracy for yourself:

2022-04-08 12:54:05.0975|INFO|Moved 118,773,651 weather archives from v3 to v5
2022-04-08 12:54:05.0975|INFO|Finished importing; duration 3.03:41:19.2445019
2022-04-08 12:54:05.0975|INFO|Import finished

Not a bad prediction.

So Weather Now 5 now has about 260 million historical records going back to 2006, including Chicago's weather from 15 years ago this hour. And where the weather station reported climate records, we've got those too.

Microsoft Azure recalculates storage use daily around 11 am Central time, so I don't have the complete picture yet, but it looks like I transferred about 245 GB of data. I'll find out for sure tomorrow, and in 3-4 days I'll get an accurate view of the storage cost.

Whew. I'm glad that's over.

Somebody call lunch!

I've gotten two solid nights of sleep in a row, and I've got a clean desk for the first time in weeks. I hope that this becomes the norm, at least until November, when I'll have a packed musical schedule for six weeks as the Apollo Chorus rehearses or performs about 30 times. But that's seven months off.

That gives me plenty of time to listen to or read these:

And finally, in compiling geographic source data for Weather Now, I discovered that the International Civil Aviation Organisation (ICAO) assigned an official designator the location where the Ingenuity helicopter landed on Mars: JZRO, for Jezero Crater.

Archive import finished; final archive import starting today

On Friday, I used Arithmetic™ to predict that the 162-million-row weather data transfer from Weather Now v3 to v5 would end around 7pm last night. Let's check the logs:

2022-04-04 18:48:30.7196|INFO|Clearing v3 archival records for ZYTX
2022-04-04 18:49:27.7471|INFO|Moved 157,408,921 weather archives from v3 to v5
2022-04-04 18:49:27.7471|INFO|Finished importing; duration 4.04:14:55.0952715 

Nice prediction. (It logged 157 million rows because I made a performance tweak and re-started the app after 5 million.)

As I've mentioned, those 162 million rows only go back to September 2009. But v3 launched in January 2007. And it turns out I have a second archive, also containing about 25 GB of data, going back to August 2006. I had to think back to decisions I never documented to piece together why.

In August 2006, I had a single big machine that served as my domain controller, Web server, and Exchange endpoint, and a second big database server. In this photo from 31 July 2006, the Exchange/Web/DC server is the white one on the right ("DOPPELKUH") and the SQL server ("BULLE") is the big black one on the left:

Now, BULLE was a huge machine for the time, with about 200 GB of disk space and (I think) 16 GB of memory. That 200 GB expanse tempted me to turn off a data-purge feature from more parsimonious days, and apparently I deployed that change around 9am on 11 August 2006. (Sadly, the purge feature worked as designed, and I have no archival data before then.)

In October 2006, I finally bought a server rack and moved the database to an even bigger set of disks:

Everything ticked along until around the time I deployed the Weather Now v3.5 refresh and discovered that the un-purged data file had grown to 25 GB. So on 3 September 2009, I simply created a new database and changed the data connection strings to point to it. That new database kept growing until I switched the archival data store to the Cloud in 2013.

And now, I get to take advantage of the triviality of that change by making an equally trivial change to the import controller's data connection string. The 2006 data file has 118,774,028 rows covering 3,988 stations from 11 August 2006 to 3 September 2009. At 435 rows per second, the final archival import should finish around...let's see...1 pm on Friday. At that point, every single byte of data Weather Now has collected in the past 15 years will be available through the App for you to see.

More from the archives:

Quick update

The Apollo Chorus performed last night at the Big Foot Arts Festival in Walworth, Wis., so I haven't done a lot of useful things today. I did take a peek at the other weather archive I have lying around, and discovered (a) it has the same schema as the one I'm currently importing into Weather Now 5, and (b) it only goes back to August 2006.

Somewhere I have older archives that I need to find... But if not, NOAA might have some.

The update rolls on...

As of 17:16 CDT, the massive Weather Now v3 to v5 import had 115,441,906 records left to transfer. At 14:28 CDT yesterday, it was at 157,409,431, giving us a rate of ( 41,967,525 / 96,480 seconds = ) 435 records per second. A little more math gives us another 265,392 seconds to go, or 3 days, 1 hour, 43 minutes left.

So, OK then, what's the over-under on this thing finishing before 7pm Monday?

It's just finished station KCKV (Outlaw Field, Clarksville, Tenn.), with another 2,770 stations left to go. Because it's going in alphabetical order, this means it's finished all of the Pacific Islands (A stations), Northern Europe (B and E), Canada (C), Africa (D, F, G, H). There are no I or J stations (at least not on Earth). K is by far the largest swath as it encompasses all of the continental US, which has more airports than any other land mass.

Once it finishes the continental US, it'll have only 38 million left to do! Whee!

How long will this take?

It takes a while to transfer 162.4 million rows of data from a local SQL database to a remote Azure Tables collection. So far, after 4 hours and 20 minutes, I've transferred just over 4 million rows. That works out to about 260 rows per second, or 932,000 per hour. So, yes, the entire transfer will take 174 hours.

Good thing it can run in the background. Also, because it cycles through three distinct phases (disk-intensive data read, processor-intensive data transformation, and network-intensive data write), it doesn't really take a lot of effort for my computer to handle it. In fact, network writes take 75% of the cycle time for each batch of reports, because the Azure Tables API takes batches of 100 rows at a time.

Now, you might wonder why I don't just push the import code up to Azure and speed up the network writes a bit. Well, yes, I thought of that, but decided against the effort and cost. To do that, I would have to upload a SQL backup file to a SQL VM big enough to take a SQL Server instance. Any VM big enough to do that would cost about 67¢ per hour. So even if I cut the total time in half, it would still cost me $60 or so to do the transfer. That's an entire bottle of Bourbon's worth just to speed up something for a hobby project by a couple of days.

Speaking of cost, how much will all this data add to my Azure bill? Well, I estimate the entire archive of 2009-2022 data will come to about 50 gigabytes. The 2003-2009 data will probably add another 30. Azure Tables cost 6¢ per gigabyte per month for the geographically-redundant storage I use. I will leave the rest of the calculation as an exercise for the reader.

Update: I just made a minor change to the import code, and got a bit of a performance bump. We're now up to 381 rows per second, 46% faster than before, which means the upload should finish in only 114 hours or 4.7 days. All right, let's see if we're done early Monday morning after all! (It's already almost done with Canada, too!)

Strange data patterns

The data transfer from Weather Now v3 to v5 continues in the background. Before running it, I did a simple SQL query to find out how many readings each station reported between September 2009 and March 2013. The results surprised me a bit:

The v3 database recorded 162.4 million readings from 4,071 stations. Fully 75 of them only have one report, and digging in I can see that a lot of those don't have any data. Another 185 have fewer than 100, and a total of 573 have fewer than 10,000.

At the other end, Anderson AFB on Guam has 123,394 reports, Spangdahlem AB in Germany has 123,297, and Leeuwarden, Netherlands, has 119,533. In fact, seven Dutch weather stations account for 761,000 reports of the 162 million in the archive. I don't know why, but I'll find out at some point. (It looks like some of them have multiple weather recording devices with color designations. I'll do some more digging.)

How many should they have? Well, the archive contains 1,285 days of records. That's about 31,000 hourly reports or 93,000 20-minute updates—exactly where the chart plateaus. Chicago O'Hare, which reports hourly plus when the weather shifts significantly had 37,069 reports. Half Moon Bay, Calif., which just ticks along on autopilot without a human weather observer to trigger special reports, had 90,958. So the numbers check out pretty well. (The most prolific US-based station, whose 91,083 reports made the 10th most prolific in the world, was Union County Airport in Marysville, Ohio.)

Finally, I know that what the App has a lot of data sloppiness right now. After I transfer over these archives, I'll work on importing the FAA Airports database, which will fix the names and locations of most of the US stations.

Chugging along with data consolidation

Sunday night I finished moving all the Weather Now v4 data to v5. The v4 archives went back to March 2013, but the UI made that difficult to discover. I've also started moving v3 data, which would bring the archives back to September 2009. I think once I get that done then moving the v2 data (back to early 2003) will be as simple as connecting the 2009 import to the 2003 database. Then, someday, I'll import data from other sources, like NCEI (formerly NCDC) and the Met*, to really flesh out the archives.

One of the coolest parts of this is that you can get to every single archival report through a simple URL. For example, to see the weather in Chicago five years ago, simply go to https://wx-now.com/History/KORD/2017/03/30. From there, you can drill into each individual report (like the one from 6pm) or use the navigation buttons at the bottom to browse the data.

Meanwhile, work continues apace on importing geographic data. And I have discovered a couple of UI bugs, including a memory leak that caused the app to crash twice since launch. Oops.

* The Met has really cool archives, some of which go back to the 1850s.