The Daily Parker

Politics, Weather, Photography, and the Dog

New project meetings

Yesterday and today I've been in meetings all day starting a new project at work. Unusually for my career, the project is not only a matter of public record, but the work will be in the public domain. That's right: I'm doing a project for the largest organization in the world, the United States Government.

Some parts of the project touch on confidential information, and I'm going to remain professionally discrete about the project details. But the project itself is unclassified, and we have permission from the sponsor to discuss it openly.

I'll have more about it tomorrow, including a photo or two I never thought I'd be able to take, let alone share publicly. Stay tuned.

Hidden complexity in software could be a problem

The Atlantic worries that there's a "coming software apocalypse:"

There will be more bad days for software. It's important that we get better at making it, because if we don't, and as software becomes more sophisticated and connected—as it takes control of more critical functions—those days could get worse.

The problem is that programmers are having a hard time keeping up with their own creations. Since the 1980s, the way programmers work and the tools they use have changed remarkably little. There is a small but growing chorus that worries the status quo is unsustainable. “Even very good programmers are struggling to make sense of the systems that they are working with,” says Chris Granger, a software developer who worked as a lead at Microsoft on Visual Studio, an IDE that costs $1,199 a year and is used by nearly a third of all professional programmers. He told me that while he was at Microsoft, he arranged an end-to-end study of Visual Studio, the only one that had ever been done. For a month and a half, he watched behind a one-way mirror as people wrote code. “How do they use tools? How do they think?” he said. “How do they sit at the computer, do they touch the mouse, do they not touch the mouse? All these things that we have dogma around that we haven’t actually tested empirically.”

The findings surprised him. “Visual Studio is one of the single largest pieces of software in the world,” he said. “It’s over 55 million lines of code. And one of the things that I found out in this study is more than 98 percent of it is completely irrelevant. All this work had been put into this thing, but it missed the fundamental problems that people faced. And the biggest one that I took away from it was that basically people are playing computer inside their head.” 

I'm not sure that there's a coming apocalypse. Things get more complex; we have adapted pretty well as a species. I imagine taking any of today's top technologists forward 1000 or 2000 years (or even 100 or 200) and watching their heads explode. A bronze-age Egyptian wouldn't understand a telescope. An iron-age Roman wouldn't understand movable type. And Guttenberg himself wouldn't understand a light bulb, let alone the 1920x1200 LED monitors I have in front of me.

So I'm not too worried about an apocalypse. But as a programmer, I'm very worried about crappy software.

Also, it's interesting that the author singled out Visual Studio, which is the tool I use most often to write software. (I wrote all this blog's customizations with it, for example.)

Welcome (and overdue) feature in Chrome

The January release of Google Chrome will prevent videos from auto-playing:

Starting in Chrome 64, which is currently earmarked for a January 2018 release, auto-play will only be allowed when the video in question is muted or when a "user has indicated an interest in the media."

The latter applies if the site has been added to the home screen on mobile or if the user has frequently played media on the site on desktop. Google also says auto-play will be allowed if the user has "tapped or clicked somewhere on the site during the browsing session."

"Chrome will be making auto-play more consistent with user expectations and will give users more control over audio," writes Google in a blog post. "These changes will also unify desktop and mobile web behavior, making web media development more predictable across platforms and browsers."

I mean, really. The more advertisers annoy the shit out of us, the less effective it will be effective.

Slosh modeling started here

The science of modeling hurricane storm surges started here in Chicago after the seiche of 1954:

When the surge hit Chicago, it hit a city that housed one of the world’s great meteorology departments, at the University of Chicago. One of its professors was the meteorologist George Platzman....

The meeting of those two freak concepts—real but rare deadly Great Lakes storm surges, and the bizarre possibility of an atomic bomb detonating in Lake Michigan—along with his computer-forecasting experiments, led Platzman to take up the nonexistent science of storm-surge prediction, beginning with an attempt to reverse-engineer the 1954 tragedy. His first model, in 1958, got the timing right, but was off by half on the height of the surge; nonetheless, it was used to accurately predict a 1960 Lake Michigan storm surge on Chicago, resulting in a public warning that may have saved lives.

Five years later, Platzman published a much more ambitious run at the phenomenon, crunching 20 years of hourly wind and water-level data at six weather stations on Lake Erie. He also used a much more sophisticated model than his 1958 study—which didn’t include wind stress—a level of complexity only possible in the computer age. And it worked, with an accuracy of about 90 percent.

The models improved into today's SLOSH model, which meteorologists have been using with abandon the past two weeks.

Software frustrations

I'm on the Board of Directors for the Apollo Chorus of Chicago, and information technology is my portfolio. Under that aegis, I'm in the process of taking all of our donor and membership spreadsheets and stuffing them into a new Neon CRM setup.

So far, it's going well, and it's going to make the organization a lot more effective at managing membership, events, and donations.

That said, in the last 24 hours I've logged five bug reports, including one of the most frustrating user experience (UX) bugs possible: a broken back button. This UX failure is so well-known and so irritating that we were talking about it when I started developing Web apps in the late 1990s. Jakob Nielsen called it the #1 web design mistake...of 1999:

The Back button is the lifeline of the web user and the second-most-used navigation feature (after following hypertext links). Users happily know that they can try anything on the web and always be saved by a click or two on Back to return them to familiar territory.

Except, of course, for those sites that break Back by committing one of these design sins:

  • opening a new browser window (see mistake #2)
  • using an immediate redirect: every time the user clicks Back, the browser returns to a page that bounces the user forward to the undesired location
  • prevents caching such that the Back navigation requires a fresh trip to the server; all hypertext navigation should be sub-second and this goes double for backtracking

Neon, however, has made some alternative design choices, and even has a FAQ explaining how they've broken the rules.

Seriously, guys. It's a good product, but wow, is that irritating.

Reactions to the weekend

Apparently, life went on in the US while I was abroad last week. First, to James Damore:

Of course, that wasn't the big story of the weekend. About the terrorist attack and armed ultra-right rally in Virginia, there have been many, many reactions:

Can we have a discussion about domestic right-wing domestic terrorism now? Before we have another Oklahoma City?

Not sure that's a bad thing...

I just saw a comment on a review site listing the following as a "con" for a particular Web-based product:

I really feel like this company doesn't fix problems that only affect a couple of customers. Instead they prioritize fixes that affect the whole system and only fix specific problems when they have time.

Yes. Also, you might be interested to learn that businesses try to make profits by selling things for more than it cost to obtain them.

On behalf of the company in question—a small business in Chicago whose principal constituents are non-profit organizations with budgets under $1m—you're either new to this whole "commerce" thing or you have a magnificently droll sense of humor. Either way, good day to you, sir. I said good day!

Happy 1.5 Gigaseconds!

Tonight at 02:40 UTC, all Unix-based computers (including Apples running OS-X) will pass a milestone: 1.5 Gs since the beginning of time (at least as far as Unix is concerned).

Unix keeps track of time by counting the number of seconds since 1 January 1970 at midnight UTC, which (at this writing) was 1,499,962,035 seconds ago. Tonight at 21:40:00 Chicago time will be 1.5 billion seconds since that point.

If you miss this anniversary, don't worry; it'll be 2.0 Gs into the Unix time epoch on 18 May 2033 at 03:33:20 UTC. Mark your calendars now!

Don't do this. Just don't.

It's a general rule of software security that, if I have physical access to your computer, I own it.

I'm analyzing a piece of software so that I can transfer its data to another application. The software runs on a local machine and is written in .NET, with a SQL Express back-end. I have administrator access to the SQL database, the machine, and therefore, to the software.

It took me all of an hour to find the master encryption key in one of the DLLs that make up the software, and another hour to build an applet—using the software's own assemblies—that can read and decrypt every byte in the database.

Good thing I'm covered by a confidentiality agreement and the owner of the data has engaged my company to do exactly what I'm doing. But wow, we really need to migrate this stuff quickly, and get it the hell off this computer.