The Daily Parker

Politics, Weather, Photography, and the Dog

How to protect your data from being stolen

Sadly, you can't. But you can protect yourself from identity theft, as Bruce Schneier explains:

The reality is that your sensitive data has likely already been stolen, multiple times. Cybercriminals have your credit card information. They have your social security number and your mother's maiden name. They have your address and phone number. They obtained the data by hacking any one of the hundreds of companies you entrust with the data­ -- and you have no visibility into those companies' security practices, and no recourse when they lose your data.

Given this, your best option is to turn your efforts toward trying to make sure that your data isn't used against you. Enable two-factor authentication for all important accounts whenever possible. Don't reuse passwords for anything important -- ­and get a password manager to remember them all.

Do your best to disable the "secret questions" and other backup authentication mechanisms companies use when you forget your password­ -- those are invariably insecure. Watch your credit reports and your bank accounts for suspicious activity. Set up credit freezes with the major credit bureaus. Be wary of email and phone calls you get from people purporting to be from companies you do business with.

At the very least, download a password safe (like the one Schneier himself helped write) and make sure that you use a different, random password for everything.

Is it time to break up Facebook?

Facebook co-founder Chris Hughes thinks so:

America was built on the idea that power should not be concentrated in any one person, because we are all fallible. That’s why the founders created a system of checks and balances. They didn’t need to foresee the rise of Facebook to understand the threat that gargantuan companies would pose to democracy. Jefferson and Madison were voracious readers of Adam Smith, who believed that monopolies prevent the competition that spurs innovation and leads to economic growth.

A century later, in response to the rise of the oil, railroad and banking trusts of the Gilded Age, the Ohio Republican John Sherman said on the floor of Congress: “If we will not endure a king as a political power, we should not endure a king over the production, transportation and sale of any of the necessities of life. If we would not submit to an emperor, we should not submit to an autocrat of trade with power to prevent competition and to fix the price of any commodity.” The Sherman Antitrust Act of 1890 outlawed monopolies. More legislation followed in the 20th century, creating legal and regulatory structures to promote competition and hold the biggest companies accountable. The Department of Justice broke up monopolies like Standard Oil and AT&T.

For many people today, it’s hard to imagine government doing much of anything right, let alone breaking up a company like Facebook. This isn’t by coincidence.

Starting in the 1970s, a small but dedicated group of economists, lawyers and policymakers sowed the seeds of our cynicism. Over the next 40 years, they financed a network of think tanks, journals, social clubs, academic centers and media outlets to teach an emerging generation that private interests should take precedence over public ones. Their gospel was simple: “Free” markets are dynamic and productive, while government is bureaucratic and ineffective. By the mid-1980s, they had largely managed to relegate energetic antitrust enforcement to the history books.

This shift, combined with business-friendly tax and regulatory policy, ushered in a period of mergers and acquisitions that created megacorporations. In the past 20 years, more than 75 percent of American industries, from airlines to pharmaceuticals, have experienced increased concentration, and the average size of public companies has tripled. The results are a decline in entrepreneurship, stalled productivity growth, and higher prices and fewer choices for consumers.

The same thing is happening in social media and digital communications. Because Facebook so dominates social networking, it faces no market-based accountability. This means that every time Facebook messes up, we repeat an exhausting pattern: first outrage, then disappointment and, finally, resignation.

Hughes makes excellent points. Just because the industries look different than those in the 1890s doesn't mean they haven't consolidated too much. History doesn't repeat itself, but it does rhyme.

Azure DNS failure causes widespread outage

Yesterday, Microsoft made an error making a nameserver delegation chage (where they switch computers for their internal address book), causing large swaths of Azure to lose track of itself:

Summary of impact: Between 19:43 and 22:35 UTC on 02 May 2019, customers may have experienced intermittent connectivity issues with Azure and other Microsoft services (including M365, Dynamics, DevOps, etc). Most services were recovered by 21:30 UTC with the remaining recovered by 22:35 UTC. 

Preliminary root cause: Engineers identified the underlying root cause as a nameserver delegation change affecting DNS resolution and resulting in downstream impact to Compute, Storage, App Service, AAD, and SQL Database services. During the migration of a legacy DNS system to Azure DNS, some domains for Microsoft services were incorrectly updated. No customer DNS records were impacted during this incident, and the availability of Azure DNS remained at 100% throughout the incident. The problem impacted only records for Microsoft services.

Mitigation: To mitigate, engineers corrected the nameserver delegation issue. Applications and services that accessed the incorrectly configured domains may have cached the incorrect information, leading to a longer restoration time until their cached information expired.

Next steps: Engineers will continue to investigate to establish the full root cause and prevent future occurrences. A detailed RCA will be provided within approximately 72 hours.

If you tried to get to the Daily Parker yesterday afternoon Chicago time, you might have gotten nothing, or gotten the whole blog. All I know is I spent half an hour tracking it down from my end before Microsoft copped to the problem.

That's not a criticism of Microsoft. In fact, they're a lot more transparent about problems like this than most other organizations. And having spent a lot of time trying to figure out why something has broken, half an hour doesn't seem like a lot of time.

So, bad for Microsoft that they tanked their entire universe with a misconfigured DNS server. Good for them that they fixed it completely in just over an hour.

I hate Scala

"It is no one's right to despise another. It is a hard-won privilege over long experience."—Isaac Asimov, "C-Chute"

For the past three months, I've worked with a programming language called Scala. When I started with it, I thought it would present a challenge to learn, but ultimately it would be worth it. Scala is derived from Java, which in turn is a C-based language. C#, my primary language of the last 18 years, is also a C-based language. 

So I analogized thus: C# : Java :: Spanish : Italian; therefore C# : Scala :: Spanish : Neapolitan Italian. So as with my decent Spanish skills making it relatively easy to navigate Italy, and my decent C# skills making it relatively easy to navigate Java, I expected Scala to be just a dialect I could pick up.

Wow, was I wrong. After three months, I have a better analogy: C# : Scala :: English : Cockney Rhyming Slang. Or perhaps Jive. Or maybe even, C# : Scala :: German : Yiddish. That's right: I submit that Scala is a creole, probably of Java and Haskell.

Let me explain.

Most C-based languages use similar words to express similar concepts. In Java, C#, and other C-based languages, an instance of a class is an object. Classes can have instance and static members. If you have a list of cases to switch between, you switch...case over them. And on and on.

Scala has a thing called a class, a thing called an object, a thing called a case class, and a thing called a match. I'll leave discovering their meanings as an exercise for the reader. Hint: only one of the four works the same way as in the other related languages. In other words, Scala bases itself on one language, but changes so many things that the language is incomprehensible to most people without a lot of study.

Further, though Scala comes from object-oriented Java, and while it can work in an object-oriented fashion with a lot of coercion, fundamentally it's a functional language. Its classes don't know and do things; they just do things.

As in Yiddish or Gullah, while you can see the relationship from the creole back to the dominant language, no deductive process will get you from the dominant to the creole. "Answer the dog," a perfectly valid way to say "pick up the phone" in Cockney (dog <- dog bone <- rhymes with phone), makes no sense to anyone outside the East End. Nor does the phrase "ich hab' in bodereim," which translates directly from Yiddish as "I have (it) in the bathtub," but really means "fuck it." The fact that Yiddish uses the Hebrew alphabet to express a Germanic creole only adds to its incomprehensibility to outsiders.

These are features, not bugs, of Jive, Gullah, Yiddish, and other creoles and slangs. They take words from the dominant language, mix them with a bunch of other languages, and survive in part as a way to distinguish the in-group from the out-group, where the out-group are the larger culture's dominant class.

I submit that Scala fits this profile.

Scala comes from academic computing, and its creator, German computer scientist Martin Odersky, maintains tight control over the language definition. Fast adoption was never his goal; nor, I think, was producing commercial software. He wanted to solve certain programming problems that current computer science thinking worries about. That it elevates a densely mathematical model of software design into commercial development is incidental, but it does increase the barriers to entry into the profession, so...maybe that's also a feature rather than a bug?

Scala isn't a bad language. It has some benefits over other languages. First, Scala encourages you to create immutable objects, meaning once an object has a value, that value can never change. This prevents threading issues, where one part of the program creates an object and another part changes the object such that the first part gets totally confused. Other languages, like C#, require developers to put guardrails around parts of the code to prevent that from happening. If all of your objects are immutable, however, you don't need those guardrails.

A consequence of this, however, is that some really simple operations in C# become excruciating in Scala. Want to create a list of things that change state when they encounter other things? Oh, no, no, no! Changing state is bad! Your two-line snippet that flips a bit on some objects in a list should instead be 12 lines of code (16 if it's readable) that create two copies of every object in your list through a function that operates on the list. I mean, sure, that's one way to do it, I suppose...but it seems more like an interview question with an arbitrary design constraint than a way to solve a real problem.

Second, this same characteristic makes the language very, very scaleable, right out of the box. That is actually the point: "Scala" is a portmanteau of "Scaleable" and "Language," after all. That said, C# is scaleable enough to run XBox, NBC.com, Office Online, and dozens of other sites with millions of concurrent users, so Scala doesn't exactly have a monopoly on this.

Not to mention, Scala gives frustrated Haskell programmers a way to show off their functional-programming and code-golf skills in a commercial environment. Note that this makes life a lot harder for people who didn't come from an academic background, as the code these people write in Scala is just as incomprehensible to newbies as Haskell is to everyone else.

All of this is fine, but I have developed commercial software for 25 years, and rarely have I encountered a mainstream language as ill-suited to the task of producing a working product in a reasonable time frame than this one. Because the problem that I want to solve, as a grizzled veteran of this industry, is how I can do my job effectively. Scala is making that much more difficult than any language I have ever dealt with. (There are a number of other factors in my current role making it difficult to do my job, but that is not the point of this post.)

Maybe I'm just old. Maybe I have thought about software in terms of objects with behavior and data for so long that thinking about it in terms of functions that operate over data just doesn't penetrate. Maybe I've been doing this job long enough to see functional programming as the pendulum swinging back to the 1980s and early 1990s, when the implementationists ruled the day, and not wanting to go back to those dark ages. (That's another post, soon.)

This cri de coeur aside, I'll continue learning Scala, and continue doing my job. I just really, really wish it had fewer barriers to success.

Edited to finish an incomplete sentence.

Readings between meetings

On my list today:

Back to meetings...

Today's Google Doodle

Today is Johann Sebastian Bach's 334th birthday, and to celebrate, Google has created a Doodle that uses artificial intelligence to harmonize a melody that you can supply:

Google says the Doodle uses machine learning to "harmonize the custom melody into Bach's signature music style." Bach's chorales were known for having four voices carrying their own melodic line.

To develop the AI Doodle, Google teams created a machine-learning model that was trained on 306 of Bach's chorale harmonizations. Another team worked to allow machine learning to occur within the web browser instead of on its servers.

The results are...interesting. (I'm about to get my music theory nerd on, so if that's not your bag you can wait until I post something political this afternoon.)

Here's a little d-minor melody I whipped up:

The basic harmonic structure of this melody is i-V7/V-V-vi-V-i. Even though I haven't taken a music theory course in [redacted] years, I can probably harmonize this melody ten times without breaking a sweat. Basically, on the beats, you've got d minor, a minor on &2, E major, A major; then in the second bar, Bb major, g minor, E major, A major, d minor. (Note that some of those are passing harmonies that expand other harmonies.)

Google's AI did not do that. It actually got a little flummoxed. Here's one example:

Oh, dearie dearie me. I think one of the problems is that it thought I had done something really weird in F-major instead of something prosaic in d-minor. And it doesn't provide any way of tweaking the harmonization.

So, really very cool AI going on there, but not yet ready for prime time. Still worth playing around with. If I had more time I'd try some simpler melodies to see if it helps.

(If you liked this post, by the way, you will love what I do in April.)

This sort of thing has cropped up before

...and it has always been due to human error.

Today, I don't mean the HAL-9000. Amtrak:

Amtrak said “human error” is to blame for the disrupted service yesterday at Union Station.

A worker fell on a circuit board, which turned off computers and led to the service interruption, according to U.S. Sen. Dick Durbin.

The delay lasted more than 12 hours and caused significant overcrowding at Union Station.

The error affected more than 60,000 Amtrak and Metra passengers taking trains from Union to the suburbs, according to reports. Some riders resorted to taking the CTA or using ride-sharing services to get home, Chicago Tribune reported.

An analysis of the signal system failures and determined they were caused by “human error in the process of deploying a server upgrade in our technology facility that supports our dispatch control system” at Union Station, Amtrak said in a statement. Amtrak apologized in the statement for failing to provide the service that’s expected of it.

Which led my co-workers to wonder, why the hell were they doing a critical server upgrade in the middle of the day?

Olé, olé olé olé!

Oh, I love these stories. On today's Daily WTF, editor Remy Porter describes the world I grew up in, where dates were dates and 30 December 1899 ruled them all:

If you wanted to set a landmark, you could pick any date, but a nice round number seems reasonable. Let's say, for example, January 1st, 1900. From there, it's easy to just add and subtract numbers of days to produce new dates. Oh, but you do have to think about leap years. Leap years are more complicated- a year is a leap year if it's divisible by four, but not if it's divisible by 100, unless it's also divisible by 400. That's a lot of math to do if you're trying to fit a thousand rows in a spreadsheet on a computer with less horsepower than your average 2019 thermostat.

So you cheat. Checking if a number is divisible by four doesn't require a modulus operation—you can check that with a bitmask, which is super fast. Unfortunately, it means your code is wrong, because you think 1900 is a leap year. Now all your dates after February 28th are off-by-one. Then again, you're the one counting. Speaking of being the one counting, while arrays might start at zero, normal humans start counting at one, so January 1st should be 1, which makes December 31st, 1899 your "zero" date.

Our macro language is off-by-one for the first few months of 1900, but that discrepancy is acceptable, and no one at Microsoft, including Bill Gates who signed off on it, cares.

The Basic-derived macro language is successful enough inside of Excel that it grows up to be Visual Basic. It is "the" Microsoft language, and when they start extending it with features like COM for handling library linking and cross-process communication, it lays the model. Which means when they're figuring out how to do dates in COM… they use the Visual Basic date model. And COM was the whole banana, as far as Windows was concerned- everything on Windows touched COM or its successors in some fashion. It wasn't until .NET that the rule of December 30th, 1899 was finally broken, but it still crops up in Office products and SQL Server from time to time.

The .NET epoch began 1 January 2000. Except for DateTimeOffset values, whose epoch began on the non-existent date 1 January 0. Or DateTime values (now deprecated) which start at the beginning of the Gregorian calendar in 1753. (Same with SQL Server datetime types.)

The bottom line: dates are hard.

Rally nice view

I'm happy to announce that I started a new role on the 14th at Rally Health, a software company wholly owned by United Health Group. I'll have more to say later (still figuring out the social media policies), but for now I can say, look at the view:

And here's the view from the north:

Today, by the way, is the first day since I started that we've had anything approaching full sunlight. Of course, it's frighteningly cold out, but hey: nice view.

(I'll update Facebook and LinkedIn over the weekend, for those of you who care about those things.)