The Daily Parker

Politics, Weather, Photography, and the Dog

What to teach new coders

Scott Hanselman recommends teaching systems thinking over technical coding:

I told this young person to try not to focus on the syntax of C# and the details of the .NET Framework, and rather to think about the problems that it solves and the system around it.

This advice was .NET specific, but it can also apply to someone learning Rails 3 talking to someone who knows Rails 5, or someone who learned original Node and is now reentering the industry with modern JavaScript and Node 12.

Do you understand how your system talks to the file system? To the network? Do you understand latency and how it can affect your system? Do you have a general understanding of "the stack" from when your backend gets data from the database makes anglebrackets or curly braces, sends them over the network to a client/browser, and what that next system does with the info?

Squeezing an analogy, I'm not asking you to be able to build a car from scratch, or even rebuild an engine. But I am asking you for a passing familiarity with internal combustion engines, how to change a tire, or generally how to change your oil. Or at least know that these things exist so you can google them.

This is why I'm a fan of Hanselman. He's right. Learning technical skills is easy; learning how to think is hard.

How to protect your data from being stolen

Sadly, you can't. But you can protect yourself from identity theft, as Bruce Schneier explains:

The reality is that your sensitive data has likely already been stolen, multiple times. Cybercriminals have your credit card information. They have your social security number and your mother's maiden name. They have your address and phone number. They obtained the data by hacking any one of the hundreds of companies you entrust with the data­ -- and you have no visibility into those companies' security practices, and no recourse when they lose your data.

Given this, your best option is to turn your efforts toward trying to make sure that your data isn't used against you. Enable two-factor authentication for all important accounts whenever possible. Don't reuse passwords for anything important -- ­and get a password manager to remember them all.

Do your best to disable the "secret questions" and other backup authentication mechanisms companies use when you forget your password­ -- those are invariably insecure. Watch your credit reports and your bank accounts for suspicious activity. Set up credit freezes with the major credit bureaus. Be wary of email and phone calls you get from people purporting to be from companies you do business with.

At the very least, download a password safe (like the one Schneier himself helped write) and make sure that you use a different, random password for everything.

Is it time to break up Facebook?

Facebook co-founder Chris Hughes thinks so:

America was built on the idea that power should not be concentrated in any one person, because we are all fallible. That’s why the founders created a system of checks and balances. They didn’t need to foresee the rise of Facebook to understand the threat that gargantuan companies would pose to democracy. Jefferson and Madison were voracious readers of Adam Smith, who believed that monopolies prevent the competition that spurs innovation and leads to economic growth.

A century later, in response to the rise of the oil, railroad and banking trusts of the Gilded Age, the Ohio Republican John Sherman said on the floor of Congress: “If we will not endure a king as a political power, we should not endure a king over the production, transportation and sale of any of the necessities of life. If we would not submit to an emperor, we should not submit to an autocrat of trade with power to prevent competition and to fix the price of any commodity.” The Sherman Antitrust Act of 1890 outlawed monopolies. More legislation followed in the 20th century, creating legal and regulatory structures to promote competition and hold the biggest companies accountable. The Department of Justice broke up monopolies like Standard Oil and AT&T.

For many people today, it’s hard to imagine government doing much of anything right, let alone breaking up a company like Facebook. This isn’t by coincidence.

Starting in the 1970s, a small but dedicated group of economists, lawyers and policymakers sowed the seeds of our cynicism. Over the next 40 years, they financed a network of think tanks, journals, social clubs, academic centers and media outlets to teach an emerging generation that private interests should take precedence over public ones. Their gospel was simple: “Free” markets are dynamic and productive, while government is bureaucratic and ineffective. By the mid-1980s, they had largely managed to relegate energetic antitrust enforcement to the history books.

This shift, combined with business-friendly tax and regulatory policy, ushered in a period of mergers and acquisitions that created megacorporations. In the past 20 years, more than 75 percent of American industries, from airlines to pharmaceuticals, have experienced increased concentration, and the average size of public companies has tripled. The results are a decline in entrepreneurship, stalled productivity growth, and higher prices and fewer choices for consumers.

The same thing is happening in social media and digital communications. Because Facebook so dominates social networking, it faces no market-based accountability. This means that every time Facebook messes up, we repeat an exhausting pattern: first outrage, then disappointment and, finally, resignation.

Hughes makes excellent points. Just because the industries look different than those in the 1890s doesn't mean they haven't consolidated too much. History doesn't repeat itself, but it does rhyme.

Azure DNS failure causes widespread outage

Yesterday, Microsoft made an error making a nameserver delegation chage (where they switch computers for their internal address book), causing large swaths of Azure to lose track of itself:

Summary of impact: Between 19:43 and 22:35 UTC on 02 May 2019, customers may have experienced intermittent connectivity issues with Azure and other Microsoft services (including M365, Dynamics, DevOps, etc). Most services were recovered by 21:30 UTC with the remaining recovered by 22:35 UTC. 

Preliminary root cause: Engineers identified the underlying root cause as a nameserver delegation change affecting DNS resolution and resulting in downstream impact to Compute, Storage, App Service, AAD, and SQL Database services. During the migration of a legacy DNS system to Azure DNS, some domains for Microsoft services were incorrectly updated. No customer DNS records were impacted during this incident, and the availability of Azure DNS remained at 100% throughout the incident. The problem impacted only records for Microsoft services.

Mitigation: To mitigate, engineers corrected the nameserver delegation issue. Applications and services that accessed the incorrectly configured domains may have cached the incorrect information, leading to a longer restoration time until their cached information expired.

Next steps: Engineers will continue to investigate to establish the full root cause and prevent future occurrences. A detailed RCA will be provided within approximately 72 hours.

If you tried to get to the Daily Parker yesterday afternoon Chicago time, you might have gotten nothing, or gotten the whole blog. All I know is I spent half an hour tracking it down from my end before Microsoft copped to the problem.

That's not a criticism of Microsoft. In fact, they're a lot more transparent about problems like this than most other organizations. And having spent a lot of time trying to figure out why something has broken, half an hour doesn't seem like a lot of time.

So, bad for Microsoft that they tanked their entire universe with a misconfigured DNS server. Good for them that they fixed it completely in just over an hour.

I hate Scala

"It is no one's right to despise another. It is a hard-won privilege over long experience."—Isaac Asimov, "C-Chute"

For the past three months, I've worked with a programming language called Scala. When I started with it, I thought it would present a challenge to learn, but ultimately it would be worth it. Scala is derived from Java, which in turn is a C-based language. C#, my primary language of the last 18 years, is also a C-based language. 

So I analogized thus: C# : Java :: Spanish : Italian; therefore C# : Scala :: Spanish : Neapolitan Italian. So as with my decent Spanish skills making it relatively easy to navigate Italy, and my decent C# skills making it relatively easy to navigate Java, I expected Scala to be just a dialect I could pick up.

Wow, was I wrong. After three months, I have a better analogy: C# : Scala :: English : Cockney Rhyming Slang. Or perhaps Jive. Or maybe even, C# : Scala :: German : Yiddish. That's right: I submit that Scala is a creole, probably of Java and Haskell.

Let me explain.

Most C-based languages use similar words to express similar concepts. In Java, C#, and other C-based languages, an instance of a class is an object. Classes can have instance and static members. If you have a list of cases to switch between, you switch...case over them. And on and on.

Scala has a thing called a class, a thing called an object, a thing called a case class, and a thing called a match. I'll leave discovering their meanings as an exercise for the reader. Hint: only one of the four works the same way as in the other related languages. In other words, Scala bases itself on one language, but changes so many things that the language is incomprehensible to most people without a lot of study.

Further, though Scala comes from object-oriented Java, and while it can work in an object-oriented fashion with a lot of coercion, 

As in Yiddish or Gullah, while you can see the relationship from the creole back to the dominant language, no deductive process will get you from the dominant to the creole. "Answer the dog," a perfectly valid way to say "pick up the phone" in Cockney (dog <- dog bone <- rhymes with phone), makes no sense to anyone outside the East End. Nor does the phrase "ich hab' in bodereim," which translates directly from Yiddish as "I have (it) in the bathtub," but really means "fuck it." The fact that Yiddish uses the Hebrew alphabet to express a Germanic creole only adds to its incomprehensibility to outsiders.

These are features, not bugs, of Jive, Gullah, Yiddish, and other creoles and slangs. They take words from the dominant language, mix them with a bunch of other languages, and survive in part as a way to distinguish the in-group from the out-group, where the out-group are the larger culture's dominant class.

I submit that Scala fits this profile.

Scala comes from academic computing, and its creator, German computer scientist Martin Odersky, maintains tight control over the language definition. Fast adoption was never his goal; nor, I think, was producing commercial software. He wanted to solve certain programming problems that current computer science thinking worries about. That it elevates a densely mathematical model of software design into commercial development is incidental, but it does increase the barriers to entry into the profession, so...maybe that's also a feature rather than a bug?

Scala isn't a bad language. It has some benefits over other languages. First, Scala encourages you to create immutable objects, meaning once an object has a value, that value can never change. This prevents threading issues, where one part of the program creates an object and another part changes the object such that the first part gets totally confused. Other languages, like C#, require developers to put guardrails around parts of the code to prevent that from happening. If all of your objects are immutable, however, you don't need those guardrails.

A consequence of this, however, is that some really simple operations in C# become excruciating in Scala. Want to create a list of things that change state when they encounter other things? Oh, no, no, no! Changing state is bad! Your two-line snippet that flips a bit on some objects in a list should instead be 12 lines of code (16 if it's readable) that create two copies of every object in your list through a function that operates on the list. I mean, sure, that's one way to do it, I suppose...but it seems more like an interview question with an arbitrary design constraint than a way to solve a real problem.

Second, this same characteristic makes the language very, very scaleable, right out of the box. That is actually the point: "Scala" is a portmanteau of "Scaleable" and "Language," after all. That said, C# is scaleable enough to run XBox, NBC.com, Office Online, and dozens of other sites with millions of concurrent users, so Scala doesn't exactly have a monopoly on this.

Not to mention, Scala gives frustrated Haskell programmers a way to show off their functional-programming and code-golf skills in a commercial environment. Note that this makes life a lot harder for people who didn't come from an academic background, as the code these people write in Scala is just as incomprehensible to newbies as Haskell is to everyone else.

All of this is fine, but I have developed commercial software for 25 years, and rarely have I encountered a mainstream language as ill-suited to the task of producing a working product in a reasonable time frame than this one. Because the problem that I want to solve, as a grizzled veteran of this industry, is how I can do my job effectively. Scala is making that much more difficult than any language I have ever dealt with. (There are a number of other factors in my current role making it difficult to do my job, but that is not the point of this post.)

Maybe I'm just old. Maybe I have thought about software in terms of objects with behavior and data for so long that thinking about it in terms of functions that operate over data just doesn't penetrate. Maybe I've been doing this job long enough to see functional programming as the pendulum swinging back to the 1980s and early 1990s, when the implementationists ruled the day, and not wanting to go back to those dark ages. (That's another post, soon.)

This cri de coeur aside, I'll continue learning Scala, and continue doing my job. I just really, really wish it had fewer barriers to success.

David Graeber on Bullshit Jobs

I've just started reading anthropologist David Graeber's book Bullshit Jobs. It's hilarious and depressing at the same time. For a good summary, I would point you to Graeber's own essay "On the Phenomenon of Bullshit Jobs" that ran in Strike seven years ago:

A recent report comparing employment in the US between 1910 and 2000 gives us a clear picture (and I note, one pretty much exactly echoed in the UK). Over the course of the last century, the number of workers employed as domestic servants, in industry, and in the farm sector has collapsed dramatically. At the same time, ‘professional, managerial, clerical, sales, and service workers’ tripled, growing ‘from one-quarter to three-quarters of total employment.’ In other words, productive jobs have, just as predicted, been largely automated away (even if you count industrial workers globally, including the toiling masses in India and China, such workers are still not nearly so large a percentage of the world population as they used to be.)

But rather than allowing a massive reduction of working hours to free the world's population to pursue their own projects, pleasures, visions, and ideas, we have seen the ballooning of not even so much of the ‘service’ sector as of the administrative sector, up to and including the creation of whole new industries like financial services or telemarketing, or the unprecedented expansion of sectors like corporate law, academic and health administration, human resources, and public relations. And these numbers do not even reflect on all those people whose job is to provide administrative, technical, or security support for these industries, or for that matter the whole host of ancillary industries (dog-washers, all-night pizza delivery) that only exist because everyone else is spending so much of their time working in all the other ones.

These are what I propose to call ‘bullshit jobs’.

It's as if someone were out there making up pointless jobs just for the sake of keeping us all working. And here, precisely, lies the mystery. In capitalism, this is precisely what is not supposed to happen. Sure, in the old inefficient socialist states like the Soviet Union, where employment was considered both a right and a sacred duty, the system made up as many jobs as they had to (this is why in Soviet department stores it took three clerks to sell a piece of meat). But, of course, this is the sort of very problem market competition is supposed to fix. According to economic theory, at least, the last thing a profit-seeking firm is going to do is shell out money to workers they don't really need to employ. Still, somehow, it happens.

The book expands on the essay's themes, and adds scholarship, so it's therefore even more depressing than the original column. But he suggests an alternative: public policies to redistribute wealth back to the people who created it, and actually free up our time from these bullshit jobs.

Short distance office move

My team have moved to a new space we've leased on a different floor of Chicago's Aon Center. This morning, this was my view:

And now, one floor lower and facing the opposite direction, this is my view:

I actually prefer the south view, but only marginally. In fact, I'll probably keep taking photos of the south view. But neither view sucks.

Critics of the Web—30 years ago

Alexis Madrigal takes a look at criticisms of the World Wide Web from when it was new:

Thirty years ago this week, the British scientist Tim Berners-Lee invented the World Wide Web at CERN, the European scientific-research center. Suffice it to say, the idea took off. The web made it easy for everyday people to create and link together pages on what was then a small network. The programming language was simple, and publishing was as painless as uploading something to a server with a few tags in it.

Just a few years after the internet’s creation, a vociferous set of critics—most notably in Resisting the Virtual Life, a 1995 anthology published by City Lights Books—rose to challenge the ideas that underlay the technology, as previous groups had done with other, earlier technologies.

Maybe as a major technological movement begins to accelerate—but before its language, corporate power, and political economics begin to warp reality—a brief moment occurs when critics see the full and awful potential of whatever’s coming into the world. No, the new technology will not bring better living (at least not only that). There will be losers. Oppression will worm its way into even the most seemingly liberating spaces. The noncommercial will become hooked to a vast profit machine. People of color will be discriminated against in new ways. Women will have new labors on top of the old ones. The horror-show recombination of old systems and cultures with new technological surfaces and innards is visible, like the half-destroyed robot face of Arnold Schwarzenegger in Terminator 2.

Then, if money and people really start to pour into the technology, the resistance will be swept away, left dusty and coughing as what gets called progress rushes on.

The whole piece is worth a read.

Two on data security

First, Bruce Schneier takes a look at Facebook's privacy shift:

There is ample reason to question Zuckerberg's pronouncement: The company has made -- and broken -- many privacy promises over the years. And if you read his 3,000-word post carefully, Zuckerberg says nothing about changing Facebook's surveillance capitalism business model. All the post discusses is making private chats more central to the company, which seems to be a play for increased market dominance and to counter the Chinese company WeChat.

We don't expect Facebook to abandon its advertising business model, relent in its push for monopolistic dominance, or fundamentally alter its social networking platforms. But the company can give users important privacy protections and controls without abandoning surveillance capitalism. While some of these changes will reduce profits in the short term, we hope Facebook's leadership realizes that they are in the best long-term interest of the company.

Facebook talks about community and bringing people together. These are admirable goals, and there's plenty of value (and profit) in having a sustainable platform for connecting people. But as long as the most important measure of success is short-term profit, doing things that help strengthen communities will fall by the wayside. Surveillance, which allows individually targeted advertising, will be prioritized over user privacy. Outrage, which drives engagement, will be prioritized over feelings of belonging. And corporate secrecy, which allows Facebook to evade both regulators and its users, will be prioritized over societal oversight. If Facebook now truly believes that these latter options are critical to its long-term success as a company, we welcome the changes that are forthcoming.

And Cory Doctorow describes a critical flaw in Switzerland's e-voting system:

[E]-voting is a terrible idea and the general consensus among security experts who don't work for e-voting vendors is that it shouldn't be attempted, but if you put out an RFP for magic beans, someone will always show up to sell you magic beans, whether or not magic beans exist.

The belief that companies can be trusted with this power [to fix security defects while preventing people from disclosing them] defies all logic, but it persists. Someone found Swiss Post's embrace of the idea too odious to bear, and they leaked the source code that Swiss Post had shared under its nondisclosure terms, and then an international team of some of the world's top security experts (including some of our favorites, like Matthew Green) set about analyzing that code, and (as every security expert who doesn't work for an e-voting company has predicted since the beginning of time), they found an incredibly powerful bug that would allow a single untrusted party at Swiss Post to undetectably alter the election results.

You might be thinking, "Well, what is the big deal? If you don't trust the people administering an election, you can't trust the election's outcome, right?" Not really: we design election systems so that multiple, uncoordinated people all act as checks and balances on each other. To suborn a well-run election takes massive coordination at many polling- and counting-places, as well as independent scrutineers from different political parties, as well as outside observers, etc.

And even other insecure e-voting systems like the ones in the USA are not this bad: they decentralized, and would-be vote-riggers would have to compromise many systems, all around the nation, in each poll that they wanted to alter. But Swiss Post's defect allows a single party to alter all the polling data, and subvert all the audit systems. As Matthew Green told Motherboard: "I don’t think this was deliberate. However, if I set out to design a backdoor that allowed someone to compromise the election, it would look exactly like this."

Switzerland is going ahead with the election anyway, because that's what people do when they're called out on stupidity.

Weekend reading list

Just a few things I'm reading that you also might want to read:

And finally, it's getting close to April and the Blogging A-to-Z Challenge. Stay tuned.