The Daily Parker

Politics, Weather, Photography, and the Dog

It was 20 years ago today

On 13 May 1998, just past midnight New York time, I posted my first joke on my brand-new braverman.org website from my apartment in Brooklyn.

My first website of any kind was a page of links I maintained for myself starting in April 1997. Throughout 1997 and into 1998 I gradually experimented with Active Server Pages, the new hotness, and built out some rudimentary weather features. That site launched on 19 August 1997.

By early April 1998, I had a news feed, photos, and some other features. On April 2nd, I reserved the domain name braverman.org. Then on May 6th, I launched a redesign that filled out our giant 1024 x 768 CRT displays. Here's what it looked like; please don't vomit:

On May 13th, 20 years ago today, I added a Jokes section. That's when I started posting things for the general public, not just for myself, which made the site a proto-blog. That's the milestone this post is commemorating.

Shortly after that, I changed the name to "The Write Site," which lasted until early 2000.

In 1999, Katie Toner redesigned the site. The earliest Wayback Machine image shows how it looked after that. Except for the screenshot above, I have no records of how the site looked prior to Katie's redesign, and no easy way of recreating it from old source code.

I didn't call it a "blog" until November 2005. But on the original braverman.org site, I posted jokes, thoughts, news, my aviation log, and other bits of debris somewhat regularly. What else was it, really?

Today, The Daily Parker has 6,209 posts in dozens of categories. Will it go 20 more years? It might. Stick around.

Odd little personal milestone

No, this isn't one of the two Daily Parker milestones we'll see this month. It's trivial and personal.

On this day in 1988, 30 years ago, I bought my first CD. It was an almost-new technology—the first CDs were commercially available in 1981—and it sounded a lot better than scratchy old vinyl records.

Just looking back at what I posted 10 years ago confirms I haven't bought that many CDs lately. I don't have the number in front of me, but I believe I've now got 940 of them, meaning I've bought an average of 12 a year since 2008. That's slightly fewer than the 12 a month I bought in 1990.

For historical context, when I bought my first CD, Ronald Reagan was president, it looked like (but wasn't certain) that Michael Dukakis and George H.W. Bush would be the candidates to replace him, and our arch-rival for world domination was the Union of Soviet Socialist Republics. A Toyota Corolla cost $10,000, a gallon of gas or a gallon of milk cost 96c, and you could buy a 3-bedroom house in my home town for $200,000. (The same house is now close to $750,000.)

Disappearing Midwestern accents

Edward McClelland essays on the decline of the white blue-collar Midwest, as expressed linguistically:

The “classic Chicago” accent, with its elongated vowels and its tendency to substitute “dese, dem, and dose” for “these, them, and those,” or “chree” for “three,” was the voice of the city’s white working class. “Dese, Dem, and Dose Guy,” in fact, is a term for a certain type of down-to-earth Chicagoan, usually from a white South Side neighborhood or an inner-ring suburb.

The classic accent was most widespread during the city’s industrial heyday. Blue-collar work and strong regional speech are closely connected: If you were white and graduated high school in the 1960s, you didn’t need to go to college, or even leave your neighborhood, to get a good job, and once you got that job, you didn’t have to talk to anyone outside your house, your factory, or your tavern. A regular-joe accent was a sign of masculinity and local cred, bonding forces important for the teamwork of industrial labor.

The classic Chicago accent is heard less often these days because the white working class is less numerous, and less influential, than it was in the 20th century. It has been pushed to the margins of city life, both figuratively and geographically, by white flight, multiculturalism and globalization: The accent is most prevalent in blue-collar suburbs and predominantly white neighborhoods in the northwest and southwest corners of the city, now heavily populated by city workers whose families have lived in Chicago for generations.

There’s a conception that television leveled local accents, by bringing so-called “broadcaster English” into every home. I don’t think this is true. No one watched more television than the Baby Boomers, but their accents are much stronger than those of their children, the Millennials.

What’s really killing the local accent is education and geographicmobility, which became economic necessities for young Rust Belters after the mills closed down. But as blue-collar jobs have faded, so has some of our linguistic diversity.

McClelland adapted his CityLab essay from his 2016 book How to Speak Midwestern, which is obviously now on my Amazon wish list.

Construction complication in Dunning

No, not the Dunning of Kruger fame; Dunning, the community area on the far northwest side of Chicago.

Workers building a new school in the neighborhood discovered that not only was it the former site of a poor house, but also that 38,000 people may be buried there:

“There can be and there have been bodies found all over the place,” said Barry Fleig, a genealogist and cemetery researcher who began investigating the site in 1989. “It’s a spooky, scary place.”

Workers have until April 27 to excavate and clear the site, remediate the soil and relocate an existing sewer line. The school is scheduled to open in time for the 2019-20 academic year, though a spokesperson for Chicago Public Schools would not say what type of school it will be.

Fleig said he’s “nearly certain” there are no intact caskets buried underneath the proposed school grounds — bodies were primarily buried in two formal cemeteries, though scattered human remains have been discovered during previous construction projects near the campus.

In 1854, the county opened a poorhouse and farm and gradually added an insane asylum, infirmary and tuberculosis hospital to the property. At its peak, a thousand people were buried on the grounds each year.

The state took over in 1912 and changed the official name to Chicago State Hospital. Buildings were shuttered in 1970 and operations moved west of Oak Park Avenue to what is now Chicago-Read Mental Health Center. 

In 1854, the site would have been a few hours' ride from the city. So I'm glad to see that the American tradition of dumping the poor in places where they can't possible thrive was as strong then as now. I'm a little shocked that a pauper's cemetery acquired so many corpses in sixty years, though.

A lie wrapped in a fabrication, lacquered over with a falsehood

In a powerful June, 2016, column for Slate, Dahlia Lithwick laid out the NRA's (and the right's) second-amendment hoax. It's worth revisiting:

The Supreme Court ... most famously in a 1939 case called U.S. v. Miller [ruled] that since the possession or use of a “shotgun having a barrel of less than eighteen inches in length” had no reasonable relationship to the “preservation or efficiency of a well regulated militia,” the court simply could not find that the Second Amendment guaranteed “the right to keep and bear such an instrument.” Period, full stop. And that was the viewpoint adopted by the courts for years.

What changed? As Cass Sunstein and others have explained, what changed things was a decades-long effort by exceptionally well-organized, well-funded interest groups that included the National Rifle Association—all of whom “embarked on an extraordinary campaign to convince the public, and eventually the courts, to understand the Second Amendment in their preferred way.”

The larger fabrication is the idea that the Second Amendment—unlike other provisions of the Constitution—cannot be subject to any reasonable restriction.

Hoax number three: Obama, Clinton, Democrats, liberals, the media, whomeverare coming for your guns. They are Coming. For your Guns!!! This is the crunchy candy shell that makes the other two lies seem almost reasonable.

Meanwhile, as Lithwick and others keep saying, we're the only country in the OECD where you're more likely to get shot than get hit by lightning. (Seriously, in every other country the incidence of gun death is less than 0.5 per 100,000—about the incidence of being injured or killed by lightning. In the U.S., the incidence of gun murder, not just getting shot, is around 3.6 per 100,000.)

And to think, this is all driven by a trade association. Imagine if the National Association of Dental Hygienists had that much power.

And kudos to Lyft, who announced they'll give free rides to anti-gun rallies. This is one more reason I use them and not the other guys.

One sentence that sums everything up concisely

From Josh Marshall:

[D]ecoupling the United States from the major states and economies of Western Europe has been the central foreign policy goal of Russia for about 70 years.

We defeated the Soviet Union by allying ourselves with most of the world. Now the President of the United States is undoing 70 years of work and handing Russia their own sphere of influence.

Great work, Mr President.

On the radar today

I'm actually coughing up a lung at home today, which you'd think gives me more time to read, but actually it doesn't. Really I just want a nap.

Now I have to decide whether to debug some notoriously slow code of mine, or...nap.

It's official: The Millennial Generation is 1981-1996

At least according to Pew Research:

Pew Research Center has been studying the Millennial generationfor more than a decade. But as we enter 2018, it’s become clear to us that it’s time to determine a cutoff point between Millennials and the next generation. Turning 37 this year, the oldest Millennials are well into adulthood, and they first entered adulthood before today’s youngest adults were born.

In order to keep the Millennial generation analytically meaningful, and to begin looking at what might be unique about the next cohort, Pew Research Center will use 1996 as the last birth year for Millennials for our future work. Anyone born between 1981 and 1996 (ages 22-37 in 2018) will be considered a Millennial, and anyone born from 1997 onward will be part of a new generation. Since the oldest among this rising generation are just turning 21 this year, and most are still in their teens, we think it’s too early to give them a name – though The New York Times asked readers to take a stab – and we look forward to watching as conversations among researchers, the media and the public help a name for this generation take shape. In the meantime, we will simply call them “post-Millennials” until a common nomenclature takes hold.

Generational cutoff points aren’t an exact science. They should be viewed primarily as tools, allowing for the kinds of analyses detailed above. But their boundaries are not arbitrary. Generations are often considered by their span, but again there is no agreed upon formula for how long that span should be. At 16 years (1981 to 1996), our working definition of Millennials will be equivalent in age span to their preceding generation, Generation X (born between 1965 and 1980). By this definition, both are shorter than the span of the Baby Boomers (19 years) – the only generation officially designated by the U.S. Census Bureau, based on the famous surge in post-WWII births in 1946 and a significant decline in birthrates after 1964.

I've always been solidly an X-er, but some of my friends will be surprised to learn that they, too, are now officially Gen X.

Bronze age defenses, modern attacks

Via Bruce Schneier, DHS Senior Analyst Jack Anderson describes how walls are still a dominant security metaphor, and the consequences of that choice:

Walls don’t fail gracefully. But there is a bewitching tendency to trust them more than we should, and this leads to dangerous liabilities. Extreme risk prognosticator Pasquale Curillo calls this tendency to depend too much on controls we’ve put in place the “fence paradox.” By protecting things — which they must — organizations can encourage situations where they stand to lose a lot if their wall is breached. When that fortification fails (and eventually, every fortress fails) it fails catastrophically. The scale of the Equifax hack in 2017 and the Brussels bombings in 2016 both illustrate the way that organizations and systems organize risk, tending to put together massive targets for potential threats. Walls actually encourage this kind of thinking. If you build walls to protect something, it makes sense to expect them to work. But network architects and airport security designers both need to listen to de Montluc, the 16th century French military mastermind: “Nothing is impregnable.”

We need a new awareness of what walls do. It’s tempting to think of them as blocking threats, but they don’t. They behave more like filters — winnowing out only those threats not serious enough to circumvent them. And this implies a secondary problem apart from the fence paradox. A wall that prevents large-scale foot traffic across unsecured locations in the U.S border means that only determined, capable adversaries will be able to cross the wall. The people who are the least threatening are the only ones who are easily deflected. It may prevent smaller scale losses, but it actually encourages your biggest threat to innovate, leaving room for catastrophe. Bag checks and barricades moved a perimeter outward at the Mandalay Bay Casino last October, but Stephen Paddock circumvented this by moving his position upward. As Washington considers the marginal benefits of a massive border wall, it needs to think equally of this revenge effect.

This weakness is where the idea of “defense in depth” (layered security) comes from. A good summary of the reasons for defense in depth comes from a 1921 Infantry Journal, published by the U.S. Infantry Association: “All essential elements of the defense should be organized in depth. If the forward defensive areas are captured, resistance is continued by those in the rear.”

That's bronze-age wisdom, in fact. And yet security designers don't seem to learn. And the President's wall around Fantasyland will not prevent the threats he fears, not one little bit.

"Told you so."—George Washington, 1796

Thomas Pickering and James Stoutenberg, writing for the New York Times, point out that George Washington warned us about someone like the modern Republican Party or Donald Trump taking power in the U.S.:

In elaborate and thoughtful prose, Washington raised red flags about disunity, false patriotism, special interests, extreme partisanship, fake news, the national debt, foreign alliances and foreign hatreds. With uncanny foresight, he warned that the most serious threat to our democracy might come from disunity within the country rather than interference from outside. And he foresaw the possibility of foreign influence over our political system and the rise of a president whose ego and avarice would transcend the national interest, raising the threat of despotism.

He wrote that should one group, “sharpened by the spirit of revenge,” gain domination over another, the result could be “a more formal and permanent despotism.” The despot’s rise would be fueled by “disorders and miseries” that would gradually push citizens “to seek security and repose in the absolute power of an individual.”

“Sooner or later,” he concluded, “the chief of some prevailing faction, more able and more fortunate than his competitors, turns this disposition to the purpose of his own elevation on the ruins of public liberty.”

And then he arrived at one of his greatest concerns: The ways in which hyperpartisanship could open the door “to foreign influence and corruption, which find a facilitated access to the government itself through the channels of party passions. Thus the policy and the will of one country are subjected to the policy and will of another.”

As someone with a degree in history, all I can do is watch the train wreck and hope to survive it.