One of my Facebook friends just posted a photo of our high school graduation program—from 5th June 1988. Thirty years ago.
I am screaming in my head, not just because I missed the anniversary yesterday, but also because 30 YEARS.
Writing in this month's Atlantic (a magazine by and for the very people he writes about), Matthew Stewart says the 9.9%, not the 0.1%, are the real story in American inequality:
Let’s talk first about money—even if money is only one part of what makes the new aristocrats special. There is a familiar story about rising inequality in the United States, and its stock characters are well known. The villains are the fossil-fueled plutocrat, the Wall Street fat cat, the callow tech bro, and the rest of the so-called top 1 percent. The good guys are the 99 percent, otherwise known as “the people” or “the middle class.” The arc of the narrative is simple: Once we were equal, but now we are divided. The story has a grain of truth to it. But it gets the characters and the plot wrong in basic ways.
It is in fact the top 0.1 percent who have been the big winners in the growing concentration of wealth over the past half century. According to the UC Berkeley economists Emmanuel Saez and Gabriel Zucman, the 160,000 or so households in that group held 22 percent of America’s wealth in 2012, up from 10 percent in 1963. If you’re looking for the kind of money that can buy elections, you’ll find it inside the top 0.1 percent alone.
In between the top 0.1 percent and the bottom 90 percent is a group that has been doing just fine. It has held on to its share of a growing pie decade after decade. And as a group, it owns substantially more wealth than do the other two combined. In the tale of three classes (see Figure 1), it is represented by the gold line floating high and steady while the other two duke it out. You’ll find the new aristocracy there. We are the 9.9 percent.
I recommend reading the whole article. But his conclusions jibe with things I've worried about for most of my adult life:
The toxic wave of wealth concentration that arose in the Gilded Age and crested in the 1920s finally crashed on the shoals of depression and war. Today we like to think that the social-welfare programs that were planted by the New Deal and that blossomed in the postwar era were the principal drivers of a new equality. But the truth is that those efforts belong more to the category of effects than causes. Death and destruction were the real agents of change. The financial collapse knocked the wealthy back several steps, and war empowered labor—above all working women.
That gilded, roaring surge of destruction was by no means the first such destabilizing wave of inequality to sweep through American history. In the first half of the 19th century, the largest single industry in the United States, measured in terms of both market capital and employment, was the enslavement (and the breeding for enslavement) of human beings. Over the course of the period, the industry became concentrated to the point where fewer than 4,000 families (roughly 0.1 percent of the households in the nation) owned about a quarter of this “human capital,” and another 390,000 (call it the 9.9 percent, give or take a few points) owned all of the rest.
The slaveholding elite were vastly more educated, healthier, and had much better table manners than the overwhelming majority of their fellow white people, never mind the people they enslaved. They dominated not only the government of the nation, but also its media, culture, and religion. Their votaries in the pulpits and the news networks were so successful in demonstrating the sanctity and beneficence of the slave system that millions of impoverished white people with no enslaved people to call their own conceived of it as an honor to lay down their life in the system’s defense.
That wave ended with 620,000 military deaths, and a lot of property damage. It did level the playing field in the American South for a time—though the process began to reverse itself all too swiftly.
I like where I am, no lie. But I recognize, as does Stewart, that we have a choice to make in how we reverse the trending inequality that has historically led to revolution. Food for thought.
On 13 May 1998, just past midnight New York time, I posted my first joke on my brand-new braverman.org website from my apartment in Brooklyn.
My first website of any kind was a page of links I maintained for myself starting in April 1997. Throughout 1997 and into 1998 I gradually experimented with Active Server Pages, the new hotness, and built out some rudimentary weather features. That site launched on 19 August 1997.
By early April 1998, I had a news feed, photos, and some other features. On April 2nd, I reserved the domain name braverman.org. Then on May 6th, I launched a redesign that filled out our giant 1024 x 768 CRT displays. Here's what it looked like; please don't vomit:
On May 13th, 20 years ago today, I added a Jokes section. That's when I started posting things for the general public, not just for myself, which made the site a proto-blog. That's the milestone this post is commemorating.
Shortly after that, I changed the name to "The Write Site," which lasted until early 2000.
In 1999, Katie Toner redesigned the site. The earliest Wayback Machine image shows how it looked after that. Except for the screenshot above, I have no records of how the site looked prior to Katie's redesign, and no easy way of recreating it from old source code.
I didn't call it a "blog" until November 2005. But on the original braverman.org site, I posted jokes, thoughts, news, my aviation log, and other bits of debris somewhat regularly. What else was it, really?
Today, The Daily Parker has 6,209 posts in dozens of categories. Will it go 20 more years? It might. Stick around.
No, this isn't one of the two Daily Parker milestones we'll see this month. It's trivial and personal.
On this day in 1988, 30 years ago, I bought my first CD. It was an almost-new technology—the first CDs were commercially available in 1981—and it sounded a lot better than scratchy old vinyl records.
Just looking back at what I posted 10 years ago confirms I haven't bought that many CDs lately. I don't have the number in front of me, but I believe I've now got 940 of them, meaning I've bought an average of 12 a year since 2008. That's slightly fewer than the 12 a month I bought in 1990.
For historical context, when I bought my first CD, Ronald Reagan was president, it looked like (but wasn't certain) that Michael Dukakis and George H.W. Bush would be the candidates to replace him, and our arch-rival for world domination was the Union of Soviet Socialist Republics. A Toyota Corolla cost $10,000, a gallon of gas or a gallon of milk cost 96c, and you could buy a 3-bedroom house in my home town for $200,000. (The same house is now close to $750,000.)
Edward McClelland essays on the decline of the white blue-collar Midwest, as expressed linguistically:
The “classic Chicago” accent, with its elongated vowels and its tendency to substitute “dese, dem, and dose” for “these, them, and those,” or “chree” for “three,” was the voice of the city’s white working class. “Dese, Dem, and Dose Guy,” in fact, is a term for a certain type of down-to-earth Chicagoan, usually from a white South Side neighborhood or an inner-ring suburb.
The classic accent was most widespread during the city’s industrial heyday. Blue-collar work and strong regional speech are closely connected: If you were white and graduated high school in the 1960s, you didn’t need to go to college, or even leave your neighborhood, to get a good job, and once you got that job, you didn’t have to talk to anyone outside your house, your factory, or your tavern. A regular-joe accent was a sign of masculinity and local cred, bonding forces important for the teamwork of industrial labor.
The classic Chicago accent is heard less often these days because the white working class is less numerous, and less influential, than it was in the 20th century. It has been pushed to the margins of city life, both figuratively and geographically, by white flight, multiculturalism and globalization: The accent is most prevalent in blue-collar suburbs and predominantly white neighborhoods in the northwest and southwest corners of the city, now heavily populated by city workers whose families have lived in Chicago for generations.
There’s a conception that television leveled local accents, by bringing so-called “broadcaster English” into every home. I don’t think this is true. No one watched more television than the Baby Boomers, but their accents are much stronger than those of their children, the Millennials.
What’s really killing the local accent is education and geographicmobility, which became economic necessities for young Rust Belters after the mills closed down. But as blue-collar jobs have faded, so has some of our linguistic diversity.
McClelland adapted his CityLab essay from his 2016 book How to Speak Midwestern, which is obviously now on my Amazon wish list.
No, not the Dunning of Kruger fame; Dunning, the community area on the far northwest side of Chicago.
Workers building a new school in the neighborhood discovered that not only was it the former site of a poor house, but also that 38,000 people may be buried there:
“There can be and there have been bodies found all over the place,” said Barry Fleig, a genealogist and cemetery researcher who began investigating the site in 1989. “It’s a spooky, scary place.”
Workers have until April 27 to excavate and clear the site, remediate the soil and relocate an existing sewer line. The school is scheduled to open in time for the 2019-20 academic year, though a spokesperson for Chicago Public Schools would not say what type of school it will be.
Fleig said he’s “nearly certain” there are no intact caskets buried underneath the proposed school grounds — bodies were primarily buried in two formal cemeteries, though scattered human remains have been discovered during previous construction projects near the campus.
In 1854, the county opened a poorhouse and farm and gradually added an insane asylum, infirmary and tuberculosis hospital to the property. At its peak, a thousand people were buried on the grounds each year.
The state took over in 1912 and changed the official name to Chicago State Hospital. Buildings were shuttered in 1970 and operations moved west of Oak Park Avenue to what is now Chicago-Read Mental Health Center.
In 1854, the site would have been a few hours' ride from the city. So I'm glad to see that the American tradition of dumping the poor in places where they can't possible thrive was as strong then as now. I'm a little shocked that a pauper's cemetery acquired so many corpses in sixty years, though.
In a powerful June, 2016, column for Slate, Dahlia Lithwick laid out the NRA's (and the right's) second-amendment hoax. It's worth revisiting:
The Supreme Court ... most famously in a 1939 case called U.S. v. Miller [ruled] that since the possession or use of a “shotgun having a barrel of less than eighteen inches in length” had no reasonable relationship to the “preservation or efficiency of a well regulated militia,” the court simply could not find that the Second Amendment guaranteed “the right to keep and bear such an instrument.” Period, full stop. And that was the viewpoint adopted by the courts for years.
What changed? As Cass Sunstein and others have explained, what changed things was a decades-long effort by exceptionally well-organized, well-funded interest groups that included the National Rifle Association—all of whom “embarked on an extraordinary campaign to convince the public, and eventually the courts, to understand the Second Amendment in their preferred way.”
The larger fabrication is the idea that the Second Amendment—unlike other provisions of the Constitution—cannot be subject to any reasonable restriction.
Hoax number three: Obama, Clinton, Democrats, liberals, the media, whomeverare coming for your guns. They are Coming. For your Guns!!! This is the crunchy candy shell that makes the other two lies seem almost reasonable.
Meanwhile, as Lithwick and others keep saying, we're the only country in the OECD where you're more likely to get shot than get hit by lightning. (Seriously, in every other country the incidence of gun death is less than 0.5 per 100,000—about the incidence of being injured or killed by lightning. In the U.S., the incidence of gun murder, not just getting shot, is around 3.6 per 100,000.)
And to think, this is all driven by a trade association. Imagine if the National Association of Dental Hygienists had that much power.
And kudos to Lyft, who announced they'll give free rides to anti-gun rallies. This is one more reason I use them and not the other guys.
From Josh Marshall:
[D]ecoupling the United States from the major states and economies of Western Europe has been the central foreign policy goal of Russia for about 70 years.
We defeated the Soviet Union by allying ourselves with most of the world. Now the President of the United States is undoing 70 years of work and handing Russia their own sphere of influence.
Great work, Mr President.
I'm actually coughing up a lung at home today, which you'd think gives me more time to read, but actually it doesn't. Really I just want a nap.
Now I have to decide whether to debug some notoriously slow code of mine, or...nap.
At least according to Pew Research:
Pew Research Center has been studying the Millennial generationfor more than a decade. But as we enter 2018, it’s become clear to us that it’s time to determine a cutoff point between Millennials and the next generation. Turning 37 this year, the oldest Millennials are well into adulthood, and they first entered adulthood before today’s youngest adults were born.
In order to keep the Millennial generation analytically meaningful, and to begin looking at what might be unique about the next cohort, Pew Research Center will use 1996 as the last birth year for Millennials for our future work. Anyone born between 1981 and 1996 (ages 22-37 in 2018) will be considered a Millennial, and anyone born from 1997 onward will be part of a new generation. Since the oldest among this rising generation are just turning 21 this year, and most are still in their teens, we think it’s too early to give them a name – though The New York Times asked readers to take a stab – and we look forward to watching as conversations among researchers, the media and the public help a name for this generation take shape. In the meantime, we will simply call them “post-Millennials” until a common nomenclature takes hold.
Generational cutoff points aren’t an exact science. They should be viewed primarily as tools, allowing for the kinds of analyses detailed above. But their boundaries are not arbitrary. Generations are often considered by their span, but again there is no agreed upon formula for how long that span should be. At 16 years (1981 to 1996), our working definition of Millennials will be equivalent in age span to their preceding generation, Generation X (born between 1965 and 1980). By this definition, both are shorter than the span of the Baby Boomers (19 years) – the only generation officially designated by the U.S. Census Bureau, based on the famous surge in post-WWII births in 1946 and a significant decline in birthrates after 1964.
I've always been solidly an X-er, but some of my friends will be surprised to learn that they, too, are now officially Gen X.