The Daily Parker

Politics, Weather, Photography, and the Dog

Anti-intellectualism lives on both sides

Williams College Biology Professor Luana Maroja sounds the alarm as she sees students challenging long-established science on political grounds:

The trouble began when we discussed the notion of heritability as it applies to human intelligence.

I asked students to think about the limitations of the data, which do not control for environmental differences, and explained that the raw numbers say nothing about whether observed differences are indeed “inborn”—that is, genetic.

There is, of course, a long history of charlatans who have cited dubious “science” as proof that certain racial and ethnic groups are genetically superior to others. My approach has been to teach students how to see through those efforts, by explaining how scientists understand heritability today, and by discussing how to interpret intelligence data—and how not to.

In class, though, some students argued instead that it is impossible to measure IQ in the first place, that IQ tests were invented to ostracize minority groups, or that IQ is not heritable at all. None of these arguments is true. In fact, IQ can certainly be measured, and it has some predictive value. While the score may not reflect satisfaction in life, it does correlate with academic success. And while IQ is very highly influenced by environmental differences, it also has a substantial heritable component; about 50 percent of the variation in measured intelligence among individuals in a population is based on variation in their genes. Even so, some students, without any evidence, started to deny the existence of heritability as a biological phenomenon.

Similar biological denialism exists about nearly any observed difference between human groups, including those between males and females. Unfortunately, students push back against these phenomena not by using scientific arguments, but by employing an a priori moral commitment to equality, anti-racism, and anti-sexism. They resort to denialism to protect themselves from having to confront a worldview they reject—that certain differences between groups may be based partly on biology.

She concludes that this has a chilling effect on education and research. It's pretty scary.

Two stories of North American irrationality

First, William Giraldi, writing in Medium, proclaims "[e]verything you need to know about the mess that is America in 2019 can be explained by our deepening national belief that Bigfoot is real:"

Bigfooters believe they are questing for bipedal apes in California, but they are really questing for their own lost boyhoods, their Boy Scout days, those formative experiences in the woodlands of fancy and faith, and for the thrill of certain belief as it was before the adult world broke in to bludgeon it.

Remember that preadolescent frisson, the dread-tinged excitement of knowing, absolutely knowing, that monsters were real, not the myths, folklores, and allegories that adulthood insists they are? If Wordsworth laments adulthood’s injection of sobriety and rationality into the childhood sublime, Bigfooters aren’t having it. They’ve found a means of resurrecting that boyish wonder, of plugging back into the child’s reciprocal, imaginative bond with nature. If it comes at the cost of evidence — to say nothing of dignity — since when have children ever bothered with evidence? These scientists and their mocking, scoffing facts are a drag. What did John Keats says about Isaac Newton’s achievements with light? “He destroyed the poetry of a rainbow by reducing it to a prism.”

In concert with their wish to plug back into their boyhoods, these men, loose in the woods, are searching for the approval and acceptance of other men. No optional male group endeavor, none, is exempt from this law, one that hearkens back to the mastodon hunt, during which a male proved himself worthy of the clan and thus worthy of the protection and resources the clan controlled.

Which isn’t to say that time in the woods is idyllic. Believer and journalist John Green, the grandfather of Sasquatchia, once wrote, “The average sasquatch hunter is so pig-headed that two of them together are pretty sure to have a falling out before long… People who will go hunting for an animal that is rejected by the world of science and almost everybody else are bound to be people who don’t pay much attention to any opinion but their own, and expect not only to have an opinion but to act on it.”

Remember Jonathan Swift: “Reasoning will never make a Man correct an ill Opinion, which by Reasoning he never acquired.”

Our second story comes from the CBC via IFLScience, showing that our Canadian neighbors have shown a disturbing lack of immunity to our American anti-intellectualism.

It seems a Vancouver dad worried so much about the (totally discredited) myth that vaccines cause autism that he didn't get his kids vaccinated. Flash forward 10 years and add a family trip to Vietnam, and one of his kids became Patient Zero in a British Columbia measles outbreak:

[Emmanuel] Bilodeau believes one of his three sons contracted measles during a family trip to Vietnam earlier this year and that it has since spread at the French-language schools his children attend.

Bilodeau said he brought his sons to a travel clinic on Broadway Street before their trip where they received other vaccinations, but not for measles.

It was on the plane ride home that his 11-year-old son began experiencing symptoms, including fever.

Measles. In 2019. In Canada, which has perhaps one of the three or four most advanced health-care systems the world has ever seen.

So, go ahead and believe in 3-meter-tall ape-people wandering the forests of Oregon, but try to apply some logic and rationality to life-or-death decisions like whether to prevent people from getting a disease so easy to prevent we had almost eradicated it before the beginning of this century.

University of Wisconsin kills liberal education

Wisconsin, founded in a tradition of liberalism, is shifting its world-class university away from actually educating students into giving them vocational training instead:

In March 2018, the school’s administration offered a proposal to deal with the deficit. Cuts were necessary, the administration said. Liberal-arts staples such as English, philosophy, political science, and history would have to be eliminated. All told, the university planned to get rid of 13 majors. Not enough students were enrolled in them to make them worth the cost, the university argued. “We’re facing some changing enrollment behaviors,” Greg Summers, the provost and vice chancellor at Stevens Point, told me. “And students are far more cost-conscious than they used to be.”

Instead, administrators wanted to focus the school’s limited resources on the academic areas that students were flocking to and that the state’s economy could use straightaway—though they maintained that the liberal arts more generally would remain central to the curriculum, even if these specific majors were gone. “We remain committed to ensuring every student who graduates from UW-Stevens Point is thoroughly grounded in the liberal arts, as well as prepared for a successful career path,” Bernie Patterson, the institution’s chancellor, said in a message to the campus. The changes would reflect “a national move among students towards career pathways,” administrators argued. The proposal planned to add majors in chemical engineering, computer-information systems, conservation-law enforcement, finance, fire science, graphic design, management, and marketing. By focusing more on fields that led directly to careers, the school could better provide what businesses wanted—and students, in theory, would have an easier time finding jobs and career success.

Fierce backlash to the proposal from students, faculty, and alumni pushed the administration to reconsider its original plan. By the time the final proposal was released in mid-November 2018, it was less expansive, though still forceful. Six programs would be cut, including the history major. The university seemed to be eyeing degree programs with low numbers of graduates, and nationally, the number of graduates from bachelor’s programs in history has had the steepest decline of any major in recent years, according to the National Center for Education Statistics.

If the proposal, which is now in the middle of a public-comment period, is finalized, history classes will still be offered, but Willis said that cutting the major may ultimately lead to a reduction of staff and upper-level courses, such as the spring seminar on the Holocaust and its major’s emphasis on race and ethnicity. To Willis, this isn’t just an educational loss, but a societal one as well. “You never know when a historical metaphor is going to arise,” he quipped, pointing to the recent incident in Baraboo, Wisconsin, where high-school students gestured the Nazi salute in a photo.

Here's a tip: liberal education, especially in areas like English and history, is job training. Anyone can write code; many people can manage; actually figuring out how to run a business is different.

It's yet another example of the Dunning-Kruger effect: you need a liberal education to understand how beneficial—how useful—it is. And people like Scott Walker don't have those tools.

In praise of B students

Research (and life experience) suggest strongly that kids who get straight As in school may not actually have the best preparation for real life:

The evidence is clear: Academic excellence is not a strong predictor of career excellence. Across industries, research shows that the correlation between grades and job performance is modest in the first year after college and trivial within a handful of years. For example, at Google, once employees are two or three years out of college, their grades have no bearing on their performance. (Of course, it must be said that if you got D’s, you probably didn’t end up at Google.)

In a classic 1962 study, a team of psychologists tracked down America’s most creative architects and compared them with their technically skilled but less original peers. One of the factors that distinguished the creative architects was a record of spiky grades. “In college our creative architects earned about a B average,” Donald MacKinnon wrote. “In work and courses which caught their interest they could turn in an A performance, but in courses that failed to strike their imagination, they were quite willing to do no work at all.” They paid attention to their curiosity and prioritized activities that they found intrinsically motivating — which ultimately served them well in their careers.

Straight-A students also miss out socially. More time studying in the library means less time to start lifelong friendships, join new clubs or volunteer. I know from experience. I didn’t meet my 4.0 goal; I graduated with a 3.78. (This is the first time I’ve shared my G.P.A. since applying to graduate school 16 years ago. Really, no one cares.) Looking back, I don’t wish my grades had been higher. If I could do it over again, I’d study less. The hours I wasted memorizing the inner workings of the eye would have been better spent trying out improv comedy and having more midnight conversations about the meaning of life.

I've known all of this since first grade, when I realized that getting straight As would require me to do hundreds of pointless math problems every night for six months. Around that time I encountered the Terrible Trivium in The Phantom Tollbooth and the pieces fell into place. (For the record, I do arithmetic on paper just fine—and I also have Microsoft Excel to do it for me.)

Me and Julio

Yesterday was my [redacted] high school reunion. We started with a tour of the building, which has become a modern office park since we graduated:

The glass atrium there is new, as of 2008. The structure back to the left is the Center for the Performing Arts, which featured proudly in Ferris Bueller's Day Off.

I didn't get a picture of the Michelin-starred student food court, but this, this I had to snap:

So, at some point I'll post about Illinois school-funding policies and why one of the richest school districts in the world has 20 or so exercise bikes, each of which costs as much as feeding a child lunch every school day for three years...but this is not that post.

It was great seeing some of my classmates and walking around the old school. Also, shout out to the server at Northbrook's Landmark Inn for dealing with about 100 old people who didn't remember their own drinks.

 

It was 30 years ago today

I found this photo just in time for its 30th anniversary. That's me on my first full day on campus, 27 August 1988. The guy in the '80s mesh shirt is my first college roommate.

That night, he and a few of his new friends did beer funnels in the room, forcing me to go to sleep with two drunk idiots lying on our floor in pools of beer.

I got a new roommate the day room moves opened up 4 weeks later. I have no idea what became of the guy, but I imagine he sells insurance or something.

Update: According to his Facebook profile, he's a chiropractor now. I would never have guessed that. Never.

Drunken fumbling becomes a Constitutional case

A student at the University of Cincinnati has filed a fascinating lawsuit against the school for discrimination in its enforcement of sexual misconduct:

Is it possible for two people to simultaneously sexually assault each other? This is the question—rife with legal, anatomical, and emotional improbabilities—to which the University of Cincinnati now addresses itself, and with some urgency, as the institution and three of its employees are currently being sued over an encounter that was sexual for a brief moment, but that just as quickly entered the realm of eternal return. The one important thing you need to know about the case is that according to the lawsuit, a woman has been indefinitely suspended from college because she let a man touch her vagina.

The event in précis, as summarized by Robby Soave of Reason magazine: “Male and female student have a drunken hookup. He wakes up, terrified she's going to file a sexual misconduct complaint, so he goes to the Title IX office and beats her to the punch. She is found guilty and suspended.”

By some kind of weird alchemy involving the sum of its parts, this strange little event manages to hit upon almost every troubling aspect of the way that these cases are interpreted and punished on the contemporary campus. It proceeds from the assumption that if two drunk college students make out, one of them—and only one of them—is a victim of the event. It resulted in a fairly common but extremely severe consequence meted on students found guilty of very minor offenses—banishment from the university until the complainant graduates. And it suggests how easily the system can be manipulated by a student with an alleged grudge.

I'm curious to see how this turns out. It seems hard to believe that I went to university during a brief, golden age when we were treated as adults. I served 5 semesters on the Student Judiciary Board and heard so many stories of ridiculous student behavior, and still managed to find them odious enough to ban the offenders from campus a handful of times. I fear for our children that they won't ever grow up, because their parents and institutions won't let them.

Z is for Zero

Blogging A to ZToday is the last day of the 2018 Blogging A-to-Z challenge. Today's topic: Nothing. Zero. Nada. Zilch. Null.

The concept of "zero" only made it into Western mathematics just a few centuries ago, and still has yet to make it into many developers' brains. The problem arises in particular when dealing with arrays, and unexpected nulls.

In C#, arrays are zero-based. An array's first element appears at position 0:

var things = new[] { 1, 2, 3, 4, 5 };
Console.WriteLine(things[1]);

// -> 2

This causes no end of headaches for new developers who expect that, because the array above has a length of 5, its last element is #5. But doing this:

Console.WriteLine(things[5]);

...throws an IndexOutOfRange exception.

You get a similar problem when you try to read a string, because if you recall, strings are basically just arrays of characters:

var word = "12345";
Console.WriteLine(word.Substring(4));

// 5

Console.WriteLine(word.Substring(5));

// IndexOutOfRange exception

The funny thing is, both the array things and the string word have a length of 5.

The other bugaboo is null. Null means nothing. It is the absence of anything. It equals nothing, not even itself (though this, alas, is not always true).

Reference types can be null, and value types cannot. That's because value types always have to have a value, while reference types can simply be a reference to nothing. That said, the Nullable<T> structure gives value types a way into the nulliverse that even comes with its own cool syntax:

int? q = null;
int r = 0;
Console.WriteLine(q ?? 0 + r);
// 0

(What I love about this "struct?" syntax is you can almost hear it in a Scooby Doo voice, can't you?)

Line 1 defines a nullable System.Int32 as null. Line 2 defines a bog-standard Int32 equal to zero. If you try to add them, you get a NullReference exception. So line 3 shows the coalescing operator that basically contracts both of these statements into a succinct little fragment:

// Long form:
int result;
if (q.HasValue)
{
	result = q.Value + r;
}
else
{
	result = 0 + r;
}

// Shorter form:
int result = (q.HasValue ? q.Value : 0) + r;

// Shortest form:
int result = q ?? 0 + r;

And so the Daily Parker concludes the 2018 Blogging A-to-Z challenge with an entire post about nothing. I hope you've enjoyed the posts this month. Later this morning, I'll post the complete list of topics as a permanent page. Let me know what you think in the comments. It's been a fun challenge.

Y is for Y2K (and other date/time problems)

Blogging A to ZI should have posted day 25 of the Blogging A-to-Z challenge. yesterday, but life happened, as it has a lot this month. I'm looking forward to June when I might not have the over-scheduling I've experienced since mid-March. We'll see.

So it's appropriate that today's topic involves one of the things most programmers get wrong: dates and times. And we can start 20 years ago when the world was young...

A serious problem loomed in the software world in the late 1990s: programmers, starting as far back as the 1950s, had used 2-digit fields to represent the year portion of dates. As I mentioned Friday, it's important to remember that memory, communications, and storage cost a lot more than programmer time until the last 15 years or so. A 2-digit year field makes a lot of sense in 1960, or even 1980, because it saves lots of money, and why on earth would people still use this software 20 or 30 years from now?

You can see (or remember) what happened: the year 2000. If today is 991231 and tomorrow is 000101, what does that do to your date math?

It turns out, not a lot, because programmers generally planned for it way more effectively than non-technical folks realized. On the night of 31 December 1999, I was in a data center at a brokerage in New York, not doing anything. Because we had fixed all the potential problems already.

But as I said, dates and times are hard. Start with times: 24 hours, 60 minutes, 60 seconds...that's not fun. And then there's the calendar: 12 months, 52 weeks, 365 (or 366) days...also not fun.

It becomes pretty obvious even to novice programmers who think about the problem that days are the best unit to represent time in most human-scale cases. (Scientists, however, prefer seconds.) I mentioned on day 8 that I used Julian day numbers very, very early in my programming life. Microsoft (and the .NET platform) also uses the day as the base unit for all of its date classes, and relegates the display of date information to a different set of classes.

I'm going to skip the DateTime structure because it's basically useless. It will give you no end of debugging problems with its asinine DateTime.Kind member. This past week I had to fix exactly this kind of thing at work.

Instead, use the DateTimeOffset structure. It represents an unambiguous point in time, with a double value for the date and a TimeSpan value for the offset from UTC. As Microsoft explains:

The DateTimeOffset structure includes a DateTime value, together with an Offset property that defines the difference between the current DateTimeOffset instance's date and time and Coordinated Universal Time (UTC). Because it exactly defines a date and time relative to UTC, the DateTimeOffset structure does not include a Kind member, as the DateTime structure does. It represents dates and times with values whose UTC ranges from 12:00:00 midnight, January 1, 0001 Anno Domini (Common Era), to 11:59:59 P.M., December 31, 9999 A.D. (C.E.).

The time component of a DateTimeOffset value is measured in 100-nanosecond units called ticks, and a particular date is the number of ticks since 12:00 midnight, January 1, 0001 A.D. (C.E.) in the GregorianCalendar calendar. A DateTimeOffset value is always expressed in the context of an explicit or default calendar. Ticks that are attributable to leap seconds are not included in the total number of ticks.

Yes. This is the way to do it. Except...well, you know what? Let's skip how the calendar has changed over time. (Short answer: the year 1 was not the year 1.)

In any event, DateTimeOffset gives you methods to calculate time and dates accurately across a 20,000-year range.

Which is to say nothing of time zones...