Media reports, including the XPOTUS's own social-media posts, suggest the US Attorney for the Southern District of New York will issue an historic indictment on Tuesday:
The Manhattan district attorney's office is expected to issue criminal charges against Trump in a case centering on a payment that Michael Cohen, Trump's attorney and fixer at the time, made to the adult film star Stormy Daniels in the final weeks of the 2016 presidential election.
Cohen told CNN Thursday that he believed an indictment of Trump was "imminent."
Trump has maintained his innocence in the case and claims he did not have an affair with Daniels. His attorneys have also argued the investigation is politically motivated. Trump attacked Daniels Wednesday on his social media platform Truth Social.
To secure a conviction, prosecutors would have to prove Trump knowingly broke state law by reimbursing Cohen for his payment to Daniels and then falsifying his business records to cover it up.
There is also no guarantee the case will go to trial.
Of course this won't go to trial. The XPOTUS may have massive lacunae in his higher functions, but I'm sure he's canny enough to realize that he can't afford politically to have Stormy Daniels take the stand.
If you think the Democratic Party wouldn't be as hard on one of our own as we think the Justice Department should be about the XPOTUS, here's just one of the things I wrote about Democratic Illinois Governor Rod Blagojevich—who I fucking voted for—when it turned out he was unfit for office. Heck, read all of the things I wrote.
See, it's not about partisan politics; it's about not wanting our politicians to do crimes. And it's about wanting something approaching ethics based on a simple fear of consequences to guide these narcissists, as actual moral philosophy is simply beyond them.
Also, this is likely only the first indictment coming for the XPOTUS. There are at least two other grand jury investigations in other jurisdictions, operating on their own timetables. The next election will not be fun.
Twenty years ago today, the United States invaded a neutral country that hadn't taken a shot at us for over a decade. This had predictable results for the region, including making our long-time adversary Iran a major player:
The invasion “was the original sin,” said Emile Hokayem, a senior fellow for Middle East security at the International Institute for Strategic Studies, a British think tank. “It helped Iran bolster its position by being a predator in Iraq. It’s where Iran perfected the use of violence and militias to obtain its goals. It eroded the U.S.’s image. It led to fragmentation in the region.”
All of that was enabled by the political changes that the American invasion of Iraq on March 20, 2003, set in motion. Later on, the 2014 takeover of a large swathe of northern Iraq by the Islamic State terrorist group prompted Iraq to turn to Iran as well as the United States for help, cementing Iran’s grip.
Under the Iraqi dictatorship, the Sunni minority had formed the base of Mr. Hussein’s power; once he was killed, Iran set up loyal militias inside Iraq. It also went on to dismay Saudi Arabia and the other Gulf monarchies and Israel by supporting proxies and partners, such as the Houthi militia in Yemen, that brought violence right to their doorsteps.
People on my side of things in 2003 felt incandescent rage at President Bush and Secretary of State Powell lying through their teeth about Iraq's supposed cache of weapons of mass destruction (WMDs). Robert Wright points out that the invasion's premises were already dishonest, since the United Nations was already there doing what we claimed our invasion would do:
The fog of time makes it easy to lose sight of one of the most amazing facts about that war: In order to invade Iraq and start looking for weapons of mass destruction, the US had to first kick out UN inspectors who were in Iraq looking for weapons of mass destruction.
And they’d been looking intensively! Over the previous four months they had inspected more than 500 sites and found no WMDs and no signs of a WMD program.
Given that those inspected sites included the sites US intelligence agencies had deemed most likely to yield paydirt, this result—zero-for-500—suggested to the attentive observer that information coming from the US government about Saddam Hussein’s activities was not to be trusted.
But let’s leave that aside. Suppose the US government hadn’t been thus discredited—suppose that on the eve of the invasion there was still good reason to think that WMDs were out there somewhere. Why not let the UN inspectors—who had been allowed by the Iraqi government to inspect every site they had asked to inspect—keep looking? There just isn’t an answer to this question that holds water.
By dividing our attention between Iraq and Afghanistan, we failed to accomplish any of our claimed long-term goals in either country—and made the world a much more dangerous place in the process.
I refuse to purchase tickets from the Live Nation/Ticketmaster monopoly, no matter how much I love the act or believe that going to a show would bring about world peace. The Cure's Robert Smith makes it clear the artists themselves hate the monopoly as well:
Hours after Ticketmaster began the “verified fan” process on March 15 to distribute tickets for the band’s first American tour in years — an additional layer of security that Smith insisted upon to prevent scalpers and astronomical prices — the front man wrote an angry screed against the company for the mandatory fees they snuck in for buyers. “I am as sickened as you all are by today’s Ticketmaster ‘fees’ debacle,” he wrote in an all-caps Twitter thread. “To be very clear, the artist has no way to limit them. I have been asking how they are justified. If I get anything coherent by way of an answer I will let you all know … There are tickets available, it is just a very slow process. I will be back if I get anything serious on the TM fees.”
One particular tweet gained virality for showcasing the extent of the company’s malpractice: A fan’s reasonable ticket price of $20 was more than doubled due to processing fees and charges.
At least The Cure have enough clout to get some changes made. Ticketmaster backed down ever so slightly from the 110% surcharges after Smith's complaints:
“After further conversation, Ticketmaster have agreed with us that many of the fees being charged are unduly high, and as a gesture of goodwill have offered a $10 per ticket refund to all verified fan accounts for the lowest ticket price transaction,” [Smith Tweeted]. “And a $5 per ticket refund to all verified fan accounts for other ticket price transactions for all Cure shows at all venues.”
Unregulated capitalism produces monopolies in short order; that's why we have regulation. But having a history degree means watching everything in the present rhyme with everything in the past. So while the monopolies of today have their moment or rapacious greed, I fully expect that we'll see some serious trust-busting soon, and then, 60 years from now, our grandchildren will have forgotten why.
I'm arguing with the Blazorise framework right now because their documentation on how to make a layout work doesn't actually work. Because this requires repeated build/test cycles, I have almost no time to read all of this:
Finally, a group of Chicago aldermen have proposed that the city clear sidewalks of snow and ice when property owners don't. Apparently the $500 fines, which don't happen often, don't work often either.
We had four completely-overcast days in a row, including one with some blowing snow, so I'm happy today has been completely clear. Tomorrow might even get above 10°C—which would at least get into normal March temperatures. This whole winter has been weird, as the next few will likely be until temperature increases start leveling out.
In other news:
Finally, Bruce Schneier and Nathan Sanders explain how AI could write our laws in the future.
The New York Times today has an interactive feature explaining how converting pre-war offices to apartments is a lot easier than converting modern office buildings. Simply put, before the 1940s, no one had air conditioning, so the buildings had more light and air:
These kinds of buildings, often dating to the early 20th century, make for simpler conversions because the same logic that shaped how they were designed as offices a century ago determines how apartments are planned today. Both share a rule of thumb that no interior space be more than 8 or 9 meters from a window that opens.
Iconic prewar skyscrapers like the Empire State Building were designed to this standard, and with this smallest unit in mind: a single rentable office 3 to 6 meters wide and about 8 meters from the windows to the common corridor. That was just the right amount of space for a receptionist’s anteroom and a windowed office.
Dan Kaplan, a senior partner with the architecture firm FXCollaborative in New York, identifies the private-eye suite in any film noir as a classic example: frosted glass doors, a secretary framed by interior transom windows, and then the detective in his private office flooded with natural light.
But the conversion puzzle gets more complex with offices built after World War II. That’s because the modern office has strayed far — increasingly far — from the window rule.
Two inventions liberated office space from the window: air-conditioning and the fluorescent light bulb. Just as the elevator and steel-cage construction enabled buildings to grow taller in the late 19th century, the architectural historian Carol Willis has written, fluorescent lighting and air-conditioning enabled their floor plates to become much deeper.
Then local rules add still more complexity: Maybe the building has to meet stricter seismic requirements as an apartment than as an office (much of the West Coast), or the whole facade must be replaced to meet current wind-load standards (hurricane-prone places). Or you can only convert 18 of the 32 existing office floors into residential use (in Manhattan, such use caps depend on a building’s age and location). Or units must average at least 500 square feet in size per building (downtown Chicago). Or every legal bedroom must have its own working window (New York requires this but Philadelphia and San Francisco don’t).
Still, the commercial real-estate collapse of the last three years has made conversions imperative in big-city downtowns like the Chicago Loop.
The result, probably in only a few years, will be to transform former dense commercial districts like the Loop into dense mixed-use districts that people want to live in.
To paraphrase Hemingway, the pandemic began gradually, then suddenly. Three years ago today, we started March 12th with some trepidation and ended it by closing the world.
What a strange three years we've had.
After my work conference this week, and flying home yesterday, I had a rehearsal this morning and I've got a performance tomorrow. I'll try to catch up on some posts tomorrow morning.
We had several options for group activities today. I did not choose the golf or spring training options. I chose this:
I should have photos of this and other bits (including two extra Brews & Choos stops!) over the weekend.
I'm in Phoenix for my company's Tech Forum, where all the technology professionals come together for a few days of panel discussions and heavy drinking networking events. This morning's lineup, including the keynote speaker, emphasized to me the dangers in the United States' declining ability to teach kids English and history.
I will have more details later, but for now I'll mention these three things. First, if you show the ubiquitous graph of the growing gap between productivity and wages that the US and UK have experienced since the mid-1970s and blame technology for this gap, I'm going to point you to Ronald Reagan, Margaret Thatcher, and the history of capitalism as possibly contributing factors. I mean, there was a similar wage-productivity gap in the southern US from about 1800 to 1865, which technology certainly made possible, but ultimately public policy had a lot more to do with it.
Second, if you present your company's most exciting new AI technology, and someone in the audience asks you if you can show some non-scripted input, saying "no" calls your entire presentation into question. But that's OK; it was already the most boring presentation on an exciting topic I'd seen in years, so I the guy may have challenged them to go off-script with less-than-honorable intentions.
Finally, to the junior developer presenting for the first time to other professionals: if your slide has content on it obviously copied and pasted from the previous slide, your colleagues will forgive you with a little razzing. If you then cannot for the life of you figure out what the content should be, your colleagues—particularly the more senior ones—will think you've blown off your homework and as a consequence your presentation has wasted their time. Because what am I learning from you anyway, if you have not learned it yourself?
What does this have to do with humanities education? I guarantee all of these presenters were engineers without much history or English study, and their lack of breadth showed.
Next up: the "Sonora Desert Hike" experience, with 45 of my best friends. It's cool and cloudy right now so I anticipate I will enjoy it immensely.