Via Raymond Chen, Eric Shlaepfer built a 6502 emulator out of full-size components:
The MOnSter 6502
A dis-integrated circuit project to make a complete, working transistor-scale replica of the classic MOS 6502 microprocessor.
How big is it?
It's a four layer circuit board, 12 × 15 inches, 0.1 inches thick, with surface mount components on both sides.
Can you hook it up inside an Apple ][ and run Oregon Trail?
No, not directly. It's neat to think of plugging the MOnSter 6502's in-circuit emulator (ICE) in-circuit replica (ICR) cable directly into a socket inside an Apple ][, but that wouldn't actually work. The Apple ][ design relies on a number of clever tricks that derive timing for video generation and peripheral control from the main clock signal — all of which will fail if you need to run at a slower speed.
Are you going to make one out of vacuum tubes next?
Make sure you watch the 2-minute video.
Just an historical note: as of today, I've been working with Microsoft .NET for 17 years. The first time I picked it up was 10 September 2001, which, if you think about it, is a very easy date to remember.
Atlantic editor Adam Serwer draws a straight line between the ways the Redemption court of the 1870s paved the way for the Gilded Age and Jim Crow, and how the Roberts court now (and especially with Brett Kavanaugh on it) is returning to those halcyon days:
The decision in Cruikshank set a pattern that would hold for decades. Despite being dominated by appointees from the party of abolition, the Court gave its constitutional blessing to the destruction of America’s short-lived attempt at racial equality piece by piece. By the end, racial segregation would be the law of the land, black Americans would be almost entirely disenfranchised, and black workers would be relegated to a twisted simulacrum of the slave system that existed before the Civil War.
The justices did not resurrect Dred Scott v. Sandford’s antebellum declaration that a black man had no rights that a white man was bound to respect. Rather, they carefully framed their arguments in terms of limited government and individual liberty, writing opinion after opinion that allowed the white South to create an oppressive society in which black Americans had almost no rights at all. Their commitment to freedom in the abstract, and only in the abstract, allowed a brutal despotism to take root in Southern soil.
The conservative majority on the Supreme Court today is similarly blinded by a commitment to liberty in theory that ignores the reality of how Americans’ lives are actually lived. Like the Supreme Court of that era, the conservatives on the Court today are opposed to discrimination in principle, and indifferent to it in practice. Chief Justice John Roberts’s June 2018 ruling to uphold President Donald Trump’s travel ban targeting a list of majority-Muslim countries, despite the voluminous evidence that it had been conceived in animus, showed that the muddled doctrines of the post-Reconstruction period retain a stubborn appeal.
Roberts wrote that since the declaration itself was “facially neutral toward religion” and did not discriminate against all Muslims, it did not run afoul of the Constitution. In doing so, he embraced the logic of decades of jurisprudence from his predecessors on the high court, whose rulings ensured that the Constitution would not interfere with the emergence of Jim Crow in the American South. The nation’s founding document is no match for a dedicated majority of justices committed to circumventing its guarantees.
He lays out that in the Roberts court at least they're not vociferously white supremacist. But the deference to corporate rights, he points out, almost guarantee another generation of increasing wealth disparities in America.
Unless we win all three branches of government and pass an amendment or two. But it'll have to get a lot worse before we do that, if history is any guide.
Update: Longtime reader MB sent this: "At every crossroads on the path that leads to the future, tradition has placed 10,000 men to guard the past."—Maurice Maeterlinck
Most people starting college this year were born in 2000. Let that sink in. Then read this:
- They are the first class born in the new millennium, escaping the dreaded label of “Millennial,” though their new designation—iGen, GenZ, etc. — has not yet been agreed upon by them.
- Outer space has never been without human habitation.
- They have always been able to refer to Wikipedia.
- They have grown up afraid that a shooting could happen at their school, too.
- People loudly conversing with themselves in public are no longer thought to be talking to imaginary friends.
It gets worse from there. (Worse, I suppose, if you realize that these kids are 30 years younger than you are.)
I'm traveling today, so this may be my last post of the Summer of 2018. Posting resumes from the Ancestral Homeland tomorrow.
Chicago-based writer Daniel Kay Hertz finds that reactions to gentrification, and its effects, have remained the same for over a century:
I’ve been struck by the Groundhog Day quality of thinking on these changes. Decade after decade, observers alternately wonder at the latest clique of young, middle-class white people to have chosen to live in a less privileged urban neighborhood, and then predict that clique’s imminent demise, a return to the “natural” order of things.
As early as the 1920s, the sociologist Harvey Zorbaugh quoted people who swore that time was up for the residents of Tower Town, Chicago’s bohemian answer to New York City’s Greenwich Village, as young artists abandoned it. (Many of those who left just settled a short walk up the lakefront in what we now call Old Town.) Zorbaugh himself was convinced that the Gold Coast, the last inner city stronghold of Chicago’s upper class, had barely ten years left before the rich realized they would have fewer headaches farther from the chaos of the downtown Loop. (A century later, the Gold Coast is still, well, Gold.)
Often, even the gentrifiers themselves don’t quite believe that what they’ve created can last. Into the 1970s—when parts of Lincoln Park had already become wealthier than many white-collar suburbs—a Lincoln Park neighborhood association director fretted that one wrong development might push the area towards a “ghetto.”
Why have we found it so hard to believe that a generations-old trend of growing affluence at the core of a major city could be durable? And why has it proven so durable?
Hertz provides some pretty compelling and well-researched answers.
I found this photo just in time for its 30th anniversary. That's me on my first full day on campus, 27 August 1988. The guy in the '80s mesh shirt is my first college roommate.
That night, he and a few of his new friends did beer funnels in the room, forcing me to go to sleep with two drunk idiots lying on our floor in pools of beer.
I got a new roommate the day room moves opened up 4 weeks later. I have no idea what became of the guy, but I imagine he sells insurance or something.
Update: According to his Facebook profile, he's a chiropractor now. I would never have guessed that. Never.
The Post has a long-form profile of our greatest (and longest-serving!) former president, Jimmy Carter:
When Carter left the White House after one tumultuous term, trounced by Ronald Reagan in the 1980 election, he returned to Plains, a speck of peanut and cotton farmland that to this day has a nearly 40 percent poverty rate.
The Democratic former president decided not to join corporate boards or give speeches for big money because, he says, he didn’t want to “capitalize financially on being in the White House.”
Presidential historian Michael Beschloss said that Gerald Ford, Carter’s predecessor and close friend, was the first to fully take advantage of those high-paid post-presidential opportunities, but that “Carter did the opposite.”
Since Ford, other former presidents, and sometimes their spouses, routinely earn hundreds of thousands of dollars per speech.
“I don’t see anything wrong with it; I don’t blame other people for doing it,” Carter says over dinner. “It just never had been my ambition to be rich.”
Carter decided that his income would come from writing, and he has written 33 books, about his life and career, his faith, Middle East peace, women’s rights, aging, fishing, woodworking, even a children’s book written with his daughter, Amy Carter, called “The Little Baby Snoogle-Fleejer.”
With book income and the $210,700 annual pension all former presidents receive, the Carters live comfortably. But his books have never fetched the massive sums commanded by more recent presidents.
That's not the only way he's modest and giving back to the nation. It's a good read.
On 8 August 1988, the Chicago Cubs played their first night game at Wrigley Field. The Tribune rounds up memories from people who supported and opposed the installation of lights at the park:
Ryne Sandberg, Cubs second baseman, 1982-1997: Leading up to ’88, the talk within the organization was that lights were necessary to create a schedule more conducive to resting the home team, getting us out of the sun. Before that, with some of those 10-day homestands with all day games (it was) in 90-plus temperatures.
Rick Sutcliffe, Cubs pitcher, 1984-1991: There's nothing better than playing a day game and going home to have dinner with your family. But when you come back from a West Coast trip, and let’s say you had a long game … sometimes we went straight from the airport to the ballpark. It’s really difficult that whole homestand. You just feel wiped out. … I would throw nine innings at Dodger Stadium and might lose anywhere from 2 to 4 pounds. There were times at Wrigley Field during that heat that I lost 10 to 15 pounds. I would love to go start a game to lose 15 right now!
Did lights help the Cubs? Probably; but there's no definitive way to say.
Yesterday was the 73rd anniversary of our nuclear attack on Hiroshima, Japan. On the event's 50th anniversary, The Atlantic asked, "Was it right?"
I imagine that the persistence of that question irritated Harry Truman above all other things. The atomic bombs that destroyed the cities of Hiroshima and Nagasaki fifty years ago were followed in a matter of days by the complete surrender of the Japanese empire and military forces, with only the barest fig leaf of a condition—an American promise not to molest the Emperor. What more could one ask from an act of war? But the two bombs each killed at least 50,000 people and perhaps as many as 100,000. Numerous attempts have been made to estimate the death toll, counting not only those who died on the first day and over the following week or two but also the thousands who died later of cancers thought to have been caused by radiation. The exact number of dead can never be known, because whole families—indeed, whole districts—were wiped out by the bombs; because the war had created a floating population of refugees throughout Japan; because certain categories of victims, such as conscript workers from Korea, were excluded from estimates by Japanese authorities; and because as time went by, it became harder to know which deaths had indeed been caused by the bombs. However many died, the victims were overwhelming civilians, primarily the old, the young, and women; and all the belligerents formally took the position that the killing of civilians violated both the laws of war and common precepts of humanity. Truman shared this reluctance to be thought a killer of civilians. Two weeks before Hiroshima he wrote of the bomb in his diary, "I have told [the Secretary of War] Mr. Stimson to use it so that military objectives and soldiers and sailors are the target and not women and children.
" The first reports on August 6, 1945, accordingly described Hiroshima as a Japanese army base.
This fiction could not stand for long. The huge death toll of ordinary Japanese citizens, combined with the horror of so many deaths by fire, eventually cast a moral shadow over the triumph of ending the war with two bombs.
It's a sobering essay. It's also a good argument, indirectly, in favor of making sure nuclear weapons are never used again.
Sediment under Lake Chichancanab on the Yucatan Peninsula has offered scientists a clearer view of what happened to the Mayan civilization:
Scientists have several theories about why the collapse happened, including deforestation, overpopulation and extreme drought. New research, published in Science Thursday, focuses on the drought and suggests, for the first time, how extreme it was.
[S]cientists found a 50 percent decrease in annual precipitation over more than 100 years, from 800 to 1,000 A.D. At times, the study shows, the decrease was as much as 70 percent.
The drought was previously known, but this study is the first to quantify the rainfall, relative humidity and evaporation at that time. It's also the first to combine multiple elemental analyses and modeling to determine the climate record during the Mayan civilization demise.
Many theories about the drought triggers exist, but there is no smoking gun some 1,000 years later. The drought coincides with the beginning of the Medieval Warm Period, thought to have been caused by a decrease in volcanic ash in the atmosphere and an increase in solar activity. Previous studies have shown that the Mayans’ deforestation may have also contributed. Deforestation tends to decrease the amount of moisture and destabilize the soil. Additional theories for the cause of the drought include changes to the atmospheric circulation and decline in tropical cyclone frequency, Evans said.
What this research has to do with the early 21st Century I'll leave as an exercise for the reader.