Today is the 20th birthday of the Microsoft .NET Framework. I remember it vividly, because of the job I had then and its weirdly coincidental start and end dates.
I joined a startup in Chicago to write software using the yet-unreleased .NET Framework in 2001. My first day of work was September 10th. No one showed up to work the next morning.
Flash forward to February 2002, and our planned release date of Monday February 18th, to coincide with the official release of .NET. (We couldn't release software to production using the unreleased beta Framework code.) Microsoft moved the date up to the 14th, but we held to our date because releasing new software on a Thursday night in the era before automated DevOps pipelines was just dumb.
I popped out to New York to see friends on Saturday the 16th. Shortly after I got back to my house on the 17th, our CTO called me to let me know about a hitch in our release plans: the CEO had gotten caught with his hand in the till. We all wound up working at minimum wage (then $5.25 an hour) for two weeks, with the rest of our compensation deferred until, it turned out, mid-2004.
So, happy birthday, .NET Framework. Your release to manufacturing date meant a lot more to me than I could have imagined at the time.
I've spent today alternately upgrading my code base for my real job to .NET 6.0, and preparing for the Apollo Chorus performances of Händel's Messiah on December 11th and 12th.
Cassie, for her part, enjoys when I work from home, even if we haven't spent a lot of time outside today because (a) I've had a lot to do and (b) it rained from 11am to just about now.
So, as I wait for the .NET 6 update to build and deploy on our dev/test CI/CD instance (I think I set the new environments on our app services correctly), I have a few things to read:
OK, the build has...well, crap. I didn't set the environment correctly after all.
Update: Fixed the build bit. And the rain stopped. But the test platform is the wrong version. FFS.
Update: Well, I have to pick something up from a store before 6, so I'll come back to this task later.
Update: Even though I've had 4 tiny commits of minor things that broke with the .NET 6 upgrade, this hasn't gone poorly. Kudos to Microsoft for providing a straightforward upgrade path.
Also known as: read all error messages carefully.
I've just spent about 90 minutes debugging an Azure DevOps pipeline after upgrading from .NET Core 3.1 to .NET 5 RC2. Everything compiled OK, all tests ran locally, but the Test step of my pipeline failed with this error message:
##[error]Unable to find D:\a\1\s\ProjectName.Tests\bin\Debug\net5.0\ref\ProjectName.Tests.deps.json.
Make sure test project has a nuget reference of package "Microsoft.NET.Test.Sdk".
The test step had this Test Files configuration:
**\bin\$(BuildConfiguration)\**\*Tests.dll
!**\*TestAdapter.dll
!**\obj\**
I'll save you all the steps I went through to determine that the .NET 5 build step only copied .dlls into the ref folder, without copying anything else (like the dependencies definition file). The solution turned out to be adding one line to the configuration:
**\bin\$(BuildConfiguration)\**\*Tests.dll
!**\ref\**
!**\*TestAdapter.dll
!**\obj\**
Excluding the ref folder fixed it. And I hope this post saves someone else 90 minutes of debugging.
The Cloud—known to us in the industry as "someone else's computers"—takes a lot of power to run. Which is why our local electric utility, ComEd, is beefing up their service to the O'Hare area:
Last month, it broke ground to expand its substation in northwest suburban Itasca to increase its output by about 180 megawatts by the end of 2019. Large data centers with multiple users often consume about 24 megawatts. For scale, 1 megawatt is enough to supply as many as 285 homes.
ComEd also has acquired land for a new substation to serve the proposed 1 million-square-foot Busse Farm technology park in Elk Grove Village that will include a data center component. The last time ComEd built a substation was in 2015 in Romeoville, to serve nearby warehouses. In the past year, Elk Grove Village issued permits for four data center projects totaling 600,000 square feet and $175 million in construction. If built, it's a 40 percent increase in total data center capacity in the village.
Insiders say Apple, Google, Microsoft and Oracle have taken on more capacity at data centers in metro Chicago in the past year or so.
One deal that got plenty of tongues wagging was from DuPont Fabros Technology, which started work earlier this year on a 305,000-square-foot data center in Elk Grove Village. DuPont, which recently was acquired by Digital Realty Trust, pre-leased half of it, or about 14 megawatts, to a single customer, believed to be Apple.
One of the oldest cloud data centers, Microsoft's North Central Azure DC, is about three kilometers south of the airport here. Notice the substation just across the tollway to the west.
The Microsoft Windows operating system has millions of lines of code maintained by thousands of developers. And in the past three months, Microsoft has moved 90% of its code to the open-source Git version control system:
The switch to Git has been driven by a couple of things. In 2013, the company embarked on its OneCore project, unifying its different strands of Windows development and making the operating system a more cleanly modularized, layered platform. At the time, Microsoft was using SourceDepot, a customized version of the commercial Perforce version control system, for all its major projects.
SourceDepot couldn't handle a project the size of Windows, so rather than having the whole operating system in a single repository, the Windows code was actually divided among 65 different repositories, with a kind of virtualization layer on top to produce a unified view of all the code. Some of these 65 repos contained nicely isolated, standalone components; others took vertical or horizontal slices through the operating system; others were just grab bags of different code. As such, the repo structure didn't correspond with OneCore's module boundaries.
Due to widespread developer familiarity and strong support for creating lots of branches with low overhead, the decision was made to use Git as the new system. But Git isn't designed to handle 300GB repositories made up of 3.5 million files. Microsoft had to embark on a project to customize Git to enable it to handle the company's scale.
You read that right: Windows contains 3.5 million individual code files, and there are so many changes to them (8,500 per day on average), they had to create their own super-charged version of Git.
Programming nerds will want to read the whole article. Non-nerds can scroll down for political stuff.
Senior Microsoft programmer Raymond Chen describes a feature in Windows 10 that is unusually useful:
Windows 10 brings the Xbox Game DVR feature to the PC. The Game DVR feature lets you record yourself playing a video game, so you can share the recording with your friends.
Suppose you have some program that you want to record, say for a bug report or for an instructional video. Just pretend it's a game:
- Put focus on the program you want to record.
- Press Win+G to open the Game Bar. If it asks whether you want to open the Game Bar, say, "Yes, this is a game."
- Press the red circle to start recording, or press Win+Alt+R.
- Do the thing you want to record.
- Open the Game Bar and press the red square to stop recording. Or use the hotkey Win+Alt+R.
- Optional: Open the Game Bar, click the gear icon, and uncheck "Remember this app as a game."
The recording is placed in your Videos\Captures folder.
Cool, right?
Despite being a long-term .NET guy, and despite thinking Java has lagged significantly in language features and power over the years, and despite the ludicrous claim that .NET isn't portable, I laughed very hard at this Norwegian video:
No, really. In 1998 Microsoft wanted to demonstrate its SQL Server database engine with a terabyte-sized database, so it built a map called Terraserver. Motherboard's Jason Koebler has the story:
Terraserver could have, should have been a product that ensured Microsoft would remain the world’s most important internet company well into the 21st century. It was the first-ever publicly available interactive satellite map of the world. The world’s first-ever terabyte-sized database. In fact, it was the world’s largest database for several years, and that Compaq was—physically speaking—the world's largest computer. Terraserver was a functional and popular Google Earth predecessor that launched and worked well before Google even thought of the concept. It let you see your house, from space.
So why aren’t we all using Terraserver on our smartphones right now?
Probably for the same reason Microsoft barely put up a fight as Google outpaced it with search, email, browser, and just about every other consumer service. Microsoft, the corporation, didn't seem to care very much about the people who actually used Terraserver, and it didn’t care about the vast amount of data about consumers it was gleaning from how they used the service.
In sum, Microsoft saw itself as a software company, not an information company. It's similar to how Borders got destroyed: it thought of itself as a bookstore, while Amazon thought of itself as a delivery service.
I remember how cool Terraserver was, and how sad I felt when it disappeared for a couple of years before it morphed into Google Earth.