11 Comments
Mar 8, 2021Liked by Steven Sinofsky

The Usenet postings at the bottom of the BillG memo are terrific. I loved reading those at the time, the industry speculation about DOS/OS2/Windows/NT. So many keystrokes, so much bloviating and prediction and posturing.

Expand full comment
Mar 21, 2021Liked by Steven Sinofsky

"The management part was an add-on. There was no such thing as a manager who didn’t also code. " : IMO this was one of the great strengths of Microsoft at least early on. As a "manager" you had firm understanding of the design and architecture and the schedule, As company grew larger and "management" required more time and focus it created several challenges and problems. I'm hoping you'd get to those later in your posts.

Expand full comment

(adapted from my own memoir writing drafts folder - seemed to fit your post)

I had been brought back to Autodesk in the Fall of 1992 to lead development of a completely new architecture for AutoCAD that would allow it to play well in a world where all desktop computing was going to be based on GUI development. I was chosen for this because of having developed Windows apps since 1986, having 3 years of NeXT development experience, and having become the leader for AutoCAD development on the Mac. Though we historically did all core AutoCAD development on Sun workstations and shipped on DOS and Windows using 1–2-man porting teams, these products, while all identical, had user experiences and architectural shortcomings that prevented them from running well with expected user interaction models. This was fine in a world where AutoCAD on DOS was a bespoke environment for users on a platform that was standards-free with respect to interaction models. But the GUI was clearly where all our platforms were going. AutoCAD was built around its famous command line which relied on a keyboard-polling architecture which was not suitable for event-driven environments. This did not allow for newly expected idioms such as modeless dialog boxes, floating palettes, dynamic tear-off menus that reflected program state, and so on.

I had successfully convinced the company that Windows would be the dominant platform and I had successfully and painfully convinced the founders that yes, one could develop the very-large AutoCAD and associated add-ons using the newly released NT. We would eliminate portable C idioms we had used for over a decade and move to portable C++ with Win32 APIs used for the basic architecture. We would rely on portability framework for providing Win32 APIs on non-Windows platforms (Bristol Wind/U and Mainsoft - yeah, those guys - for Unix and Mewel for DOS) and we considered and prototyped with all the C++ or Objective-C Windows application frameworks that would allow us to develop in a style akin to NeXTStep's AppKit.

We tried them all - XVT, C++Views, Objective-C Views, a few more I can't remember, and Zinc from some friendly guys in Utah that we were able to get working in DOS using Mewel, a Win32 API implementation for DOS. After a lot of prototyping and examination by two dozen engineers, we chose Zinc. We paid a corporate license for it of $1-2M and began work. Zinc shipped us a pallet of boxes of the software, and we distributed copies to 100 engineers. We were building AutoCAD on MSC for the first time ever (was a MetaWare High C app for years on PCs, like dBase, Paradox, and other large DOS programs), and all was going well with AutoCAD on Zinc design and MSC conversion for the code base.

Then Visual C++ came out with MFC 2.0. I should say, MFC 1.0 was looked at and was a non-starter, for reasons you would know. There were two engineers who had worked on the NeXT platform in the entire company - myself and a guy I brought with me from the company where we had done this. We opened the box and pored over the manuals, installed it, and got to coding over a delirious 2 days. We were shocked, to say the least.

The framework, the wizards and everything else reminded us of Interface Builder/Project Builder systems, and the rest of the environment was particularly good. Our experience and that of many other people I knew in the valley (including my many NeXT friends) told us that using frameworks like AppKit had a 10x effect on productivity and program understanding for newly onboarded people. MFC 2.0 seemed like it would have the same effect.

Since our portability strategy for other operating systems was based on a Win32 layer, I called Bristol and they agreed that they would have Wind/U working with MFC ASAP. Our DOS guy thought he could get it to work.

Now I had to convince our VP of Engineering John Lynch that despite all the effort that had gone into engineering a solution on Win32 and Zinc under my leadership, despite spending millions on Zinc software and months of training that **this** was the platform we needed to use instead. John was livid, saying that we'd look stupid and indecisive, but I pointed out that we were not told about MFC 2.0 until that week and had jumped on it and analyzed it inside and out. None of that mattered anyway I said, moving exclusively to MFC was the right thing to do to create a Windows-centric code base for all our products and everybody would have higher job satisfaction working with MFC (better on your resume, etc.). He said fine, but that I needed to present this about-face to our CEO myself.

In the meeting with Carol Bartz, I walked through the details of the decision process, and was reminded by her questions that she had a computer science background and had an immediate understanding of most of the issues. She can be very intimidating (ask Steve who won in the Yahoo deal). Finally, she said, "it sounds well thought out; it's just too bad that NeXT didn't win, because all of these advantages would be available to us on a 4-year-old platform".

My response was, "MFC provides almost all of the same advantages but with an important advantage - it runs on a platform people actually buy".

Carol said, "That’s it. Go do it". We did.

Today all of Autodesk's core revenue-producing software was developed using MFC, on Windows. That decision simplified our product development and enabled us to go from $200M to $1.8B while I was there. I should thank you for my career.

My tagline when explaining my decision and why I moved the company in this direction became "MFC is NeXTSTEP on a platform people actually buy". You couldn't use it, but I could.

I was right for 15 years, anyway.

Expand full comment
May 20, 2021Liked by Steven Sinofsky

In your opinion, does Windows architecture have a bit of a oopaholic problem? I have often wondered that based on evidence that I perhaps misinterpret. First, the Registry with its tree structure, varied data types, and key-value orientation always seemed to me to be ideal for loading data to memory into objects as a basic system goal. This is in contrast to Linux which still prefers to load from lines of plain text and retains its "everything is a file" nature. Second, the "fail fast" philosophy of the Windows "System"model implied to me that the memory structures used are complex enough that determining where corruption lies is difficult--therefore failing as soon as corruption is detected is preferable. Complex memory structures imply to me objects (and of course the Windows Internals books refer to various memory objects, but these could be red herrings to my hypothesis).

It's no great crime if the Windows architecture is heavily OO influenced, but it does seem to me to mark it of its era--late 80s to early 90s. Java, designed in the same time frame, basically leads with object-orientation and that is its overriding principle. Your references to oopaholic tendencies reverberates in what I know of Windows.

Expand full comment
Mar 13, 2021Liked by Steven Sinofsky

I can relate very much to the strategy of using "C++ as a better C" and to "stick to a sane subset" of C++. And I have experienced that even today, it can be a valid strategy! Let me elaborate. :-)

Just a few years ago, I was in the position of software architect, responsible for a 100 kLOC code base of firmware for hearing aids. The code ran on an ARM M0 CPU at 5 MHz with 144 KB of RAM and 128 KB of ROM.

The code base was written in C, but I pushed for it to become C++. Not because of religious reasons, or because I wanted to use all the fancy features of C++. But because "C++ as a better C" would allow us to write code that's more type-safe, avoid macros and maybe use a class here and there (for when we used a "class-like" struct in C anyway). And since we were so memory-constrained, the move to C++ was not allowed to increase compiled code size! Ok, maybe by a few bytes. ;-)

The push towards C++ ended up being successful, and after a 3-year transition period, the entire code base was C++. During those 3 years, the code base was operative and productive every day and was used in multiple hearing aid product launches. Compiled code size did not increase noticeably.

Older code in that code base is still very C-like today, but newly added code started to use more advanced C++ language features. The code base is now a mix of those styles, and maybe surprisingly, that hasn't really caused problems.

It was definitely a very interesting "journey" with a lot of learnings, so much so that I presented them to a few hundred people at two European embedded-systems conferences in 2018 and 2019. In case anyone is interested in more details, there's a recording (https://youtu.be/nuwOJ-xUhFU) and a white paper (https://drive.google.com/file/d/1kQEBfVCSBYoqD233HjmifBo4dL4VCcMP/view).

Regarding AFX/MFC, I am curious to learn how the "C++ as a better C" approach progressed over time. Was it kept strictly? Or were more C++ language features used over time? (disclaimer: I have never programmed with MFC)

Expand full comment