The Usenet postings at the bottom of the BillG memo are terrific. I loved reading those at the time, the industry speculation about DOS/OS2/Windows/NT. So many keystrokes, so much bloviating and prediction and posturing.
Nicholas Petreley! Such an ass. He once wrote a column that IBM was /just about/ to drop the hammer on MS, over (?) OpenDoc patents. I'd seen the day before that we'd signed some eternal cross-patent deal with IBM. Ha ha Nick.
I got a job offer from Steve Jobs because of one of these. I coudn't believe it. He really did spend time in the early 90s pouring over comp.sys.next.advocacy, and the like.
"The management part was an add-on. There was no such thing as a manager who didn’t also code. " : IMO this was one of the great strengths of Microsoft at least early on. As a "manager" you had firm understanding of the design and architecture and the schedule, As company grew larger and "management" required more time and focus it created several challenges and problems. I'm hoping you'd get to those later in your posts.
(adapted from my own memoir writing drafts folder - seemed to fit your post)
I had been brought back to Autodesk in the Fall of 1992 to lead development of a completely new architecture for AutoCAD that would allow it to play well in a world where all desktop computing was going to be based on GUI development. I was chosen for this because of having developed Windows apps since 1986, having 3 years of NeXT development experience, and having become the leader for AutoCAD development on the Mac. Though we historically did all core AutoCAD development on Sun workstations and shipped on DOS and Windows using 1–2-man porting teams, these products, while all identical, had user experiences and architectural shortcomings that prevented them from running well with expected user interaction models. This was fine in a world where AutoCAD on DOS was a bespoke environment for users on a platform that was standards-free with respect to interaction models. But the GUI was clearly where all our platforms were going. AutoCAD was built around its famous command line which relied on a keyboard-polling architecture which was not suitable for event-driven environments. This did not allow for newly expected idioms such as modeless dialog boxes, floating palettes, dynamic tear-off menus that reflected program state, and so on.
I had successfully convinced the company that Windows would be the dominant platform and I had successfully and painfully convinced the founders that yes, one could develop the very-large AutoCAD and associated add-ons using the newly released NT. We would eliminate portable C idioms we had used for over a decade and move to portable C++ with Win32 APIs used for the basic architecture. We would rely on portability framework for providing Win32 APIs on non-Windows platforms (Bristol Wind/U and Mainsoft - yeah, those guys - for Unix and Mewel for DOS) and we considered and prototyped with all the C++ or Objective-C Windows application frameworks that would allow us to develop in a style akin to NeXTStep's AppKit.
We tried them all - XVT, C++Views, Objective-C Views, a few more I can't remember, and Zinc from some friendly guys in Utah that we were able to get working in DOS using Mewel, a Win32 API implementation for DOS. After a lot of prototyping and examination by two dozen engineers, we chose Zinc. We paid a corporate license for it of $1-2M and began work. Zinc shipped us a pallet of boxes of the software, and we distributed copies to 100 engineers. We were building AutoCAD on MSC for the first time ever (was a MetaWare High C app for years on PCs, like dBase, Paradox, and other large DOS programs), and all was going well with AutoCAD on Zinc design and MSC conversion for the code base.
Then Visual C++ came out with MFC 2.0. I should say, MFC 1.0 was looked at and was a non-starter, for reasons you would know. There were two engineers who had worked on the NeXT platform in the entire company - myself and a guy I brought with me from the company where we had done this. We opened the box and pored over the manuals, installed it, and got to coding over a delirious 2 days. We were shocked, to say the least.
The framework, the wizards and everything else reminded us of Interface Builder/Project Builder systems, and the rest of the environment was particularly good. Our experience and that of many other people I knew in the valley (including my many NeXT friends) told us that using frameworks like AppKit had a 10x effect on productivity and program understanding for newly onboarded people. MFC 2.0 seemed like it would have the same effect.
Since our portability strategy for other operating systems was based on a Win32 layer, I called Bristol and they agreed that they would have Wind/U working with MFC ASAP. Our DOS guy thought he could get it to work.
Now I had to convince our VP of Engineering John Lynch that despite all the effort that had gone into engineering a solution on Win32 and Zinc under my leadership, despite spending millions on Zinc software and months of training that **this** was the platform we needed to use instead. John was livid, saying that we'd look stupid and indecisive, but I pointed out that we were not told about MFC 2.0 until that week and had jumped on it and analyzed it inside and out. None of that mattered anyway I said, moving exclusively to MFC was the right thing to do to create a Windows-centric code base for all our products and everybody would have higher job satisfaction working with MFC (better on your resume, etc.). He said fine, but that I needed to present this about-face to our CEO myself.
In the meeting with Carol Bartz, I walked through the details of the decision process, and was reminded by her questions that she had a computer science background and had an immediate understanding of most of the issues. She can be very intimidating (ask Steve who won in the Yahoo deal). Finally, she said, "it sounds well thought out; it's just too bad that NeXT didn't win, because all of these advantages would be available to us on a 4-year-old platform".
My response was, "MFC provides almost all of the same advantages but with an important advantage - it runs on a platform people actually buy".
Carol said, "That’s it. Go do it". We did.
Today all of Autodesk's core revenue-producing software was developed using MFC, on Windows. That decision simplified our product development and enabled us to go from $200M to $1.8B while I was there. I should thank you for my career.
My tagline when explaining my decision and why I moved the company in this direction became "MFC is NeXTSTEP on a platform people actually buy". You couldn't use it, but I could.
In your opinion, does Windows architecture have a bit of a oopaholic problem? I have often wondered that based on evidence that I perhaps misinterpret. First, the Registry with its tree structure, varied data types, and key-value orientation always seemed to me to be ideal for loading data to memory into objects as a basic system goal. This is in contrast to Linux which still prefers to load from lines of plain text and retains its "everything is a file" nature. Second, the "fail fast" philosophy of the Windows "System"model implied to me that the memory structures used are complex enough that determining where corruption lies is difficult--therefore failing as soon as corruption is detected is preferable. Complex memory structures imply to me objects (and of course the Windows Internals books refer to various memory objects, but these could be red herrings to my hypothesis).
It's no great crime if the Windows architecture is heavily OO influenced, but it does seem to me to mark it of its era--late 80s to early 90s. Java, designed in the same time frame, basically leads with object-orientation and that is its overriding principle. Your references to oopaholic tendencies reverberates in what I know of Windows.
Good question. I am not sure if that is an OOP problem as much as it was a point of view that tended to embrace complexity in the name of being sophisticated. The debate over the binary registry versus text files was one that was pretty significant as many had a lot of experience with text files. It was the potential “benefits” of a binary file (scale, database like functionality, etc.) that won out. We had a similar debate over setup even after we’d all seen the implementation in NeXTStep. Many of us really wanted to do stateless install, but the view that management tools would be better off with a per-machine install database won out.
I can relate very much to the strategy of using "C++ as a better C" and to "stick to a sane subset" of C++. And I have experienced that even today, it can be a valid strategy! Let me elaborate. :-)
Just a few years ago, I was in the position of software architect, responsible for a 100 kLOC code base of firmware for hearing aids. The code ran on an ARM M0 CPU at 5 MHz with 144 KB of RAM and 128 KB of ROM.
The code base was written in C, but I pushed for it to become C++. Not because of religious reasons, or because I wanted to use all the fancy features of C++. But because "C++ as a better C" would allow us to write code that's more type-safe, avoid macros and maybe use a class here and there (for when we used a "class-like" struct in C anyway). And since we were so memory-constrained, the move to C++ was not allowed to increase compiled code size! Ok, maybe by a few bytes. ;-)
The push towards C++ ended up being successful, and after a 3-year transition period, the entire code base was C++. During those 3 years, the code base was operative and productive every day and was used in multiple hearing aid product launches. Compiled code size did not increase noticeably.
Older code in that code base is still very C-like today, but newly added code started to use more advanced C++ language features. The code base is now a mix of those styles, and maybe surprisingly, that hasn't really caused problems.
Regarding AFX/MFC, I am curious to learn how the "C++ as a better C" approach progressed over time. Was it kept strictly? Or were more C++ language features used over time? (disclaimer: I have never programmed with MFC)
Oh, and for us, over time, we eventually allowed the use of deep C++ constructs, as people got familiar with the language and knew what was going on at the bare metal level.
Maybe MSFT was different than us and every other company who shipped products on a schedule. Would be interesting to hear.
For AutoCAD, we tested people by asking them to describe every memory allocation that was occurring using references. The failure rate on understanding where copy constructor activity was occurring was quite high - over 90%. In an application as large as AutoCAD, you manage your own memory very carefully, co-locating items on VM pages deliberately, and track every malloc(), especially in loops. So, this failure wasn't going to fly. And we didn't have time for people to learn this.
Noticing that Objective-C users did not have this issue because every object reference was a pointer that you know you allocated, we mandated the Objective-C approach to object reference.
The Usenet postings at the bottom of the BillG memo are terrific. I loved reading those at the time, the industry speculation about DOS/OS2/Windows/NT. So many keystrokes, so much bloviating and prediction and posturing.
you used to be obsessed with one infoworld weekly columnist that was always horribly wrong because he was such an OS/2 cheerleader. Was it Zackman?
Nicholas Petreley! Such an ass. He once wrote a column that IBM was /just about/ to drop the hammer on MS, over (?) OpenDoc patents. I'd seen the day before that we'd signed some eternal cross-patent deal with IBM. Ha ha Nick.
I got a job offer from Steve Jobs because of one of these. I coudn't believe it. He really did spend time in the early 90s pouring over comp.sys.next.advocacy, and the like.
"The management part was an add-on. There was no such thing as a manager who didn’t also code. " : IMO this was one of the great strengths of Microsoft at least early on. As a "manager" you had firm understanding of the design and architecture and the schedule, As company grew larger and "management" required more time and focus it created several challenges and problems. I'm hoping you'd get to those later in your posts.
(adapted from my own memoir writing drafts folder - seemed to fit your post)
I had been brought back to Autodesk in the Fall of 1992 to lead development of a completely new architecture for AutoCAD that would allow it to play well in a world where all desktop computing was going to be based on GUI development. I was chosen for this because of having developed Windows apps since 1986, having 3 years of NeXT development experience, and having become the leader for AutoCAD development on the Mac. Though we historically did all core AutoCAD development on Sun workstations and shipped on DOS and Windows using 1–2-man porting teams, these products, while all identical, had user experiences and architectural shortcomings that prevented them from running well with expected user interaction models. This was fine in a world where AutoCAD on DOS was a bespoke environment for users on a platform that was standards-free with respect to interaction models. But the GUI was clearly where all our platforms were going. AutoCAD was built around its famous command line which relied on a keyboard-polling architecture which was not suitable for event-driven environments. This did not allow for newly expected idioms such as modeless dialog boxes, floating palettes, dynamic tear-off menus that reflected program state, and so on.
I had successfully convinced the company that Windows would be the dominant platform and I had successfully and painfully convinced the founders that yes, one could develop the very-large AutoCAD and associated add-ons using the newly released NT. We would eliminate portable C idioms we had used for over a decade and move to portable C++ with Win32 APIs used for the basic architecture. We would rely on portability framework for providing Win32 APIs on non-Windows platforms (Bristol Wind/U and Mainsoft - yeah, those guys - for Unix and Mewel for DOS) and we considered and prototyped with all the C++ or Objective-C Windows application frameworks that would allow us to develop in a style akin to NeXTStep's AppKit.
We tried them all - XVT, C++Views, Objective-C Views, a few more I can't remember, and Zinc from some friendly guys in Utah that we were able to get working in DOS using Mewel, a Win32 API implementation for DOS. After a lot of prototyping and examination by two dozen engineers, we chose Zinc. We paid a corporate license for it of $1-2M and began work. Zinc shipped us a pallet of boxes of the software, and we distributed copies to 100 engineers. We were building AutoCAD on MSC for the first time ever (was a MetaWare High C app for years on PCs, like dBase, Paradox, and other large DOS programs), and all was going well with AutoCAD on Zinc design and MSC conversion for the code base.
Then Visual C++ came out with MFC 2.0. I should say, MFC 1.0 was looked at and was a non-starter, for reasons you would know. There were two engineers who had worked on the NeXT platform in the entire company - myself and a guy I brought with me from the company where we had done this. We opened the box and pored over the manuals, installed it, and got to coding over a delirious 2 days. We were shocked, to say the least.
The framework, the wizards and everything else reminded us of Interface Builder/Project Builder systems, and the rest of the environment was particularly good. Our experience and that of many other people I knew in the valley (including my many NeXT friends) told us that using frameworks like AppKit had a 10x effect on productivity and program understanding for newly onboarded people. MFC 2.0 seemed like it would have the same effect.
Since our portability strategy for other operating systems was based on a Win32 layer, I called Bristol and they agreed that they would have Wind/U working with MFC ASAP. Our DOS guy thought he could get it to work.
Now I had to convince our VP of Engineering John Lynch that despite all the effort that had gone into engineering a solution on Win32 and Zinc under my leadership, despite spending millions on Zinc software and months of training that **this** was the platform we needed to use instead. John was livid, saying that we'd look stupid and indecisive, but I pointed out that we were not told about MFC 2.0 until that week and had jumped on it and analyzed it inside and out. None of that mattered anyway I said, moving exclusively to MFC was the right thing to do to create a Windows-centric code base for all our products and everybody would have higher job satisfaction working with MFC (better on your resume, etc.). He said fine, but that I needed to present this about-face to our CEO myself.
In the meeting with Carol Bartz, I walked through the details of the decision process, and was reminded by her questions that she had a computer science background and had an immediate understanding of most of the issues. She can be very intimidating (ask Steve who won in the Yahoo deal). Finally, she said, "it sounds well thought out; it's just too bad that NeXT didn't win, because all of these advantages would be available to us on a 4-year-old platform".
My response was, "MFC provides almost all of the same advantages but with an important advantage - it runs on a platform people actually buy".
Carol said, "That’s it. Go do it". We did.
Today all of Autodesk's core revenue-producing software was developed using MFC, on Windows. That decision simplified our product development and enabled us to go from $200M to $1.8B while I was there. I should thank you for my career.
My tagline when explaining my decision and why I moved the company in this direction became "MFC is NeXTSTEP on a platform people actually buy". You couldn't use it, but I could.
I was right for 15 years, anyway.
In your opinion, does Windows architecture have a bit of a oopaholic problem? I have often wondered that based on evidence that I perhaps misinterpret. First, the Registry with its tree structure, varied data types, and key-value orientation always seemed to me to be ideal for loading data to memory into objects as a basic system goal. This is in contrast to Linux which still prefers to load from lines of plain text and retains its "everything is a file" nature. Second, the "fail fast" philosophy of the Windows "System"model implied to me that the memory structures used are complex enough that determining where corruption lies is difficult--therefore failing as soon as corruption is detected is preferable. Complex memory structures imply to me objects (and of course the Windows Internals books refer to various memory objects, but these could be red herrings to my hypothesis).
It's no great crime if the Windows architecture is heavily OO influenced, but it does seem to me to mark it of its era--late 80s to early 90s. Java, designed in the same time frame, basically leads with object-orientation and that is its overriding principle. Your references to oopaholic tendencies reverberates in what I know of Windows.
Good question. I am not sure if that is an OOP problem as much as it was a point of view that tended to embrace complexity in the name of being sophisticated. The debate over the binary registry versus text files was one that was pretty significant as many had a lot of experience with text files. It was the potential “benefits” of a binary file (scale, database like functionality, etc.) that won out. We had a similar debate over setup even after we’d all seen the implementation in NeXTStep. Many of us really wanted to do stateless install, but the view that management tools would be better off with a per-machine install database won out.
I can relate very much to the strategy of using "C++ as a better C" and to "stick to a sane subset" of C++. And I have experienced that even today, it can be a valid strategy! Let me elaborate. :-)
Just a few years ago, I was in the position of software architect, responsible for a 100 kLOC code base of firmware for hearing aids. The code ran on an ARM M0 CPU at 5 MHz with 144 KB of RAM and 128 KB of ROM.
The code base was written in C, but I pushed for it to become C++. Not because of religious reasons, or because I wanted to use all the fancy features of C++. But because "C++ as a better C" would allow us to write code that's more type-safe, avoid macros and maybe use a class here and there (for when we used a "class-like" struct in C anyway). And since we were so memory-constrained, the move to C++ was not allowed to increase compiled code size! Ok, maybe by a few bytes. ;-)
The push towards C++ ended up being successful, and after a 3-year transition period, the entire code base was C++. During those 3 years, the code base was operative and productive every day and was used in multiple hearing aid product launches. Compiled code size did not increase noticeably.
Older code in that code base is still very C-like today, but newly added code started to use more advanced C++ language features. The code base is now a mix of those styles, and maybe surprisingly, that hasn't really caused problems.
It was definitely a very interesting "journey" with a lot of learnings, so much so that I presented them to a few hundred people at two European embedded-systems conferences in 2018 and 2019. In case anyone is interested in more details, there's a recording (https://youtu.be/nuwOJ-xUhFU) and a white paper (https://drive.google.com/file/d/1kQEBfVCSBYoqD233HjmifBo4dL4VCcMP/view).
Regarding AFX/MFC, I am curious to learn how the "C++ as a better C" approach progressed over time. Was it kept strictly? Or were more C++ language features used over time? (disclaimer: I have never programmed with MFC)
Oh, and for us, over time, we eventually allowed the use of deep C++ constructs, as people got familiar with the language and knew what was going on at the bare metal level.
Maybe MSFT was different than us and every other company who shipped products on a schedule. Would be interesting to hear.
For AutoCAD, we tested people by asking them to describe every memory allocation that was occurring using references. The failure rate on understanding where copy constructor activity was occurring was quite high - over 90%. In an application as large as AutoCAD, you manage your own memory very carefully, co-locating items on VM pages deliberately, and track every malloc(), especially in loops. So, this failure wasn't going to fly. And we didn't have time for people to learn this.
Noticing that Objective-C users did not have this issue because every object reference was a pointer that you know you allocated, we mandated the Objective-C approach to object reference.
Problem solved.