6 Comments
Feb 15, 2021Liked by Steven Sinofsky

Cross platform has quite a history at Microsoft. Steven is right to point out the ebb and flow of the benefit of cross platform approaches as platforms and computing eras go through their life cycles. By that I mean that an OS like Mac OS and Windows are platforms while GUI (local) and Internet ("server") based apps are eras. (Your taxonomy mileage may vary :))

I joined Microsoft at the beginning of the Mac OS platform and GUI era but before Windows and OS/2. I worked on the Mac apps File, Multiplan, and Excel 1.0 (barely). These early apps did a lot to teach Microsoft about the GUI era, lessons that were critical to winning the apps market. Other ISVs were completely focused on MS-DOS and ignored the GUI era at their peril. I give credit to BillG (Bill Gates) for having the strategic forethought about GUI, even if DoukK hated it at first. It was a “Big Bet” as GUI needed more of everything: RAM, Disk, Graphics, and Peripherals (think mice and laser printers).

Since Mac OS was the only platform at first, there was no cross platform need in the early GUI era. This fit with the resource constraints. In order to make things perform well the closer to the platform the app stayed the better. Windows was on the horizon, and after Excel 1.0 shipped exclusively on Mac OS, a team was put together to port it to Windows (including me). At this point Windows 1.0 had just shipped, with virtually zero ISV support and the Windows team was given the charter to make sure Windows 2.0 would garner ISV support. In addition to Excel the Windows team was keen to make sure Aldus PageMaker would work on Windows. I sometimes say that the Excel, PageMaker, and Windows teams wrote Windows 2.0 together. Another important piece of the Windows story happens during this time, in the second half of this project Microsoft acquires Dynamical Systems Research Inc. and its programming team including DavidW. There are many stories to that adventure, but this is a post about cross platform!

After shipping Excel 2.0 on Windows, with the OS/2 with Presentation Manager platform on the horizon, we started thinking about cross platform strategies for Excel. Steven and the tools team were biding time for C++ and other ideas to come together. Word was not yet shipping for Windows. So, the Excel team was doing this on its own.

Cross platform programming is an exercise in indirection. The “core” code is written to a model platform and the model platform is then implemented on a real platform like Mac OS. One of the biggest decisions to make, one that has ramifications on performance and stability, is “what should the model platform be?” Steven will have to speak about the thinking in the tools team, but I remember their attitude being they should build an “ideal” model platform using OOP paradigms. For Excel we took a very pragmatic approach. Our design philosophy was to make the model platform implementable with as little code as possible driven by the maxim, “Choose the mechanism which will suck the least on the other platforms.” The “Excel Layer” model used eventing and windowing using the Windows model and used a hybrid of the Mac OS model with global GrafPorts but Windows style handles for pens, brushes, and fonts for graphics. The memory management model was the HUGE pointer allocator that already spanned the two platforms from Excel 1.0 and 2.0. The goal was to have the core code contain zero platform specific code.

Why pursue cross platform programming? The intention is that most of the programming effort of an app is done as core code. People remember the mantra “Write once, run anywhere” and that was the hope for Excel. Any new feature would be written once. The core code bugs would be worked out on the first platform shipped and the subsequent platform versions would thus take much less effort to ship. The Excel team would have all platforms working up until the code complete stage of the development process, then focus on the first shipping platform exclusively (keeping the other platforms compiling and passing a basic battery of automated testing), then shipping the other platforms in turn.

The Excel Layer worked very well, and it was used to deliver Excel 2.2, 3.0, and 4.0 on all platforms including OS/2 when it arrived (and then not when IBM and Microsoft split). However, eventually Excel's layer stopped working well, as the platforms began diverging. Windows with protected mode from a memory management perspective vastly outstripped Mac OS in terms of the size of applications it could support. Apple would fix this with OS X, but that was many years off. Another set of issues arose when feature areas with end user abstractions (think desktop, window, menu, scroll bar) were completely different on each platform. The best example I can remember is the Windows concept of OLE (Object Linking and Embedding) and Mac OS OpenDoc with Publish and Subscribe. The user models, semantics, and data models were so different that there was not a ready model platform approach to preserve the core code paradigm. Platform specific code proliferated in the core.

Along this timeline Microsoft acquired ForeThought Inc and its app PowerPoint. PowerPoint also started on the Mac, wrote a Windows specific version, and then its own cross platform layer. Steven’s tools team shipped AFX for Windows (he will tell that story in much detail I am sure). Word used parts of that to ship a core code Word for Mac OS. Also important along this timeline is a mini era shift, away from stand-alone applications to office suites.

These factors led to the general displeasure of end users. People started complaining about “least common denominator apps” and the Mac version of Office stretched Mac OS to its limit. At this time, I was the VP responsible for Office. The Mac business was important to Microsoft. In those days SteveB (Steve Ballmer) would use a “dollars per PC” (PC included Macs) as a top-level management metric and ironically, because Office had a much higher usage percentage on Macs, Microsoft made more money per Mac sold than IBM style PC sold. It was clear that Office’s cross platform strategy was no longer making good enough applications for Mac OS. So, I made the decision to abandon the strategy and formed an independent Mac Business Unit led by BenW (Ben Waldman).

That, my friends, is a real-world life and death story of one cross platform strategy. To compete in an Era, Platforms start with the similarities applicable to the Era, but then diverge as they compete for dominance of the Era. Cross platform approaches work for a while, and then they don’t.

Expand full comment
author

Thank you for so much fun color and detail!

Expand full comment
May 14, 2021Liked by Steven Sinofsky

You, and others, were too far ahead of what the technology could support. Usually the way this story ends is that the "ahead of its time" approach is either abandoned or the company goes under. What is interesting here is that Microsoft abandoned the approach (the right call, given how good the MacBU stuff was in the end) but kept it in its back pocket--because today, as we all know, all iterations of Office (PC, web, mobile, Mac) are built from a common code base. Obviously there have been lots of twists and turns in the meantime, but the takeaway is simple: If you have the ultimately right approach, be pragmatic in shifting if the time is not yet right but don't forget what you "should" be doing when the circumstances for success are finally in place.

Expand full comment
Feb 22, 2021Liked by Steven Sinofsky

Hi Steven, great chapter. Talking about Excel and Cross platform. Excel 2 for Windows (first version for Windows) had x86 specific assembly code for certain parts like the SUM function so that Excel would win shoot outs in the PC press at the time. The press would compare how fast Excel would add a column or row of numbers and compare it to other spreadsheets like Lotus 1-2-3.

Expand full comment
Feb 23, 2021Liked by Steven Sinofsky

The connection of OOP and Smalltalk 80 is fascinating. To the best of my knowledge, Smalltalk was not used to develop the system or the apps on the Xerox Workstation. There was a point where a Xerox Workstation configured around InterLISP was offered in an early round of the AI hype-cycle though.

My unhappiness with Smalltalk was not so much for its intended purpose but that it put too much work at the leaves of computations. My notion has mainly been that the leaves are supposed to have simple and very fast operations and more complicated things emerge higher up in the computation tree. Of course, some of that has been moved onto the processor these days, but I'll stick by that just the same.

At the 40th Anniversary ACM event in Houston, Adele Goldberg introduced me to one of the Parc Place guys and I lamented about this. I was then told about all the things that were being done in hardware designs that would accelerate Smalltalk.

This reminded me of the Burroughs situation where scaling up involved many kinds of hardware cache and other acceleration techniques to deal with the handling of an ALGOL-like (but defective) computation model.

My assumption was that there were far better things to do with all those transistors, and it is an opportunity cost situation. These days, maybe we have transistors to burn and the only way to expand processing capacity is by such measures. My single-threaded brain has not dug into any of that.

Of course there has always been pipe-lining branch prediction and all manner of other acceleration methodologies. Also, as we see with .NET (and C++ somewhat), there are ways to optimize around having every leaf be poised to provide full object messaging goo.

I had a tangential connection with the Simula project, since it used the ALGOL 60 compiler written at (then-named) Case Institute for the Univac 1107. I did not understand it, but the Case team was pretty excited.

In a kind of "for want of a nail" situation, I did a dealer at a Trenton Computer Festival where I suggested the C++ convention for calling into methods was suspect. That got me a phone call from Bjarn, where he explained the constraints he was under (having to compile down to C), and also he thought there were some context-switching cost to do (virtual) method entries the way I claimed was better. He may have been right. In any case, there is no help for it at this point.

Sometimes details in code-generation and/or hardware instruction architecture can have unexpected high costs. I will mention 3: (1) required boundary-alignment of data versus how the (maybe-hardware) stack works, (2) indirect addressing that changes register states and makes memory access violations not restartable so virtual memory becomes a problem, and (3) subroutine calling conventions that require the caller to provide the working data (e.g., object instance state) the called function needs. This last included the IBM OS/360 "Type 2" subroutine calling convention and the C++ method entry method. As a functional-programming adherent already, I had figured out the workaround of Type 2 linkage and I thought it was simply cleaner for closures to be enterable without the caller knowing anything about the callee innards, and I had succeeded nicely at that using a minicomputer assembly language. There are other nasties that make garbage collection painful as well.

Expand full comment
author

Wonderful stories!

Expand full comment