022. Injecting New Ideas and IQ: The Information Superhighway
“We’ll need people on roller skates—like in the old days when computers had vacuum tubes—replacing disks as they failed.” —NathanM
In 1993, it would have been difficult to overstate the hype surrounding the “Information Superhighway”. Whatever definition or capabilities it might have, it consumed the imaginations of everyone from Wall Street to Main Street with magazine covers, morning news show pieces, investor conferences, and more. Microsoft had risen with the juggernauts of MS-DOS, Windows, and soon Office and found itself, surprisingly, at the nexus of Hollywood, newspapers, cable TV companies, and telephone companies each believing they would come to dominate the highway. Only one thing was missing and that was some software to power it. Could Microsoft be that “vendor” or would software be so central that Microsoft would come to dominate the very nature of information delivered to the home as some felt it already dominated computing. Fear of Microsoft, and fear of Bill Gates, began to dominate. Gone were the wonders of the programming nerds in the Pacific Northwest.
Whenever I was feeling caught between shipping products and big vision or hearing about a product that was deep in bugs and had an unpredictable ship date, I could be rescued from my supply closet of an office by a demonstration from Microsoft Research (MSR) or the Advanced Consumer Technology group (ACT).
Bill viewed the gathering of the level of “IQ” and experience in these groups with great pride and a good deal of personal effort, as he often personally recruited them. Icons filled the rosters of the two teams over the years: Jim Gray, pioneer in databases; Chuck Thacker, coinventor of ethernet networking; Gary Starkweather, inventor of the laser printer; Alvy Ray Smith, Academy Award winner, cofounder of Pixar, and inventor of alpha channel in graphics; and Butler Lampson, founding member of Xerox PARC and inventor of the personal computer—and those were just the people hired in the early 1990s. I sometimes had to pinch myself that I even got to meet these legends. When Butler Lampson (BLampson) was being recruited I was asked to take him to lunch, but I was mostly starstruck.
NathanM and Craig Mundie (CraigMu, who at the time reported to Nathan) were a yin and yang. Nathan brought an eclectic background in physics and science, though he founded a PC software company acquired by Microsoft that brought him (and among others, DavidW who was key to Windows early on) to the company. CraigMu was an industry veteran, having seen the entire arc of computing. He got his start at legendary Data General then ultimately started a well-known supercomputer company, Alliant. The company fell victim to the advances of Moore’s law, which brought him to Microsoft, “having seen failure,” as BillG used to say. Officially Craig was leading the Windows CE project, Microsoft’s first efforts in mobile, but he often managed, formally or informally, advanced projects and an ever-expanding portfolio. Part of the yin to Nathan’s yang was that Craig was a former CEO of a larger company and a technology industry veteran.
The information superhighway, as it was called, was front and center of all the future discussions BillG and NathanM were involved in. The internet, as we think of it today, was more than a year away when I first started as TA, though the first version of the Mosaic browser was released in the summer of 1993. The highway was how the phone companies and cable television companies described accessing information over their respective networks. Bill’s first book, which he began writing around this time, was titled The Road Ahead (1995) and spoke quite a bit about this metaphorical highway. The cover photo by Annie Leibovitz even featured Bill on a lone stretch of highway. In public, Bill was a bit of a realist about the timeline and who would “win”. The highway was the first time as a public company that Microsoft faced a huge mainstream hype cycle. Part of this cycle were alternating predictions about how Microsoft with come to dominate the superhighway or how Microsoft was missing out.
Prior to the internet, much of the discussion in BillG meetings looked like the internet, only it used proprietary software and required dedicated hardware devices from Microsoft, phone companies, or cable companies, and mostly worked only over private networks of leased lines running proprietary protocols or archaic phone company standards. The Information at Your Fingertips vision (more on this in a future post), first articulated in 1990, looked a lot like the internet would come to be in a very short time, but with an entirely different implementation.
While there was no single definition of the Superhighway, the common articulation was the idea that a wide variety of consumer services would be available directly to home computers using a new type of data connectivity offered by phone or cable companies (the obvious incumbents with wires to the house). Common examples offered were news, weather, sports, stock market, movie listings, shopping (like at a mall), along with communication services over voice and video. So basically, not unlike a local newspaper.
Cable companies were extremely interested in offering one of the most forward-looking services one could imagine at the time, video on demand. Imagine watching any show anywhere, anytime. Most of us were still kicking ourselves if we forgot to put a blank tape in the VCR to record Seinfeld or if we had to go to Blockbuster in the rain in the hopes of finding a movie to rent. The early time shifting digital recorder from ReplayTV was not yet released. In fact, there was not even an electronic program guide to know what was on television (some cable systems had a guide on a separate channel that scrolled by continuously).
No one really had any idea how to deliver video on demand, especially in a world where there was intense fear that a movie could be recorded, or stolen, and copied on to videotapes and sold on the street corner. The technology required to digitally encode a video was enormous. It had to store a massive amount of information (CD-ROMs held about 700 megabytes, which was about seven minutes of TV-quality uncompressed video), transmit the video to households reliably, provide program listings and remote control, such as pause for breaks, and so on. In 1994 we were a long way away from home DVRs or DVD movies. Even HDTV remained years off.
Yet, somehow, one day I walked into a conference room with BillG and we saw a demonstration of all of those pieces working. A program guide of movies to choose from, select a movie, instantly begin to watch it. It was one of the more impressive demonstrations I had seen—it wasn’t a fake demo made with Director or AuthorWare but real code from Microsoft Research. NathanM called the product Tiger, which was a code name for an MSR lab project developing a file server that could reliably move files around in real time. For this demo, the Tiger file system was used to deliver compressed video streams to a PC. The demo showed many PCs receiving different movies, each with their own pause/rewind/fast-forward controls. Mind blown.
Nathan detailed how this system worked, using a dozen or more Windows NT Servers connected to large collections of high-speed disk drives, connected together with the highest speed networking available. The architecture he spoke poetically about was the biggest problem they could foresee. Storing thousands of movies required a lot of disk drives, and unfortunately disk drives on PC servers were not particularly reliable. Nathan walked through the math for us—computing the mean-time-between-failure of drives, number of drives, and number of movies. His eyes got bigger and bigger as he got to the punchline, which was that a system like this would see failing disk drives as the bottleneck and “we’ll need people on roller skates—like in the old days when computers had vacuum tubes—replacing disks as they failed.” It was a colorful metaphor for a system decades ahead of its time.
Tiger generated a huge amount of interest from cable companies. Many wanted to immediately deploy it, but while they had been counting on networking capabilities into homes, there were none yet. Tiger also represented the collective view of Microsoft’s immense industry power—if it could conjure a remarkable system like this and extend the control of the PC to the living room, what could Microsoft not accomplish? The press reports of Tiger alluded to simply stated as fact Microsoft control of the information superhighway, too.
Tiger, which really was a research project and had no product team structure around it at the time, became somewhat of a lynchpin in multi-way discussions Bill, Nathan, and Craig found themselves involved in. On the one hand, there was an industry that made cable TV tuner boxes each of them jockeying for a role in making the device that would sell millions to the cable companies to replace low tech CATV tuners. On the other hand, were the carriers, cable and telephone, who were trying to out-maneuver each other to gain the upper hand in being the pipeline to the home, and with that the ability to control the flow of information, the services offered, and to earn incremental revenue for everything on the system. There were also the content providers who owned everything interesting. For months, Bill, Nathan, and Craig met with CEOs across these industries. At one point Microsoft entered into a pilot project with Time Warner that garnered ongoing national news about the rollout of the superhighway.
Microsoft found itself in a new position. By some accounts it was poised to dominate the future of information services to the home. By other accounts it was going to provide plumbing to the massive companies already providing cable and voice services. But with announcements from all those companies almost constantly about pilots, prototypes, partnerships and more, many thought Microsoft was behind because it wasn’t in all the announcements. Microsoft was at the same time in the early days of being one of the most dominant US companies, including early investigations by the Federal Trade Commission. Microsoft the nerdy tech company found itself front and center in entirely new industries. It was crazy. And all we had was Tiger and Windows PCs. What product would Microsoft even make?
Like so many of the innovation-oriented projects, Tiger was so far ahead as to be unable to connect to the here and now. Customers were not prepared to deploy and manage thousands of Windows Servers, consumers did not have high-speed networking, and even Hollywood was not ready to distribute video this way. The path from Tiger to today’s streaming services (that Microsoft is not part of) does not represent a series of missteps by Microsoft but rather a series of additional technologies and marketplace expectations that completely discounted how Tiger approached the difference with streaming as we came to know it. It is fun to think about how early the vision was, but as is so often the case when you dive in you realize all the assumptions made were the wrong ones and all the technologies needed were generations away from being ready to solve the problem.
Fumbling the Future: How Xerox Invented, then Ignored, the First Personal Computer, a book detailing how Xerox invented the PC but failed to capitalize on it, was top of mind for BillG. It was a tragic story, and one Microsoft did not want to repeat. Being aware of the challenge and avoiding falling victim to it are different things. Nobody wants to be wildly underestimated or misunderstood when history is presented.
The technology world eventually solved Nathan’s disk drive problem, but it wasn’t Microsoft’s answer.
Microsoft and the world around Windows NT followed the path of IBM mainframes and continued to work to make disk drives and the software more reliable—redundancy, quality control, and more increased the costs per megabyte and made it even more difficult to scale to huge data volumes. A new company came along to solve this problem, taking the exact opposite approach. Google, not even a company for another five years, would invent what came to be known as GFS, which stood for the Google File System early in the millennium. It used cheap commodity disks (like the kind in a home PC, not those used for data centers) and designed a system that assumed the disks would fail. As such, the idea of replacing them quickly like vacuum tubes was unnecessary. Coincidently, the lead inventor of GFS was a classmate of mine from Cornell who also worked on the Cornell Synthesizer project—that revolutionary programming tool that influenced many of our ideas in Visual C++. Small world.
In hindsight, it was easy, especially during the down times of Microsoft, to become cynical over the company’s inability to commercialize legitimate inventions such as Tiger. Beyond Bill’s expansive vision for the role software and computing could play, it was also a management and organization approach that allowed projects to incubate. In particular, the very experience that influenced both Apple Macintosh and Microsoft Windows, visits to Xerox PARC, the birthplace of the PC and graphical operating system, was front and center of Bill’s efforts to develop innovative projects and to commercialize them.
What I learned, and only in hindsight, is that visualizing the future and even providing working prototypes of it cannot account for the ability of the marketplace, customers, or even the most cost-effective technology approaches to make something a reality. The technology industry is littered with ideas before their time, and to find fault in those companies or leaders for not capitalizing is almost always in error. Tiger was not on a path to be a commercial streaming service. Deploying it, as we saw it in 1994, wasn’t remotely possible.
Years later, as Microsoft hit rough patches in innovation and leadership, people pointed to many of the projects from the 1990s and asked what happened. BillG led and empowered visions across nearly every computing domain, but the under-pinning of them was the assumptions that built Microsoft—PCs, Windows, Win32, and then Windows Server held together with a client-server architecture. If those weren’t the right ingredients, then building with them didn’t create a path to some future. At the time, however, those were the only ingredients, so it made total sense to be using them. Faulting Microsoft for building with Windows in the 1990s would have been as crazy as faulting IBM for using mainframes in the 1970s or criticizing Detroit for building cars with internal combustion engines in the 1990s. What is impossible to see is when those ingredients cease to become assets and turn into liabilities. What really does happen is that those new entrants seeking to invent a new future deliberately take different approaches to solving problems and in doing so intentionally avoid following in the footsteps of the incumbent. They too often fail, but the world hardly notices…until one succeeds.
The pattern of Microsoft being early to so many innovative spaces often omits the challenges that existed at the time. When you’re early many of the ingredients required—like the internet, high-speed networking, faster processors, long battery life, touch screens, pen digitizers, and so on—are simply not there. That means the products aren’t ready to be built. When a new generation of product takes off it is rarely a ground-up invention; rather, it builds on the many early failures that came before.
In my role I was supposed to be eyes and ears, but I struggled with how to overcome what I saw as potential blind spots. It was never difficult to show new technologies to Bill and he was ready, willing, and able to absorb new information and incorporate it (or not) into his world view. At the same time, he seemed to over-index on the complex or sophisticated, seeing those as a moat or strategic advantage. Simple solutions did not have the appeal that a complex solution did, one that required the IQ of MSR or “deep architectural thinking”. The researchers also valued complexity. The product teams tended to avoid complexity, seeing edge cases and boundary conditions as the enemy of shipping. This aversion to complexity looked almost lazy, as though there was a fear of taking on the hard problems and solving them. Worse, simplicity looked expeditious as though there was an attempt to get full credit for a solution by doing only part of the work. Seriously though, who was I to raise these questions? What did I know?
Bill saw products as built out of components of technology and each of those components needed to be the most sophisticated and singular across the company. The best text control, the best forms package, the best directory, and the best database each were ingredients that allowed him to take a product like the information superhighway or Lotus Notes and break it down and assign those components to the very best people to build the parts. There was a blind spot there.
Who would stitch those pieces together to make a product? Was Lotus Notes really a database, forms package, and a programming language? Was video on demand really Tiger plus some user-interface code? The hands-on experience on MS-DOS and Windows seemed to have enshrined the lesson that building components was the winning strategy and developers would provide the rest of technology to create a full product experience. The Applications teams were learning the exact opposite lesson—that winning was about the complete experience and having the most advanced high-tech pieces without stitching them into a product was not all that useful.
The two cultures at Microsoft (MikeMap’s two gardens) yielded different results for different reasons, and they both worked.
What I did have a handle on was shipping and products. In my (only) five years and innumerable stories about shipping I had heard, I definitely believed I could tell the difference between something that was real and something that was mostly slides. Bill did not always see things this way. He gravitated towards the technology view and was less interested in the process and mechanics of shipping. When a project like Cairo or Tiger needed him to push on shipping he was more comfortable continuing the technology discussion. The NT and EMS teams were happy to engage on the technology discussion, but in a sense kept those at arm’s length while they dedicated themselves to making the tradeoffs for shipping.
Most any leader setting this tone would have had far more projects ultimately end like Tiger, but Bill had one enormous strength that made the company what it was. He did not hire only people in his image. Rather he balanced his technology leadership with product (and sales, marketing, operations, etc.) leadership and also empowered those people to get their jobs done. They just had to put up with those deep technology discussions and have good answers to them. MikeMap, PaulMa, PeteH, BradSi, ChrisP, JonDe, and on and on filled out the product building ranks and were given the latitude to execute. I soon found myself spending my time as a TA trying to amplify those voices, while (too) often showing the effort required to go from technology to product.
On to 023. ThinkWeek