Tag Archives: The Innovators

Isaacson’s The Innovators (final thoughts) – is the future about thinking machines?

It is always very sad to end reading a great book, but Isaascon’s beautifully finishes his with Ada Lovelace considerations (during the 19th century!) about the role of computers. “Ada might also be justified in boasting that she was correct, at least thus far, in her more controversial contention that no computer, no matter how powerful would ever truly be a “thinking” machine. A century after she died, Alan Turing dubbed the “Lady Lovelace’s Objection” and tried to dismiss it by providing an operational definition of a thinking machine. […] But it’s now been more than sixty years, and the machines that attempt to fool people on the test are at best engaging in lame conversation tricks rather than actual thinking. Certainly none has cleared Ada’s higher bar of being able to “originate” any thoughts of its own. […] Artificial intelligence enthusiasts have long been promising, or threatening, that machines like HAL would soon merge and prove Ada wrong. Such was the promise of the 1956 conference at Dartmouth organized by John McCarthy and Marvin Minsky, where the field of artificial intelligence was launched. The conference concluded that a breakthrough was about twenty years away. It wasn’t.” [Page 468]

800px-Ada_Lovelace_portrait
Ada, Countess of Lovelace, 1840

John von Neumann realized that the architecture of the human brain is fundamentally different. Digital computers deal in precise units, whereas the brain, to the extent we understand it, is also partly an analog system which deals with a continuum of possibilities, […] not just binary yes-no data but also answers such as “maybe” and “probably” and infinite other nuances, including occasional bafflement. Von Neumann suggested that the future of intelligent computing might require abandoning the purely digital approach and creating “mixed procedures”. [Page 469]

“Artifical Intelligence”

Discussion about artificial intelligence flared up a bit, at least in the popular press, after IBM’s Deep Blue, a chess-playing machine beat the world champion Garry Kasparov in 1997 and then Watson, its natural-language question-answering computer won at Jeopardy! But […] these were not breakthroughs of human-like artificial intelligence, as IBM’s CEO was first to admit. Deep Blue won by brute force. […To] one question about the “anatomical oddity” of the former Olympic gymnast George Eyer, Watson answered “What is a leg?” The correct answer was that Eyer was missing a leg. The problem was understanding “oddity”, explained David Ferruci, who ran the Watson project at IBM. “The computer wouldn’t know that a missing leg is odder than anything else.” […]
“Watson did not understand the questions, nor its answers, nor that some of its answers were right and some wrong, nor that it was playing a game, nor that it won – because it doesn’t understand anything, according to John Searle [a Berkeley philosophy professor]. “Computers today are brilliant idiots” John E. Kelly III, IBM’s director of research. “These recent achievements have, ironically, underscored the limitations of computer science and artificial intelligence.” Professor Tomaso Poggio, director of the Center of Brain, Minds, and Machines at MIT. “We do not yet understand how the brain gives rise to intelligence, nor do we know how to build machines that are as broadly intelligent as we are.” Ask Google “Can a crocodile play basketball?” and it will have no clue, even though a toddler could tell you, after a bit of giggling.
[Pages 470-71] I tried the question on Google and guess what. It gave me the extract by Isaacson…

The human brain not only combines analog and digital processes, it also is a distributed system, like the Internet, rather than a centralized one. […] It took scientists forty years to map the neurological activity of the one-millimeter long roundworm, which has 302 neurons and 8,000 synapses. The human brain has 86 billion neurons and up to 150 trillion synapses. […] IBM and Qualcomm each disclosed plans to build “neuromorphic”, or brain-like, computer processors, and a European research consortium called the Human Brain project announced that it had built a neuromorphic microchip that incorporated “fifty million plastic synapses and 200,000 biologically realistic neuron models on a single 8-inch silicon wafer. […] These latest advances may even lead to the “Singularity” a term that von Neumann coined and the futurist Ray Kurzweil and the science fiction writer Vernor Vinge popularized, which is sometimes used to describe the moment when computers are not only smarter than humans but also can design themselves to be even supersmarter, and will thus no longer need us mortals. Isaacon is wiser than I am (as I feel that these ideas are stupid) when he adds: “We can leave the debate to the futurists. Indeed depending on your definition of consciousness, it may never happen. We can leave “that” debate to the philosophers and theologians. “Human ingenuity” wrote Leonardo da Vinci “will never devise any inventions more beautiful, nor more simple, nor more to the purpose than Nature does”. [Pages 472-74]

Computers as a Complement to Humans

Isaacson adds: “There is however another possibility, one that Ada Lovelace would like. Machines would not replace humans but would instead become their partners. What humans would bring is originality and creativity” [page 475]. After explaining that in a 2005 chess tournament, “the final winner was not a grandmaster nor a state-of-the-art computer, not even a combination of both, but two Americans amateurs who used three computers at the same time and knew how to manage the process of collaborating with their machines” (page 476) and that “in order to be useful, the IBM team realized [Watson] needed to interact [with humans] in a manner that made collaboration pleasant” (page 477) Isaacson further speculates:

Let us assume, for example, that a machine someday exhibits all of the mental capabilitie of a human: giving the outward appearance of recognizing patterns, perceiving emotions, appreciating beauty, creating art, having desires, forming moral values, and pursuing goals. Such a machine might be able to pass a Turing test. It might even pass we could call the Ada test, which is that it could appear to “originate” its own thoughts that go beyond what we humans program it to do. There would, however, be still another hurdle before we could say that artificial intelligence has triumphed over augmented intelligence. We call it the Licklider Test. It would go beyong asking whether a machine could replicate all the components of human intelligence to ask whether the machine accomplishes these tasks better when whirring away completely on its own or when working in conjunction with humans. In other words, is it possible that humans and machines working in partnership will be indefinitely more powerful than an artificial intelligence machine working alone? If so, then “man-computer symbiosis,” as Licklider called it, will remain triumphant. Artificial Intelligence need not be the holy grail of computing. The goal instead could be to find ways to olptimize the collaboration between human and machine capabilities – to forge a èartnership in which we let the machines do what they do best, and they let us do what we do best. [Pages 478-79]

Ada’s Poetical Science

At his last such apperance, for the iPad2 in 2011, Steve Jobs declared: “It’s in Apple’s DNA that technology alone is not enough – that it’s technology married with liberal arts, married with the humanities, that yields us the result that makes our heat sing”. The converse to this paean to the humanities, however, is also true. People who love the arts and humanities should endeavor to appreciate the beauties of math and physics, just as Ada did. Otherwise they will be left at the intersection of arts and science, where most digital-age creativity will occur. They will surrender control of that territory to the engineers. Many people who celebrate the arts and the humanities, who applaud vigorously the tributes to their importance in our schools, will proclaim without shame (and sometimes even joke) that they don’t understand math or physics. They extoll the virtues of learning Latin, but they are clueless about how to write an algorithm or tell BASIC from C++, Python from Pascal. They consider people who don’t know Hamlet from Macbeth to be Philistines, yet they might merrily admit that they don’t know the difference between a gene and a chromosome, or a transistor and a capacitor, or an integral and a differential equation. These concepts may seem difficult. Yes, but so, too, is Hamlet. And like Hamlet, each of these concepts is beautiful. Like an elegant mathematical equation, they are expressions of the glories of the universe. [Pages 486-87]

leonardo-da-vinci-vitruvian-man
Issacson’s book last page presents Vinci’s Vitruvian Man, 1492

Halt and Catch Fire – the TV series about innovation (without Silicon Valley and start-ups)

I will always remember the day when one of my former bosses told me I should focus on (watching, making) videos rather than (reading, writing) books. I am a book person so I will probably not follow his advice ! Still from time to time I discover movies about High-tech innovation and entrepreneurship, start-ups.

Halt and Catch Fire is not precisely about start-ups, it is not a documentary, it is not a movie. It is a TV series that is certainly more serious (and less fun) than HBO’s Silicon Valley. It is an interesting accident that I began watching it while reading Isaacson’s the Innovators. Both talk about the early days of Personal Computers in a (rather) dramatic manner.

I am still in the beginning of Season 1 so my comments come as much from what I read as from what I saw! Halt and Catch Fire takes place in Texas (not in Silicon Valley), in an established company, Cardiff Electric (not a start-up) where three individuals who should probably have never met, a sales man, an engineer and a geek (not entrepreneurs) will try to prove to the world that they can change it. So why Texas? According to French Wikipedia: “Season 1 (which takes place in 1983-198) is inspired by the creation of Compaq launched in 1982 to develop the first IBM-compatible portable PC. Compaq engineers had to reverse engineer by disassembling the IBM BIOS to make a compatible version rewritten by people who had never seen the IBM BIOS in order not to violate copyrights.” (My Compaq cap. table below.)

Scoot McNairy as Gordon Clark, Mackenzie Davis as Cameron Howe and Lee Pace as Joe MacMillan - Halt and Catch Fire _ Season 1, Gallery - Photo Credit: James Minchin III/AMC

Scoot McNairy as Gordon Clark, Mackenzie Davis as Cameron Howe and Lee Pace as Joe MacMillan – Halt and Catch Fire _ Season 1, Gallery – Photo Credit: James Minchin III/AMC

I should credit Marc Andreessen for helping me discovering this new AMC TV series. In a long portrait by the New Yorker, the Netscape founder mentions the series: “He pushed a button to unroll the wall screen, then called up Apple TV. We were going to watch the final two episodes of the first season of the AMC drama “Halt and Catch Fire,” about a fictional company called Cardiff, which enters the personal-computer wars of the early eighties. The show’s resonance for Andreessen was plain. In 1983, he said, “I was twelve, and I didn’t know anything about startups or venture capital, but I knew all the products.” He used the school library’s Radio Shack TRS-80 to build a calculator for math homework.” […] “The best scenes with Cameron were when she was alone in the basement, coding.” I said I felt that she was the least satisfactory character: underwritten, inconsistent, lacking in plausible motivation. He smiled and replied, “Because she’s the future.”

According to Wikipedia’s article about the series, “the show’s title refers to computer machine code instruction HCF, the execution of which would cause the computer’s central processing unit to stop working (“catch fire” was a humorous exaggeration).” It the series is not about entrepreneurship and start-ups so far, it is about rebellion, mutiny. There is a beautiful moment where one of the heroes convince his two colleagues to follow when they are about to stop. They are on quest.

amc_haltandcatchfire_mutiny

I haven’t seen many movies and videos about my favorite topic so let me try and recapitulate:
– I began with Something Ventured, a documentary about the early days of Silicon Valley entrepreneurs and venture capitalists.
The Startup Kids is another documentary about young (mostly) web entrepreneurs. Often very moving.
HBO’s Silicon Valley is funnier than HFC but maybe not as good. Only time will say.
– I saw The Social Network which seems to remain the best fiction movie about all this, but
– I have not seen the two movies about Steve Jobs. It’s apparently not worth watching Jobs (2013) but I will probably try not to miss Steve Jobs (2015)

So as a conclusion, watch the trailer.

The Compaq Capitalization Table at IPO

Compaq_Cap_Table

Walter Isaacson’s The Innovators (part 4) – Steal… or Share?

How many times will I say how great a book is Walter Isaacson’s The Innovators – How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution ? And how many posts will I write about it ? Now this this the 4th part ! Isaacson shows how collaboration in software contributed to a unique value creation. This may mean sharing but also stealing !

Gates complained to the members of the Homebrew Computer Club about this: “Two surprising things are apparent, however, 1) Most of these “users” never bought BASIC (less than 10% of all Altair owners have bought BASIC), and 2) The amount of royalties we have received from sales to hobbyists makes the time spent on Altair BASIC worth less than $2 an hour. Why is this? As the majority of hobbyists must be aware, most of you steal your software. Hardware must be paid for, but software is something to share. Who cares if the people who worked on it get paid? Is this fair? One thing you don’t do by stealing software is get back at MITS for some problem you may have had. MITS doesn’t make money selling software. […] The thing you do is theft. I would appreciate anyone who wants to pay.” [Page 342 and http://www.digibarn.com/collections/newsletters/homebrew/V2_01/gatesletter.html]

homebrew_V2_01_p2

But Isaacson nuances : “Still there was a certain audacity to the letter. Gates was, after all, a serial stealer of computer time, and he had manipulated passwords to hack into accounts from eighth grade through his sophomore year at Harvard. Indeed, when he claimed in his letter that he and Allen had used more than $40,000 worth of computer time to make BASIC, he omitted the fact he had never actually paid for that time. […] Also, though Gates did not appreciate it at the time, the widespread pirating of Microsoft BASIC helped his fledgling company in the long run. By spending so fast, Microsoft BASIC became a standard, and other computer makers had to license it.” [Page 343]

And what about Jobs and Wozniak? Everyone knows about how phone phreaks had created a device that emitted just the right tone chirps to fool the Bell System and cadge free long-distance calls. […] “I have never designed a circuit I was prouder of. I still think it was incredible”. They tested it by calling the Vatican, with Wozniak pretending to be Henry Kissinger needing to speak to the pope, it took a while but the officials at the Vatican finally realized it was a prank before they woke up the pontiff. [Page 346]

Gates, Jobs and the GUI

And the greatest robbery may have been the GUI – Graphical User Interface. But who stole? Later when he was challenged about pilfering Xerox’s ideas, Jobs quoted Picasso: “Good artists copy, great artists steal. And we have been always shameless about stealing great ideas. They were copier-heads who had no clue about what a computer could do.” [Page 365]

However when Microsoft copied Apple for Windows, it was a different story… “In the early 1980s, before the introduction of the Macintosh, Microsoft had a good relationship with Apple. In fact on the day that IBM launched its PC in August 1981, Gates was visiting Jobs at Apple, which was a regular occurrence since Microsoft was making most of its revenue writing software for the Apple II. Gates was still the supplicant in the relationship. In 1981, Apple had $334 million in revenue, compared to Microsoft’s $15 million. […] Jobs had one major worry about Microsoft: he didn’t want it to copy the graphical user interface. […] His fear that Gates would steal the idea was somewhat ironic, since Jobs himself had filched the concept from Xerox.” [Pages 366-67]

Things would go worse… “Well, Steve, I think there’s more than one way of looking at it. I think it’s more like we both had this rich neighbor named Xerox and I broke into his house to steal the TV set and I found out that you had already stolen it”. [Page 368]

Stallman, Torvalds, free- and open-source

There would be other oppositions. The hacker corps that grew up around GNU [Stallman’s free software] and Linux [Torvalds’ open software] showed that emotional incentives, beyond financial rewards, can motivate voluntary collaboration. “Money is not the greatest of motivations,” Torvalds said. “Folks do their best work when they are driven by passion. When they are having fun. This is as true for playwrights and sculptors and entrepreneurs as it is for software engineers.” There is also, intended or not, some self-interest involved. “Hackers are also motivated, in large part, by the esteem they can gain in the eyes of their peers, improve their reputation, elevate their social status. Open source development gives programmers the chance.” Gates “Letter to Hobbysts”, complaining about the unauthorized sharing of Microsoft BASIC, asked in a chiding way, “who can afford to do professional work for nothing?”. Torvalds found that an odd outlook. He and Gates were from two very different cultures, the communist-tinged radical academia of Helsinki versus the corporate elite of Seattle. Gates may have ended up with the bigger house, but Torvalds reaped anti-establishment adulation. “Journalists seemed to love the fact that, while Gates lived a high-tech lakeside mansion, I was tripping over my daughter’s playthings in a three-bedroom ranch house with bad plumbing in boring Santa Clara,” he said with ironic self-awareness. “And that I drove a boring Pontiac. And answered my own phone. Who wouldn’t love me?” [Pages 378-79]

Which does not make open a friend of free. The disputes went beyond mere substance and became, in some ways, ideological. Stallman was possessed by a moral clarity and unyielding aura, and he lamented that “anyone encouraging idealism today faces a great obstacle: the prevailing ideology encourages people to dismiss idealism as ‘impractical'”. Torvalds, on the contrary, was unabashedly practical, like an engineer. “I led the pragmatists,” he said. “I have always thought that idealistic people are interesting, but kind of boring and scary.” “Torvalds admitted to “not exactly being a huge fan” of Stallman, explaining, “I don’t like single-issue people, nor do I think that people who turn the world into black and white are very nice or ultimately very useful. The fact is, there aren’t just two sides to any issue, there’s almost always a range of responses, and ‘it depends’ is almost always the right answer to any big question. He also believed it should be permissible to make money from open-source software. “Open-source is about letting everybody play. Why should business, which fuels so much of society’s technological advancement, be excluded?”. Software may want to be free, but the people who write it may want to feed their kids and reward their investors. [Page 380]

The Innovators by Walter Isaacson – part 3: (Silicon) Valley

Innovation is about business models – the Atari case

Innovation in (Silicon) Valley: after the chip, innovation saw the arrival of games, software and the Internet “As they were working on the first Computer Space consoles, Bushnell heard that he had competition. A Stanford grad named Bill Pitts and his buddy Hugh Tuck from California polytechnic had become addicted to Spacewar, and they decided to use a PDP-11 minicomputer to turn it into an arcade game. […] Bushnell was contemptuous of their plan to spend $20,000 on equipment, including a PDP-11 that would be in another room and connected by yards of cable to the console, and then charge ten cents a game. “I was surprised at how clueless they were about the business model,” he said. “Surprised and relieved. As soon as I saw what they were doing, I knew they’d be no competition”.
Galaxy Game by Pitts and Tuck debuted at Stanford’s Tresidder student union coffeehouse in the fall of 1971. Students gathered around each night like cultists in front of a shrine. But no matter how many lined up their coins to play, there was no way the machine could pay for itself, and the venture eventually folded. “Hugh and I were both engineers and we didn’t pay attention to business issues at all,” conceded Pitts. Innovation can be sparked by engineering talent, but it must be combined with business skills to set the world afire.
Bushnell was able to produce his game, Computer Space, for only $1,000. It made its debut a few weeks after Galaxy Game at the Dutch Goose bar in Menlo Park near Palo Alto and went on to sell a respectful 1,500 unites. Bushnell was the consummate entrepreneur: inventive, good at engineering, and savvy about business and consumer demand. He was also a great salesman. […] When he arrived back at Atari’s little rented office in Santa Clara, he described the game to Alcorn [Atari’s co-founder], sketched out some circuits, and asked him to build the arcade version of it. He told Acorn he had signed a contract with GE to make the game, which was untrue. Like many entrepreneurs, Bushnell had no shame about distorting reality in order to motivate people.”
[Pages 209-211]

“Innovation requires having three things: a great idea, the engineering talent to execute it, and the business savvy (plus deal-making moxie) to turn it into a successful product. Nolan Bushnell scored a trifecta when he was twenty-nine, which is why he, rather than Bill Pitts, Hugh Truck, Bill Nutting, or Ralph Baer, goes down in history as the innovator who launched the video game industry.” [page 215]

You may also so listen to Bushnell directly. This is Something Ventured and the Atari story begins at 30’07” until 36’35” (you may go on Youtube directly for the right timing).

The debate about intelligence of machines

Chapter 7 is about the beginnings of the Internet. Isaacson adddresses a topic which has come back has a hot debate these days: will machines and the computer in particular replace humans, with or despite their intelligence, creativity and innovation capabilities? I feel close to Isaacson whom I quote from page 226: “Licklider sided with Norbert Wiener, whose theory of cybernetics was based on humans and machines working closely together, rather than with their MIT colleagues Marvin Minsky and John mcCarthy, whose quest for artificial intelligence involved creating machines that could learn on their own and replace human cognition. As Licklider explained, the sensible goal was to create an environment in which humans and machines “cooperate in making decisions.” In other words,they would augment each other. “Men will set the goals, formulate the hypotheses, determine the criteria, and perform the evaluations. Computing machines will do the routinizable work that must be done to prepare the way for insights and decisions in technical and scientific thinking.”

The Innovator’s dilemma

In the same chapter which tries to describe who were the inventors (more than the innovators) in the case of the Internet – J.C.R. Licklider, Bob Taylor, Larry Roberts, Paul Baran, Donald Davies, or even Leonard Kleinrock – and why it was invented – an unclear motivation between the military objective of protecting communications in case of a nuclear attack or the civilian one of helping researchers in sharing resources – Isaacson shows once again the challenge of convincing established players.

Baran then collided with one of the realities of innovation, which was that entrenched bureaucracies are resistant to change. […] He tried to convince AT&T to supplement its circuit-switched voice network with a packet-switched data network. “they fought it tooth and nail,” he recalled. “They tried all sorts of things to stop it.” [AT&T would go as far as organizing a series of seminars that would involve 94 speakers] “Now do you see why packet switching wouldn’t work?” Baran simply replied, “No”. Once again, AT&T was stymied by the innovator’s dilemma. It balked at considering a whole new type of data network because it was so invested in traditional circuits. [Pages 240-41]

[Davies] came up with a good old English word for them: packets. In trying to convince the general Post office to adopt the system, Davies ran into the same problem that Baran had knocking at the door of AT&T. But they both found a fan in Washington. Larry Roberts not only embraced their ideas; he also adopted the word packet.

The entrepreneur is a rebel (who loves power)

One hard-core hacker, Steve Dompier, told of going down to Alburquerque in person to pry loose a machine from MITS, which was having trouble fulfilling orders. By the time of the third Homebrew meeting in April 1975, he had made an amusing discovery. He had written a program to sort numbers, and while he was running it, he was listening to a weather broadcast on a low-frequency transistor radio. “The radio started going zip-zzziiip-ZZZIIIPP at different pitches », and Dompier said to himself, “Well, what do you know ! My first peripheral device!” So he experimented. “I tried some other programs to see what they sounded like, and after about eight hours of messing around, I had a program that could produce musical tones and actually make music”. [Page 310]

“Dompier published his musical program in the next issue of the People’s Computer Company, which led to a historically noteworthy response from a mystified reader. “Steven Dompier has an article about the musical program that he wrote for the Altair in the People’s Computer Company,” Bill Gates, a Harvard student on leave writing software for MITS in Albuquerque, wrote in the Altair newsletter. “The article gives a listing of his program and the musical data for ‘The Fool on the Hill’ and ‘Daisy.’ He doesn’t explain why it works and I don’t see why. Does anyone know?” the simple answer was that the computer , as it ran the programs, produced frequency interference that could be controlled by the timing loops and picked up as tone pulses by an AM radio.
By the time his query was published, Gates had been thrown into a more fundamental dispute with the Homebrew Computer Club. It became archetypal of the clash between the commercial ethic that believed in keeping information proprietary, represented by Gates [and Jobs], and the hacker ethic of sharing information freely, represented by the Homebrew crowd [and Wozniak].” [Page 311]

Isaacson, through his description of Gates and Jobs, explains what is an entrepreneur.

“Yes, Mom, I’m thinking,” he replied. “Have you ever tried thinking?” [P.314] Gates was a serial obsessor. […] he had a confrontational style [… and he] would escalate the insult to be “the stupidest thing I’ve ever heard.” [P.317] Gates pulled a power play that would define his future relationship with Allen. As Gates describes it, “That’s when I say ‘Okay, but I’m going to be in charge. And I’ll get used to being in charge, and it’ll be hard to deal with me from now on unless I’m in charge. If you put me in charge, I’m in charge of this and anything else we do.’ ” [P.323] Like many innovators, Gates was rebellious just for the hell of it. [P.331] “An innovator is probably a fanatic, somebody who loves what they do, works day and night, may ignore normal things to some degree and therefore be viewed as a bit imbalanced. […] Gates was also a rebel with little respect for authority, another trait of innovators. [P.338]

Allen assumed that his partnership with Gates would be fifty-fifty. […] but Gates had insisted on being in charge. “It’s not right for you to get half. […] I think it should be sixty-forty.” […] Worse yet, Gates insisted on revisiting the split two years later. “I deserve more than 60 percent.” His new demand was that the split be 64-36. Born with a risk-taking gene, Gates would cut loose late at night by driving at terrifying speeds up the mountain roads. “I decided it was his way of letting off steam.” Allen said. [P.339]

gates-arrested
Gates arrested for speeding, 1977. [P.312]

“There is something indefinable in an entrepreneur, and I saw that in Steve,“ Bushnell recalled. “He was interested not just in engineering, but also in the business aspects. I taught him that if you act like you can do something, then it will work. I told him, pretend to be completely in control and people will assume that you are.” [P.348]

The concept of the entrepreneur as a rebel is not new. In 2004, Pitch Johnson, one of the earliest VC in Silicon Valley claimed “Entrepreneurs are the revolutionaries of our time.” Freeman Dyson has written “The Scientist as a Rebel“. And you should read Nicolas Colin’s analysis of entrepreneurial ecosystems: Capital + know-how + rebellion = entrepreneurial economy. Yes rebels who loves power…

The Innovators by Walter Isaacson – part 2 : Silicon (Valley)

What I am reading now following my recent post The Complexity and Beauty of Innovation according to Walter Isaacson is probably much better known: Innovation in Silicon Valley at the time of Silicon – Fairchild, Intel and the other Fairchildren. I have my own archive, nice posters from those days, one about the start-up / entrepreneur genealogy, with a zoom on Fairchild and one on Intel and one about the investor genealogy

Entrepreneurs…

SiliconValleyGenealogy-All

SiliconValleyGenealogy-Fairchild

SiliconValleyGenealogy-Intel

“There were internal problems in Palo Alto. Engineers began defecting, thus seeding the valley with what became known as Fairchildren: companies that sprouted from spores emanating from Fairchild.” [Page 184] “The valley’s main artery, a bustling highway named El Camino Real, was once the royal road that connected California’s twenty-one mission churches. By the early 1970s – thanks to Hewlett-Packard, Fred Terman’s Stanford Industrial Park, William Shockley, Fairchild and its Fairchildren – it connected a bustling corridor of tech companies. In 1971, the region got a new moniker. Don Hoefler, a columnist for the weekly trade paper Electronic News, began writing s series of columns entitled “Silicon Valley USA,” and the name stuck.” [Page 198]

… and Investors

WCVCGenealogy-All

WCVCGenealogy-Beginnning

“In the eleven years since he had assembled the deal for the traitorous eight to form Fairchild Semiconductors, Arthur Rock had helped to build something that was destined to be almost as important to the digital age as the microchip: venture capital.” [Page 185] “When he had sought a home for the traitorous eight in 1957, he pulled out a single piece of legal-pad paper, wrote a numbered list of names, and methodically phoned each one, crossing off the names as he went down the list. Eleven years later, he took another sheet of paper and listed people who would be invited to invest and how many of the 500’000 shares available at $5 apiece he would offer to each. […] It took them less than two days to raise the money. […] All I had to tell people was that it was Noyce and Moore. They didn’t need to know much else.” [Pages 187-88]

Rock_List

The Intel culture

“There arose at Intel an innovation that had almost as much of an impact on the digital age as any [other]. It was the invention of a corporate culture and management style that was the antithesis of the hierarchical organization of East Coast companies.” [[Page 189] “The Intel culture, which would permeate the culture of Silicon Valley, was a product of all three men. [Noyce, Moore and Grove]. […] It was devoid of the trappings of hierarchy. There were no reserved parking places. Everyone including Noyce and Moore, worked in similar cubicles. […] “There were no privileges anywhere” recalled Ann Bowers, who was the personnel director and later married Noyce, [she would then become Steve Jobs’ first director of human resources] “we started a form of company culture that was completely different than anything that had been before. It was a culture of meritocracy.
It was also a culture of innovation. Noyce had a theory that he developed after bridling the rigid hierarchy at Philco. The more open and unstructured a workplace, he believed, the faster new ideas would be sparked, disseminated, refined and applied.” [Pages 192-193]

The Complexity and Beauty of Innovation according to Walter Isaacson

The Innovators by Walter Isaacson is a great book because of its balanced description of the role of geniuses or disruptive innovators as much as of teamwork in incremental innovation. “The tale of their teamwork is important because we don’t often focus on how central their skill is to innovation. […] But we have far fewer tales of collaborative creativity, which is actually more important in understanding how today’s technology evolution was fashioned.” [Page 1] He also goes deeper: “I also explore the social and cultural forces that provide the atmosphere for innovation. For the birth of the digital age, this included a research ecosystem that was nurtured by the government spending and managed by a military-industrial collaboration. Intersecting with that was a loose alliance of community organizers, communal-minded hippies, do-it yourself hobbyists, and homebrew hackers, most of whom were suspicious of centralized authority.” [Page 2] ”Finally, I was struck by how the truest creativity of the digital age came from those who were able to connect the arts and sciences.” [Page 5]

the-innovators-9781476708690_lg

The computer

I was a little more cautious with chapter 2 as I have the feeling that the story of Ada Lovelace and Charles Babbage is well known. I may be wrong. But chapter 3 about the early days of the computer was mostly unknown to me. Who invented the computer? Probably many different people in different locations in the US, the UK and Germany, around WWII. “How did they develop this idea at the same time when war kept their two teams isolated? The answer is partly that advances in technology and theory made the moment ripe. Along with many innovators, Zuse and Stibitz were familiar with the use of relays in phone circuits, and it made sense to tie that to binary operations of math and logic. Likewise, Shannon, who was also very familiar with phone circuits, would be able to perform the logical tasks of Boolean algebra. The idea that digital circuits would be the key to computing was quickly becoming clear to researchers almost everywhere, even in isolated places like central Iowa.” [Page 54]

There would be a patent fight I did not know about. Read pages 82-84. You can also read the following on Wikipedia: “On June 26, 1947, J. Presper Eckert and John Mauchly were the first to file for patent on a digital computing device (ENIAC), much to the surprise of Atanasoff. The ABC [Atanasoff–Berry Computer] had been examined by John Mauchly in June 1941, and Isaac Auerbach, a former student of Mauchly’s, alleged that it influenced his later work on ENIAC, although Mauchly denied this. The ENIAC patent did not issue until 1964, and by 1967 Honeywell sued Sperry Rand in an attempt to break the ENIAC patents, arguing the ABC constituted prior art. The United States District Court for the District of Minnesota released its judgement on October 19, 1973, finding in Honeywell v. Sperry Rand that the ENIAC patent was a derivative of John Atanasoff’s invention.” [The trial had begun in June 1971 and the ENIAC patent was therefore made invalid]

I also liked his short comment about complementary skills. “Eckert and Mauchly served as counterbalances for each other, which made them typical of so many digital-age leadership duos. Eckert drove people with a passion for precision; Mauchly tended to calm them and make them feel loved.” [Pages 74-75]

Women in Technology and Science

It is in chapter 4 about Programming that Isaacson addresses the role of women. “[Grace Hopper] education wasn’t as unusual as you might think. She was the eleventh woman to get a math doctorate from Yale, the first being in 1895. It was not at all uncommon for a woman, especially from a successful family, to get a doctorate in math in the 1930s. In fact, it was more common than it would be a generation later. The number of American women who got doctorates in math during the 1930s was 133, which was 15 percent of the total number of American math doctorates. During the decade of the 1950s, only 106 American women got math doctorates, which was a mere 4 percent of the total. (By the first decade of the 2000 things had more than rebounded and there were 1,600 women who got math doctorates, 30 percent of the total.)” [Page 88]

Not surprisingly, in the early days of computer development, men worked more in hardware whereas women would be in software. “All the engineers who built ENIAC’s hardware were men. Less heralded by history was a group of women, six in particular, who turned out to be almost as important in the development of modern computing.” [Page 95] “Shortly before she died in 2011, Jean Jennings Bartik reflected proudly on the fact that all the programmers who created the first general-purpose computer were women. « Despite our coming of age in an era when women’s career opportunities were generally quite confined, we helped initiate the era of the computer. » It happened because a lot of women back then had studied math and their skills were in demand. There was also an irony involved: the boys with their toys thought that assembling the hardware was the most important task, and thus a man’s job. « American science and engineering was even more sexist than it is today, » Jennings said. « If the ENIAC’s administration had known how crucial programming would be to the functioning of the electronic computer and how complex it would prove to be, they might have been more hesitant to give such an important role to women.” [Pages 99-100]

The sources of innovation

“Hopper’s historical sections focused on personalities. In doing so, her book emphasized the role of individuals. In contrast, shortly after Hopper’s book was completed, the executives at IBM commissioned their own history of the Mark I that gave primary credit to the IBM teams in Endicott, New York, who had constructed the machine. “IBM interests were best served by replacing individual history with organizational history,” the historian Kurt Beyer wrote in a study of Hopper. “The locus of technological innovation, according to IBM was the corporation. The myth of the lone radical inventor working in the laboratory or basement was replaced by the reality of teams of faceless organizational engineers contributing incremental advancements.” In the IBM version of history, the Mark I contained a long list of small innovations, such as the ratchet-type counter and the double-checked card feed, that IBM’s book attributed to a bevy of little-known engineers who worked collaboratively in Endicott.
The difference between Hopper’s version of history and IBM’s ran deeper than a dispute over who should get the most credit. It showed fundamentally contrasting outlooks on the history of innovations. Some studies of technology and science emphasize, as Hopper did, the role of creative inventors who make innovative leaps. Other studies emphasize the role of teams and institutions, such as the collaborative work done at Bell Labs and IBM’s Endicott facility. This latter approach tries to show that what may seem like creative leaps – the Eureka moment – are actually the result of an evolutionary process that occurs when ideas, concepts, technologies, and engineering methods ripen together. Neither way of looking at technological advancement is, on its oqn, completely satisfying. Most of the great innovations of the digital age sprang from an interplay of creative individuals (Mauchly, Turing, von Neumann, Aiken) with teams that knew how to implement their ideas.”
[Pages 91-92]

Google about Disruptive and Incremental Innovation

This is very similar to what I read about Google and posted recently in The Importance and Difficulty of Culture in Start-ups: Google again…: “To us, innovation entails both the production and implementation of novel and useful ideas. Since “novel” is often just a fancy synonym for “new”, we should also clarify that for something to be innovative, it needs to offer new functionality, but it also has to be surprising. If your customers are asking for it, you aren’t being innovative when you give them what they want; you are just being responsive. That’s a good thing, but it’s not innovative. Finally “useful” is a rather underwhelming adjective to describe that innovation hottie, so let’s add an adverb and make it radically useful, Voilà: For something to be innovative, it needs to be new, surprising, and radically useful.” […] “But Google also releases over five hundred improvements to its search every year. Is that innovative? Or incremental? They are new and surprising, for sure, but while each one of them, by itself is useful, it may be a stretch to call it radically useful. Put them all together, though, and they are. […] This more inclusive definition – innovation isn’t just about the really new, really big things – matters because it affords everyone the opportunity to innovate, rather than keeping it to the exclusive realm of these few people in that off-campus building [Google[x]] whose job is to innovate.” [How Google Works – Page 206]

Maybe more about The Innovators soon…