Выбери любимый жанр

The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolutio - Isaacson Walter - Страница 12


Изменить размер шрифта:

12

Bush’s machine, however, was not destined to be a major advance in computing history because it was an analog device. In fact, it turned out to be the last gasp for analog computing, at least for many decades.

New approaches, technologies, and theories began to emerge in 1937, exactly a hundred years after Babbage first published his paper on the Analytical Engine. It would become an annus mirabilis of the computer age, and the result would be the triumph of four properties, somewhat interrelated, that would define modern computing:

DIGITAL. A fundamental trait of the computer revolution was that it was based on digital, not analog, computers. This occurred for many reasons, as we shall soon see, including simultaneous advances in logic theory, circuits, and electronic on-off switches that made a digital rather than an analog approach more fruitful. It would not be until the 2010s that computer scientists, seeking to mimic the human brain, would seriously begin working on ways to revive analog computing.

BINARY. Not only would modern computers be digital, but the digital system they would adopt would be binary, or base-2, meaning that it employs just 0s and 1s rather than all ten digits of our everyday decimal system. Like many mathematical concepts, binary theory was pioneered by Leibniz in the late seventeenth century. During the 1940s, it became increasingly clear that the binary system worked better than other digital forms, including the decimal system, for performing logical operations using circuits composed of on-off switches.

ELECTRONIC. In the mid-1930s, the British engineer Tommy Flowers pioneered the use of vacuum tubes as on-off switches in electronic circuits. Until then, circuits had relied on mechanical and electromechanical switches, such as the clacking electromagnetic relays that were used by phone companies. Vacuum tubes had mainly been employed to amplify signals rather than as on-off switches. By using electronic components such as vacuum tubes, and later transistors and microchips, computers could operate thousands of times faster than machines that had moving electromechanical switches.

GENERAL PURPOSE. Finally, the machines would eventually have the ability to be programmed and reprogrammed—and even reprogram themselves—for a variety of purposes. They would be able to solve not just one form of mathematical calculation, such as differential equations, but could handle a multiplicity of tasks and symbol manipulations, involving words and music and pictures as well as numbers, thus fulfilling the potential that Lady Lovelace had celebrated when describing Babbage’s Analytical Engine.

Innovation occurs when ripe seeds fall on fertile ground. Instead of having a single cause, the great advances of 1937 came from a combination of capabilities, ideas, and needs that coincided in multiple places. As often happens in the annals of invention, especially information technology invention, the time was right and the atmosphere was charged. The development of vacuum tubes for the radio industry paved the way for the creation of electronic digital circuits. That was accompanied by theoretical advances in logic that made circuits more useful. And the march was quickened by the drums of war. As nations began arming for the looming conflict, it became clear that computational power was as important as firepower. Advances fed on one another, occurring almost simultaneously and spontaneously, at Harvard and MIT and Princeton and Bell Labs and an apartment in Berlin and even, most improbably but interestingly, in a basement in Ames, Iowa.

Underpinning all of these advances were some beautiful—Ada might call them poetic—leaps of mathematics. One of these leaps led to the formal concept of a “universal computer,” a general-purpose machine that could be programmed to perform any logical task and simulate the behavior of any other logical machine. It was conjured up as a thought experiment by a brilliant English mathematician with a life story that was both inspiring and tragic.

ALAN TURING

Alan Turing had the cold upbringing of a child born on the fraying fringe of the British gentry.1 His family had been graced since 1638 with a baronetcy, which had meandered down the lineage to one of his nephews. But for the younger sons on the family tree, which Turing and his father and grandfather were, there was no land and little wealth. Most went into fields such as the clergy, like Alan’s grandfather, and the colonial civil service, like his father, who served as a minor administrator in remote regions of India. Alan was conceived in Chhatrapur, India, and born on June 23, 1912, in London, while his parents were on home leave. When he was only one, his parents went back to India for a few years, and handed him and his older brother off to a retired army colonel and his wife to be raised in a seaside town on the south coast of England. “I am no child psychologist,” his brother, John, later noted, “but I am assured that it is a bad thing for an infant in arms to be uprooted and put into a strange environment.”2

When his mother returned, Alan lived with her for a few years and then, at age thirteen, was sent to boarding school. He rode there on his bicycle, taking two days to cover more than sixty miles, alone. There was a lonely intensity to him, reflected in his love of long-distance running and biking. He also had a trait, so common among innovators, that was charmingly described by his biographer Andrew Hodges: “Alan was slow to learn that indistinct line that separated initiative from disobedience.”3

In a poignant memoir, his mother described the son whom she doted upon:

Alan was broad, strongly built and tall, with a square, determined jaw and unruly brown hair. His deep-set, clear blue eyes were his most remarkable feature. The short, slightly retrousse nose and humorous lines of his mouth gave him a youthful—sometimes a childlike—appearance. So much so that in his late thirties he was still at times mistaken for an undergraduate. In dress and habits he tended to be slovenly. His hair was usually too long, with an overhanging lock which he would toss back with a jerk of his head. . . . He could be abstracted and dreamy, absorbed in his own thoughts which on occasion made him seem unsociable. . . . There were times when his shyness led him into extreme gaucherie. . . . Indeed he surmised that the seclusion of a mediaeval monastery would have suited him very well.4

At the boarding school, Sherborne, he realized that he was homosexual. He became infatuated with a fair-haired, slender schoolmate, Christopher Morcom, with whom he studied math and discussed philosophy. But in the winter before he was to graduate, Morcom suddenly died of tuberculosis. Turing would later write Morcom’s mother, “I simply worshipped the ground he trod on—a thing which I did not make much attempt to disguise, I am sorry to say.”5 In a letter to his own mother, Turing seemed to take refuge in his faith: “I feel that I shall meet Morcom again somewhere and that there will be work for us to do together there as I believed there was for us to do here. Now that I am left to do it alone, I must not let him down. If I succeed I shall be more fit to join his company than I am now.” But the tragedy ended up eroding Turing’s religious faith. It also turned him even more inward, and he never again found it easy to forge intimate relationships. His housemaster reported to his parents at Easter 1927, “Undeniably he’s not a ‘normal’ boy; not the worse for that, but probably less happy.”6

In his final year at Sherborne, Turing won a scholarship to attend King’s College, Cambridge, where he went in 1931 to read mathematics. One of three books he bought with some prize money was The Mathematical Foundations of Quantum Mechanics, by John von Neumann, a fascinating Hungarian-born mathematician who, as a pioneer of computer design, would have a continuing influence on his life. Turing was particularly interested in the math at the core of quantum physics, which describes how events at the subatomic level are governed by statistical probabilities rather than laws that determine things with certainty. He believed (at least while he was young) that this uncertainty and indeterminacy at the subatomic level permitted humans to exercise free will—a trait that, if true, would seem to distinguish them from machines. In other words, because events at the subatomic level are not predetermined, that opens the way for our thoughts and actions not to be predetermined. As he explained in a letter to Morcom’s mother:

12
Перейти на страницу:
Мир литературы