Immediately following the Second World War, electrical engineers grappled with a fundamental but open question: How should electronic digital computers be built? What kind of switch would serve best for logic circuits? And what should be used for main memory?
They quickly settled on the speedy vacuum tube, among several options, for the basic logic switch, with each machine requiring thousands of them. (The transistor wasn’t yet a serious candidate, having just emerged from Bell Telephone Laboratories.) The options for main memory in the earliest systems were also diverse: specialized cathode-ray tubes, mercury-filled pipes, and spinning drums covered with magnetic paint. But in the early 1950s, the technical community began to converge on another memory technology—magnetic cores. These small rings of ferromagnetic material each held a single bit of data when magnetized in one direction or the other.
Through the mid-1950s, “big iron” mainframes containing vacuum-tube logic and magnetic-core memory dominated the budding world of electronic digital computers. In time, tubes gave way to transistors, and discrete transistors to silicon integrated circuits, for both logic and main memory. But this progression was not inevitable. In the 1950s and early 1960s groups of engineers actively explored radically different paths for the digital computer.
One of the most original of these explorers was Dudley Allen Buck, who worked at MIT from 1950 until his sudden death in 1959 at age 32. Buck made important early contributions to the development of microcircuitry—the pursuit of highly miniaturized circuits fabricated as integral wholes rather than from discrete components wired together. What’s more, Buck invented the “cryotron,” a superconducting switch he hoped would become the fundamental building block for future digital computers. Inspired by Buck’s vision, GE, IBM, RCA, and the U.S. military all mounted major cryotron-research programs in the late 1950s and early 1960s before shifting their focus to silicon microchips for computer logic and memory.
Buck’s vision outlived him. It survives even today: The cryotron is at the root of efforts at IBM and elsewhere to make superconducting quantum bits—qubits—in pursuit of quantum computing.
Despite the decades of work that it sparked, Buck and his cryotron have faded from memory. Most electrical engineers today know nothing about this technology. So let me offer here a sketch of Buck’s work and his now-forgotten cryotron computer.
After graduating from the University of Washington in 1948 with a bachelor’s degree in electrical engineering, Dudley Buck joined the Navy’s cryptologic organization in Washington, D.C., where he worked with early digital computers. In 1950, he moved to MIT and began graduate studies in electrical engineering under physicist Arthur von Hippel. Buck also became a research assistant on MIT’s pioneering Whirlwind computer, a behemoth intended for military use.
Jay Last, who was a fellow graduate student with Buck at MIT and who later led the team that created the first planar silicon integrated circuit at Fairchild Semiconductor, recalls him as being both a “great visionary” and a “good person…close to obnoxiously good.” A clue as to why Last got this impression can be found in a letter that Buck wrote in 1954, when he was just 27: “I have a foster son, aged 17 who has been with me for 4 years, have been a Scoutmaster for 6 years, and I am a lay speaker in the Methodist Church, where I occasionally fill the Sunday morning pulpit. I enjoy working with the human values as well as the engineering values.”
Despite the many commitments of a rich personal life, Buck poured enormous energy into his research at MIT. Early on, as part of the Whirlwind effort, he investigated various materials for making magnetic cores. He also searched for materials with dramatic physical properties that could be useful as the basis for improved switches from which to make advanced digital computers.
In 1952, Buck’s attention alighted on the chemical element bismuth, which exhibits strong magnetoresistance: Its electrical resistivity rises dramatically in response to an applied magnetic field, especially at low temperatures. At the boiling point of liquid helium (4.2 kelvins), the electrical resistance of bismuth varies by a factor of tens of millions with the application of a strong magnetic field. Buck thought this behavior could be useful for building computers. A relatively small current in a control wire, and the magnetic field it produces, could bring about an enormous change in the resistivity of a piece of bismuth, abruptly halting or allowing current to travel through it. He would have an electronic switch.
By 1954, Buck began to focus on an even more extreme quirk of electromagnetism found at the low temperatures of liquid helium: superconductivity. The phenomenon, while peculiar, was well established. Since the early 20th century, physicists had known that when cooled to temperatures around the boiling point of liquid helium, various metals lose their electrical resistance entirely.
Superconductivity also has a magnetic aspect, known as the Meissner effect. A piece of superconducting material excludes magnetic fields—but only up to a point. If a sufficiently large magnetic field is applied, the material is driven, nearly instantaneously, into the resistive state. If the magnetic field is removed, the material returns to the superconducting state.
Buck saw in this phenomenon the possibility for a new, singular building block for electronic digital computers: a magnetically controlled, superconductive switch. He thought it might beat both tubes and cores. A superconductive switch could be very small and fast and consume very little power.
Buck dubbed his invention the cryotron, using a futuristic, quintessentially 1950s evocation of cryo (Greek for “icy cold”) in a play on the word electronics. But he didn’t just conceive and name it. He immediately built and tested dozens of prototypes.
Buck’s first cryotrons were fantastically simple. They involved nothing more than a short length of tantalum wire around which he wound some copper wire in a tight helix. He then attached electrical leads to both ends of the tantalum and copper wires, so that the cryotron could be dipped into a container of liquid helium while still connected to external circuitry.
By sending a current through the copper helix, thereby creating a magnetic field, Buck could drive the tantalum wire from superconductivity to resistivity. What’s more, his prototypes showed gain. That is, a small current in the copper winding could control a much larger current in the tantalum wire. Like triode vacuum tubes and transistors, Buck’s cryotron could act as a digital computer’s logic switch.
Buck was seized by the promise of his new superconducting device. He imagined making large arrays of cryotrons using the printed-circuitry techniques that he had contemplated in his master’s thesis. From these, or even from his wire-wound cryotrons, Buck believed an all-cryotron digital computer could be built, with cryotrons serving for logic and memory alike. But he was concerned about the switching speed of his prototypes, which were disappointingly slow—barely better than electromechanical relays.
In a quest for better performance, Buck tried many different materials. A combination of lead wound with niobium, for instance, offered a switching time of 5 microseconds—not bad, but still much slower than the speediest transistors of the era, which switched 100 times as fast. But Buck believed that by reducing their physical dimensions, he could build cryotrons that matched even the best transistors.
In the meanwhile, wiring together several of his hand-wound cryotrons, Buck successfully fabricated a logic gate, a flip-flop, and a fan-out amplifier. He thus created all the basic circuits required for digital computer memory and logic using cryotrons alone. The all-cryotron superconducting computer was not an idle dream.
At this point, Buck’s research program, and his ambitions for it, expanded dramatically. He believed that by using microminiaturization techniques he’d be able to fashion a computer containing tens of thousands of cryotrons. Buck’s computer would have roughly the computing power of Whirlwind, which at the time was among the world’s most advanced digital computers, but it required many rooms packed full of electronic equipment and consumed 150 kilowatts of electricity.
The 28-year-old engineer was in essence proposing to squeeze Whirlwind down to the size of a radio set, submerge it in a tub of liquid helium, and run it using no more power than what a Christmas-tree bulb consumes. His vision was audacious, but his arguments, enthusiasm, and results convinced his colleagues that the cryotron had merit.
Cryotron research now became Buck’s official job at MIT’s Lincoln Laboratory. While he continued to work on smaller, faster, lower-power cryotrons, he simultaneously began a project to create a large computer memory, for which the slow switching speeds of existing cryotrons would not matter.
Buck proposed using 75 000 cryotrons to form what is known today as a content-addressable memory. Buck himself would come to refer to it as a “recognition unit.” That’s because each of the many memory locations was simultaneously checked to see whether it contained a desired piece of information.
Such a memory had particular advantages for cryptanalysis, in which the identification of patterns is often paramount. I suspect that Buck’s motivation for building it stemmed from his earlier work on code-breaking machines for the Navy and from his ongoing consulting work while at MIT for the newly minted National Security Agency (NSA). In any event, his all-cryotron recognition unit would be only about as large as a briefcase, and yet at 3.2 kilobytes, it would roughly match the main magnetic-core memory of Whirlwind.
As Buck prepared a patent application on the cryotron in mid-1955, news of his effort to build a content-addressable memory percolated through U.S. cryptological and computing circles, generating considerable interest. In July of that year, John McPherson, an IBM vice president who was a leader in the firm’s efforts in electronic computing, wrote to Buck, explaining that William Friedman, the chief cryptologist of the NSA, was “very interested” in Buck’s superconducting computer components.
Buck filed his patent application just days after receiving McPherson’s letter. The patent, for a “Magnetically Controlled Gating Element,” contained broad claims for the cryotron and its use in computers.
At this point, Buck’s cryotron research expanded beyond MIT, albeit just down Memorial Drive. He signed a consulting agreement on cryotron technology with the contract-research firm Arthur D. Little. Named after its MIT chemist founder, A.D. Little was adjacent to the MIT campus and in the 1950s had become a leading producer of cryostats for the production of liquid helium. With NSA sponsorship, Buck and researchers at A.D. Little began development of Buck’s cryotron recognition unit, starting with a smaller proof-of-concept memory array.
Through the rest of 1955, Buck’s personal cryotron research at MIT focused on creating miniature cryotrons, and even integrated cryotron arrays, using evaporated thin films. Instead of winding small wires about one another, he wanted to evaporate metal through a mask, like a stencil, onto a substrate to create a patterned thin film of superconducting material. Atop this film, he would evaporate the control wiring through another mask. He would thus be able to print cryotrons, arrays of them.
In preparation, Buck tested a variety of films made of alloys of lead, bismuth, strontium, indium, and other elements. During these experiments, he produced a 100-nanometer-thick film of a lead-bismuth-strontium alloy that could switch between superconducting and resistive states in 0.1 microsecond—a tenth the speed of the fastest transistor at the time. Buck also designed a wide range of binary circuits that could be constructed solely from cryotrons, including flip-flops, gates, multivibrators, adders, and accumulators.
With his patent filed and significant research completed, Buck was ready to announce the cryotron to the world. He submitted a paper titled “The Cryotron—A Superconductive Computer Component” to the Institute of Radio Engineers, one of IEEE’s predecessor organizations, in November 1955. In the paper, Buck detailed the wire-wound cryotron and a range of basic digital circuits that could be made with it, stressing the implications for this superconductive device in computing. “The cryotron in its present state of development…can be used as an active element in logical circuits,” Buck wrote. He did not restrain himself from sharing his conviction that, in the near term, “a large-scale digital computer can be made to occupy one cubic foot.… The power required by such a machine extrapolates to about one-half watt.”
Buck’s discussion of switching speeds in this paper was, in contrast, coy: “The device is at present somewhat faster than electromechanical relays, but far slower than vacuum tubes and transistors. A program is under way to increase the speed.” Although he had already tested thin-film cryotrons that could come close to the fastest transistors, Buck kept news of this development and the ongoing work on the cryotron recognition unit to himself.
By the time Buck’s article appeared in the Proceedings of the IRE (April 1956), he was regularly creating and testing thin-film cryotrons. The work at A.D. Little on the proof-of-concept cryotron memory unit was under way, and NSA engineer Albert Slade had begun his own investigations of cryotron circuitry with advice from Buck.
Around this time, Buck submitted to von Hippel his ideas for a doctoral thesis. The outline came as no surprise: Buck would investigate evaporated thin films of superconducting materials and study ways of controlling their thickness and geometry to create fast-switching cryotrons. Von Hippel signed off on the proposal, which promised to yield exciting results promptly.
Until his death in 1959, Buck was at the center of expanding and intensifying efforts to develop integrated cryotron microcircuits. Albert Slade, for example, moved from the NSA to A.D. Little to work on the cryotron recognition unit. Another NSA researcher, Horace Tharp Mann, began studying evaporated thin-film cryotrons in consultation with Buck. In 1957, IBM and RCA each initiated their own NSA-funded programs to develop high-speed thin-film cryotron circuitry. General Electric added to the momentum with a self-funded program of cryotron research. Buck, still the MIT graduate student, now had some stiff competition.
In characteristic fashion, Buck responded by setting even higher goals for his research. He did that in a collaboration with Kenneth R. Shoulders, who, like Buck, was a member of von Hippel’s MIT laboratory. Shoulders was actively pursuing a different innovation: using electron beams to “micro-machine,” or etch, extremely small microcircuits. This approach, later dubbed electron-beam lithography, has become indispensable in making silicon microchips. In the mid-1950s, Shoulders was aiming to construct electronic devices with features as small as 100 nm—smaller than a virus and orders of magnitude smaller than anything anyone had ever attempted to make. Shoulders’s ambitions aligned precisely with Buck’s desire to increase the speed of cryotrons through miniaturization and to create large-scale integrated arrays of them.
In their work together, Shoulders explored various ways of manipulating electron beams while Buck evaluated a wide variety of superconducting alloys and the resist materials for electron-beam etching. The pair worked together until the middle of 1958, when Buck earned a doctorate and a position as an assistant professor in MIT’s department of electrical engineering and Shoulders left MIT for the Stanford Research Institute in Menlo Park, Calif.
As a capstone to their collaboration, Buck and Shoulders presented a paper titled “An Approach to Microminiature Printed Systems” to the Eastern Joint Computer Conference in December 1958. This paper expressed their now-shared conviction in the future of massively integrated microcircuitry. “The day is rapidly drawing near when digital computers will no longer be made by assembling thousands of individually manufactured parts into plug-in assemblies,” they wrote. “Instead, an entire computer or a very large part of a computer probably will be made in a single process.”
Five months after presenting this paper, Buck died suddenly. The last entry in his lab notebook, dated 18 May 1959, describes his effort to deposit a film of the element boron. Stricken in the following days by respiratory distress, Buck perished on 21 May. Not one month had passed since his 32nd birthday.
Although his death was attributed at the time to viral pneumonia, I believe that his deposition experiments may have been to blame. Buck’s work of 18 May involved two substances that require the utmost care. His source of boron was boron trichloride gas, and the process for depositing the boron film generates hydrogen chloride gas. Exposure to either gas, to say nothing of their combination, can cause fatal pulmonary edema to develop, with symptoms similar to pneumonia. And while he had taken courses in the subject at MIT, Buck was not a chemist. He may not have appreciated the danger or have had sufficient bench experience to safely contend with these gases. In any event, for his colleagues, Buck’s death was a tragedy.
Cryotron research did not end with Buck. Strong efforts to build cryotron computers continued into the 1960s. Mann, who had worked on thin-film cryotrons at the NSA, moved to TRW’s Space Technology Laboratories in Los Angeles in the late 1950s. There, he pursued electron-beam lithography to make thin-film cryotrons until 1966. And researchers at A.D. Little continued to develop cryotron memory arrays in an effort to build Buck’s recognition unit.
Meanwhile, GE, IBM, and RCA developed thin-film cryotron microcircuitry, particularly for memory, through the early 1960s. By 1961, GE researchers had produced a working integrated shift register made with thin-film cryotrons, matching the complexity of silicon integrated circuits at the time. Within two years, GE’s cryotron microcircuitry had surpassed silicon microchips in its level of integration. Researchers there even fabricated an experimental working computer from three arrays of integrated cryotrons.
Despite all these efforts, the rapid development of silicon microchips—in particular their ability to lower the cost of electronics—during the 1960s eclipsed the advances in cryotrons, leading to digital computers dominated by silicon logic and magnetic-core memory. By the mid-1960s, most cryotron researchers abandoned the superconducting switch, shifting their attention to silicon.
Some persisted, however. Their attention focused on special cryotrons that exhibited a quantum-mechanical phenomenon called the Josephson effect. In the early 1970s, IBM researchers created modified cryotrons known as Josephson junctions. These were at the center of a huge effort at IBM to build superconducting computers, which lasted into the 1980s. And Josephson junctions continue to be a mainstay of quantum-computing research at IBM and elsewhere.
So Buck’s cryotron never really disappeared. It has survived, in different forms and under different names, in the long shadow of the silicon microchip. We can only wonder what more Buck might have explored had he lived longer.
This article originally appeared in print as “Dudley Buck and the Computer That Never Was.”
About the Author
Brock is a senior research fellow at the Chemical Heritage Foundation’s Center for Contemporary History and Policy. While writing Makers of the Microchip: A Documentary History of Fairchild Semiconductor (MIT Press, 2010), he learned of efforts to build briefcase-size superconducting computers during the 1950s. “I asked my history-of-technology friends, and nobody had ever heard about it,” says Brock.