The Biography of John Atanasoff, Digital Pioneer
By Jane Smiley 256 pages, Doubleday. $25.95
One night a young physics professor named John Atanasoff got up from his dining room table and went for a drive along the Iowa back roads. "I was in such a mental state," he recalled dramatically, "that no resolution was possible." When he crossed into Illinois, though, Atanasoff spotted a roadside tavern. He went inside, ordered a bourbon and soda, took a sip, and then, with the crispness of a Frank Capra scene, the moment he'd waited for his entire life revealed itself. The design of the world's first working computer flashed before his eyes. He jotted down the plan on a cocktail napkin. It was 1937.
In The Man Who Invented the Computer, author Jane Smiley—yes, that Jane Smiley—tells the story of Atanasoff and of the subsequent race to build the world's first computer with all the grace expected from a fine novelist. In the first of publisher Doubleday's series on innovators, Smiley benefits from a subject culled from fiction. Born to Bulgarian immigrants, Atanasoff grew up in tiny Brewster, Fla. As a child, he memorized the manual for his father's Model T Ford and wired his family's home for electricity. As he got older, he sought the attention of girls by teaching himself to crochet. He spoke often about nondecimal ways of counting and considered himself a precocious mathematician. He often got into fistfights. By the time he was in graduate school, he was known as the Mad Russian. When Smiley quotes Atanasoff as confessing that he had been "unhappy to an extreme degree" as he struggled with the design of a computing machine, it isn't difficult to understand why.
While earning his PhD at the University of Wisconsin at Madison, Atanasoff became sick of using a Monroe calculator—a typewriter-like device whose power was limited to solving simple equations. After landing a job teaching quantum mechanics at Iowa State University, he tinkered with existing calculators, including one created by IBM (IBM), to make them more powerful. With a graduate student named Clifford Berry and a $650 grant, he built a prototype of his computer, the ABC, in 1940.
Atanasoff's computer may have been, as Smiley attests, the most important invention of the 20th century. The race to get there first, amid the backdrop of war, is eerily similar to today's battle to control the Internet, albeit without the black hoodies and flip-flops. That's to say it was filled with greed, hubris, and deception—and also genius, luck, and a cast of colorful characters. One of Atanasoff's competitors, the brilliant Nazi code breaker Alan Turing, was so focused on his work he forgot Christmas arrived every year on the same day. Another, Konrad Zuse, worked on his machine in nearly complete isolation in Berlin, which made his eventual creation of the Z3— the world's first programmable computer—that much more extraordinary. "Oddly enough," Smiley writes, "no inventor of the computer got rich off the invention, even though a few tried."
Never heard of Atanasoff? There's a reason. Having been called to work on government war programs, he wasn't able to patent his invention. Only in 1973, after a lengthy court battle, was he credited with being the inventor of the first digital computer, which was assembled in the basement of the Iowa State physics building.
The ABC (Atanasoff-Berry Computer) consisted of vacuum tubes and regenerative capacitor memory elements—capable of storing bits of data—that was 6 feet long and more than 3 feet tall, about the size of a souped-up Weber barbecue grill.
In our Jobsian era of apps, Atanasoff's story is a reminder of computing's original mission: to solve equations and problems that humans were incapable of handling themselves. Smiley notes that Atanasoff envisioned a machine much like the human brain, with regenerative memory. As Atanasoff once wrote: "I had been forced to the conclusion that if I wanted a computer suited to the general needs of science and, in particular, suited to solving systems of linear equations, I would have to build it myself."
It was a noble purpose, and subsequently one wonders what Atanasoff would make of his invention's spawn—the Human Genome Project, new biotech drugs, and terrorist tracking software aside. However, the value the public and investors place on a computer—whether an iPad, Droid, or BlackBerry—is inversely related to its size. As Steve Jobs has said over and over, the smaller computers become, the more "magical" they are in our lives. He has a point. Back then, it would have taken hundreds of Atanasoff machines to provide the computing muscle of 15 minutes on the iPhone. Not everyone has a basement with that kind of storage space. And maybe the young Atanasoff would have found more friends had he been on Facebook.
Yet as computers have gotten smaller, critics say, so have many of their objectives. Smartphones have become low-hanging fruit for software developers salivating at the chance to make a quick buck. Their development isn't only preoccupying the media, it's also dominating investment and venture capital dollars. Kleiner Perkins Caufield & Byers, the firm that helped launch Amazon.com (AMZN) and Google (GOOG), recently started a $200 million funding effort for apps.
We're unwittingly headed toward something called lock-in. The more we use these tiny computers for tasks we've always done without gadgets, the further removed we'll become from real innovation. For all the pleasure and time-saving mechanisms these tiny computers afford, there is this disturbing iFact: They aren't solving any real problems. The single most purchased app on the iPhone, Angry Birds, allows players to sling digital birds at virtual pigs. Apple (AAPL), once at the forefront of computing, is now, according to Jobs himself, a mobile device company—and who can blame it. Apple sold more than $7 billion worth of iPhones and iPads in the third quarter. Mac sales: $4.4 billion.
Fortunately, there are initiatives to push the new Atanasoffs in different directions. Web guru Tim O'Reilly is behind a program called Gov 2.0, an effort to get developers to improve government through apps. The U.S. Army recently held a contest for new apps, and winners included those dealing with disaster-relief aid, route calculators, and tools for young soldiers to understand the ranking system. Federal health officials are currently giving developers access to data on drugs and diseases.
All of this would likely please Atanasoff. However, apps that seek to innovate are still a long way from sexy. Earlier this year at TechCrunch Disrupt, a hacking event attended by some of the industry's sharpest computing minds, the most successful program to emerge was a smartphone app called GroupMe. It lets users create instant chat groups for trading private text messages—you know, instead of getting together for a beer. When Doubleday publishes the story of this generation of innovators, it'll probably be in an app and not even a book.
No comments:
Post a Comment