The first freely programmable computer was invented in 1936 by Konrad Zuse, a German engineer for the Henschel Aircraft Company in Berlin. It was called the Z1 and was a mechanical binary calculating machine that had a memory and could keep track of intermediate results for use later. Zuse used it to develop key technologies that now form the basis of modern computing, including floating-point arithmetic and the yes/no principle (base 2). He used old movie film to store his data, because paper was scarce. In 1946 he even wrote the first programming language called "Plankalkul" and used it to write the world's first chess-playing program. Unable to secure funding from the Nazi government, he escaped to Switzerland and smuggled his final model-in-progress, the Z4, with him on a horse-drawn cart to Zurich.
While the Z1 was mechanical, the first electronic-digital computer was developed between 1939 and 1942 at Iowa State College by Professor John Atanasoff and graduate engineering student Clifford Berry. It was later named the ABC for the Atanasoff-Berry Computer. It weighed 700 pounds, was the size of a gaming machine, and contained over a mile of wire. It could perform one operation every 15 seconds (modern computers are about 200 billion times faster). Amazingly, while Atanasoff was working with the government on military and defense projects during the war, the computer was dismantled and destroyed by the physics faculty during a spring cleaning. Only a few parts of this important artifact remain.
The next breakthrough was the Mark series of computers, built for the United States Navy by Howard Aiken and Grace Hopper at Harvard University beginning in 1944. The Mark I filled a room-it was 55 feet long, 8 feet high, and weighed 5 tons. The ENIAC (Electrical Numerical Integrator and Calculator) was even bigger-it took up 1,800 square feet, weighed 30 tons, and cost $500,000. It required so much power to operate that when it was switched on, the city of Philadelphia experienced near blackouts. It was a thousand times faster than its contemporaries, but required weeks to program.
The invention of the transistor to replace vacuum tubes revolutionized computers and enabled them to begin shrinking. Then Jack Kilby and Robert Noyce came up with the idea of replacing the hundreds of miles of wiring by replacing all the components on a "chip" made of semiconductor material (germanium or silicon).
Today's computers use microprocessors, which were invented way back in 1971 by Intel Corp., but the next breakthrough will arrive in the form of quantum computers, which for the first time will be able to perform millions of calculations simultaneously, rather than linearly as they do currently.
No comments:
Post a Comment