The History Of Electronics
The history of electronics is a story of the twentieth century and three key components—the vacuum tube, the transistor, and the integrated circuit. In 1883, Thomas Alva Edison discovered that electrons will flow from one metal conductor to another through a vacuum.
This discovery of conduction became known as the Edison effect. In
1904, John Fleming applied the Edison effect in inventing a two-element electron tube called a diode,
and Lee De Forest followed in 1906 with the three-element tube, the
triode. These vacuum tubes were the devices that made manipulation of
electrical energy possible so it could be amplified and transmitted.
Communications technology was able to make huge advances before World War II as more specialized tubes were made for many applications. Radio as the primary form of education and entertainment was soon challenged by television, which was invented in the 1920s but didn't become widely available until 1947. Bell Laboratories publicly unveiled the television in 1927, and its first forms were electromechanical. When an electronic system was proved superior, Bell Labs engineers introduced the cathode ray picture tube and color television. But Vladimir Zworykin, an engineer with the Radio Corporation of America (RCA), is considered the "father of the television" because of his inventions, the picture tube and the iconoscope camera tube.
After the war, electron tubes were used to develop the first computers, but they were impractical because of the sizes of the electronic components. In 1947, the transistor was invented by a team of engineers from Bell Laboratories. John Bardeen, Walter Brattain, and William Shockley received a Nobel prize for their creation, but few could envision how quickly and dramatically the transistor would change the world. The transistor functions like the vacuum tube, but it is tiny by comparison, weighs less, consumes less power, is much more reliable, and is cheaper to manufacture with its combination of metal contacts and semiconductor materials.
The concept of the integrated circuit was proposed in 1952 by Geoffrey W. A. Dummer, a British electronics expert with the Royal Radar Establishment. Throughout the 1950s, transistors were mass produced on single wafers and cut apart. The total semiconductor circuit was a simple step away from this; it combined transistors and diodes (active devices) and capacitors and resistors (passive devices) on a planar unit or chip. The semiconductor industry and the silicon integrated circuit (SIC) evolved simultaneously at Texas Instruments and Fairchild Semiconductor Company. By 1961, integrated circuits were in full production at a number of firms, and designs of equipment changed rapidly and in several directions to adapt to the technology. Bipolar transistors and digital integrated circuits were made first, but analog ICs, large-scale integration (LSI), and very-large-scale integration (VLSI) followed by the mid-1970s. VLSI consists of thousands of circuits with on-and-off switches or gates between them on a single chip. Microcomputers, medical equipment, video cameras, and communication satellites are only examples of devices made possible by integrated circuits.
cool
RépondreSupprimer