Computer chips control most appliances in the modern world, from communications to refrigerators and alarm systems. They are usually made from silicon, the most abundant element on Earth. The story of how the world’s most common element became its most common electronic device is the history of modern science.
Electricity is the energy that powers heating, lighting, motors and numerous modern appliances. Electronics is the science of how this electrical energy is controlled. In the early 20th century, this was done with the triode valve. This is a vacuum tube with three terminals. One of the terminals controls the flow of current or voltage between the other two. It amplifies, diminishes or switches the signal elsewhere. But vacuum tubes, like light bulbs, burn out, leak and become two heavy and fragile to use in the complicated circuits of computers. They were replaced by transistors, small devices with three terminals and made of semiconducting materials.
Semiconductors are materials with physical properties somewhere between those of conductors and insulators of electricity. When impurities are added to these materials they can make large changes in their electrical properties. This process is called doping. The impact of light, temperature and pressure can also change their conductivity, and cause a current to pass or a voltage to grow between two terminals on the semiconductor.
How they work
Semiconductor devices are usually made of elements from the middle of the Periodic Table: silicon and germanium. These elements have four outer electrons in their atoms that hold together with neighbouring atoms by strong covalent bonds. This enables the elements to create firm, stable crystals. Silicon forms crystals that resemble metals but in pure form, they cannot conduct electricity. When the crystal is doped – exposed to a gas, polymer, electrolyte solution, magnetism or light – the outer electrons are energised and jump into different positions away from the bonds. Some electrons remain free and transform the semiconductors into excellent conductors of electricity. Energised electrons in a semiconductor can emit light of varying colours when a voltage is applied to them
The invention of the integrated circuit in 1958 by American physicist Jack Kilby pioneered the modern computer chip. Rather than soldering transistor wires onto a printed circuit board to make an electronic device, he miniaturised the process. The basic idea was to make the transistors and the circuit out of the same material. Today, the chip is produced by ultraviolet photolithography. A UV light is shined through a mask with a circuit imprint and etches that circuit onto a single silicon crystal.
Availability and purity
Silicon and germanium are the most common elements used as semiconductors, as well as a compound, gallium arsenide. Silicon rose in importance because of its availability and relative lack of defects. It is important to have the purest material in place before doping to produce precise electronic properties. Silicon can be produced in purer form than germanium, the first element used for computer chips. Silicon oxides, the raw material for silicon, are the most common compounds on the planet and are found in sands, gravels and rocks. This has a major impact on the costs of producing chips from silicon compared with other materials.
- 20 of the funniest online reviews ever
- 14 Biggest lies people tell in online dating sites
- Hilarious things Google thinks you're trying to search for