The concept of the integrated circuit, the basis for all modern computers, is first published by Geoffrey Dummer.

Geoffrey William Arnold Dummer, MBE (1945), C. Eng., IEE Premium Award, FIEEE, MIEE, USA Medal of Freedom with Bronze Palm (25 February 1909 9 September 2002) was an English electronics engineer and consultant, who is credited as being the first person to popularise the concepts that ultimately led to the development of the integrated circuit, commonly called the microchip, in the late 1940s and early 1950s. Dummer passed the first radar trainers and became a pioneer of reliability engineering at the Telecommunications Research Establishment in Malvern in the 1940s.

Manchester College of Arts and Technology

Born in Hull, Dummer studied electrical engineering at Manchester College of Technology starting in the early 1930s. By the early 1940s he was working at the Telecommunications Research Establishment in Malvern (later to become the Royal Radar Establishment).

His work with colleagues at TRE led him to the belief that it would be possible to fabricate multiple circuit elements on and into a substance like silicon. In 1952 he became one of the first people to speak publicly on the topic of integrated circuits, presenting his conceptual work at a conference in Washington, DC. As a result, he has been called "the prophet of the integrated circuit".Dummer was admitted to a nursing home in Malvern in 2000 due to a stroke and died in September 2002, aged 93.

An integrated circuit or monolithic integrated circuit (also referred to as an IC, a chip, or a microchip) is a set of electronic circuits on one small flat piece (or "chip") of semiconductor material, usually silicon. Large numbers of tiny MOSFETs (metal–oxide–semiconductor field-effect transistors) integrate into a small chip. This results in circuits that are orders of magnitude smaller, faster, and less expensive than those constructed of discrete electronic components. The IC's mass production capability, reliability, and building-block approach to integrated circuit design has ensured the rapid adoption of standardized ICs in place of designs using discrete transistors. ICs are now used in virtually all electronic equipment and have revolutionized the world of electronics. Computers, mobile phones, and other digital home appliances are now inextricable parts of the structure of modern societies, made possible by the small size and low cost of ICs such as modern computer processors and microcontrollers.

Very-large-scale integration was made practical by technological advancements in metal–oxide–silicon (MOS) semiconductor device fabrication. Since their origins in the 1960s, the size, speed, and capacity of chips have progressed enormously, driven by technical advances that fit more and more MOS transistors on chips of the same size – a modern chip may have many billions of MOS transistors in an area the size of a human fingernail. These advances, roughly following Moore's law, make the computer chips of today possess millions of times the capacity and thousands of times the speed of the computer chips of the early 1970s.

ICs have two main advantages over discrete circuits: cost and performance. The cost is low because the chips, with all their components, are printed as a unit by photolithography rather than being constructed one transistor at a time. Furthermore, packaged ICs use much less material than discrete circuits. Performance is high because the IC's components switch quickly and consume comparatively little power because of their small size and proximity. The main disadvantage of ICs is the high cost of designing them and fabricating the required photomasks. This high initial cost means ICs are only commercially viable when high production volumes are anticipated.