The first-ever computer-to-computer link is established on ARPANET, the precursor to the Internet.
The Advanced Research Projects Agency Network (ARPANET) was the first wide-area packet-switched network with distributed control and one of the first networks to implement the TCP/IP protocol suite. Both technologies became the technical foundation of the Internet. The ARPANET was established by the Advanced Research Projects Agency (ARPA) of the United States Department of Defense.Building on the ideas of J. C. R. Licklider, Bob Taylor initiated the ARPANET project in 1966 to enable access to remote computers. Taylor appointed Larry Roberts as program manager. Roberts made the key decisions about the network design. He incorporated Donald Davies' concepts and designs for packet switching, and sought input from Paul Baran. ARPA awarded the contract to build the network to Bolt Beranek & Newman who developed the first protocol for the network. Roberts engaged Leonard Kleinrock at UCLA to develop mathematical methods for analyzing the packet network technology.The first computers were connected in 1969 and the Network Control Protocol was implemented in 1970. The network was declared operational in 1971. Further software development enabled remote login, file transfer and email. The network expanded rapidly and operational control passed to the Defense Communications Agency in 1975.
Internetworking research in the early 1970s led by Bob Kahn at DARPA and Vint Cerf at Stanford University and later DARPA formulated the Transmission Control Program, which incorporated concepts from the French CYCLADES project. As this work progressed, a protocol was developed by which multiple separate networks could be joined into a network of networks. Version 4 of TCP/IP was installed in the ARPANET for production use in January 1983 after the Department of Defense made it standard for all military computer networking.Access to the ARPANET was expanded in 1981, when the National Science Foundation (NSF) funded the Computer Science Network (CSNET). In the early 1980s, the NSF funded the establishment of national supercomputing centers at several universities, and provided network access and network interconnectivity with the NSFNET project in 1986. The ARPANET was formally decommissioned in 1990, after partnerships with the telecommunication and computer industry had assured private sector expansion and future commercialization of an expanded world-wide network, known as the Internet.
A computer is a digital electronic machine that can be programmed to carry out sequences of arithmetic or logical operations (computation) automatically. Modern computers can perform generic sets of operations known as programs. These programs enable computers to perform a wide range of tasks. A computer system is a "complete" computer that includes the hardware, operating system (main software), and peripheral equipment needed and used for "full" operation. This term may also refer to a group of computers that are linked and function together, such as a computer network or computer cluster.
A broad range of industrial and consumer products use computers as control systems. Simple special-purpose devices like microwave ovens and remote controls are included, as are factory devices like industrial robots and computer-aided design, as well as general-purpose devices like personal computers and mobile devices like smartphones. Computers power the Internet, which links billions of other computers and users.
Early computers were meant to be used only for calculations. Simple manual instruments like the abacus have aided people in doing calculations since ancient times. Early in the Industrial Revolution, some mechanical devices were built to automate long tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century. The first digital electronic calculating machines were developed during World War II. The first semiconductor transistors in the late 1940s were followed by the silicon-based MOSFET (MOS transistor) and monolithic integrated circuit (IC) chip technologies in the late 1950s, leading to the microprocessor and the microcomputer revolution in the 1970s. The speed, power and versatility of computers have been increasing dramatically ever since then, with transistor counts increasing at a rapid pace (as predicted by Moore's law), leading to the Digital Revolution during the late 20th to early 21st centuries.
Conventionally, a modern computer consists of at least one processing element, typically a central processing unit (CPU) in the form of a microprocessor, along with some type of computer memory, typically semiconductor memory chips. The processing element carries out arithmetic and logical operations, and a sequencing and control unit can change the order of operations in response to stored information. Peripheral devices include input devices (keyboards, mice, joystick, etc.), output devices (monitor screens, printers, etc.), and input/output devices that perform both functions (e.g., the 2000s-era touchscreen). Peripheral devices allow information to be retrieved from an external source and they enable the result of operations to be saved and retrieved.