language-icon Old Web
English
Sign In

Digital Revolution

The Digital Revolution, also known as the Third Industrial Revolution, is the shift from mechanical and analogue electronic technology to digital electronics which began anywhere from the late 1950s to the late 1970s with the adoption and proliferation of digital computers and digital record keeping that continues to the present day. Implicitly, the term also refers to the sweeping changes brought about by digital computing and communication technology during (and after) the latter half of the 20th century. Analogous to the Agricultural Revolution and Industrial Revolution, the Digital Revolution marked the beginning of the Information Age. Central to this revolution is the mass production and widespread use of digital logic, MOS transistors, and integrated circuits, and their derived technologies, including computers, microprocessors, digital cellular phones, and the Internet. These technological innovations have transformed traditional production and business techniques. The underlying technology was invented in the later half of the 19th century, including Babbage's analytical engine and the telegraph. Digital communication became economical for widespread adoption after the invention of the personal computer. Claude Shannon, a Bell Labs mathematician, is credited for having laid out the foundations of digitalization in his pioneering 1948 article, A Mathematical Theory of Communication. The digital revolution converted technology that had been analog into a digital format. By doing this, it became possible to make copies that were identical to the original. In digital communications, for example, repeating hardware was able to amplify the digital signal and pass it on with no loss of information in the signal. Of equal importance to the revolution was the ability to easily move the digital information between media, and to access or distribute it remotely. The turning point of the revolution was the change from analogue to digitally recorded music. During the 1980s the digital format of optical compact discs gradually replaced analog formats, such as vinyl records and cassette tapes, as the popular medium of choice. In 1947, the transistor was invented, leading the way to more advanced digital computers. From the late 1940s, Universities, the Military, and Business developed computer systems, to digitally replicate and automate previously manually performed mathematical calculations, with the LEO (computer) being the first commercially available general purpose computer. The next major milestones in semiconductor technology were the invention of the integrated circuit at Texas Instruments in 1958, and the invention of the metal-oxide-semiconductor (MOS) transistor at Bell Labs in 1959. The MOS transistor subsequently became the basic building block of the digital revolution. A key step toward mass commercialization was the advent of the planar process in 1959 by Jean Hoerni, an employee of Fairchild Semiconductor. This enabled silicon integrated circuits to be mass produced using the techniques of photolithography. The development of the silicon-gate MOS integrated circuit at Fairchild Semiconductor in 1968 led to the development of the Intel 4004, the first single-chip microprocessor. It was released by Intel in 1971 and laid the foundations for the microcomputer revolution that began in the 1970s. The public was first introduced to the concepts that would lead to the Internet when a message was sent over the ARPANET in 1969. Packet switched networks such as ARPANET, Mark I, CYCLADES, Merit Network, Tymnet, and Telenet, were developed in the late 1960s and early 1970s using a variety of protocols. The ARPANET in particular led to the development of protocols for internetworking, in which multiple separate networks could be joined together into a network of networks.

[ "Media studies", "Public relations", "Telecommunications", "Law" ]
Parent Topic
Child Topic
    No Parent Topic