Sound design is the art and practice of creating sound tracks for a variety of needs. It involves specifying, acquiring or creating auditory elements using audio production techniques and tools. It is employed in a variety of disciplines including filmmaking, television production, video game development, theatre, sound recording and reproduction, live performance, sound art, post-production, radio and musical instrument development. Sound design commonly involves performing (see e.g. foley) and editing of previously composed or recorded audio, such as sound effects and dialogue for the purposes of the medium. A sound designer is one who practices sound design. Sound design is the art and practice of creating sound tracks for a variety of needs. It involves specifying, acquiring or creating auditory elements using audio production techniques and tools. It is employed in a variety of disciplines including filmmaking, television production, video game development, theatre, sound recording and reproduction, live performance, sound art, post-production, radio and musical instrument development. Sound design commonly involves performing (see e.g. foley) and editing of previously composed or recorded audio, such as sound effects and dialogue for the purposes of the medium. A sound designer is one who practices sound design. The use of sound to evoke emotion, reflect mood and underscore actions in plays and dances began in prehistoric times. At its earliest, it was used in religious practices for healing or recreation. In ancient Japan, theatrical events called kagura were performed in Shinto shrines with music and dance. Plays were performed in medieval times in a form of theatre called Commedia dell'arte, which used music and sound effects to enhance performances. The use of music and sound in the Elizabethan Theatre followed, in which music and sound effects were produced off stage using devices such as bells, whistles, and horns. Cues would be written in the script for music and sound effects to be played at the appropriate time. Italian composer Luigi Russolo built mechanical sound-making devices, called 'intonarumori,' for futurist theatrical and music performances starting around 1913. These devices were meant to simulate natural and man-made sounds, such as trains and bombs. Russolo's treatise, The Art of Noises, is one of the earliest written documents on the use of abstract noise in the theatre. After his death, his intonarumori' were used in more conventional theatre performances to create realistic sound effects. Possibly the first use of recorded sound in the theatre was a phonograph playing a baby’s cry in a London theatre in 1890. Sixteen years later, Herbert Beerbohm Tree used recordings in his London production of Stephen Phillips’ tragedy NERO. The event is marked in the Theatre Magazine (1906) with two photographs; one showing a musician blowing a bugle into a large horn attached to a disc recorder, the other with an actor recording the agonizing shrieks and groans of the tortured martyrs. The article states: “these sounds are all realistically reproduced by the gramophone”. As cited by Bertolt Brecht, there was a play about Rasputin written in (1927) by Alexej Tolstoi and directed by Erwin Piscator that included a recording of Lenin's voice. Whilst the term 'sound designer' was not in use at this time, a number of stage managers specialised as 'effects men', creating and performing offstage sound effects using a mix of vocal mimicry, mechanical and electrical contraptions and gramophone records. A great deal of care and attention was paid to the construction and performance of these effects, both naturalistic and abstract. Over the course of the twentieth century the use of recorded sound effects began to take over from live sound effects, though often it was the stage manager's duty to find the sound effects and an electrician played the recordings during performances. Between 1980 and 1988, Charlie Richmond, USITT's first Sound Design Commissioner, oversaw efforts of their Sound Design Commission to define the duties, responsibilities, standards and procedures which might normally be expected of a theatre sound designer in North America. This subject is still regularly discussed by that group, but during that time, substantial conclusions were drawn and he wrote a document which, although now somewhat dated, provides a succinct record of what was expected at that time. It was subsequently provided to both the ADC and David Goodman at the Florida USA local when they were both planning to represent sound designers in the 1990s. MIDI and digital audio technology have contributed to the evolution of sound production techniques in the 1980s and 1990s. Digital audio workstations and a variety of digital signal processing algorithms applied in them allow more complicated sound tracks with more tracks as well as auditory effects to be realized. Features such as unlimited undo and sample-level editing allow fine control over the sound tracks. In theatre sound, features of computerized theatre sound design systems have also been recognized as being essential for live show control systems at Walt Disney World and, as a result, Disney utilized systems of that type to control many facilities at their Disney-MGM Studios theme park, which opened in 1989. These features were incorporated into the MIDI Show Control (MSC) specification, an open communications protocol used to interact with diverse devices. The first show to fully utilize the MSC specification was the Magic Kingdom Parade at Walt Disney World's Magic Kingdom in September, 1991. The rise of interest in game audio has also brought more advanced interactive audio tools that are also accessible without a background in computer programming. Some of such software tools (termed 'implementation tools' or 'audio engines') feature a workflow that's similar to that in more conventional digital audio workstation programs and can also allow the sound production personnel to undertake some of the more creative interactive sound tasks (that are considered to be part of sound design for computer applications) that previously would have required a computer programmer. Interactive applications have also given rise to a plethora of techniques in 'dynamic audio' that loosely means sound that's 'parametrically' adjusted during the run-time of the program. This allows for a broader expression in sounds, more similar to that in films, because this way the sound designer can e.g. create footstep sounds that vary in a believable and non-repeating way and that also corresponds to what's seen in the picture. The digital audio workstation cannot directly 'communicate' with game engines, because the game's events occur often in an unpredictable order, whereas traditional digital audio workstations as well as so called linear media (TV, film etc.) have everything occur in the same order every time the production is run. Especially games have also brought in dynamic or adaptive mixing.