language-icon Old Web
English
Sign In

Pipeline (Unix)

In Unix-like computer operating systems, a pipeline is a mechanism for inter-process communication using message passing. A pipeline is a set of processes chained together by their standard streams, so that the output text of each process (stdout) is passed directly as input (stdin) to the next one. The first process is not completed before the second is started, but they are executed concurrently. The concept of pipelines was championed by Douglas McIlroy at Unix's ancestral home of Bell Labs, during the development of Unix, shaping its toolbox philosophy. It is named by analogy to a physical pipeline. A key feature of these pipelines is their 'hiding of internals' (Ritchie & Thompson, 1974). This in turn allows for more clarity and simplicity in the system. In Unix-like computer operating systems, a pipeline is a mechanism for inter-process communication using message passing. A pipeline is a set of processes chained together by their standard streams, so that the output text of each process (stdout) is passed directly as input (stdin) to the next one. The first process is not completed before the second is started, but they are executed concurrently. The concept of pipelines was championed by Douglas McIlroy at Unix's ancestral home of Bell Labs, during the development of Unix, shaping its toolbox philosophy. It is named by analogy to a physical pipeline. A key feature of these pipelines is their 'hiding of internals' (Ritchie & Thompson, 1974). This in turn allows for more clarity and simplicity in the system. This article is about anonymous pipes, where data written by one process is buffered by the operating system until it is read by the next process, and this uni-directional channel disappears when the processes are completed. This differs from named pipes, where messages are passed to or from a pipe that is named by making it a file, and remains after the processes are completed. The standard shell syntax for anonymous pipes is to list multiple commands, separated by vertical bars ('pipes' in common Unix verbiage): For example, to list files in the current directory (.mw-parser-output .monospaced{font-family:monospace,monospace}ls), retain only the lines of ls output containing the string 'key' (grep), and view the result in a scrolling page (less), a user types the following into the command line of a terminal: 'ls -l' produces a process, the output (stdout) of which is piped to the input (stdin) of the process for 'grep key'; and likewise for the process for 'less'. Each process takes input from the previous process and produces output for the next process via standard streams. Each ' | ' tells the shell to connect the standard output of the command on the left to the standard input of the command on the right by an inter-process communication mechanism called an (anonymous) pipe, implemented in the operating system. Pipes are unidirectional; data flows through the pipeline from left to right. All widely used Unix shells have a special syntax construct for the creation of pipelines. In all usage one writes the commands in sequence, separated by the ASCII vertical bar character '|' (which, for this reason, is often called 'pipe character'). The shell starts the processes and arranges for the necessary connections between their standard streams (including some amount of buffer storage). By default, the standard error streams ('stderr') of the processes in a pipeline are not passed on through the pipe; instead, they are merged and directed to the console. However, many shells have additional syntax for changing this behavior. In the csh shell, for instance, using '|&' instead of '|' signifies that the standard error stream should also be merged with the standard output and fed to the next process. The Bourne Shell can also merge standard error with |& since bash 4.0 or using 2>&1, as well as redirect it to a different file.

[ "Unix architecture", "Unix", "Software", "Mechanical engineering", "Operating system" ]
Parent Topic
Child Topic
    No Parent Topic