8/13/2019 Working With Python Subprocess - Shells, Processes, Streams, Pipes, Redirects and More
1/16
James Gardner: Home> Blog> 2009> Working with
Python subprocess -...
Blog Categories 2010 2009 2008 2007
2006 2005
Working with Python subprocess - Shells,
Processes, Streams, Pipes, Redirects andMore
Posted: 2009-04-28 15:20
Tags: Python
Note
Much of the "What Happens When you Execute a Command?" is based on information in
http://en.wikipedia.org/wiki/Redirection_(computing)so go there for the latest version. This post
is released under the GFDL.
Contents
What Happens When you Execute a Command?
Streams
Working with the Shell
Redirecting Standard input and Standard Output to Files
Piping
Redirecting Standard Input and Standard Output to and from the Standard FileHandles
Chained pipelines
Redirect to multiple outputs
Here Documents
Introducing subprocess
Using the Shell
Strings or Argument Lists
Without the Shell
Checking a Program is on the PATH
Reading from Standard Output and Standard ErrorRedirecting stderr to stdout
Writing to Standard Input
The File-like Attributes
Reading and Writing to the Same Process
Accessing Return Values, poll()and wait()
Convenience Functions
Understanding sys.argv
Shell Expansion
Further Reading
In my last post I wrote about how tobuild a command line interface with sub-commands in
Python. In this post Im going to look at how you can interact with other command line programs
using Pythons subprocessmodule.
king with Python subprocess - Shells, Processes, Streams, Pipes, R... http://jimmyg.org/blog/2009/working-with-python-subprocess.html
16 2/4/2014 12:11 PM
8/13/2019 Working With Python Subprocess - Shells, Processes, Streams, Pipes, Redirects and More
2/16
What I want to be able to do is:
Find out exactly what happens when you run commands on a command line
Find out if a command exists and where it actually is
Execute a command a command from Python either directly or via a shell
Read from STDOUT and write to STDIN of a running process
Check the exit status of a processUnderstand the role of Bash in interpreting patterns and sending them to command line
programs
What Happens When you Execute a Command?
When you click on the Terminal icon on your Desktop you are loading a program which in turn
loads ashell. The commands you write are not executed directly by the kernel but are first
interpreted by the shell.
Command (eg. `ls -l')
Terminal Program (eg. `gnome-terminal')
Shell (eg Bash)
Kernel (eg. Linux 2.6.24)
More information about shells:
http://en.wikipedia.org/wiki/Unix_shell
More information about how processes are actually started:
http://pangea.stanford.edu/computerinfo/unix/shell/processes/processes.html
When you execute a program from Python you can choose to execute it directly with the kernel or
via a shell. If you are executing it directly you wont be able to write your commands in quite the
same way as you would when using a shell like bash.
Lets look at the different functionality you will be used to using on the shell before looking at
how to achive similar results with subprocess.
Streams
In UNIX and Linux, there are three I/O channels known asstreamswhich connect a computer
program with its environment such as a text terminal (eggnome-terminal running Bash) or
another computer program (eg a Python program using the subprocessmodule). These I/O
channels are called thestandard input,standard outputandstandard errorrespectively and can
also be refered to by their correspondingfile descriptorswhich are the numbers 0, 1 and 2
respectively.
Handle Name Description
0 stdin Standard input
1 stdout Standard output
king with Python subprocess - Shells, Processes, Streams, Pipes, R... http://jimmyg.org/blog/2009/working-with-python-subprocess.html
16 2/4/2014 12:11 PM
8/13/2019 Working With Python Subprocess - Shells, Processes, Streams, Pipes, Redirects and More
3/16
8/13/2019 Working With Python Subprocess - Shells, Processes, Streams, Pipes, Redirects and More
4/16
$ program1 < file1 > file2
There may be times when you want the output from one program to be read as the input to another
program. You can achieve this using temporary files like this:
$ program1 > tempfile1
$ program2 < tempfile1
$ rm tempfile1
This is a bit cumbersome though so shells provide a facility called piping.
Piping
Piping allows the standard output from one program to be fed directly into the standard input of
another without the need for a temporary file:
$ program1 | program2
The |character is known as thepipecharacter and so this process is known aspiping.
Heres another picture from wikipedia illustrating this:
Heres an example of piping the output from find .(which recursively prints the paths of the
files and directores in the current directory) into the grepprogram to find just a particular file:
find . | grep "The file I'm after.txt"
Data from the first program is piped into the second program line by line so the first program
doesnt have to finsih before the second program can start using it.
Redirecting Standard Input and Standard Output to and from the StandardFile Handles
As well as redirecting the stadard output, you can also redirect other streams, for example to send
king with Python subprocess - Shells, Processes, Streams, Pipes, R... http://jimmyg.org/blog/2009/working-with-python-subprocess.html
16 2/4/2014 12:11 PM
8/13/2019 Working With Python Subprocess - Shells, Processes, Streams, Pipes, Redirects and More
5/16
the standard error data to standard output. In Bash the >, >operators weve already
discussed can also be prefixed by the file descriptor (remeber the numbers 0, 1 and 2 in the table
earlier) to redirect that stream. If the number is omitted it is assumed to be 1 for the standard
output which is why the commands weve used so far work.
This command executes program1and sends any data it writes to standard errorto file1.
program1 2> file1
executes program1, directing the standard error stream to file1.
Heres an example program you can use to test it. Save it as redirect1.py:
import sys
while 1:
try:
input = sys.stdin.readline()
if input:
sys.stdout.write('Echo to stdout: %s'%input) sys.stderr.write('Echo to stderr: %s'%input)
except KeyboardError:
sys.exit()
This program constantly polls stdin and echos any message it recieves to both stdout and stdout.
In shells derived from csh (the C shell), the syntax instead appends the &character to the redirect
characters, thus achieving a similar result.
Another useful capability is to redirect one standard file handle to another. The most popular
variation is to merge standard error into standard output so error messages can be processed
together with (or alternately to) the usual output. Example:
find / -name .profile > results 2>&1
will try to find all files named .profile. You need the &character even in Bash this time.
Executed without redirection, it will output hits to stdout and errors (e.g. for lack of privilege to
traverse protected directories) to stderr. If standard output is directed to file results, error messages
appear on the console. To see both hits and error messages in file results, merge stderr (handle 2)
into stdout (handle 1) using 2>&1.
Its possible use 2>&1before ">" but it doesnt work. In fact, when the interpreter reads 2>&1, it
doesnt know yet where standard output is redirected and then standard error isnt merged.
If the merged output is to be piped into another program, the file merge sequence 2>&1must
precede the pipe symbol, thus:
find / -name .profile 2>&1 | less
In Bash a simplified form of the command:
command > file 2>&1
king with Python subprocess - Shells, Processes, Streams, Pipes, R... http://jimmyg.org/blog/2009/working-with-python-subprocess.html
16 2/4/2014 12:11 PM
8/13/2019 Working With Python Subprocess - Shells, Processes, Streams, Pipes, Redirects and More
6/16
is:
command &>file
or:
command >&file
but dont use these shortcuts or you might get confused. Better to be more verbose and explicit.
The &>operator redirects standard output (stdout) and standard error (stderr) at the same time.
This is simpler to type than the Bourne shell equivalent command > file 2>&1.
Chained pipelines
The redirection and piping tokens can be chained together to create complex commands. For
example:
ls | grep '\.sh' | sort > shlist
lists the contents of the current directory, where this output is filtered to only contain lines which
contain .sh, sort this resultant output lexicographically, and place the final output in shlist. This
type of construction is used very commonly in shell scripts and batch files.
Redirect to multiple outputs
The standard command tee can redirect output from a command to several destinations.
ls -lrt | tee xyz
This directs the file list output to both standard output as well as to the file xyz.
Here Documents
Most shells, including Bash support here documentswhich enable you to embed text as part of a
command using the
8/13/2019 Working With Python Subprocess - Shells, Processes, Streams, Pipes, Redirects and More
7/16
in backticks are evaluated:
$ cat Working dir $PWD
> EOF
Working dir /home/user
This can be disabled by setting the label in the command line in single or double quotes:
$ cat Working dir $PWD
> EOF
Working dir $PWD
Introducingsubprocess
Now that weve discussed the sort of functionality offered on the command line, lets experiment
with the subprocessmodule. Heres a simple command you can run on the command line:
$ echo "Hello world!"
Hello world!
Lets try to run this from Python.
In the past, process management in Python was dealt with by a large range of different Python
functions from all over the standard library. Since Python 2.4 all this functionality has been
carefully and neatly packaged up into the subprocessmodule which provides one class called
Popenwhich is all you need to use.
Note
If you are interested in how the new Popenclass replaces the old functionality the subprocess
documentationhas a section explaining how things used to be done and how they are done now.
The Popenclass takes the following options which are all described in detail at
http://docs.python.org/library/subprocess.html#using-the-subprocess-module:
subprocess.Popen(args, bufsize=0, executable=None, stdin=None, stdout=None, stderr=None, preexec_fn=None, close_fds=False,
shell=False, cwd=None, env=None, universal_newlines=False,
startupinfo=None, creationflags=0
)
Luckily most of the time you will just need to use argsand shell.
Using the Shell
Lets start with a simple example and run the Hello World! example in the same way as before,passing the command though the shell:
>>> import subprocess
king with Python subprocess - Shells, Processes, Streams, Pipes, R... http://jimmyg.org/blog/2009/working-with-python-subprocess.html
16 2/4/2014 12:11 PM
8/13/2019 Working With Python Subprocess - Shells, Processes, Streams, Pipes, Redirects and More
8/16
>>> subprocess.Popen('echo "Hello world!"', shell=True)
Hello world!
As you can see, this prints Hello world!to the standard output as before but the interactive
console also displays that we have created an instance of the subprocess.Popenclass.
If you save this as process_test.pyand execute it on the command line you get the same result:
$ python process_test.py
Hello world!
So far so good.
You might we wondering which shell is being used. On Unix, the default shell is /bin/sh. On
Windows, the default shell is specified by the COMSPECenvironment variable. When you specify
shell=Trueyou can customise the shell to use with the executableargument.
>>> subprocess.Popen('echo "Hello world!"', shell=True, executable="/bin/bash")
Hello world!
This example works the same as before but if you were using some shell-specific features you
would notice the differences.
Lets explore some other features of using the shell:
Variable expansion:
>>> subprocess.Popen('echo $PWD', shell=True)
/home/james/Desktop
Pipes and redirects:
subprocess.Popen('echo "Hello world!" | tr a-z A-Z 2> errors.txt', shell=True)
>>> HELLO WORLD!
The errors.txtfile will be empty because there werent any errors. Interstingly on my computer,
the Popeninstance is displayed beforethe HELLO WORLD!message is printed to the standard output
this time. Pipes and redirects clearly work anyway.
Here documents:
>>> subprocess.Popen("""
... cat new.txt
... Hello World!
... EOF
... """, shell=True)
The new.txtfile now exists with the content Hello World!.
king with Python subprocess - Shells, Processes, Streams, Pipes, R... http://jimmyg.org/blog/2009/working-with-python-subprocess.html
16 2/4/2014 12:11 PM
8/13/2019 Working With Python Subprocess - Shells, Processes, Streams, Pipes, Redirects and More
9/16
Unsurprisingly the features that work with a command passed to the shell via the command line
also work when passed to the shell from Python, including shell commands.
Strings or Argument Lists
While it is handy to be able to execute commands pretty much as you would on the command line,
you often need to pass variables from Python to the commands you are using. Lets say we wanted
to re-write this function to use echo:
def print_string(string):
print string
You might do this:
def print_string(string):
subprocess.Popen('echo "%s"'%string, shell=True)
This will work fine for the string Hello World!:
>>> print_string('Hello world!')
Hello world!
But not for this:
>>> print_string('nasty " example')
/bin/sh: Syntax error: Unterminated quoted string
The command being executed is echo "nasty " example"and as you can see, there is a problem
with the escaping.
One approach is to deal with the escaping in your code but this can be cumbersome because you
have to deal with all the possible escape characters, spaces etc.
Python can handle it for you but you have to avoid using the shell. Lets look at this next.
Without the Shell
Now lets try the same thing without the shell:
def print_string(string):
subprocess.Popen(['echo', string], shell=False)
>>> print_string('Hello world!')
Hello world!
>>> print_string('nasty " example')
nasty " example
As you can see it works correctly with the correct escaping.
Note
king with Python subprocess - Shells, Processes, Streams, Pipes, R... http://jimmyg.org/blog/2009/working-with-python-subprocess.html
16 2/4/2014 12:11 PM
8/13/2019 Working With Python Subprocess - Shells, Processes, Streams, Pipes, Redirects and More
10/16
You can actually specify a single string as the argument when shell=Falsebut it must be the
program itself and is no different from just specifying a list with one element forargs. If you try
to execute the same sort of command you would when shell=Falseyou get an error:
>>> subprocess.Popen('echo "Hello world!"', shell=False)
Traceback (most recent call last):
File "", line 1, in
File "/usr/lib/python2.5/subprocess.py", line 594, in __init__
errread, errwrite)
File "/usr/lib/python2.5/subprocess.py", line 1147, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory
Since we are still passing it as a string, Python assumes the entire stringis the name of the
program to execute and there isnt a program called echo "Hello world!"so it fails. Instead you
have to pass each argument separately.
Checking a Program is on the PATH
Heres a function to find the actual location of a program:
import os
def whereis(program):
for path in os.environ.get('PATH', '').split(':'):
if os.path.exists(os.path.join(path, program)) and \
not os.path.isdir(os.path.join(path, program)):
return os.path.join(path, program)
return None
Lets use this to find out where the echoprogram really is:
>>> location = whereis('echo')
>>> if location is not None:
... print location
/bin/echo
This function is also very useful for checking whether a user has a program your Python program
requires installed on theirPATH.
Of course you can also find out the location of programs with the whereiscommand on thecommand line:
$ whereis echo
echo: /bin/echo /usr/share/man/man1/echo.1.gz
Note
Notice that whethershellis TrueorFalsewe havent had to specify the full path to an
executable. As long as the executable is on the PATHenvironmant variable, you can execute it
from Python. Of course, there is no harm in specifying the full path if you prefer.
If you want to be slightly perverse you can specify the executableargument rather than having
the executable as the first argument in the argslist. This doesnt seem well documented but this is
what it does on my computer:
king with Python subprocess - Shells, Processes, Streams, Pipes, R... http://jimmyg.org/blog/2009/working-with-python-subprocess.html
f 16 2/4/2014 12:11 PM
8/13/2019 Working With Python Subprocess - Shells, Processes, Streams, Pipes, Redirects and More
11/16
8/13/2019 Working With Python Subprocess - Shells, Processes, Streams, Pipes, Redirects and More
12/16
>>> print process.communicate()
('Message to stdout\n', 'Message to stderr\n')
This time both stdout and stderr can be accessed from Python.
Now that all the messages have been printed, if we call communicate()again we get an error:
>>> print process.communicate()Traceback (most recent call last):
File "", line 1, in
File "/usr/lib/python2.5/subprocess.py", line 668, in communicate
return self._communicate(input)
File "/usr/lib/python2.5/subprocess.py", line 1207, in _communicate
rlist, wlist, xlist = select.select(read_set, write_set, [])
ValueError: I/O operation on closed file
The communicate()method only reads data from stdout and stderr, until end-of-file is reached.
Redirecting stderr to stdout
If you want messages to stderr to be piped to stdout, you can set a special value for the stderr
argument: stderr=subprocess.STDOUT .
Writing to Standard Input
Writing to a process is very similar. In order to write to the process you have to open a pipe to
stdin. Once again this can be done by specifying stdin=subprocess.PIPEas an argument to
Popen.
To test it lets write another program which simply prints Received:and then repeats the message
you send it. It will repeat 3 messages before exiting. Call this test2.py:
import sys
input = sys.stdin.read()
sys.stdout.write('Received: %s'%input)
To send a message to stdin, you pass the string you want to send as the inputargument to
communicate(). Lets try the program:
>>> process = subprocess.Popen(['python', 'test2.py'], shell=False, stdin=subpro
>>> print process.communicate('How are you?')
Received: How are you?(None, None)
Notice that the message generated in the test2.pyprocess was printed to stdout and then the
return value (None, None)was printed because no pipes were set up to stdout or stderr.
You can also specify stdout=subprocess.PIPE and stderr=subprocess.PIPEjust as before to
set up the pipes.
The File-like Attributes
The Popeninstance also has attributes stdoutand stderrwhich you can write to as if they were
king with Python subprocess - Shells, Processes, Streams, Pipes, R... http://jimmyg.org/blog/2009/working-with-python-subprocess.html
f 16 2/4/2014 12:11 PM
8/13/2019 Working With Python Subprocess - Shells, Processes, Streams, Pipes, Redirects and More
13/16
file-like objects and a stdinattribute which you can read from like a file. You can use these
instead of getting and sending data via communicate()if you prefer. Well see them next.
Reading and Writing to the Same Process
Heres another program, save it as test3.py:
import sys
while True:
input = sys.stdin.readline()
sys.stdout.write('Received: %s'%input)
sys.stdout.flush()
This program simply echos the value it recieves. Lets test it like this:
>>> import time
>>> process = subprocess.Popen(['python', 'test3.py'], shell=False, stdin=subpro
>>> for i in range(5):... process.stdin.write('%d\n' % i)
... output = process.stdout.readline()
... print output
... time.sleep(1)
...
Received: 0
Received: 1
Received: 2
Received: 3
Received: 4
>>>
Each line of output appears after a second.
You should now have enough knowledge to be able to reproduce in Python all the functionality
you might have relied on the shell for in the past.
Accessing Return Values, poll()and wait()
When a program exits it can return an integer value to indicate the exit status. Zero is considered
successful termination and any nonzero value is considered abnormal termination by shells
and the like. Most systems require it to be in the range 0-127, and produce undefined results
otherwise. Some systems have a convention for assigning specific meanings to specific exit codes,
but these are generally underdeveloped; Unix programs generally use 2 for command line syntax
errors and 1 for all other kind of errors.
You can access the return code from an exited child process via the .returncodeattribute of a
Popeninstance. Heres an example:
>>> process = subprocess.Popen(['echo', 'Hello world!'], shell=False)
>>> process.poll()
king with Python subprocess - Shells, Processes, Streams, Pipes, R... http://jimmyg.org/blog/2009/working-with-python-subprocess.html
f 16 2/4/2014 12:11 PM
8/13/2019 Working With Python Subprocess - Shells, Processes, Streams, Pipes, Redirects and More
14/16
>>> print process.returncode
None
>>> process.poll()
0
>>> print process.returncode
0
The returncodevalue is not ever set by the child process, it starts off with a default value of
None, and remains Noneuntil you call a method in the subprocesssuch as poll()orwait().Those methods set and then return returncode. As a result, if you want to know what the status of
the child process is, you have to call eitherpoll()orwait().
The poll()and wait()methods are subtley different:
Popen.poll()
Check if child process has terminated. Set and return .returncodeattribute.
Popen.wait()
Wait for child process to terminate. Set and return .returncodeattribute.
Convenience Functions
The subprocessmodule has some convenience functions to make executing a command in the
simple case more straightforward. I dont use them though.
Understanding sys.argv
If you are writing a Python program to accept command line arguments, the arguments passed on
the command line are accessed as sys.argv. Heres a simple example, save it as command.py:
#!/usr/bin/env python
if __name__ == '__main__':
import sys
print "Executable: %s"%sys.argv[0]
for arg in sys.argv[1:]:
print "Arg: %s"%arg
The if __name__ == '__main__':line ensures the code beneath it only gets run if the file is
executed, not if it is imported. Make this file executable:
$ chmod 755 command.py
Here are some examples of this being run:
$ python command.py
Executable: command.py
$ python command.py arg1
Executable: command.py
Arg: arg1
$ python command.py arg1 arg2
Executable: command.py
Arg: arg1
Arg: arg2
king with Python subprocess - Shells, Processes, Streams, Pipes, R... http://jimmyg.org/blog/2009/working-with-python-subprocess.html
f 16 2/4/2014 12:11 PM
8/13/2019 Working With Python Subprocess - Shells, Processes, Streams, Pipes, Redirects and More
15/16
Notice that sys.argv[0]always contains the name of the script which was executed, regardless
of how Python was invoked. sys.argv[1]and onward are the command line arguments. You can
also invoke the program using Pythons -mswitch to run it as a module import:
$ python -m command
Executable: /home/james/Desktop/command.py
$ python -m command arg1
Executable: /home/james/Desktop/command.py
Arg: arg1
$ python -m command arg1 arg2
Executable: /home/james/Desktop/command.py
Arg: arg1
Arg: arg2
Once again, Python works out that python -m commandare all part of the invoking command so
sys.argv[0]only contains the path of the Python script. Lets try executing it directly:
$ ./command.py
Executable: ./command.py
$ ./command.py arg1
Executable: ./command.py
Arg: arg1
$ ./command.py arg1 arg2
Executable: ./command.py
Arg: arg1
Arg: arg2
As you can see, sys.argv[0]still contains the Python script and sys.argv[1]and onwards
represent each argument.
Shell Expansion
One small complication when running programs from within a shell is that the shell will
sometimes substitute a pattern for a set of arguments. For example, consider this run in the Bash
shell:
$ ./command.py *.txt
You might expect the following output:
Executable: ./command.py
Arg: *.txt
but this isnt what you get. Instead the output depends on the number of.txtfiles in the current
working directory. When I run it I get this:
Executable: ./command.py
Arg: errors.txt
Arg: new.txt
Arg: output.txt
The Bash shell substitutes the *.txtcommand for the filenames of all the .txtfiles in the current
directory so your program recieves more arguments than you might have expected.
king with Python subprocess - Shells, Processes, Streams, Pipes, R... http://jimmyg.org/blog/2009/working-with-python-subprocess.html
f 16 2/4/2014 12:11 PM
8/13/2019 Working With Python Subprocess - Shells, Processes, Streams, Pipes, Redirects and More
16/16
You can disable shell expansion by quoting the argument, but actually most of the time it is a very
useful feature once you are aware of it.
$ ./command.py "*.txt"
Executable: ./command.py
Arg: *.txt
To learn more about Bash shell expansions, see this: http://www.gnu.org/software/bash/manual/bashref.html#Filename-Expansion
Further Reading
See also:
http://www.doughellmann.com/PyMOTW/subprocess/(and its OReilly copy here)
http://docs.python.org/library/subprocess.html
http://webpython.codepoint.net/cgi_shell_command
http://www.artima.com/weblogs/viewpost.jsp?thread=4829(About writing main()
functions)
Topics for a future post:
Send and receive signals between running processes
Run a program in the background or the foreground
(view source)
James Gardner: Home> Blog> 2009> Working with
Python subprocess -...
Blog Categories 2010 2009 2008 2007
2006 2005
Copyright James Gardner1996-2009 All Rights Reserved. http://jimmyg.org
king with Python subprocess - Shells, Processes, Streams, Pipes, R... http://jimmyg.org/blog/2009/working-with-python-subprocess.html