concurrent.futures
The concurrent.futures module provides a high-level interface for asynchronously executing callables. It abstracts away the complexity of thread and process management, letting you focus on the actual work. You can execute tasks using threads via ThreadPoolExecutor or separate processes via ProcessPoolExecutor. Both implement the same interface, so switching between them often requires just a one-line change.
Added in Python 3.2.
ThreadPoolExecutor
ThreadPoolExecutor is an Executor subclass that uses a pool of threads to execute calls asynchronously. It’s ideal for I/O-bound tasks like making HTTP requests, reading files, or querying databases. Threads share memory, so you can pass most Python objects directly without serialization.
Syntax
class concurrent.futures.ThreadPoolExecutor(
max_workers=None,
thread_name_prefix='',
initializer=None,
initargs=()
)
Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
max_workers | int | None | Maximum number of threads. Defaults to min(32, os.cpu_count() + 4). |
thread_name_prefix | str | '' | Prefix for thread names for easier debugging. |
initializer | callable | None | Callable run at the start of each worker thread. |
initargs | tuple | () | Arguments passed to the initializer. |
Examples
Basic usage with submit()
import concurrent.futures
import time
def slow_square(x):
time.sleep(1)
return x ** 2
with concurrent.futures.ThreadPoolExecutor(max_workers=3) as executor:
future = executor.submit(slow_square, 5)
result = future.result()
print(result)
# 25
Using map() for multiple inputs
import concurrent.futures
import time
def double(x):
return x * 2
with concurrent.futures.ThreadPoolExecutor(max_workers=4) as executor:
results = executor.map(double, [1, 2, 3, 4, 5])
print(list(results))
# [2, 4, 6, 8, 10]
ProcessPoolExecutor
ProcessPoolExecutor uses a pool of processes instead of threads. Each worker runs in its own process with its own Python interpreter and Global Interpreter Lock (GIL). This makes it suitable for CPU-bound tasks like mathematical computations, image processing, or data analysis. The tradeoff is that only picklable objects can be passed to and returned from worker functions.
Syntax
class concurrent.futures.ProcessPoolExecutor(
max_workers=None,
mp_context=None,
initializer=None,
initargs=(),
max_tasks_per_child=None
)
Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
max_workers | int | None | Maximum number of processes. Defaults to os.cpu_count(). |
mp_context | multiprocessing.Context | None | Multiprocessing context for launching workers. |
initializer | callable | None | Callable run at the start of each worker process. |
initargs | tuple | () | Arguments passed to the initializer. |
max_tasks_per_child | int | None | Maximum tasks per worker before replacement. |
Examples
CPU-bound computation
import concurrent.futures
import math
def is_prime(n):
if n < 2:
return False
if n == 2:
return True
if n % 2 == 0:
return False
for i in range(3, int(math.sqrt(n)) + 1, 2):
if n % i == 0:
return False
return True
numbers = [1000003, 1000033, 1000037, 1000039, 1000081]
with concurrent.futures.ProcessPoolExecutor() as executor:
results = executor.map(is_prime, numbers)
for n, prime in zip(numbers, results):
print(f"{n} is prime: {prime}")
# 1000003 is prime: True
# 1000033 is prime: True
# 1000037 is prime: False
# 1000039 is prime: False
# 1000081 is prime: True
Future Objects
A Future represents the asynchronous execution of a callable. You get Futures by calling Executor.submit() or Executor.map(). The Future object lets you check status, cancel execution, or retrieve the result.
Syntax
class concurrent.futures.Future
Methods
| Method | Description |
|---|---|
cancel() | Attempt to cancel the call. Returns True if successful, False otherwise. |
cancelled() | Returns True if the call was successfully cancelled. |
running() | Returns True if the call is currently executing. |
done() | Returns True if the call completed (successfully or was cancelled). |
result(timeout=None) | Returns the result, waiting up to timeout seconds. |
exception(timeout=None) | Returns the exception raised by the call, or None. |
add_done_callback(fn) | Attaches a callback to be called when the Future completes. |
Examples
Checking status and getting result
import concurrent.futures
import time
def task(n):
time.sleep(0.5)
return n * 2
with concurrent.futures.ThreadPoolExecutor() as executor:
future = executor.submit(task, 10)
print(f"Done: {future.done()}")
# Done: False
result = future.result()
print(f"Result: {result}")
# Result: 20
print(f"Done: {future.done()}")
# Done: True
Using callbacks
import concurrent.futures
def task():
return 42
def callback(future):
print(f"Task completed with result: {future.result()}")
with concurrent.futures.ThreadPoolExecutor() as executor:
future = executor.submit(task)
future.add_done_callback(callback)
# Task completed with result: 42
Module Functions
wait()
Wait for multiple Futures to complete. Returns two sets: done contains completed Futures, not_done contains those still pending.
concurrent.futures.wait(fs, timeout=None, return_when=ALL_COMPLETED)
| Parameter | Type | Default | Description |
|---|---|---|---|
fs | iterable of Future | — | Futures to wait on. |
timeout | float | None | Maximum seconds to wait. |
return_when | str | ALL_COMPLETED | When to return: FIRST_COMPLETED, FIRST_EXCEPTION, or ALL_COMPLETED. |
import concurrent.futures
import time
def work(n):
time.sleep(n)
return n
with concurrent.futures.ThreadPoolExecutor() as executor:
f1 = executor.submit(work, 2)
f2 = executor.submit(work, 1)
f3 = executor.submit(work, 3)
done, not_done = concurrent.futures.wait(
[f1, f2, f3],
return_when=concurrent.futures.FIRST_COMPLETED
)
print(f"Done: {len(done)}, Not done: {len(not_done)}")
# Done: 1, Not done: 2
as_completed()
Returns an iterator that yields Futures as they complete. This is useful when you want to process results as soon as they’re available rather than waiting for all submissions.
concurrent.futures.as_completed(fs, timeout=None)
| Parameter | Type | Default | Description |
|---|---|---|---|
fs | iterable of Future | — | Futures to iterate over. |
timeout | float | None | Maximum seconds to wait for next completion. |
import concurrent.futures
import time
def work(n):
time.sleep(n)
return n
with concurrent.futures.ThreadPoolExecutor() as executor:
futures = [executor.submit(work, i) for i in [3, 1, 2]]
for future in concurrent.futures.as_completed(futures):
print(f"Result: {future.result()}")
# Result: 1
# Result: 2
# Result: 3
Common Patterns
Context manager for automatic cleanup
Always use executors as context managers. This ensures resources are freed properly and all pending tasks complete before the program exits.
with concurrent.futures.ThreadPoolExecutor(max_workers=5) as executor:
futures = [executor.submit(task, arg) for arg in args]
for f in concurrent.futures.as_completed(futures):
print(f.result())
# Executor shuts down automatically here
Handling exceptions
When a callable raises an exception, it’s captured in the Future and re-raised when you call result(). You can also check for exceptions without raising them.
def failing_task():
raise ValueError("Something went wrong")
with concurrent.futures.ThreadPoolExecutor() as executor:
future = executor.submit(failing_task)
try:
future.result()
except ValueError as e:
print(f"Caught: {e}")
# Caught: Something went wrong
Errors
| Error | When it occurs |
|---|---|
TimeoutError | Result not available within the specified timeout. |
CancelledError | Future was cancelled before completing. |
BrokenExecutor | Worker pool crashed (e.g., initializer failed). |
InvalidStateError | Operation on Future in invalid state (e.g., setting result twice). |