yield
The yield keyword in Python turns a function into a generator. Instead of returning a single value and stopping, a generator function can pause execution and yield multiple values over time. This enables lazy evaluation — producing values on-demand rather than building entire lists in memory.
Syntax
def generator_function():
# Generator body
yield value1
yield value2
# ...
Each yield suspends the function state, allowing it to resume from exactly where it left off the next time a value is requested.
Basic Examples
A Simple Counter Generator
def count_up_to(n):
i = 1
while i <= n:
yield i
i += 1
for num in count_up_to(5):
print(num)
# 1
# 2
# 3
# 4
# 5
Yielding from an Iterable
def double_values(items):
for item in items:
yield item * 2
for doubled in double_values([1, 2, 3]):
print(doubled)
# 2
# 4
# 6
How Generators Work
Lazy Evaluation
Generators produce values one at a time, only when requested:
def lazy_range(n):
print("Starting")
for i in range(n):
yield i
print("Done")
gen = lazy_range(3)
# Nothing printed yet — generator is idle
print(next(gen)) # "Starting" then 0
print(next(gen)) # 1
print(next(gen)) # 2
print(next(gen)) # StopIteration exception
Memory Efficiency
Generators shine when working with large datasets:
# Bad — builds entire list in memory
def get_millions_bad():
return [i for i in range(1_000_000)]
# Good — yields one at a time
def get_millions_good():
for i in range(1_000_000):
yield i
# Both can be iterated, but generator uses constant memory
Generator Methods
next()
Get the next value from a generator:
def numbers():
yield 1
yield 2
yield 3
gen = numbers()
print(next(gen)) # 1
print(next(gen)) # 2
print(next(gen)) # 3
# print(next(gen)) # StopIteration
send()
Send a value into a generator:
def accumulator():
total = 0
while True:
value = yield total
if value is not None:
total += value
gen = accumulator()
print(next(gen)) # 0 — prime the generator
print(gen.send(10)) # 10
print(gen.send(5)) # 15
This allows two-way communication with generators.
throw()
Inject an exception into a generator:
def fragile():
try:
yield 1
except ValueError:
yield "Caught!"
gen = fragile()
print(next(gen)) # 1
print(gen.throw(ValueError)) # "Caught!"
close()
Terminate a generator:
def infinite():
while True:
yield 1
gen = infinite()
print(next(gen)) # 1
gen.close() # Generator closed
# next(gen) raises StopIteration
yield from
Delegating to Sub-generators
yield from delegates to another iterable:
def chain(*iterables):
for it in iterables:
yield from it
for item in chain([1, 2], [3, 4], [5]):
print(item)
# 1
# 2
# 3
# 4
# 5
Equivalent to a Loop
# These are equivalent:
def gen1():
yield from [1, 2, 3]
def gen2():
for item in [1, 2, 3]:
yield item
Common Patterns
Processing Large Files
def read_lines(filepath):
with open(filepath) as f:
for line in f:
yield line.strip()
# Process a million-line file without loading it all
for line in read_lines("huge_file.txt"):
process(line) # One line at a time
Infinite Sequences
def fibonacci():
a, b = 0, 1
while True:
yield a
a, b = b, a + b
fib = fibonacci()
for _ in range(10):
print(next(fib))
# 0
# 1
# 1
# 2
# 3
# 5
# 8
# 13
# 21
# 34
Pipeline Processing
Chain generators for data pipelines:
def numbers():
yield from range(10)
def double(n):
return n * 2
def is_even(n):
return n % 2 == 0
# Chain: numbers -> double -> filter -> collect
result = [double(n) for n in numbers() if is_even(double(n))]
print(result) # [0, 4, 8, 12, 16]
Generator Expressions
For simple transformations, use generator expressions:
# Generator expression (lazy)
gen = (x * 2 for x in range(10))
# List comprehension (eager)
lst = [x * 2 for x in range(10)]
Send Values with yield from
Two-Way Communication
def transformer(source):
while True:
received = yield
for item in source:
if item > received:
yield item
gen = transformer([1, 5, 3, 8, 2])
next(gen) # Prime
print(gen.send(4)) # 5, 8
print(gen.send(6)) # 8
Best Practices
Use Generators for Large Data
# Bad — loads everything
data = [process(item) for item in huge_list]
# Good — processes one at a time
def process_all(items):
for item in items:
yield process(item)
Prefer yield from Over Manual Loops
# Verbose
def flatten(list_of_lists):
for sublist in list_of_lists:
for item in sublist:
yield item
# Clean
def flatten(list_of_lists):
yield from list_of_lists
Remember Generators Are One-Way
def gen():
yield 1
yield 2
g = gen()
list(g) # [1, 2]
list(g) # [] — exhausted!
You cannot rewind a generator. Create a new one if needed.
Use itertools for Complex Operations
from itertools import islice, takewhile
def numbers():
n = 0
while True:
yield n
n += 1
# Take first 10
print(list(islice(numbers(), 10))) # [0, 1, 2, ..., 9]
# Take while condition is true
print(list(takewhile(lambda x: x < 5, numbers()))) # [0, 1, 2, 3, 4]
See Also
- return keyword — returning from functions
- def keyword — function definitions
- async keyword — async generators with async def
- lambda keyword — anonymous functions and generator expressions