Generators in Python - Interview Questions and Answers
A generator is a special type of iterator that allows lazy evaluation. It is defined using a function with the yield
keyword instead of return
.
A normal function returns a value and terminates, whereas a generator yields values one at a time and can resume execution from where it left off.
The yield
keyword is used to return a value from a generator function while preserving its state for subsequent calls.
A generator is created using a function that contains one or more yield
statements.
def my_generator():
yield 1
yield 2
yield 3
You can use a for
loop or the next()
function to iterate over a generator.
gen = my_generator()
print(next(gen)) # Output: 1
print(next(gen)) # Output: 2
print(next(gen)) # Output: 3
A StopIteration
exception is raised when a generator has no more values to yield.
Lazy evaluation reduces memory usage by generating values only when needed, rather than storing all values in memory.
Yes, a generator can have multiple yield
statements, allowing it to produce multiple values.
Yes, but a return
statement will terminate the generator and raise a StopIteration
exception.
You can check using isinstance(obj, types.GeneratorType)
.
Yes, but it requires yield from
for proper recursion.
The yield from
statement is used to delegate part of a generator’s operations to another generator.
def sub_generator():
yield 1
yield 2
def main_generator():
yield from sub_generator()
yield 3
No, once a generator is exhausted, it cannot be restarted. You need to create a new instance.
Use the .close()
method to stop a generator.
Use the .send()
method to send values to a generator.
return
stops the function, while yield
allows resuming execution.
Yes, just like normal functions.
You cannot reset a generator, but you can create a new instance.
Yes, an infinite generator keeps yielding values indefinitely.
def infinite_counter():
count = 0
while True:
yield count
count += 1
It converts the generator into a list, consuming all values.
Generators help process large files line by line without loading everything into memory.
Generators can be used to yield page content one by one, reducing memory usage.
Streaming large logs or API responses efficiently.
Generators can yield training data batch by batch instead of loading it all at once.
Handling multiple connections efficiently without blocking execution.
Use list(generator)
, but it consumes all elements.
You can store the generator object, but it doesn’t store its values.
They yield a set number of items per iteration, reducing load time.
It keeps only the current yielded value in memory, reducing memory footprint.
The generator can catch and handle exceptions.
All generators are iterators, but not all iterators are generators. Generators use yield
, while iterators require __iter__()
and __next__()
.
No, yield
cannot be used inside a lambda.
It allows a generator to yield values from another generator directly.
Use next()
, for
loops, or print statements to inspect behavior.
Yes, using threading or multiprocessing.
It sends an exception into the generator to handle errors.
It maintains the context manager’s scope.
Use return
, but it raises StopIteration
.
Coroutines are enhanced generators that can receive values via .send()
.
It returns the next item or a default value if exhausted.
The .throw()
method is used to raise an exception inside a generator at the point where it was paused.
def gen():
try:
yield 1
except ValueError:
yield "Exception handled"
g = gen()
print(next(g)) # Output: 1
print(g.throw(ValueError)) # Output: "Exception handled"
The .send()
method allows sending values into a generator to modify its behavior dynamically.
def my_gen():
value = yield
yield f"Received {value}"
g = my_gen()
next(g) # Prime the generator
print(g.send("Hello")) # Output: Received Hello
Sending None
is equivalent to calling next()
and is used to start the generator initially.
Yes, yield
can be used inside a with
statement, ensuring resource cleanup.
def file_reader(filename):
with open(filename) as f:
for line in f:
yield line.strip()
It refers to combining multiple generators using yield from
, making code more modular.
You can use try-except
blocks to handle exceptions within a generator.
No, generators maintain state and cannot be directly copied using copy.deepcopy()
.
Use sys.getsizeof(generator)
or memory profiling tools like memory_profiler
.
Use sys.getsizeof(generator)
or memory profiling tools like memory_profiler
.
Convert it into a list using list(generator())
and compare it with expected results.
Generators are already iterators since they implement __iter__()
and __next__()
.
Tutorials
Random Blogs
- Types of Numbers in Python
- Top 10 Knowledge for Machine Learning & Data Science Students
- How to Become a Good Data Scientist ?
- The Ultimate Guide to Data Science: Everything You Need to Know
- Ideas for Content of Every niche on Reader’s Demand during COVID-19
- 10 Awesome Data Science Blogs To Check Out
- Exploratory Data Analysis On Iris Dataset
- Transforming Logistics: The Power of AI in Supply Chain Management
- The Ultimate Guide to Starting a Career in Computer Vision
- Where to Find Free Datasets for Your Next Machine Learning & Data Science Project