Here’s a curated list of Python-specific functions, constructs, and concepts like decorators, wrappers, generators, etc., that are frequently asked in Python interviews—especially for developer, data engineer, or backend roles.
✅ Core Python Functional Concepts (Highly Asked)
Feature/Function
Purpose / Use Case
Example / Notes
Decorators
Wrap a function to modify or extend its behavior
@decorator_name, functools.wraps
Wrapper Functions
Inner function that wraps logic (used inside decorators)
Used with closures
Lambda Functions
Anonymous short functions
lambda x: x+1
Map / Filter / Reduce
Functional operations on iterables
Often used with lambda
Generators
Lazy iterables using yield
def gen(): yield 1
Iterators / Iterables
__iter__() / __next__() protocol
Custom looping logic
List / Dict Comprehension
Compact way to build lists/dicts/sets
[x for x in iterable if x > 0]
Closures
Function remembering variables from enclosing scope
Used in decorators
Context Managers
with statement + __enter__/__exit__()
with open() or custom resource mgmt
Partial Functions
Fix arguments of a function using functools.partial()
itertools → For lazy iteration (chain, cycle, product, etc.)
contextlib → @contextmanager, suppress, closing
✅ Suggested Top Interview Code Snippets to Master:
Custom decorator with @wraps
Closure with captured variables
Generator with yield and yield from
Custom context manager using class or @contextmanager
Using map, filter, reduce with lambda
*args, **kwargs demo and how they’re passed
Writing a callable class with __call__
Custom sorting using sorted(iterable, key=lambda x: ...)
Using partial() to simplify function calls
Mocking or monkey-patching a function
Great! Here is your inline, printed version of:
✅ Top Python interview concepts ✅ Code examples ✅ Mindmap-style summary ✅ Notebook-style practice
🧠 1. Decorators and Wrapper Functions
from functools import wraps
def my_decorator(func):
@wraps(func)
def wrapper(*args, **kwargs):
print("Before the function runs")
result = func(*args, **kwargs)
print("After the function runs")
return result
return wrapper
@my_decorator
def greet(name):
print(f"Hello, {name}!")
greet("Rajeev")
class Duck:
def quack(self): print("Quack!")
class Person:
def quack(self): print("I can quack like a duck!")
def in_the_forest(duckish):
duckish.quack()
in_the_forest(Duck()) # Quack!
in_the_forest(Person()) # I can quack like a duck!
📊 9. Sorting with Key
data = ["apple", "banana", "cherry"]
sorted_data = sorted(data, key=lambda x: len(x)) # by length
print(sorted_data) # ['apple', 'banana', 'cherry']
📚 10. *args, **kwargs
def print_all(*args, **kwargs):
for a in args:
print(a)
for key, value in kwargs.items():
print(f"{key} = {value}")
print_all(1, 2, 3, name="Rajeev", job="Engineer")
🧠 MINDMAP SUMMARY (Visualized as Indented Bullets)
# Practice 1: Write a decorator to log function execution time
import time
from functools import wraps
def timer(func):
@wraps(func)
def wrapper(*args, **kwargs):
start = time.time()
result = func(*args, **kwargs)
end = time.time()
print(f"{func.__name__} took {end - start:.4f}s")
return result
return wrapper
@timer
def slow_add(x, y):
time.sleep(1)
return x + y
slow_add(3, 5)
# Practice 2: Generator to read a file line by line (lazy)
def read_lines(filename):
with open(filename) as f:
for line in f:
yield line.strip()
# for line in read_lines('myfile.txt'):
# print(line)
# Practice 3: Custom context manager using @contextmanager
from contextlib import contextmanager
@contextmanager
def open_file(name):
f = open(name, 'r')
try:
yield f
finally:
f.close()
# with open_file('data.txt') as f:
# print(f.read())
Generators in Python are used to create iterators in a memory-efficient, lazy way. Their primary purpose is to yield items one at a time, only when requested, instead of generating all items at once and storing them in memory like lists.
✅ Why Use Generators?
Reason
Description
🔁 Lazy Evaluation
Produces values on-the-fly instead of computing all at once
🧠 Memory Efficient
Doesn’t store the whole dataset in memory
⚡ Faster for Streaming Data
Ideal for processing large files or streams
🔗 Composable Pipelines
Can be chained together like UNIX pipes
🧰 Simplifies Complex Iterators
Cleaner code using yield instead of managing __iter__() and __next__() manually
🧪 Example: Generator vs List
❌ List (eager evaluation)
squares = [x**2 for x in range(1_000_000)]
# Uses lots of memory
✅ Generator (lazy)
def square_gen():
for x in range(1_000_000):
yield x**2
gen = square_gen() # Only computes one value at a time
💡 Real-Life Use Cases
1. Reading large files line by line
def read_large_file(filename):
with open(filename) as f:
for line in f:
yield line.strip()
# for line in read_large_file("huge.log"):
# process(line)
2. Infinite or long sequences
def infinite_counter():
i = 0
while True:
yield i
i += 1
counter = infinite_counter()
print(next(counter)) # 0
print(next(counter)) # 1
3. Data Pipelines / ETL
Generators help build staged data transformation pipelines:
def read_numbers():
for i in range(100):
yield i
def filter_even(nums):
for n in nums:
if n % 2 == 0:
yield n
def square(nums):
for n in nums:
yield n * n
# Pipeline
result = square(filter_even(read_numbers()))
for r in result:
print(r)
🧠 yield vs return
Keyword
Used in
Behavior
return
Normal function
Ends the function, returns value
yield
Generator
Pauses and resumes the function, returns value one at a time
Leave a Reply