Here’s a curated list of Python-specific functions, constructs, and concepts like decorators, wrappers, generators, etc., that are frequently asked in Python interviews—especially for developer, data engineer, or backend roles.


✅ Core Python Functional Concepts (Highly Asked)

Feature/FunctionPurpose / Use CaseExample / Notes
DecoratorsWrap a function to modify or extend its behavior@decorator_name, functools.wraps
Wrapper FunctionsInner function that wraps logic (used inside decorators)Used with closures
Lambda FunctionsAnonymous short functionslambda x: x+1
Map / Filter / ReduceFunctional operations on iterablesOften used with lambda
GeneratorsLazy iterables using yielddef gen(): yield 1
Iterators / Iterables__iter__() / __next__() protocolCustom looping logic
List / Dict ComprehensionCompact way to build lists/dicts/sets[x for x in iterable if x > 0]
ClosuresFunction remembering variables from enclosing scopeUsed in decorators
Context Managerswith statement + __enter__/__exit__()with open() or custom resource mgmt
Partial FunctionsFix arguments of a function using functools.partial()Used in currying

🔁 Introspection / Metaprogramming

ConceptDescription
dir()List attributes/methods of object
getattr(), setattr()Access/set attributes dynamically
hasattr()Check if object has an attribute
type() / isinstance()Type checking
id()Memory address of object
callable()Check if object is callable
__name__, __doc__Access function metadata

⚙️ Advanced Function Handling

ConceptDescription
*args, **kwargsVariable-length arguments
functools.lru_cache()Caching function results
functools.wraps()Preserve metadata in decorated functions
@staticmethod, @classmethodUsed in class-based logic
__call__()Make an object callable like a function
__str__(), __repr__()Custom string representation

🧠 Interview-Favorite Special Concepts

ConceptDescription
yield fromDelegates generator operations
nonlocalModify outer non-global variable
globals() / locals()Current global or local symbol table
property()Turns method into a getter/setter
Monkey patchingDynamically modifying classes or methods
Duck typingBehavior over type: “if it quacks…”
First-class functionsFunctions as variables/arguments/returns

🔐 Error Handling and Contexts

ConceptUse Case
try/except/finallyError handling
raiseRaise custom errors
Custom ExceptionsInherit from Exception
assertValidate assumptions during debugging
with StatementAuto resource management (context manager)

📚 Miscellaneous Built-in Functions (Commonly Asked)

FunctionUse Case
zip()Combine multiple iterables
enumerate()Index + value in iteration
any() / all()Boolean checks on iterable
sorted() + keyCustom sorting
reversed()Reverse iterable
sum(), max(), min()Aggregations

🔧 Libraries Often Used With Above Concepts

  • functoolswraps, lru_cache, partial
  • itertools → For lazy iteration (chain, cycle, product, etc.)
  • contextlib@contextmanager, suppress, closing

✅ Suggested Top Interview Code Snippets to Master:

  1. Custom decorator with @wraps
  2. Closure with captured variables
  3. Generator with yield and yield from
  4. Custom context manager using class or @contextmanager
  5. Using map, filter, reduce with lambda
  6. *args, **kwargs demo and how they’re passed
  7. Writing a callable class with __call__
  8. Custom sorting using sorted(iterable, key=lambda x: ...)
  9. Using partial() to simplify function calls
  10. Mocking or monkey-patching a function

Great! Here is your inline, printed version of:

Top Python interview concepts
Code examples
Mindmap-style summary
Notebook-style practice


🧠 1. Decorators and Wrapper Functions

from functools import wraps

def my_decorator(func):
    @wraps(func)
    def wrapper(*args, **kwargs):
        print("Before the function runs")
        result = func(*args, **kwargs)
        print("After the function runs")
        return result
    return wrapper

@my_decorator
def greet(name):
    print(f"Hello, {name}!")

greet("Rajeev")

🔁 2. Closures

def outer(msg):
    def inner():
        print(f"Message: {msg}")
    return inner

my_func = outer("Hello from closure")
my_func()

⚙️ 3. Generators & yield

def count_up_to(n):
    i = 1
    while i <= n:
        yield i
        i += 1

for num in count_up_to(3):
    print(num)  # Output: 1 2 3

🔁 4. map, filter, reduce

from functools import reduce

nums = [1, 2, 3, 4]

# map: square each number
squared = list(map(lambda x: x**2, nums))  # [1, 4, 9, 16]

# filter: keep even numbers
evens = list(filter(lambda x: x % 2 == 0, nums))  # [2, 4]

# reduce: sum all numbers
summed = reduce(lambda x, y: x + y, nums)  # 10

🧮 5. Partial Function

from functools import partial

def power(base, exponent):
    return base ** exponent

square = partial(power, exponent=2)
print(square(5))  # Output: 25

🔁 6. Context Manager (Custom)

class FileOpener:
    def __init__(self, filename):
        self.filename = filename

    def __enter__(self):
        self.file = open(self.filename, 'r')
        return self.file

    def __exit__(self, exc_type, exc_value, traceback):
        self.file.close()

# Usage
# with FileOpener('myfile.txt') as f:
#     print(f.read())

👀 7. Callable Object with __call__

class Greeter:
    def __init__(self, greeting):
        self.greeting = greeting

    def __call__(self, name):
        print(f"{self.greeting}, {name}!")

hello = Greeter("Hello")
hello("Rajeev")  # Output: Hello, Rajeev!

🔍 8. Duck Typing

class Duck:
    def quack(self): print("Quack!")

class Person:
    def quack(self): print("I can quack like a duck!")

def in_the_forest(duckish):
    duckish.quack()

in_the_forest(Duck())    # Quack!
in_the_forest(Person())  # I can quack like a duck!

📊 9. Sorting with Key

data = ["apple", "banana", "cherry"]
sorted_data = sorted(data, key=lambda x: len(x))  # by length
print(sorted_data)  # ['apple', 'banana', 'cherry']

📚 10. *args, **kwargs

def print_all(*args, **kwargs):
    for a in args:
        print(a)
    for key, value in kwargs.items():
        print(f"{key} = {value}")

print_all(1, 2, 3, name="Rajeev", job="Engineer")

🧠 MINDMAP SUMMARY (Visualized as Indented Bullets)

Python Interview Concepts
├── Functional Programming
│   ├── lambda
│   ├── map, filter, reduce
│   └── functools.partial
├── Closures and Decorators
│   ├── Closures
│   ├── Decorators (@)
│   └── functools.wraps
├── Generators
│   ├── yield
│   └── yield from
├── Object-Oriented Tools
│   ├── __call__()
│   ├── @staticmethod / @classmethod
│   └── Duck Typing
├── Context Management
│   ├── with statement
│   ├── __enter__, __exit__
│   └── contextlib.contextmanager
├── Introspection
│   ├── dir(), type(), id()
│   ├── hasattr(), getattr()
│   └── callable(), isinstance()
├── Data Handling
│   ├── *args, **kwargs
│   ├── zip(), enumerate()
│   └── sorted(), key=
├── Error Handling
│   ├── try/except/finally
│   └── raise, assert

📓 Notebook Practice Style Inline

# Practice 1: Write a decorator to log function execution time
import time
from functools import wraps

def timer(func):
    @wraps(func)
    def wrapper(*args, **kwargs):
        start = time.time()
        result = func(*args, **kwargs)
        end = time.time()
        print(f"{func.__name__} took {end - start:.4f}s")
        return result
    return wrapper

@timer
def slow_add(x, y):
    time.sleep(1)
    return x + y

slow_add(3, 5)
# Practice 2: Generator to read a file line by line (lazy)
def read_lines(filename):
    with open(filename) as f:
        for line in f:
            yield line.strip()

# for line in read_lines('myfile.txt'):
#     print(line)
# Practice 3: Custom context manager using @contextmanager
from contextlib import contextmanager

@contextmanager
def open_file(name):
    f = open(name, 'r')
    try:
        yield f
    finally:
        f.close()

# with open_file('data.txt') as f:
#     print(f.read())

Generators in Python are used to create iterators in a memory-efficient, lazy way. Their primary purpose is to yield items one at a time, only when requested, instead of generating all items at once and storing them in memory like lists.


Why Use Generators?

ReasonDescription
🔁 Lazy EvaluationProduces values on-the-fly instead of computing all at once
🧠 Memory EfficientDoesn’t store the whole dataset in memory
Faster for Streaming DataIdeal for processing large files or streams
🔗 Composable PipelinesCan be chained together like UNIX pipes
🧰 Simplifies Complex IteratorsCleaner code using yield instead of managing __iter__() and __next__() manually

🧪 Example: Generator vs List

❌ List (eager evaluation)

squares = [x**2 for x in range(1_000_000)]
# Uses lots of memory

✅ Generator (lazy)

def square_gen():
    for x in range(1_000_000):
        yield x**2

gen = square_gen()  # Only computes one value at a time

💡 Real-Life Use Cases

1. Reading large files line by line

def read_large_file(filename):
    with open(filename) as f:
        for line in f:
            yield line.strip()

# for line in read_large_file("huge.log"):
#     process(line)

2. Infinite or long sequences

def infinite_counter():
    i = 0
    while True:
        yield i
        i += 1

counter = infinite_counter()
print(next(counter))  # 0
print(next(counter))  # 1

3. Data Pipelines / ETL

Generators help build staged data transformation pipelines:

def read_numbers():
    for i in range(100):
        yield i

def filter_even(nums):
    for n in nums:
        if n % 2 == 0:
            yield n

def square(nums):
    for n in nums:
        yield n * n

# Pipeline
result = square(filter_even(read_numbers()))
for r in result:
    print(r)

🧠 yield vs return

KeywordUsed inBehavior
returnNormal functionEnds the function, returns value
yieldGeneratorPauses and resumes the function, returns value one at a time

🚀 Summary

  • Use generators when:
    • Working with huge datasets
    • Reading large files
    • Needing infinite sequences
    • Building modular data transformation pipelines
    • Wanting to optimize memory and performance

Pages: 1 2 3 4 5


Discover more from HintsToday

Subscribe to get the latest posts sent to your email.

Posted in

Leave a Reply

Discover more from HintsToday

Subscribe now to keep reading and get access to the full archive.

Continue reading