Deep dive into SOLID principles, metaprogramming with Decorators, and memory-efficient Generators.
Har principle ka practical Python implementation.
Ek class ka kaam sirf ek specific domain tak mehdood hona chahiye. Isse testing aur debugging asaan ho jati hai.
class User:
def __init__(self, name: str):
self.name = name
class UserDB:
def save(self, user: User):
print(f"Saving {user.name} to DB")
# User sirf data handle karta hai, DB nahi.
Naye features add karne ke liye purane code ko touch mat karo. Interfaces aur Inheritance ka use karke functionality extend karo.
from abc import ABC, abstractmethod
class Shape(ABC):
@abstractmethod
def area(self): pass
class Circle(Shape):
def area(self): return 3.14 * (5**2)
class Square(Shape):
def area(self): return 10 * 10
Child class ko hamesha Parent class ki jagah fit aana chahiye. Logic break nahi honi chahiye.
class Bird: pass
class FlyingBird(Bird):
def fly(self): print("Flying...")
class Ostrich(Bird):
# Ostrich fly nahi kar sakta
# Isliye ye FlyingBird ko inherit nahi karega
pass
Badi interfaces ko chhota kardo taaki classes sirf zaruri methods ko hi implement karein.
class Printer(ABC):
@abstractmethod
def print_doc(self): pass
class Scanner(ABC):
@abstractmethod
def scan_doc(self): pass
class OldPrinter(Printer):
def print_doc(self): print("Printing...")
High-level modules ko low-level classes par nahi, balki interfaces par depend hona chahiye (Abstraction).
class DBInterface(ABC):
@abstractmethod
def connect(self): pass
class App:
def __init__(self, db: DBInterface):
# Specific DB par depend nahi hai
self.db = db
Decorator original function ko wrap kar leta hai. Hum @functools.wraps ka use karte hain taaki function metadata (name, docstring) loss na ho.
Authentication
Function execute hone se pehle user check karein.
Performance
Function ne kitna time liya, ye calculate karein.
import functools
import time
def timer(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
start = time.perf_counter()
result = func(*args, **kwargs)
end = time.perf_counter()
print(f"{func.__name__} took {end-start:.4f}s")
return result
return wrapper
@timer
def heavy_task():
time.sleep(1)
Generators memory mein poori list save nahi karte. Wo "Lazy Evaluation" use karte hain, yani jab data ki zaroorat ho tabhi generate karte hain.
def large_file_reader(path):
with open(path) as f:
for line in f:
yield line # Ek waqt par ek line
Aap multiple generators ko ek ke baad ek chain kar sakte hain bina memory usage badhaye.
data = (i for i in range(10**10))
filtered = (x for x in data if x % 2 == 0)
# Sirf call karne par hi compute hoga