Python Notes
Started: 01 May 2025
Updated: 01 May 2026
Updated: 01 May 2026
Here are some advance topics on python programming that one should know.
- functional enhancements | decorators and closures
- decorators are higher-order functions that allow you to modify the behaiour of a function without changing its source code
- useful in logging, timing simulations, enforcing constraints
import functools import time def timer(func): """A decorator to measure the execution time of a simulation step.""" @functools.wraps(func) def wrapper(*args, **kwargs): start_time = time.perf_counter() result = func(*args, **kwargs) end_time = time.perf_counter() print(f"Function {func.__name__} took {end_time - start_time:.4f}s") return result return wrapper @timer def compute_forces(atoms): # Simulated heavy computation time.sleep(0.5) return "Force Vector" - decorators are higher-order functions that allow you to modify the behaiour of a function without changing its source code
- Context managers
- guarantee setup and teardown code runs, even if an exception is raised
- the
withstatement is the syntax — it calls__enter__on entry and__exit__on exit -
class-based: implement
__enter__and__exit__directlyclass SimulationTimer: def __enter__(self): self.start = time.perf_counter() return self # bound to the `as` variable def __exit__(self, exc_type, exc_val, exc_tb): elapsed = time.perf_counter() - self.start print(f"Simulation took {elapsed:.4f}s") return False # False = do not suppress exceptions with SimulationTimer() as t: run_md_simulation() -
function-based shorthand:
@contextlib.contextmanager—yieldsplits setup from teardownfrom contextlib import contextmanager @contextmanager def open_trajectory(path): f = open(path, 'r') try: yield f # everything after `with` runs here finally: f.close() # always runs, even on exception with open_trajectory("traj.xyz") as f: frames = parse(f) - why
finallymatters: if you usetry/exceptinstead, a bareexceptthat re-raises still skips cleanup.finallyis unconditional. -
multiple context managers on one line:
with open("input.xyz") as src, open("output.xyz", "w") as dst: dst.write(convert(src.read())) - common built-in uses:
open(),threading.Lock(),numpy.errstate(),unittest.mock.patch()
- Iterators and Generators (memory efficiency)
- use of
yieldto turn a function into a generator
def stream_trajectory(file_path): """Generator to read a massive trajectory file line-by-line.""" with open(file_path, 'r') as f: for line in f: if line.startswith("ATOM"): yield parse_line(line) # Returns an object only when requested - use of
- metaprogramming and Dunder (“double under”) methods
__call__ __getitem__ - performance python: bridging to c++/fortran
ctypesandcffi: calling c functions directlypybind11: modern standard for creating python bindings for c++numba: JIT compiler that convert python functions into machine code
- memmory management
__slots__numpymemory layoutC_CONTIGUOUSandF_CONTIGUOUSmemoryviewfor zero-copy slicing
- profiling and optimization
cProfile: cpu time profiler (built-int)line_profile: line-by-line timingmemory_profiler: memory usage per linetracemalloc: built-in memory tracingScalene: CPU + GPU + memory in one- modern all-in-one profiler, distinguishes python time vs native time
- workflow:
timeit (is it slow?) → cProfile (which function?) → line_profiler (which line?) → memory_profiler / tracemalloc (is it memory, not CPU?) → Scalene / py-spy (production or long-running jobs)
- Concurrency and parallelism
multiprocessingvsasyncio
- Testing patterns
pytestfixtures, parametrize
- Packaging and environments:
pyproject.tomlsrc/layout- editable installs, building a wheel