Design Patterns
Although design patterns are a topic with their own history, intricacies, and opinionated implementations (not to say it varies based on the programming language), this section focuses on the most common design choices available to developers when integrating user code with the functionality provided by Python libraries.
Most of the time, library code would not require the implementation of a complex design pattern for an object to be compliant. Instead, it will provide foundational principles for how an interface can be accepted and how a generic piece of code can modify, attach to, or benefit from its implementation.
Decorator
One of the most popular design choices in Python is the use of decorators, which is
striking due to the usage of the @
character in the language, almost exclusive to the
implementation of this pattern1.
The definition of a decorator is very concise: a function that modifies the behavior of an object.
Decorators themselves are not complicated, but many examples implement them with poor type hinting, which makes the abstraction visually confusing.
The following is an example of the simplest possible decorator:
import typing as t
def decorator[T, **P](
func: t.Callable[P, T],
) -> t.Callable[P, T]:
"""
Args:
func: any callable
Returns:
func's signature, with additional attribute `__func__`
"""
setattr(func, "__func__", "__func__")
return func
# Using syntactic sugar, apply on statement
@decorator
def syntactic_sugar() -> None: ...
# Without syntactic sugar, as first-class citizens
def func() -> None: ...
syntactic_salt = decorator(func)
Even though this code snippet is quite simple, it represents a complete decorator implementation. Note that its usage does not involve creating a new callable nor internally consuming the function; it simply returns the function after adding some metadata to it.
Callable
Some definitions are very simple and describe a decorator as a function that accepts and returns a function (sometimes the same function).
These decorators are used for attribute-based modifications and often require the context of another object to embed some logic.
import typing as t
def zero_lvl_func_decorator[T, **P](
func: t.Callable[P, T],
) -> t.Callable[P, T]:
print("extend func behavior on DEFINITION")
setattr(func, "__func__", "__func__")
return func
For example, the standard library’s abc.abstractmethod
decorator, simply marks a flag
inside the decorated methods to determine which methods of the class must be overridden
by their children in order to create an instance.
Another decorator option is one that extends the function on its call, for that, a wrapper callable is required, which is returned when the decorated function is invoked. This wrapper will call the original function and execute additional functionality.
import typing as t
from functools import wraps
def one_lvl_func_decorator[T, **P](
func: t.Callable[P, T],
) -> t.Callable[P, T]:
print("extend func behavior ON DEFINITION")
# although a new callable is being returned, this accepts same params as func
# to hiddenly extend/decorate functionality
@wraps(func)
def wrapper(*args: P.args, **kwargs: P.kwargs) -> T:
print("extend func behavior ON CALL")
# returning same value as decorated callable
return func(*args, **kwargs)
return wrapper
To allow users to customize the functionality of the decorator or modify certain options, the decorator can accept parameters as well. One might be tempted to modify the decorator’s signature like this:
import typing as t
from functools import wraps
def one_lvl_func_decorator[T, **P](
func: t.Callable[P, T],
print_func_return: bool = False,
) -> t.Callable[P, T]:
print("extend func behavior ON DEFINITION")
# since a new a callable is being return, respect old params
@wraps(func)
def wrapper(*args: P.args, **kwargs: P.kwargs) -> T:
print("extend func behavior ON CALL")
if print_func_return:
returns = func(*args, **kwargs)
print(f"{returns=}")
return returns
# returning same value as decorated callable
return func(*args, **kwargs)
return wrapper
However, this code will fail at runtime:
@one_lvl_func_decorator(print_func_return=True)
def func() -> None: ...
$ uv run main.py
Traceback (most recent call last):
File "/mle/code/python/standards/design-patterns/04/main.py", line 26, in <module>
@one_lvl_func_decorator(print_func_return=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: one_lvl_func_decorator() missing 1 required positional argument: 'func'
This happens because the decorator syntactic sugar syntax implicitly passes the function statement as the first argument of the decorator. However, when using parentheses to specify options, this creates a conflict: the first argument is now expected to be the function itself.
To enable the use of decorators on callable statements while allowing parameters, a two-level decorator structure can be utilized.
import typing as t
from functools import wraps
def two_lvl_func_decorator[T, **P](
a: int,
b: int,
) -> t.Callable[
[t.Callable[P, T]],
t.Callable[P, T],
]:
"""
Parametrized Decorator that extends behavior of common function
"""
print(f"modify inner behavior based on {a=}")
def decorator(
func: t.Callable[P, T],
) -> t.Callable[P, T]:
@wraps(func)
# since a new a callable is being return, respect old params
def wrapper(*args: P.args, **kwargs: P.kwargs) -> T:
print("extend func behavior ON CALL")
print(f"modify inner behavior based on {b=}")
return func(*args, **kwargs)
return wrapper
return decorator
Throughout these examples, typing.ParamSpec
has been included in all wrapper callables
using the [**P]
notation. The benefit of this is that, if a user wants or needs to
store the functions on a different module —rather than decorating them on function
statement— such as for file conventions, or any other reason, they can still take
advantage of static type checking.
def func(c: str) -> str:
return c
two_lvl_func_decorator(a=1, b=2)(func)(c=3)
$ uv run mypy main.py
main.py:36: error: Argument "c" has incompatible type "int"; expected "str" [arg-type]
Found 1 error in 1 file (checked 1 source file)
Class
Simple abstraction for attribute-based modifications:
def zero_lvl_class_decorator[T: (type)](some_class: T) -> T:
"""
Extend class behavior on DEFINITION
"""
setattr(some_class, "__type__", "__type__")
return some_class
To define a class decorator that returns the same object and accepts parameters for customization, the following approach can be used:
import typing as t
def one_lvl_class_decorator[T: (type)](
some_value: int,
) -> t.Callable[[T], T]:
print("run operation ON DEFINITION")
def decorator(some_class: T) -> T:
print("extend class behavior on DEFINITION")
# modify methods of original class, e.g.
def some_method(self, x=some_value) -> int:
return x
setattr(some_class, "some_method", some_method)
return some_class
return decorator
A two-level decorator to accept parameters is not necessary. Even if the decorator’s
purpose were to return an object of a different type, the class abstraction itself
contains enough information to eliminate the need for a wrapper
-like object. However,
it may be confusing and is not considered good practice to have a decorator that
returns a different object type.
Object-Oriented
Python, by design, is an object-oriented programming language. It does not enforce
rigidity in this principle, enabling easy implementation of multiparadigm designs in
codebases. However, its nature heavily relies on the concept of objects and classes.
Any struct-like interface, trait, or even Enums must be addressed using the class
keyword.
dataclasses
is a popular example
of this. These are commonly used to represent a structure that includes methods without
relying on heavy Object-Oriented Programming (OOP) techniques. However, it still must be
declared using the class constructor. The same applies to Pydantic base models, which
require inheritance from pydantic.BaseModel
. Even though these objects are not
necessarily meant to incorporate traditional OOP design patterns, they still follow the
same construction experience.
Regardless, mid-level Object-Oriented Programming features are part of writing idiomatic Python. Not all libraries enforce this design, but understanding the core concepts most commonly used in the language enhances development:
-
Object Instantiation: objects that have a set of methods and attributes required by an operation. This is common with standard library objects like
io.BytesIO
. The instance of the object will comply with all the attributes/methods required by a different operator. -
Abstract Methods: Inheritance of an object that requires user implementation. The standard library uses this pattern to define concepts like Iterables, Mappings, Awaitables... In third-party libraries, it is more common to find them with a method that must be overridden, respecting some input parameters and returning a specific type.
-
Special Methods: Dunder (double underscore) methods, also known as magic methods. These methods are finite, and implementing some of them will have an effect when using built-in keywords, operators, and global functions. Others are more object-centric but are still common and relevant:
-
__init__
: Initializes the contents of a class and binds objects to the instance of the type.__init__
does not return.class Class: def __init__(self) -> None: ... # signature returns None.
-
__new__
: Method is called when a class is constructed and is responsible for creating the class. Must return an instance of the class.class Class: def __new__(cls) -> "Class": ... # modify creational behavior. Must return the type
-
__call__
: Method accesed when using()
on a initializedobject
. Python functions defined with the thedef
keyword also adhere to this protocol. This is the abstraction that enablescollections.abc.Callable
. A class that implements__call__
is as callable as a function object defined with thedef
keyword.class Class: def __call__(self) -> str: return "called" my_instance = Class() # does not run call, object has to be created first my_instance() # 'called'
def func() -> str: return "called" func.__call__() # 'called'
Accesing the type at runtime of a function returns the
function
type. However, function is not reserved keyword in Python, nor a type. It is not possible to use it as a base class on class definitiondef func() -> None: ... FunctionType = type(func) class Function(FunctionType): ...
Traceback (most recent call last): File "/mle/code/python/standards/design-patterns/11/main.py", line 7, in <module> class Function(FunctionType): ... TypeError: type 'function' is not an acceptable base type
TODO: improve transition
Even some built-in objects like
dict
define__dunder__
methods that were not originally intended for the type to enable more convenient operations. For example, the|
(__or__
) operator, originally introduced to represent the binaryOR
operation between numbers, can also be used to merge dictionaries. Libraries likeLangChain
use this operator in a pipe-centric way to stream the output of Runnables similar to%>%
in theR
programming language, inspired by Unix pipelines. -
-
Metaclasses go beyond the capabilities of methods like
__new__
. They allow modification of a class’s behavior before it is even created, providing access to the__dict__
that contains the entire class body, the base classes it inherits from, and more. Usage of metaclasses is part of what makes possible complex abstractions likesqlalchemy.orm.DeclarativeBase
,pydantic.BaseModel
in modern Python applications.
-
Decorators are not the only grammatical construct in Python that uses the
@
character. There is also the@
operator, which can be used when an object defines the__matmul__
magic method. ↩