Part of a series - Stop doing things
2024-11-05
Pytest fixtures are great for everything with a teardown - these kind of things:
@pytest.fixture(scope="session")
def db() -> Iterator[str]:
create_db()
yield db_url
teardown_db()
@pytest.fixture()
def conn(db: str) -> Iterator[psycopg.Connection]:
with psycopg.conn(db) as conn:
yield conn
clean_up_tables()
@pytest.fixture()
def monkey_patch_thing() -> Iterator[None]:
before = foo.bar
foo.bar = TEST_VALUE
yield
foo.bar = before
Pytest fixtures are bad because:
@pytest.fixture()
def create_user(conn: psycopg.Connection) -> User:
return db.add_user(username="test-user", ...)
Evolve into monstrosities like:
@pytest.fixture()
def create_user(conn: psycopg.Connection) -> Callable[[str] User]:
def _create_user(username: str)
return db.add_user(username=username, ...)
return _create_user
In general, why introduce a whole new (and fairly hairy) construct - fixtures - when boring function calls will suffice?
OK, so what should I do instead?
Have a very small number of primitives in your conftest.py
that require teardown, things like db
, conn
, monkeypatch
etc.
Have various helpers.py
/factories.py
/whatever that contain normal functions that you import and call like:
def create_user(conn: psycopg.Connection) -> User:
return db.add_user(username="test-user", ...)
Call them in your tests as normal functions:
from tests.foo import factories
def test_foo(conn: psycopg.Connection) -> None:
user = factories.create_user(conn)
All of your favourite features of plain ol' functions (ability to add arguments/refactor, reason about imports etc) are all provided out of the box!
There are various libraries out there that promise to remove boiler plate by inspecting your models and filling in random values for those you don't provide at __init__
time. These:
inner__nested_thing__add=0
).Bad!
Don't bother, instead:
deepcopy
eg:def make_standard_item() -> Item:
return Item(...)
def make_weird_item() -> Item:
item = make_standard_item()
item.weird_exception = 42
return item
Wouldn't it be nice to have a typed monkeypatch
with a nice interface where mypy
can catch any errors. You can!
Usage:
def test_foo(patch: conftest.MonkeyPatch) -> None:
patch(my.module.f).to(_dummy_f) # typechecked!
...
Mildly terrifying implementation:
@dataclass
class _MonkeyPatchSetAttr(Generic[T]):
monkeypatch: Any
module: Any
attr: str
def to(self, to: T) -> None:
self.monkeypatch.setattr(self.module, self.attr, to)
@dataclass
class MonkeyPatch:
monkeypatch: Any
def __call__(self, from_: T) -> _MonkeyPatchSetAttr[T]:
call_site = inspect.stack()[1]
code: str = call_site.code_context[0]
match = re.match(r".+patch\(([\w+.]+)\)", code)
module_name, _, attr = match.groups()[0].rpartition(".")
module = eval(module_name, call_site.frame.f_globals, call_site.frame.f_locals)
return _MonkeyPatchSetAttr(self.monkeypatch, module, attr)
@pytest.fixture
def patch(monkeypatch: Any) -> Iterator[MonkeyPatch]:
yield MonkeyPatch(monkeypatch)
Instead of test files looking like:
def test_make_a_user(
conn: psycopg.Connection,
patch_settings: None
) -> None:
...
assert foo == bar
Could we ditch a whole bunch of the pytest
magic and just have files like:
with (
pytest.test("Make a user"),
conftest.conn() as conn,
patch_settings(),
):
...
assert foo == bar
Implementation something along the lines of:
@contextmanager
def test(name: str) -> Iterator[pytest.Test]:
if "PYTEST_UNDER_TEST" in os.environ:
yield pytest.Test(name)
else:
raise Error("Not possible to import from test files.")
@contextmanager
def db() -> Iterator[str]:
if pytest.is_first_test():
create_db()
yield db_url
if pytest.is_last_test():
teardown_db()
@contextmanager
def conn() -> Iterator[psycopg.Connection]:
with conftest.db() as db:
with psycopg.conn(db) as conn:
yield conn
clean_up_tables()