With the Paper A sprint starting ahead of the public release, squatting the PyPI name now avoids the risk of a namesquatter grabbing it once the project surfaces on arXiv or JOSS. The 0.1.0.dev0 version is a PEP 440 dev release — pip install neuropose won't resolve to it without --pre, so no accidental installs until 0.1.0 final. PYPI_README.md is a minimal PyPI-specific project page: name- reserved notice, pointer at the source URL, author list. Kept as a separate file from the contributor-focused README so the two can evolve independently — README.md speaks to someone cloning the repo, PYPI_README.md speaks to someone landing on the PyPI page. |
||
|---|---|---|
| .github/workflows | ||
| benchmarks | ||
| docs | ||
| examples/analysis | ||
| scripts | ||
| src/neuropose | ||
| tests | ||
| .dockerignore | ||
| .gitignore | ||
| .pre-commit-config.yaml | ||
| .python-version | ||
| AUTHORS.md | ||
| CHANGELOG.md | ||
| CITATION.cff | ||
| Dockerfile | ||
| LICENSE | ||
| PYPI_README.md | ||
| README.md | ||
| mkdocs.yml | ||
| pyproject.toml | ||
| uv.lock | ||
README.md
NeuroPose (rewrite)
Ground-up rewrite of the prior NeuroPose internal prototype. The repository is private while the IRB data-handling policy is being authored; this README is aimed at contributors working on the rewrite, not external users.
Layout
neuropose/
├── .github/workflows/ # CI: ruff + pyright + pytest (ci.yml), mkdocs (docs.yml)
├── src/neuropose/
│ ├── __init__.py # version only
│ ├── config.py # pydantic-settings Settings class
│ ├── io.py # prediction schema, load/save helpers
│ ├── estimator.py # per-video MeTRAbs worker
│ ├── interfacer.py # filesystem-polling daemon
│ ├── visualize.py # per-frame 2D + 3D overlay rendering
│ ├── cli.py # typer app (watch | process | analyze)
│ ├── _model.py # MeTRAbs download + SHA-256 verify + load
│ └── analyzer/ # post-processing subpackage
│ ├── dtw.py # FastDTW helpers
│ └── features.py # normalization, padding, joint angles, stats
├── tests/
│ ├── conftest.py # env isolation, synthetic video, fake model
│ ├── unit/ # fast, no model download
│ └── integration/ # marked @slow, downloads the real MeTRAbs model
├── docs/ # mkdocs-material site (mkdocs.yml at repo root)
├── scripts/download_model.py # pre-warm the model cache
├── pyproject.toml # hatchling build, dev group (PEP 735)
├── Dockerfile # CPU image, non-root, /data volume
├── CHANGELOG.md # Keep a Changelog format
├── RESEARCH.md # DTW methodology + MeTRAbs self-hosting R&D log
├── AUTHORS.md
├── CITATION.cff
└── LICENSE # MIT
Architecture
Three stages, one module each:
estimator— per-video worker. Streams frames from an input video via OpenCV, runs MeTRAbs on each frame, and returns a validatedVideoPredictions(per-frameboxes,poses3d,poses2dplus aVideoMetadataenvelope with frame count, fps, and resolution). Pure library — no filesystem semantics.interfacer— filesystem-polling daemon. Watches the configured input directory for new job subdirectories, dispatches each to anEstimator, and persists job state (status.json) across crashes and restarts. Single instance enforced viafcntl.flock. Owns the input → output → failed directory lifecycle.analyzer— post-processing subpackage. FastDTW-based motion comparison (dtw_all,dtw_per_joint,dtw_relation) and joint-angle / feature-statistics helpers. Pure functions operating onVideoPredictions. Heavy dependencies (fastdtw, scipy) are lazy-imported soimport neuropose.analyzerworks without theanalysisextra.
Configuration is centralized in src/neuropose/config.py (a
pydantic-settings Settings class). The runtime data directory defaults to
$XDG_DATA_HOME/neuropose/jobs/ and never lives inside the repository.
Development setup
Requires Python 3.11 and uv.
git clone https://git.levineuwirth.org/neuwirth/neuropose.git
cd neuropose
uv sync --group dev
uv sync --group dev creates .venv/ automatically and installs the
runtime stack (pydantic, typer, OpenCV, TensorFlow, matplotlib) plus the
full dev toolchain (pytest, ruff, pyright, pre-commit, mkdocs-material,
fastdtw, scipy). First sync downloads ~600 MB of TensorFlow; subsequent
runs hit the uv cache.
Install the pre-commit hooks:
uv run pre-commit install
Running tests
uv run pytest # unit tests only (default)
uv run pytest --runslow # unit + integration; downloads ~2 GB MeTRAbs model
uv run pytest -m "not slow" # explicitly exclude slow tests
Integration tests live under tests/integration/ and are gated behind
@pytest.mark.slow plus a custom --runslow flag implemented in
tests/conftest.py. Without the flag, slow tests are skipped at collection
time. The first --runslow run downloads the pinned MeTRAbs tarball
(~2 GB) into a session-scoped temp cache; subsequent tests in the same run
reuse it.
Lint and type-check
uv run ruff check .
uv run ruff format .
uv run pyright
CI runs lint, typecheck, and test as three parallel jobs on every push and
PR to main — see .github/workflows/ci.yml.
Docs
uv run mkdocs serve # live-reload preview at http://127.0.0.1:8000
uv run mkdocs build --strict # same check CI runs
The API reference pages under docs/api/ are auto-generated from source
docstrings via mkdocstrings, so they cannot drift out of sync.