Skip to content
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 10 additions & 12 deletions .github/workflows/test-ml.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,3 @@
# This workflow will install Python dependencies, run tests and lint with a single version of Python
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python

name: Test ML pipeline

on:
Expand All @@ -17,16 +14,17 @@ jobs:
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4

- name: Install uv
uses: astral-sh/setup-uv@v6

- name: Set up Python 3.10
uses: actions/setup-python@main # Need latest version to use pyproject.toml instead of requirements.txt
with:
python-version: "3.10"
cache: "pip"
run: uv python install 3.10

- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install .
run: uv sync --no-default-groups

- name: Run ML tests
run: |
ami test pipeline
uv run ami test pipeline
37 changes: 11 additions & 26 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,40 +13,25 @@ jobs:
test:
name: Run Python Tests
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
python-version: ["3.10", "3.12"]

steps:
- uses: actions/checkout@v4

- name: Set up Python 3.10
uses: actions/setup-python@v5
with:
python-version: "3.10"
cache: "pip"
cache-dependency-path: |
poetry.lock
pyproject.toml

- name: Install Poetry
uses: snok/install-poetry@v1
with:
version: latest
virtualenvs-create: true
virtualenvs-in-project: true

- name: Load cached Poetry virtualenv
id: cached-poetry-dependencies
uses: actions/cache@v3
with:
path: .venv
key: venv-${{ runner.os }}-${{ hashFiles('**/poetry.lock') }}
- name: Install uv
uses: astral-sh/setup-uv@v6

- name: Set up Python ${{ matrix.python-version }}
run: uv python install ${{ matrix.python-version }}

- name: Install dependencies
if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
run: poetry install --no-interaction
run: uv sync
Comment on lines +27 to +31
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🌐 Web query:

In GitHub Actions, does uv python install also force subsequentuv syncanduv runto use that exact interpreter, or shouldUV_PYTHON (or setup-uv python-version input) be set explicitly?

💡 Result:

No, uv python install does not force subsequent uv sync and uv run to use that exact interpreter automatically. uv selects the Python interpreter via a discovery process that prioritizes: UV_PYTHON environment variable (if set to a path), .python-version files (project-local or parent dirs), pyproject.toml requires-python, managed Pythons in UV_PYTHON_INSTALL_DIR (newer first), then PATH executables (first matching). uv python install adds the version to managed Pythons and PATH (via shims in ~/.local/bin), making it discoverable and preferred if no higher-priority specifier exists (e.g., no .python-version or UV_PYTHON), but does not set UV_PYTHON or create .python-version. In GitHub Actions, use setup-uv's python-version input (sets UV_PYTHON to the path) or explicitly set UV_PYTHON=/path/to/python for guarantee. Official docs recommend explicit uv python install after setup-uv for projects without pyproject.toml requires-python, but pairing with setup-uv python-version or UV_PYTHON ensures the exact interpreter.

Citations:


🏁 Script executed:

cat -n .github/workflows/test.yml

Repository: RolnickLab/ami-data-companion

Length of output: 1094


🏁 Script executed:

# Check for .python-version file
find . -name ".python-version" -type f

# Check for pyproject.toml and if it specifies requires-python
find . -name "pyproject.toml" -type f -exec cat {} \;

Repository: RolnickLab/ami-data-companion

Length of output: 1761


Set UV_PYTHON to enforce matrix Python version for uv sync and uv run commands.

The workflow installs a matrix Python version via uv python install, but doesn't explicitly pin it. While pyproject.toml constrains the range to 3.10–3.13, it doesn't guarantee the exact matrix version (e.g., 3.10) is used if other compatible versions exist on the runner's PATH. This risks silent matrix coverage drift.

Suggested hardening
       - name: Set up Python ${{ matrix.python-version }}
         run: uv python install ${{ matrix.python-version }}
+
+      - name: Pin uv interpreter for this job
+        run: echo "UV_PYTHON=${{ matrix.python-version }}" >> "$GITHUB_ENV"

       - name: Install dependencies
         run: uv sync

Alternatively, pass python-version to the setup-uv action:

       - name: Install uv
-        uses: astral-sh/setup-uv@v6
+        uses: astral-sh/setup-uv@v6
+        with:
+          python-version: ${{ matrix.python-version }}
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
- name: Set up Python ${{ matrix.python-version }}
run: uv python install ${{ matrix.python-version }}
- name: Install dependencies
if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
run: poetry install --no-interaction
run: uv sync
- name: Set up Python ${{ matrix.python-version }}
run: uv python install ${{ matrix.python-version }}
- name: Pin uv interpreter for this job
run: echo "UV_PYTHON=${{ matrix.python-version }}" >> "$GITHUB_ENV"
- name: Install dependencies
run: uv sync
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/test.yml around lines 27 - 31, The workflow currently runs
"uv python install ${{ matrix.python-version }}" but doesn't export the chosen
interpreter for later "uv sync" / "uv run" steps, risking matrix version drift;
fix by exporting UV_PYTHON equal to the installed interpreter before running "uv
sync" and "uv run" (or alternatively pass python-version into the setup-uv
action), e.g., set the environment variable UV_PYTHON to the same value used in
the "Set up Python ${{ matrix.python-version }}" step so subsequent "uv sync"
and "uv run" use the pinned interpreter.


- name: Run tests
run: |
# Clean any cached Python files before running tests
find . -type d -name "__pycache__" -exec rm -r {} +
find . -type f -name "*.pyc" -delete
poetry run pytest --import-mode=importlib
uv run pytest --import-mode=importlib
1 change: 0 additions & 1 deletion MANIFEST.in

This file was deleted.

23 changes: 8 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Desktop app for analyzing images from autonomous insect monitoring stations usin

## Dependencies

- Requires Python 3.10. Use [Anaconda](https://www.anaconda.com/) (or [miniconda](https://docs.conda.io/en/latest/miniconda.html)) if you need to maintain multiple versions of Python or are unfamiliar with using Python and scientific packages, it is especially helpful on Windows. [PyEnv](https://github.com/pyenv/pyenv) is also a popular tool for managing multiple versions of python if you are familiar with the command line.
- Requires Python >=3.10, <3.13 (tested on 3.10 and 3.12). Use [uv](https://docs.astral.sh/uv/) (recommended), [Anaconda](https://www.anaconda.com/), or [PyEnv](https://github.com/pyenv/pyenv) to manage Python versions.

## Installation (for non-developers)

Expand All @@ -36,37 +36,30 @@ ami test pipeline

## Installation (for developers)

Create an environment just for AMI and the data companion using conda (or virtualenv)

```sh
conda create -n ami python=3.10 anaconda
```

Clone the repository using the command line or the GitHub desktop app.
Install [uv](https://docs.astral.sh/uv/getting-started/installation/) then clone and sync:

```sh
git clone git@github.com:RolnickLab/ami-data-companion.git
cd ami-data-companion
uv sync
```

Install as an editable package. This will install the dependencies and install the `trapdata` console command
To include the desktop GUI dependencies:

```sh
cd ami-data-companion
pip install -e .
uv sync --extra gui
```

Test the whole backend pipeline without the GUI using this command

```sh
python trapdata/tests/test_pipeline.py
# or
ami test pipeline
uv run ami test pipeline
```

Run all other tests with:

```sh
ami test all
uv run ami test all
```

## GUI Usage
Expand Down
3,552 changes: 0 additions & 3,552 deletions poetry.lock

This file was deleted.

112 changes: 59 additions & 53 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,67 +1,73 @@
[tool.poetry]
[project]
name = "trapdata"
version = "0.6.0"
description = "Companion software for automated insect monitoring stations"
authors = ["Michael Bunsen <notbot@gmail.com>"]
license = "MIT"
authors = [{ name = "Michael Bunsen", email = "notbot@gmail.com" }]
license = { text = "MIT" }
readme = "README.md"
homepage = "https://github.com/RolnickLab/ami-data-manager"
repository = "https://github.com/RolnickLab/ami-data-manager"
requires-python = ">=3.10,<3.13"
urls = { Homepage = "https://github.com/RolnickLab/ami-data-companion", Repository = "https://github.com/RolnickLab/ami-data-companion" }

[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
dependencies = [
"alembic>=1.10.2",
"boto3>=1.33",
"fastapi>=0.100",
"gradio>=4.41",
"imagesize>=1.4.1",
"numpy>=1.26",
"pandas>=2.0",
"pillow>=10.0",
"pydantic>=2.5",
"pydantic-settings>=2.1",
"python-dateutil>=2.8.2",
"python-dotenv>=1.0",
"requests>=2.28",
"rich>=13.3",
"sentry-sdk>=1.18",
"sqlalchemy>=2.0",
"sqlalchemy-utils>=0.40",
"structlog>=22.3",
"timm>=0.6.13",
"torch>=2.5",
"torchvision>=0.20",
"typer>=0.12.3,<1",
"uvicorn>=0.20",
]

[project.optional-dependencies]
gui = [
"kivy[base]>=2.3",
"plyer>=2.1",
"pyobjus>=1.2.1; sys_platform == 'darwin'",
]
postgres = [
"psycopg2-binary>=2.9.5",
]

[tool.poetry.dependencies]
python = "^3.10,<3.12"
pillow = "^9.5.0"
python-dateutil = "^2.8.2"
python-dotenv = "^1.0.0"
pydantic = "^2.5.0"
rich = "^13.3.3"
pandas = "^1.5.3"
sqlalchemy = ">2.0"
sqlalchemy-utils = "^0.40.0"
alembic = "^1.10.2"
psycopg2-binary = { version = "^2.9.5", optional = true }
sentry-sdk = "^1.18.0"
imagesize = "^1.4.1"
torch = "^2.1.0"
torchvision = "^0.16.0"
timm = "^0.6.13"
structlog = "^22.3.0"
kivy = { extras = ["base"], version = "^2.3.0" }
plyer = "^2.1.0"
pyobjus = { version = "^1.2.1", markers = "sys_platform == 'darwin'" }
[dependency-groups]
dev = [
"black>=23.3",
"flake8>=6.0",
"ipython>=8.11",
"isort>=5.12",
"mypy-boto3-s3>=1.29",
"pytest>=7",
"pytest-asyncio>=0.21",
"pytest-cov>=4.0",
]

# [tool.poetry.group.dev.dependencies] # Can't install these dev deps with pip, so they're in the main deps
black = "^23.3.0"
flake8 = "^6.0.0"
ipython = "^8.11.0"
pytest-cov = "^4.0.0"
pytest-asyncio = "^0.21.0"
pytest = "*"
numpy = "^1.26.2"
pip = "^23.3.1"
pydantic-settings = "^2.1.0"
boto3 = "^1.33.0"
botocore = "^1.33.0"
mypy-boto3-s3 = "^1.29.7"
typer = ">=0.12.3,<1.0.0"
gradio = "^4.41.0"
[project.scripts]
trapdata = "trapdata.ui.main:run"
trapdata-test = "trapdata.tests.test_pipeline:process_deployments"
ami = "trapdata.cli.base:cli"

[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

[tool.pytest.ini_options]
asyncio_mode = 'auto'
asyncio_mode = "auto"
testpaths = ["trapdata/tests", "trapdata/**/tests"]

[tool.isort]
profile = "black"

[tool.poetry.scripts]
trapdata = 'trapdata.ui.main:run'
trapdata-test = 'trapdata.tests.test_pipeline:process_deployments'
ami = 'trapdata.cli.base:cli'

# [tool.setuptools.package_data]
# trapdata = ["*.kv", "*.txt", "*.json", "*.ini", "*.md"]
65 changes: 65 additions & 0 deletions scripts/job_adc_worker.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
#!/bin/bash
#SBATCH --job-name=adc-worker
#SBATCH --account=def-YOUR_ACCOUNT # <-- Replace with your DRAC allocation
#SBATCH --time=12:00:00
#SBATCH --mem=32G
#SBATCH --cpus-per-task=4
#SBATCH --gres=gpu:1 # Request 1 GPU (see GPU options below)
#SBATCH --output=%x-%j.out
#SBATCH --error=%x-%j.err
#
# Example SLURM job script for running the ADC Antenna worker on DRAC Alliance
# HPC clusters (Fir, Rorqual, Narval, etc.).
#
# GPU options (adjust --gres):
# Fir/Rorqual: gpu:h100:1
# Narval: gpu:a100:1
# Cedar: gpu:v100l:1
#
# ──────────────────────────────────────────────────────────────────────
# One-time setup (run interactively, NOT in this script):
#
# module load python/3.12 cuda/12.6
# uv venv ~/venvs/adc --python 3.12
# source ~/venvs/adc/bin/activate
# cd ~/projects/ami-data-companion
# # --no-deps: let the lockfile control versions; avoids conflicts with
# # system packages on DRAC nodes.
# uv pip install --no-deps -r <(uv export --no-hashes --frozen)
# uv pip install --no-deps .
#
# # Create .env with your Antenna credentials:
# cat > .env <<EOF
# AMI_ANTENNA_API_BASE_URL=https://antenna.insectai.org/api/v2
# AMI_ANTENNA_API_AUTH_TOKEN=your_token_here
# AMI_ANTENNA_API_BATCH_SIZE=4
# EOF
#
# # Register worker with Antenna (once):
# ami worker register "DRAC Worker"
Comment thread
coderabbitai[bot] marked this conversation as resolved.
Outdated
# ──────────────────────────────────────────────────────────────────────

set -euo pipefail
Comment thread
coderabbitai[bot] marked this conversation as resolved.

# Catch unedited placeholder in #SBATCH --account
if [[ "${SLURM_JOB_ACCOUNT:-}" == *YOUR_ACCOUNT* ]]; then
echo "ERROR: Replace --account in this script with your DRAC allocation." >&2
exit 1
fi

module load python/3.12 cuda/12.6

source ~/venvs/adc/bin/activate
cd ~/projects/ami-data-companion
if [[ ! -f .env ]]; then
echo "ERROR: .env not found in $(pwd). See one-time setup instructions in this script." >&2
exit 1
fi
set -a; source .env; set +a

# Register pipelines on each run (idempotent)
ami worker register "DRAC Worker - $(hostname)"

# Run with timeout slightly shorter than SLURM --time to allow clean shutdown.
# The worker handles SIGTERM gracefully and finishes the current batch.
timeout --signal=SIGTERM 11h55m ami worker
2 changes: 1 addition & 1 deletion trapdata/ml/models/tracking.py
Original file line number Diff line number Diff line change
Expand Up @@ -456,7 +456,7 @@ def post_process_batch(self, output) -> np.ndarray:
# output = output.view(-1, output.size(0)).cpu()
# output = output.reshape((output.shape[0],))
batch_size = output.shape[0]
num_features = np.product(output.shape[1:])
num_features = np.prod(output.shape[1:])

output = output.reshape(batch_size, num_features)
output = output.cpu().numpy()
Expand Down
Loading