PyCharm Setup¶
PyCharm has two distinct needs when working with this project:
- Code analysis (autocompletion, type checking, go-to-definition): needs a Python interpreter with all packages installed
- Test execution / debugging: needs the Docker environment (database, tools, environment variables)
The recommended approach is to configure a local venv for code analysis and a Docker Compose interpreter for running and debugging tests.
Code analysis: local venv¶
- Run
make local_venvto create per-package virtual environments. - In PyCharm, go to Settings → Project → Python Interpreter → Add Interpreter → Add Local Interpreter.
- Select Existing and point it to the venv for the package you're working on:
- For
core/common:packages/core/.venv/bin/python - For
floors_estimation:packages/floors_estimation/.venv/bin/python - For
party_walls:packages/party_walls/.venv/bin/python
- For
- Mark source roots so PyCharm resolves cross-package imports correctly:
- Right-click
packages/core/src→ Mark Directory as → Sources Root - Right-click
packages/common/src→ Mark Directory as → Sources Root
- Right-click
This gives full autocompletion and type checking. External tool binaries (roofer, tyler, etc.) are not needed for code analysis.
Test execution and debugging: Docker Compose interpreter¶
Prerequisites¶
Start the containers with bind-mounted source so that breakpoints in your local files map directly to the running code:
make docker_dev
The docker_dev target starts all services with docker/compose.dev.yaml layered on top of the base compose file. The pipeline containers bind-mount the host source directories, so edits are
immediately visible inside the containers.
To start only the core service (faster for most development):
BAG3D_DOCKER_IMAGE_TAG=develop docker compose -p bag3d-dev \
-f docker/compose.yaml -f docker/compose.dev.yaml up -d
Configure the Docker Compose interpreter in PyCharm¶
- Go to Settings → Project → Python Interpreter → Add Interpreter → On Docker Compose.
- Set Configuration files:
docker/compose.yamldocker/compose.dev.yaml(add with the+button)
- Set Service:
bag3d-core(or the package you're testing). - Set Python interpreter path:
/opt/3dbag-pipeline/venv/bin/python - Under Path mappings, add:
| Local path | Remote path |
|---|---|
<project-root>/packages |
/opt/3dbag-pipeline/packages |
- Click OK and wait for PyCharm to index the remote interpreter.
Running tests with debugging¶
Once the Docker Compose interpreter is configured and the containers are running with bind-mounted source (make docker_dev):
- Right-click any test file or function and choose Debug — breakpoints in your local source files will be hit inside the container.
- The test environment (PostgreSQL, environment variables from
docker/.env) is available automatically because the test runs inside the container.
After editing Python files¶
With bind-mounted source (make docker_dev), file changes are visible in the container immediately. However, Dagster's gRPC code server does not auto-detect file changes. After editing code, click *
Reload* in the Dagster UI at http://localhost:3000 to reload the code location.
If you prefer automatic reloads on save, use make docker_watch instead (Stage 1 sync+restart workflow). That triggers a container restart whenever source files change, at the cost of a few seconds
of downtime per reload.
Selective service startup (profiles)¶
The dev overlay defines compose profiles so you can start only the services you need:
# Start only core + databases + dagster (default, no profile flag needed)
make docker_dev
# Also start floors-estimation
BAG3D_DOCKER_IMAGE_TAG=develop docker compose -p bag3d-dev \
-f docker/compose.yaml -f docker/compose.dev.yaml --profile floors up -d
# Also start party-walls
BAG3D_DOCKER_IMAGE_TAG=develop docker compose -p bag3d-dev \
-f docker/compose.yaml -f docker/compose.dev.yaml --profile party-walls up -d
# Start everything
BAG3D_DOCKER_IMAGE_TAG=develop docker compose -p bag3d-dev \
-f docker/compose.yaml -f docker/compose.dev.yaml --profile all up -d