The goal of this project is to provide a Python-based starter API, which comes pre-configured with tools supporting the accelerated development of both Comet as well as general python APIs. Some of these tools are as follows:
- Platform: Python
- Web Framework: Fast API
- Database: SQLite, Alembic
- ORM: SQLAlchemy
- Data Validation: Pydantic
- Unit Testing: PyTest
- Code Quality: Ruff, PyLint, Black, isort
- Authentication support: JWT
- Documentation: Swagger and ReDoc
- Running the Project Locally
- Running with Docker
- Initializing PostgreSQL Database
- Running Unit Tests
- Running Code Quality Checks
- Running Code Formatting
- Publishing Updated Docs
- Contributing
- Next Steps
To override default environment variables, add a .env file to the comet-api directory and update as needed (optional):
API_PREFIX=[SOME_ROUTE] # Ex: '/api'
DATABASE_URL=[SOME_URL] # Ex: 'postgresql://username:password@localhost:5432/database_name'
OIDC_CONFIG_URL=[SOME_URL] # Ex: 'https://keycloak.auth.metrostar.cloud/auth/realms/dev/.well-known/openid-configuration'
LOG_LEVEL=[LOG_LEVEL] # Ex: 'DEBUG', 'INFO', 'WARNING', 'ERROR', 'CRITICAL' (Default: 'INFO')
- To install project dependencies, run the following:
uv sync- To install dev dependencies, run the following:
uv sync --extra dev- To start the app, run the following:
uv run uvicorn app.main:app --reload --host=0.0.0.0 --port=5000- Access the swagger docs by navigating to:
http://0.0.0.0:5000/docs
- To create an environment, run the following:
virtualenv -p python3 venv
source venv/bin/activate- To install project dependencies, run the following:
pip install .- To install dev dependencies, run the following (optional):
pip install -e ".[dev]"- To start the app, run the following:
uvicorn app.main:app --reload --host=0.0.0.0 --port=5000- Access the swagger docs by navigating to:
http://0.0.0.0:5000/docs
- To build the image, run the following:
docker build . -t comet-api- To run the container, run the following:
docker run -p 5000:5000 --name comet-api comet-api- Access the swagger docs by navigating to:
http://0.0.0.0:5000/docs
If you're using PostgreSQL instead of SQLite, you can use the provided initialization script to set up your database:
- Ensure your
.envfile contains a PostgreSQL DATABASE_URL:
DATABASE_URL=postgresql://username:password@localhost:5432/database_name
- Run the initialization script using either method:
Using the shell script:
./scripts/init_db.shOr using Python directly:
python scripts/init_postgres.py- To seed initial data along with the schema (optional):
./scripts/init_db.sh --seedScript Options:
--seed: Seed initial data after running migrations--skip-create: Skip database creation (only run migrations)
The script will:
- Create the database if it doesn't exist
- Run all Alembic migrations to set up the schema
- Optionally seed initial data
- To run unit tests, run the following:
pytest- To run unit tests with code coverage, run the following:
coverage run -m pytest && coverage html- To run code quality checks, run the following:
ruff check .- To run code formatting, run the following:
ruff format .Documentation is automatically updated via GitHub Actions when changes are pushed to the main branch.
The GitHub Action workflow (.github/workflows/update-docs.yaml) automatically:
- Starts the API server
- Downloads the OpenAPI specification from
/openapi.json - Generates HTML documentation using
@redocly/cli - Commits and pushes the updated
docs/openapi.jsonanddocs/index.htmlfiles
The workflow is triggered:
- Automatically when changes to
app/**files are pushed tomain - Manually via the GitHub Actions UI (workflow_dispatch)
If you need to update docs manually:
-
Access the swagger ReDocs by navigating to:
http://0.0.0.0:5000/redoc -
Click the Download button
-
Copy the downloaded file into the
docsdirectory -
To convert the json into html, run the following:
npx @redocly/cli build-docs docs/openapi.json --output docs/index.html- Commit the spec and html files and merge into
mainto publish docs
- Fork the Project
- Create your Feature Branch (
git checkout -b feature_a) - Commit your Changes (
git commit -m 'Added new feature_a') - Push to the Branch (
git push origin feature_a) - Open a Pull Request
The following provides a short list of tasks which are potential next steps for this project. These could be steps in making use of this baseline or they could be for learning purposes.
- Add/Update existing endpoints with more applicable entities and/or columns
- Update applicable endpoints to require JWT
- Replace default database with external database (Ex. Postgres)
- Deploy to cloud infrastructure
- Automate doc publishing process