Compare commits

...

60 Commits

Author SHA1 Message Date
cal
29f9875718 Merge pull request 'fix: drop :latest tag from CI, make worker count configurable' (#116) from fix/ci-tags-and-workers into main
All checks were successful
Build Docker Image / build (push) Successful in 1m18s
2026-04-09 16:32:52 +00:00
Cal Corum
6efba473a0 fix: use exec form CMD so uvicorn receives SIGTERM as PID 1
Shell form CMD makes /bin/sh PID 1 — SIGTERM from docker stop goes to
the shell, not uvicorn, causing SIGKILL after the stop timeout instead
of graceful shutdown. Using CMD ["sh", "-c", "exec uvicorn ..."] lets
the shell expand $WEB_WORKERS then exec-replaces itself with uvicorn,
restoring correct signal delivery.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-09 09:01:18 -05:00
Cal Corum
0095d2a792 fix: drop :latest tag from CI, make worker count configurable
Remove :latest Docker tag to match Paper Dynasty convention — only
:version and :environment tags are pushed. Add WEB_WORKERS env var
to Dockerfile (default 2) so prod can override via docker-compose.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-09 08:41:47 -05:00
cal
1df1abbd7c Merge pull request 'fix: replace removed tiangolo base image with python:3.13-slim' (#115) from fix/dockerfile-base-image into main
All checks were successful
Build Docker Image / build (push) Successful in 3m36s
Reviewed-on: #115
2026-04-09 12:51:57 +00:00
Cal Corum
1187c2c99b fix: use python:3.12-slim to match pinned numpy==1.26.4 compatibility
numpy==1.26.4 has no Python 3.13 wheel and slim images have no build
toolchain, so the build would fail. python:3.12-slim matches the Python
version from the removed tiangolo base image.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-09 07:31:18 -05:00
Cal Corum
4bcc798082 fix: replace removed tiangolo base image with python:3.13-slim
The tiangolo/uvicorn-gunicorn-fastapi:python3.12 image was removed from
Docker Hub, breaking CI builds. Switches to official python:3.13-slim
with explicit uvicorn CMD. Fixes COPY path to match WORKDIR and adds
2 workers to replace the multi-worker gunicorn setup.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-09 06:49:46 -05:00
cal
d8a020fe55 Merge pull request 'ci: add dev tag trigger and environment-based image tagging' (#114) from ci/dev-tag-support into main
Some checks failed
Build Docker Image / build (push) Failing after 1m25s
2026-04-09 04:47:41 +00:00
Cal Corum
6b957fc61c ci: add dev tag trigger and environment-based image tagging
Adds support for the 'dev' tag to trigger CI builds, pushing :dev Docker
tag for the dev environment. CalVer tags now also push :production alongside
:latest for future migration.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-08 20:12:45 -05:00
cal
f9e24eb4bc Merge pull request 'fix: update test limit to respect MAX_LIMIT=500 (#110)' (#112) from issue/110-fix-test-batting-sbaplayer-career-totals-returns-4 into main
Reviewed-on: #112
2026-04-08 12:55:55 +00:00
Cal Corum
36b962e5d5 fix: update test limit to respect MAX_LIMIT=500 (#110)
Closes #110

The test was sending limit=999 which exceeds MAX_LIMIT (500), causing
FastAPI to return 422. Changed to limit=500, which is sufficient to
cover all seasons for any player.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-08 12:55:35 +00:00
cal
5d5df325bc Merge pull request 'feat: increase MAX_LIMIT to 1000 for plays batting/fielding/pitching (#111)' (#113) from issue/111-feat-increase-max-limit-to-1000-for-plays-fielding into main
Reviewed-on: #113
2026-04-08 12:53:38 +00:00
Cal Corum
682b990321 feat: increase MAX_LIMIT to 1000 for plays batting/fielding/pitching (#111)
Closes #111

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-08 06:32:46 -05:00
cal
5b19bd486a Merge pull request 'fix: preserve total_count in get_totalstats instead of overwriting with page length (#101)' (#102) from issue/101-fieldingstats-get-totalstats-total-count-overwritt into main 2026-04-08 04:08:40 +00:00
Cal Corum
718abc0096 fix: preserve total_count in get_totalstats instead of overwriting with page length (#101)
Closes #101

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 23:08:10 -05:00
cal
52d88ae950 Merge pull request 'fix: add missing indexes on FK columns in stratplay and stratgame (#74)' (#95) from issue/74-add-missing-indexes-on-foreign-key-columns-in-high into main 2026-04-08 04:06:06 +00:00
Cal Corum
9165419ed0 fix: add missing indexes on FK columns in stratplay and stratgame (#74)
Closes #74

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 23:05:25 -05:00
cal
d23d6520c3 Merge pull request 'fix: batch standings updates to eliminate N+1 queries in recalculate (#75)' (#93) from issue/75-fix-n-1-query-pattern-in-standings-recalculation into main 2026-04-08 04:03:39 +00:00
Cal Corum
c23ca9a721 fix: batch standings updates to eliminate N+1 queries in recalculate (#75)
Replace per-game update_standings() calls with pre-fetched dicts and
in-memory accumulation, then a single bulk_update at the end.
Reduces ~1,100+ queries for a full season to ~5 queries.

Closes #75

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 23:03:11 -05:00
cal
1db06576cc Merge pull request 'fix: replace integer comparisons on boolean fields with True/False (#69)' (#94) from issue/69-boolean-fields-compared-as-integers-sqlite-pattern into main 2026-04-08 03:57:35 +00:00
Cal Corum
7a5327f490 fix: replace integer comparisons on boolean fields with True/False (#69)
Closes #69

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 22:57:01 -05:00
cal
a2889751da Merge pull request 'fix: remove SQLite fallback code from db_engine.py (#70)' (#89) from issue/70-remove-sqlite-fallback-code-from-db-engine-py into main 2026-04-08 03:56:11 +00:00
Cal Corum
eb886a4690 fix: remove SQLite fallback code from db_engine.py (#70)
Removes DATABASE_TYPE conditional entirely. PostgreSQL is now the only
supported backend. Moves PooledPostgresqlDatabase import to top-level
and raises RuntimeError at startup if POSTGRES_PASSWORD is unset,
preventing silent misconnection with misleading errors.

Closes #70

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 22:55:44 -05:00
cal
0ee7367bc0 Merge pull request 'fix: disable autoconnect and set pool timeout on PooledPostgresqlDatabase (#80)' (#87) from issue/80-disable-autoconnect-and-set-pool-timeout-on-pooled into main 2026-04-08 03:55:05 +00:00
Cal Corum
6637f6e9eb fix: disable autoconnect and set pool timeout on PooledPostgresqlDatabase (#80)
- Set timeout=5 so pool exhaustion surfaces as an error instead of hanging forever
- Set autoconnect=False to require explicit connection acquisition
- Add HTTP middleware in main.py to open/close connections per request

Closes #80

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 22:54:27 -05:00
cal
fa176c9b05 Merge pull request 'fix: enforce Literal validation on sort parameter in GET /api/v3/players (#66)' (#68) from ai/major-domo-database-66 into main 2026-04-08 03:54:02 +00:00
Cal Corum
ece25ec22c fix: enforce Literal validation on sort parameter in GET /api/v3/players (#66)
Closes #66

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 22:53:33 -05:00
cal
580e8ea031 Merge pull request 'fix: align CustomCommandCreator.discord_id model with BIGINT column (#78)' (#88) from issue/78-fix-discord-id-type-mismatch-between-model-charfie into main 2026-04-08 03:49:26 +00:00
Cal Corum
18394aa74e fix: align CustomCommandCreator.discord_id model with BIGINT column (#78)
Closes #78

Change CharField(max_length=20) to BigIntegerField to match the BIGINT
column created by the migration. Remove the str() workaround in
get_creator_by_discord_id() that was compensating for the type mismatch.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 22:48:55 -05:00
cal
8e7466abd7 Merge pull request 'fix: remove token value from Bad Token log warnings (#79)' (#85) from issue/79-stop-logging-raw-auth-tokens-in-warning-messages into main 2026-04-08 03:43:13 +00:00
Cal Corum
d61bc31daa fix: remove token value from Bad Token log warnings (#79)
Closes #79

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 22:42:49 -05:00
cal
ca15dfe380 Merge pull request 'fix: replace row-by-row DELETE with bulk DELETE in career recalculation (#77)' (#92) from issue/77-replace-row-by-row-delete-with-bulk-delete-in-care into main 2026-04-08 03:24:58 +00:00
cal
1575be8260 Merge pull request 'fix: update Docker base image from Python 3.11 to 3.12 (#82)' (#91) from issue/82-align-python-version-between-docker-image-3-11-and into main 2026-04-08 03:24:46 +00:00
cal
7c7405cd1d Merge pull request 'feat: add migration tracking system (#81)' (#96) from issue/81-add-migration-tracking-system into main 2026-04-08 03:23:41 +00:00
cal
0cc0cba6a9 Merge pull request 'fix: replace deprecated Pydantic .dict() with .model_dump() (#76)' (#90) from issue/76-replace-deprecated-pydantic-dict-with-model-dump into main 2026-04-08 03:23:30 +00:00
cal
41fe4f6ce2 Merge pull request 'fix: add type annotations to untyped query parameters (#73)' (#86) from issue/73-add-type-annotations-to-untyped-query-parameters into main 2026-04-08 03:22:17 +00:00
cal
14234385fe Merge pull request 'fix: add combined_season classmethod to PitchingStat (#65)' (#67) from ai/major-domo-database-65 into main 2026-04-08 03:22:15 +00:00
cal
07aeaa8f3e Merge pull request 'fix: replace manual db.close() calls with middleware-based connection management (#71)' (#97) from issue/71-refactor-manual-db-close-calls-to-middleware-based into main 2026-04-08 02:42:09 +00:00
Cal Corum
701f790868 ci: retrigger build after transient Docker Hub push failure 2026-04-07 21:30:36 -05:00
Cal Corum
b46d8d33ef fix: remove empty finally clauses in custom_commands and help_commands
After removing db.close() calls, 22 finally: blocks were left empty
(12 in custom_commands.py, 10 in help_commands.py), causing
IndentationError at import time. Removed the finally: clause entirely
since connection lifecycle is now handled by the middleware.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 21:30:36 -05:00
Cal Corum
cfa6da06b7 fix: replace manual db.close() calls with middleware-based connection management (#71)
Closes #71

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 21:30:36 -05:00
cal
40897f1cc8 Merge pull request 'fix: move hardcoded Discord webhook URL to env var' (#83) from fix/remove-hardcoded-webhook into main
Reviewed-on: #83
Reviewed-by: Claude <cal.corum+openclaw@gmail.com>
2026-04-08 02:28:20 +00:00
12a76c2bb5 Merge branch 'main' into fix/remove-hardcoded-webhook 2026-04-08 02:24:10 +00:00
cal
aac4bf50d5 Merge pull request 'chore: switch CI to tag-triggered builds' (#107) from chore/tag-triggered-ci into main
Reviewed-on: #107
2026-04-06 16:59:02 +00:00
Cal Corum
4ad445b0da chore: switch CI to tag-triggered builds
Match the discord bot's CI pattern — trigger on CalVer tag push
instead of branch push/PR. Removes auto-CalVer generation and
simplifies to a single build step.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 16:58:45 +00:00
cal
8d9bbdd7a0 Merge pull request 'fix: increase get_games limit to 1000' (#106) from fix/increase-get-game-limit into main
All checks were successful
Build Docker Image / build (push) Successful in 1m6s
Reviewed-on: #106
2026-04-06 15:30:47 +00:00
cal
c95459fa5d Update app/routers_v3/stratgame.py
All checks were successful
Build Docker Image / build (pull_request) Successful in 4m51s
2026-04-06 14:58:36 +00:00
cal
d809590f0e Merge pull request 'fix: correct column references in season pitching stats SQL' (#105) from fix/pitching-stats-column-name into main
All checks were successful
Build Docker Image / build (push) Successful in 2m11s
2026-04-02 16:57:30 +00:00
cal
0d8e666a75 Merge pull request 'fix: let HTTPException pass through @handle_db_errors' (#104) from fix/handle-db-errors-passthrough-http into main
Some checks failed
Build Docker Image / build (push) Has been cancelled
2026-04-02 16:57:12 +00:00
Cal Corum
bd19b7d913 fix: correct column references in season pitching stats view
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m4s
sp.on_first/on_second/on_third don't exist — the actual columns are
on_first_id/on_second_id/on_third_id. This caused failures when
updating season pitching stats after games.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-02 11:54:56 -05:00
Cal Corum
c49f91cc19 test: update test_get_nonexistent_play to expect 404 after HTTPException fix
All checks were successful
Build Docker Image / build (pull_request) Successful in 1m3s
After handle_db_errors no longer catches HTTPException, GET /plays/999999999
correctly returns 404 instead of 500. Update the assertion and docstring
to reflect the fixed behavior.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-02 09:30:39 -05:00
Cal Corum
215085b326 fix: let HTTPException pass through @handle_db_errors unchanged
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m34s
The decorator was catching all exceptions including intentional
HTTPException (401, 404, etc.) and re-wrapping them as 500 "Database
error". This masked auth failures and other deliberate HTTP errors.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-02 08:30:22 -05:00
Cal Corum
b35b68a88f Merge remote-tracking branch 'origin/main' into fix/remove-hardcoded-webhook
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m58s
2026-03-28 02:27:14 -05:00
Cal Corum
eccf4d1441 feat: add migration tracking system (#81)
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m11s
Adds schema_versions table and migrations.py runner to prevent
double-application and missed migrations across dev/prod environments.

- migrations/2026-03-27_add_schema_versions_table.sql: creates tracking table
- migrations.py: applies pending .sql files in sorted order, records each in schema_versions
- .gitignore: untrack migrations.py (was incorrectly ignored as legacy root file)

First run on an existing DB will apply all migrations (safe — all use IF NOT EXISTS).

Closes #81

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 05:34:13 -05:00
Cal Corum
d8c6ce2a5e fix: replace row-by-row DELETE with bulk DELETE in career recalculation (#77)
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m4s
Closes #77

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 03:02:55 -05:00
Cal Corum
665f275546 fix: update Docker base image from Python 3.11 to 3.12 (#82)
Some checks failed
Build Docker Image / build (pull_request) Failing after 49s
Closes #82

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 02:32:01 -05:00
Cal Corum
75a8fc8505 fix: replace deprecated Pydantic .dict() with .model_dump() (#76)
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m26s
Closes #76

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 02:02:49 -05:00
Cal Corum
dcaf184ad3 fix: add type annotations to untyped query parameters (#73)
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m11s
Closes #73

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 00:02:57 -05:00
Cal Corum
1bcde424c6 Address PR review feedback for DISCORD_WEBHOOK_URL env var
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m32s
- Add DISCORD_WEBHOOK_URL to docker-compose.yml api service environment block
- Add empty placeholder entry in .env for discoverability
- Move DISCORD_WEBHOOK_URL constant to the env-var constants section at top of dependencies.py

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-26 23:23:26 -05:00
Cal Corum
3be4f71e22 fix: move hardcoded Discord webhook URL to environment variable
All checks were successful
Build Docker Image / build (pull_request) Successful in 3m42s
Replace inline webhook URL+token with DISCORD_WEBHOOK_URL env var.
Logs a warning and returns False gracefully if the var is unset.

The exposed webhook token should be rotated in Discord.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 23:15:21 -05:00
Cal Corum
a21bb2a380 fix: add combined_season classmethod to PitchingStat (#65)
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m17s
Closes #65

`PitchingStat.combined_season()` was referenced in the `get_pitstats`
handler but never defined, causing a 500 on `s_type=combined/total/all`.

Added `combined_season` as a `@staticmethod` matching the pattern of
`BattingStat.combined_season` — returns all rows for the given season
with no week filter (both regular and postseason).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 17:33:47 -05:00
41 changed files with 522 additions and 447 deletions

3
.env
View File

@ -6,6 +6,9 @@ SBA_DB_USER_PASSWORD=your_production_password
# SBa API
API_TOKEN=Tp3aO3jhYve5NJF1IqOmJTmk
# Integrations
DISCORD_WEBHOOK_URL=
# Universal
TZ=America/Chicago
LOG_LEVEL=INFO

View File

@ -1,20 +1,21 @@
# Gitea Actions: Docker Build, Push, and Notify
#
# CI/CD pipeline for Major Domo Database API:
# - Builds Docker images on every push/PR
# - Auto-generates CalVer version (YYYY.MM.BUILD) on main branch merges
# - Pushes to Docker Hub and creates git tag on main
# - Triggered by pushing a CalVer tag (e.g., 2026.4.5) or the 'dev' tag
# - CalVer tags push with version + "production" Docker tags
# - "dev" tag pushes with "dev" Docker tag for the dev environment
# - Sends Discord notifications on success/failure
#
# To release: git tag -a 2026.4.5 -m "description" && git push origin 2026.4.5
# To test: git tag -f dev && git push -f origin dev
name: Build Docker Image
on:
push:
branches:
- main
pull_request:
branches:
- main
tags:
- '20*' # matches CalVer tags like 2026.4.5
- 'dev'
jobs:
build:
@ -24,7 +25,21 @@ jobs:
- name: Checkout code
uses: https://github.com/actions/checkout@v4
with:
fetch-depth: 0 # Full history for tag counting
fetch-depth: 0
- name: Extract version from tag
id: version
run: |
VERSION=${GITHUB_REF#refs/tags/}
SHA_SHORT=$(git rev-parse --short HEAD)
echo "version=$VERSION" >> $GITHUB_OUTPUT
echo "sha_short=$SHA_SHORT" >> $GITHUB_OUTPUT
echo "timestamp=$(date -u +%Y-%m-%dT%H:%M:%SZ)" >> $GITHUB_OUTPUT
if [ "$VERSION" = "dev" ]; then
echo "environment=dev" >> $GITHUB_OUTPUT
else
echo "environment=production" >> $GITHUB_OUTPUT
fi
- name: Set up Docker Buildx
uses: https://github.com/docker/setup-buildx-action@v3
@ -35,80 +50,47 @@ jobs:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Generate CalVer version
id: calver
uses: cal/gitea-actions/calver@main
# Dev build: push with dev + dev-SHA tags (PR/feature branches)
- name: Build Docker image (dev)
if: github.ref != 'refs/heads/main'
- name: Build and push Docker image
uses: https://github.com/docker/build-push-action@v5
with:
context: .
push: true
tags: |
manticorum67/major-domo-database:dev
manticorum67/major-domo-database:dev-${{ steps.calver.outputs.sha_short }}
manticorum67/major-domo-database:${{ steps.version.outputs.version }}
manticorum67/major-domo-database:${{ steps.version.outputs.environment }}
cache-from: type=registry,ref=manticorum67/major-domo-database:buildcache
cache-to: type=registry,ref=manticorum67/major-domo-database:buildcache,mode=max
# Production build: push with latest + CalVer tags (main only)
- name: Build Docker image (production)
if: github.ref == 'refs/heads/main'
uses: https://github.com/docker/build-push-action@v5
with:
context: .
push: true
tags: |
manticorum67/major-domo-database:latest
manticorum67/major-domo-database:${{ steps.calver.outputs.version }}
manticorum67/major-domo-database:${{ steps.calver.outputs.version_sha }}
cache-from: type=registry,ref=manticorum67/major-domo-database:buildcache
cache-to: type=registry,ref=manticorum67/major-domo-database:buildcache,mode=max
- name: Tag release
if: success() && github.ref == 'refs/heads/main'
uses: cal/gitea-actions/gitea-tag@main
with:
version: ${{ steps.calver.outputs.version }}
token: ${{ github.token }}
- name: Build Summary
run: |
echo "## Docker Build Successful" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "**Version:** \`${{ steps.version.outputs.version }}\`" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "**Image Tags:**" >> $GITHUB_STEP_SUMMARY
echo "- \`manticorum67/major-domo-database:latest\`" >> $GITHUB_STEP_SUMMARY
echo "- \`manticorum67/major-domo-database:${{ steps.calver.outputs.version }}\`" >> $GITHUB_STEP_SUMMARY
echo "- \`manticorum67/major-domo-database:${{ steps.calver.outputs.version_sha }}\`" >> $GITHUB_STEP_SUMMARY
echo "- \`manticorum67/major-domo-database:${{ steps.version.outputs.version }}\`" >> $GITHUB_STEP_SUMMARY
echo "- \`manticorum67/major-domo-database:${{ steps.version.outputs.environment }}\`" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "**Build Details:**" >> $GITHUB_STEP_SUMMARY
echo "- Branch: \`${{ steps.calver.outputs.branch }}\`" >> $GITHUB_STEP_SUMMARY
echo "- Commit: \`${{ github.sha }}\`" >> $GITHUB_STEP_SUMMARY
echo "- Timestamp: \`${{ steps.calver.outputs.timestamp }}\`" >> $GITHUB_STEP_SUMMARY
echo "- Commit: \`${{ steps.version.outputs.sha_short }}\`" >> $GITHUB_STEP_SUMMARY
echo "- Timestamp: \`${{ steps.version.outputs.timestamp }}\`" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
if [ "${{ github.ref }}" == "refs/heads/main" ]; then
echo "Pushed to Docker Hub!" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "Pull with: \`docker pull manticorum67/major-domo-database:latest\`" >> $GITHUB_STEP_SUMMARY
else
echo "_PR build - image not pushed to Docker Hub_" >> $GITHUB_STEP_SUMMARY
fi
echo "Pull with: \`docker pull manticorum67/major-domo-database:${{ steps.version.outputs.version }}\`" >> $GITHUB_STEP_SUMMARY
- name: Discord Notification - Success
if: success() && github.ref == 'refs/heads/main'
if: success()
uses: cal/gitea-actions/discord-notify@main
with:
webhook_url: ${{ secrets.DISCORD_WEBHOOK }}
title: "Major Domo Database"
status: success
version: ${{ steps.calver.outputs.version }}
image_tag: ${{ steps.calver.outputs.version_sha }}
commit_sha: ${{ steps.calver.outputs.sha_short }}
timestamp: ${{ steps.calver.outputs.timestamp }}
version: ${{ steps.version.outputs.version }}
image_tag: ${{ steps.version.outputs.version }}
commit_sha: ${{ steps.version.outputs.sha_short }}
timestamp: ${{ steps.version.outputs.timestamp }}
- name: Discord Notification - Failure
if: failure() && github.ref == 'refs/heads/main'
if: failure()
uses: cal/gitea-actions/discord-notify@main
with:
webhook_url: ${{ secrets.DISCORD_WEBHOOK }}

1
.gitignore vendored
View File

@ -55,7 +55,6 @@ Include/
pyvenv.cfg
db_engine.py
main.py
migrations.py
db_engine.py
sba_master.db
db_engine.py

View File

@ -40,7 +40,7 @@ python migrations.py # Run migrations (SQL files in migrat
- **Bot container**: `dev_sba_postgres` (PostgreSQL) + `dev_sba_db_api` (API) — check with `docker ps`
- **Image**: `manticorum67/major-domo-database:dev` (Docker Hub)
- **CI/CD**: Gitea Actions on PR to `main` — builds Docker image, auto-generates CalVer version (`YYYY.MM.BUILD`) on merge
- **CI/CD**: Gitea Actions — tag-triggered Docker builds. Push a CalVer tag to release: `git tag -a 2026.4.5 -m "description" && git push origin 2026.4.5`
## Important

View File

@ -1,5 +1,5 @@
# Use specific version for reproducible builds
FROM tiangolo/uvicorn-gunicorn-fastapi:python3.11
# Use official Python slim image
FROM python:3.12-slim
# Set Python optimizations
ENV PYTHONUNBUFFERED=1
@ -20,11 +20,15 @@ RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir -r requirements.txt
# Copy application code
COPY ./app /app/app
COPY ./app /usr/src/app/app
# Create directories for volumes
RUN mkdir -p /usr/src/app/storage
# Health check
HEALTHCHECK --interval=30s --timeout=10s --start-period=10s --retries=3 \
CMD curl -f http://localhost:80/api/v3/current || exit 1
CMD curl -f http://localhost:80/api/v3/current || exit 1
# Start uvicorn
ENV WEB_WORKERS=2
CMD ["sh", "-c", "exec uvicorn app.main:app --host 0.0.0.0 --port 80 --workers $WEB_WORKERS"]

View File

@ -8,39 +8,30 @@ from typing import Literal, List, Optional
from pandas import DataFrame
from peewee import *
from peewee import ModelSelect
from playhouse.pool import PooledPostgresqlDatabase
from playhouse.shortcuts import model_to_dict
# Database configuration - supports both SQLite and PostgreSQL
DATABASE_TYPE = os.environ.get("DATABASE_TYPE", "sqlite")
if DATABASE_TYPE.lower() == "postgresql":
from playhouse.pool import PooledPostgresqlDatabase
_postgres_password = os.environ.get("POSTGRES_PASSWORD")
if _postgres_password is None:
raise RuntimeError(
"POSTGRES_PASSWORD environment variable is not set. "
"This variable is required when DATABASE_TYPE=postgresql."
)
db = PooledPostgresqlDatabase(
os.environ.get("POSTGRES_DB", "sba_master"),
user=os.environ.get("POSTGRES_USER", "sba_admin"),
password=_postgres_password,
host=os.environ.get("POSTGRES_HOST", "sba_postgres"),
port=int(os.environ.get("POSTGRES_PORT", "5432")),
max_connections=20,
stale_timeout=300, # 5 minutes
timeout=0,
autoconnect=True,
autorollback=True, # Automatically rollback failed transactions
)
else:
# Default SQLite configuration
db = SqliteDatabase(
"storage/sba_master.db",
pragmas={"journal_mode": "wal", "cache_size": -1 * 64000, "synchronous": 0},
_postgres_password = os.environ.get("POSTGRES_PASSWORD")
if _postgres_password is None:
raise RuntimeError(
"POSTGRES_PASSWORD environment variable is not set. "
"This variable is required when DATABASE_TYPE=postgresql."
)
db = PooledPostgresqlDatabase(
os.environ.get("POSTGRES_DB", "sba_master"),
user=os.environ.get("POSTGRES_USER", "sba_admin"),
password=_postgres_password,
host=os.environ.get("POSTGRES_HOST", "sba_postgres"),
port=int(os.environ.get("POSTGRES_PORT", "5432")),
max_connections=20,
stale_timeout=300, # 5 minutes
timeout=5,
autoconnect=False,
autorollback=True, # Automatically rollback failed transactions
)
date = f"{datetime.datetime.now().year}-{datetime.datetime.now().month}-{datetime.datetime.now().day}"
logger = logging.getLogger("discord_app")
@ -523,12 +514,12 @@ class Team(BaseModel):
all_drops = Transaction.select_season(Current.latest().season).where(
(Transaction.oldteam == self)
& (Transaction.week == current.week + 1)
& (Transaction.cancelled == 0)
& (Transaction.cancelled == False)
)
all_adds = Transaction.select_season(Current.latest().season).where(
(Transaction.newteam == self)
& (Transaction.week == current.week + 1)
& (Transaction.cancelled == 0)
& (Transaction.cancelled == False)
)
for move in all_drops:
@ -606,12 +597,12 @@ class Team(BaseModel):
all_drops = Transaction.select_season(Current.latest().season).where(
(Transaction.oldteam == sil_team)
& (Transaction.week == current.week + 1)
& (Transaction.cancelled == 0)
& (Transaction.cancelled == False)
)
all_adds = Transaction.select_season(Current.latest().season).where(
(Transaction.newteam == sil_team)
& (Transaction.week == current.week + 1)
& (Transaction.cancelled == 0)
& (Transaction.cancelled == False)
)
for move in all_drops:
@ -689,12 +680,12 @@ class Team(BaseModel):
all_drops = Transaction.select_season(Current.latest().season).where(
(Transaction.oldteam == lil_team)
& (Transaction.week == current.week + 1)
& (Transaction.cancelled == 0)
& (Transaction.cancelled == False)
)
all_adds = Transaction.select_season(Current.latest().season).where(
(Transaction.newteam == lil_team)
& (Transaction.week == current.week + 1)
& (Transaction.cancelled == 0)
& (Transaction.cancelled == False)
)
for move in all_drops:
@ -1365,6 +1356,10 @@ class PitchingStat(BaseModel):
def select_season(season):
return PitchingStat.select().where(PitchingStat.season == season)
@staticmethod
def combined_season(season):
return PitchingStat.select().where(PitchingStat.season == season)
@staticmethod
def regular_season(season):
if season == 1:
@ -1529,7 +1524,18 @@ class Standings(BaseModel):
with db.atomic():
Standings.bulk_create(create_teams)
# Iterate through each individual result
# Pre-fetch all data needed for in-memory processing (avoids N+1 queries)
standings_by_team_id = {
s.team_id: s
for s in Standings.select().where(Standings.team << s_teams)
}
teams_by_id = {t.id: t for t in Team.select().where(Team.season == season)}
divisions_by_id = {
d.id: d
for d in Division.select().where(Division.season == season)
}
# Iterate through each individual result, tallying wins/losses in memory
# for game in Result.select_season(season).where(Result.week <= 22):
for game in (
StratGame.select()
@ -1540,8 +1546,121 @@ class Standings(BaseModel):
)
.order_by(StratGame.week, StratGame.game_num)
):
# tally win and loss for each standings object
game.update_standings()
away_stan = standings_by_team_id.get(game.away_team_id)
home_stan = standings_by_team_id.get(game.home_team_id)
away_team_obj = teams_by_id.get(game.away_team_id)
home_team_obj = teams_by_id.get(game.home_team_id)
if None in (away_stan, home_stan, away_team_obj, home_team_obj):
continue
away_div = divisions_by_id.get(away_team_obj.division_id)
home_div = divisions_by_id.get(home_team_obj.division_id)
if away_div is None or home_div is None:
continue
# Home Team Won
if game.home_score > game.away_score:
home_stan.wins += 1
home_stan.home_wins += 1
away_stan.losses += 1
away_stan.away_losses += 1
if home_stan.streak_wl == 'w':
home_stan.streak_num += 1
else:
home_stan.streak_wl = 'w'
home_stan.streak_num = 1
if away_stan.streak_wl == 'l':
away_stan.streak_num += 1
else:
away_stan.streak_wl = 'l'
away_stan.streak_num = 1
if game.home_score == game.away_score + 1:
home_stan.one_run_wins += 1
away_stan.one_run_losses += 1
if away_div.division_abbrev == 'TC':
home_stan.div1_wins += 1
elif away_div.division_abbrev == 'ETSOS':
home_stan.div2_wins += 1
elif away_div.division_abbrev == 'APL':
home_stan.div3_wins += 1
elif away_div.division_abbrev == 'BBC':
home_stan.div4_wins += 1
if home_div.division_abbrev == 'TC':
away_stan.div1_losses += 1
elif home_div.division_abbrev == 'ETSOS':
away_stan.div2_losses += 1
elif home_div.division_abbrev == 'APL':
away_stan.div3_losses += 1
elif home_div.division_abbrev == 'BBC':
away_stan.div4_losses += 1
home_stan.run_diff += game.home_score - game.away_score
away_stan.run_diff -= game.home_score - game.away_score
# Away Team Won
else:
home_stan.losses += 1
home_stan.home_losses += 1
away_stan.wins += 1
away_stan.away_wins += 1
if home_stan.streak_wl == 'l':
home_stan.streak_num += 1
else:
home_stan.streak_wl = 'l'
home_stan.streak_num = 1
if away_stan.streak_wl == 'w':
away_stan.streak_num += 1
else:
away_stan.streak_wl = 'w'
away_stan.streak_num = 1
if game.away_score == game.home_score + 1:
home_stan.one_run_losses += 1
away_stan.one_run_wins += 1
if away_div.division_abbrev == 'TC':
home_stan.div1_losses += 1
elif away_div.division_abbrev == 'ETSOS':
home_stan.div2_losses += 1
elif away_div.division_abbrev == 'APL':
home_stan.div3_losses += 1
elif away_div.division_abbrev == 'BBC':
home_stan.div4_losses += 1
if home_div.division_abbrev == 'TC':
away_stan.div1_wins += 1
elif home_div.division_abbrev == 'ETSOS':
away_stan.div2_wins += 1
elif home_div.division_abbrev == 'APL':
away_stan.div3_wins += 1
elif home_div.division_abbrev == 'BBC':
away_stan.div4_wins += 1
home_stan.run_diff -= game.away_score - game.home_score
away_stan.run_diff += game.away_score - game.home_score
# Bulk save all modified standings
with db.atomic():
Standings.bulk_update(
list(standings_by_team_id.values()),
fields=[
Standings.wins, Standings.losses,
Standings.home_wins, Standings.home_losses,
Standings.away_wins, Standings.away_losses,
Standings.one_run_wins, Standings.one_run_losses,
Standings.streak_wl, Standings.streak_num,
Standings.run_diff,
Standings.div1_wins, Standings.div1_losses,
Standings.div2_wins, Standings.div2_losses,
Standings.div3_wins, Standings.div3_losses,
Standings.div4_wins, Standings.div4_losses,
]
)
# Set pythag record and iterate through last 8 games for last8 record
for team in all_teams:
@ -1590,9 +1709,7 @@ class BattingCareer(BaseModel):
@staticmethod
def recalculate():
# Wipe existing data
delete_lines = BattingCareer.select()
for line in delete_lines:
line.delete_instance()
BattingCareer.delete().execute()
# For each seasonstat, find career or create new and increment
for this_season in BattingSeason.select().where(
@ -1655,9 +1772,7 @@ class PitchingCareer(BaseModel):
@staticmethod
def recalculate():
# Wipe existing data
delete_lines = PitchingCareer.select()
for line in delete_lines:
line.delete_instance()
PitchingCareer.delete().execute()
# For each seasonstat, find career or create new and increment
for this_season in PitchingSeason.select().where(
@ -1709,9 +1824,7 @@ class FieldingCareer(BaseModel):
@staticmethod
def recalculate():
# Wipe existing data
delete_lines = FieldingCareer.select()
for line in delete_lines:
line.delete_instance()
FieldingCareer.delete().execute()
# For each seasonstat, find career or create new and increment
for this_season in FieldingSeason.select().where(
@ -2369,6 +2482,12 @@ class StratGame(BaseModel):
home_stan.save()
away_stan.save()
class Meta:
indexes = (
(("season",), False),
(("season", "week", "game_num"), False),
)
class StratPlay(BaseModel):
game = ForeignKeyField(StratGame)
@ -2443,6 +2562,14 @@ class StratPlay(BaseModel):
re24_primary = FloatField(null=True)
re24_running = FloatField(null=True)
class Meta:
indexes = (
(("game",), False),
(("batter",), False),
(("pitcher",), False),
(("runner",), False),
)
class Decision(BaseModel):
game = ForeignKeyField(StratGame)
@ -2465,8 +2592,7 @@ class Decision(BaseModel):
class CustomCommandCreator(BaseModel):
"""Model for custom command creators."""
discord_id = CharField(max_length=20, unique=True) # Discord snowflake ID as string
discord_id = BigIntegerField(unique=True)
username = CharField(max_length=32)
display_name = CharField(max_length=32, null=True)
created_at = DateTimeField()

View File

@ -22,6 +22,9 @@ logger = logging.getLogger("discord_app")
# level=log_level
# )
# Discord integration
DISCORD_WEBHOOK_URL = os.environ.get("DISCORD_WEBHOOK_URL")
# Redis configuration
REDIS_HOST = os.environ.get("REDIS_HOST", "localhost")
REDIS_PORT = int(os.environ.get("REDIS_PORT", "6379"))
@ -379,14 +382,14 @@ def update_season_pitching_stats(player_ids, season, db_connection):
-- RBI allowed (excluding HR) per runner opportunity
CASE
WHEN (SUM(CASE WHEN sp.on_first IS NOT NULL THEN 1 ELSE 0 END) +
SUM(CASE WHEN sp.on_second IS NOT NULL THEN 1 ELSE 0 END) +
SUM(CASE WHEN sp.on_third IS NOT NULL THEN 1 ELSE 0 END)) > 0
WHEN (SUM(CASE WHEN sp.on_first_id IS NOT NULL THEN 1 ELSE 0 END) +
SUM(CASE WHEN sp.on_second_id IS NOT NULL THEN 1 ELSE 0 END) +
SUM(CASE WHEN sp.on_third_id IS NOT NULL THEN 1 ELSE 0 END)) > 0
THEN ROUND(
(SUM(sp.rbi) - SUM(sp.homerun))::DECIMAL /
(SUM(CASE WHEN sp.on_first IS NOT NULL THEN 1 ELSE 0 END) +
SUM(CASE WHEN sp.on_second IS NOT NULL THEN 1 ELSE 0 END) +
SUM(CASE WHEN sp.on_third IS NOT NULL THEN 1 ELSE 0 END)),
(SUM(CASE WHEN sp.on_first_id IS NOT NULL THEN 1 ELSE 0 END) +
SUM(CASE WHEN sp.on_second_id IS NOT NULL THEN 1 ELSE 0 END) +
SUM(CASE WHEN sp.on_third_id IS NOT NULL THEN 1 ELSE 0 END)),
3
)
ELSE 0.000
@ -516,7 +519,12 @@ def send_webhook_message(message: str) -> bool:
Returns:
bool: True if successful, False otherwise
"""
webhook_url = "https://discord.com/api/webhooks/1408811717424840876/7RXG_D5IqovA3Jwa9YOobUjVcVMuLc6cQyezABcWuXaHo5Fvz1en10M7J43o3OJ3bzGW"
webhook_url = DISCORD_WEBHOOK_URL
if not webhook_url:
logger.warning(
"DISCORD_WEBHOOK_URL env var is not set — skipping webhook message"
)
return False
try:
payload = {"content": message}
@ -807,6 +815,10 @@ def handle_db_errors(func):
return result
except HTTPException:
# Let intentional HTTP errors (401, 404, etc.) pass through unchanged
raise
except Exception as e:
elapsed_time = time.time() - start_time
error_trace = traceback.format_exc()

View File

@ -8,6 +8,8 @@ from fastapi import Depends, FastAPI, Request
from fastapi.openapi.docs import get_swagger_ui_html
from fastapi.openapi.utils import get_openapi
from .db_engine import db
# from fastapi.openapi.docs import get_swagger_ui_html
# from fastapi.openapi.utils import get_openapi
@ -68,9 +70,33 @@ app = FastAPI(
)
@app.middleware("http")
async def db_connection_middleware(request: Request, call_next):
db.connect(reuse_if_open=True)
try:
response = await call_next(request)
finally:
if not db.is_closed():
db.close()
return response
logger.info(f"Starting up now...")
@app.middleware("http")
async def db_connection_middleware(request: Request, call_next):
from .db_engine import db
db.connect(reuse_if_open=True)
try:
response = await call_next(request)
return response
finally:
if not db.is_closed():
db.close()
@app.middleware("http")
async def strip_empty_query_params(request: Request, call_next):
qs = request.scope.get("query_string", b"")

View File

@ -78,7 +78,6 @@ async def get_awards(
"count": total_count,
"awards": [model_to_dict(x, recurse=not short_output) for x in all_awards],
}
db.close()
return return_awards
@ -87,10 +86,8 @@ async def get_awards(
async def get_one_award(award_id: int, short_output: Optional[bool] = False):
this_award = Award.get_or_none(Award.id == award_id)
if this_award is None:
db.close()
raise HTTPException(status_code=404, detail=f"Award ID {award_id} not found")
db.close()
return model_to_dict(this_award, recurse=not short_output)
@ -109,12 +106,11 @@ async def patch_award(
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f"patch_player - Bad Token: {token}")
logger.warning("patch_player - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_award = Award.get_or_none(Award.id == award_id)
if this_award is None:
db.close()
raise HTTPException(status_code=404, detail=f"Award ID {award_id} not found")
if name is not None:
@ -136,10 +132,8 @@ async def patch_award(
if this_award.save() == 1:
r_award = model_to_dict(this_award)
db.close()
return r_award
else:
db.close()
raise HTTPException(status_code=500, detail=f"Unable to patch award {award_id}")
@ -147,7 +141,7 @@ async def patch_award(
@handle_db_errors
async def post_award(award_list: AwardList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"patch_player - Bad Token: {token}")
logger.warning("patch_player - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
new_awards = []
@ -178,12 +172,11 @@ async def post_award(award_list: AwardList, token: str = Depends(oauth2_scheme))
status_code=404, detail=f"Team ID {x.team_id} not found"
)
new_awards.append(x.dict())
new_awards.append(x.model_dump())
with db.atomic():
for batch in chunked(new_awards, 15):
Award.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f"Inserted {len(new_awards)} awards"
@ -192,16 +185,14 @@ async def post_award(award_list: AwardList, token: str = Depends(oauth2_scheme))
@handle_db_errors
async def delete_award(award_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"patch_player - Bad Token: {token}")
logger.warning("patch_player - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_award = Award.get_or_none(Award.id == award_id)
if this_award is None:
db.close()
raise HTTPException(status_code=404, detail=f"Award ID {award_id} not found")
count = this_award.delete_instance()
db.close()
if count == 1:
return f"Award {award_id} has been deleted"

View File

@ -93,17 +93,14 @@ async def get_batstats(
if "post" in s_type.lower():
all_stats = BattingStat.post_season(season)
if all_stats.count() == 0:
db.close()
return {"count": 0, "stats": []}
elif s_type.lower() in ["combined", "total", "all"]:
all_stats = BattingStat.combined_season(season)
if all_stats.count() == 0:
db.close()
return {"count": 0, "stats": []}
else:
all_stats = BattingStat.regular_season(season)
if all_stats.count() == 0:
db.close()
return {"count": 0, "stats": []}
if position is not None:
@ -129,7 +126,6 @@ async def get_batstats(
if week_end is not None:
end = min(week_end, end)
if start > end:
db.close()
raise HTTPException(
status_code=404,
detail=f"Start week {start} is after end week {end} - cannot pull stats",
@ -147,7 +143,6 @@ async def get_batstats(
# 'stats': [{'id': x.id} for x in all_stats]
}
db.close()
return return_stats
@ -350,7 +345,6 @@ async def get_totalstats(
"bplo": x.sum_bplo,
}
)
db.close()
return return_stats
@ -366,15 +360,16 @@ async def patch_batstats(
stat_id: int, new_stats: BatStatModel, token: str = Depends(oauth2_scheme)
):
if not valid_token(token):
logger.warning(f"patch_batstats - Bad Token: {token}")
logger.warning("patch_batstats - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
if BattingStat.get_or_none(BattingStat.id == stat_id) is None:
raise HTTPException(status_code=404, detail=f"Stat ID {stat_id} not found")
BattingStat.update(**new_stats.dict()).where(BattingStat.id == stat_id).execute()
BattingStat.update(**new_stats.model_dump()).where(
BattingStat.id == stat_id
).execute()
r_stat = model_to_dict(BattingStat.get_by_id(stat_id))
db.close()
return r_stat
@ -382,7 +377,7 @@ async def patch_batstats(
@handle_db_errors
async def post_batstats(s_list: BatStatList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"post_batstats - Bad Token: {token}")
logger.warning("post_batstats - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
all_stats = []
@ -410,7 +405,7 @@ async def post_batstats(s_list: BatStatList, token: str = Depends(oauth2_scheme)
status_code=404, detail=f"Player ID {x.player_id} not found"
)
all_stats.append(BattingStat(**x.dict()))
all_stats.append(BattingStat(**x.model_dump()))
with db.atomic():
for batch in chunked(all_stats, 15):
@ -418,5 +413,4 @@ async def post_batstats(s_list: BatStatList, token: str = Depends(oauth2_scheme)
# Update career stats
db.close()
return f"Added {len(all_stats)} batting lines"

View File

@ -41,7 +41,6 @@ async def get_current(season: Optional[int] = None):
if current is not None:
r_curr = model_to_dict(current)
db.close()
return r_curr
else:
return None
@ -65,7 +64,7 @@ async def patch_current(
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f"patch_current - Bad Token: {token}")
logger.warning("patch_current - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
@ -100,10 +99,8 @@ async def patch_current(
if current.save():
r_curr = model_to_dict(current)
db.close()
return r_curr
else:
db.close()
raise HTTPException(
status_code=500, detail=f"Unable to patch current {current_id}"
)
@ -113,17 +110,15 @@ async def patch_current(
@handle_db_errors
async def post_current(new_current: CurrentModel, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"patch_current - Bad Token: {token}")
logger.warning("patch_current - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_current = Current(**new_current.dict())
this_current = Current(**new_current.model_dump())
if this_current.save():
r_curr = model_to_dict(this_current)
db.close()
return r_curr
else:
db.close()
raise HTTPException(
status_code=500,
detail=f"Unable to post season {new_current.season} current",
@ -134,7 +129,7 @@ async def post_current(new_current: CurrentModel, token: str = Depends(oauth2_sc
@handle_db_errors
async def delete_current(current_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"patch_current - Bad Token: {token}")
logger.warning("patch_current - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
if Current.delete_by_id(current_id) == 1:

View File

@ -175,7 +175,7 @@ def delete_custom_command(command_id: int):
def get_creator_by_discord_id(discord_id: int):
"""Get a creator by Discord ID"""
creator = CustomCommandCreator.get_or_none(
CustomCommandCreator.discord_id == str(discord_id)
CustomCommandCreator.discord_id == discord_id
)
if creator:
return model_to_dict(creator)
@ -363,8 +363,6 @@ async def get_custom_commands(
except Exception as e:
logger.error(f"Error getting custom commands: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
# Move this route to after the specific string routes
@ -377,7 +375,7 @@ async def create_custom_command_endpoint(
):
"""Create a new custom command"""
if not valid_token(token):
logger.warning(f"create_custom_command - Bad Token: {token}")
logger.warning("create_custom_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
@ -429,8 +427,6 @@ async def create_custom_command_endpoint(
except Exception as e:
logger.error(f"Error creating custom command: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
@router.put("/{command_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@ -440,7 +436,7 @@ async def update_custom_command_endpoint(
):
"""Update an existing custom command"""
if not valid_token(token):
logger.warning(f"update_custom_command - Bad Token: {token}")
logger.warning("update_custom_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
@ -490,8 +486,6 @@ async def update_custom_command_endpoint(
except Exception as e:
logger.error(f"Error updating custom command {command_id}: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
@router.patch("/{command_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@ -508,7 +502,7 @@ async def patch_custom_command(
):
"""Partially update a custom command"""
if not valid_token(token):
logger.warning(f"patch_custom_command - Bad Token: {token}")
logger.warning("patch_custom_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
@ -575,8 +569,6 @@ async def patch_custom_command(
except Exception as e:
logger.error(f"Error patching custom command {command_id}: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
@router.delete("/{command_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@ -586,7 +578,7 @@ async def delete_custom_command_endpoint(
):
"""Delete a custom command"""
if not valid_token(token):
logger.warning(f"delete_custom_command - Bad Token: {token}")
logger.warning("delete_custom_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
@ -612,8 +604,6 @@ async def delete_custom_command_endpoint(
except Exception as e:
logger.error(f"Error deleting custom command {command_id}: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
# Creator endpoints
@ -683,8 +673,6 @@ async def get_creators(
except Exception as e:
logger.error(f"Error getting creators: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
@router.post("/creators", include_in_schema=PRIVATE_IN_SCHEMA)
@ -694,7 +682,7 @@ async def create_creator_endpoint(
):
"""Create a new command creator"""
if not valid_token(token):
logger.warning(f"create_creator - Bad Token: {token}")
logger.warning("create_creator - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
@ -728,8 +716,6 @@ async def create_creator_endpoint(
except Exception as e:
logger.error(f"Error creating creator: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
@router.get("/stats")
@ -854,8 +840,6 @@ async def get_custom_command_stats():
except Exception as e:
logger.error(f"Error getting custom command stats: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
# Special endpoints for Discord bot integration
@ -921,8 +905,6 @@ async def get_custom_command_by_name_endpoint(command_name: str):
except Exception as e:
logger.error(f"Error getting custom command by name '{command_name}': {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
@router.patch("/by_name/{command_name}/execute", include_in_schema=PRIVATE_IN_SCHEMA)
@ -932,7 +914,7 @@ async def execute_custom_command(
):
"""Execute a custom command and update usage statistics"""
if not valid_token(token):
logger.warning(f"execute_custom_command - Bad Token: {token}")
logger.warning("execute_custom_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
@ -990,8 +972,6 @@ async def execute_custom_command(
except Exception as e:
logger.error(f"Error executing custom command '{command_name}': {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
@router.get("/autocomplete")
@ -1027,8 +1007,6 @@ async def get_command_names_for_autocomplete(
except Exception as e:
logger.error(f"Error getting command names for autocomplete: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
@router.get("/{command_id}")
@ -1077,5 +1055,3 @@ async def get_custom_command(command_id: int):
except Exception as e:
logger.error(f"Error getting custom command {command_id}: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()

View File

@ -143,7 +143,6 @@ async def get_decisions(
"count": all_dec.count(),
"decisions": [model_to_dict(x, recurse=not short_output) for x in all_dec],
}
db.close()
return return_dec
@ -163,12 +162,11 @@ async def patch_decision(
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f"patch_decision - Bad Token: {token}")
logger.warning("patch_decision - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_dec = Decision.get_or_none(Decision.id == decision_id)
if this_dec is None:
db.close()
raise HTTPException(
status_code=404, detail=f"Decision ID {decision_id} not found"
)
@ -194,10 +192,8 @@ async def patch_decision(
if this_dec.save() == 1:
d_result = model_to_dict(this_dec)
db.close()
return d_result
else:
db.close()
raise HTTPException(
status_code=500, detail=f"Unable to patch decision {decision_id}"
)
@ -207,7 +203,7 @@ async def patch_decision(
@handle_db_errors
async def post_decisions(dec_list: DecisionList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"post_decisions - Bad Token: {token}")
logger.warning("post_decisions - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
new_dec = []
@ -221,12 +217,11 @@ async def post_decisions(dec_list: DecisionList, token: str = Depends(oauth2_sch
status_code=404, detail=f"Player ID {x.pitcher_id} not found"
)
new_dec.append(x.dict())
new_dec.append(x.model_dump())
with db.atomic():
for batch in chunked(new_dec, 10):
Decision.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f"Inserted {len(new_dec)} decisions"
@ -235,18 +230,16 @@ async def post_decisions(dec_list: DecisionList, token: str = Depends(oauth2_sch
@handle_db_errors
async def delete_decision(decision_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"delete_decision - Bad Token: {token}")
logger.warning("delete_decision - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_dec = Decision.get_or_none(Decision.id == decision_id)
if this_dec is None:
db.close()
raise HTTPException(
status_code=404, detail=f"Decision ID {decision_id} not found"
)
count = this_dec.delete_instance()
db.close()
if count == 1:
return f"Decision {decision_id} has been deleted"
@ -260,16 +253,14 @@ async def delete_decision(decision_id: int, token: str = Depends(oauth2_scheme))
@handle_db_errors
async def delete_decisions_game(game_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"delete_decisions_game - Bad Token: {token}")
logger.warning("delete_decisions_game - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_game = StratGame.get_or_none(StratGame.id == game_id)
if not this_game:
db.close()
raise HTTPException(status_code=404, detail=f"Game ID {game_id} not found")
count = Decision.delete().where(Decision.game == this_game).execute()
db.close()
if count > 0:
return f"Deleted {count} decisions matching Game ID {game_id}"

View File

@ -55,7 +55,6 @@ async def get_divisions(
"count": total_count,
"divisions": [model_to_dict(x) for x in all_divisions],
}
db.close()
return return_div
@ -64,13 +63,11 @@ async def get_divisions(
async def get_one_division(division_id: int):
this_div = Division.get_or_none(Division.id == division_id)
if this_div is None:
db.close()
raise HTTPException(
status_code=404, detail=f"Division ID {division_id} not found"
)
r_div = model_to_dict(this_div)
db.close()
return r_div
@ -85,12 +82,11 @@ async def patch_division(
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f"patch_division - Bad Token: {token}")
logger.warning("patch_division - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_div = Division.get_or_none(Division.id == division_id)
if this_div is None:
db.close()
raise HTTPException(
status_code=404, detail=f"Division ID {division_id} not found"
)
@ -106,10 +102,8 @@ async def patch_division(
if this_div.save() == 1:
r_division = model_to_dict(this_div)
db.close()
return r_division
else:
db.close()
raise HTTPException(
status_code=500, detail=f"Unable to patch division {division_id}"
)
@ -121,17 +115,15 @@ async def post_division(
new_division: DivisionModel, token: str = Depends(oauth2_scheme)
):
if not valid_token(token):
logger.warning(f"post_division - Bad Token: {token}")
logger.warning("post_division - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_division = Division(**new_division.dict())
this_division = Division(**new_division.model_dump())
if this_division.save() == 1:
r_division = model_to_dict(this_division)
db.close()
return r_division
else:
db.close()
raise HTTPException(status_code=500, detail=f"Unable to post division")
@ -139,18 +131,16 @@ async def post_division(
@handle_db_errors
async def delete_division(division_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"delete_division - Bad Token: {token}")
logger.warning("delete_division - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_div = Division.get_or_none(Division.id == division_id)
if this_div is None:
db.close()
raise HTTPException(
status_code=404, detail=f"Division ID {division_id} not found"
)
count = this_div.delete_instance()
db.close()
if count == 1:
return f"Division {division_id} has been deleted"

View File

@ -32,7 +32,6 @@ async def get_draftdata():
if draft_data is not None:
r_data = model_to_dict(draft_data)
db.close()
return r_data
raise HTTPException(status_code=404, detail=f'No draft data found')
@ -45,12 +44,11 @@ async def patch_draftdata(
pick_deadline: Optional[datetime.datetime] = None, result_channel: Optional[int] = None,
ping_channel: Optional[int] = None, pick_minutes: Optional[int] = None, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'patch_draftdata - Bad Token: {token}')
logger.warning('patch_draftdata - Bad Token')
raise HTTPException(status_code=401, detail='Unauthorized')
draft_data = DraftData.get_or_none(DraftData.id == data_id)
if draft_data is None:
db.close()
raise HTTPException(status_code=404, detail=f'No draft data found')
if currentpick is not None:
@ -68,7 +66,6 @@ async def patch_draftdata(
saved = draft_data.save()
r_data = model_to_dict(draft_data)
db.close()
if saved == 1:
return r_data

View File

@ -40,7 +40,7 @@ async def get_draftlist(
offset: int = Query(default=0, ge=0),
):
if not valid_token(token):
logger.warning(f"get_draftlist - Bad Token: {token}")
logger.warning("get_draftlist - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
all_list = DraftList.select()
@ -55,7 +55,6 @@ async def get_draftlist(
r_list = {"count": total_count, "picks": [model_to_dict(x) for x in all_list]}
db.close()
return r_list
@ -63,7 +62,7 @@ async def get_draftlist(
@handle_db_errors
async def get_team_draftlist(team_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"post_draftlist - Bad Token: {token}")
logger.warning("post_draftlist - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_team = Team.get_or_none(Team.id == team_id)
@ -76,7 +75,6 @@ async def get_team_draftlist(team_id: int, token: str = Depends(oauth2_scheme)):
"picks": [model_to_dict(x) for x in this_list],
}
db.close()
return r_list
@ -86,7 +84,7 @@ async def post_draftlist(
draft_list: DraftListList, token: str = Depends(oauth2_scheme)
):
if not valid_token(token):
logger.warning(f"post_draftlist - Bad Token: {token}")
logger.warning("post_draftlist - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
new_list = []
@ -100,13 +98,12 @@ async def post_draftlist(
DraftList.delete().where(DraftList.team == this_team).execute()
for x in draft_list.draft_list:
new_list.append(x.dict())
new_list.append(x.model_dump())
with db.atomic():
for batch in chunked(new_list, 15):
DraftList.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f"Inserted {len(new_list)} list values"
@ -114,9 +111,8 @@ async def post_draftlist(
@handle_db_errors
async def delete_draftlist(team_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"delete_draftlist - Bad Token: {token}")
logger.warning("delete_draftlist - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
count = DraftList.delete().where(DraftList.team_id == team_id).execute()
db.close()
return f"Deleted {count} list values"

View File

@ -124,7 +124,6 @@ async def get_picks(
for line in all_picks:
return_picks["picks"].append(model_to_dict(line, recurse=not short_output))
db.close()
return return_picks
@ -136,7 +135,6 @@ async def get_one_pick(pick_id: int, short_output: Optional[bool] = False):
r_pick = model_to_dict(this_pick, recurse=not short_output)
else:
raise HTTPException(status_code=404, detail=f"Pick ID {pick_id} not found")
db.close()
return r_pick
@ -146,15 +144,14 @@ async def patch_pick(
pick_id: int, new_pick: DraftPickModel, token: str = Depends(oauth2_scheme)
):
if not valid_token(token):
logger.warning(f"patch_pick - Bad Token: {token}")
logger.warning("patch_pick - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
if DraftPick.get_or_none(DraftPick.id == pick_id) is None:
raise HTTPException(status_code=404, detail=f"Pick ID {pick_id} not found")
DraftPick.update(**new_pick.dict()).where(DraftPick.id == pick_id).execute()
DraftPick.update(**new_pick.model_dump()).where(DraftPick.id == pick_id).execute()
r_pick = model_to_dict(DraftPick.get_by_id(pick_id))
db.close()
return r_pick
@ -162,7 +159,7 @@ async def patch_pick(
@handle_db_errors
async def post_picks(p_list: DraftPickList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"post_picks - Bad Token: {token}")
logger.warning("post_picks - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
new_picks = []
@ -171,18 +168,16 @@ async def post_picks(p_list: DraftPickList, token: str = Depends(oauth2_scheme))
DraftPick.season == pick.season, DraftPick.overall == pick.overall
)
if dupe:
db.close()
raise HTTPException(
status_code=500,
detail=f"Pick # {pick.overall} already exists for season {pick.season}",
)
new_picks.append(pick.dict())
new_picks.append(pick.model_dump())
with db.atomic():
for batch in chunked(new_picks, 15):
DraftPick.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f"Inserted {len(new_picks)} picks"
@ -191,7 +186,7 @@ async def post_picks(p_list: DraftPickList, token: str = Depends(oauth2_scheme))
@handle_db_errors
async def delete_pick(pick_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"delete_pick - Bad Token: {token}")
logger.warning("delete_pick - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_pick = DraftPick.get_or_none(DraftPick.id == pick_id)
@ -199,7 +194,6 @@ async def delete_pick(pick_id: int, token: str = Depends(oauth2_scheme)):
raise HTTPException(status_code=404, detail=f"Pick ID {pick_id} not found")
count = this_pick.delete_instance()
db.close()
if count == 1:
return f"Draft pick {pick_id} has been deleted"

View File

@ -46,17 +46,14 @@ async def get_fieldingstats(
if "post" in s_type.lower():
all_stats = BattingStat.post_season(season)
if all_stats.count() == 0:
db.close()
return {"count": 0, "stats": []}
elif s_type.lower() in ["combined", "total", "all"]:
all_stats = BattingStat.combined_season(season)
if all_stats.count() == 0:
db.close()
return {"count": 0, "stats": []}
else:
all_stats = BattingStat.regular_season(season)
if all_stats.count() == 0:
db.close()
return {"count": 0, "stats": []}
all_stats = all_stats.where(
@ -86,7 +83,6 @@ async def get_fieldingstats(
if week_end is not None:
end = min(week_end, end)
if start > end:
db.close()
raise HTTPException(
status_code=404,
detail=f"Start week {start} is after end week {end} - cannot pull stats",
@ -124,7 +120,6 @@ async def get_fieldingstats(
],
}
db.close()
return return_stats
@ -281,6 +276,4 @@ async def get_totalstats(
}
)
return_stats["count"] = len(return_stats["stats"])
db.close()
return return_stats

View File

@ -138,8 +138,6 @@ async def get_help_commands(
except Exception as e:
logger.error(f"Error getting help commands: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@ -149,7 +147,7 @@ async def create_help_command_endpoint(
):
"""Create a new help command"""
if not valid_token(token):
logger.warning(f"create_help_command - Bad Token: {token}")
logger.warning("create_help_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
@ -187,8 +185,6 @@ async def create_help_command_endpoint(
except Exception as e:
logger.error(f"Error creating help command: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
@router.put("/{command_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@ -198,7 +194,7 @@ async def update_help_command_endpoint(
):
"""Update an existing help command"""
if not valid_token(token):
logger.warning(f"update_help_command - Bad Token: {token}")
logger.warning("update_help_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
@ -238,8 +234,6 @@ async def update_help_command_endpoint(
except Exception as e:
logger.error(f"Error updating help command {command_id}: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
@router.patch("/{command_id}/restore", include_in_schema=PRIVATE_IN_SCHEMA)
@ -249,7 +243,7 @@ async def restore_help_command_endpoint(
):
"""Restore a soft-deleted help command"""
if not valid_token(token):
logger.warning(f"restore_help_command - Bad Token: {token}")
logger.warning("restore_help_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
@ -277,8 +271,6 @@ async def restore_help_command_endpoint(
except Exception as e:
logger.error(f"Error restoring help command {command_id}: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
@router.delete("/{command_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@ -288,7 +280,7 @@ async def delete_help_command_endpoint(
):
"""Soft delete a help command"""
if not valid_token(token):
logger.warning(f"delete_help_command - Bad Token: {token}")
logger.warning("delete_help_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
@ -309,8 +301,6 @@ async def delete_help_command_endpoint(
except Exception as e:
logger.error(f"Error deleting help command {command_id}: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
@router.get("/stats")
@ -368,8 +358,6 @@ async def get_help_command_stats():
except Exception as e:
logger.error(f"Error getting help command stats: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
# Special endpoints for Discord bot integration
@ -402,8 +390,6 @@ async def get_help_command_by_name_endpoint(
except Exception as e:
logger.error(f"Error getting help command by name '{command_name}': {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
@router.patch("/by_name/{command_name}/view", include_in_schema=PRIVATE_IN_SCHEMA)
@ -411,7 +397,7 @@ async def get_help_command_by_name_endpoint(
async def increment_view_count(command_name: str, token: str = Depends(oauth2_scheme)):
"""Increment view count for a help command"""
if not valid_token(token):
logger.warning(f"increment_view_count - Bad Token: {token}")
logger.warning("increment_view_count - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
@ -439,8 +425,6 @@ async def increment_view_count(command_name: str, token: str = Depends(oauth2_sc
except Exception as e:
logger.error(f"Error incrementing view count for '{command_name}': {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
@router.get("/autocomplete")
@ -470,8 +454,6 @@ async def get_help_names_for_autocomplete(
except Exception as e:
logger.error(f"Error getting help names for autocomplete: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()
@router.get("/{command_id}")
@ -499,5 +481,3 @@ async def get_help_command(command_id: int):
except Exception as e:
logger.error(f"Error getting help command {command_id}: {e}")
raise HTTPException(status_code=500, detail=str(e))
finally:
db.close()

View File

@ -75,7 +75,6 @@ async def get_injuries(
"count": total_count,
"injuries": [model_to_dict(x, recurse=not short_output) for x in all_injuries],
}
db.close()
return return_injuries
@ -87,12 +86,11 @@ async def patch_injury(
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f"patch_injury - Bad Token: {token}")
logger.warning("patch_injury - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_injury = Injury.get_or_none(Injury.id == injury_id)
if this_injury is None:
db.close()
raise HTTPException(status_code=404, detail=f"Injury ID {injury_id} not found")
if is_active is not None:
@ -100,10 +98,8 @@ async def patch_injury(
if this_injury.save() == 1:
r_injury = model_to_dict(this_injury)
db.close()
return r_injury
else:
db.close()
raise HTTPException(
status_code=500, detail=f"Unable to patch injury {injury_id}"
)
@ -113,17 +109,15 @@ async def patch_injury(
@handle_db_errors
async def post_injury(new_injury: InjuryModel, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"post_injury - Bad Token: {token}")
logger.warning("post_injury - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_injury = Injury(**new_injury.dict())
this_injury = Injury(**new_injury.model_dump())
if this_injury.save():
r_injury = model_to_dict(this_injury)
db.close()
return r_injury
else:
db.close()
raise HTTPException(status_code=500, detail=f"Unable to post injury")
@ -131,16 +125,14 @@ async def post_injury(new_injury: InjuryModel, token: str = Depends(oauth2_schem
@handle_db_errors
async def delete_injury(injury_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"delete_injury - Bad Token: {token}")
logger.warning("delete_injury - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_injury = Injury.get_or_none(Injury.id == injury_id)
if this_injury is None:
db.close()
raise HTTPException(status_code=404, detail=f"Injury ID {injury_id} not found")
count = this_injury.delete_instance()
db.close()
if count == 1:
return f"Injury {injury_id} has been deleted"

View File

@ -55,7 +55,6 @@ async def get_keepers(
"count": total_count,
"keepers": [model_to_dict(x, recurse=not short_output) for x in all_keepers],
}
db.close()
return return_keepers
@ -69,7 +68,7 @@ async def patch_keeper(
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f"patch_keeper - Bad Token: {token}")
logger.warning("patch_keeper - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_keeper = Keeper.get_or_none(Keeper.id == keeper_id)
@ -85,10 +84,8 @@ async def patch_keeper(
if this_keeper.save():
r_keeper = model_to_dict(this_keeper)
db.close()
return r_keeper
else:
db.close()
raise HTTPException(
status_code=500, detail=f"Unable to patch keeper {keeper_id}"
)
@ -98,17 +95,16 @@ async def patch_keeper(
@handle_db_errors
async def post_keepers(k_list: KeeperList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"post_keepers - Bad Token: {token}")
logger.warning("post_keepers - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
new_keepers = []
for keeper in k_list.keepers:
new_keepers.append(keeper.dict())
new_keepers.append(keeper.model_dump())
with db.atomic():
for batch in chunked(new_keepers, 14):
Keeper.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f"Inserted {len(new_keepers)} keepers"
@ -117,7 +113,7 @@ async def post_keepers(k_list: KeeperList, token: str = Depends(oauth2_scheme)):
@handle_db_errors
async def delete_keeper(keeper_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"delete_keeper - Bad Token: {token}")
logger.warning("delete_keeper - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_keeper = Keeper.get_or_none(Keeper.id == keeper_id)
@ -125,7 +121,6 @@ async def delete_keeper(keeper_id: int, token: str = Depends(oauth2_scheme)):
raise HTTPException(status_code=404, detail=f"Keeper ID {keeper_id} not found")
count = this_keeper.delete_instance()
db.close()
if count == 1:
return f"Keeper ID {keeper_id} has been deleted"

View File

@ -84,7 +84,6 @@ async def get_managers(
],
}
db.close()
return return_managers
@ -94,7 +93,6 @@ async def get_one_manager(manager_id: int, short_output: Optional[bool] = False)
this_manager = Manager.get_or_none(Manager.id == manager_id)
if this_manager is not None:
r_manager = model_to_dict(this_manager, recurse=not short_output)
db.close()
return r_manager
else:
raise HTTPException(status_code=404, detail=f"Manager {manager_id} not found")
@ -111,12 +109,11 @@ async def patch_manager(
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f"patch_manager - Bad Token: {token}")
logger.warning("patch_manager - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_manager = Manager.get_or_none(Manager.id == manager_id)
if this_manager is None:
db.close()
raise HTTPException(
status_code=404, detail=f"Manager ID {manager_id} not found"
)
@ -132,10 +129,8 @@ async def patch_manager(
if this_manager.save() == 1:
r_manager = model_to_dict(this_manager)
db.close()
return r_manager
else:
db.close()
raise HTTPException(
status_code=500, detail=f"Unable to patch manager {this_manager}"
)
@ -145,17 +140,15 @@ async def patch_manager(
@handle_db_errors
async def post_manager(new_manager: ManagerModel, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"post_manager - Bad Token: {token}")
logger.warning("post_manager - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_manager = Manager(**new_manager.dict())
this_manager = Manager(**new_manager.model_dump())
if this_manager.save():
r_manager = model_to_dict(this_manager)
db.close()
return r_manager
else:
db.close()
raise HTTPException(
status_code=500, detail=f"Unable to post manager {this_manager.name}"
)
@ -165,18 +158,16 @@ async def post_manager(new_manager: ManagerModel, token: str = Depends(oauth2_sc
@handle_db_errors
async def delete_manager(manager_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"delete_manager - Bad Token: {token}")
logger.warning("delete_manager - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_manager = Manager.get_or_none(Manager.id == manager_id)
if this_manager is None:
db.close()
raise HTTPException(
status_code=404, detail=f"Manager ID {manager_id} not found"
)
count = this_manager.delete_instance()
db.close()
if count == 1:
return f"Manager {manager_id} has been deleted"

View File

@ -78,17 +78,14 @@ async def get_pitstats(
if "post" in s_type.lower():
all_stats = PitchingStat.post_season(season)
if all_stats.count() == 0:
db.close()
return {"count": 0, "stats": []}
elif s_type.lower() in ["combined", "total", "all"]:
all_stats = PitchingStat.combined_season(season)
if all_stats.count() == 0:
db.close()
return {"count": 0, "stats": []}
else:
all_stats = PitchingStat.regular_season(season)
if all_stats.count() == 0:
db.close()
return {"count": 0, "stats": []}
if team_abbrev is not None:
@ -114,7 +111,6 @@ async def get_pitstats(
if week_end is not None:
end = min(week_end, end)
if start > end:
db.close()
raise HTTPException(
status_code=404,
detail=f"Start week {start} is after end week {end} - cannot pull stats",
@ -133,7 +129,6 @@ async def get_pitstats(
"stats": [model_to_dict(x, recurse=not short_output) for x in all_stats],
}
db.close()
return return_stats
@ -307,7 +302,6 @@ async def get_totalstats(
"bsv": x.sum_bsv,
}
)
db.close()
return return_stats
@ -317,15 +311,16 @@ async def patch_pitstats(
stat_id: int, new_stats: PitStatModel, token: str = Depends(oauth2_scheme)
):
if not valid_token(token):
logger.warning(f"patch_pitstats - Bad Token: {token}")
logger.warning("patch_pitstats - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
if PitchingStat.get_or_none(PitchingStat.id == stat_id) is None:
raise HTTPException(status_code=404, detail=f"Stat ID {stat_id} not found")
PitchingStat.update(**new_stats.dict()).where(PitchingStat.id == stat_id).execute()
PitchingStat.update(**new_stats.model_dump()).where(
PitchingStat.id == stat_id
).execute()
r_stat = model_to_dict(PitchingStat.get_by_id(stat_id))
db.close()
return r_stat
@ -333,7 +328,7 @@ async def patch_pitstats(
@handle_db_errors
async def post_pitstats(s_list: PitStatList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"post_pitstats - Bad Token: {token}")
logger.warning("post_pitstats - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
all_stats = []
@ -350,11 +345,10 @@ async def post_pitstats(s_list: PitStatList, token: str = Depends(oauth2_scheme)
status_code=404, detail=f"Player ID {x.player_id} not found"
)
all_stats.append(PitchingStat(**x.dict()))
all_stats.append(PitchingStat(**x.model_dump()))
with db.atomic():
for batch in chunked(all_stats, 15):
PitchingStat.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f"Added {len(all_stats)} batting lines"

View File

@ -4,7 +4,7 @@ Thin HTTP layer using PlayerService for business logic.
"""
from fastapi import APIRouter, Query, Response, Depends
from typing import Optional, List
from typing import Literal, Optional, List
from ..dependencies import (
oauth2_scheme,
@ -27,8 +27,10 @@ async def get_players(
pos: list = Query(default=None),
strat_code: list = Query(default=None),
is_injured: Optional[bool] = None,
sort: Optional[str] = None,
limit: Optional[int] = Query(default=None, ge=1),
sort: Optional[Literal["cost-asc", "cost-desc", "name-asc", "name-desc"]] = None,
limit: Optional[int] = Query(
default=None, ge=1, description="Maximum number of results to return"
),
offset: Optional[int] = Query(
default=None, ge=0, description="Number of results to skip for pagination"
),

View File

@ -85,7 +85,6 @@ async def get_results(
"count": total_count,
"results": [model_to_dict(x, recurse=not short_output) for x in all_results],
}
db.close()
return return_results
@ -97,7 +96,6 @@ async def get_one_result(result_id: int, short_output: Optional[bool] = False):
r_result = model_to_dict(this_result, recurse=not short_output)
else:
r_result = None
db.close()
return r_result
@ -116,7 +114,7 @@ async def patch_result(
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f"patch_player - Bad Token: {token}")
logger.warning("patch_player - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_result = Result.get_or_none(Result.id == result_id)
@ -149,10 +147,8 @@ async def patch_result(
if this_result.save() == 1:
r_result = model_to_dict(this_result)
db.close()
return r_result
else:
db.close()
raise HTTPException(
status_code=500, detail=f"Unable to patch result {result_id}"
)
@ -162,7 +158,7 @@ async def patch_result(
@handle_db_errors
async def post_results(result_list: ResultList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"patch_player - Bad Token: {token}")
logger.warning("patch_player - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
new_results = []
@ -187,12 +183,11 @@ async def post_results(result_list: ResultList, token: str = Depends(oauth2_sche
status_code=404, detail=f"Team ID {x.hometeam_id} not found"
)
new_results.append(x.dict())
new_results.append(x.model_dump())
with db.atomic():
for batch in chunked(new_results, 15):
Result.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f"Inserted {len(new_results)} results"
@ -201,16 +196,14 @@ async def post_results(result_list: ResultList, token: str = Depends(oauth2_sche
@handle_db_errors
async def delete_result(result_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"delete_result - Bad Token: {token}")
logger.warning("delete_result - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_result = Result.get_or_none(Result.id == result_id)
if not this_result:
db.close()
raise HTTPException(status_code=404, detail=f"Result ID {result_id} not found")
count = this_result.delete_instance()
db.close()
if count == 1:
return f"Result {result_id} has been deleted"

View File

@ -102,7 +102,6 @@ async def get_players(
if csv:
return_val = query_to_csv(all_players)
db.close()
return Response(content=return_val, media_type="text/csv")
total_count = all_players.count()
@ -112,7 +111,6 @@ async def get_players(
"count": total_count,
"players": [model_to_dict(x) for x in all_players],
}
db.close()
return return_val
@ -121,13 +119,11 @@ async def get_players(
async def get_one_player(player_id: int):
this_player = SbaPlayer.get_or_none(SbaPlayer.id == player_id)
if this_player is None:
db.close()
raise HTTPException(
status_code=404, detail=f"SbaPlayer id {player_id} not found"
)
r_data = model_to_dict(this_player)
db.close()
return r_data
@ -144,8 +140,7 @@ async def patch_player(
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logging.warning(f"Bad Token: {token}")
db.close()
logging.warning("Bad Token")
raise HTTPException(
status_code=401,
detail="You are not authorized to patch mlb players. This event has been logged.",
@ -153,7 +148,6 @@ async def patch_player(
this_player = SbaPlayer.get_or_none(SbaPlayer.id == player_id)
if this_player is None:
db.close()
raise HTTPException(
status_code=404, detail=f"SbaPlayer id {player_id} not found"
)
@ -173,10 +167,8 @@ async def patch_player(
if this_player.save() == 1:
return_val = model_to_dict(this_player)
db.close()
return return_val
else:
db.close()
raise HTTPException(
status_code=418,
detail="Well slap my ass and call me a teapot; I could not save that player",
@ -187,8 +179,7 @@ async def patch_player(
@handle_db_errors
async def post_players(players: PlayerList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logging.warning(f"Bad Token: {token}")
db.close()
logging.warning("Bad Token")
raise HTTPException(
status_code=401,
detail="You are not authorized to post mlb players. This event has been logged.",
@ -207,7 +198,6 @@ async def post_players(players: PlayerList, token: str = Depends(oauth2_scheme))
)
if dupes.count() > 0:
logger.error(f"Found a dupe for {x}")
db.close()
raise HTTPException(
status_code=400,
detail=f"{x.first_name} {x.last_name} has a key already in the database",
@ -218,7 +208,6 @@ async def post_players(players: PlayerList, token: str = Depends(oauth2_scheme))
with db.atomic():
for batch in chunked(new_players, 15):
SbaPlayer.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f"Inserted {len(new_players)} new MLB players"
@ -227,8 +216,7 @@ async def post_players(players: PlayerList, token: str = Depends(oauth2_scheme))
@handle_db_errors
async def post_one_player(player: SbaPlayerModel, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logging.warning(f"Bad Token: {token}")
db.close()
logging.warning("Bad Token")
raise HTTPException(
status_code=401,
detail="You are not authorized to post mlb players. This event has been logged.",
@ -243,20 +231,17 @@ async def post_one_player(player: SbaPlayerModel, token: str = Depends(oauth2_sc
logging.info(f"POST /SbaPlayers/one - dupes found:")
for x in dupes:
logging.info(f"{x}")
db.close()
raise HTTPException(
status_code=400,
detail=f"{player.first_name} {player.last_name} has a key already in the database",
)
new_player = SbaPlayer(**player.dict())
new_player = SbaPlayer(**player.model_dump())
saved = new_player.save()
if saved == 1:
return_val = model_to_dict(new_player)
db.close()
return return_val
else:
db.close()
raise HTTPException(
status_code=418,
detail="Well slap my ass and call me a teapot; I could not save that player",
@ -267,8 +252,7 @@ async def post_one_player(player: SbaPlayerModel, token: str = Depends(oauth2_sc
@handle_db_errors
async def delete_player(player_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logging.warning(f"Bad Token: {token}")
db.close()
logging.warning("Bad Token")
raise HTTPException(
status_code=401,
detail="You are not authorized to delete mlb players. This event has been logged.",
@ -276,13 +260,11 @@ async def delete_player(player_id: int, token: str = Depends(oauth2_scheme)):
this_player = SbaPlayer.get_or_none(SbaPlayer.id == player_id)
if this_player is None:
db.close()
raise HTTPException(
status_code=404, detail=f"SbaPlayer id {player_id} not found"
)
count = this_player.delete_instance()
db.close()
if count == 1:
return f"Player {player_id} has been deleted"

View File

@ -80,7 +80,6 @@ async def get_schedules(
"count": total_count,
"schedules": [model_to_dict(x, recurse=not short_output) for x in all_sched],
}
db.close()
return return_sched
@ -92,7 +91,6 @@ async def get_one_schedule(schedule_id: int):
r_sched = model_to_dict(this_sched)
else:
r_sched = None
db.close()
return r_sched
@ -108,7 +106,7 @@ async def patch_schedule(
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f"patch_schedule - Bad Token: {token}")
logger.warning("patch_schedule - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_sched = Schedule.get_or_none(Schedule.id == schedule_id)
@ -134,10 +132,8 @@ async def patch_schedule(
if this_sched.save() == 1:
r_sched = model_to_dict(this_sched)
db.close()
return r_sched
else:
db.close()
raise HTTPException(
status_code=500, detail=f"Unable to patch schedule {schedule_id}"
)
@ -147,7 +143,7 @@ async def patch_schedule(
@handle_db_errors
async def post_schedules(sched_list: ScheduleList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"post_schedules - Bad Token: {token}")
logger.warning("post_schedules - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
new_sched = []
@ -172,12 +168,11 @@ async def post_schedules(sched_list: ScheduleList, token: str = Depends(oauth2_s
status_code=404, detail=f"Team ID {x.hometeam_id} not found"
)
new_sched.append(x.dict())
new_sched.append(x.model_dump())
with db.atomic():
for batch in chunked(new_sched, 15):
Schedule.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f"Inserted {len(new_sched)} schedules"
@ -186,7 +181,7 @@ async def post_schedules(sched_list: ScheduleList, token: str = Depends(oauth2_s
@handle_db_errors
async def delete_schedule(schedule_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"delete_schedule - Bad Token: {token}")
logger.warning("delete_schedule - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_sched = Schedule.get_or_none(Schedule.id == schedule_id)
@ -196,7 +191,6 @@ async def delete_schedule(schedule_id: int, token: str = Depends(oauth2_scheme))
)
count = this_sched.delete_instance()
db.close()
if count == 1:
return f"Schedule {this_sched} has been deleted"

View File

@ -69,7 +69,6 @@ async def get_standings(
"standings": [model_to_dict(x, recurse=not short_output) for x in div_teams],
}
db.close()
return return_standings
@ -88,19 +87,18 @@ async def get_team_standings(team_id: int):
@router.patch("/{stan_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def patch_standings(
stan_id,
stan_id: int,
wins: Optional[int] = None,
losses: Optional[int] = None,
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f"patch_standings - Bad Token: {token}")
logger.warning("patch_standings - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
this_stan = Standings.get_by_id(stan_id)
except Exception as e:
db.close()
raise HTTPException(status_code=404, detail=f"No team found with id {stan_id}")
if wins:
@ -109,7 +107,6 @@ async def patch_standings(
this_stan.losses = losses
this_stan.save()
db.close()
return model_to_dict(this_stan)
@ -118,7 +115,7 @@ async def patch_standings(
@handle_db_errors
async def post_standings(season: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"post_standings - Bad Token: {token}")
logger.warning("post_standings - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
new_teams = []
@ -129,7 +126,6 @@ async def post_standings(season: int, token: str = Depends(oauth2_scheme)):
with db.atomic():
for batch in chunked(new_teams, 16):
Standings.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f"Inserted {len(new_teams)} standings"
@ -138,11 +134,10 @@ async def post_standings(season: int, token: str = Depends(oauth2_scheme)):
@handle_db_errors
async def recalculate_standings(season: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"recalculate_standings - Bad Token: {token}")
logger.warning("recalculate_standings - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
code = Standings.recalculate(season)
db.close()
if code == 69:
raise HTTPException(status_code=500, detail=f"Error recreating Standings rows")
return f"Just recalculated standings for season {season}"

View File

@ -13,7 +13,6 @@ from ..dependencies import (
PRIVATE_IN_SCHEMA,
handle_db_errors,
update_season_batting_stats,
MAX_LIMIT,
DEFAULT_LIMIT,
)
@ -61,7 +60,7 @@ async def get_games(
division_id: Optional[int] = None,
short_output: Optional[bool] = False,
sort: Optional[str] = None,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=1000),
offset: int = Query(default=0, ge=0),
) -> Any:
all_games = StratGame.select()
@ -130,7 +129,6 @@ async def get_games(
"count": total_count,
"games": [model_to_dict(x, recurse=not short_output) for x in all_games],
}
db.close()
return return_games
@ -139,11 +137,9 @@ async def get_games(
async def get_one_game(game_id: int) -> Any:
this_game = StratGame.get_or_none(StratGame.id == game_id)
if not this_game:
db.close()
raise HTTPException(status_code=404, detail=f"StratGame ID {game_id} not found")
g_result = model_to_dict(this_game)
db.close()
return g_result
@ -160,12 +156,11 @@ async def patch_game(
scorecard_url: Optional[str] = None,
) -> Any:
if not valid_token(token):
logger.warning(f"patch_game - Bad Token: {token}")
logger.warning("patch_game - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_game = StratGame.get_or_none(StratGame.id == game_id)
if not this_game:
db.close()
raise HTTPException(status_code=404, detail=f"StratGame ID {game_id} not found")
if game_num is not None:
@ -241,7 +236,7 @@ async def patch_game(
@handle_db_errors
async def post_games(game_list: GameList, token: str = Depends(oauth2_scheme)) -> Any:
if not valid_token(token):
logger.warning(f"post_games - Bad Token: {token}")
logger.warning("post_games - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
new_games = []
@ -255,12 +250,11 @@ async def post_games(game_list: GameList, token: str = Depends(oauth2_scheme)) -
status_code=404, detail=f"Team ID {x.home_team_id} not found"
)
new_games.append(x.dict())
new_games.append(x.model_dump())
with db.atomic():
for batch in chunked(new_games, 16):
StratGame.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f"Inserted {len(new_games)} games"
@ -269,12 +263,11 @@ async def post_games(game_list: GameList, token: str = Depends(oauth2_scheme)) -
@handle_db_errors
async def wipe_game(game_id: int, token: str = Depends(oauth2_scheme)) -> Any:
if not valid_token(token):
logger.warning(f"wipe_game - Bad Token: {token}")
logger.warning("wipe_game - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_game = StratGame.get_or_none(StratGame.id == game_id)
if not this_game:
db.close()
raise HTTPException(status_code=404, detail=f"StratGame ID {game_id} not found")
this_game.away_score = None
@ -285,10 +278,8 @@ async def wipe_game(game_id: int, token: str = Depends(oauth2_scheme)) -> Any:
if this_game.save() == 1:
g_result = model_to_dict(this_game)
db.close()
return g_result
else:
db.close()
raise HTTPException(status_code=500, detail=f"Unable to wipe game {game_id}")
@ -296,16 +287,14 @@ async def wipe_game(game_id: int, token: str = Depends(oauth2_scheme)) -> Any:
@handle_db_errors
async def delete_game(game_id: int, token: str = Depends(oauth2_scheme)) -> Any:
if not valid_token(token):
logger.warning(f"delete_game - Bad Token: {token}")
logger.warning("delete_game - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_game = StratGame.get_or_none(StratGame.id == game_id)
if not this_game:
db.close()
raise HTTPException(status_code=404, detail=f"StratGame ID {game_id} not found")
count = this_game.delete_instance()
db.close()
if count == 1:
return f"StratGame {game_id} has been deleted"

View File

@ -17,7 +17,6 @@ from ...dependencies import (
add_cache_headers,
cache_result,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
from .common import build_season_games
@ -58,7 +57,7 @@ async def get_batting_totals(
risp: Optional[bool] = None,
inning: list = Query(default=None),
sort: Optional[str] = None,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=1000),
short_output: Optional[bool] = False,
page_num: Optional[int] = 1,
week_start: Optional[int] = None,
@ -598,5 +597,4 @@ async def get_batting_totals(
}
)
db.close()
return return_stats

View File

@ -20,10 +20,8 @@ logger = logging.getLogger("discord_app")
@handle_db_errors
async def get_one_play(play_id: int):
if StratPlay.get_or_none(StratPlay.id == play_id) is None:
db.close()
raise HTTPException(status_code=404, detail=f"Play ID {play_id} not found")
r_play = model_to_dict(StratPlay.get_by_id(play_id))
db.close()
return r_play
@ -33,16 +31,14 @@ async def patch_play(
play_id: int, new_play: PlayModel, token: str = Depends(oauth2_scheme)
):
if not valid_token(token):
logger.warning(f"patch_play - Bad Token: {token}")
logger.warning("patch_play - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
if StratPlay.get_or_none(StratPlay.id == play_id) is None:
db.close()
raise HTTPException(status_code=404, detail=f"Play ID {play_id} not found")
StratPlay.update(**new_play.dict()).where(StratPlay.id == play_id).execute()
StratPlay.update(**new_play.model_dump()).where(StratPlay.id == play_id).execute()
r_play = model_to_dict(StratPlay.get_by_id(play_id))
db.close()
return r_play
@ -50,7 +46,7 @@ async def patch_play(
@handle_db_errors
async def post_plays(p_list: PlayList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"post_plays - Bad Token: {token}")
logger.warning("post_plays - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
new_plays = []
@ -88,12 +84,11 @@ async def post_plays(p_list: PlayList, token: str = Depends(oauth2_scheme)):
if this_play.pa == 0:
this_play.batter_final = None
new_plays.append(this_play.dict())
new_plays.append(this_play.model_dump())
with db.atomic():
for batch in chunked(new_plays, 20):
StratPlay.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f"Inserted {len(new_plays)} plays"
@ -102,16 +97,14 @@ async def post_plays(p_list: PlayList, token: str = Depends(oauth2_scheme)):
@handle_db_errors
async def delete_play(play_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"delete_play - Bad Token: {token}")
logger.warning("delete_play - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_play = StratPlay.get_or_none(StratPlay.id == play_id)
if not this_play:
db.close()
raise HTTPException(status_code=404, detail=f"Play ID {play_id} not found")
count = this_play.delete_instance()
db.close()
if count == 1:
return f"Play {play_id} has been deleted"
@ -125,16 +118,14 @@ async def delete_play(play_id: int, token: str = Depends(oauth2_scheme)):
@handle_db_errors
async def delete_plays_game(game_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"delete_plays_game - Bad Token: {token}")
logger.warning("delete_plays_game - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
this_game = StratGame.get_or_none(StratGame.id == game_id)
if not this_game:
db.close()
raise HTTPException(status_code=404, detail=f"Game ID {game_id} not found")
count = StratPlay.delete().where(StratPlay.game == this_game).execute()
db.close()
if count > 0:
return f"Deleted {count} plays matching Game ID {game_id}"
@ -148,12 +139,11 @@ async def delete_plays_game(game_id: int, token: str = Depends(oauth2_scheme)):
@handle_db_errors
async def post_erun_check(token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"post_erun_check - Bad Token: {token}")
logger.warning("post_erun_check - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
all_plays = StratPlay.update(run=1).where(
(StratPlay.e_run == 1) & (StratPlay.run == 0)
)
count = all_plays.execute()
db.close()
return count

View File

@ -17,7 +17,6 @@ from ...dependencies import (
handle_db_errors,
add_cache_headers,
cache_result,
MAX_LIMIT,
DEFAULT_LIMIT,
)
from .common import build_season_games
@ -57,7 +56,7 @@ async def get_fielding_totals(
team_id: list = Query(default=None),
manager_id: list = Query(default=None),
sort: Optional[str] = None,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=1000),
short_output: Optional[bool] = False,
page_num: Optional[int] = 1,
):
@ -365,5 +364,4 @@ async def get_fielding_totals(
"week": this_week,
}
)
db.close()
return return_stats

View File

@ -20,7 +20,6 @@ from ...dependencies import (
handle_db_errors,
add_cache_headers,
cache_result,
MAX_LIMIT,
DEFAULT_LIMIT,
)
from .common import build_season_games
@ -57,7 +56,7 @@ async def get_pitching_totals(
risp: Optional[bool] = None,
inning: list = Query(default=None),
sort: Optional[str] = None,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=1000),
short_output: Optional[bool] = False,
csv: Optional[bool] = False,
page_num: Optional[int] = 1,
@ -352,7 +351,6 @@ async def get_pitching_totals(
)
return_stats["count"] = len(return_stats["stats"])
db.close()
if csv:
return Response(
content=complex_data_to_csv(return_stats["stats"]), media_type="text/csv"

View File

@ -210,5 +210,4 @@ async def get_plays(
"count": all_plays.count(),
"plays": [model_to_dict(x, recurse=not short_output) for x in all_plays],
}
db.close()
return return_plays

View File

@ -78,14 +78,14 @@ async def get_transactions(
transactions = transactions.where(Transaction.player << these_players)
if cancelled:
transactions = transactions.where(Transaction.cancelled == 1)
transactions = transactions.where(Transaction.cancelled == True)
else:
transactions = transactions.where(Transaction.cancelled == 0)
transactions = transactions.where(Transaction.cancelled == False)
if frozen:
transactions = transactions.where(Transaction.frozen == 1)
transactions = transactions.where(Transaction.frozen == True)
else:
transactions = transactions.where(Transaction.frozen == 0)
transactions = transactions.where(Transaction.frozen == False)
transactions = transactions.order_by(-Transaction.week, Transaction.moveid)
@ -101,7 +101,6 @@ async def get_transactions(
],
}
db.close()
return return_trans
@ -114,12 +113,11 @@ async def patch_transactions(
cancelled: Optional[bool] = None,
):
if not valid_token(token):
logger.warning(f"patch_transactions - Bad Token: {token}")
logger.warning("patch_transactions - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
these_moves = Transaction.select().where(Transaction.moveid == move_id)
if these_moves.count() == 0:
db.close()
raise HTTPException(status_code=404, detail=f"Move ID {move_id} not found")
if frozen is not None:
@ -131,7 +129,6 @@ async def patch_transactions(
x.cancelled = cancelled
x.save()
db.close()
return f"Updated {these_moves.count()} transactions"
@ -141,7 +138,7 @@ async def post_transactions(
moves: TransactionList, token: str = Depends(oauth2_scheme)
):
if not valid_token(token):
logger.warning(f"post_transactions - Bad Token: {token}")
logger.warning("post_transactions - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
all_moves = []
@ -175,13 +172,12 @@ async def post_transactions(
status_code=404, detail=f"Player ID {x.player_id} not found"
)
all_moves.append(x.dict())
all_moves.append(x.model_dump())
with db.atomic():
for batch in chunked(all_moves, 15):
Transaction.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f"{len(all_moves)} transactions have been added"
@ -189,13 +185,12 @@ async def post_transactions(
@handle_db_errors
async def delete_transactions(move_id, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f"delete_transactions - Bad Token: {token}")
logger.warning("delete_transactions - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
delete_query = Transaction.delete().where(Transaction.moveid == move_id)
count = delete_query.execute()
db.close()
if count > 0:
return f"Removed {count} transactions"
else:

View File

@ -124,7 +124,7 @@ async def refresh_season_batting_stats(
Useful for full season updates.
"""
if not valid_token(token):
logger.warning(f"refresh_season_batting_stats - Bad Token: {token}")
logger.warning("refresh_season_batting_stats - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
logger.info(f"Refreshing all batting stats for season {season}")
@ -270,7 +270,7 @@ async def refresh_season_pitching_stats(
Private endpoint - not included in public API documentation.
"""
if not valid_token(token):
logger.warning(f"refresh_season_batting_stats - Bad Token: {token}")
logger.warning("refresh_season_batting_stats - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
logger.info(f"Refreshing season {season} pitching stats")
@ -318,7 +318,7 @@ async def get_admin_cache_stats(token: str = Depends(oauth2_scheme)) -> dict:
Private endpoint - requires authentication.
"""
if not valid_token(token):
logger.warning(f"get_admin_cache_stats - Bad Token: {token}")
logger.warning("get_admin_cache_stats - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized")
logger.info("Getting cache statistics")

View File

@ -34,6 +34,7 @@ services:
- REDIS_HOST=sba_redis
- REDIS_PORT=6379
- REDIS_DB=0
- DISCORD_WEBHOOK_URL=${DISCORD_WEBHOOK_URL}
depends_on:
- postgres
- redis

88
migrations.py Normal file
View File

@ -0,0 +1,88 @@
#!/usr/bin/env python3
"""Apply pending SQL migrations and record them in schema_versions.
Usage:
python migrations.py
Connects to PostgreSQL using the same environment variables as the API:
POSTGRES_DB (default: sba_master)
POSTGRES_USER (default: sba_admin)
POSTGRES_PASSWORD (required)
POSTGRES_HOST (default: sba_postgres)
POSTGRES_PORT (default: 5432)
On first run against an existing database, all migrations will be applied.
All migration files use IF NOT EXISTS guards so re-applying is safe.
"""
import os
import sys
from pathlib import Path
import psycopg2
MIGRATIONS_DIR = Path(__file__).parent / "migrations"
_CREATE_SCHEMA_VERSIONS = """
CREATE TABLE IF NOT EXISTS schema_versions (
filename VARCHAR(255) PRIMARY KEY,
applied_at TIMESTAMP NOT NULL DEFAULT NOW()
);
"""
def _get_connection():
password = os.environ.get("POSTGRES_PASSWORD")
if password is None:
raise RuntimeError("POSTGRES_PASSWORD environment variable is not set")
return psycopg2.connect(
dbname=os.environ.get("POSTGRES_DB", "sba_master"),
user=os.environ.get("POSTGRES_USER", "sba_admin"),
password=password,
host=os.environ.get("POSTGRES_HOST", "sba_postgres"),
port=int(os.environ.get("POSTGRES_PORT", "5432")),
)
def main():
conn = _get_connection()
try:
with conn:
with conn.cursor() as cur:
cur.execute(_CREATE_SCHEMA_VERSIONS)
with conn.cursor() as cur:
cur.execute("SELECT filename FROM schema_versions")
applied = {row[0] for row in cur.fetchall()}
migration_files = sorted(MIGRATIONS_DIR.glob("*.sql"))
pending = [f for f in migration_files if f.name not in applied]
if not pending:
print("No pending migrations.")
return
for migration_file in pending:
print(f"Applying {migration_file.name} ...", end=" ", flush=True)
sql = migration_file.read_text()
with conn:
with conn.cursor() as cur:
cur.execute(sql)
cur.execute(
"INSERT INTO schema_versions (filename) VALUES (%s)",
(migration_file.name,),
)
print("done")
print(f"\nApplied {len(pending)} migration(s).")
finally:
conn.close()
if __name__ == "__main__":
try:
main()
except Exception as e:
print(f"Error: {e}", file=sys.stderr)
sys.exit(1)

View File

@ -0,0 +1,9 @@
-- Migration: Add schema_versions table for migration tracking
-- Date: 2026-03-27
-- Description: Creates a table to record which SQL migrations have been applied,
-- preventing double-application and missed migrations across environments.
CREATE TABLE IF NOT EXISTS schema_versions (
filename VARCHAR(255) PRIMARY KEY,
applied_at TIMESTAMP NOT NULL DEFAULT NOW()
);

View File

@ -0,0 +1,24 @@
-- Migration: Add missing indexes on foreign key columns in stratplay and stratgame
-- Created: 2026-03-27
--
-- PostgreSQL does not auto-index foreign key columns. These tables are the
-- highest-volume tables in the schema and are filtered/joined on these columns
-- in batting, pitching, and running stats aggregation and standings recalculation.
-- stratplay: FK join column
CREATE INDEX IF NOT EXISTS idx_stratplay_game_id ON stratplay(game_id);
-- stratplay: filtered in batting stats aggregation
CREATE INDEX IF NOT EXISTS idx_stratplay_batter_id ON stratplay(batter_id);
-- stratplay: filtered in pitching stats aggregation
CREATE INDEX IF NOT EXISTS idx_stratplay_pitcher_id ON stratplay(pitcher_id);
-- stratplay: filtered in running stats
CREATE INDEX IF NOT EXISTS idx_stratplay_runner_id ON stratplay(runner_id);
-- stratgame: heavily filtered by season
CREATE INDEX IF NOT EXISTS idx_stratgame_season ON stratgame(season);
-- stratgame: standings recalculation query ordering
CREATE INDEX IF NOT EXISTS idx_stratgame_season_week_game_num ON stratgame(season, week, game_num);

View File

@ -81,9 +81,9 @@ class TestRouteRegistration:
for route, methods in EXPECTED_PLAY_ROUTES.items():
assert route in paths, f"Route {route} missing from OpenAPI schema"
for method in methods:
assert (
method in paths[route]
), f"Method {method.upper()} missing for {route}"
assert method in paths[route], (
f"Method {method.upper()} missing for {route}"
)
def test_play_routes_have_plays_tag(self, api):
"""All play routes should be tagged with 'plays'."""
@ -96,9 +96,9 @@ class TestRouteRegistration:
for method, spec in paths[route].items():
if method in ("get", "post", "patch", "delete"):
tags = spec.get("tags", [])
assert (
"plays" in tags
), f"{method.upper()} {route} missing 'plays' tag, has {tags}"
assert "plays" in tags, (
f"{method.upper()} {route} missing 'plays' tag, has {tags}"
)
@pytest.mark.post_deploy
@pytest.mark.skip(
@ -124,9 +124,9 @@ class TestRouteRegistration:
]:
params = paths[route]["get"].get("parameters", [])
param_names = [p["name"] for p in params]
assert (
"sbaplayer_id" in param_names
), f"sbaplayer_id parameter missing from {route}"
assert "sbaplayer_id" in param_names, (
f"sbaplayer_id parameter missing from {route}"
)
# ---------------------------------------------------------------------------
@ -493,10 +493,9 @@ class TestPlayCrud:
assert result["id"] == play_id
def test_get_nonexistent_play(self, api):
"""GET /plays/999999999 returns an error (wrapped by handle_db_errors)."""
"""GET /plays/999999999 returns 404 Not Found."""
r = requests.get(f"{api}/api/v3/plays/999999999", timeout=10)
# handle_db_errors wraps HTTPException as 500 with detail message
assert r.status_code == 500
assert r.status_code == 404
assert "not found" in r.json().get("detail", "").lower()
@ -570,14 +569,14 @@ class TestGroupBySbaPlayer:
# Get per-season rows
r_seasons = requests.get(
f"{api}/api/v3/plays/batting",
params={"group_by": "player", "sbaplayer_id": 1, "limit": 999},
params={"group_by": "player", "sbaplayer_id": 1, "limit": 500},
timeout=15,
)
assert r_seasons.status_code == 200
season_pas = [s["pa"] for s in r_seasons.json()["stats"]]
assert career_pa >= max(
season_pas
), f"Career PA ({career_pa}) should be >= max season PA ({max(season_pas)})"
assert career_pa >= max(season_pas), (
f"Career PA ({career_pa}) should be >= max season PA ({max(season_pas)})"
)
@pytest.mark.post_deploy
def test_batting_sbaplayer_short_output(self, api):