Compare commits

...

49 Commits

Author SHA1 Message Date
cal
6972cbe610 Merge pull request 'feat: expose running CalVer version via API (#126)' (#127) from issue/126-feat-expose-running-calver-version-via-api into main
Reviewed-on: #127
2026-04-10 15:54:13 +00:00
Cal Corum
4acf7b2afa feat: expose running CalVer version via API (#126)
Closes #126

- Dockerfile: ARG BUILD_VERSION=dev baked into ENV APP_VERSION + OCI label
- CI: passes BUILD_VERSION build-arg from git tag to docker build
- main.py: adds GET /health returning {"status": "ok", "version": "..."}
- Dockerfile: updates HEALTHCHECK to use /health (no SQL, lightweight)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-10 15:53:55 +00:00
cal
1b10933ca1 Merge pull request 'chore: remove vestigial env vars from docker-compose (#121)' (#123) from issue/121-chore-remove-vestigial-env-vars-from-docker-compos into main
Reviewed-on: #123
2026-04-10 15:53:32 +00:00
Cal Corum
99fbb3848b chore: remove vestigial env vars from docker-compose (#121)
Closes #121

WORKERS_PER_CORE, TIMEOUT, and GRACEFUL_TIMEOUT were consumed by the
old tiangolo/uvicorn-gunicorn-fastapi base image and are silently
ignored since switching to python:3.12-slim with an explicit uvicorn CMD.

Also applied the same removal to dev (ssh sba-db) and prod (ssh akamai)
docker-compose files directly. Added WEB_WORKERS=4 to prod to override
the Dockerfile default of 2.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-10 15:53:21 +00:00
cal
18d7a8a758 Merge pull request 'chore: drop libpq-dev from Dockerfile (#118)' (#124) from issue/118-chore-drop-libpq-dev-from-dockerfile into main
Reviewed-on: #124
2026-04-10 15:52:48 +00:00
Cal Corum
c22e6072a2 chore: drop libpq-dev from Dockerfile (#118)
psycopg2-binary bundles its own libpq and does not need libpq-dev at
build time. curl is kept for the HEALTHCHECK.

Closes #118

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-09 13:31:34 -05:00
cal
29f9875718 Merge pull request 'fix: drop :latest tag from CI, make worker count configurable' (#116) from fix/ci-tags-and-workers into main
All checks were successful
Build Docker Image / build (push) Successful in 1m18s
2026-04-09 16:32:52 +00:00
Cal Corum
6efba473a0 fix: use exec form CMD so uvicorn receives SIGTERM as PID 1
Shell form CMD makes /bin/sh PID 1 — SIGTERM from docker stop goes to
the shell, not uvicorn, causing SIGKILL after the stop timeout instead
of graceful shutdown. Using CMD ["sh", "-c", "exec uvicorn ..."] lets
the shell expand $WEB_WORKERS then exec-replaces itself with uvicorn,
restoring correct signal delivery.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-09 09:01:18 -05:00
Cal Corum
0095d2a792 fix: drop :latest tag from CI, make worker count configurable
Remove :latest Docker tag to match Paper Dynasty convention — only
:version and :environment tags are pushed. Add WEB_WORKERS env var
to Dockerfile (default 2) so prod can override via docker-compose.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-09 08:41:47 -05:00
cal
1df1abbd7c Merge pull request 'fix: replace removed tiangolo base image with python:3.13-slim' (#115) from fix/dockerfile-base-image into main
All checks were successful
Build Docker Image / build (push) Successful in 3m36s
Reviewed-on: #115
2026-04-09 12:51:57 +00:00
Cal Corum
1187c2c99b fix: use python:3.12-slim to match pinned numpy==1.26.4 compatibility
numpy==1.26.4 has no Python 3.13 wheel and slim images have no build
toolchain, so the build would fail. python:3.12-slim matches the Python
version from the removed tiangolo base image.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-09 07:31:18 -05:00
Cal Corum
4bcc798082 fix: replace removed tiangolo base image with python:3.13-slim
The tiangolo/uvicorn-gunicorn-fastapi:python3.12 image was removed from
Docker Hub, breaking CI builds. Switches to official python:3.13-slim
with explicit uvicorn CMD. Fixes COPY path to match WORKDIR and adds
2 workers to replace the multi-worker gunicorn setup.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-09 06:49:46 -05:00
cal
d8a020fe55 Merge pull request 'ci: add dev tag trigger and environment-based image tagging' (#114) from ci/dev-tag-support into main
Some checks failed
Build Docker Image / build (push) Failing after 1m25s
2026-04-09 04:47:41 +00:00
Cal Corum
6b957fc61c ci: add dev tag trigger and environment-based image tagging
Adds support for the 'dev' tag to trigger CI builds, pushing :dev Docker
tag for the dev environment. CalVer tags now also push :production alongside
:latest for future migration.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-08 20:12:45 -05:00
cal
f9e24eb4bc Merge pull request 'fix: update test limit to respect MAX_LIMIT=500 (#110)' (#112) from issue/110-fix-test-batting-sbaplayer-career-totals-returns-4 into main
Reviewed-on: #112
2026-04-08 12:55:55 +00:00
Cal Corum
36b962e5d5 fix: update test limit to respect MAX_LIMIT=500 (#110)
Closes #110

The test was sending limit=999 which exceeds MAX_LIMIT (500), causing
FastAPI to return 422. Changed to limit=500, which is sufficient to
cover all seasons for any player.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-08 12:55:35 +00:00
cal
5d5df325bc Merge pull request 'feat: increase MAX_LIMIT to 1000 for plays batting/fielding/pitching (#111)' (#113) from issue/111-feat-increase-max-limit-to-1000-for-plays-fielding into main
Reviewed-on: #113
2026-04-08 12:53:38 +00:00
Cal Corum
682b990321 feat: increase MAX_LIMIT to 1000 for plays batting/fielding/pitching (#111)
Closes #111

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-08 06:32:46 -05:00
cal
5b19bd486a Merge pull request 'fix: preserve total_count in get_totalstats instead of overwriting with page length (#101)' (#102) from issue/101-fieldingstats-get-totalstats-total-count-overwritt into main 2026-04-08 04:08:40 +00:00
Cal Corum
718abc0096 fix: preserve total_count in get_totalstats instead of overwriting with page length (#101)
Closes #101

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 23:08:10 -05:00
cal
52d88ae950 Merge pull request 'fix: add missing indexes on FK columns in stratplay and stratgame (#74)' (#95) from issue/74-add-missing-indexes-on-foreign-key-columns-in-high into main 2026-04-08 04:06:06 +00:00
Cal Corum
9165419ed0 fix: add missing indexes on FK columns in stratplay and stratgame (#74)
Closes #74

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 23:05:25 -05:00
cal
d23d6520c3 Merge pull request 'fix: batch standings updates to eliminate N+1 queries in recalculate (#75)' (#93) from issue/75-fix-n-1-query-pattern-in-standings-recalculation into main 2026-04-08 04:03:39 +00:00
Cal Corum
c23ca9a721 fix: batch standings updates to eliminate N+1 queries in recalculate (#75)
Replace per-game update_standings() calls with pre-fetched dicts and
in-memory accumulation, then a single bulk_update at the end.
Reduces ~1,100+ queries for a full season to ~5 queries.

Closes #75

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 23:03:11 -05:00
cal
1db06576cc Merge pull request 'fix: replace integer comparisons on boolean fields with True/False (#69)' (#94) from issue/69-boolean-fields-compared-as-integers-sqlite-pattern into main 2026-04-08 03:57:35 +00:00
Cal Corum
7a5327f490 fix: replace integer comparisons on boolean fields with True/False (#69)
Closes #69

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 22:57:01 -05:00
cal
a2889751da Merge pull request 'fix: remove SQLite fallback code from db_engine.py (#70)' (#89) from issue/70-remove-sqlite-fallback-code-from-db-engine-py into main 2026-04-08 03:56:11 +00:00
Cal Corum
eb886a4690 fix: remove SQLite fallback code from db_engine.py (#70)
Removes DATABASE_TYPE conditional entirely. PostgreSQL is now the only
supported backend. Moves PooledPostgresqlDatabase import to top-level
and raises RuntimeError at startup if POSTGRES_PASSWORD is unset,
preventing silent misconnection with misleading errors.

Closes #70

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 22:55:44 -05:00
cal
0ee7367bc0 Merge pull request 'fix: disable autoconnect and set pool timeout on PooledPostgresqlDatabase (#80)' (#87) from issue/80-disable-autoconnect-and-set-pool-timeout-on-pooled into main 2026-04-08 03:55:05 +00:00
Cal Corum
6637f6e9eb fix: disable autoconnect and set pool timeout on PooledPostgresqlDatabase (#80)
- Set timeout=5 so pool exhaustion surfaces as an error instead of hanging forever
- Set autoconnect=False to require explicit connection acquisition
- Add HTTP middleware in main.py to open/close connections per request

Closes #80

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 22:54:27 -05:00
cal
fa176c9b05 Merge pull request 'fix: enforce Literal validation on sort parameter in GET /api/v3/players (#66)' (#68) from ai/major-domo-database-66 into main 2026-04-08 03:54:02 +00:00
Cal Corum
ece25ec22c fix: enforce Literal validation on sort parameter in GET /api/v3/players (#66)
Closes #66

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 22:53:33 -05:00
cal
580e8ea031 Merge pull request 'fix: align CustomCommandCreator.discord_id model with BIGINT column (#78)' (#88) from issue/78-fix-discord-id-type-mismatch-between-model-charfie into main 2026-04-08 03:49:26 +00:00
Cal Corum
18394aa74e fix: align CustomCommandCreator.discord_id model with BIGINT column (#78)
Closes #78

Change CharField(max_length=20) to BigIntegerField to match the BIGINT
column created by the migration. Remove the str() workaround in
get_creator_by_discord_id() that was compensating for the type mismatch.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 22:48:55 -05:00
cal
8e7466abd7 Merge pull request 'fix: remove token value from Bad Token log warnings (#79)' (#85) from issue/79-stop-logging-raw-auth-tokens-in-warning-messages into main 2026-04-08 03:43:13 +00:00
Cal Corum
d61bc31daa fix: remove token value from Bad Token log warnings (#79)
Closes #79

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 22:42:49 -05:00
cal
ca15dfe380 Merge pull request 'fix: replace row-by-row DELETE with bulk DELETE in career recalculation (#77)' (#92) from issue/77-replace-row-by-row-delete-with-bulk-delete-in-care into main 2026-04-08 03:24:58 +00:00
cal
1575be8260 Merge pull request 'fix: update Docker base image from Python 3.11 to 3.12 (#82)' (#91) from issue/82-align-python-version-between-docker-image-3-11-and into main 2026-04-08 03:24:46 +00:00
cal
7c7405cd1d Merge pull request 'feat: add migration tracking system (#81)' (#96) from issue/81-add-migration-tracking-system into main 2026-04-08 03:23:41 +00:00
cal
0cc0cba6a9 Merge pull request 'fix: replace deprecated Pydantic .dict() with .model_dump() (#76)' (#90) from issue/76-replace-deprecated-pydantic-dict-with-model-dump into main 2026-04-08 03:23:30 +00:00
cal
41fe4f6ce2 Merge pull request 'fix: add type annotations to untyped query parameters (#73)' (#86) from issue/73-add-type-annotations-to-untyped-query-parameters into main 2026-04-08 03:22:17 +00:00
cal
14234385fe Merge pull request 'fix: add combined_season classmethod to PitchingStat (#65)' (#67) from ai/major-domo-database-65 into main 2026-04-08 03:22:15 +00:00
cal
07aeaa8f3e Merge pull request 'fix: replace manual db.close() calls with middleware-based connection management (#71)' (#97) from issue/71-refactor-manual-db-close-calls-to-middleware-based into main 2026-04-08 02:42:09 +00:00
Cal Corum
eccf4d1441 feat: add migration tracking system (#81)
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m11s
Adds schema_versions table and migrations.py runner to prevent
double-application and missed migrations across dev/prod environments.

- migrations/2026-03-27_add_schema_versions_table.sql: creates tracking table
- migrations.py: applies pending .sql files in sorted order, records each in schema_versions
- .gitignore: untrack migrations.py (was incorrectly ignored as legacy root file)

First run on an existing DB will apply all migrations (safe — all use IF NOT EXISTS).

Closes #81

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 05:34:13 -05:00
Cal Corum
d8c6ce2a5e fix: replace row-by-row DELETE with bulk DELETE in career recalculation (#77)
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m4s
Closes #77

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 03:02:55 -05:00
Cal Corum
665f275546 fix: update Docker base image from Python 3.11 to 3.12 (#82)
Some checks failed
Build Docker Image / build (pull_request) Failing after 49s
Closes #82

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 02:32:01 -05:00
Cal Corum
75a8fc8505 fix: replace deprecated Pydantic .dict() with .model_dump() (#76)
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m26s
Closes #76

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 02:02:49 -05:00
Cal Corum
dcaf184ad3 fix: add type annotations to untyped query parameters (#73)
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m11s
Closes #73

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 00:02:57 -05:00
Cal Corum
a21bb2a380 fix: add combined_season classmethod to PitchingStat (#65)
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m17s
Closes #65

`PitchingStat.combined_season()` was referenced in the `get_pitstats`
handler but never defined, causing a 500 on `s_type=combined/total/all`.

Added `combined_season` as a `@staticmethod` matching the pattern of
`BattingStat.combined_season` — returns all rows for the given season
with no week filter (both regular and postseason).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 17:33:47 -05:00
37 changed files with 455 additions and 173 deletions

View File

@ -1,11 +1,13 @@
# Gitea Actions: Docker Build, Push, and Notify # Gitea Actions: Docker Build, Push, and Notify
# #
# CI/CD pipeline for Major Domo Database API: # CI/CD pipeline for Major Domo Database API:
# - Triggered by pushing a CalVer tag (e.g., 2026.4.5) # - Triggered by pushing a CalVer tag (e.g., 2026.4.5) or the 'dev' tag
# - Builds Docker image and pushes to Docker Hub with version + latest tags # - CalVer tags push with version + "production" Docker tags
# - "dev" tag pushes with "dev" Docker tag for the dev environment
# - Sends Discord notifications on success/failure # - Sends Discord notifications on success/failure
# #
# To release: git tag -a 2026.4.5 -m "description" && git push origin 2026.4.5 # To release: git tag -a 2026.4.5 -m "description" && git push origin 2026.4.5
# To test: git tag -f dev && git push -f origin dev
name: Build Docker Image name: Build Docker Image
@ -13,6 +15,7 @@ on:
push: push:
tags: tags:
- '20*' # matches CalVer tags like 2026.4.5 - '20*' # matches CalVer tags like 2026.4.5
- 'dev'
jobs: jobs:
build: build:
@ -32,6 +35,11 @@ jobs:
echo "version=$VERSION" >> $GITHUB_OUTPUT echo "version=$VERSION" >> $GITHUB_OUTPUT
echo "sha_short=$SHA_SHORT" >> $GITHUB_OUTPUT echo "sha_short=$SHA_SHORT" >> $GITHUB_OUTPUT
echo "timestamp=$(date -u +%Y-%m-%dT%H:%M:%SZ)" >> $GITHUB_OUTPUT echo "timestamp=$(date -u +%Y-%m-%dT%H:%M:%SZ)" >> $GITHUB_OUTPUT
if [ "$VERSION" = "dev" ]; then
echo "environment=dev" >> $GITHUB_OUTPUT
else
echo "environment=production" >> $GITHUB_OUTPUT
fi
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: https://github.com/docker/setup-buildx-action@v3 uses: https://github.com/docker/setup-buildx-action@v3
@ -49,7 +57,9 @@ jobs:
push: true push: true
tags: | tags: |
manticorum67/major-domo-database:${{ steps.version.outputs.version }} manticorum67/major-domo-database:${{ steps.version.outputs.version }}
manticorum67/major-domo-database:latest manticorum67/major-domo-database:${{ steps.version.outputs.environment }}
build-args: |
BUILD_VERSION=${{ steps.version.outputs.version }}
cache-from: type=registry,ref=manticorum67/major-domo-database:buildcache cache-from: type=registry,ref=manticorum67/major-domo-database:buildcache
cache-to: type=registry,ref=manticorum67/major-domo-database:buildcache,mode=max cache-to: type=registry,ref=manticorum67/major-domo-database:buildcache,mode=max
@ -61,7 +71,7 @@ jobs:
echo "" >> $GITHUB_STEP_SUMMARY echo "" >> $GITHUB_STEP_SUMMARY
echo "**Image Tags:**" >> $GITHUB_STEP_SUMMARY echo "**Image Tags:**" >> $GITHUB_STEP_SUMMARY
echo "- \`manticorum67/major-domo-database:${{ steps.version.outputs.version }}\`" >> $GITHUB_STEP_SUMMARY echo "- \`manticorum67/major-domo-database:${{ steps.version.outputs.version }}\`" >> $GITHUB_STEP_SUMMARY
echo "- \`manticorum67/major-domo-database:latest\`" >> $GITHUB_STEP_SUMMARY echo "- \`manticorum67/major-domo-database:${{ steps.version.outputs.environment }}\`" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY echo "" >> $GITHUB_STEP_SUMMARY
echo "**Build Details:**" >> $GITHUB_STEP_SUMMARY echo "**Build Details:**" >> $GITHUB_STEP_SUMMARY
echo "- Commit: \`${{ steps.version.outputs.sha_short }}\`" >> $GITHUB_STEP_SUMMARY echo "- Commit: \`${{ steps.version.outputs.sha_short }}\`" >> $GITHUB_STEP_SUMMARY

1
.gitignore vendored
View File

@ -55,7 +55,6 @@ Include/
pyvenv.cfg pyvenv.cfg
db_engine.py db_engine.py
main.py main.py
migrations.py
db_engine.py db_engine.py
sba_master.db sba_master.db
db_engine.py db_engine.py

View File

@ -1,16 +1,21 @@
# Use specific version for reproducible builds # Use official Python slim image
FROM tiangolo/uvicorn-gunicorn-fastapi:python3.11 FROM python:3.12-slim
# Build-time version arg — passed by CI from the git tag
ARG BUILD_VERSION=dev
LABEL org.opencontainers.image.version=$BUILD_VERSION
# Set Python optimizations # Set Python optimizations
ENV PYTHONUNBUFFERED=1 ENV PYTHONUNBUFFERED=1
ENV PYTHONDONTWRITEBYTECODE=1 ENV PYTHONDONTWRITEBYTECODE=1
ENV PIP_NO_CACHE_DIR=1 ENV PIP_NO_CACHE_DIR=1
# Bake the CalVer version into the image so it's readable at runtime
ENV APP_VERSION=$BUILD_VERSION
WORKDIR /usr/src/app WORKDIR /usr/src/app
# Install system dependencies (PostgreSQL client libraries) # Install system dependencies (PostgreSQL client libraries)
RUN apt-get update && apt-get install -y --no-install-recommends \ RUN apt-get update && apt-get install -y --no-install-recommends \
libpq-dev \
curl \ curl \
&& rm -rf /var/lib/apt/lists/* && rm -rf /var/lib/apt/lists/*
@ -20,11 +25,15 @@ RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir -r requirements.txt pip install --no-cache-dir -r requirements.txt
# Copy application code # Copy application code
COPY ./app /app/app COPY ./app /usr/src/app/app
# Create directories for volumes # Create directories for volumes
RUN mkdir -p /usr/src/app/storage RUN mkdir -p /usr/src/app/storage
# Health check # Health check
HEALTHCHECK --interval=30s --timeout=10s --start-period=10s --retries=3 \ HEALTHCHECK --interval=30s --timeout=10s --start-period=10s --retries=3 \
CMD curl -f http://localhost:80/api/v3/current || exit 1 CMD curl -f http://localhost:80/health || exit 1
# Start uvicorn
ENV WEB_WORKERS=2
CMD ["sh", "-c", "exec uvicorn app.main:app --host 0.0.0.0 --port 80 --workers $WEB_WORKERS"]

View File

@ -8,39 +8,30 @@ from typing import Literal, List, Optional
from pandas import DataFrame from pandas import DataFrame
from peewee import * from peewee import *
from peewee import ModelSelect from peewee import ModelSelect
from playhouse.pool import PooledPostgresqlDatabase
from playhouse.shortcuts import model_to_dict from playhouse.shortcuts import model_to_dict
# Database configuration - supports both SQLite and PostgreSQL _postgres_password = os.environ.get("POSTGRES_PASSWORD")
DATABASE_TYPE = os.environ.get("DATABASE_TYPE", "sqlite") if _postgres_password is None:
raise RuntimeError(
if DATABASE_TYPE.lower() == "postgresql": "POSTGRES_PASSWORD environment variable is not set. "
from playhouse.pool import PooledPostgresqlDatabase "This variable is required when DATABASE_TYPE=postgresql."
_postgres_password = os.environ.get("POSTGRES_PASSWORD")
if _postgres_password is None:
raise RuntimeError(
"POSTGRES_PASSWORD environment variable is not set. "
"This variable is required when DATABASE_TYPE=postgresql."
)
db = PooledPostgresqlDatabase(
os.environ.get("POSTGRES_DB", "sba_master"),
user=os.environ.get("POSTGRES_USER", "sba_admin"),
password=_postgres_password,
host=os.environ.get("POSTGRES_HOST", "sba_postgres"),
port=int(os.environ.get("POSTGRES_PORT", "5432")),
max_connections=20,
stale_timeout=300, # 5 minutes
timeout=0,
autoconnect=True,
autorollback=True, # Automatically rollback failed transactions
)
else:
# Default SQLite configuration
db = SqliteDatabase(
"storage/sba_master.db",
pragmas={"journal_mode": "wal", "cache_size": -1 * 64000, "synchronous": 0},
) )
db = PooledPostgresqlDatabase(
os.environ.get("POSTGRES_DB", "sba_master"),
user=os.environ.get("POSTGRES_USER", "sba_admin"),
password=_postgres_password,
host=os.environ.get("POSTGRES_HOST", "sba_postgres"),
port=int(os.environ.get("POSTGRES_PORT", "5432")),
max_connections=20,
stale_timeout=300, # 5 minutes
timeout=5,
autoconnect=False,
autorollback=True, # Automatically rollback failed transactions
)
date = f"{datetime.datetime.now().year}-{datetime.datetime.now().month}-{datetime.datetime.now().day}" date = f"{datetime.datetime.now().year}-{datetime.datetime.now().month}-{datetime.datetime.now().day}"
logger = logging.getLogger("discord_app") logger = logging.getLogger("discord_app")
@ -523,12 +514,12 @@ class Team(BaseModel):
all_drops = Transaction.select_season(Current.latest().season).where( all_drops = Transaction.select_season(Current.latest().season).where(
(Transaction.oldteam == self) (Transaction.oldteam == self)
& (Transaction.week == current.week + 1) & (Transaction.week == current.week + 1)
& (Transaction.cancelled == 0) & (Transaction.cancelled == False)
) )
all_adds = Transaction.select_season(Current.latest().season).where( all_adds = Transaction.select_season(Current.latest().season).where(
(Transaction.newteam == self) (Transaction.newteam == self)
& (Transaction.week == current.week + 1) & (Transaction.week == current.week + 1)
& (Transaction.cancelled == 0) & (Transaction.cancelled == False)
) )
for move in all_drops: for move in all_drops:
@ -606,12 +597,12 @@ class Team(BaseModel):
all_drops = Transaction.select_season(Current.latest().season).where( all_drops = Transaction.select_season(Current.latest().season).where(
(Transaction.oldteam == sil_team) (Transaction.oldteam == sil_team)
& (Transaction.week == current.week + 1) & (Transaction.week == current.week + 1)
& (Transaction.cancelled == 0) & (Transaction.cancelled == False)
) )
all_adds = Transaction.select_season(Current.latest().season).where( all_adds = Transaction.select_season(Current.latest().season).where(
(Transaction.newteam == sil_team) (Transaction.newteam == sil_team)
& (Transaction.week == current.week + 1) & (Transaction.week == current.week + 1)
& (Transaction.cancelled == 0) & (Transaction.cancelled == False)
) )
for move in all_drops: for move in all_drops:
@ -689,12 +680,12 @@ class Team(BaseModel):
all_drops = Transaction.select_season(Current.latest().season).where( all_drops = Transaction.select_season(Current.latest().season).where(
(Transaction.oldteam == lil_team) (Transaction.oldteam == lil_team)
& (Transaction.week == current.week + 1) & (Transaction.week == current.week + 1)
& (Transaction.cancelled == 0) & (Transaction.cancelled == False)
) )
all_adds = Transaction.select_season(Current.latest().season).where( all_adds = Transaction.select_season(Current.latest().season).where(
(Transaction.newteam == lil_team) (Transaction.newteam == lil_team)
& (Transaction.week == current.week + 1) & (Transaction.week == current.week + 1)
& (Transaction.cancelled == 0) & (Transaction.cancelled == False)
) )
for move in all_drops: for move in all_drops:
@ -1365,6 +1356,10 @@ class PitchingStat(BaseModel):
def select_season(season): def select_season(season):
return PitchingStat.select().where(PitchingStat.season == season) return PitchingStat.select().where(PitchingStat.season == season)
@staticmethod
def combined_season(season):
return PitchingStat.select().where(PitchingStat.season == season)
@staticmethod @staticmethod
def regular_season(season): def regular_season(season):
if season == 1: if season == 1:
@ -1529,7 +1524,18 @@ class Standings(BaseModel):
with db.atomic(): with db.atomic():
Standings.bulk_create(create_teams) Standings.bulk_create(create_teams)
# Iterate through each individual result # Pre-fetch all data needed for in-memory processing (avoids N+1 queries)
standings_by_team_id = {
s.team_id: s
for s in Standings.select().where(Standings.team << s_teams)
}
teams_by_id = {t.id: t for t in Team.select().where(Team.season == season)}
divisions_by_id = {
d.id: d
for d in Division.select().where(Division.season == season)
}
# Iterate through each individual result, tallying wins/losses in memory
# for game in Result.select_season(season).where(Result.week <= 22): # for game in Result.select_season(season).where(Result.week <= 22):
for game in ( for game in (
StratGame.select() StratGame.select()
@ -1540,8 +1546,121 @@ class Standings(BaseModel):
) )
.order_by(StratGame.week, StratGame.game_num) .order_by(StratGame.week, StratGame.game_num)
): ):
# tally win and loss for each standings object away_stan = standings_by_team_id.get(game.away_team_id)
game.update_standings() home_stan = standings_by_team_id.get(game.home_team_id)
away_team_obj = teams_by_id.get(game.away_team_id)
home_team_obj = teams_by_id.get(game.home_team_id)
if None in (away_stan, home_stan, away_team_obj, home_team_obj):
continue
away_div = divisions_by_id.get(away_team_obj.division_id)
home_div = divisions_by_id.get(home_team_obj.division_id)
if away_div is None or home_div is None:
continue
# Home Team Won
if game.home_score > game.away_score:
home_stan.wins += 1
home_stan.home_wins += 1
away_stan.losses += 1
away_stan.away_losses += 1
if home_stan.streak_wl == 'w':
home_stan.streak_num += 1
else:
home_stan.streak_wl = 'w'
home_stan.streak_num = 1
if away_stan.streak_wl == 'l':
away_stan.streak_num += 1
else:
away_stan.streak_wl = 'l'
away_stan.streak_num = 1
if game.home_score == game.away_score + 1:
home_stan.one_run_wins += 1
away_stan.one_run_losses += 1
if away_div.division_abbrev == 'TC':
home_stan.div1_wins += 1
elif away_div.division_abbrev == 'ETSOS':
home_stan.div2_wins += 1
elif away_div.division_abbrev == 'APL':
home_stan.div3_wins += 1
elif away_div.division_abbrev == 'BBC':
home_stan.div4_wins += 1
if home_div.division_abbrev == 'TC':
away_stan.div1_losses += 1
elif home_div.division_abbrev == 'ETSOS':
away_stan.div2_losses += 1
elif home_div.division_abbrev == 'APL':
away_stan.div3_losses += 1
elif home_div.division_abbrev == 'BBC':
away_stan.div4_losses += 1
home_stan.run_diff += game.home_score - game.away_score
away_stan.run_diff -= game.home_score - game.away_score
# Away Team Won
else:
home_stan.losses += 1
home_stan.home_losses += 1
away_stan.wins += 1
away_stan.away_wins += 1
if home_stan.streak_wl == 'l':
home_stan.streak_num += 1
else:
home_stan.streak_wl = 'l'
home_stan.streak_num = 1
if away_stan.streak_wl == 'w':
away_stan.streak_num += 1
else:
away_stan.streak_wl = 'w'
away_stan.streak_num = 1
if game.away_score == game.home_score + 1:
home_stan.one_run_losses += 1
away_stan.one_run_wins += 1
if away_div.division_abbrev == 'TC':
home_stan.div1_losses += 1
elif away_div.division_abbrev == 'ETSOS':
home_stan.div2_losses += 1
elif away_div.division_abbrev == 'APL':
home_stan.div3_losses += 1
elif away_div.division_abbrev == 'BBC':
home_stan.div4_losses += 1
if home_div.division_abbrev == 'TC':
away_stan.div1_wins += 1
elif home_div.division_abbrev == 'ETSOS':
away_stan.div2_wins += 1
elif home_div.division_abbrev == 'APL':
away_stan.div3_wins += 1
elif home_div.division_abbrev == 'BBC':
away_stan.div4_wins += 1
home_stan.run_diff -= game.away_score - game.home_score
away_stan.run_diff += game.away_score - game.home_score
# Bulk save all modified standings
with db.atomic():
Standings.bulk_update(
list(standings_by_team_id.values()),
fields=[
Standings.wins, Standings.losses,
Standings.home_wins, Standings.home_losses,
Standings.away_wins, Standings.away_losses,
Standings.one_run_wins, Standings.one_run_losses,
Standings.streak_wl, Standings.streak_num,
Standings.run_diff,
Standings.div1_wins, Standings.div1_losses,
Standings.div2_wins, Standings.div2_losses,
Standings.div3_wins, Standings.div3_losses,
Standings.div4_wins, Standings.div4_losses,
]
)
# Set pythag record and iterate through last 8 games for last8 record # Set pythag record and iterate through last 8 games for last8 record
for team in all_teams: for team in all_teams:
@ -1590,9 +1709,7 @@ class BattingCareer(BaseModel):
@staticmethod @staticmethod
def recalculate(): def recalculate():
# Wipe existing data # Wipe existing data
delete_lines = BattingCareer.select() BattingCareer.delete().execute()
for line in delete_lines:
line.delete_instance()
# For each seasonstat, find career or create new and increment # For each seasonstat, find career or create new and increment
for this_season in BattingSeason.select().where( for this_season in BattingSeason.select().where(
@ -1655,9 +1772,7 @@ class PitchingCareer(BaseModel):
@staticmethod @staticmethod
def recalculate(): def recalculate():
# Wipe existing data # Wipe existing data
delete_lines = PitchingCareer.select() PitchingCareer.delete().execute()
for line in delete_lines:
line.delete_instance()
# For each seasonstat, find career or create new and increment # For each seasonstat, find career or create new and increment
for this_season in PitchingSeason.select().where( for this_season in PitchingSeason.select().where(
@ -1709,9 +1824,7 @@ class FieldingCareer(BaseModel):
@staticmethod @staticmethod
def recalculate(): def recalculate():
# Wipe existing data # Wipe existing data
delete_lines = FieldingCareer.select() FieldingCareer.delete().execute()
for line in delete_lines:
line.delete_instance()
# For each seasonstat, find career or create new and increment # For each seasonstat, find career or create new and increment
for this_season in FieldingSeason.select().where( for this_season in FieldingSeason.select().where(
@ -2369,6 +2482,12 @@ class StratGame(BaseModel):
home_stan.save() home_stan.save()
away_stan.save() away_stan.save()
class Meta:
indexes = (
(("season",), False),
(("season", "week", "game_num"), False),
)
class StratPlay(BaseModel): class StratPlay(BaseModel):
game = ForeignKeyField(StratGame) game = ForeignKeyField(StratGame)
@ -2443,6 +2562,14 @@ class StratPlay(BaseModel):
re24_primary = FloatField(null=True) re24_primary = FloatField(null=True)
re24_running = FloatField(null=True) re24_running = FloatField(null=True)
class Meta:
indexes = (
(("game",), False),
(("batter",), False),
(("pitcher",), False),
(("runner",), False),
)
class Decision(BaseModel): class Decision(BaseModel):
game = ForeignKeyField(StratGame) game = ForeignKeyField(StratGame)
@ -2465,8 +2592,7 @@ class Decision(BaseModel):
class CustomCommandCreator(BaseModel): class CustomCommandCreator(BaseModel):
"""Model for custom command creators.""" """Model for custom command creators."""
discord_id = BigIntegerField(unique=True)
discord_id = CharField(max_length=20, unique=True) # Discord snowflake ID as string
username = CharField(max_length=32) username = CharField(max_length=32)
display_name = CharField(max_length=32, null=True) display_name = CharField(max_length=32, null=True)
created_at = DateTimeField() created_at = DateTimeField()

View File

@ -8,6 +8,8 @@ from fastapi import Depends, FastAPI, Request
from fastapi.openapi.docs import get_swagger_ui_html from fastapi.openapi.docs import get_swagger_ui_html
from fastapi.openapi.utils import get_openapi from fastapi.openapi.utils import get_openapi
from .db_engine import db
# from fastapi.openapi.docs import get_swagger_ui_html # from fastapi.openapi.docs import get_swagger_ui_html
# from fastapi.openapi.utils import get_openapi # from fastapi.openapi.utils import get_openapi
@ -68,6 +70,17 @@ app = FastAPI(
) )
@app.middleware("http")
async def db_connection_middleware(request: Request, call_next):
db.connect(reuse_if_open=True)
try:
response = await call_next(request)
finally:
if not db.is_closed():
db.close()
return response
logger.info(f"Starting up now...") logger.info(f"Starting up now...")
@ -126,6 +139,11 @@ app.include_router(views.router)
logger.info(f"Loaded all routers.") logger.info(f"Loaded all routers.")
@app.get("/health")
async def health():
return {"status": "ok", "version": os.environ.get("APP_VERSION", "dev")}
@app.get("/api/docs", include_in_schema=False) @app.get("/api/docs", include_in_schema=False)
async def get_docs(req: Request): async def get_docs(req: Request):
logger.debug(req.scope) logger.debug(req.scope)

View File

@ -106,7 +106,7 @@ async def patch_award(
token: str = Depends(oauth2_scheme), token: str = Depends(oauth2_scheme),
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_player - Bad Token: {token}") logger.warning("patch_player - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_award = Award.get_or_none(Award.id == award_id) this_award = Award.get_or_none(Award.id == award_id)
@ -141,7 +141,7 @@ async def patch_award(
@handle_db_errors @handle_db_errors
async def post_award(award_list: AwardList, token: str = Depends(oauth2_scheme)): async def post_award(award_list: AwardList, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_player - Bad Token: {token}") logger.warning("patch_player - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
new_awards = [] new_awards = []
@ -172,7 +172,7 @@ async def post_award(award_list: AwardList, token: str = Depends(oauth2_scheme))
status_code=404, detail=f"Team ID {x.team_id} not found" status_code=404, detail=f"Team ID {x.team_id} not found"
) )
new_awards.append(x.dict()) new_awards.append(x.model_dump())
with db.atomic(): with db.atomic():
for batch in chunked(new_awards, 15): for batch in chunked(new_awards, 15):
@ -185,7 +185,7 @@ async def post_award(award_list: AwardList, token: str = Depends(oauth2_scheme))
@handle_db_errors @handle_db_errors
async def delete_award(award_id: int, token: str = Depends(oauth2_scheme)): async def delete_award(award_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_player - Bad Token: {token}") logger.warning("patch_player - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_award = Award.get_or_none(Award.id == award_id) this_award = Award.get_or_none(Award.id == award_id)

View File

@ -360,13 +360,15 @@ async def patch_batstats(
stat_id: int, new_stats: BatStatModel, token: str = Depends(oauth2_scheme) stat_id: int, new_stats: BatStatModel, token: str = Depends(oauth2_scheme)
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_batstats - Bad Token: {token}") logger.warning("patch_batstats - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
if BattingStat.get_or_none(BattingStat.id == stat_id) is None: if BattingStat.get_or_none(BattingStat.id == stat_id) is None:
raise HTTPException(status_code=404, detail=f"Stat ID {stat_id} not found") raise HTTPException(status_code=404, detail=f"Stat ID {stat_id} not found")
BattingStat.update(**new_stats.dict()).where(BattingStat.id == stat_id).execute() BattingStat.update(**new_stats.model_dump()).where(
BattingStat.id == stat_id
).execute()
r_stat = model_to_dict(BattingStat.get_by_id(stat_id)) r_stat = model_to_dict(BattingStat.get_by_id(stat_id))
return r_stat return r_stat
@ -375,7 +377,7 @@ async def patch_batstats(
@handle_db_errors @handle_db_errors
async def post_batstats(s_list: BatStatList, token: str = Depends(oauth2_scheme)): async def post_batstats(s_list: BatStatList, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"post_batstats - Bad Token: {token}") logger.warning("post_batstats - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
all_stats = [] all_stats = []
@ -403,7 +405,7 @@ async def post_batstats(s_list: BatStatList, token: str = Depends(oauth2_scheme)
status_code=404, detail=f"Player ID {x.player_id} not found" status_code=404, detail=f"Player ID {x.player_id} not found"
) )
all_stats.append(BattingStat(**x.dict())) all_stats.append(BattingStat(**x.model_dump()))
with db.atomic(): with db.atomic():
for batch in chunked(all_stats, 15): for batch in chunked(all_stats, 15):

View File

@ -64,7 +64,7 @@ async def patch_current(
token: str = Depends(oauth2_scheme), token: str = Depends(oauth2_scheme),
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_current - Bad Token: {token}") logger.warning("patch_current - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
try: try:
@ -110,10 +110,10 @@ async def patch_current(
@handle_db_errors @handle_db_errors
async def post_current(new_current: CurrentModel, token: str = Depends(oauth2_scheme)): async def post_current(new_current: CurrentModel, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_current - Bad Token: {token}") logger.warning("patch_current - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_current = Current(**new_current.dict()) this_current = Current(**new_current.model_dump())
if this_current.save(): if this_current.save():
r_curr = model_to_dict(this_current) r_curr = model_to_dict(this_current)
@ -129,7 +129,7 @@ async def post_current(new_current: CurrentModel, token: str = Depends(oauth2_sc
@handle_db_errors @handle_db_errors
async def delete_current(current_id: int, token: str = Depends(oauth2_scheme)): async def delete_current(current_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_current - Bad Token: {token}") logger.warning("patch_current - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
if Current.delete_by_id(current_id) == 1: if Current.delete_by_id(current_id) == 1:

View File

@ -175,7 +175,7 @@ def delete_custom_command(command_id: int):
def get_creator_by_discord_id(discord_id: int): def get_creator_by_discord_id(discord_id: int):
"""Get a creator by Discord ID""" """Get a creator by Discord ID"""
creator = CustomCommandCreator.get_or_none( creator = CustomCommandCreator.get_or_none(
CustomCommandCreator.discord_id == str(discord_id) CustomCommandCreator.discord_id == discord_id
) )
if creator: if creator:
return model_to_dict(creator) return model_to_dict(creator)
@ -375,7 +375,7 @@ async def create_custom_command_endpoint(
): ):
"""Create a new custom command""" """Create a new custom command"""
if not valid_token(token): if not valid_token(token):
logger.warning(f"create_custom_command - Bad Token: {token}") logger.warning("create_custom_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
try: try:
@ -436,7 +436,7 @@ async def update_custom_command_endpoint(
): ):
"""Update an existing custom command""" """Update an existing custom command"""
if not valid_token(token): if not valid_token(token):
logger.warning(f"update_custom_command - Bad Token: {token}") logger.warning("update_custom_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
try: try:
@ -502,7 +502,7 @@ async def patch_custom_command(
): ):
"""Partially update a custom command""" """Partially update a custom command"""
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_custom_command - Bad Token: {token}") logger.warning("patch_custom_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
try: try:
@ -578,7 +578,7 @@ async def delete_custom_command_endpoint(
): ):
"""Delete a custom command""" """Delete a custom command"""
if not valid_token(token): if not valid_token(token):
logger.warning(f"delete_custom_command - Bad Token: {token}") logger.warning("delete_custom_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
try: try:
@ -682,7 +682,7 @@ async def create_creator_endpoint(
): ):
"""Create a new command creator""" """Create a new command creator"""
if not valid_token(token): if not valid_token(token):
logger.warning(f"create_creator - Bad Token: {token}") logger.warning("create_creator - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
try: try:
@ -914,7 +914,7 @@ async def execute_custom_command(
): ):
"""Execute a custom command and update usage statistics""" """Execute a custom command and update usage statistics"""
if not valid_token(token): if not valid_token(token):
logger.warning(f"execute_custom_command - Bad Token: {token}") logger.warning("execute_custom_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
try: try:

View File

@ -162,7 +162,7 @@ async def patch_decision(
token: str = Depends(oauth2_scheme), token: str = Depends(oauth2_scheme),
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_decision - Bad Token: {token}") logger.warning("patch_decision - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_dec = Decision.get_or_none(Decision.id == decision_id) this_dec = Decision.get_or_none(Decision.id == decision_id)
@ -203,7 +203,7 @@ async def patch_decision(
@handle_db_errors @handle_db_errors
async def post_decisions(dec_list: DecisionList, token: str = Depends(oauth2_scheme)): async def post_decisions(dec_list: DecisionList, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"post_decisions - Bad Token: {token}") logger.warning("post_decisions - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
new_dec = [] new_dec = []
@ -217,7 +217,7 @@ async def post_decisions(dec_list: DecisionList, token: str = Depends(oauth2_sch
status_code=404, detail=f"Player ID {x.pitcher_id} not found" status_code=404, detail=f"Player ID {x.pitcher_id} not found"
) )
new_dec.append(x.dict()) new_dec.append(x.model_dump())
with db.atomic(): with db.atomic():
for batch in chunked(new_dec, 10): for batch in chunked(new_dec, 10):
@ -230,7 +230,7 @@ async def post_decisions(dec_list: DecisionList, token: str = Depends(oauth2_sch
@handle_db_errors @handle_db_errors
async def delete_decision(decision_id: int, token: str = Depends(oauth2_scheme)): async def delete_decision(decision_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"delete_decision - Bad Token: {token}") logger.warning("delete_decision - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_dec = Decision.get_or_none(Decision.id == decision_id) this_dec = Decision.get_or_none(Decision.id == decision_id)
@ -253,7 +253,7 @@ async def delete_decision(decision_id: int, token: str = Depends(oauth2_scheme))
@handle_db_errors @handle_db_errors
async def delete_decisions_game(game_id: int, token: str = Depends(oauth2_scheme)): async def delete_decisions_game(game_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"delete_decisions_game - Bad Token: {token}") logger.warning("delete_decisions_game - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_game = StratGame.get_or_none(StratGame.id == game_id) this_game = StratGame.get_or_none(StratGame.id == game_id)

View File

@ -82,7 +82,7 @@ async def patch_division(
token: str = Depends(oauth2_scheme), token: str = Depends(oauth2_scheme),
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_division - Bad Token: {token}") logger.warning("patch_division - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_div = Division.get_or_none(Division.id == division_id) this_div = Division.get_or_none(Division.id == division_id)
@ -115,10 +115,10 @@ async def post_division(
new_division: DivisionModel, token: str = Depends(oauth2_scheme) new_division: DivisionModel, token: str = Depends(oauth2_scheme)
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"post_division - Bad Token: {token}") logger.warning("post_division - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_division = Division(**new_division.dict()) this_division = Division(**new_division.model_dump())
if this_division.save() == 1: if this_division.save() == 1:
r_division = model_to_dict(this_division) r_division = model_to_dict(this_division)
@ -131,7 +131,7 @@ async def post_division(
@handle_db_errors @handle_db_errors
async def delete_division(division_id: int, token: str = Depends(oauth2_scheme)): async def delete_division(division_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"delete_division - Bad Token: {token}") logger.warning("delete_division - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_div = Division.get_or_none(Division.id == division_id) this_div = Division.get_or_none(Division.id == division_id)

View File

@ -44,7 +44,7 @@ async def patch_draftdata(
pick_deadline: Optional[datetime.datetime] = None, result_channel: Optional[int] = None, pick_deadline: Optional[datetime.datetime] = None, result_channel: Optional[int] = None,
ping_channel: Optional[int] = None, pick_minutes: Optional[int] = None, token: str = Depends(oauth2_scheme)): ping_channel: Optional[int] = None, pick_minutes: Optional[int] = None, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f'patch_draftdata - Bad Token: {token}') logger.warning('patch_draftdata - Bad Token')
raise HTTPException(status_code=401, detail='Unauthorized') raise HTTPException(status_code=401, detail='Unauthorized')
draft_data = DraftData.get_or_none(DraftData.id == data_id) draft_data = DraftData.get_or_none(DraftData.id == data_id)

View File

@ -40,7 +40,7 @@ async def get_draftlist(
offset: int = Query(default=0, ge=0), offset: int = Query(default=0, ge=0),
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"get_draftlist - Bad Token: {token}") logger.warning("get_draftlist - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
all_list = DraftList.select() all_list = DraftList.select()
@ -62,7 +62,7 @@ async def get_draftlist(
@handle_db_errors @handle_db_errors
async def get_team_draftlist(team_id: int, token: str = Depends(oauth2_scheme)): async def get_team_draftlist(team_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"post_draftlist - Bad Token: {token}") logger.warning("post_draftlist - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_team = Team.get_or_none(Team.id == team_id) this_team = Team.get_or_none(Team.id == team_id)
@ -84,7 +84,7 @@ async def post_draftlist(
draft_list: DraftListList, token: str = Depends(oauth2_scheme) draft_list: DraftListList, token: str = Depends(oauth2_scheme)
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"post_draftlist - Bad Token: {token}") logger.warning("post_draftlist - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
new_list = [] new_list = []
@ -98,7 +98,7 @@ async def post_draftlist(
DraftList.delete().where(DraftList.team == this_team).execute() DraftList.delete().where(DraftList.team == this_team).execute()
for x in draft_list.draft_list: for x in draft_list.draft_list:
new_list.append(x.dict()) new_list.append(x.model_dump())
with db.atomic(): with db.atomic():
for batch in chunked(new_list, 15): for batch in chunked(new_list, 15):
@ -111,7 +111,7 @@ async def post_draftlist(
@handle_db_errors @handle_db_errors
async def delete_draftlist(team_id: int, token: str = Depends(oauth2_scheme)): async def delete_draftlist(team_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"delete_draftlist - Bad Token: {token}") logger.warning("delete_draftlist - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
count = DraftList.delete().where(DraftList.team_id == team_id).execute() count = DraftList.delete().where(DraftList.team_id == team_id).execute()

View File

@ -144,13 +144,13 @@ async def patch_pick(
pick_id: int, new_pick: DraftPickModel, token: str = Depends(oauth2_scheme) pick_id: int, new_pick: DraftPickModel, token: str = Depends(oauth2_scheme)
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_pick - Bad Token: {token}") logger.warning("patch_pick - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
if DraftPick.get_or_none(DraftPick.id == pick_id) is None: if DraftPick.get_or_none(DraftPick.id == pick_id) is None:
raise HTTPException(status_code=404, detail=f"Pick ID {pick_id} not found") raise HTTPException(status_code=404, detail=f"Pick ID {pick_id} not found")
DraftPick.update(**new_pick.dict()).where(DraftPick.id == pick_id).execute() DraftPick.update(**new_pick.model_dump()).where(DraftPick.id == pick_id).execute()
r_pick = model_to_dict(DraftPick.get_by_id(pick_id)) r_pick = model_to_dict(DraftPick.get_by_id(pick_id))
return r_pick return r_pick
@ -159,7 +159,7 @@ async def patch_pick(
@handle_db_errors @handle_db_errors
async def post_picks(p_list: DraftPickList, token: str = Depends(oauth2_scheme)): async def post_picks(p_list: DraftPickList, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"post_picks - Bad Token: {token}") logger.warning("post_picks - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
new_picks = [] new_picks = []
@ -173,7 +173,7 @@ async def post_picks(p_list: DraftPickList, token: str = Depends(oauth2_scheme))
detail=f"Pick # {pick.overall} already exists for season {pick.season}", detail=f"Pick # {pick.overall} already exists for season {pick.season}",
) )
new_picks.append(pick.dict()) new_picks.append(pick.model_dump())
with db.atomic(): with db.atomic():
for batch in chunked(new_picks, 15): for batch in chunked(new_picks, 15):
@ -186,7 +186,7 @@ async def post_picks(p_list: DraftPickList, token: str = Depends(oauth2_scheme))
@handle_db_errors @handle_db_errors
async def delete_pick(pick_id: int, token: str = Depends(oauth2_scheme)): async def delete_pick(pick_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"delete_pick - Bad Token: {token}") logger.warning("delete_pick - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_pick = DraftPick.get_or_none(DraftPick.id == pick_id) this_pick = DraftPick.get_or_none(DraftPick.id == pick_id)

View File

@ -276,5 +276,4 @@ async def get_totalstats(
} }
) )
return_stats["count"] = len(return_stats["stats"])
return return_stats return return_stats

View File

@ -147,7 +147,7 @@ async def create_help_command_endpoint(
): ):
"""Create a new help command""" """Create a new help command"""
if not valid_token(token): if not valid_token(token):
logger.warning(f"create_help_command - Bad Token: {token}") logger.warning("create_help_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
try: try:
@ -194,7 +194,7 @@ async def update_help_command_endpoint(
): ):
"""Update an existing help command""" """Update an existing help command"""
if not valid_token(token): if not valid_token(token):
logger.warning(f"update_help_command - Bad Token: {token}") logger.warning("update_help_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
try: try:
@ -243,7 +243,7 @@ async def restore_help_command_endpoint(
): ):
"""Restore a soft-deleted help command""" """Restore a soft-deleted help command"""
if not valid_token(token): if not valid_token(token):
logger.warning(f"restore_help_command - Bad Token: {token}") logger.warning("restore_help_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
try: try:
@ -280,7 +280,7 @@ async def delete_help_command_endpoint(
): ):
"""Soft delete a help command""" """Soft delete a help command"""
if not valid_token(token): if not valid_token(token):
logger.warning(f"delete_help_command - Bad Token: {token}") logger.warning("delete_help_command - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
try: try:
@ -397,7 +397,7 @@ async def get_help_command_by_name_endpoint(
async def increment_view_count(command_name: str, token: str = Depends(oauth2_scheme)): async def increment_view_count(command_name: str, token: str = Depends(oauth2_scheme)):
"""Increment view count for a help command""" """Increment view count for a help command"""
if not valid_token(token): if not valid_token(token):
logger.warning(f"increment_view_count - Bad Token: {token}") logger.warning("increment_view_count - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
try: try:

View File

@ -86,7 +86,7 @@ async def patch_injury(
token: str = Depends(oauth2_scheme), token: str = Depends(oauth2_scheme),
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_injury - Bad Token: {token}") logger.warning("patch_injury - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_injury = Injury.get_or_none(Injury.id == injury_id) this_injury = Injury.get_or_none(Injury.id == injury_id)
@ -109,10 +109,10 @@ async def patch_injury(
@handle_db_errors @handle_db_errors
async def post_injury(new_injury: InjuryModel, token: str = Depends(oauth2_scheme)): async def post_injury(new_injury: InjuryModel, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"post_injury - Bad Token: {token}") logger.warning("post_injury - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_injury = Injury(**new_injury.dict()) this_injury = Injury(**new_injury.model_dump())
if this_injury.save(): if this_injury.save():
r_injury = model_to_dict(this_injury) r_injury = model_to_dict(this_injury)
@ -125,7 +125,7 @@ async def post_injury(new_injury: InjuryModel, token: str = Depends(oauth2_schem
@handle_db_errors @handle_db_errors
async def delete_injury(injury_id: int, token: str = Depends(oauth2_scheme)): async def delete_injury(injury_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"delete_injury - Bad Token: {token}") logger.warning("delete_injury - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_injury = Injury.get_or_none(Injury.id == injury_id) this_injury = Injury.get_or_none(Injury.id == injury_id)

View File

@ -68,7 +68,7 @@ async def patch_keeper(
token: str = Depends(oauth2_scheme), token: str = Depends(oauth2_scheme),
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_keeper - Bad Token: {token}") logger.warning("patch_keeper - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_keeper = Keeper.get_or_none(Keeper.id == keeper_id) this_keeper = Keeper.get_or_none(Keeper.id == keeper_id)
@ -95,12 +95,12 @@ async def patch_keeper(
@handle_db_errors @handle_db_errors
async def post_keepers(k_list: KeeperList, token: str = Depends(oauth2_scheme)): async def post_keepers(k_list: KeeperList, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"post_keepers - Bad Token: {token}") logger.warning("post_keepers - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
new_keepers = [] new_keepers = []
for keeper in k_list.keepers: for keeper in k_list.keepers:
new_keepers.append(keeper.dict()) new_keepers.append(keeper.model_dump())
with db.atomic(): with db.atomic():
for batch in chunked(new_keepers, 14): for batch in chunked(new_keepers, 14):
@ -113,7 +113,7 @@ async def post_keepers(k_list: KeeperList, token: str = Depends(oauth2_scheme)):
@handle_db_errors @handle_db_errors
async def delete_keeper(keeper_id: int, token: str = Depends(oauth2_scheme)): async def delete_keeper(keeper_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"delete_keeper - Bad Token: {token}") logger.warning("delete_keeper - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_keeper = Keeper.get_or_none(Keeper.id == keeper_id) this_keeper = Keeper.get_or_none(Keeper.id == keeper_id)

View File

@ -109,7 +109,7 @@ async def patch_manager(
token: str = Depends(oauth2_scheme), token: str = Depends(oauth2_scheme),
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_manager - Bad Token: {token}") logger.warning("patch_manager - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_manager = Manager.get_or_none(Manager.id == manager_id) this_manager = Manager.get_or_none(Manager.id == manager_id)
@ -140,10 +140,10 @@ async def patch_manager(
@handle_db_errors @handle_db_errors
async def post_manager(new_manager: ManagerModel, token: str = Depends(oauth2_scheme)): async def post_manager(new_manager: ManagerModel, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"post_manager - Bad Token: {token}") logger.warning("post_manager - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_manager = Manager(**new_manager.dict()) this_manager = Manager(**new_manager.model_dump())
if this_manager.save(): if this_manager.save():
r_manager = model_to_dict(this_manager) r_manager = model_to_dict(this_manager)
@ -158,7 +158,7 @@ async def post_manager(new_manager: ManagerModel, token: str = Depends(oauth2_sc
@handle_db_errors @handle_db_errors
async def delete_manager(manager_id: int, token: str = Depends(oauth2_scheme)): async def delete_manager(manager_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"delete_manager - Bad Token: {token}") logger.warning("delete_manager - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_manager = Manager.get_or_none(Manager.id == manager_id) this_manager = Manager.get_or_none(Manager.id == manager_id)

View File

@ -311,13 +311,15 @@ async def patch_pitstats(
stat_id: int, new_stats: PitStatModel, token: str = Depends(oauth2_scheme) stat_id: int, new_stats: PitStatModel, token: str = Depends(oauth2_scheme)
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_pitstats - Bad Token: {token}") logger.warning("patch_pitstats - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
if PitchingStat.get_or_none(PitchingStat.id == stat_id) is None: if PitchingStat.get_or_none(PitchingStat.id == stat_id) is None:
raise HTTPException(status_code=404, detail=f"Stat ID {stat_id} not found") raise HTTPException(status_code=404, detail=f"Stat ID {stat_id} not found")
PitchingStat.update(**new_stats.dict()).where(PitchingStat.id == stat_id).execute() PitchingStat.update(**new_stats.model_dump()).where(
PitchingStat.id == stat_id
).execute()
r_stat = model_to_dict(PitchingStat.get_by_id(stat_id)) r_stat = model_to_dict(PitchingStat.get_by_id(stat_id))
return r_stat return r_stat
@ -326,7 +328,7 @@ async def patch_pitstats(
@handle_db_errors @handle_db_errors
async def post_pitstats(s_list: PitStatList, token: str = Depends(oauth2_scheme)): async def post_pitstats(s_list: PitStatList, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"post_pitstats - Bad Token: {token}") logger.warning("post_pitstats - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
all_stats = [] all_stats = []
@ -343,7 +345,7 @@ async def post_pitstats(s_list: PitStatList, token: str = Depends(oauth2_scheme)
status_code=404, detail=f"Player ID {x.player_id} not found" status_code=404, detail=f"Player ID {x.player_id} not found"
) )
all_stats.append(PitchingStat(**x.dict())) all_stats.append(PitchingStat(**x.model_dump()))
with db.atomic(): with db.atomic():
for batch in chunked(all_stats, 15): for batch in chunked(all_stats, 15):

View File

@ -4,7 +4,7 @@ Thin HTTP layer using PlayerService for business logic.
""" """
from fastapi import APIRouter, Query, Response, Depends from fastapi import APIRouter, Query, Response, Depends
from typing import Optional, List from typing import Literal, Optional, List
from ..dependencies import ( from ..dependencies import (
oauth2_scheme, oauth2_scheme,
@ -27,8 +27,10 @@ async def get_players(
pos: list = Query(default=None), pos: list = Query(default=None),
strat_code: list = Query(default=None), strat_code: list = Query(default=None),
is_injured: Optional[bool] = None, is_injured: Optional[bool] = None,
sort: Optional[str] = None, sort: Optional[Literal["cost-asc", "cost-desc", "name-asc", "name-desc"]] = None,
limit: Optional[int] = Query(default=None, ge=1), limit: Optional[int] = Query(
default=None, ge=1, description="Maximum number of results to return"
),
offset: Optional[int] = Query( offset: Optional[int] = Query(
default=None, ge=0, description="Number of results to skip for pagination" default=None, ge=0, description="Number of results to skip for pagination"
), ),

View File

@ -114,7 +114,7 @@ async def patch_result(
token: str = Depends(oauth2_scheme), token: str = Depends(oauth2_scheme),
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_player - Bad Token: {token}") logger.warning("patch_player - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_result = Result.get_or_none(Result.id == result_id) this_result = Result.get_or_none(Result.id == result_id)
@ -158,7 +158,7 @@ async def patch_result(
@handle_db_errors @handle_db_errors
async def post_results(result_list: ResultList, token: str = Depends(oauth2_scheme)): async def post_results(result_list: ResultList, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_player - Bad Token: {token}") logger.warning("patch_player - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
new_results = [] new_results = []
@ -183,7 +183,7 @@ async def post_results(result_list: ResultList, token: str = Depends(oauth2_sche
status_code=404, detail=f"Team ID {x.hometeam_id} not found" status_code=404, detail=f"Team ID {x.hometeam_id} not found"
) )
new_results.append(x.dict()) new_results.append(x.model_dump())
with db.atomic(): with db.atomic():
for batch in chunked(new_results, 15): for batch in chunked(new_results, 15):
@ -196,7 +196,7 @@ async def post_results(result_list: ResultList, token: str = Depends(oauth2_sche
@handle_db_errors @handle_db_errors
async def delete_result(result_id: int, token: str = Depends(oauth2_scheme)): async def delete_result(result_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"delete_result - Bad Token: {token}") logger.warning("delete_result - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_result = Result.get_or_none(Result.id == result_id) this_result = Result.get_or_none(Result.id == result_id)

View File

@ -140,7 +140,7 @@ async def patch_player(
token: str = Depends(oauth2_scheme), token: str = Depends(oauth2_scheme),
): ):
if not valid_token(token): if not valid_token(token):
logging.warning(f"Bad Token: {token}") logging.warning("Bad Token")
raise HTTPException( raise HTTPException(
status_code=401, status_code=401,
detail="You are not authorized to patch mlb players. This event has been logged.", detail="You are not authorized to patch mlb players. This event has been logged.",
@ -179,7 +179,7 @@ async def patch_player(
@handle_db_errors @handle_db_errors
async def post_players(players: PlayerList, token: str = Depends(oauth2_scheme)): async def post_players(players: PlayerList, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logging.warning(f"Bad Token: {token}") logging.warning("Bad Token")
raise HTTPException( raise HTTPException(
status_code=401, status_code=401,
detail="You are not authorized to post mlb players. This event has been logged.", detail="You are not authorized to post mlb players. This event has been logged.",
@ -216,7 +216,7 @@ async def post_players(players: PlayerList, token: str = Depends(oauth2_scheme))
@handle_db_errors @handle_db_errors
async def post_one_player(player: SbaPlayerModel, token: str = Depends(oauth2_scheme)): async def post_one_player(player: SbaPlayerModel, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logging.warning(f"Bad Token: {token}") logging.warning("Bad Token")
raise HTTPException( raise HTTPException(
status_code=401, status_code=401,
detail="You are not authorized to post mlb players. This event has been logged.", detail="You are not authorized to post mlb players. This event has been logged.",
@ -236,7 +236,7 @@ async def post_one_player(player: SbaPlayerModel, token: str = Depends(oauth2_sc
detail=f"{player.first_name} {player.last_name} has a key already in the database", detail=f"{player.first_name} {player.last_name} has a key already in the database",
) )
new_player = SbaPlayer(**player.dict()) new_player = SbaPlayer(**player.model_dump())
saved = new_player.save() saved = new_player.save()
if saved == 1: if saved == 1:
return_val = model_to_dict(new_player) return_val = model_to_dict(new_player)
@ -252,7 +252,7 @@ async def post_one_player(player: SbaPlayerModel, token: str = Depends(oauth2_sc
@handle_db_errors @handle_db_errors
async def delete_player(player_id: int, token: str = Depends(oauth2_scheme)): async def delete_player(player_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logging.warning(f"Bad Token: {token}") logging.warning("Bad Token")
raise HTTPException( raise HTTPException(
status_code=401, status_code=401,
detail="You are not authorized to delete mlb players. This event has been logged.", detail="You are not authorized to delete mlb players. This event has been logged.",

View File

@ -106,7 +106,7 @@ async def patch_schedule(
token: str = Depends(oauth2_scheme), token: str = Depends(oauth2_scheme),
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_schedule - Bad Token: {token}") logger.warning("patch_schedule - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_sched = Schedule.get_or_none(Schedule.id == schedule_id) this_sched = Schedule.get_or_none(Schedule.id == schedule_id)
@ -143,7 +143,7 @@ async def patch_schedule(
@handle_db_errors @handle_db_errors
async def post_schedules(sched_list: ScheduleList, token: str = Depends(oauth2_scheme)): async def post_schedules(sched_list: ScheduleList, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"post_schedules - Bad Token: {token}") logger.warning("post_schedules - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
new_sched = [] new_sched = []
@ -168,7 +168,7 @@ async def post_schedules(sched_list: ScheduleList, token: str = Depends(oauth2_s
status_code=404, detail=f"Team ID {x.hometeam_id} not found" status_code=404, detail=f"Team ID {x.hometeam_id} not found"
) )
new_sched.append(x.dict()) new_sched.append(x.model_dump())
with db.atomic(): with db.atomic():
for batch in chunked(new_sched, 15): for batch in chunked(new_sched, 15):
@ -181,7 +181,7 @@ async def post_schedules(sched_list: ScheduleList, token: str = Depends(oauth2_s
@handle_db_errors @handle_db_errors
async def delete_schedule(schedule_id: int, token: str = Depends(oauth2_scheme)): async def delete_schedule(schedule_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"delete_schedule - Bad Token: {token}") logger.warning("delete_schedule - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_sched = Schedule.get_or_none(Schedule.id == schedule_id) this_sched = Schedule.get_or_none(Schedule.id == schedule_id)

View File

@ -87,13 +87,13 @@ async def get_team_standings(team_id: int):
@router.patch("/{stan_id}", include_in_schema=PRIVATE_IN_SCHEMA) @router.patch("/{stan_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors @handle_db_errors
async def patch_standings( async def patch_standings(
stan_id, stan_id: int,
wins: Optional[int] = None, wins: Optional[int] = None,
losses: Optional[int] = None, losses: Optional[int] = None,
token: str = Depends(oauth2_scheme), token: str = Depends(oauth2_scheme),
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_standings - Bad Token: {token}") logger.warning("patch_standings - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
try: try:
@ -115,7 +115,7 @@ async def patch_standings(
@handle_db_errors @handle_db_errors
async def post_standings(season: int, token: str = Depends(oauth2_scheme)): async def post_standings(season: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"post_standings - Bad Token: {token}") logger.warning("post_standings - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
new_teams = [] new_teams = []
@ -134,7 +134,7 @@ async def post_standings(season: int, token: str = Depends(oauth2_scheme)):
@handle_db_errors @handle_db_errors
async def recalculate_standings(season: int, token: str = Depends(oauth2_scheme)): async def recalculate_standings(season: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"recalculate_standings - Bad Token: {token}") logger.warning("recalculate_standings - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
code = Standings.recalculate(season) code = Standings.recalculate(season)

View File

@ -156,7 +156,7 @@ async def patch_game(
scorecard_url: Optional[str] = None, scorecard_url: Optional[str] = None,
) -> Any: ) -> Any:
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_game - Bad Token: {token}") logger.warning("patch_game - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_game = StratGame.get_or_none(StratGame.id == game_id) this_game = StratGame.get_or_none(StratGame.id == game_id)
@ -236,7 +236,7 @@ async def patch_game(
@handle_db_errors @handle_db_errors
async def post_games(game_list: GameList, token: str = Depends(oauth2_scheme)) -> Any: async def post_games(game_list: GameList, token: str = Depends(oauth2_scheme)) -> Any:
if not valid_token(token): if not valid_token(token):
logger.warning(f"post_games - Bad Token: {token}") logger.warning("post_games - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
new_games = [] new_games = []
@ -250,7 +250,7 @@ async def post_games(game_list: GameList, token: str = Depends(oauth2_scheme)) -
status_code=404, detail=f"Team ID {x.home_team_id} not found" status_code=404, detail=f"Team ID {x.home_team_id} not found"
) )
new_games.append(x.dict()) new_games.append(x.model_dump())
with db.atomic(): with db.atomic():
for batch in chunked(new_games, 16): for batch in chunked(new_games, 16):
@ -263,7 +263,7 @@ async def post_games(game_list: GameList, token: str = Depends(oauth2_scheme)) -
@handle_db_errors @handle_db_errors
async def wipe_game(game_id: int, token: str = Depends(oauth2_scheme)) -> Any: async def wipe_game(game_id: int, token: str = Depends(oauth2_scheme)) -> Any:
if not valid_token(token): if not valid_token(token):
logger.warning(f"wipe_game - Bad Token: {token}") logger.warning("wipe_game - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_game = StratGame.get_or_none(StratGame.id == game_id) this_game = StratGame.get_or_none(StratGame.id == game_id)
@ -287,7 +287,7 @@ async def wipe_game(game_id: int, token: str = Depends(oauth2_scheme)) -> Any:
@handle_db_errors @handle_db_errors
async def delete_game(game_id: int, token: str = Depends(oauth2_scheme)) -> Any: async def delete_game(game_id: int, token: str = Depends(oauth2_scheme)) -> Any:
if not valid_token(token): if not valid_token(token):
logger.warning(f"delete_game - Bad Token: {token}") logger.warning("delete_game - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_game = StratGame.get_or_none(StratGame.id == game_id) this_game = StratGame.get_or_none(StratGame.id == game_id)

View File

@ -17,7 +17,6 @@ from ...dependencies import (
add_cache_headers, add_cache_headers,
cache_result, cache_result,
handle_db_errors, handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT, DEFAULT_LIMIT,
) )
from .common import build_season_games from .common import build_season_games
@ -58,7 +57,7 @@ async def get_batting_totals(
risp: Optional[bool] = None, risp: Optional[bool] = None,
inning: list = Query(default=None), inning: list = Query(default=None),
sort: Optional[str] = None, sort: Optional[str] = None,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT), limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=1000),
short_output: Optional[bool] = False, short_output: Optional[bool] = False,
page_num: Optional[int] = 1, page_num: Optional[int] = 1,
week_start: Optional[int] = None, week_start: Optional[int] = None,

View File

@ -31,13 +31,13 @@ async def patch_play(
play_id: int, new_play: PlayModel, token: str = Depends(oauth2_scheme) play_id: int, new_play: PlayModel, token: str = Depends(oauth2_scheme)
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_play - Bad Token: {token}") logger.warning("patch_play - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
if StratPlay.get_or_none(StratPlay.id == play_id) is None: if StratPlay.get_or_none(StratPlay.id == play_id) is None:
raise HTTPException(status_code=404, detail=f"Play ID {play_id} not found") raise HTTPException(status_code=404, detail=f"Play ID {play_id} not found")
StratPlay.update(**new_play.dict()).where(StratPlay.id == play_id).execute() StratPlay.update(**new_play.model_dump()).where(StratPlay.id == play_id).execute()
r_play = model_to_dict(StratPlay.get_by_id(play_id)) r_play = model_to_dict(StratPlay.get_by_id(play_id))
return r_play return r_play
@ -46,7 +46,7 @@ async def patch_play(
@handle_db_errors @handle_db_errors
async def post_plays(p_list: PlayList, token: str = Depends(oauth2_scheme)): async def post_plays(p_list: PlayList, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"post_plays - Bad Token: {token}") logger.warning("post_plays - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
new_plays = [] new_plays = []
@ -84,7 +84,7 @@ async def post_plays(p_list: PlayList, token: str = Depends(oauth2_scheme)):
if this_play.pa == 0: if this_play.pa == 0:
this_play.batter_final = None this_play.batter_final = None
new_plays.append(this_play.dict()) new_plays.append(this_play.model_dump())
with db.atomic(): with db.atomic():
for batch in chunked(new_plays, 20): for batch in chunked(new_plays, 20):
@ -97,7 +97,7 @@ async def post_plays(p_list: PlayList, token: str = Depends(oauth2_scheme)):
@handle_db_errors @handle_db_errors
async def delete_play(play_id: int, token: str = Depends(oauth2_scheme)): async def delete_play(play_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"delete_play - Bad Token: {token}") logger.warning("delete_play - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_play = StratPlay.get_or_none(StratPlay.id == play_id) this_play = StratPlay.get_or_none(StratPlay.id == play_id)
@ -118,7 +118,7 @@ async def delete_play(play_id: int, token: str = Depends(oauth2_scheme)):
@handle_db_errors @handle_db_errors
async def delete_plays_game(game_id: int, token: str = Depends(oauth2_scheme)): async def delete_plays_game(game_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"delete_plays_game - Bad Token: {token}") logger.warning("delete_plays_game - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
this_game = StratGame.get_or_none(StratGame.id == game_id) this_game = StratGame.get_or_none(StratGame.id == game_id)
@ -139,7 +139,7 @@ async def delete_plays_game(game_id: int, token: str = Depends(oauth2_scheme)):
@handle_db_errors @handle_db_errors
async def post_erun_check(token: str = Depends(oauth2_scheme)): async def post_erun_check(token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"post_erun_check - Bad Token: {token}") logger.warning("post_erun_check - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
all_plays = StratPlay.update(run=1).where( all_plays = StratPlay.update(run=1).where(

View File

@ -17,7 +17,6 @@ from ...dependencies import (
handle_db_errors, handle_db_errors,
add_cache_headers, add_cache_headers,
cache_result, cache_result,
MAX_LIMIT,
DEFAULT_LIMIT, DEFAULT_LIMIT,
) )
from .common import build_season_games from .common import build_season_games
@ -57,7 +56,7 @@ async def get_fielding_totals(
team_id: list = Query(default=None), team_id: list = Query(default=None),
manager_id: list = Query(default=None), manager_id: list = Query(default=None),
sort: Optional[str] = None, sort: Optional[str] = None,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT), limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=1000),
short_output: Optional[bool] = False, short_output: Optional[bool] = False,
page_num: Optional[int] = 1, page_num: Optional[int] = 1,
): ):

View File

@ -20,7 +20,6 @@ from ...dependencies import (
handle_db_errors, handle_db_errors,
add_cache_headers, add_cache_headers,
cache_result, cache_result,
MAX_LIMIT,
DEFAULT_LIMIT, DEFAULT_LIMIT,
) )
from .common import build_season_games from .common import build_season_games
@ -57,7 +56,7 @@ async def get_pitching_totals(
risp: Optional[bool] = None, risp: Optional[bool] = None,
inning: list = Query(default=None), inning: list = Query(default=None),
sort: Optional[str] = None, sort: Optional[str] = None,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT), limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=1000),
short_output: Optional[bool] = False, short_output: Optional[bool] = False,
csv: Optional[bool] = False, csv: Optional[bool] = False,
page_num: Optional[int] = 1, page_num: Optional[int] = 1,

View File

@ -78,14 +78,14 @@ async def get_transactions(
transactions = transactions.where(Transaction.player << these_players) transactions = transactions.where(Transaction.player << these_players)
if cancelled: if cancelled:
transactions = transactions.where(Transaction.cancelled == 1) transactions = transactions.where(Transaction.cancelled == True)
else: else:
transactions = transactions.where(Transaction.cancelled == 0) transactions = transactions.where(Transaction.cancelled == False)
if frozen: if frozen:
transactions = transactions.where(Transaction.frozen == 1) transactions = transactions.where(Transaction.frozen == True)
else: else:
transactions = transactions.where(Transaction.frozen == 0) transactions = transactions.where(Transaction.frozen == False)
transactions = transactions.order_by(-Transaction.week, Transaction.moveid) transactions = transactions.order_by(-Transaction.week, Transaction.moveid)
@ -113,7 +113,7 @@ async def patch_transactions(
cancelled: Optional[bool] = None, cancelled: Optional[bool] = None,
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"patch_transactions - Bad Token: {token}") logger.warning("patch_transactions - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
these_moves = Transaction.select().where(Transaction.moveid == move_id) these_moves = Transaction.select().where(Transaction.moveid == move_id)
@ -138,7 +138,7 @@ async def post_transactions(
moves: TransactionList, token: str = Depends(oauth2_scheme) moves: TransactionList, token: str = Depends(oauth2_scheme)
): ):
if not valid_token(token): if not valid_token(token):
logger.warning(f"post_transactions - Bad Token: {token}") logger.warning("post_transactions - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
all_moves = [] all_moves = []
@ -172,7 +172,7 @@ async def post_transactions(
status_code=404, detail=f"Player ID {x.player_id} not found" status_code=404, detail=f"Player ID {x.player_id} not found"
) )
all_moves.append(x.dict()) all_moves.append(x.model_dump())
with db.atomic(): with db.atomic():
for batch in chunked(all_moves, 15): for batch in chunked(all_moves, 15):
@ -185,7 +185,7 @@ async def post_transactions(
@handle_db_errors @handle_db_errors
async def delete_transactions(move_id, token: str = Depends(oauth2_scheme)): async def delete_transactions(move_id, token: str = Depends(oauth2_scheme)):
if not valid_token(token): if not valid_token(token):
logger.warning(f"delete_transactions - Bad Token: {token}") logger.warning("delete_transactions - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
delete_query = Transaction.delete().where(Transaction.moveid == move_id) delete_query = Transaction.delete().where(Transaction.moveid == move_id)

View File

@ -124,7 +124,7 @@ async def refresh_season_batting_stats(
Useful for full season updates. Useful for full season updates.
""" """
if not valid_token(token): if not valid_token(token):
logger.warning(f"refresh_season_batting_stats - Bad Token: {token}") logger.warning("refresh_season_batting_stats - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
logger.info(f"Refreshing all batting stats for season {season}") logger.info(f"Refreshing all batting stats for season {season}")
@ -270,7 +270,7 @@ async def refresh_season_pitching_stats(
Private endpoint - not included in public API documentation. Private endpoint - not included in public API documentation.
""" """
if not valid_token(token): if not valid_token(token):
logger.warning(f"refresh_season_batting_stats - Bad Token: {token}") logger.warning("refresh_season_batting_stats - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
logger.info(f"Refreshing season {season} pitching stats") logger.info(f"Refreshing season {season} pitching stats")
@ -318,7 +318,7 @@ async def get_admin_cache_stats(token: str = Depends(oauth2_scheme)) -> dict:
Private endpoint - requires authentication. Private endpoint - requires authentication.
""" """
if not valid_token(token): if not valid_token(token):
logger.warning(f"get_admin_cache_stats - Bad Token: {token}") logger.warning("get_admin_cache_stats - Bad Token")
raise HTTPException(status_code=401, detail="Unauthorized") raise HTTPException(status_code=401, detail="Unauthorized")
logger.info("Getting cache statistics") logger.info("Getting cache statistics")

View File

@ -23,9 +23,6 @@ services:
- LOG_LEVEL=${LOG_LEVEL} - LOG_LEVEL=${LOG_LEVEL}
- API_TOKEN=${API_TOKEN} - API_TOKEN=${API_TOKEN}
- TZ=${TZ} - TZ=${TZ}
- WORKERS_PER_CORE=1.5
- TIMEOUT=120
- GRACEFUL_TIMEOUT=120
- DATABASE_TYPE=postgresql - DATABASE_TYPE=postgresql
- POSTGRES_HOST=sba_postgres - POSTGRES_HOST=sba_postgres
- POSTGRES_DB=${SBA_DATABASE} - POSTGRES_DB=${SBA_DATABASE}

88
migrations.py Normal file
View File

@ -0,0 +1,88 @@
#!/usr/bin/env python3
"""Apply pending SQL migrations and record them in schema_versions.
Usage:
python migrations.py
Connects to PostgreSQL using the same environment variables as the API:
POSTGRES_DB (default: sba_master)
POSTGRES_USER (default: sba_admin)
POSTGRES_PASSWORD (required)
POSTGRES_HOST (default: sba_postgres)
POSTGRES_PORT (default: 5432)
On first run against an existing database, all migrations will be applied.
All migration files use IF NOT EXISTS guards so re-applying is safe.
"""
import os
import sys
from pathlib import Path
import psycopg2
MIGRATIONS_DIR = Path(__file__).parent / "migrations"
_CREATE_SCHEMA_VERSIONS = """
CREATE TABLE IF NOT EXISTS schema_versions (
filename VARCHAR(255) PRIMARY KEY,
applied_at TIMESTAMP NOT NULL DEFAULT NOW()
);
"""
def _get_connection():
password = os.environ.get("POSTGRES_PASSWORD")
if password is None:
raise RuntimeError("POSTGRES_PASSWORD environment variable is not set")
return psycopg2.connect(
dbname=os.environ.get("POSTGRES_DB", "sba_master"),
user=os.environ.get("POSTGRES_USER", "sba_admin"),
password=password,
host=os.environ.get("POSTGRES_HOST", "sba_postgres"),
port=int(os.environ.get("POSTGRES_PORT", "5432")),
)
def main():
conn = _get_connection()
try:
with conn:
with conn.cursor() as cur:
cur.execute(_CREATE_SCHEMA_VERSIONS)
with conn.cursor() as cur:
cur.execute("SELECT filename FROM schema_versions")
applied = {row[0] for row in cur.fetchall()}
migration_files = sorted(MIGRATIONS_DIR.glob("*.sql"))
pending = [f for f in migration_files if f.name not in applied]
if not pending:
print("No pending migrations.")
return
for migration_file in pending:
print(f"Applying {migration_file.name} ...", end=" ", flush=True)
sql = migration_file.read_text()
with conn:
with conn.cursor() as cur:
cur.execute(sql)
cur.execute(
"INSERT INTO schema_versions (filename) VALUES (%s)",
(migration_file.name,),
)
print("done")
print(f"\nApplied {len(pending)} migration(s).")
finally:
conn.close()
if __name__ == "__main__":
try:
main()
except Exception as e:
print(f"Error: {e}", file=sys.stderr)
sys.exit(1)

View File

@ -0,0 +1,9 @@
-- Migration: Add schema_versions table for migration tracking
-- Date: 2026-03-27
-- Description: Creates a table to record which SQL migrations have been applied,
-- preventing double-application and missed migrations across environments.
CREATE TABLE IF NOT EXISTS schema_versions (
filename VARCHAR(255) PRIMARY KEY,
applied_at TIMESTAMP NOT NULL DEFAULT NOW()
);

View File

@ -0,0 +1,24 @@
-- Migration: Add missing indexes on foreign key columns in stratplay and stratgame
-- Created: 2026-03-27
--
-- PostgreSQL does not auto-index foreign key columns. These tables are the
-- highest-volume tables in the schema and are filtered/joined on these columns
-- in batting, pitching, and running stats aggregation and standings recalculation.
-- stratplay: FK join column
CREATE INDEX IF NOT EXISTS idx_stratplay_game_id ON stratplay(game_id);
-- stratplay: filtered in batting stats aggregation
CREATE INDEX IF NOT EXISTS idx_stratplay_batter_id ON stratplay(batter_id);
-- stratplay: filtered in pitching stats aggregation
CREATE INDEX IF NOT EXISTS idx_stratplay_pitcher_id ON stratplay(pitcher_id);
-- stratplay: filtered in running stats
CREATE INDEX IF NOT EXISTS idx_stratplay_runner_id ON stratplay(runner_id);
-- stratgame: heavily filtered by season
CREATE INDEX IF NOT EXISTS idx_stratgame_season ON stratgame(season);
-- stratgame: standings recalculation query ordering
CREATE INDEX IF NOT EXISTS idx_stratgame_season_week_game_num ON stratgame(season, week, game_num);

View File

@ -569,7 +569,7 @@ class TestGroupBySbaPlayer:
# Get per-season rows # Get per-season rows
r_seasons = requests.get( r_seasons = requests.get(
f"{api}/api/v3/plays/batting", f"{api}/api/v3/plays/batting",
params={"group_by": "player", "sbaplayer_id": 1, "limit": 999}, params={"group_by": "player", "sbaplayer_id": 1, "limit": 500},
timeout=15, timeout=15,
) )
assert r_seasons.status_code == 200 assert r_seasons.status_code == 200