Compare commits

..

34 Commits

Author SHA1 Message Date
cal
aac4bf50d5 Merge pull request 'chore: switch CI to tag-triggered builds' (#107) from chore/tag-triggered-ci into main
Reviewed-on: #107
2026-04-06 16:59:02 +00:00
Cal Corum
4ad445b0da chore: switch CI to tag-triggered builds
Match the discord bot's CI pattern — trigger on CalVer tag push
instead of branch push/PR. Removes auto-CalVer generation and
simplifies to a single build step.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 16:58:45 +00:00
cal
8d9bbdd7a0 Merge pull request 'fix: increase get_games limit to 1000' (#106) from fix/increase-get-game-limit into main
All checks were successful
Build Docker Image / build (push) Successful in 1m6s
Reviewed-on: #106
2026-04-06 15:30:47 +00:00
cal
c95459fa5d Update app/routers_v3/stratgame.py
All checks were successful
Build Docker Image / build (pull_request) Successful in 4m51s
2026-04-06 14:58:36 +00:00
cal
d809590f0e Merge pull request 'fix: correct column references in season pitching stats SQL' (#105) from fix/pitching-stats-column-name into main
All checks were successful
Build Docker Image / build (push) Successful in 2m11s
2026-04-02 16:57:30 +00:00
cal
0d8e666a75 Merge pull request 'fix: let HTTPException pass through @handle_db_errors' (#104) from fix/handle-db-errors-passthrough-http into main
Some checks failed
Build Docker Image / build (push) Has been cancelled
2026-04-02 16:57:12 +00:00
Cal Corum
bd19b7d913 fix: correct column references in season pitching stats view
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m4s
sp.on_first/on_second/on_third don't exist — the actual columns are
on_first_id/on_second_id/on_third_id. This caused failures when
updating season pitching stats after games.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-02 11:54:56 -05:00
Cal Corum
c49f91cc19 test: update test_get_nonexistent_play to expect 404 after HTTPException fix
All checks were successful
Build Docker Image / build (pull_request) Successful in 1m3s
After handle_db_errors no longer catches HTTPException, GET /plays/999999999
correctly returns 404 instead of 500. Update the assertion and docstring
to reflect the fixed behavior.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-02 09:30:39 -05:00
Cal Corum
215085b326 fix: let HTTPException pass through @handle_db_errors unchanged
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m34s
The decorator was catching all exceptions including intentional
HTTPException (401, 404, etc.) and re-wrapping them as 500 "Database
error". This masked auth failures and other deliberate HTTP errors.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-02 08:30:22 -05:00
cal
c063f5c4ef Merge pull request 'hotfix: remove output caps from GET /players' (#103) from hotfix/remove-players-output-caps into main
All checks were successful
Build Docker Image / build (push) Successful in 1m3s
2026-04-02 01:19:51 +00:00
Cal Corum
d92f571960 hotfix: remove output caps from GET /players endpoint
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m29s
The MAX_LIMIT/DEFAULT_LIMIT caps added in 16f3f8d are too restrictive
for the /players endpoint — bot and website consumers need full player
lists without pagination. Reverts limit param to Optional[int] with no
ceiling while keeping caps on all other endpoints.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-01 20:14:35 -05:00
cal
81baa54681 Merge pull request 'Fix unbounded API queries causing worker timeouts' (#99) from bugfix/limit-caps into main
All checks were successful
Build Docker Image / build (push) Successful in 1m9s
Reviewed-on: #99
2026-04-01 22:44:38 +00:00
Cal Corum
67e87a893a Fix fieldingstats count computed after limit applied
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m9s
Capture total_count before .limit() so the response count reflects
all matching rows, not just the capped page size. Resolves #100.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-01 17:40:02 -05:00
Cal Corum
16f3f8d8de Fix unbounded API queries causing Gunicorn worker timeouts
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m32s
Add MAX_LIMIT=500 cap across all list endpoints, empty string
stripping middleware, and limit/offset to /transactions. Resolves #98.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-01 17:23:25 -05:00
cal
a1fa54c416 Merge pull request 'fix: remove hardcoded fallback password from DB connection' (#84) from fix/remove-default-db-password into main
All checks were successful
Build Docker Image / build (push) Successful in 2m46s
2026-03-28 07:26:55 +00:00
Cal Corum
c451e02c52 fix: remove hardcoded fallback password from PostgreSQL connection
All checks were successful
Build Docker Image / build (pull_request) Successful in 18m25s
Raise RuntimeError on startup if POSTGRES_PASSWORD env var is not set,
instead of silently falling back to a known password in source code.

Closes #C2 from postgres migration review.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 23:15:07 -05:00
cal
da679b6d1a Merge pull request 'Release: merge next-release into main' (#64) from next-release into main
All checks were successful
Build Docker Image / build (push) Successful in 1m6s
Reviewed-on: #64
2026-03-17 21:43:36 +00:00
Cal Corum
697152808b fix: validate sort_by parameter with Literal type in views.py (#36)
All checks were successful
Build Docker Image / build (pull_request) Successful in 4m13s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 16:29:58 -05:00
Cal Corum
c40426d175 fix: remove unimplementable skipped caching tests (#33)
The three skipped tests in TestPlayerServiceCache required caching
in get_players() (read-through cache) and cache propagation through
the cls() pattern in write methods — neither is implemented and the
architecture does not support it without significant refactoring.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 16:29:58 -05:00
Cal Corum
95ff5eeaf9 fix: replace print(req.scope) with logger.debug in /api/docs (#21)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 16:29:58 -05:00
Cal Corum
a0d5d49724 fix: address review feedback (#52)
Guard bulk ID queries against empty lists to prevent PostgreSQL
syntax error (WHERE id IN ()) when batch POST endpoints receive
empty request bodies.

Affected endpoints:
- POST /api/v3/transactions
- POST /api/v3/results
- POST /api/v3/schedules
- POST /api/v3/battingstats

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 16:29:58 -05:00
Cal Corum
b0fd1d89ea fix: eliminate N+1 queries in batch POST endpoints (#25)
Replace per-row Team/Player lookups with bulk IN-list queries before
the validation loop in post_transactions, post_results, post_schedules,
and post_batstats. A 50-move batch now uses 2 queries instead of 150.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 16:29:58 -05:00
Cal Corum
5ac9cce7f0 fix: replace bare except: with except Exception: (#29)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 16:29:21 -05:00
Cal Corum
0e132e602f fix: remove unused imports in standings.py and pitchingstats.py (#30)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 16:28:25 -05:00
Cal Corum
d92bb263f1 fix: invalidate cache after PlayerService write operations (#32)
Add finally blocks to update_player, patch_player, create_players, and
delete_player in PlayerService to call invalidate_related_cache() using
the existing cache_patterns. Matches the pattern already used in
TeamService.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 16:28:07 -05:00
Cal Corum
9558da6ace fix: remove empty WEEK_NUMS dict from db_engine.py (#34)
Dead code - module-level constant defined but never referenced.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 16:28:07 -05:00
Cal Corum
6d1b0ac747 perf: push limit/offset to DB in PlayerService.get_players (#37)
Apply .offset() and .limit() on the Peewee query before materializing
results, instead of fetching all rows into memory and slicing in Python.
Total count is obtained via query.count() before pagination is applied.
In-memory (mock) queries continue to use Python-level slicing.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 16:28:07 -05:00
Cal Corum
a351010c3c fix: calculate lob_2outs and rbipercent in SeasonPitchingStats (#28)
Both fields were hardcoded to 0.0 in the INSERT. Added SQL expressions
to the pitching_stats CTE to calculate them from stratplay data, using
the same logic as the batting stats endpoint.

- lob_2outs: count of runners stranded when pitcher recorded the 3rd out
- rbipercent: RBI allowed (excluding HR) per runner opportunity

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 16:28:07 -05:00
cal
40a71c6f90 Merge pull request 'chore: pin all Python dependency versions in requirements.txt (#62)' (#63) from ai/major-domo-database-62 into main
All checks were successful
Build Docker Image / build (push) Successful in 1m8s
Reviewed-on: #63
2026-03-10 14:04:20 +00:00
Cal Corum
d076b7604c chore: pin all Python dependency versions in requirements.txt (#62)
All checks were successful
Build Docker Image / build (pull_request) Successful in 3m3s
- Pin all direct dependencies to exact versions captured from production
  via `docker exec sba_db_api pip freeze`
- Explicitly pin starlette==0.52.1 (root cause of 2026-03-09 outage)
- Move pytest/pytest-asyncio to new requirements-dev.txt

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-10 00:32:41 -05:00
cal
43a6ed1c74 Merge pull request 'fix: standardize all collection POST routes to use trailing slash' (#61) from fix/standardize-post-trailing-slashes into main
All checks were successful
Build Docker Image / build (push) Successful in 58s
Reviewed-on: #61
2026-03-10 00:41:19 +00:00
Cal Corum
9ec69f9f2c fix: standardize all collection POST routes to use trailing slash
All checks were successful
Build Docker Image / build (pull_request) Successful in 3m38s
aiohttp follows 307 redirects but converts POST to GET, silently
dropping the request body. Standardize all @router.post('') to
@router.post('/') so the canonical URL always has a trailing slash,
preventing 307 redirects when clients POST with trailing slashes.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-09 19:34:28 -05:00
cal
bc36970aeb Merge pull request 'fix: case-insensitive team_abbrev filter in transactions endpoint' (#60) from fix/transaction-team-filter-case-insensitive into main
All checks were successful
Build Docker Image / build (push) Successful in 54s
Reviewed-on: #60
2026-03-08 22:37:45 +00:00
Cal Corum
6af278d737 fix: case-insensitive team_abbrev filter in transactions endpoint
All checks were successful
Build Docker Image / build (pull_request) Successful in 2m7s
The team_abbrev filter uppercased input but compared against mixed-case
DB values (e.g. "MKEMiL"), causing affiliate roster transactions to be
silently dropped from results. Use fn.UPPER() on the DB column to match.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-08 17:35:12 -05:00
38 changed files with 3967 additions and 2544 deletions

View File

@ -1,20 +1,18 @@
# Gitea Actions: Docker Build, Push, and Notify
#
# CI/CD pipeline for Major Domo Database API:
# - Builds Docker images on every push/PR
# - Auto-generates CalVer version (YYYY.MM.BUILD) on main branch merges
# - Pushes to Docker Hub and creates git tag on main
# - Triggered by pushing a CalVer tag (e.g., 2026.4.5)
# - Builds Docker image and pushes to Docker Hub with version + latest tags
# - Sends Discord notifications on success/failure
#
# To release: git tag -a 2026.4.5 -m "description" && git push origin 2026.4.5
name: Build Docker Image
on:
push:
branches:
- main
pull_request:
branches:
- main
tags:
- '20*' # matches CalVer tags like 2026.4.5
jobs:
build:
@ -24,7 +22,16 @@ jobs:
- name: Checkout code
uses: https://github.com/actions/checkout@v4
with:
fetch-depth: 0 # Full history for tag counting
fetch-depth: 0
- name: Extract version from tag
id: version
run: |
VERSION=${GITHUB_REF#refs/tags/}
SHA_SHORT=$(git rev-parse --short HEAD)
echo "version=$VERSION" >> $GITHUB_OUTPUT
echo "sha_short=$SHA_SHORT" >> $GITHUB_OUTPUT
echo "timestamp=$(date -u +%Y-%m-%dT%H:%M:%SZ)" >> $GITHUB_OUTPUT
- name: Set up Docker Buildx
uses: https://github.com/docker/setup-buildx-action@v3
@ -35,80 +42,47 @@ jobs:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Generate CalVer version
id: calver
uses: cal/gitea-actions/calver@main
# Dev build: push with dev + dev-SHA tags (PR/feature branches)
- name: Build Docker image (dev)
if: github.ref != 'refs/heads/main'
uses: https://github.com/docker/build-push-action@v5
with:
context: .
push: true
tags: |
manticorum67/major-domo-database:dev
manticorum67/major-domo-database:dev-${{ steps.calver.outputs.sha_short }}
cache-from: type=registry,ref=manticorum67/major-domo-database:buildcache
cache-to: type=registry,ref=manticorum67/major-domo-database:buildcache,mode=max
# Production build: push with latest + CalVer tags (main only)
- name: Build Docker image (production)
if: github.ref == 'refs/heads/main'
- name: Build and push Docker image
uses: https://github.com/docker/build-push-action@v5
with:
context: .
push: true
tags: |
manticorum67/major-domo-database:${{ steps.version.outputs.version }}
manticorum67/major-domo-database:latest
manticorum67/major-domo-database:${{ steps.calver.outputs.version }}
manticorum67/major-domo-database:${{ steps.calver.outputs.version_sha }}
cache-from: type=registry,ref=manticorum67/major-domo-database:buildcache
cache-to: type=registry,ref=manticorum67/major-domo-database:buildcache,mode=max
- name: Tag release
if: success() && github.ref == 'refs/heads/main'
uses: cal/gitea-actions/gitea-tag@main
with:
version: ${{ steps.calver.outputs.version }}
token: ${{ github.token }}
- name: Build Summary
run: |
echo "## Docker Build Successful" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "**Version:** \`${{ steps.version.outputs.version }}\`" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "**Image Tags:**" >> $GITHUB_STEP_SUMMARY
echo "- \`manticorum67/major-domo-database:${{ steps.version.outputs.version }}\`" >> $GITHUB_STEP_SUMMARY
echo "- \`manticorum67/major-domo-database:latest\`" >> $GITHUB_STEP_SUMMARY
echo "- \`manticorum67/major-domo-database:${{ steps.calver.outputs.version }}\`" >> $GITHUB_STEP_SUMMARY
echo "- \`manticorum67/major-domo-database:${{ steps.calver.outputs.version_sha }}\`" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "**Build Details:**" >> $GITHUB_STEP_SUMMARY
echo "- Branch: \`${{ steps.calver.outputs.branch }}\`" >> $GITHUB_STEP_SUMMARY
echo "- Commit: \`${{ github.sha }}\`" >> $GITHUB_STEP_SUMMARY
echo "- Timestamp: \`${{ steps.calver.outputs.timestamp }}\`" >> $GITHUB_STEP_SUMMARY
echo "- Commit: \`${{ steps.version.outputs.sha_short }}\`" >> $GITHUB_STEP_SUMMARY
echo "- Timestamp: \`${{ steps.version.outputs.timestamp }}\`" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
if [ "${{ github.ref }}" == "refs/heads/main" ]; then
echo "Pushed to Docker Hub!" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "Pull with: \`docker pull manticorum67/major-domo-database:latest\`" >> $GITHUB_STEP_SUMMARY
else
echo "_PR build - image not pushed to Docker Hub_" >> $GITHUB_STEP_SUMMARY
fi
echo "Pull with: \`docker pull manticorum67/major-domo-database:${{ steps.version.outputs.version }}\`" >> $GITHUB_STEP_SUMMARY
- name: Discord Notification - Success
if: success() && github.ref == 'refs/heads/main'
if: success()
uses: cal/gitea-actions/discord-notify@main
with:
webhook_url: ${{ secrets.DISCORD_WEBHOOK }}
title: "Major Domo Database"
status: success
version: ${{ steps.calver.outputs.version }}
image_tag: ${{ steps.calver.outputs.version_sha }}
commit_sha: ${{ steps.calver.outputs.sha_short }}
timestamp: ${{ steps.calver.outputs.timestamp }}
version: ${{ steps.version.outputs.version }}
image_tag: ${{ steps.version.outputs.version }}
commit_sha: ${{ steps.version.outputs.sha_short }}
timestamp: ${{ steps.version.outputs.timestamp }}
- name: Discord Notification - Failure
if: failure() && github.ref == 'refs/heads/main'
if: failure()
uses: cal/gitea-actions/discord-notify@main
with:
webhook_url: ${{ secrets.DISCORD_WEBHOOK }}

View File

@ -40,7 +40,7 @@ python migrations.py # Run migrations (SQL files in migrat
- **Bot container**: `dev_sba_postgres` (PostgreSQL) + `dev_sba_db_api` (API) — check with `docker ps`
- **Image**: `manticorum67/major-domo-database:dev` (Docker Hub)
- **CI/CD**: Gitea Actions on PR to `main` — builds Docker image, auto-generates CalVer version (`YYYY.MM.BUILD`) on merge
- **CI/CD**: Gitea Actions — tag-triggered Docker builds. Push a CalVer tag to release: `git tag -a 2026.4.5 -m "description" && git push origin 2026.4.5`
## Important

File diff suppressed because it is too large Load Diff

View File

@ -11,8 +11,8 @@ from fastapi import HTTPException, Response
from fastapi.security import OAuth2PasswordBearer
from redis import Redis
date = f'{datetime.datetime.now().year}-{datetime.datetime.now().month}-{datetime.datetime.now().day}'
logger = logging.getLogger('discord_app')
date = f"{datetime.datetime.now().year}-{datetime.datetime.now().month}-{datetime.datetime.now().day}"
logger = logging.getLogger("discord_app")
# date = f'{datetime.datetime.now().year}-{datetime.datetime.now().month}-{datetime.datetime.now().day}'
# log_level = logger.info if os.environ.get('LOG_LEVEL') == 'INFO' else 'WARN'
@ -23,10 +23,10 @@ logger = logging.getLogger('discord_app')
# )
# Redis configuration
REDIS_HOST = os.environ.get('REDIS_HOST', 'localhost')
REDIS_PORT = int(os.environ.get('REDIS_PORT', '6379'))
REDIS_DB = int(os.environ.get('REDIS_DB', '0'))
CACHE_ENABLED = os.environ.get('CACHE_ENABLED', 'true').lower() == 'true'
REDIS_HOST = os.environ.get("REDIS_HOST", "localhost")
REDIS_PORT = int(os.environ.get("REDIS_PORT", "6379"))
REDIS_DB = int(os.environ.get("REDIS_DB", "0"))
CACHE_ENABLED = os.environ.get("CACHE_ENABLED", "true").lower() == "true"
# Initialize Redis client with connection error handling
if not CACHE_ENABLED:
@ -40,7 +40,7 @@ else:
db=REDIS_DB,
decode_responses=True,
socket_connect_timeout=5,
socket_timeout=5
socket_timeout=5,
)
# Test connection
redis_client.ping()
@ -50,12 +50,19 @@ else:
redis_client = None
oauth2_scheme = OAuth2PasswordBearer(tokenUrl="token")
priv_help = False if not os.environ.get('PRIVATE_IN_SCHEMA') else os.environ.get('PRIVATE_IN_SCHEMA').upper()
PRIVATE_IN_SCHEMA = True if priv_help == 'TRUE' else False
priv_help = (
False
if not os.environ.get("PRIVATE_IN_SCHEMA")
else os.environ.get("PRIVATE_IN_SCHEMA").upper()
)
PRIVATE_IN_SCHEMA = True if priv_help == "TRUE" else False
MAX_LIMIT = 500
DEFAULT_LIMIT = 200
def valid_token(token):
return token == os.environ.get('API_TOKEN')
return token == os.environ.get("API_TOKEN")
def update_season_batting_stats(player_ids, season, db_connection):
@ -63,17 +70,19 @@ def update_season_batting_stats(player_ids, season, db_connection):
Update season batting stats for specific players in a given season.
Recalculates stats from stratplay data and upserts into seasonbattingstats table.
"""
if not player_ids:
logger.warning("update_season_batting_stats called with empty player_ids list")
return
# Convert single player_id to list for consistency
if isinstance(player_ids, int):
player_ids = [player_ids]
logger.info(f"Updating season batting stats for {len(player_ids)} players in season {season}")
logger.info(
f"Updating season batting stats for {len(player_ids)} players in season {season}"
)
try:
# SQL query to recalculate and upsert batting stats
query = """
@ -217,12 +226,14 @@ def update_season_batting_stats(player_ids, season, db_connection):
sb = EXCLUDED.sb,
cs = EXCLUDED.cs;
"""
# Execute the query with parameters using the passed database connection
db_connection.execute_sql(query, [season, player_ids, season, player_ids])
logger.info(f"Successfully updated season batting stats for {len(player_ids)} players in season {season}")
logger.info(
f"Successfully updated season batting stats for {len(player_ids)} players in season {season}"
)
except Exception as e:
logger.error(f"Error updating season batting stats: {e}")
raise
@ -233,17 +244,19 @@ def update_season_pitching_stats(player_ids, season, db_connection):
Update season pitching stats for specific players in a given season.
Recalculates stats from stratplay and decision data and upserts into seasonpitchingstats table.
"""
if not player_ids:
logger.warning("update_season_pitching_stats called with empty player_ids list")
return
# Convert single player_id to list for consistency
if isinstance(player_ids, int):
player_ids = [player_ids]
logger.info(f"Updating season pitching stats for {len(player_ids)} players in season {season}")
logger.info(
f"Updating season pitching stats for {len(player_ids)} players in season {season}"
)
try:
# SQL query to recalculate and upsert pitching stats
query = """
@ -357,8 +370,28 @@ def update_season_pitching_stats(player_ids, season, db_connection):
WHEN SUM(sp.bb) > 0
THEN ROUND(SUM(sp.so)::DECIMAL / SUM(sp.bb), 2)
ELSE 0.0
END AS kperbb
END AS kperbb,
-- Runners left on base when pitcher recorded the 3rd out
SUM(CASE WHEN sp.on_first_final IS NOT NULL AND sp.on_first_final != 4 AND sp.starting_outs + sp.outs = 3 THEN 1 ELSE 0 END) +
SUM(CASE WHEN sp.on_second_final IS NOT NULL AND sp.on_second_final != 4 AND sp.starting_outs + sp.outs = 3 THEN 1 ELSE 0 END) +
SUM(CASE WHEN sp.on_third_final IS NOT NULL AND sp.on_third_final != 4 AND sp.starting_outs + sp.outs = 3 THEN 1 ELSE 0 END) AS lob_2outs,
-- RBI allowed (excluding HR) per runner opportunity
CASE
WHEN (SUM(CASE WHEN sp.on_first_id IS NOT NULL THEN 1 ELSE 0 END) +
SUM(CASE WHEN sp.on_second_id IS NOT NULL THEN 1 ELSE 0 END) +
SUM(CASE WHEN sp.on_third_id IS NOT NULL THEN 1 ELSE 0 END)) > 0
THEN ROUND(
(SUM(sp.rbi) - SUM(sp.homerun))::DECIMAL /
(SUM(CASE WHEN sp.on_first_id IS NOT NULL THEN 1 ELSE 0 END) +
SUM(CASE WHEN sp.on_second_id IS NOT NULL THEN 1 ELSE 0 END) +
SUM(CASE WHEN sp.on_third_id IS NOT NULL THEN 1 ELSE 0 END)),
3
)
ELSE 0.000
END AS rbipercent
FROM stratplay sp
JOIN stratgame sg ON sg.id = sp.game_id
JOIN player p ON p.id = sp.pitcher_id
@ -402,7 +435,7 @@ def update_season_pitching_stats(player_ids, season, db_connection):
ps.bphr, ps.bpfo, ps.bp1b, ps.bplo, ps.wp, ps.balk,
ps.wpa * -1, ps.era, ps.whip, ps.avg, ps.obp, ps.slg, ps.ops, ps.woba,
ps.hper9, ps.kper9, ps.bbper9, ps.kperbb,
0.0, 0.0, COALESCE(ps.re24 * -1, 0.0)
ps.lob_2outs, ps.rbipercent, COALESCE(ps.re24 * -1, 0.0)
FROM pitching_stats ps
LEFT JOIN decision_stats ds ON ps.player_id = ds.player_id AND ps.season = ds.season
ON CONFLICT (player_id, season)
@ -460,12 +493,14 @@ def update_season_pitching_stats(player_ids, season, db_connection):
rbipercent = EXCLUDED.rbipercent,
re24 = EXCLUDED.re24;
"""
# Execute the query with parameters using the passed database connection
db_connection.execute_sql(query, [season, player_ids, season, player_ids])
logger.info(f"Successfully updated season pitching stats for {len(player_ids)} players in season {season}")
logger.info(
f"Successfully updated season pitching stats for {len(player_ids)} players in season {season}"
)
except Exception as e:
logger.error(f"Error updating season pitching stats: {e}")
raise
@ -474,26 +509,24 @@ def update_season_pitching_stats(player_ids, season, db_connection):
def send_webhook_message(message: str) -> bool:
"""
Send a message to Discord via webhook.
Args:
message: The message content to send
Returns:
bool: True if successful, False otherwise
"""
webhook_url = "https://discord.com/api/webhooks/1408811717424840876/7RXG_D5IqovA3Jwa9YOobUjVcVMuLc6cQyezABcWuXaHo5Fvz1en10M7J43o3OJ3bzGW"
try:
payload = {
"content": message
}
payload = {"content": message}
response = requests.post(webhook_url, json=payload, timeout=10)
response.raise_for_status()
logger.info(f"Webhook message sent successfully: {message[:100]}...")
return True
except requests.exceptions.RequestException as e:
logger.error(f"Failed to send webhook message: {e}")
return False
@ -502,99 +535,106 @@ def send_webhook_message(message: str) -> bool:
return False
def cache_result(ttl: int = 300, key_prefix: str = "api", normalize_params: bool = True):
def cache_result(
ttl: int = 300, key_prefix: str = "api", normalize_params: bool = True
):
"""
Decorator to cache function results in Redis with parameter normalization.
Args:
ttl: Time to live in seconds (default: 5 minutes)
key_prefix: Prefix for cache keys (default: "api")
normalize_params: Remove None/empty values to reduce cache variations (default: True)
Usage:
@cache_result(ttl=600, key_prefix="stats")
async def get_player_stats(player_id: int, season: Optional[int] = None):
# expensive operation
return stats
# These will use the same cache entry when normalize_params=True:
# get_player_stats(123, None) and get_player_stats(123)
"""
def decorator(func):
@wraps(func)
async def wrapper(*args, **kwargs):
# Skip caching if Redis is not available
if redis_client is None:
return await func(*args, **kwargs)
try:
# Normalize parameters to reduce cache variations
normalized_kwargs = kwargs.copy()
if normalize_params:
# Remove None values and empty collections
normalized_kwargs = {
k: v for k, v in kwargs.items()
k: v
for k, v in kwargs.items()
if v is not None and v != [] and v != "" and v != {}
}
# Generate more readable cache key
args_str = "_".join(str(arg) for arg in args if arg is not None)
kwargs_str = "_".join([
f"{k}={v}" for k, v in sorted(normalized_kwargs.items())
])
kwargs_str = "_".join(
[f"{k}={v}" for k, v in sorted(normalized_kwargs.items())]
)
# Combine args and kwargs for cache key
key_parts = [key_prefix, func.__name__]
if args_str:
key_parts.append(args_str)
if kwargs_str:
key_parts.append(kwargs_str)
cache_key = ":".join(key_parts)
# Truncate very long cache keys to prevent Redis key size limits
if len(cache_key) > 200:
cache_key = f"{key_prefix}:{func.__name__}:{hash(cache_key)}"
# Try to get from cache
cached_result = redis_client.get(cache_key)
if cached_result is not None:
logger.debug(f"Cache hit for key: {cache_key}")
return json.loads(cached_result)
# Cache miss - execute function
logger.debug(f"Cache miss for key: {cache_key}")
result = await func(*args, **kwargs)
# Skip caching for Response objects (like CSV downloads) as they can't be properly serialized
if not isinstance(result, Response):
# Store in cache with TTL
redis_client.setex(
cache_key,
ttl,
json.dumps(result, default=str, ensure_ascii=False)
cache_key,
ttl,
json.dumps(result, default=str, ensure_ascii=False),
)
else:
logger.debug(f"Skipping cache for Response object from {func.__name__}")
logger.debug(
f"Skipping cache for Response object from {func.__name__}"
)
return result
except Exception as e:
# If caching fails, log error and continue without caching
logger.error(f"Cache error for {func.__name__}: {e}")
return await func(*args, **kwargs)
return wrapper
return decorator
def invalidate_cache(pattern: str = "*"):
"""
Invalidate cache entries matching a pattern.
Args:
pattern: Redis pattern to match keys (default: "*" for all)
Usage:
invalidate_cache("stats:*") # Clear all stats cache
invalidate_cache("api:get_player_*") # Clear specific player cache
@ -602,12 +642,14 @@ def invalidate_cache(pattern: str = "*"):
if redis_client is None:
logger.warning("Cannot invalidate cache: Redis not available")
return 0
try:
keys = redis_client.keys(pattern)
if keys:
deleted = redis_client.delete(*keys)
logger.info(f"Invalidated {deleted} cache entries matching pattern: {pattern}")
logger.info(
f"Invalidated {deleted} cache entries matching pattern: {pattern}"
)
return deleted
else:
logger.debug(f"No cache entries found matching pattern: {pattern}")
@ -620,13 +662,13 @@ def invalidate_cache(pattern: str = "*"):
def get_cache_stats() -> dict:
"""
Get Redis cache statistics.
Returns:
dict: Cache statistics including memory usage, key count, etc.
"""
if redis_client is None:
return {"status": "unavailable", "message": "Redis not connected"}
try:
info = redis_client.info()
return {
@ -634,7 +676,7 @@ def get_cache_stats() -> dict:
"memory_used": info.get("used_memory_human", "unknown"),
"total_keys": redis_client.dbsize(),
"connected_clients": info.get("connected_clients", 0),
"uptime_seconds": info.get("uptime_in_seconds", 0)
"uptime_seconds": info.get("uptime_in_seconds", 0),
}
except Exception as e:
logger.error(f"Error getting cache stats: {e}")
@ -642,34 +684,35 @@ def get_cache_stats() -> dict:
def add_cache_headers(
max_age: int = 300,
max_age: int = 300,
cache_type: str = "public",
vary: Optional[str] = None,
etag: bool = False
etag: bool = False,
):
"""
Decorator to add HTTP cache headers to FastAPI responses.
Args:
max_age: Cache duration in seconds (default: 5 minutes)
cache_type: "public", "private", or "no-cache" (default: "public")
vary: Vary header value (e.g., "Accept-Encoding, Authorization")
etag: Whether to generate ETag based on response content
Usage:
@add_cache_headers(max_age=1800, cache_type="public")
async def get_static_data():
return {"data": "static content"}
@add_cache_headers(max_age=60, cache_type="private", vary="Authorization")
async def get_user_data():
return {"data": "user specific"}
"""
def decorator(func):
@wraps(func)
async def wrapper(*args, **kwargs):
result = await func(*args, **kwargs)
# Handle different response types
if isinstance(result, Response):
response = result
@ -677,38 +720,41 @@ def add_cache_headers(
# Convert to Response with JSON content
response = Response(
content=json.dumps(result, default=str, ensure_ascii=False),
media_type="application/json"
media_type="application/json",
)
else:
# Handle other response types
response = Response(content=str(result))
# Build Cache-Control header
cache_control_parts = [cache_type]
if cache_type != "no-cache" and max_age > 0:
cache_control_parts.append(f"max-age={max_age}")
response.headers["Cache-Control"] = ", ".join(cache_control_parts)
# Add Vary header if specified
if vary:
response.headers["Vary"] = vary
# Add ETag if requested
if etag and (hasattr(result, '__dict__') or isinstance(result, (dict, list))):
if etag and (
hasattr(result, "__dict__") or isinstance(result, (dict, list))
):
content_hash = hashlib.md5(
json.dumps(result, default=str, sort_keys=True).encode()
).hexdigest()
response.headers["ETag"] = f'"{content_hash}"'
# Add Last-Modified header with current time for dynamic content
response.headers["Last-Modified"] = datetime.datetime.now(datetime.timezone.utc).strftime(
"%a, %d %b %Y %H:%M:%S GMT"
)
response.headers["Last-Modified"] = datetime.datetime.now(
datetime.timezone.utc
).strftime("%a, %d %b %Y %H:%M:%S GMT")
return response
return wrapper
return decorator
@ -718,52 +764,63 @@ def handle_db_errors(func):
Ensures proper cleanup of database connections and provides consistent error handling.
Includes comprehensive logging with function context, timing, and stack traces.
"""
@wraps(func)
async def wrapper(*args, **kwargs):
import time
import traceback
from .db_engine import db # Import here to avoid circular imports
start_time = time.time()
func_name = f"{func.__module__}.{func.__name__}"
# Sanitize arguments for logging (exclude sensitive data)
safe_args = []
safe_kwargs = {}
try:
# Log sanitized arguments (avoid logging tokens, passwords, etc.)
for arg in args:
if hasattr(arg, '__dict__') and hasattr(arg, 'url'): # FastAPI Request object
safe_args.append(f"Request({getattr(arg, 'method', 'UNKNOWN')} {getattr(arg, 'url', 'unknown')})")
if hasattr(arg, "__dict__") and hasattr(
arg, "url"
): # FastAPI Request object
safe_args.append(
f"Request({getattr(arg, 'method', 'UNKNOWN')} {getattr(arg, 'url', 'unknown')})"
)
else:
safe_args.append(str(arg)[:100]) # Truncate long values
for key, value in kwargs.items():
if key.lower() in ['token', 'password', 'secret', 'key']:
safe_kwargs[key] = '[REDACTED]'
if key.lower() in ["token", "password", "secret", "key"]:
safe_kwargs[key] = "[REDACTED]"
else:
safe_kwargs[key] = str(value)[:100] # Truncate long values
logger.info(f"Starting {func_name} - args: {safe_args}, kwargs: {safe_kwargs}")
logger.info(
f"Starting {func_name} - args: {safe_args}, kwargs: {safe_kwargs}"
)
result = await func(*args, **kwargs)
elapsed_time = time.time() - start_time
logger.info(f"Completed {func_name} successfully in {elapsed_time:.3f}s")
return result
except HTTPException:
# Let intentional HTTP errors (401, 404, etc.) pass through unchanged
raise
except Exception as e:
elapsed_time = time.time() - start_time
error_trace = traceback.format_exc()
logger.error(f"Database error in {func_name} after {elapsed_time:.3f}s")
logger.error(f"Function args: {safe_args}")
logger.error(f"Function kwargs: {safe_kwargs}")
logger.error(f"Exception: {str(e)}")
logger.error(f"Full traceback:\n{error_trace}")
try:
logger.info(f"Attempting database rollback for {func_name}")
db.rollback()
@ -775,8 +832,12 @@ def handle_db_errors(func):
db.close()
logger.info(f"Database connection closed for {func_name}")
except Exception as close_error:
logger.error(f"Error closing database connection in {func_name}: {close_error}")
raise HTTPException(status_code=500, detail=f'Database error in {func_name}: {str(e)}')
logger.error(
f"Error closing database connection in {func_name}: {close_error}"
)
raise HTTPException(
status_code=500, detail=f"Database error in {func_name}: {str(e)}"
)
return wrapper

View File

@ -2,6 +2,7 @@ import datetime
import logging
from logging.handlers import RotatingFileHandler
import os
from urllib.parse import parse_qsl, urlencode
from fastapi import Depends, FastAPI, Request
from fastapi.openapi.docs import get_swagger_ui_html
@ -10,38 +11,77 @@ from fastapi.openapi.utils import get_openapi
# from fastapi.openapi.docs import get_swagger_ui_html
# from fastapi.openapi.utils import get_openapi
from .routers_v3 import current, players, results, schedules, standings, teams, transactions, battingstats, pitchingstats, fieldingstats, draftpicks, draftlist, managers, awards, draftdata, keepers, stratgame, stratplay, injuries, decisions, divisions, sbaplayers, custom_commands, help_commands, views
from .routers_v3 import (
current,
players,
results,
schedules,
standings,
teams,
transactions,
battingstats,
pitchingstats,
fieldingstats,
draftpicks,
draftlist,
managers,
awards,
draftdata,
keepers,
stratgame,
stratplay,
injuries,
decisions,
divisions,
sbaplayers,
custom_commands,
help_commands,
views,
)
# date = f'{datetime.datetime.now().year}-{datetime.datetime.now().month}-{datetime.datetime.now().day}'
log_level = logging.INFO if os.environ.get('LOG_LEVEL') == 'INFO' else logging.WARNING
log_level = logging.INFO if os.environ.get("LOG_LEVEL") == "INFO" else logging.WARNING
# logging.basicConfig(
# filename=f'logs/database/{date}.log',
# format='%(asctime)s - sba-database - %(levelname)s - %(message)s',
# level=log_level
# )
logger = logging.getLogger('discord_app')
logger = logging.getLogger("discord_app")
logger.setLevel(log_level)
handler = RotatingFileHandler(
filename='./logs/sba-database.log',
filename="./logs/sba-database.log",
# encoding='utf-8',
maxBytes=8 * 1024 * 1024, # 8 MiB
backupCount=5, # Rotate through 5 files
)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
formatter = logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s")
handler.setFormatter(formatter)
logger.addHandler(handler)
app = FastAPI(
# root_path='/api',
responses={404: {'description': 'Not found'}},
docs_url='/api/docs',
redoc_url='/api/redoc'
responses={404: {"description": "Not found"}},
docs_url="/api/docs",
redoc_url="/api/redoc",
)
logger.info(f'Starting up now...')
logger.info(f"Starting up now...")
@app.middleware("http")
async def strip_empty_query_params(request: Request, call_next):
qs = request.scope.get("query_string", b"")
if qs:
pairs = parse_qsl(qs.decode(), keep_blank_values=True)
filtered = [(k, v) for k, v in pairs if v != ""]
new_qs = urlencode(filtered).encode()
request.scope["query_string"] = new_qs
if hasattr(request, "_query_params"):
del request._query_params
return await call_next(request)
app.include_router(current.router)
@ -70,18 +110,20 @@ app.include_router(custom_commands.router)
app.include_router(help_commands.router)
app.include_router(views.router)
logger.info(f'Loaded all routers.')
logger.info(f"Loaded all routers.")
@app.get("/api/docs", include_in_schema=False)
async def get_docs(req: Request):
print(req.scope)
return get_swagger_ui_html(openapi_url=req.scope.get('root_path')+'/openapi.json', title='Swagger')
logger.debug(req.scope)
return get_swagger_ui_html(
openapi_url=req.scope.get("root_path") + "/openapi.json", title="Swagger"
)
@app.get("/api/openapi.json", include_in_schema=False)
async def openapi():
return get_openapi(title='SBa API Docs', version=f'0.1.1', routes=app.routes)
return get_openapi(title="SBa API Docs", version=f"0.1.1", routes=app.routes)
# @app.get("/api")

View File

@ -4,15 +4,19 @@ import logging
import pydantic
from ..db_engine import db, Award, Team, Player, Manager, model_to_dict, chunked, fn
from ..dependencies import oauth2_scheme, valid_token, PRIVATE_IN_SCHEMA, handle_db_errors
logger = logging.getLogger('discord_app')
router = APIRouter(
prefix='/api/v3/awards',
tags=['awards']
from ..dependencies import (
oauth2_scheme,
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
router = APIRouter(prefix="/api/v3/awards", tags=["awards"])
class AwardModel(pydantic.BaseModel):
name: str
@ -30,13 +34,20 @@ class AwardList(pydantic.BaseModel):
awards: List[AwardModel]
@router.get('')
@router.get("")
@handle_db_errors
async def get_awards(
name: list = Query(default=None), season: Optional[int] = None, timing: Optional[str] = None,
manager_id: list = Query(default=None), player_id: list = Query(default=None),
team_id: list = Query(default=None), short_output: Optional[bool] = False,
player_name: list = Query(default=None)):
name: list = Query(default=None),
season: Optional[int] = None,
timing: Optional[str] = None,
manager_id: list = Query(default=None),
player_id: list = Query(default=None),
team_id: list = Query(default=None),
short_output: Optional[bool] = False,
player_name: list = Query(default=None),
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
offset: int = Query(default=0, ge=0),
):
all_awards = Award.select()
if name is not None:
@ -60,40 +71,51 @@ async def get_awards(
all_players = Player.select().where(fn.Lower(Player.name) << pname_list)
all_awards = all_awards.where(Award.player << all_players)
total_count = all_awards.count()
all_awards = all_awards.offset(offset).limit(limit)
return_awards = {
'count': all_awards.count(),
'awards': [model_to_dict(x, recurse=not short_output) for x in all_awards]
"count": total_count,
"awards": [model_to_dict(x, recurse=not short_output) for x in all_awards],
}
db.close()
return return_awards
@router.get('/{award_id}')
@router.get("/{award_id}")
@handle_db_errors
async def get_one_award(award_id: int, short_output: Optional[bool] = False):
this_award = Award.get_or_none(Award.id == award_id)
if this_award is None:
db.close()
raise HTTPException(status_code=404, detail=f'Award ID {award_id} not found')
raise HTTPException(status_code=404, detail=f"Award ID {award_id} not found")
db.close()
return model_to_dict(this_award, recurse=not short_output)
@router.patch('/{award_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.patch("/{award_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def patch_award(
award_id: int, name: Optional[str] = None, season: Optional[int] = None, timing: Optional[str] = None,
image: Optional[str] = None, manager1_id: Optional[int] = None, manager2_id: Optional[int] = None,
player_id: Optional[int] = None, team_id: Optional[int] = None, token: str = Depends(oauth2_scheme)):
award_id: int,
name: Optional[str] = None,
season: Optional[int] = None,
timing: Optional[str] = None,
image: Optional[str] = None,
manager1_id: Optional[int] = None,
manager2_id: Optional[int] = None,
player_id: Optional[int] = None,
team_id: Optional[int] = None,
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f'patch_player - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_player - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_award = Award.get_or_none(Award.id == award_id)
if this_award is None:
db.close()
raise HTTPException(status_code=404, detail=f'Award ID {award_id} not found')
raise HTTPException(status_code=404, detail=f"Award ID {award_id} not found")
if name is not None:
this_award.name = name
@ -118,26 +140,43 @@ async def patch_award(
return r_award
else:
db.close()
raise HTTPException(status_code=500, detail=f'Unable to patch award {award_id}')
raise HTTPException(status_code=500, detail=f"Unable to patch award {award_id}")
@router.post('', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_award(award_list: AwardList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'patch_player - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_player - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
new_awards = []
for x in award_list.awards:
if x.manager1_id is not None and Manager.get_or_none(Manager.id == x.manager1_id) is None:
raise HTTPException(status_code=404, detail=f'Manager ID {x.manager1_id} not found')
if x.manager2_id is not None and Manager.get_or_none(Manager.id == x.manager2_id) is None:
raise HTTPException(status_code=404, detail=f'Manager ID {x.manager2_id} not found')
if x.player_id is not None and Player.get_or_none(Player.id == x.player_id) is None:
raise HTTPException(status_code=404, detail=f'Player ID {x.player_id} not found')
if (
x.manager1_id is not None
and Manager.get_or_none(Manager.id == x.manager1_id) is None
):
raise HTTPException(
status_code=404, detail=f"Manager ID {x.manager1_id} not found"
)
if (
x.manager2_id is not None
and Manager.get_or_none(Manager.id == x.manager2_id) is None
):
raise HTTPException(
status_code=404, detail=f"Manager ID {x.manager2_id} not found"
)
if (
x.player_id is not None
and Player.get_or_none(Player.id == x.player_id) is None
):
raise HTTPException(
status_code=404, detail=f"Player ID {x.player_id} not found"
)
if x.team_id is not None and Team.get_or_none(Team.id == x.team_id) is None:
raise HTTPException(status_code=404, detail=f'Team ID {x.team_id} not found')
raise HTTPException(
status_code=404, detail=f"Team ID {x.team_id} not found"
)
new_awards.append(x.dict())
@ -146,29 +185,27 @@ async def post_award(award_list: AwardList, token: str = Depends(oauth2_scheme))
Award.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f'Inserted {len(new_awards)} awards'
return f"Inserted {len(new_awards)} awards"
@router.delete('/{award_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.delete("/{award_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def delete_award(award_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'patch_player - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_player - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_award = Award.get_or_none(Award.id == award_id)
if this_award is None:
db.close()
raise HTTPException(status_code=404, detail=f'Award ID {award_id} not found')
raise HTTPException(status_code=404, detail=f"Award ID {award_id} not found")
count = this_award.delete_instance()
db.close()
if count == 1:
return f'Award {award_id} has been deleted'
return f"Award {award_id} has been deleted"
else:
raise HTTPException(status_code=500, detail=f'Award {award_id} could not be deleted')
raise HTTPException(
status_code=500, detail=f"Award {award_id} could not be deleted"
)

View File

@ -3,15 +3,29 @@ from typing import List, Optional, Literal
import logging
import pydantic
from ..db_engine import db, BattingStat, Team, Player, Current, model_to_dict, chunked, fn, per_season_weeks
from ..dependencies import oauth2_scheme, valid_token, PRIVATE_IN_SCHEMA, handle_db_errors
logger = logging.getLogger('discord_app')
router = APIRouter(
prefix='/api/v3/battingstats',
tags=['battingstats']
from ..db_engine import (
db,
BattingStat,
Team,
Player,
Current,
model_to_dict,
chunked,
fn,
per_season_weeks,
)
from ..dependencies import (
oauth2_scheme,
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
router = APIRouter(prefix="/api/v3/battingstats", tags=["battingstats"])
class BatStatModel(pydantic.BaseModel):
@ -60,29 +74,37 @@ class BatStatList(pydantic.BaseModel):
stats: List[BatStatModel]
@router.get('')
@router.get("")
@handle_db_errors
async def get_batstats(
season: int, s_type: Optional[str] = 'regular', team_abbrev: list = Query(default=None),
player_name: list = Query(default=None), player_id: list = Query(default=None),
week_start: Optional[int] = None, week_end: Optional[int] = None, game_num: list = Query(default=None),
position: list = Query(default=None), limit: Optional[int] = None, sort: Optional[str] = None,
short_output: Optional[bool] = True):
if 'post' in s_type.lower():
season: int,
s_type: Optional[str] = "regular",
team_abbrev: list = Query(default=None),
player_name: list = Query(default=None),
player_id: list = Query(default=None),
week_start: Optional[int] = None,
week_end: Optional[int] = None,
game_num: list = Query(default=None),
position: list = Query(default=None),
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
sort: Optional[str] = None,
short_output: Optional[bool] = True,
):
if "post" in s_type.lower():
all_stats = BattingStat.post_season(season)
if all_stats.count() == 0:
db.close()
return {'count': 0, 'stats': []}
elif s_type.lower() in ['combined', 'total', 'all']:
return {"count": 0, "stats": []}
elif s_type.lower() in ["combined", "total", "all"]:
all_stats = BattingStat.combined_season(season)
if all_stats.count() == 0:
db.close()
return {'count': 0, 'stats': []}
return {"count": 0, "stats": []}
else:
all_stats = BattingStat.regular_season(season)
if all_stats.count() == 0:
db.close()
return {'count': 0, 'stats': []}
return {"count": 0, "stats": []}
if position is not None:
all_stats = all_stats.where(BattingStat.pos << [x.upper() for x in position])
@ -93,7 +115,9 @@ async def get_batstats(
if player_id:
all_stats = all_stats.where(BattingStat.player_id << player_id)
else:
p_query = Player.select_season(season).where(fn.Lower(Player.name) << [x.lower() for x in player_name])
p_query = Player.select_season(season).where(
fn.Lower(Player.name) << [x.lower() for x in player_name]
)
all_stats = all_stats.where(BattingStat.player << p_query)
if game_num:
all_stats = all_stats.where(BattingStat.game == game_num)
@ -108,21 +132,18 @@ async def get_batstats(
db.close()
raise HTTPException(
status_code=404,
detail=f'Start week {start} is after end week {end} - cannot pull stats'
detail=f"Start week {start} is after end week {end} - cannot pull stats",
)
all_stats = all_stats.where(
(BattingStat.week >= start) & (BattingStat.week <= end)
)
all_stats = all_stats.where((BattingStat.week >= start) & (BattingStat.week <= end))
if limit:
all_stats = all_stats.limit(limit)
all_stats = all_stats.limit(limit)
if sort:
if sort == 'newest':
if sort == "newest":
all_stats = all_stats.order_by(-BattingStat.week, -BattingStat.game)
return_stats = {
'count': all_stats.count(),
'stats': [model_to_dict(x, recurse=not short_output) for x in all_stats],
"count": all_stats.count(),
"stats": [model_to_dict(x, recurse=not short_output) for x in all_stats],
# 'stats': [{'id': x.id} for x in all_stats]
}
@ -130,52 +151,84 @@ async def get_batstats(
return return_stats
@router.get('/totals')
@router.get("/totals")
@handle_db_errors
async def get_totalstats(
season: int, s_type: Literal['regular', 'post', 'total', None] = None, team_abbrev: list = Query(default=None),
team_id: list = Query(default=None), player_name: list = Query(default=None),
week_start: Optional[int] = None, week_end: Optional[int] = None, game_num: list = Query(default=None),
position: list = Query(default=None), sort: Optional[str] = None, player_id: list = Query(default=None),
group_by: Literal['team', 'player', 'playerteam'] = 'player', short_output: Optional[bool] = False,
min_pa: Optional[int] = 1, week: list = Query(default=None)):
season: int,
s_type: Literal["regular", "post", "total", None] = None,
team_abbrev: list = Query(default=None),
team_id: list = Query(default=None),
player_name: list = Query(default=None),
week_start: Optional[int] = None,
week_end: Optional[int] = None,
game_num: list = Query(default=None),
position: list = Query(default=None),
sort: Optional[str] = None,
player_id: list = Query(default=None),
group_by: Literal["team", "player", "playerteam"] = "player",
short_output: Optional[bool] = False,
min_pa: Optional[int] = 1,
week: list = Query(default=None),
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
offset: int = Query(default=0, ge=0),
):
if sum(1 for x in [s_type, (week_start or week_end), week] if x is not None) > 1:
raise HTTPException(status_code=400, detail=f'Only one of s_type, week_start/week_end, or week may be used.')
raise HTTPException(
status_code=400,
detail=f"Only one of s_type, week_start/week_end, or week may be used.",
)
# Build SELECT fields conditionally based on group_by to match GROUP BY exactly
select_fields = []
if group_by == 'player':
if group_by == "player":
select_fields = [BattingStat.player]
elif group_by == 'team':
elif group_by == "team":
select_fields = [BattingStat.team]
elif group_by == 'playerteam':
elif group_by == "playerteam":
select_fields = [BattingStat.player, BattingStat.team]
else:
# Default case
select_fields = [BattingStat.player]
all_stats = (
BattingStat
.select(*select_fields,
fn.SUM(BattingStat.pa).alias('sum_pa'), fn.SUM(BattingStat.ab).alias('sum_ab'),
fn.SUM(BattingStat.run).alias('sum_run'), fn.SUM(BattingStat.hit).alias('sum_hit'),
fn.SUM(BattingStat.rbi).alias('sum_rbi'), fn.SUM(BattingStat.double).alias('sum_double'),
fn.SUM(BattingStat.triple).alias('sum_triple'), fn.SUM(BattingStat.hr).alias('sum_hr'),
fn.SUM(BattingStat.bb).alias('sum_bb'), fn.SUM(BattingStat.so).alias('sum_so'),
fn.SUM(BattingStat.hbp).alias('sum_hbp'), fn.SUM(BattingStat.sac).alias('sum_sac'),
fn.SUM(BattingStat.ibb).alias('sum_ibb'), fn.SUM(BattingStat.gidp).alias('sum_gidp'),
fn.SUM(BattingStat.sb).alias('sum_sb'), fn.SUM(BattingStat.cs).alias('sum_cs'),
fn.SUM(BattingStat.bphr).alias('sum_bphr'), fn.SUM(BattingStat.bpfo).alias('sum_bpfo'),
fn.SUM(BattingStat.bp1b).alias('sum_bp1b'), fn.SUM(BattingStat.bplo).alias('sum_bplo'),
fn.SUM(BattingStat.xba).alias('sum_xba'), fn.SUM(BattingStat.xbt).alias('sum_xbt'),
fn.SUM(BattingStat.xch).alias('sum_xch'), fn.SUM(BattingStat.xhit).alias('sum_xhit'),
fn.SUM(BattingStat.error).alias('sum_error'), fn.SUM(BattingStat.pb).alias('sum_pb'),
fn.SUM(BattingStat.sbc).alias('sum_sbc'), fn.SUM(BattingStat.csc).alias('sum_csc'),
fn.SUM(BattingStat.roba).alias('sum_roba'), fn.SUM(BattingStat.robs).alias('sum_robs'),
fn.SUM(BattingStat.raa).alias('sum_raa'), fn.SUM(BattingStat.rto).alias('sum_rto'))
.where(BattingStat.season == season)
.having(fn.SUM(BattingStat.pa) >= min_pa)
BattingStat.select(
*select_fields,
fn.SUM(BattingStat.pa).alias("sum_pa"),
fn.SUM(BattingStat.ab).alias("sum_ab"),
fn.SUM(BattingStat.run).alias("sum_run"),
fn.SUM(BattingStat.hit).alias("sum_hit"),
fn.SUM(BattingStat.rbi).alias("sum_rbi"),
fn.SUM(BattingStat.double).alias("sum_double"),
fn.SUM(BattingStat.triple).alias("sum_triple"),
fn.SUM(BattingStat.hr).alias("sum_hr"),
fn.SUM(BattingStat.bb).alias("sum_bb"),
fn.SUM(BattingStat.so).alias("sum_so"),
fn.SUM(BattingStat.hbp).alias("sum_hbp"),
fn.SUM(BattingStat.sac).alias("sum_sac"),
fn.SUM(BattingStat.ibb).alias("sum_ibb"),
fn.SUM(BattingStat.gidp).alias("sum_gidp"),
fn.SUM(BattingStat.sb).alias("sum_sb"),
fn.SUM(BattingStat.cs).alias("sum_cs"),
fn.SUM(BattingStat.bphr).alias("sum_bphr"),
fn.SUM(BattingStat.bpfo).alias("sum_bpfo"),
fn.SUM(BattingStat.bp1b).alias("sum_bp1b"),
fn.SUM(BattingStat.bplo).alias("sum_bplo"),
fn.SUM(BattingStat.xba).alias("sum_xba"),
fn.SUM(BattingStat.xbt).alias("sum_xbt"),
fn.SUM(BattingStat.xch).alias("sum_xch"),
fn.SUM(BattingStat.xhit).alias("sum_xhit"),
fn.SUM(BattingStat.error).alias("sum_error"),
fn.SUM(BattingStat.pb).alias("sum_pb"),
fn.SUM(BattingStat.sbc).alias("sum_sbc"),
fn.SUM(BattingStat.csc).alias("sum_csc"),
fn.SUM(BattingStat.roba).alias("sum_roba"),
fn.SUM(BattingStat.robs).alias("sum_robs"),
fn.SUM(BattingStat.raa).alias("sum_raa"),
fn.SUM(BattingStat.rto).alias("sum_rto"),
)
.where(BattingStat.season == season)
.having(fn.SUM(BattingStat.pa) >= min_pa)
)
if True in [s_type is not None, week_start is not None, week_end is not None]:
@ -185,16 +238,20 @@ async def get_totalstats(
elif week_start is not None or week_end is not None:
if week_start is None or week_end is None:
raise HTTPException(
status_code=400, detail='Both week_start and week_end must be included if either is used.'
status_code=400,
detail="Both week_start and week_end must be included if either is used.",
)
weeks["start"] = week_start
if week_end < weeks["start"]:
raise HTTPException(
status_code=400,
detail="week_end must be greater than or equal to week_start",
)
weeks['start'] = week_start
if week_end < weeks['start']:
raise HTTPException(status_code=400, detail='week_end must be greater than or equal to week_start')
else:
weeks['end'] = week_end
weeks["end"] = week_end
all_stats = all_stats.where(
(BattingStat.week >= weeks['start']) & (BattingStat.week <= weeks['end'])
(BattingStat.week >= weeks["start"]) & (BattingStat.week <= weeks["end"])
)
elif week is not None:
all_stats = all_stats.where(BattingStat.week << week)
@ -204,14 +261,20 @@ async def get_totalstats(
if position is not None:
p_list = [x.upper() for x in position]
all_players = Player.select().where(
(Player.pos_1 << p_list) | (Player.pos_2 << p_list) | (Player.pos_3 << p_list) | ( Player.pos_4 << p_list) |
(Player.pos_5 << p_list) | (Player.pos_6 << p_list) | (Player.pos_7 << p_list) | ( Player.pos_8 << p_list)
(Player.pos_1 << p_list)
| (Player.pos_2 << p_list)
| (Player.pos_3 << p_list)
| (Player.pos_4 << p_list)
| (Player.pos_5 << p_list)
| (Player.pos_6 << p_list)
| (Player.pos_7 << p_list)
| (Player.pos_8 << p_list)
)
all_stats = all_stats.where(BattingStat.player << all_players)
if sort is not None:
if sort == 'player':
if sort == "player":
all_stats = all_stats.order_by(BattingStat.player)
elif sort == 'team':
elif sort == "team":
all_stats = all_stats.order_by(BattingStat.team)
if group_by is not None:
# Use the same fields for GROUP BY as we used for SELECT
@ -227,56 +290,66 @@ async def get_totalstats(
all_teams = Team.select().where(Team.id << team_id)
all_stats = all_stats.where(BattingStat.team << all_teams)
elif team_abbrev is not None:
all_teams = Team.select().where(fn.Lower(Team.abbrev) << [x.lower() for x in team_abbrev])
all_teams = Team.select().where(
fn.Lower(Team.abbrev) << [x.lower() for x in team_abbrev]
)
all_stats = all_stats.where(BattingStat.team << all_teams)
if player_name is not None:
all_players = Player.select().where(fn.Lower(Player.name) << [x.lower() for x in player_name])
all_players = Player.select().where(
fn.Lower(Player.name) << [x.lower() for x in player_name]
)
all_stats = all_stats.where(BattingStat.player << all_players)
elif player_id is not None:
all_players = Player.select().where(Player.id << player_id)
all_stats = all_stats.where(BattingStat.player << all_players)
return_stats = {
'count': all_stats.count(),
'stats': []
}
total_count = all_stats.count()
all_stats = all_stats.offset(offset).limit(limit)
return_stats = {"count": total_count, "stats": []}
for x in all_stats:
# Handle player field based on grouping with safe access
this_player = 'TOT'
if 'player' in group_by and hasattr(x, 'player'):
this_player = x.player_id if short_output else model_to_dict(x.player, recurse=False)
this_player = "TOT"
if "player" in group_by and hasattr(x, "player"):
this_player = (
x.player_id if short_output else model_to_dict(x.player, recurse=False)
)
# Handle team field based on grouping with safe access
this_team = 'TOT'
if 'team' in group_by and hasattr(x, 'team'):
this_team = x.team_id if short_output else model_to_dict(x.team, recurse=False)
return_stats['stats'].append({
'player': this_player,
'team': this_team,
'pa': x.sum_pa,
'ab': x.sum_ab,
'run': x.sum_run,
'hit': x.sum_hit,
'rbi': x.sum_rbi,
'double': x.sum_double,
'triple': x.sum_triple,
'hr': x.sum_hr,
'bb': x.sum_bb,
'so': x.sum_so,
'hbp': x.sum_hbp,
'sac': x.sum_sac,
'ibb': x.sum_ibb,
'gidp': x.sum_gidp,
'sb': x.sum_sb,
'cs': x.sum_cs,
'bphr': x.sum_bphr,
'bpfo': x.sum_bpfo,
'bp1b': x.sum_bp1b,
'bplo': x.sum_bplo
})
# Handle team field based on grouping with safe access
this_team = "TOT"
if "team" in group_by and hasattr(x, "team"):
this_team = (
x.team_id if short_output else model_to_dict(x.team, recurse=False)
)
return_stats["stats"].append(
{
"player": this_player,
"team": this_team,
"pa": x.sum_pa,
"ab": x.sum_ab,
"run": x.sum_run,
"hit": x.sum_hit,
"rbi": x.sum_rbi,
"double": x.sum_double,
"triple": x.sum_triple,
"hr": x.sum_hr,
"bb": x.sum_bb,
"so": x.sum_so,
"hbp": x.sum_hbp,
"sac": x.sum_sac,
"ibb": x.sum_ibb,
"gidp": x.sum_gidp,
"sb": x.sum_sb,
"cs": x.sum_cs,
"bphr": x.sum_bphr,
"bpfo": x.sum_bpfo,
"bp1b": x.sum_bp1b,
"bplo": x.sum_bplo,
}
)
db.close()
return return_stats
@ -287,15 +360,17 @@ async def get_totalstats(
# pass # Keep Career Stats table and recalculate after posting stats
@router.patch('/{stat_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.patch("/{stat_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def patch_batstats(stat_id: int, new_stats: BatStatModel, token: str = Depends(oauth2_scheme)):
async def patch_batstats(
stat_id: int, new_stats: BatStatModel, token: str = Depends(oauth2_scheme)
):
if not valid_token(token):
logger.warning(f'patch_batstats - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_batstats - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
if BattingStat.get_or_none(BattingStat.id == stat_id) is None:
raise HTTPException(status_code=404, detail=f'Stat ID {stat_id} not found')
raise HTTPException(status_code=404, detail=f"Stat ID {stat_id} not found")
BattingStat.update(**new_stats.dict()).where(BattingStat.id == stat_id).execute()
r_stat = model_to_dict(BattingStat.get_by_id(stat_id))
@ -303,22 +378,37 @@ async def patch_batstats(stat_id: int, new_stats: BatStatModel, token: str = Dep
return r_stat
@router.post('', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_batstats(s_list: BatStatList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'post_batstats - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"post_batstats - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
all_stats = []
all_team_ids = list(set(x.team_id for x in s_list.stats))
all_player_ids = list(set(x.player_id for x in s_list.stats))
found_team_ids = (
set(t.id for t in Team.select(Team.id).where(Team.id << all_team_ids))
if all_team_ids
else set()
)
found_player_ids = (
set(p.id for p in Player.select(Player.id).where(Player.id << all_player_ids))
if all_player_ids
else set()
)
for x in s_list.stats:
team = Team.get_or_none(Team.id == x.team_id)
this_player = Player.get_or_none(Player.id == x.player_id)
if team is None:
raise HTTPException(status_code=404, detail=f'Team ID {x.team_id} not found')
if this_player is None:
raise HTTPException(status_code=404, detail=f'Player ID {x.player_id} not found')
if x.team_id not in found_team_ids:
raise HTTPException(
status_code=404, detail=f"Team ID {x.team_id} not found"
)
if x.player_id not in found_player_ids:
raise HTTPException(
status_code=404, detail=f"Player ID {x.player_id} not found"
)
all_stats.append(BattingStat(**x.dict()))
@ -329,4 +419,4 @@ async def post_batstats(s_list: BatStatList, token: str = Depends(oauth2_scheme)
# Update career stats
db.close()
return f'Added {len(all_stats)} batting lines'
return f"Added {len(all_stats)} batting lines"

View File

@ -4,15 +4,17 @@ import logging
import pydantic
from ..db_engine import db, Current, model_to_dict
from ..dependencies import oauth2_scheme, valid_token, PRIVATE_IN_SCHEMA, handle_db_errors
logger = logging.getLogger('discord_app')
router = APIRouter(
prefix='/api/v3/current',
tags=['current']
from ..dependencies import (
oauth2_scheme,
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
)
logger = logging.getLogger("discord_app")
router = APIRouter(prefix="/api/v3/current", tags=["current"])
class CurrentModel(pydantic.BaseModel):
week: Optional[int] = 0
@ -29,7 +31,7 @@ class CurrentModel(pydantic.BaseModel):
injury_count: Optional[int] = 0
@router.get('')
@router.get("")
@handle_db_errors
async def get_current(season: Optional[int] = None):
if season is not None:
@ -45,21 +47,33 @@ async def get_current(season: Optional[int] = None):
return None
@router.patch('/{current_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.patch("/{current_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def patch_current(
current_id: int, season: Optional[int] = None, week: Optional[int] = None, freeze: Optional[bool] = None,
transcount: Optional[int] = None, bstatcount: Optional[int] = None, pstatcount: Optional[int] = None,
bet_week: Optional[int] = None, trade_deadline: Optional[int] = None, pick_trade_start: Optional[int] = None,
pick_trade_end: Optional[int] = None, injury_count: Optional[int] = None, token: str = Depends(oauth2_scheme)):
current_id: int,
season: Optional[int] = None,
week: Optional[int] = None,
freeze: Optional[bool] = None,
transcount: Optional[int] = None,
bstatcount: Optional[int] = None,
pstatcount: Optional[int] = None,
bet_week: Optional[int] = None,
trade_deadline: Optional[int] = None,
pick_trade_start: Optional[int] = None,
pick_trade_end: Optional[int] = None,
injury_count: Optional[int] = None,
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f'patch_current - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_current - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
current = Current.get_by_id(current_id)
except Exception as e:
raise HTTPException(status_code=404, detail=f'Current id {current_id} not found')
raise HTTPException(
status_code=404, detail=f"Current id {current_id} not found"
)
if week is not None:
current.week = week
@ -90,15 +104,17 @@ async def patch_current(
return r_curr
else:
db.close()
raise HTTPException(status_code=500, detail=f'Unable to patch current {current_id}')
raise HTTPException(
status_code=500, detail=f"Unable to patch current {current_id}"
)
@router.post('', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_current(new_current: CurrentModel, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'patch_current - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_current - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_current = Current(**new_current.dict())
@ -108,17 +124,22 @@ async def post_current(new_current: CurrentModel, token: str = Depends(oauth2_sc
return r_curr
else:
db.close()
raise HTTPException(status_code=500, detail=f'Unable to post season {new_current.season} current')
raise HTTPException(
status_code=500,
detail=f"Unable to post season {new_current.season} current",
)
@router.delete('/{current_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.delete("/{current_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def delete_current(current_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'patch_current - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_current - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
if Current.delete_by_id(current_id) == 1:
return f'Deleted current ID {current_id}'
return f"Deleted current ID {current_id}"
raise HTTPException(status_code=500, detail=f'Unable to delete current {current_id}')
raise HTTPException(
status_code=500, detail=f"Unable to delete current {current_id}"
)

File diff suppressed because it is too large Load Diff

View File

@ -4,15 +4,28 @@ import copy
import logging
import pydantic
from ..db_engine import db, Decision, StratGame, Player, model_to_dict, chunked, fn, Team
from ..dependencies import oauth2_scheme, valid_token, PRIVATE_IN_SCHEMA, handle_db_errors
logger = logging.getLogger('discord_app')
router = APIRouter(
prefix='/api/v3/decisions',
tags=['decisions']
from ..db_engine import (
db,
Decision,
StratGame,
Player,
model_to_dict,
chunked,
fn,
Team,
)
from ..dependencies import (
oauth2_scheme,
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
router = APIRouter(prefix="/api/v3/decisions", tags=["decisions"])
class DecisionModel(pydantic.BaseModel):
@ -43,17 +56,31 @@ class DecisionReturnList(pydantic.BaseModel):
decisions: list[DecisionModel]
@router.get('')
@router.get("")
@handle_db_errors
async def get_decisions(
season: list = Query(default=None), week: list = Query(default=None), game_num: list = Query(default=None),
s_type: Literal['regular', 'post', 'all', None] = None, team_id: list = Query(default=None),
week_start: Optional[int] = None, week_end: Optional[int] = None, win: Optional[int] = None,
loss: Optional[int] = None, hold: Optional[int] = None, save: Optional[int] = None,
b_save: Optional[int] = None, irunners: list = Query(default=None), irunners_scored: list = Query(default=None),
game_id: list = Query(default=None), player_id: list = Query(default=None),
limit: Optional[int] = None, short_output: Optional[bool] = False):
all_dec = Decision.select().order_by(-Decision.season, -Decision.week, -Decision.game_num)
season: list = Query(default=None),
week: list = Query(default=None),
game_num: list = Query(default=None),
s_type: Literal["regular", "post", "all", None] = None,
team_id: list = Query(default=None),
week_start: Optional[int] = None,
week_end: Optional[int] = None,
win: Optional[int] = None,
loss: Optional[int] = None,
hold: Optional[int] = None,
save: Optional[int] = None,
b_save: Optional[int] = None,
irunners: list = Query(default=None),
irunners_scored: list = Query(default=None),
game_id: list = Query(default=None),
player_id: list = Query(default=None),
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
short_output: Optional[bool] = False,
):
all_dec = Decision.select().order_by(
-Decision.season, -Decision.week, -Decision.game_num
)
if season is not None:
all_dec = all_dec.where(Decision.season << season)
@ -79,7 +106,8 @@ async def get_decisions(
if season is not None and 8 in season or s8_teams:
all_teams = Team.select().where(Team.id << team_id)
all_games = StratGame.select().where(
(StratGame.away_team << all_teams) | (StratGame.home_team << all_teams))
(StratGame.away_team << all_teams) | (StratGame.home_team << all_teams)
)
all_dec = all_dec.where(Decision.game << all_games)
else:
all_teams = Team.select().where(Team.id << team_id)
@ -109,34 +137,41 @@ async def get_decisions(
if irunners_scored is not None:
all_dec = all_dec.where(Decision.irunners_scored << irunners_scored)
if limit is not None:
if limit < 1:
limit = 1
all_dec = all_dec.limit(limit)
all_dec = all_dec.limit(limit)
return_dec = {
'count': all_dec.count(),
'decisions': [model_to_dict(x, recurse=not short_output) for x in all_dec]
"count": all_dec.count(),
"decisions": [model_to_dict(x, recurse=not short_output) for x in all_dec],
}
db.close()
return return_dec
@router.patch('/{decision_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.patch("/{decision_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def patch_decision(
decision_id: int, win: Optional[int] = None, loss: Optional[int] = None, hold: Optional[int] = None,
save: Optional[int] = None, b_save: Optional[int] = None, irunners: Optional[int] = None,
irunners_scored: Optional[int] = None, rest_ip: Optional[int] = None, rest_required: Optional[int] = None,
token: str = Depends(oauth2_scheme)):
decision_id: int,
win: Optional[int] = None,
loss: Optional[int] = None,
hold: Optional[int] = None,
save: Optional[int] = None,
b_save: Optional[int] = None,
irunners: Optional[int] = None,
irunners_scored: Optional[int] = None,
rest_ip: Optional[int] = None,
rest_required: Optional[int] = None,
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f'patch_decision - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_decision - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_dec = Decision.get_or_none(Decision.id == decision_id)
if this_dec is None:
db.close()
raise HTTPException(status_code=404, detail=f'Decision ID {decision_id} not found')
raise HTTPException(
status_code=404, detail=f"Decision ID {decision_id} not found"
)
if win is not None:
this_dec.win = win
@ -163,22 +198,28 @@ async def patch_decision(
return d_result
else:
db.close()
raise HTTPException(status_code=500, detail=f'Unable to patch decision {decision_id}')
raise HTTPException(
status_code=500, detail=f"Unable to patch decision {decision_id}"
)
@router.post('', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_decisions(dec_list: DecisionList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'post_decisions - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"post_decisions - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
new_dec = []
for x in dec_list.decisions:
if StratGame.get_or_none(StratGame.id == x.game_id) is None:
raise HTTPException(status_code=404, detail=f'Game ID {x.game_id} not found')
raise HTTPException(
status_code=404, detail=f"Game ID {x.game_id} not found"
)
if Player.get_or_none(Player.id == x.pitcher_id) is None:
raise HTTPException(status_code=404, detail=f'Player ID {x.pitcher_id} not found')
raise HTTPException(
status_code=404, detail=f"Player ID {x.pitcher_id} not found"
)
new_dec.append(x.dict())
@ -187,49 +228,53 @@ async def post_decisions(dec_list: DecisionList, token: str = Depends(oauth2_sch
Decision.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f'Inserted {len(new_dec)} decisions'
return f"Inserted {len(new_dec)} decisions"
@router.delete('/{decision_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.delete("/{decision_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def delete_decision(decision_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'delete_decision - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"delete_decision - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_dec = Decision.get_or_none(Decision.id == decision_id)
if this_dec is None:
db.close()
raise HTTPException(status_code=404, detail=f'Decision ID {decision_id} not found')
raise HTTPException(
status_code=404, detail=f"Decision ID {decision_id} not found"
)
count = this_dec.delete_instance()
db.close()
if count == 1:
return f'Decision {decision_id} has been deleted'
return f"Decision {decision_id} has been deleted"
else:
raise HTTPException(status_code=500, detail=f'Decision {decision_id} could not be deleted')
raise HTTPException(
status_code=500, detail=f"Decision {decision_id} could not be deleted"
)
@router.delete('/game/{game_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.delete("/game/{game_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def delete_decisions_game(game_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'delete_decisions_game - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"delete_decisions_game - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_game = StratGame.get_or_none(StratGame.id == game_id)
if not this_game:
db.close()
raise HTTPException(status_code=404, detail=f'Game ID {game_id} not found')
raise HTTPException(status_code=404, detail=f"Game ID {game_id} not found")
count = Decision.delete().where(Decision.game == this_game).execute()
db.close()
if count > 0:
return f'Deleted {count} decisions matching Game ID {game_id}'
return f"Deleted {count} decisions matching Game ID {game_id}"
else:
raise HTTPException(status_code=500, detail=f'No decisions matching Game ID {game_id} were deleted')
raise HTTPException(
status_code=500,
detail=f"No decisions matching Game ID {game_id} were deleted",
)

View File

@ -4,15 +4,19 @@ import logging
import pydantic
from ..db_engine import db, Division, Team, model_to_dict, chunked, fn
from ..dependencies import oauth2_scheme, valid_token, PRIVATE_IN_SCHEMA, handle_db_errors
logger = logging.getLogger('discord_app')
router = APIRouter(
prefix='/api/v3/divisions',
tags=['divisions']
from ..dependencies import (
oauth2_scheme,
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
router = APIRouter(prefix="/api/v3/divisions", tags=["divisions"])
class DivisionModel(pydantic.BaseModel):
division_name: str
@ -22,11 +26,17 @@ class DivisionModel(pydantic.BaseModel):
season: int = 0
@router.get('')
@router.get("")
@handle_db_errors
async def get_divisions(
season: int, div_name: Optional[str] = None, div_abbrev: Optional[str] = None, lg_name: Optional[str] = None,
lg_abbrev: Optional[str] = None):
season: int,
div_name: Optional[str] = None,
div_abbrev: Optional[str] = None,
lg_name: Optional[str] = None,
lg_abbrev: Optional[str] = None,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
offset: int = Query(default=0, ge=0),
):
all_divisions = Division.select().where(Division.season == season)
if div_name is not None:
@ -38,40 +48,52 @@ async def get_divisions(
if lg_abbrev is not None:
all_divisions = all_divisions.where(Division.league_abbrev == lg_abbrev)
total_count = all_divisions.count()
all_divisions = all_divisions.offset(offset).limit(limit)
return_div = {
'count': all_divisions.count(),
'divisions': [model_to_dict(x) for x in all_divisions]
"count": total_count,
"divisions": [model_to_dict(x) for x in all_divisions],
}
db.close()
return return_div
@router.get('/{division_id}')
@router.get("/{division_id}")
@handle_db_errors
async def get_one_division(division_id: int):
this_div = Division.get_or_none(Division.id == division_id)
if this_div is None:
db.close()
raise HTTPException(status_code=404, detail=f'Division ID {division_id} not found')
raise HTTPException(
status_code=404, detail=f"Division ID {division_id} not found"
)
r_div = model_to_dict(this_div)
db.close()
return r_div
@router.patch('/{division_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.patch("/{division_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def patch_division(
division_id: int, div_name: Optional[str] = None, div_abbrev: Optional[str] = None,
lg_name: Optional[str] = None, lg_abbrev: Optional[str] = None, token: str = Depends(oauth2_scheme)):
division_id: int,
div_name: Optional[str] = None,
div_abbrev: Optional[str] = None,
lg_name: Optional[str] = None,
lg_abbrev: Optional[str] = None,
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f'patch_division - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_division - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_div = Division.get_or_none(Division.id == division_id)
if this_div is None:
db.close()
raise HTTPException(status_code=404, detail=f'Division ID {division_id} not found')
raise HTTPException(
status_code=404, detail=f"Division ID {division_id} not found"
)
if div_name is not None:
this_div.division_name = div_name
@ -88,15 +110,19 @@ async def patch_division(
return r_division
else:
db.close()
raise HTTPException(status_code=500, detail=f'Unable to patch division {division_id}')
raise HTTPException(
status_code=500, detail=f"Unable to patch division {division_id}"
)
@router.post('', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_division(new_division: DivisionModel, token: str = Depends(oauth2_scheme)):
async def post_division(
new_division: DivisionModel, token: str = Depends(oauth2_scheme)
):
if not valid_token(token):
logger.warning(f'post_division - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"post_division - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_division = Division(**new_division.dict())
@ -106,27 +132,29 @@ async def post_division(new_division: DivisionModel, token: str = Depends(oauth2
return r_division
else:
db.close()
raise HTTPException(status_code=500, detail=f'Unable to post division')
raise HTTPException(status_code=500, detail=f"Unable to post division")
@router.delete('/{division_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.delete("/{division_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def delete_division(division_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'delete_division - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"delete_division - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_div = Division.get_or_none(Division.id == division_id)
if this_div is None:
db.close()
raise HTTPException(status_code=404, detail=f'Division ID {division_id} not found')
raise HTTPException(
status_code=404, detail=f"Division ID {division_id} not found"
)
count = this_div.delete_instance()
db.close()
if count == 1:
return f'Division {division_id} has been deleted'
return f"Division {division_id} has been deleted"
else:
raise HTTPException(status_code=500, detail=f'Division {division_id} could not be deleted')
raise HTTPException(
status_code=500, detail=f"Division {division_id} could not be deleted"
)

View File

@ -4,15 +4,19 @@ import logging
import pydantic
from ..db_engine import db, DraftList, Team, model_to_dict, chunked
from ..dependencies import oauth2_scheme, valid_token, PRIVATE_IN_SCHEMA, handle_db_errors
logger = logging.getLogger('discord_app')
router = APIRouter(
prefix='/api/v3/draftlist',
tags=['draftlist']
from ..dependencies import (
oauth2_scheme,
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
router = APIRouter(prefix="/api/v3/draftlist", tags=["draftlist"])
class DraftListModel(pydantic.BaseModel):
season: int
@ -26,13 +30,18 @@ class DraftListList(pydantic.BaseModel):
draft_list: List[DraftListModel]
@router.get('', include_in_schema=PRIVATE_IN_SCHEMA)
@router.get("", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def get_draftlist(
season: Optional[int], team_id: list = Query(default=None), token: str = Depends(oauth2_scheme)):
season: Optional[int],
team_id: list = Query(default=None),
token: str = Depends(oauth2_scheme),
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
offset: int = Query(default=0, ge=0),
):
if not valid_token(token):
logger.warning(f'get_draftlist - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"get_draftlist - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
all_list = DraftList.select()
@ -41,47 +50,52 @@ async def get_draftlist(
if team_id is not None:
all_list = all_list.where(DraftList.team_id << team_id)
r_list = {
'count': all_list.count(),
'picks': [model_to_dict(x) for x in all_list]
}
total_count = all_list.count()
all_list = all_list.offset(offset).limit(limit)
r_list = {"count": total_count, "picks": [model_to_dict(x) for x in all_list]}
db.close()
return r_list
@router.get('/team/{team_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.get("/team/{team_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def get_team_draftlist(team_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'post_draftlist - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"post_draftlist - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_team = Team.get_or_none(Team.id == team_id)
if this_team is None:
raise HTTPException(status_code=404, detail=f'Team ID {team_id} not found')
raise HTTPException(status_code=404, detail=f"Team ID {team_id} not found")
this_list = DraftList.select().where(DraftList.team == this_team)
r_list = {
'count': this_list.count(),
'picks': [model_to_dict(x) for x in this_list]
"count": this_list.count(),
"picks": [model_to_dict(x) for x in this_list],
}
db.close()
return r_list
@router.post('', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_draftlist(draft_list: DraftListList, token: str = Depends(oauth2_scheme)):
async def post_draftlist(
draft_list: DraftListList, token: str = Depends(oauth2_scheme)
):
if not valid_token(token):
logger.warning(f'post_draftlist - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"post_draftlist - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
new_list = []
this_team = Team.get_or_none(Team.id == draft_list.draft_list[0].team_id)
if this_team is None:
raise HTTPException(status_code=404, detail=f'Team ID {draft_list.draft_list[0].team_id} not found')
raise HTTPException(
status_code=404,
detail=f"Team ID {draft_list.draft_list[0].team_id} not found",
)
DraftList.delete().where(DraftList.team == this_team).execute()
@ -93,16 +107,16 @@ async def post_draftlist(draft_list: DraftListList, token: str = Depends(oauth2_
DraftList.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f'Inserted {len(new_list)} list values'
return f"Inserted {len(new_list)} list values"
@router.delete('/team/{team_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.delete("/team/{team_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def delete_draftlist(team_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'delete_draftlist - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"delete_draftlist - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
count = DraftList.delete().where(DraftList.team_id == team_id).execute()
db.close()
return f'Deleted {count} list values'
return f"Deleted {count} list values"

View File

@ -4,15 +4,19 @@ import logging
import pydantic
from ..db_engine import db, DraftPick, Team, model_to_dict, chunked
from ..dependencies import oauth2_scheme, valid_token, PRIVATE_IN_SCHEMA, handle_db_errors
logger = logging.getLogger('discord_app')
router = APIRouter(
prefix='/api/v3/draftpicks',
tags=['draftpicks']
from ..dependencies import (
oauth2_scheme,
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
router = APIRouter(prefix="/api/v3/draftpicks", tags=["draftpicks"])
class DraftPickModel(pydantic.BaseModel):
overall: Optional[int] = None
@ -32,15 +36,26 @@ class DraftPickReturnList(pydantic.BaseModel):
picks: list[DraftPickModel]
@router.get('')
@router.get("")
@handle_db_errors
async def get_picks(
season: int, owner_team_abbrev: list = Query(default=None), orig_team_abbrev: list = Query(default=None),
owner_team_id: list = Query(default=None), orig_team_id: list = Query(default=None),
pick_round_start: Optional[int] = None, pick_round_end: Optional[int] = None, traded: Optional[bool] = None,
overall: Optional[int] = None, overall_start: Optional[int] = None, overall_end: Optional[int] = None,
short_output: Optional[bool] = False, sort: Optional[str] = None, limit: Optional[int] = None,
player_id: list = Query(default=None), player_taken: Optional[bool] = None):
season: int,
owner_team_abbrev: list = Query(default=None),
orig_team_abbrev: list = Query(default=None),
owner_team_id: list = Query(default=None),
orig_team_id: list = Query(default=None),
pick_round_start: Optional[int] = None,
pick_round_end: Optional[int] = None,
traded: Optional[bool] = None,
overall: Optional[int] = None,
overall_start: Optional[int] = None,
overall_end: Optional[int] = None,
short_output: Optional[bool] = False,
sort: Optional[str] = None,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
player_id: list = Query(default=None),
player_taken: Optional[bool] = None,
):
all_picks = DraftPick.select().where(DraftPick.season == season)
if owner_team_abbrev is not None:
@ -61,16 +76,25 @@ async def get_picks(
if owner_team_id is not None:
all_picks = all_picks.where(
(DraftPick.owner_id << owner_team_id) | (DraftPick.owner_id << owner_team_id)
(DraftPick.owner_id << owner_team_id)
| (DraftPick.owner_id << owner_team_id)
)
if orig_team_id is not None:
all_picks = all_picks.where(
(DraftPick.origowner_id << orig_team_id) | (DraftPick.origowner_id << orig_team_id)
(DraftPick.origowner_id << orig_team_id)
| (DraftPick.origowner_id << orig_team_id)
)
if pick_round_start is not None and pick_round_end is not None and pick_round_end < pick_round_start:
raise HTTPException(status_code=400, detail=f'pick_round_end must be greater than or equal to pick_round_start')
if (
pick_round_start is not None
and pick_round_end is not None
and pick_round_end < pick_round_start
):
raise HTTPException(
status_code=400,
detail=f"pick_round_end must be greater than or equal to pick_round_start",
)
if player_id is not None:
all_picks = all_picks.where(DraftPick.player_id << player_id)
@ -88,44 +112,45 @@ async def get_picks(
all_picks = all_picks.where(DraftPick.overall <= overall_end)
if player_taken is not None:
all_picks = all_picks.where(DraftPick.player.is_null(not player_taken))
if limit is not None:
all_picks = all_picks.limit(limit)
all_picks = all_picks.limit(limit)
if sort is not None:
if sort == 'order-asc':
if sort == "order-asc":
all_picks = all_picks.order_by(DraftPick.overall)
elif sort == 'order-desc':
elif sort == "order-desc":
all_picks = all_picks.order_by(-DraftPick.overall)
return_picks = {'count': all_picks.count(), 'picks': []}
return_picks = {"count": all_picks.count(), "picks": []}
for line in all_picks:
return_picks['picks'].append(model_to_dict(line, recurse=not short_output))
return_picks["picks"].append(model_to_dict(line, recurse=not short_output))
db.close()
return return_picks
@router.get('/{pick_id}')
@router.get("/{pick_id}")
@handle_db_errors
async def get_one_pick(pick_id: int, short_output: Optional[bool] = False):
this_pick = DraftPick.get_or_none(DraftPick.id == pick_id)
if this_pick is not None:
r_pick = model_to_dict(this_pick, recurse=not short_output)
else:
raise HTTPException(status_code=404, detail=f'Pick ID {pick_id} not found')
raise HTTPException(status_code=404, detail=f"Pick ID {pick_id} not found")
db.close()
return r_pick
@router.patch('/{pick_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.patch("/{pick_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def patch_pick(pick_id: int, new_pick: DraftPickModel, token: str = Depends(oauth2_scheme)):
async def patch_pick(
pick_id: int, new_pick: DraftPickModel, token: str = Depends(oauth2_scheme)
):
if not valid_token(token):
logger.warning(f'patch_pick - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_pick - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
if DraftPick.get_or_none(DraftPick.id == pick_id) is None:
raise HTTPException(status_code=404, detail=f'Pick ID {pick_id} not found')
raise HTTPException(status_code=404, detail=f"Pick ID {pick_id} not found")
DraftPick.update(**new_pick.dict()).where(DraftPick.id == pick_id).execute()
r_pick = model_to_dict(DraftPick.get_by_id(pick_id))
@ -133,21 +158,23 @@ async def patch_pick(pick_id: int, new_pick: DraftPickModel, token: str = Depend
return r_pick
@router.post('', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_picks(p_list: DraftPickList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'post_picks - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"post_picks - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
new_picks = []
for pick in p_list.picks:
dupe = DraftPick.get_or_none(DraftPick.season == pick.season, DraftPick.overall == pick.overall)
dupe = DraftPick.get_or_none(
DraftPick.season == pick.season, DraftPick.overall == pick.overall
)
if dupe:
db.close()
raise HTTPException(
status_code=500,
detail=f'Pick # {pick.overall} already exists for season {pick.season}'
detail=f"Pick # {pick.overall} already exists for season {pick.season}",
)
new_picks.append(pick.dict())
@ -157,25 +184,26 @@ async def post_picks(p_list: DraftPickList, token: str = Depends(oauth2_scheme))
DraftPick.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f'Inserted {len(new_picks)} picks'
return f"Inserted {len(new_picks)} picks"
@router.delete('/{pick_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.delete("/{pick_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def delete_pick(pick_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'delete_pick - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"delete_pick - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_pick = DraftPick.get_or_none(DraftPick.id == pick_id)
if this_pick is None:
raise HTTPException(status_code=404, detail=f'Pick ID {pick_id} not found')
raise HTTPException(status_code=404, detail=f"Pick ID {pick_id} not found")
count = this_pick.delete_instance()
db.close()
if count == 1:
return f'Draft pick {pick_id} has been deleted'
return f"Draft pick {pick_id} has been deleted"
else:
raise HTTPException(status_code=500, detail=f'Draft pick {pick_id} could not be deleted')
raise HTTPException(
status_code=500, detail=f"Draft pick {pick_id} could not be deleted"
)

View File

@ -3,40 +3,61 @@ from typing import List, Optional, Literal
import logging
import pydantic
from ..db_engine import db, BattingStat, Team, Player, Current, model_to_dict, chunked, fn, per_season_weeks
from ..dependencies import oauth2_scheme, valid_token, handle_db_errors
logger = logging.getLogger('discord_app')
router = APIRouter(
prefix='/api/v3/fieldingstats',
tags=['fieldingstats']
from ..db_engine import (
db,
BattingStat,
Team,
Player,
Current,
model_to_dict,
chunked,
fn,
per_season_weeks,
)
from ..dependencies import (
oauth2_scheme,
valid_token,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
@router.get('')
router = APIRouter(prefix="/api/v3/fieldingstats", tags=["fieldingstats"])
@router.get("")
@handle_db_errors
async def get_fieldingstats(
season: int, s_type: Optional[str] = 'regular', team_abbrev: list = Query(default=None),
player_name: list = Query(default=None), player_id: list = Query(default=None),
week_start: Optional[int] = None, week_end: Optional[int] = None, game_num: list = Query(default=None),
position: list = Query(default=None), limit: Optional[int] = None, sort: Optional[str] = None,
short_output: Optional[bool] = True):
if 'post' in s_type.lower():
season: int,
s_type: Optional[str] = "regular",
team_abbrev: list = Query(default=None),
player_name: list = Query(default=None),
player_id: list = Query(default=None),
week_start: Optional[int] = None,
week_end: Optional[int] = None,
game_num: list = Query(default=None),
position: list = Query(default=None),
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
sort: Optional[str] = None,
short_output: Optional[bool] = True,
):
if "post" in s_type.lower():
all_stats = BattingStat.post_season(season)
if all_stats.count() == 0:
db.close()
return {'count': 0, 'stats': []}
elif s_type.lower() in ['combined', 'total', 'all']:
return {"count": 0, "stats": []}
elif s_type.lower() in ["combined", "total", "all"]:
all_stats = BattingStat.combined_season(season)
if all_stats.count() == 0:
db.close()
return {'count': 0, 'stats': []}
return {"count": 0, "stats": []}
else:
all_stats = BattingStat.regular_season(season)
if all_stats.count() == 0:
db.close()
return {'count': 0, 'stats': []}
return {"count": 0, "stats": []}
all_stats = all_stats.where(
(BattingStat.xch > 0) | (BattingStat.pb > 0) | (BattingStat.sbc > 0)
@ -51,7 +72,9 @@ async def get_fieldingstats(
if player_id:
all_stats = all_stats.where(BattingStat.player_id << player_id)
else:
p_query = Player.select_season(season).where(fn.Lower(Player.name) << [x.lower() for x in player_name])
p_query = Player.select_season(season).where(
fn.Lower(Player.name) << [x.lower() for x in player_name]
)
all_stats = all_stats.where(BattingStat.player << p_query)
if game_num:
all_stats = all_stats.where(BattingStat.game == game_num)
@ -66,72 +89,92 @@ async def get_fieldingstats(
db.close()
raise HTTPException(
status_code=404,
detail=f'Start week {start} is after end week {end} - cannot pull stats'
detail=f"Start week {start} is after end week {end} - cannot pull stats",
)
all_stats = all_stats.where(
(BattingStat.week >= start) & (BattingStat.week <= end)
)
all_stats = all_stats.where((BattingStat.week >= start) & (BattingStat.week <= end))
if limit:
all_stats = all_stats.limit(limit)
total_count = all_stats.count()
all_stats = all_stats.limit(limit)
if sort:
if sort == 'newest':
if sort == "newest":
all_stats = all_stats.order_by(-BattingStat.week, -BattingStat.game)
return_stats = {
'count': all_stats.count(),
'stats': [{
'player': x.player_id if short_output else model_to_dict(x.player, recurse=False),
'team': x.team_id if short_output else model_to_dict(x.team, recurse=False),
'pos': x.pos,
'xch': x.xch,
'xhit': x.xhit,
'error': x.error,
'pb': x.pb,
'sbc': x.sbc,
'csc': x.csc,
'week': x.week,
'game': x.game,
'season': x.season
} for x in all_stats]
"count": total_count,
"stats": [
{
"player": x.player_id
if short_output
else model_to_dict(x.player, recurse=False),
"team": x.team_id
if short_output
else model_to_dict(x.team, recurse=False),
"pos": x.pos,
"xch": x.xch,
"xhit": x.xhit,
"error": x.error,
"pb": x.pb,
"sbc": x.sbc,
"csc": x.csc,
"week": x.week,
"game": x.game,
"season": x.season,
}
for x in all_stats
],
}
db.close()
return return_stats
@router.get('/totals')
@router.get("/totals")
@handle_db_errors
async def get_totalstats(
season: int, s_type: Literal['regular', 'post', 'total', None] = None, team_abbrev: list = Query(default=None),
team_id: list = Query(default=None), player_name: list = Query(default=None),
week_start: Optional[int] = None, week_end: Optional[int] = None, game_num: list = Query(default=None),
position: list = Query(default=None), sort: Optional[str] = None, player_id: list = Query(default=None),
group_by: Literal['team', 'player', 'playerteam'] = 'player', short_output: Optional[bool] = False,
min_ch: Optional[int] = 1, week: list = Query(default=None)):
season: int,
s_type: Literal["regular", "post", "total", None] = None,
team_abbrev: list = Query(default=None),
team_id: list = Query(default=None),
player_name: list = Query(default=None),
week_start: Optional[int] = None,
week_end: Optional[int] = None,
game_num: list = Query(default=None),
position: list = Query(default=None),
sort: Optional[str] = None,
player_id: list = Query(default=None),
group_by: Literal["team", "player", "playerteam"] = "player",
short_output: Optional[bool] = False,
min_ch: Optional[int] = 1,
week: list = Query(default=None),
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
offset: int = Query(default=0, ge=0),
):
# Build SELECT fields conditionally based on group_by to match GROUP BY exactly
select_fields = []
if group_by == 'player':
if group_by == "player":
select_fields = [BattingStat.player, BattingStat.pos]
elif group_by == 'team':
elif group_by == "team":
select_fields = [BattingStat.team, BattingStat.pos]
elif group_by == 'playerteam':
elif group_by == "playerteam":
select_fields = [BattingStat.player, BattingStat.team, BattingStat.pos]
else:
# Default case
select_fields = [BattingStat.player, BattingStat.pos]
all_stats = (
BattingStat
.select(*select_fields,
fn.SUM(BattingStat.xch).alias('sum_xch'),
fn.SUM(BattingStat.xhit).alias('sum_xhit'), fn.SUM(BattingStat.error).alias('sum_error'),
fn.SUM(BattingStat.pb).alias('sum_pb'), fn.SUM(BattingStat.sbc).alias('sum_sbc'),
fn.SUM(BattingStat.csc).alias('sum_csc'))
.where(BattingStat.season == season)
.having(fn.SUM(BattingStat.xch) >= min_ch)
BattingStat.select(
*select_fields,
fn.SUM(BattingStat.xch).alias("sum_xch"),
fn.SUM(BattingStat.xhit).alias("sum_xhit"),
fn.SUM(BattingStat.error).alias("sum_error"),
fn.SUM(BattingStat.pb).alias("sum_pb"),
fn.SUM(BattingStat.sbc).alias("sum_sbc"),
fn.SUM(BattingStat.csc).alias("sum_csc"),
)
.where(BattingStat.season == season)
.having(fn.SUM(BattingStat.xch) >= min_ch)
)
if True in [s_type is not None, week_start is not None, week_end is not None]:
@ -141,16 +184,20 @@ async def get_totalstats(
elif week_start is not None or week_end is not None:
if week_start is None or week_end is None:
raise HTTPException(
status_code=400, detail='Both week_start and week_end must be included if either is used.'
status_code=400,
detail="Both week_start and week_end must be included if either is used.",
)
weeks["start"] = week_start
if week_end < weeks["start"]:
raise HTTPException(
status_code=400,
detail="week_end must be greater than or equal to week_start",
)
weeks['start'] = week_start
if week_end < weeks['start']:
raise HTTPException(status_code=400, detail='week_end must be greater than or equal to week_start')
else:
weeks['end'] = week_end
weeks["end"] = week_end
all_stats = all_stats.where(
(BattingStat.week >= weeks['start']) & (BattingStat.week <= weeks['end'])
(BattingStat.week >= weeks["start"]) & (BattingStat.week <= weeks["end"])
)
elif week is not None:
@ -161,14 +208,20 @@ async def get_totalstats(
if position is not None:
p_list = [x.upper() for x in position]
all_players = Player.select().where(
(Player.pos_1 << p_list) | (Player.pos_2 << p_list) | (Player.pos_3 << p_list) | (Player.pos_4 << p_list) |
(Player.pos_5 << p_list) | (Player.pos_6 << p_list) | (Player.pos_7 << p_list) | (Player.pos_8 << p_list)
(Player.pos_1 << p_list)
| (Player.pos_2 << p_list)
| (Player.pos_3 << p_list)
| (Player.pos_4 << p_list)
| (Player.pos_5 << p_list)
| (Player.pos_6 << p_list)
| (Player.pos_7 << p_list)
| (Player.pos_8 << p_list)
)
all_stats = all_stats.where(BattingStat.player << all_players)
if sort is not None:
if sort == 'player':
if sort == "player":
all_stats = all_stats.order_by(BattingStat.player)
elif sort == 'team':
elif sort == "team":
all_stats = all_stats.order_by(BattingStat.team)
if group_by is not None:
# Use the same fields for GROUP BY as we used for SELECT
@ -177,47 +230,57 @@ async def get_totalstats(
all_teams = Team.select().where(Team.id << team_id)
all_stats = all_stats.where(BattingStat.team << all_teams)
elif team_abbrev is not None:
all_teams = Team.select().where(fn.Lower(Team.abbrev) << [x.lower() for x in team_abbrev])
all_teams = Team.select().where(
fn.Lower(Team.abbrev) << [x.lower() for x in team_abbrev]
)
all_stats = all_stats.where(BattingStat.team << all_teams)
if player_name is not None:
all_players = Player.select().where(fn.Lower(Player.name) << [x.lower() for x in player_name])
all_players = Player.select().where(
fn.Lower(Player.name) << [x.lower() for x in player_name]
)
all_stats = all_stats.where(BattingStat.player << all_players)
elif player_id is not None:
all_players = Player.select().where(Player.id << player_id)
all_stats = all_stats.where(BattingStat.player << all_players)
return_stats = {
'count': 0,
'stats': []
}
total_count = all_stats.count()
all_stats = all_stats.offset(offset).limit(limit)
return_stats = {"count": total_count, "stats": []}
for x in all_stats:
if x.sum_xch + x.sum_sbc <= 0:
continue
# Handle player field based on grouping with safe access
this_player = 'TOT'
if 'player' in group_by and hasattr(x, 'player'):
this_player = x.player_id if short_output else model_to_dict(x.player, recurse=False)
# Handle team field based on grouping with safe access
this_team = 'TOT'
if 'team' in group_by and hasattr(x, 'team'):
this_team = x.team_id if short_output else model_to_dict(x.team, recurse=False)
return_stats['stats'].append({
'player': this_player,
'team': this_team,
'pos': x.pos,
'xch': x.sum_xch,
'xhit': x.sum_xhit,
'error': x.sum_error,
'pb': x.sum_pb,
'sbc': x.sum_sbc,
'csc': x.sum_csc
})
return_stats['count'] = len(return_stats['stats'])
# Handle player field based on grouping with safe access
this_player = "TOT"
if "player" in group_by and hasattr(x, "player"):
this_player = (
x.player_id if short_output else model_to_dict(x.player, recurse=False)
)
# Handle team field based on grouping with safe access
this_team = "TOT"
if "team" in group_by and hasattr(x, "team"):
this_team = (
x.team_id if short_output else model_to_dict(x.team, recurse=False)
)
return_stats["stats"].append(
{
"player": this_player,
"team": this_team,
"pos": x.pos,
"xch": x.sum_xch,
"xhit": x.sum_xhit,
"error": x.sum_error,
"pb": x.sum_pb,
"sbc": x.sum_sbc,
"csc": x.sum_csc,
}
)
return_stats["count"] = len(return_stats["stats"])
db.close()
return return_stats

View File

@ -6,15 +6,17 @@ from pydantic import BaseModel, Field
from playhouse.shortcuts import model_to_dict
from peewee import fn
from ..dependencies import oauth2_scheme, valid_token, PRIVATE_IN_SCHEMA, handle_db_errors
from ..dependencies import (
oauth2_scheme,
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
)
from ..db_engine import db, HelpCommand
logger = logging.getLogger('database_api')
logger = logging.getLogger("database_api")
router = APIRouter(
prefix='/api/v3/help_commands',
tags=['help_commands']
)
router = APIRouter(prefix="/api/v3/help_commands", tags=["help_commands"])
# Pydantic Models for API
@ -52,15 +54,16 @@ class HelpCommandStatsResponse(BaseModel):
# API Endpoints
@router.get('')
@router.get("")
@handle_db_errors
async def get_help_commands(
name: Optional[str] = None,
category: Optional[str] = None,
is_active: Optional[bool] = True,
sort: Optional[str] = 'name',
sort: Optional[str] = "name",
page: int = Query(1, ge=1),
page_size: int = Query(25, ge=1, le=100)
page_size: int = Query(25, ge=1, le=100),
):
"""Get help commands with filtering and pagination"""
try:
@ -82,16 +85,16 @@ async def get_help_commands(
# Apply sorting
sort_mapping = {
'name': HelpCommand.name,
'title': HelpCommand.title,
'category': HelpCommand.category,
'created_at': HelpCommand.created_at,
'updated_at': HelpCommand.updated_at,
'view_count': HelpCommand.view_count,
'display_order': HelpCommand.display_order
"name": HelpCommand.name,
"title": HelpCommand.title,
"category": HelpCommand.category,
"created_at": HelpCommand.created_at,
"updated_at": HelpCommand.updated_at,
"view_count": HelpCommand.view_count,
"display_order": HelpCommand.display_order,
}
if sort.startswith('-'):
if sort.startswith("-"):
sort_field = sort[1:]
order_by = sort_mapping.get(sort_field, HelpCommand.name).desc()
else:
@ -111,10 +114,10 @@ async def get_help_commands(
for help_cmd in query:
cmd_dict = model_to_dict(help_cmd)
# Convert datetime objects to ISO strings
if cmd_dict.get('created_at'):
cmd_dict['created_at'] = cmd_dict['created_at'].isoformat()
if cmd_dict.get('updated_at'):
cmd_dict['updated_at'] = cmd_dict['updated_at'].isoformat()
if cmd_dict.get("created_at"):
cmd_dict["created_at"] = cmd_dict["created_at"].isoformat()
if cmd_dict.get("updated_at"):
cmd_dict["updated_at"] = cmd_dict["updated_at"].isoformat()
try:
command_model = HelpCommandModel(**cmd_dict)
@ -129,7 +132,7 @@ async def get_help_commands(
page=page,
page_size=page_size,
total_pages=total_pages,
has_more=page < total_pages
has_more=page < total_pages,
)
except Exception as e:
@ -139,22 +142,23 @@ async def get_help_commands(
db.close()
@router.post('', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def create_help_command_endpoint(
command: HelpCommandModel,
token: str = Depends(oauth2_scheme)
command: HelpCommandModel, token: str = Depends(oauth2_scheme)
):
"""Create a new help command"""
if not valid_token(token):
logger.warning(f'create_help_command - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"create_help_command - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
# Check if command name already exists
existing = HelpCommand.get_by_name(command.name)
if existing:
raise HTTPException(status_code=409, detail=f"Help topic '{command.name}' already exists")
raise HTTPException(
status_code=409, detail=f"Help topic '{command.name}' already exists"
)
# Create the command
help_cmd = HelpCommand.create(
@ -166,15 +170,15 @@ async def create_help_command_endpoint(
created_at=datetime.now(),
is_active=True,
view_count=0,
display_order=command.display_order
display_order=command.display_order,
)
# Return the created command
result = model_to_dict(help_cmd)
if result.get('created_at'):
result['created_at'] = result['created_at'].isoformat()
if result.get('updated_at'):
result['updated_at'] = result['updated_at'].isoformat()
if result.get("created_at"):
result["created_at"] = result["created_at"].isoformat()
if result.get("updated_at"):
result["updated_at"] = result["updated_at"].isoformat()
return result
@ -187,23 +191,23 @@ async def create_help_command_endpoint(
db.close()
@router.put('/{command_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.put("/{command_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def update_help_command_endpoint(
command_id: int,
command: HelpCommandModel,
token: str = Depends(oauth2_scheme)
command_id: int, command: HelpCommandModel, token: str = Depends(oauth2_scheme)
):
"""Update an existing help command"""
if not valid_token(token):
logger.warning(f'update_help_command - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"update_help_command - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
# Get the command
help_cmd = HelpCommand.get_or_none(HelpCommand.id == command_id)
if not help_cmd:
raise HTTPException(status_code=404, detail=f"Help command {command_id} not found")
raise HTTPException(
status_code=404, detail=f"Help command {command_id} not found"
)
# Update fields
if command.title:
@ -222,10 +226,10 @@ async def update_help_command_endpoint(
# Return updated command
result = model_to_dict(help_cmd)
if result.get('created_at'):
result['created_at'] = result['created_at'].isoformat()
if result.get('updated_at'):
result['updated_at'] = result['updated_at'].isoformat()
if result.get("created_at"):
result["created_at"] = result["created_at"].isoformat()
if result.get("updated_at"):
result["updated_at"] = result["updated_at"].isoformat()
return result
@ -238,32 +242,33 @@ async def update_help_command_endpoint(
db.close()
@router.patch('/{command_id}/restore', include_in_schema=PRIVATE_IN_SCHEMA)
@router.patch("/{command_id}/restore", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def restore_help_command_endpoint(
command_id: int,
token: str = Depends(oauth2_scheme)
command_id: int, token: str = Depends(oauth2_scheme)
):
"""Restore a soft-deleted help command"""
if not valid_token(token):
logger.warning(f'restore_help_command - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"restore_help_command - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
# Get the command
help_cmd = HelpCommand.get_or_none(HelpCommand.id == command_id)
if not help_cmd:
raise HTTPException(status_code=404, detail=f"Help command {command_id} not found")
raise HTTPException(
status_code=404, detail=f"Help command {command_id} not found"
)
# Restore the command
help_cmd.restore()
# Return restored command
result = model_to_dict(help_cmd)
if result.get('created_at'):
result['created_at'] = result['created_at'].isoformat()
if result.get('updated_at'):
result['updated_at'] = result['updated_at'].isoformat()
if result.get("created_at"):
result["created_at"] = result["created_at"].isoformat()
if result.get("updated_at"):
result["updated_at"] = result["updated_at"].isoformat()
return result
@ -276,22 +281,23 @@ async def restore_help_command_endpoint(
db.close()
@router.delete('/{command_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.delete("/{command_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def delete_help_command_endpoint(
command_id: int,
token: str = Depends(oauth2_scheme)
command_id: int, token: str = Depends(oauth2_scheme)
):
"""Soft delete a help command"""
if not valid_token(token):
logger.warning(f'delete_help_command - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"delete_help_command - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
# Get the command
help_cmd = HelpCommand.get_or_none(HelpCommand.id == command_id)
if not help_cmd:
raise HTTPException(status_code=404, detail=f"Help command {command_id} not found")
raise HTTPException(
status_code=404, detail=f"Help command {command_id} not found"
)
# Soft delete the command
help_cmd.soft_delete()
@ -307,44 +313,56 @@ async def delete_help_command_endpoint(
db.close()
@router.get('/stats')
@router.get("/stats")
@handle_db_errors
async def get_help_command_stats():
"""Get comprehensive statistics about help commands"""
try:
# Get basic counts
total_commands = HelpCommand.select().count()
active_commands = HelpCommand.select().where(HelpCommand.is_active == True).count()
active_commands = (
HelpCommand.select().where(HelpCommand.is_active == True).count()
)
# Get total views
total_views = HelpCommand.select(fn.SUM(HelpCommand.view_count)).where(
HelpCommand.is_active == True
).scalar() or 0
total_views = (
HelpCommand.select(fn.SUM(HelpCommand.view_count))
.where(HelpCommand.is_active == True)
.scalar()
or 0
)
# Get most viewed command
most_viewed = HelpCommand.get_most_viewed(limit=1).first()
most_viewed_command = None
if most_viewed:
most_viewed_dict = model_to_dict(most_viewed)
if most_viewed_dict.get('created_at'):
most_viewed_dict['created_at'] = most_viewed_dict['created_at'].isoformat()
if most_viewed_dict.get('updated_at'):
most_viewed_dict['updated_at'] = most_viewed_dict['updated_at'].isoformat()
if most_viewed_dict.get("created_at"):
most_viewed_dict["created_at"] = most_viewed_dict[
"created_at"
].isoformat()
if most_viewed_dict.get("updated_at"):
most_viewed_dict["updated_at"] = most_viewed_dict[
"updated_at"
].isoformat()
most_viewed_command = most_viewed_dict
# Get recent commands count (last 7 days)
week_ago = datetime.now() - timedelta(days=7)
recent_count = HelpCommand.select().where(
(HelpCommand.created_at >= week_ago) &
(HelpCommand.is_active == True)
).count()
recent_count = (
HelpCommand.select()
.where(
(HelpCommand.created_at >= week_ago) & (HelpCommand.is_active == True)
)
.count()
)
return HelpCommandStatsResponse(
total_commands=total_commands,
active_commands=active_commands,
total_views=int(total_views),
most_viewed_command=most_viewed_command,
recent_commands_count=recent_count
recent_commands_count=recent_count,
)
except Exception as e:
@ -355,24 +373,27 @@ async def get_help_command_stats():
# Special endpoints for Discord bot integration
@router.get('/by_name/{command_name}')
@router.get("/by_name/{command_name}")
@handle_db_errors
async def get_help_command_by_name_endpoint(
command_name: str,
include_inactive: bool = Query(False)
command_name: str, include_inactive: bool = Query(False)
):
"""Get a help command by name (for Discord bot)"""
try:
help_cmd = HelpCommand.get_by_name(command_name, include_inactive=include_inactive)
help_cmd = HelpCommand.get_by_name(
command_name, include_inactive=include_inactive
)
if not help_cmd:
raise HTTPException(status_code=404, detail=f"Help topic '{command_name}' not found")
raise HTTPException(
status_code=404, detail=f"Help topic '{command_name}' not found"
)
result = model_to_dict(help_cmd)
if result.get('created_at'):
result['created_at'] = result['created_at'].isoformat()
if result.get('updated_at'):
result['updated_at'] = result['updated_at'].isoformat()
if result.get("created_at"):
result["created_at"] = result["created_at"].isoformat()
if result.get("updated_at"):
result["updated_at"] = result["updated_at"].isoformat()
return result
@ -385,32 +406,31 @@ async def get_help_command_by_name_endpoint(
db.close()
@router.patch('/by_name/{command_name}/view', include_in_schema=PRIVATE_IN_SCHEMA)
@router.patch("/by_name/{command_name}/view", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def increment_view_count(
command_name: str,
token: str = Depends(oauth2_scheme)
):
async def increment_view_count(command_name: str, token: str = Depends(oauth2_scheme)):
"""Increment view count for a help command"""
if not valid_token(token):
logger.warning(f'increment_view_count - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"increment_view_count - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
help_cmd = HelpCommand.get_by_name(command_name)
if not help_cmd:
raise HTTPException(status_code=404, detail=f"Help topic '{command_name}' not found")
raise HTTPException(
status_code=404, detail=f"Help topic '{command_name}' not found"
)
# Increment view count
help_cmd.increment_view_count()
# Return updated command
result = model_to_dict(help_cmd)
if result.get('created_at'):
result['created_at'] = result['created_at'].isoformat()
if result.get('updated_at'):
result['updated_at'] = result['updated_at'].isoformat()
if result.get("created_at"):
result["created_at"] = result["created_at"].isoformat()
if result.get("updated_at"):
result["updated_at"] = result["updated_at"].isoformat()
return result
@ -423,11 +443,10 @@ async def increment_view_count(
db.close()
@router.get('/autocomplete')
@router.get("/autocomplete")
@handle_db_errors
async def get_help_names_for_autocomplete(
q: str = Query(""),
limit: int = Query(25, ge=1, le=100)
q: str = Query(""), limit: int = Query(25, ge=1, le=100)
):
"""Get help command names for Discord autocomplete"""
try:
@ -438,11 +457,11 @@ async def get_help_names_for_autocomplete(
# Return list of dictionaries with name, title, category
return {
'results': [
"results": [
{
'name': help_cmd.name,
'title': help_cmd.title,
'category': help_cmd.category
"name": help_cmd.name,
"title": help_cmd.title,
"category": help_cmd.category,
}
for help_cmd in results
]
@ -455,7 +474,7 @@ async def get_help_names_for_autocomplete(
db.close()
@router.get('/{command_id}')
@router.get("/{command_id}")
@handle_db_errors
async def get_help_command(command_id: int):
"""Get a single help command by ID"""
@ -463,13 +482,15 @@ async def get_help_command(command_id: int):
help_cmd = HelpCommand.get_or_none(HelpCommand.id == command_id)
if not help_cmd:
raise HTTPException(status_code=404, detail=f"Help command {command_id} not found")
raise HTTPException(
status_code=404, detail=f"Help command {command_id} not found"
)
result = model_to_dict(help_cmd)
if result.get('created_at'):
result['created_at'] = result['created_at'].isoformat()
if result.get('updated_at'):
result['updated_at'] = result['updated_at'].isoformat()
if result.get("created_at"):
result["created_at"] = result["created_at"].isoformat()
if result.get("updated_at"):
result["updated_at"] = result["updated_at"].isoformat()
return result

View File

@ -4,15 +4,19 @@ import logging
import pydantic
from ..db_engine import db, Injury, Player, model_to_dict, fn
from ..dependencies import oauth2_scheme, valid_token, PRIVATE_IN_SCHEMA, handle_db_errors
logger = logging.getLogger('discord_app')
router = APIRouter(
prefix='/api/v3/injuries',
tags=['injuries']
from ..dependencies import (
oauth2_scheme,
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
router = APIRouter(prefix="/api/v3/injuries", tags=["injuries"])
class InjuryModel(pydantic.BaseModel):
season: int
@ -25,12 +29,20 @@ class InjuryModel(pydantic.BaseModel):
is_active: bool = True
@router.get('')
@router.get("")
@handle_db_errors
async def get_injuries(
season: list = Query(default=None), player_id: list = Query(default=None), min_games: int = None,
max_games: int = None, team_id: list = Query(default=None), is_active: bool = None,
short_output: bool = False, sort: Optional[str] = 'start-asc'):
season: list = Query(default=None),
player_id: list = Query(default=None),
min_games: int = None,
max_games: int = None,
team_id: list = Query(default=None),
is_active: bool = None,
short_output: bool = False,
sort: Optional[str] = "start-asc",
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
offset: int = Query(default=0, ge=0),
):
all_injuries = Injury.select()
if season is not None:
@ -47,34 +59,41 @@ async def get_injuries(
all_players = Player.select().where(Player.team_id << team_id)
all_injuries = all_injuries.where(Injury.player << all_players)
if sort == 'return-asc':
if sort == "return-asc":
all_injuries = all_injuries.order_by(Injury.end_week, Injury.end_game)
elif sort == 'return-desc':
elif sort == "return-desc":
all_injuries = all_injuries.order_by(-Injury.end_week, -Injury.end_game)
elif sort == 'start-asc':
elif sort == "start-asc":
all_injuries = all_injuries.order_by(Injury.start_week, Injury.start_game)
elif sort == 'start-desc':
elif sort == "start-desc":
all_injuries = all_injuries.order_by(-Injury.start_week, -Injury.start_game)
total_count = all_injuries.count()
all_injuries = all_injuries.offset(offset).limit(limit)
return_injuries = {
'count': all_injuries.count(),
'injuries': [model_to_dict(x, recurse=not short_output) for x in all_injuries]
"count": total_count,
"injuries": [model_to_dict(x, recurse=not short_output) for x in all_injuries],
}
db.close()
return return_injuries
@router.patch('/{injury_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.patch("/{injury_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def patch_injury(injury_id: int, is_active: Optional[bool] = None, token: str = Depends(oauth2_scheme)):
async def patch_injury(
injury_id: int,
is_active: Optional[bool] = None,
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f'patch_injury - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_injury - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_injury = Injury.get_or_none(Injury.id == injury_id)
if this_injury is None:
db.close()
raise HTTPException(status_code=404, detail=f'Injury ID {injury_id} not found')
raise HTTPException(status_code=404, detail=f"Injury ID {injury_id} not found")
if is_active is not None:
this_injury.is_active = is_active
@ -85,15 +104,17 @@ async def patch_injury(injury_id: int, is_active: Optional[bool] = None, token:
return r_injury
else:
db.close()
raise HTTPException(status_code=500, detail=f'Unable to patch injury {injury_id}')
raise HTTPException(
status_code=500, detail=f"Unable to patch injury {injury_id}"
)
@router.post(f'', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_injury(new_injury: InjuryModel, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'post_injury - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"post_injury - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_injury = Injury(**new_injury.dict())
@ -103,28 +124,27 @@ async def post_injury(new_injury: InjuryModel, token: str = Depends(oauth2_schem
return r_injury
else:
db.close()
raise HTTPException(status_code=500, detail=f'Unable to post injury')
raise HTTPException(status_code=500, detail=f"Unable to post injury")
@router.delete('/{injury_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.delete("/{injury_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def delete_injury(injury_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'delete_injury - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"delete_injury - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_injury = Injury.get_or_none(Injury.id == injury_id)
if this_injury is None:
db.close()
raise HTTPException(status_code=404, detail=f'Injury ID {injury_id} not found')
raise HTTPException(status_code=404, detail=f"Injury ID {injury_id} not found")
count = this_injury.delete_instance()
db.close()
if count == 1:
return f'Injury {injury_id} has been deleted'
return f"Injury {injury_id} has been deleted"
else:
raise HTTPException(status_code=500, detail=f'Unable to delete injury {injury_id}')
raise HTTPException(
status_code=500, detail=f"Unable to delete injury {injury_id}"
)

View File

@ -4,15 +4,19 @@ import logging
import pydantic
from ..db_engine import db, Keeper, Player, model_to_dict, chunked, fn
from ..dependencies import oauth2_scheme, valid_token, PRIVATE_IN_SCHEMA, handle_db_errors
logger = logging.getLogger('discord_app')
router = APIRouter(
prefix='/api/v3/keepers',
tags=['keepers']
from ..dependencies import (
oauth2_scheme,
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
router = APIRouter(prefix="/api/v3/keepers", tags=["keepers"])
class KeeperModel(pydantic.BaseModel):
season: int
@ -25,11 +29,16 @@ class KeeperList(pydantic.BaseModel):
keepers: List[KeeperModel]
@router.get('')
@router.get("")
@handle_db_errors
async def get_keepers(
season: list = Query(default=None), team_id: list = Query(default=None), player_id: list = Query(default=None),
short_output: bool = False):
season: list = Query(default=None),
team_id: list = Query(default=None),
player_id: list = Query(default=None),
short_output: bool = False,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
offset: int = Query(default=0, ge=0),
):
all_keepers = Keeper.select()
if season is not None:
@ -39,26 +48,33 @@ async def get_keepers(
if player_id is not None:
all_keepers = all_keepers.where(Keeper.player_id << player_id)
total_count = all_keepers.count()
all_keepers = all_keepers.offset(offset).limit(limit)
return_keepers = {
'count': all_keepers.count(),
'keepers': [model_to_dict(x, recurse=not short_output) for x in all_keepers]
"count": total_count,
"keepers": [model_to_dict(x, recurse=not short_output) for x in all_keepers],
}
db.close()
return return_keepers
@router.patch('/{keeper_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.patch("/{keeper_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def patch_keeper(
keeper_id: int, season: Optional[int] = None, team_id: Optional[int] = None, player_id: Optional[int] = None,
token: str = Depends(oauth2_scheme)):
keeper_id: int,
season: Optional[int] = None,
team_id: Optional[int] = None,
player_id: Optional[int] = None,
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f'patch_keeper - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_keeper - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_keeper = Keeper.get_or_none(Keeper.id == keeper_id)
if not this_keeper:
raise HTTPException(status_code=404, detail=f'Keeper ID {keeper_id} not found')
raise HTTPException(status_code=404, detail=f"Keeper ID {keeper_id} not found")
if season is not None:
this_keeper.season = season
@ -73,15 +89,17 @@ async def patch_keeper(
return r_keeper
else:
db.close()
raise HTTPException(status_code=500, detail=f'Unable to patch keeper {keeper_id}')
raise HTTPException(
status_code=500, detail=f"Unable to patch keeper {keeper_id}"
)
@router.post('', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_keepers(k_list: KeeperList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'post_keepers - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"post_keepers - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
new_keepers = []
for keeper in k_list.keepers:
@ -92,26 +110,26 @@ async def post_keepers(k_list: KeeperList, token: str = Depends(oauth2_scheme)):
Keeper.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f'Inserted {len(new_keepers)} keepers'
return f"Inserted {len(new_keepers)} keepers"
@router.delete('/{keeper_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.delete("/{keeper_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def delete_keeper(keeper_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'delete_keeper - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"delete_keeper - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_keeper = Keeper.get_or_none(Keeper.id == keeper_id)
if not this_keeper:
raise HTTPException(status_code=404, detail=f'Keeper ID {keeper_id} not found')
raise HTTPException(status_code=404, detail=f"Keeper ID {keeper_id} not found")
count = this_keeper.delete_instance()
db.close()
if count == 1:
return f'Keeper ID {keeper_id} has been deleted'
return f"Keeper ID {keeper_id} has been deleted"
else:
raise HTTPException(status_code=500, detail=f'Keeper ID {keeper_id} could not be deleted')
raise HTTPException(
status_code=500, detail=f"Keeper ID {keeper_id} could not be deleted"
)

View File

@ -4,15 +4,19 @@ import logging
import pydantic
from ..db_engine import db, Manager, Team, Current, model_to_dict, fn
from ..dependencies import oauth2_scheme, valid_token, PRIVATE_IN_SCHEMA, handle_db_errors
logger = logging.getLogger('discord_app')
router = APIRouter(
prefix='/api/v3/managers',
tags=['managers']
from ..dependencies import (
oauth2_scheme,
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
router = APIRouter(prefix="/api/v3/managers", tags=["managers"])
class ManagerModel(pydantic.BaseModel):
name: str
@ -21,45 +25,49 @@ class ManagerModel(pydantic.BaseModel):
bio: Optional[str] = None
@router.get('')
@router.get("")
@handle_db_errors
async def get_managers(
name: list = Query(default=None), active: Optional[bool] = None, short_output: Optional[bool] = False):
name: list = Query(default=None),
active: Optional[bool] = None,
short_output: Optional[bool] = False,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
offset: int = Query(default=0, ge=0),
):
if active is not None:
current = Current.latest()
t_query = Team.select_season(current.season)
t_query = t_query.where(
~(Team.abbrev.endswith('IL')) & ~(Team.abbrev.endswith('MiL'))
~(Team.abbrev.endswith("IL")) & ~(Team.abbrev.endswith("MiL"))
)
logger.info(f'tquery: {t_query}')
logger.info(f"tquery: {t_query}")
a_mgr = []
i_mgr = []
for x in t_query:
logger.info(f'Team: {x.abbrev} / mgr1: {x.manager1} / mgr2: {x.manager2}')
logger.info(f"Team: {x.abbrev} / mgr1: {x.manager1} / mgr2: {x.manager2}")
if x.manager1 is not None:
a_mgr.append(x.manager1)
logger.info(f'appending {x.manager1.name}')
logger.info(f"appending {x.manager1.name}")
if x.manager2 is not None:
a_mgr.append(x.manager2)
logger.info(f'appending {x.manager2.name}')
logger.info(f"appending {x.manager2.name}")
logger.info(f'a_mgr: {a_mgr}')
logger.info(f"a_mgr: {a_mgr}")
if active:
final_mgrs = [model_to_dict(y, recurse=not short_output) for y in a_mgr]
else:
logger.info(f'checking inactive')
logger.info(f"checking inactive")
for z in Manager.select():
logger.info(f'checking: {z.name}')
logger.info(f"checking: {z.name}")
if z not in a_mgr:
logger.info(f'+inactive: {z.name}')
logger.info(f"+inactive: {z.name}")
i_mgr.append(z)
final_mgrs = [model_to_dict(y, recurse=not short_output) for y in i_mgr]
return_managers = {
'count': len(final_mgrs),
'managers': final_mgrs
}
total_count = len(final_mgrs)
final_mgrs = final_mgrs[offset : offset + limit]
return_managers = {"count": total_count, "managers": final_mgrs}
else:
all_managers = Manager.select()
@ -67,16 +75,20 @@ async def get_managers(
name_list = [x.lower() for x in name]
all_managers = all_managers.where(fn.Lower(Manager.name) << name_list)
total_count = all_managers.count()
all_managers = all_managers.offset(offset).limit(limit)
return_managers = {
'count': all_managers.count(),
'managers': [model_to_dict(x, recurse=not short_output) for x in all_managers]
"count": total_count,
"managers": [
model_to_dict(x, recurse=not short_output) for x in all_managers
],
}
db.close()
return return_managers
@router.get('/{manager_id}')
@router.get("/{manager_id}")
@handle_db_errors
async def get_one_manager(manager_id: int, short_output: Optional[bool] = False):
this_manager = Manager.get_or_none(Manager.id == manager_id)
@ -85,22 +97,29 @@ async def get_one_manager(manager_id: int, short_output: Optional[bool] = False)
db.close()
return r_manager
else:
raise HTTPException(status_code=404, detail=f'Manager {manager_id} not found')
raise HTTPException(status_code=404, detail=f"Manager {manager_id} not found")
@router.patch('/{manager_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.patch("/{manager_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def patch_manager(
manager_id: int, name: Optional[str] = None, image: Optional[str] = None, headline: Optional[str] = None,
bio: Optional[str] = None, token: str = Depends(oauth2_scheme)):
manager_id: int,
name: Optional[str] = None,
image: Optional[str] = None,
headline: Optional[str] = None,
bio: Optional[str] = None,
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f'patch_manager - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_manager - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_manager = Manager.get_or_none(Manager.id == manager_id)
if this_manager is None:
db.close()
raise HTTPException(status_code=404, detail=f'Manager ID {manager_id} not found')
raise HTTPException(
status_code=404, detail=f"Manager ID {manager_id} not found"
)
if name is not None:
this_manager.name = name
@ -117,15 +136,17 @@ async def patch_manager(
return r_manager
else:
db.close()
raise HTTPException(status_code=500, detail=f'Unable to patch manager {this_manager}')
raise HTTPException(
status_code=500, detail=f"Unable to patch manager {this_manager}"
)
@router.post('', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_manager(new_manager: ManagerModel, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'post_manager - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"post_manager - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_manager = Manager(**new_manager.dict())
@ -135,25 +156,31 @@ async def post_manager(new_manager: ManagerModel, token: str = Depends(oauth2_sc
return r_manager
else:
db.close()
raise HTTPException(status_code=500, detail=f'Unable to post manager {this_manager.name}')
raise HTTPException(
status_code=500, detail=f"Unable to post manager {this_manager.name}"
)
@router.delete('/{manager_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.delete("/{manager_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def delete_manager(manager_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'delete_manager - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"delete_manager - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_manager = Manager.get_or_none(Manager.id == manager_id)
if this_manager is None:
db.close()
raise HTTPException(status_code=404, detail=f'Manager ID {manager_id} not found')
raise HTTPException(
status_code=404, detail=f"Manager ID {manager_id} not found"
)
count = this_manager.delete_instance()
db.close()
if count == 1:
return f'Manager {manager_id} has been deleted'
return f"Manager {manager_id} has been deleted"
else:
raise HTTPException(status_code=500, detail=f'Manager {manager_id} could not be deleted')
raise HTTPException(
status_code=500, detail=f"Manager {manager_id} could not be deleted"
)

View File

@ -1,20 +1,31 @@
import datetime
import os
from fastapi import APIRouter, Depends, HTTPException, Query
from typing import List, Optional, Literal
import logging
import pydantic
from ..db_engine import db, PitchingStat, Team, Player, Current, model_to_dict, chunked, fn, per_season_weeks
from ..dependencies import oauth2_scheme, valid_token, PRIVATE_IN_SCHEMA, handle_db_errors
logger = logging.getLogger('discord_app')
router = APIRouter(
prefix='/api/v3/pitchingstats',
tags=['pitchingstats']
from ..db_engine import (
db,
PitchingStat,
Team,
Player,
Current,
model_to_dict,
chunked,
fn,
per_season_weeks,
)
from ..dependencies import (
oauth2_scheme,
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
router = APIRouter(prefix="/api/v3/pitchingstats", tags=["pitchingstats"])
class PitStatModel(pydantic.BaseModel):
@ -48,29 +59,37 @@ class PitStatList(pydantic.BaseModel):
stats: List[PitStatModel]
@router.get('')
@router.get("")
@handle_db_errors
async def get_pitstats(
season: int, s_type: Optional[str] = 'regular', team_abbrev: list = Query(default=None),
player_name: list = Query(default=None), player_id: list = Query(default=None),
week_start: Optional[int] = None, week_end: Optional[int] = None, game_num: list = Query(default=None),
limit: Optional[int] = None, ip_min: Optional[float] = None, sort: Optional[str] = None,
short_output: Optional[bool] = True):
if 'post' in s_type.lower():
season: int,
s_type: Optional[str] = "regular",
team_abbrev: list = Query(default=None),
player_name: list = Query(default=None),
player_id: list = Query(default=None),
week_start: Optional[int] = None,
week_end: Optional[int] = None,
game_num: list = Query(default=None),
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
ip_min: Optional[float] = None,
sort: Optional[str] = None,
short_output: Optional[bool] = True,
):
if "post" in s_type.lower():
all_stats = PitchingStat.post_season(season)
if all_stats.count() == 0:
db.close()
return {'count': 0, 'stats': []}
elif s_type.lower() in ['combined', 'total', 'all']:
return {"count": 0, "stats": []}
elif s_type.lower() in ["combined", "total", "all"]:
all_stats = PitchingStat.combined_season(season)
if all_stats.count() == 0:
db.close()
return {'count': 0, 'stats': []}
return {"count": 0, "stats": []}
else:
all_stats = PitchingStat.regular_season(season)
if all_stats.count() == 0:
db.close()
return {'count': 0, 'stats': []}
return {"count": 0, "stats": []}
if team_abbrev is not None:
t_query = Team.select().where(Team.abbrev << [x.upper() for x in team_abbrev])
@ -79,7 +98,9 @@ async def get_pitstats(
if player_id:
all_stats = all_stats.where(PitchingStat.player_id << player_id)
else:
p_query = Player.select_season(season).where(fn.Lower(Player.name) << [x.lower() for x in player_name])
p_query = Player.select_season(season).where(
fn.Lower(Player.name) << [x.lower() for x in player_name]
)
all_stats = all_stats.where(PitchingStat.player << p_query)
if game_num:
all_stats = all_stats.where(PitchingStat.game == game_num)
@ -96,67 +117,91 @@ async def get_pitstats(
db.close()
raise HTTPException(
status_code=404,
detail=f'Start week {start} is after end week {end} - cannot pull stats'
detail=f"Start week {start} is after end week {end} - cannot pull stats",
)
all_stats = all_stats.where(
(PitchingStat.week >= start) & (PitchingStat.week <= end)
)
if limit:
all_stats = all_stats.limit(limit)
all_stats = all_stats.limit(limit)
if sort:
if sort == 'newest':
if sort == "newest":
all_stats = all_stats.order_by(-PitchingStat.week, -PitchingStat.game)
return_stats = {
'count': all_stats.count(),
'stats': [model_to_dict(x, recurse=not short_output) for x in all_stats]
"count": all_stats.count(),
"stats": [model_to_dict(x, recurse=not short_output) for x in all_stats],
}
db.close()
return return_stats
@router.get('/totals')
@router.get("/totals")
@handle_db_errors
async def get_totalstats(
season: int, s_type: Literal['regular', 'post', 'total', None] = None, team_abbrev: list = Query(default=None),
team_id: list = Query(default=None), player_name: list = Query(default=None),
week_start: Optional[int] = None, week_end: Optional[int] = None, game_num: list = Query(default=None),
is_sp: Optional[bool] = None, ip_min: Optional[float] = 0.25, sort: Optional[str] = None,
player_id: list = Query(default=None), short_output: Optional[bool] = False,
group_by: Literal['team', 'player', 'playerteam'] = 'player', week: list = Query(default=None)):
season: int,
s_type: Literal["regular", "post", "total", None] = None,
team_abbrev: list = Query(default=None),
team_id: list = Query(default=None),
player_name: list = Query(default=None),
week_start: Optional[int] = None,
week_end: Optional[int] = None,
game_num: list = Query(default=None),
is_sp: Optional[bool] = None,
ip_min: Optional[float] = 0.25,
sort: Optional[str] = None,
player_id: list = Query(default=None),
short_output: Optional[bool] = False,
group_by: Literal["team", "player", "playerteam"] = "player",
week: list = Query(default=None),
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
offset: int = Query(default=0, ge=0),
):
if sum(1 for x in [s_type, (week_start or week_end), week] if x is not None) > 1:
raise HTTPException(status_code=400, detail=f'Only one of s_type, week_start/week_end, or week may be used.')
raise HTTPException(
status_code=400,
detail=f"Only one of s_type, week_start/week_end, or week may be used.",
)
# Build SELECT fields conditionally based on group_by to match GROUP BY exactly
select_fields = []
if group_by == 'player':
if group_by == "player":
select_fields = [PitchingStat.player]
elif group_by == 'team':
elif group_by == "team":
select_fields = [PitchingStat.team]
elif group_by == 'playerteam':
elif group_by == "playerteam":
select_fields = [PitchingStat.player, PitchingStat.team]
else:
# Default case
select_fields = [PitchingStat.player]
all_stats = (
PitchingStat
.select(*select_fields,
fn.SUM(PitchingStat.ip).alias('sum_ip'),
fn.SUM(PitchingStat.hit).alias('sum_hit'), fn.SUM(PitchingStat.run).alias('sum_run'),
fn.SUM(PitchingStat.erun).alias('sum_erun'), fn.SUM(PitchingStat.so).alias('sum_so'),
fn.SUM(PitchingStat.bb).alias('sum_bb'), fn.SUM(PitchingStat.hbp).alias('sum_hbp'),
fn.SUM(PitchingStat.wp).alias('sum_wp'), fn.SUM(PitchingStat.balk).alias('sum_balk'),
fn.SUM(PitchingStat.hr).alias('sum_hr'), fn.SUM(PitchingStat.ir).alias('sum_ir'),
fn.SUM(PitchingStat.win).alias('sum_win'), fn.SUM(PitchingStat.loss).alias('sum_loss'),
fn.SUM(PitchingStat.hold).alias('sum_hold'), fn.SUM(PitchingStat.sv).alias('sum_sv'),
fn.SUM(PitchingStat.bsv).alias('sum_bsv'), fn.SUM(PitchingStat.irs).alias('sum_irs'),
fn.SUM(PitchingStat.gs).alias('sum_gs'), fn.COUNT(PitchingStat.game).alias('sum_games'))
.where(PitchingStat.season == season)
.having(fn.SUM(PitchingStat.ip) >= ip_min)
PitchingStat.select(
*select_fields,
fn.SUM(PitchingStat.ip).alias("sum_ip"),
fn.SUM(PitchingStat.hit).alias("sum_hit"),
fn.SUM(PitchingStat.run).alias("sum_run"),
fn.SUM(PitchingStat.erun).alias("sum_erun"),
fn.SUM(PitchingStat.so).alias("sum_so"),
fn.SUM(PitchingStat.bb).alias("sum_bb"),
fn.SUM(PitchingStat.hbp).alias("sum_hbp"),
fn.SUM(PitchingStat.wp).alias("sum_wp"),
fn.SUM(PitchingStat.balk).alias("sum_balk"),
fn.SUM(PitchingStat.hr).alias("sum_hr"),
fn.SUM(PitchingStat.ir).alias("sum_ir"),
fn.SUM(PitchingStat.win).alias("sum_win"),
fn.SUM(PitchingStat.loss).alias("sum_loss"),
fn.SUM(PitchingStat.hold).alias("sum_hold"),
fn.SUM(PitchingStat.sv).alias("sum_sv"),
fn.SUM(PitchingStat.bsv).alias("sum_bsv"),
fn.SUM(PitchingStat.irs).alias("sum_irs"),
fn.SUM(PitchingStat.gs).alias("sum_gs"),
fn.COUNT(PitchingStat.game).alias("sum_games"),
)
.where(PitchingStat.season == season)
.having(fn.SUM(PitchingStat.ip) >= ip_min)
)
if True in [s_type is not None, week_start is not None, week_end is not None]:
@ -166,16 +211,20 @@ async def get_totalstats(
elif week_start is not None or week_end is not None:
if week_start is None or week_end is None:
raise HTTPException(
status_code=400, detail='Both week_start and week_end must be included if either is used.'
status_code=400,
detail="Both week_start and week_end must be included if either is used.",
)
weeks["start"] = week_start
if week_end < weeks["start"]:
raise HTTPException(
status_code=400,
detail="week_end must be greater than or equal to week_start",
)
weeks['start'] = week_start
if week_end < weeks['start']:
raise HTTPException(status_code=400, detail='week_end must be greater than or equal to week_start')
else:
weeks['end'] = week_end
weeks["end"] = week_end
all_stats = all_stats.where(
(PitchingStat.week >= weeks['start']) & (PitchingStat.week <= weeks['end'])
(PitchingStat.week >= weeks["start"]) & (PitchingStat.week <= weeks["end"])
)
elif week is not None:
@ -189,9 +238,9 @@ async def get_totalstats(
if not is_sp:
all_stats = all_stats.where(PitchingStat.gs == 0)
if sort is not None:
if sort == 'player':
if sort == "player":
all_stats = all_stats.order_by(PitchingStat.player)
elif sort == 'team':
elif sort == "team":
all_stats = all_stats.order_by(PitchingStat.team)
if group_by is not None:
# Use the same fields for GROUP BY as we used for SELECT
@ -200,67 +249,79 @@ async def get_totalstats(
all_teams = Team.select().where(Team.id << team_id)
all_stats = all_stats.where(PitchingStat.team << all_teams)
elif team_abbrev is not None:
all_teams = Team.select().where(fn.Lower(Team.abbrev) << [x.lower() for x in team_abbrev])
all_teams = Team.select().where(
fn.Lower(Team.abbrev) << [x.lower() for x in team_abbrev]
)
all_stats = all_stats.where(PitchingStat.team << all_teams)
if player_name is not None:
all_players = Player.select().where(fn.Lower(Player.name) << [x.lower() for x in player_name])
all_players = Player.select().where(
fn.Lower(Player.name) << [x.lower() for x in player_name]
)
all_stats = all_stats.where(PitchingStat.player << all_players)
elif player_id is not None:
all_players = Player.select().where(Player.id << player_id)
all_stats = all_stats.where(PitchingStat.player << all_players)
return_stats = {
'count': all_stats.count(),
'stats': []
}
total_count = all_stats.count()
all_stats = all_stats.offset(offset).limit(limit)
return_stats = {"count": total_count, "stats": []}
for x in all_stats:
# Handle player field based on grouping with safe access
this_player = 'TOT'
if 'player' in group_by and hasattr(x, 'player'):
this_player = x.player_id if short_output else model_to_dict(x.player, recurse=False)
this_player = "TOT"
if "player" in group_by and hasattr(x, "player"):
this_player = (
x.player_id if short_output else model_to_dict(x.player, recurse=False)
)
# Handle team field based on grouping with safe access
this_team = 'TOT'
if 'team' in group_by and hasattr(x, 'team'):
this_team = x.team_id if short_output else model_to_dict(x.team, recurse=False)
return_stats['stats'].append({
'player': this_player,
'team': this_team,
'ip': x.sum_ip,
'hit': x.sum_hit,
'run': x.sum_run,
'erun': x.sum_erun,
'so': x.sum_so,
'bb': x.sum_bb,
'hbp': x.sum_hbp,
'wp': x.sum_wp,
'balk': x.sum_balk,
'hr': x.sum_hr,
'ir': x.sum_ir,
'irs': x.sum_irs,
'gs': x.sum_gs,
'games': x.sum_games,
'win': x.sum_win,
'loss': x.sum_loss,
'hold': x.sum_hold,
'sv': x.sum_sv,
'bsv': x.sum_bsv
})
# Handle team field based on grouping with safe access
this_team = "TOT"
if "team" in group_by and hasattr(x, "team"):
this_team = (
x.team_id if short_output else model_to_dict(x.team, recurse=False)
)
return_stats["stats"].append(
{
"player": this_player,
"team": this_team,
"ip": x.sum_ip,
"hit": x.sum_hit,
"run": x.sum_run,
"erun": x.sum_erun,
"so": x.sum_so,
"bb": x.sum_bb,
"hbp": x.sum_hbp,
"wp": x.sum_wp,
"balk": x.sum_balk,
"hr": x.sum_hr,
"ir": x.sum_ir,
"irs": x.sum_irs,
"gs": x.sum_gs,
"games": x.sum_games,
"win": x.sum_win,
"loss": x.sum_loss,
"hold": x.sum_hold,
"sv": x.sum_sv,
"bsv": x.sum_bsv,
}
)
db.close()
return return_stats
@router.patch('/{stat_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.patch("/{stat_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def patch_pitstats(stat_id: int, new_stats: PitStatModel, token: str = Depends(oauth2_scheme)):
async def patch_pitstats(
stat_id: int, new_stats: PitStatModel, token: str = Depends(oauth2_scheme)
):
if not valid_token(token):
logger.warning(f'patch_pitstats - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_pitstats - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
if PitchingStat.get_or_none(PitchingStat.id == stat_id) is None:
raise HTTPException(status_code=404, detail=f'Stat ID {stat_id} not found')
raise HTTPException(status_code=404, detail=f"Stat ID {stat_id} not found")
PitchingStat.update(**new_stats.dict()).where(PitchingStat.id == stat_id).execute()
r_stat = model_to_dict(PitchingStat.get_by_id(stat_id))
@ -268,12 +329,12 @@ async def patch_pitstats(stat_id: int, new_stats: PitStatModel, token: str = Dep
return r_stat
@router.post('', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_pitstats(s_list: PitStatList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'post_pitstats - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"post_pitstats - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
all_stats = []
@ -281,9 +342,13 @@ async def post_pitstats(s_list: PitStatList, token: str = Depends(oauth2_scheme)
team = Team.get_or_none(Team.id == x.team_id)
this_player = Player.get_or_none(Player.id == x.player_id)
if team is None:
raise HTTPException(status_code=404, detail=f'Team ID {x.team_id} not found')
raise HTTPException(
status_code=404, detail=f"Team ID {x.team_id} not found"
)
if this_player is None:
raise HTTPException(status_code=404, detail=f'Player ID {x.player_id} not found')
raise HTTPException(
status_code=404, detail=f"Player ID {x.player_id} not found"
)
all_stats.append(PitchingStat(**x.dict()))
@ -292,4 +357,4 @@ async def post_pitstats(s_list: PitStatList, token: str = Depends(oauth2_scheme)
PitchingStat.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f'Added {len(all_stats)} batting lines'
return f"Added {len(all_stats)} batting lines"

View File

@ -6,7 +6,11 @@ Thin HTTP layer using PlayerService for business logic.
from fastapi import APIRouter, Query, Response, Depends
from typing import Optional, List
from ..dependencies import oauth2_scheme, cache_result, handle_db_errors
from ..dependencies import (
oauth2_scheme,
cache_result,
handle_db_errors,
)
from ..services.base import BaseService
from ..services.player_service import PlayerService
@ -24,8 +28,10 @@ async def get_players(
strat_code: list = Query(default=None),
is_injured: Optional[bool] = None,
sort: Optional[str] = None,
limit: Optional[int] = Query(default=None, ge=1, description="Maximum number of results to return"),
offset: Optional[int] = Query(default=None, ge=0, description="Number of results to skip for pagination"),
limit: Optional[int] = Query(default=None, ge=1),
offset: Optional[int] = Query(
default=None, ge=0, description="Number of results to skip for pagination"
),
short_output: Optional[bool] = False,
csv: Optional[bool] = False,
):
@ -41,9 +47,9 @@ async def get_players(
limit=limit,
offset=offset,
short_output=short_output or False,
as_csv=csv or False
as_csv=csv or False,
)
if csv:
return Response(content=result, media_type="text/csv")
return result
@ -54,35 +60,29 @@ async def get_players(
@cache_result(ttl=15 * 60, key_prefix="players-search")
async def search_players(
q: str = Query(..., description="Search query for player name"),
season: Optional[int] = Query(default=None, description="Season to search (0 for all)"),
season: Optional[int] = Query(
default=None, description="Season to search (0 for all)"
),
limit: int = Query(default=10, ge=1, le=50),
short_output: bool = False,
):
"""Search players by name with fuzzy matching."""
return PlayerService.search_players(
query_str=q,
season=season,
limit=limit,
short_output=short_output
query_str=q, season=season, limit=limit, short_output=short_output
)
@router.get("/{player_id}")
@handle_db_errors
@cache_result(ttl=30 * 60, key_prefix="player")
async def get_one_player(
player_id: int,
short_output: Optional[bool] = False
):
async def get_one_player(player_id: int, short_output: Optional[bool] = False):
"""Get a single player by ID."""
return PlayerService.get_player(player_id, short_output=short_output or False)
@router.put("/{player_id}")
async def put_player(
player_id: int,
new_player: dict,
token: str = Depends(oauth2_scheme)
player_id: int, new_player: dict, token: str = Depends(oauth2_scheme)
):
"""Update a player (full replacement)."""
return PlayerService.update_player(player_id, new_player, token)
@ -120,14 +120,28 @@ async def patch_player(
data = {
key: value
for key, value in {
'name': name, 'wara': wara, 'image': image, 'image2': image2,
'team_id': team_id, 'season': season,
'pos_1': pos_1, 'pos_2': pos_2, 'pos_3': pos_3, 'pos_4': pos_4,
'pos_5': pos_5, 'pos_6': pos_6, 'pos_7': pos_7, 'pos_8': pos_8,
'vanity_card': vanity_card, 'headshot': headshot,
'il_return': il_return, 'demotion_week': demotion_week,
'strat_code': strat_code, 'bbref_id': bbref_id,
'injury_rating': injury_rating, 'sbaref_id': sbaref_id,
"name": name,
"wara": wara,
"image": image,
"image2": image2,
"team_id": team_id,
"season": season,
"pos_1": pos_1,
"pos_2": pos_2,
"pos_3": pos_3,
"pos_4": pos_4,
"pos_5": pos_5,
"pos_6": pos_6,
"pos_7": pos_7,
"pos_8": pos_8,
"vanity_card": vanity_card,
"headshot": headshot,
"il_return": il_return,
"demotion_week": demotion_week,
"strat_code": strat_code,
"bbref_id": bbref_id,
"injury_rating": injury_rating,
"sbaref_id": sbaref_id,
}.items()
if value is not None
}
@ -135,19 +149,13 @@ async def patch_player(
return PlayerService.patch_player(player_id, data, token)
@router.post("")
async def post_players(
p_list: dict,
token: str = Depends(oauth2_scheme)
):
@router.post("/")
async def post_players(p_list: dict, token: str = Depends(oauth2_scheme)):
"""Create multiple players."""
return PlayerService.create_players(p_list.get("players", []), token)
@router.delete("/{player_id}")
async def delete_player(
player_id: int,
token: str = Depends(oauth2_scheme)
):
async def delete_player(player_id: int, token: str = Depends(oauth2_scheme)):
"""Delete a player."""
return PlayerService.delete_player(player_id, token)

View File

@ -4,15 +4,19 @@ import logging
import pydantic
from ..db_engine import db, Result, Team, model_to_dict, chunked
from ..dependencies import oauth2_scheme, valid_token, PRIVATE_IN_SCHEMA, handle_db_errors
logger = logging.getLogger('discord_app')
router = APIRouter(
prefix='/api/v3/results',
tags=['results']
from ..dependencies import (
oauth2_scheme,
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
router = APIRouter(prefix="/api/v3/results", tags=["results"])
class ResultModel(pydantic.BaseModel):
week: int
@ -29,13 +33,20 @@ class ResultList(pydantic.BaseModel):
results: List[ResultModel]
@router.get('')
@router.get("")
@handle_db_errors
async def get_results(
season: int, team_abbrev: list = Query(default=None), week_start: Optional[int] = None,
week_end: Optional[int] = None, game_num: list = Query(default=None),
away_abbrev: list = Query(default=None), home_abbrev: list = Query(default=None),
short_output: Optional[bool] = False):
season: int,
team_abbrev: list = Query(default=None),
week_start: Optional[int] = None,
week_end: Optional[int] = None,
game_num: list = Query(default=None),
away_abbrev: list = Query(default=None),
home_abbrev: list = Query(default=None),
short_output: Optional[bool] = False,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
offset: int = Query(default=0, ge=0),
):
all_results = Result.select_season(season)
if team_abbrev is not None:
@ -67,15 +78,18 @@ async def get_results(
if week_end is not None:
all_results = all_results.where(Result.week <= week_end)
total_count = all_results.count()
all_results = all_results.offset(offset).limit(limit)
return_results = {
'count': all_results.count(),
'results': [model_to_dict(x, recurse=not short_output) for x in all_results]
"count": total_count,
"results": [model_to_dict(x, recurse=not short_output) for x in all_results],
}
db.close()
return return_results
@router.get('/{result_id}')
@router.get("/{result_id}")
@handle_db_errors
async def get_one_result(result_id: int, short_output: Optional[bool] = False):
this_result = Result.get_or_none(Result.id == result_id)
@ -87,20 +101,27 @@ async def get_one_result(result_id: int, short_output: Optional[bool] = False):
return r_result
@router.patch('/{result_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.patch("/{result_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def patch_result(
result_id: int, week_num: Optional[int] = None, game_num: Optional[int] = None,
away_team_id: Optional[int] = None, home_team_id: Optional[int] = None, away_score: Optional[int] = None,
home_score: Optional[int] = None, season: Optional[int] = None, scorecard_url: Optional[str] = None,
token: str = Depends(oauth2_scheme)):
result_id: int,
week_num: Optional[int] = None,
game_num: Optional[int] = None,
away_team_id: Optional[int] = None,
home_team_id: Optional[int] = None,
away_score: Optional[int] = None,
home_score: Optional[int] = None,
season: Optional[int] = None,
scorecard_url: Optional[str] = None,
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f'patch_player - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_player - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_result = Result.get_or_none(Result.id == result_id)
if this_result is None:
raise HTTPException(status_code=404, detail=f'Result ID {result_id} not found')
raise HTTPException(status_code=404, detail=f"Result ID {result_id} not found")
if week_num is not None:
this_result.week = week_num
@ -132,22 +153,39 @@ async def patch_result(
return r_result
else:
db.close()
raise HTTPException(status_code=500, detail=f'Unable to patch result {result_id}')
raise HTTPException(
status_code=500, detail=f"Unable to patch result {result_id}"
)
@router.post('', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_results(result_list: ResultList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'patch_player - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_player - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
new_results = []
all_team_ids = list(
set(x.awayteam_id for x in result_list.results)
| set(x.hometeam_id for x in result_list.results)
)
found_team_ids = (
set(t.id for t in Team.select(Team.id).where(Team.id << all_team_ids))
if all_team_ids
else set()
)
for x in result_list.results:
if Team.get_or_none(Team.id == x.awayteam_id) is None:
raise HTTPException(status_code=404, detail=f'Team ID {x.awayteam_id} not found')
if Team.get_or_none(Team.id == x.hometeam_id) is None:
raise HTTPException(status_code=404, detail=f'Team ID {x.hometeam_id} not found')
if x.awayteam_id not in found_team_ids:
raise HTTPException(
status_code=404, detail=f"Team ID {x.awayteam_id} not found"
)
if x.hometeam_id not in found_team_ids:
raise HTTPException(
status_code=404, detail=f"Team ID {x.hometeam_id} not found"
)
new_results.append(x.dict())
@ -156,27 +194,27 @@ async def post_results(result_list: ResultList, token: str = Depends(oauth2_sche
Result.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f'Inserted {len(new_results)} results'
return f"Inserted {len(new_results)} results"
@router.delete('/{result_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.delete("/{result_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def delete_result(result_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'delete_result - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"delete_result - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_result = Result.get_or_none(Result.id == result_id)
if not this_result:
db.close()
raise HTTPException(status_code=404, detail=f'Result ID {result_id} not found')
raise HTTPException(status_code=404, detail=f"Result ID {result_id} not found")
count = this_result.delete_instance()
db.close()
if count == 1:
return f'Result {result_id} has been deleted'
return f"Result {result_id} has been deleted"
else:
raise HTTPException(status_code=500, detail=f'Result {result_id} could not be deleted')
raise HTTPException(
status_code=500, detail=f"Result {result_id} could not be deleted"
)

View File

@ -7,15 +7,19 @@ import logging
import pydantic
from ..db_engine import db, SbaPlayer, model_to_dict, fn, chunked, query_to_csv
from ..dependencies import oauth2_scheme, valid_token, PRIVATE_IN_SCHEMA, handle_db_errors
logger = logging.getLogger('discord_app')
router = APIRouter(
prefix='/api/v3/sbaplayers',
tags=['sbaplayers']
from ..dependencies import (
oauth2_scheme,
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
router = APIRouter(prefix="/api/v3/sbaplayers", tags=["sbaplayers"])
class SbaPlayerModel(pydantic.BaseModel):
first_name: str
@ -30,19 +34,28 @@ class PlayerList(pydantic.BaseModel):
players: List[SbaPlayerModel]
@router.get('')
@router.get("")
@handle_db_errors
async def get_players(
full_name: list = Query(default=None), first_name: list = Query(default=None),
last_name: list = Query(default=None), key_fangraphs: list = Query(default=None),
key_bbref: list = Query(default=None), key_retro: list = Query(default=None),
key_mlbam: list = Query(default=None), sort: Optional[str] = None, csv: Optional[bool] = False):
full_name: list = Query(default=None),
first_name: list = Query(default=None),
last_name: list = Query(default=None),
key_fangraphs: list = Query(default=None),
key_bbref: list = Query(default=None),
key_retro: list = Query(default=None),
key_mlbam: list = Query(default=None),
sort: Optional[str] = None,
csv: Optional[bool] = False,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
offset: int = Query(default=0, ge=0),
):
all_players = SbaPlayer.select()
if full_name is not None:
name_list = [x.lower() for x in full_name]
all_players = all_players.where(
fn.lower(SbaPlayer.first_name) + ' ' + fn.lower(SbaPlayer.last_name) << name_list
fn.lower(SbaPlayer.first_name) + " " + fn.lower(SbaPlayer.last_name)
<< name_list
)
if first_name is not None:
name_list = [x.lower() for x in first_name]
@ -60,76 +73,90 @@ async def get_players(
all_players = all_players.where(fn.lower(SbaPlayer.key_retro) << name_list)
if key_mlbam is not None:
all_players = all_players.where(SbaPlayer.key_mlbam << key_mlbam)
if sort is not None:
if sort in ['firstname-asc', 'fullname-asc']:
if sort in ["firstname-asc", "fullname-asc"]:
all_players = all_players.order_by(SbaPlayer.first_name)
elif sort in ['firstname-desc', 'fullname-desc']:
elif sort in ["firstname-desc", "fullname-desc"]:
all_players = all_players.order_by(-SbaPlayer.first_name)
elif sort == 'lastname-asc':
elif sort == "lastname-asc":
all_players = all_players.order_by(SbaPlayer.last_name)
elif sort == 'lastname-desc':
elif sort == "lastname-desc":
all_players = all_players.order_by(-SbaPlayer.last_name)
elif sort == 'fangraphs-asc':
elif sort == "fangraphs-asc":
all_players = all_players.order_by(SbaPlayer.key_fangraphs)
elif sort == 'fangraphs-desc':
elif sort == "fangraphs-desc":
all_players = all_players.order_by(-SbaPlayer.key_fangraphs)
elif sort == 'bbref-asc':
elif sort == "bbref-asc":
all_players = all_players.order_by(SbaPlayer.key_bbref)
elif sort == 'bbref-desc':
elif sort == "bbref-desc":
all_players = all_players.order_by(-SbaPlayer.key_bbref)
elif sort == 'retro-asc':
elif sort == "retro-asc":
all_players = all_players.order_by(SbaPlayer.key_retro)
elif sort == 'retro-desc':
elif sort == "retro-desc":
all_players = all_players.order_by(-SbaPlayer.key_retro)
elif sort == 'mlbam-asc':
elif sort == "mlbam-asc":
all_players = all_players.order_by(SbaPlayer.key_mlbam)
elif sort == 'mlbam-desc':
elif sort == "mlbam-desc":
all_players = all_players.order_by(-SbaPlayer.key_mlbam)
if csv:
return_val = query_to_csv(all_players)
db.close()
return Response(content=return_val, media_type='text/csv')
return Response(content=return_val, media_type="text/csv")
return_val = {'count': all_players.count(), 'players': [
model_to_dict(x) for x in all_players
]}
total_count = all_players.count()
all_players = all_players.offset(offset).limit(limit)
return_val = {
"count": total_count,
"players": [model_to_dict(x) for x in all_players],
}
db.close()
return return_val
@router.get('/{player_id}')
@router.get("/{player_id}")
@handle_db_errors
async def get_one_player(player_id: int):
this_player = SbaPlayer.get_or_none(SbaPlayer.id == player_id)
if this_player is None:
db.close()
raise HTTPException(status_code=404, detail=f'SbaPlayer id {player_id} not found')
raise HTTPException(
status_code=404, detail=f"SbaPlayer id {player_id} not found"
)
r_data = model_to_dict(this_player)
db.close()
return r_data
@router.patch('/{player_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.patch("/{player_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def patch_player(
player_id: int, first_name: Optional[str] = None, last_name: Optional[str] = None,
key_fangraphs: Optional[str] = None, key_bbref: Optional[str] = None, key_retro: Optional[str] = None,
key_mlbam: Optional[str] = None, token: str = Depends(oauth2_scheme)):
player_id: int,
first_name: Optional[str] = None,
last_name: Optional[str] = None,
key_fangraphs: Optional[str] = None,
key_bbref: Optional[str] = None,
key_retro: Optional[str] = None,
key_mlbam: Optional[str] = None,
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logging.warning(f'Bad Token: {token}')
logging.warning(f"Bad Token: {token}")
db.close()
raise HTTPException(
status_code=401,
detail='You are not authorized to patch mlb players. This event has been logged.'
detail="You are not authorized to patch mlb players. This event has been logged.",
)
this_player = SbaPlayer.get_or_none(SbaPlayer.id == player_id)
if this_player is None:
db.close()
raise HTTPException(status_code=404, detail=f'SbaPlayer id {player_id} not found')
raise HTTPException(
status_code=404, detail=f"SbaPlayer id {player_id} not found"
)
if first_name is not None:
this_player.first_name = first_name
@ -152,33 +179,38 @@ async def patch_player(
db.close()
raise HTTPException(
status_code=418,
detail='Well slap my ass and call me a teapot; I could not save that player'
detail="Well slap my ass and call me a teapot; I could not save that player",
)
@router.post('', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_players(players: PlayerList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logging.warning(f'Bad Token: {token}')
logging.warning(f"Bad Token: {token}")
db.close()
raise HTTPException(
status_code=401,
detail='You are not authorized to post mlb players. This event has been logged.'
detail="You are not authorized to post mlb players. This event has been logged.",
)
new_players = []
for x in players.players:
dupes = SbaPlayer.select().where(
((SbaPlayer.key_fangraphs == x.key_fangraphs) & (x.key_fangraphs is not None)) | ((SbaPlayer.key_mlbam == x.key_mlbam) & (x.key_mlbam is not None)) |
((SbaPlayer.key_retro == x.key_retro) & (x.key_retro is not None)) | ((SbaPlayer.key_bbref == x.key_bbref) & (x.key_bbref is not None))
(
(SbaPlayer.key_fangraphs == x.key_fangraphs)
& (x.key_fangraphs is not None)
)
| ((SbaPlayer.key_mlbam == x.key_mlbam) & (x.key_mlbam is not None))
| ((SbaPlayer.key_retro == x.key_retro) & (x.key_retro is not None))
| ((SbaPlayer.key_bbref == x.key_bbref) & (x.key_bbref is not None))
)
if dupes.count() > 0:
logger.error(f'Found a dupe for {x}')
logger.error(f"Found a dupe for {x}")
db.close()
raise HTTPException(
status_code=400,
detail=f'{x.first_name} {x.last_name} has a key already in the database'
detail=f"{x.first_name} {x.last_name} has a key already in the database",
)
new_players.append(x.model_dump())
@ -188,32 +220,33 @@ async def post_players(players: PlayerList, token: str = Depends(oauth2_scheme))
SbaPlayer.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f'Inserted {len(new_players)} new MLB players'
return f"Inserted {len(new_players)} new MLB players"
@router.post('/one', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/one", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_one_player(player: SbaPlayerModel, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logging.warning(f'Bad Token: {token}')
logging.warning(f"Bad Token: {token}")
db.close()
raise HTTPException(
status_code=401,
detail='You are not authorized to post mlb players. This event has been logged.'
detail="You are not authorized to post mlb players. This event has been logged.",
)
dupes = SbaPlayer.select().where(
(SbaPlayer.key_fangraphs == player.key_fangraphs) | (SbaPlayer.key_mlbam == player.key_mlbam) |
(SbaPlayer.key_bbref == player.key_bbref)
(SbaPlayer.key_fangraphs == player.key_fangraphs)
| (SbaPlayer.key_mlbam == player.key_mlbam)
| (SbaPlayer.key_bbref == player.key_bbref)
)
if dupes.count() > 0:
logging.info(f'POST /SbaPlayers/one - dupes found:')
logging.info(f"POST /SbaPlayers/one - dupes found:")
for x in dupes:
logging.info(f'{x}')
logging.info(f"{x}")
db.close()
raise HTTPException(
status_code=400,
detail=f'{player.first_name} {player.last_name} has a key already in the database'
detail=f"{player.first_name} {player.last_name} has a key already in the database",
)
new_player = SbaPlayer(**player.dict())
@ -226,30 +259,34 @@ async def post_one_player(player: SbaPlayerModel, token: str = Depends(oauth2_sc
db.close()
raise HTTPException(
status_code=418,
detail='Well slap my ass and call me a teapot; I could not save that player'
detail="Well slap my ass and call me a teapot; I could not save that player",
)
@router.delete('/{player_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.delete("/{player_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def delete_player(player_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logging.warning(f'Bad Token: {token}')
logging.warning(f"Bad Token: {token}")
db.close()
raise HTTPException(
status_code=401,
detail='You are not authorized to delete mlb players. This event has been logged.'
detail="You are not authorized to delete mlb players. This event has been logged.",
)
this_player = SbaPlayer.get_or_none(SbaPlayer.id == player_id)
if this_player is None:
db.close()
raise HTTPException(status_code=404, detail=f'SbaPlayer id {player_id} not found')
raise HTTPException(
status_code=404, detail=f"SbaPlayer id {player_id} not found"
)
count = this_player.delete_instance()
db.close()
if count == 1:
return f'Player {player_id} has been deleted'
return f"Player {player_id} has been deleted"
else:
raise HTTPException(status_code=500, detail=f'Player {player_id} was not deleted')
raise HTTPException(
status_code=500, detail=f"Player {player_id} was not deleted"
)

View File

@ -4,15 +4,19 @@ import logging
import pydantic
from ..db_engine import db, Schedule, Team, model_to_dict, chunked
from ..dependencies import oauth2_scheme, valid_token, PRIVATE_IN_SCHEMA, handle_db_errors
logger = logging.getLogger('discord_app')
router = APIRouter(
prefix='/api/v3/schedules',
tags=['schedules']
from ..dependencies import (
oauth2_scheme,
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
router = APIRouter(prefix="/api/v3/schedules", tags=["schedules"])
class ScheduleModel(pydantic.BaseModel):
week: int
@ -26,12 +30,19 @@ class ScheduleList(pydantic.BaseModel):
schedules: List[ScheduleModel]
@router.get('')
@router.get("")
@handle_db_errors
async def get_schedules(
season: int, team_abbrev: list = Query(default=None), away_abbrev: list = Query(default=None),
home_abbrev: list = Query(default=None), week_start: Optional[int] = None, week_end: Optional[int] = None,
short_output: Optional[bool] = True):
season: int,
team_abbrev: list = Query(default=None),
away_abbrev: list = Query(default=None),
home_abbrev: list = Query(default=None),
week_start: Optional[int] = None,
week_end: Optional[int] = None,
short_output: Optional[bool] = True,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
offset: int = Query(default=0, ge=0),
):
all_sched = Schedule.select_season(season)
if team_abbrev is not None:
@ -62,15 +73,18 @@ async def get_schedules(
all_sched = all_sched.order_by(Schedule.id)
total_count = all_sched.count()
all_sched = all_sched.offset(offset).limit(limit)
return_sched = {
'count': all_sched.count(),
'schedules': [model_to_dict(x, recurse=not short_output) for x in all_sched]
"count": total_count,
"schedules": [model_to_dict(x, recurse=not short_output) for x in all_sched],
}
db.close()
return return_sched
@router.get('/{schedule_id}')
@router.get("/{schedule_id}")
@handle_db_errors
async def get_one_schedule(schedule_id: int):
this_sched = Schedule.get_or_none(Schedule.id == schedule_id)
@ -82,19 +96,26 @@ async def get_one_schedule(schedule_id: int):
return r_sched
@router.patch('/{schedule_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.patch("/{schedule_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def patch_schedule(
schedule_id: int, week: list = Query(default=None), awayteam_id: Optional[int] = None,
hometeam_id: Optional[int] = None, gamecount: Optional[int] = None, season: Optional[int] = None,
token: str = Depends(oauth2_scheme)):
schedule_id: int,
week: list = Query(default=None),
awayteam_id: Optional[int] = None,
hometeam_id: Optional[int] = None,
gamecount: Optional[int] = None,
season: Optional[int] = None,
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f'patch_schedule - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_schedule - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_sched = Schedule.get_or_none(Schedule.id == schedule_id)
if this_sched is None:
raise HTTPException(status_code=404, detail=f'Schedule ID {schedule_id} not found')
raise HTTPException(
status_code=404, detail=f"Schedule ID {schedule_id} not found"
)
if week is not None:
this_sched.week = week
@ -117,22 +138,39 @@ async def patch_schedule(
return r_sched
else:
db.close()
raise HTTPException(status_code=500, detail=f'Unable to patch schedule {schedule_id}')
raise HTTPException(
status_code=500, detail=f"Unable to patch schedule {schedule_id}"
)
@router.post('', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_schedules(sched_list: ScheduleList, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'post_schedules - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"post_schedules - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
new_sched = []
all_team_ids = list(
set(x.awayteam_id for x in sched_list.schedules)
| set(x.hometeam_id for x in sched_list.schedules)
)
found_team_ids = (
set(t.id for t in Team.select(Team.id).where(Team.id << all_team_ids))
if all_team_ids
else set()
)
for x in sched_list.schedules:
if Team.get_or_none(Team.id == x.awayteam_id) is None:
raise HTTPException(status_code=404, detail=f'Team ID {x.awayteam_id} not found')
if Team.get_or_none(Team.id == x.hometeam_id) is None:
raise HTTPException(status_code=404, detail=f'Team ID {x.hometeam_id} not found')
if x.awayteam_id not in found_team_ids:
raise HTTPException(
status_code=404, detail=f"Team ID {x.awayteam_id} not found"
)
if x.hometeam_id not in found_team_ids:
raise HTTPException(
status_code=404, detail=f"Team ID {x.hometeam_id} not found"
)
new_sched.append(x.dict())
@ -141,24 +179,28 @@ async def post_schedules(sched_list: ScheduleList, token: str = Depends(oauth2_s
Schedule.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f'Inserted {len(new_sched)} schedules'
return f"Inserted {len(new_sched)} schedules"
@router.delete('/{schedule_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.delete("/{schedule_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def delete_schedule(schedule_id: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'delete_schedule - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"delete_schedule - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
this_sched = Schedule.get_or_none(Schedule.id == schedule_id)
if this_sched is None:
raise HTTPException(status_code=404, detail=f'Schedule ID {schedule_id} not found')
raise HTTPException(
status_code=404, detail=f"Schedule ID {schedule_id} not found"
)
count = this_sched.delete_instance()
db.close()
if count == 1:
return f'Schedule {this_sched} has been deleted'
return f"Schedule {this_sched} has been deleted"
else:
raise HTTPException(status_code=500, detail=f'Schedule {this_sched} could not be deleted')
raise HTTPException(
status_code=500, detail=f"Schedule {this_sched} could not be deleted"
)

View File

@ -1,24 +1,33 @@
from fastapi import APIRouter, Depends, HTTPException, Query
from typing import List, Optional
import logging
import pydantic
from ..db_engine import db, Standings, Team, Division, model_to_dict, chunked, fn
from ..dependencies import oauth2_scheme, valid_token, PRIVATE_IN_SCHEMA, handle_db_errors
logger = logging.getLogger('discord_app')
router = APIRouter(
prefix='/api/v3/standings',
tags=['standings']
from ..dependencies import (
oauth2_scheme,
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
@router.get('')
router = APIRouter(prefix="/api/v3/standings", tags=["standings"])
@router.get("")
@handle_db_errors
async def get_standings(
season: int, team_id: list = Query(default=None), league_abbrev: Optional[str] = None,
division_abbrev: Optional[str] = None, short_output: Optional[bool] = False):
season: int,
team_id: list = Query(default=None),
league_abbrev: Optional[str] = None,
division_abbrev: Optional[str] = None,
short_output: Optional[bool] = False,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
offset: int = Query(default=0, ge=0),
):
standings = Standings.select_season(season)
# if standings.count() == 0:
@ -30,55 +39,69 @@ async def get_standings(
standings = standings.where(Standings.team << t_query)
if league_abbrev is not None:
l_query = Division.select().where(fn.Lower(Division.league_abbrev) == league_abbrev.lower())
l_query = Division.select().where(
fn.Lower(Division.league_abbrev) == league_abbrev.lower()
)
standings = standings.where(Standings.team.division << l_query)
if division_abbrev is not None:
d_query = Division.select().where(fn.Lower(Division.division_abbrev) == division_abbrev.lower())
d_query = Division.select().where(
fn.Lower(Division.division_abbrev) == division_abbrev.lower()
)
standings = standings.where(Standings.team.division << d_query)
def win_pct(this_team_stan):
if this_team_stan.wins + this_team_stan.losses == 0:
return 0
else:
return (this_team_stan.wins / (this_team_stan.wins + this_team_stan.losses)) + \
(this_team_stan.run_diff * .000001)
return (
this_team_stan.wins / (this_team_stan.wins + this_team_stan.losses)
) + (this_team_stan.run_diff * 0.000001)
div_teams = [x for x in standings]
div_teams.sort(key=lambda team: win_pct(team), reverse=True)
total_count = len(div_teams)
div_teams = div_teams[offset : offset + limit]
return_standings = {
'count': len(div_teams),
'standings': [model_to_dict(x, recurse=not short_output) for x in div_teams]
"count": total_count,
"standings": [model_to_dict(x, recurse=not short_output) for x in div_teams],
}
db.close()
return return_standings
@router.get('/team/{team_id}')
@router.get("/team/{team_id}")
@handle_db_errors
async def get_team_standings(team_id: int):
this_stan = Standings.get_or_none(Standings.team_id == team_id)
if this_stan is None:
raise HTTPException(status_code=404, detail=f'No standings found for team id {team_id}')
raise HTTPException(
status_code=404, detail=f"No standings found for team id {team_id}"
)
return model_to_dict(this_stan)
@router.patch('/{stan_id}', include_in_schema=PRIVATE_IN_SCHEMA)
@router.patch("/{stan_id}", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def patch_standings(
stan_id, wins: Optional[int] = None, losses: Optional[int] = None, token: str = Depends(oauth2_scheme)):
stan_id,
wins: Optional[int] = None,
losses: Optional[int] = None,
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logger.warning(f'patch_standings - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"patch_standings - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
try:
this_stan = Standings.get_by_id(stan_id)
except Exception as e:
db.close()
raise HTTPException(status_code=404, detail=f'No team found with id {stan_id}')
raise HTTPException(status_code=404, detail=f"No team found with id {stan_id}")
if wins:
this_stan.wins = wins
@ -91,35 +114,35 @@ async def patch_standings(
return model_to_dict(this_stan)
@router.post('/s{season}/new', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/s{season}/new", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_standings(season: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'post_standings - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"post_standings - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
new_teams = []
all_teams = Team.select().where(Team.season == season)
for x in all_teams:
new_teams.append(Standings({'team_id': x.id}))
new_teams.append(Standings({"team_id": x.id}))
with db.atomic():
for batch in chunked(new_teams, 16):
Standings.insert_many(batch).on_conflict_ignore().execute()
db.close()
return f'Inserted {len(new_teams)} standings'
return f"Inserted {len(new_teams)} standings"
@router.post('/s{season}/recalculate', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/s{season}/recalculate", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def recalculate_standings(season: int, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logger.warning(f'recalculate_standings - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"recalculate_standings - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
code = Standings.recalculate(season)
db.close()
if code == 69:
raise HTTPException(status_code=500, detail=f'Error recreating Standings rows')
return f'Just recalculated standings for season {season}'
raise HTTPException(status_code=500, detail=f"Error recreating Standings rows")
return f"Just recalculated standings for season {season}"

View File

@ -13,6 +13,7 @@ from ..dependencies import (
PRIVATE_IN_SCHEMA,
handle_db_errors,
update_season_batting_stats,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
@ -59,6 +60,8 @@ async def get_games(
division_id: Optional[int] = None,
short_output: Optional[bool] = False,
sort: Optional[str] = None,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=1000),
offset: int = Query(default=0, ge=0),
) -> Any:
all_games = StratGame.select()
@ -119,8 +122,11 @@ async def get_games(
StratGame.season, StratGame.week, StratGame.game_num
)
total_count = all_games.count()
all_games = all_games.offset(offset).limit(limit)
return_games = {
"count": all_games.count(),
"count": total_count,
"games": [model_to_dict(x, recurse=not short_output) for x in all_games],
}
db.close()
@ -230,7 +236,7 @@ async def patch_game(
raise HTTPException(status_code=500, detail=f"Unable to patch game {game_id}")
@router.post("", include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_games(game_list: GameList, token: str = Depends(oauth2_scheme)) -> Any:
if not valid_token(token):

View File

@ -13,7 +13,13 @@ from ...db_engine import (
fn,
model_to_dict,
)
from ...dependencies import add_cache_headers, cache_result, handle_db_errors
from ...dependencies import (
add_cache_headers,
cache_result,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
from .common import build_season_games
router = APIRouter()
@ -52,7 +58,7 @@ async def get_batting_totals(
risp: Optional[bool] = None,
inning: list = Query(default=None),
sort: Optional[str] = None,
limit: Optional[int] = 200,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
short_output: Optional[bool] = False,
page_num: Optional[int] = 1,
week_start: Optional[int] = None,
@ -423,8 +429,6 @@ async def get_batting_totals(
run_plays = run_plays.order_by(StratPlay.game.asc())
# For other group_by values, skip game_id/play_num sorting since they're not in GROUP BY
if limit < 1:
limit = 1
bat_plays = bat_plays.paginate(page_num, limit)
logger.info(f"bat_plays query: {bat_plays}")

View File

@ -13,7 +13,13 @@ from ...db_engine import (
fn,
SQL,
)
from ...dependencies import handle_db_errors, add_cache_headers, cache_result
from ...dependencies import (
handle_db_errors,
add_cache_headers,
cache_result,
MAX_LIMIT,
DEFAULT_LIMIT,
)
from .common import build_season_games
logger = logging.getLogger("discord_app")
@ -51,7 +57,7 @@ async def get_fielding_totals(
team_id: list = Query(default=None),
manager_id: list = Query(default=None),
sort: Optional[str] = None,
limit: Optional[int] = 200,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
short_output: Optional[bool] = False,
page_num: Optional[int] = 1,
):
@ -237,8 +243,6 @@ async def get_fielding_totals(
def_plays = def_plays.order_by(StratPlay.game.asc())
# For other group_by values, skip game_id/play_num sorting since they're not in GROUP BY
if limit < 1:
limit = 1
def_plays = def_plays.paginate(page_num, limit)
logger.info(f"def_plays query: {def_plays}")

View File

@ -16,7 +16,13 @@ from ...db_engine import (
SQL,
complex_data_to_csv,
)
from ...dependencies import handle_db_errors, add_cache_headers, cache_result
from ...dependencies import (
handle_db_errors,
add_cache_headers,
cache_result,
MAX_LIMIT,
DEFAULT_LIMIT,
)
from .common import build_season_games
router = APIRouter()
@ -51,7 +57,7 @@ async def get_pitching_totals(
risp: Optional[bool] = None,
inning: list = Query(default=None),
sort: Optional[str] = None,
limit: Optional[int] = 200,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
short_output: Optional[bool] = False,
csv: Optional[bool] = False,
page_num: Optional[int] = 1,
@ -164,8 +170,6 @@ async def get_pitching_totals(
if group_by in ["playergame", "teamgame"]:
pitch_plays = pitch_plays.order_by(StratPlay.game.asc())
if limit < 1:
limit = 1
pitch_plays = pitch_plays.paginate(page_num, limit)
# Execute the Peewee query

View File

@ -16,6 +16,8 @@ from ...dependencies import (
handle_db_errors,
add_cache_headers,
cache_result,
MAX_LIMIT,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
@ -70,7 +72,7 @@ async def get_plays(
pitcher_team_id: list = Query(default=None),
short_output: Optional[bool] = False,
sort: Optional[str] = None,
limit: Optional[int] = 200,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
page_num: Optional[int] = 1,
s_type: Literal["regular", "post", "total", None] = None,
):
@ -185,8 +187,6 @@ async def get_plays(
season_games = season_games.where(StratGame.week > 18)
all_plays = all_plays.where(StratPlay.game << season_games)
if limit < 1:
limit = 1
bat_plays = all_plays.paginate(page_num, limit)
if sort == "wpa-desc":

View File

@ -6,16 +6,23 @@ Thin HTTP layer using TeamService for business logic.
from fastapi import APIRouter, Query, Response, Depends
from typing import List, Optional, Literal
from ..dependencies import oauth2_scheme, PRIVATE_IN_SCHEMA, handle_db_errors, cache_result
from ..dependencies import (
oauth2_scheme,
PRIVATE_IN_SCHEMA,
handle_db_errors,
cache_result,
MAX_LIMIT,
DEFAULT_LIMIT,
)
from ..services.base import BaseService
from ..services.team_service import TeamService
router = APIRouter(prefix='/api/v3/teams', tags=['teams'])
router = APIRouter(prefix="/api/v3/teams", tags=["teams"])
@router.get('')
@router.get("")
@handle_db_errors
@cache_result(ttl=10*60, key_prefix='teams')
@cache_result(ttl=10 * 60, key_prefix="teams")
async def get_teams(
season: Optional[int] = None,
owner_id: list = Query(default=None),
@ -23,7 +30,7 @@ async def get_teams(
team_abbrev: list = Query(default=None),
active_only: Optional[bool] = False,
short_output: Optional[bool] = False,
csv: Optional[bool] = False
csv: Optional[bool] = False,
):
"""Get teams with filtering."""
result = TeamService.get_teams(
@ -33,46 +40,41 @@ async def get_teams(
team_abbrev=team_abbrev if team_abbrev else None,
active_only=active_only or False,
short_output=short_output or False,
as_csv=csv or False
as_csv=csv or False,
)
if csv:
return Response(content=result, media_type='text/csv')
return Response(content=result, media_type="text/csv")
return result
@router.get('/{team_id}')
@router.get("/{team_id}")
@handle_db_errors
@cache_result(ttl=30*60, key_prefix='team')
@cache_result(ttl=30 * 60, key_prefix="team")
async def get_one_team(team_id: int):
"""Get a single team by ID."""
return TeamService.get_team(team_id)
@router.get('/{team_id}/roster')
@router.get("/{team_id}/roster")
@handle_db_errors
@cache_result(ttl=30*60, key_prefix='team-roster')
async def get_team_roster_default(
team_id: int,
sort: Optional[str] = None
):
@cache_result(ttl=30 * 60, key_prefix="team-roster")
async def get_team_roster_default(team_id: int, sort: Optional[str] = None):
"""Get team roster with IL lists (defaults to current season)."""
return TeamService.get_team_roster(team_id, 'current', sort=sort)
return TeamService.get_team_roster(team_id, "current", sort=sort)
@router.get('/{team_id}/roster/{which}')
@router.get("/{team_id}/roster/{which}")
@handle_db_errors
@cache_result(ttl=30*60, key_prefix='team-roster')
@cache_result(ttl=30 * 60, key_prefix="team-roster")
async def get_team_roster(
team_id: int,
which: Literal['current', 'next'],
sort: Optional[str] = None
team_id: int, which: Literal["current", "next"], sort: Optional[str] = None
):
"""Get team roster with IL lists."""
return TeamService.get_team_roster(team_id, which, sort=sort)
@router.patch('/{team_id}')
@router.patch("/{team_id}")
async def patch_team(
team_id: int,
token: str = Depends(oauth2_scheme),
@ -95,31 +97,33 @@ async def patch_team(
data = {
key: value
for key, value in {
'manager1_id': manager1_id, 'manager2_id': manager2_id,
'gmid': gmid, 'gmid2': gmid2, 'mascot': mascot,
'stadium': stadium, 'thumbnail': thumbnail, 'color': color,
'abbrev': abbrev, 'sname': sname, 'lname': lname,
'dice_color': dice_color, 'division_id': division_id,
"manager1_id": manager1_id,
"manager2_id": manager2_id,
"gmid": gmid,
"gmid2": gmid2,
"mascot": mascot,
"stadium": stadium,
"thumbnail": thumbnail,
"color": color,
"abbrev": abbrev,
"sname": sname,
"lname": lname,
"dice_color": dice_color,
"division_id": division_id,
}.items()
if value is not None
}
return TeamService.update_team(team_id, data, token)
@router.post('')
async def post_teams(
team_list: dict,
token: str = Depends(oauth2_scheme)
):
@router.post("/")
async def post_teams(team_list: dict, token: str = Depends(oauth2_scheme)):
"""Create multiple teams."""
return TeamService.create_teams(team_list.get("teams", []), token)
@router.delete('/{team_id}')
async def delete_team(
team_id: int,
token: str = Depends(oauth2_scheme)
):
@router.delete("/{team_id}")
async def delete_team(team_id: int, token: str = Depends(oauth2_scheme)):
"""Delete a team."""
return TeamService.delete_team(team_id, token)

View File

@ -10,6 +10,8 @@ from ..dependencies import (
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
MAX_LIMIT,
DEFAULT_LIMIT,
)
logger = logging.getLogger("discord_app")
@ -36,7 +38,7 @@ class TransactionList(pydantic.BaseModel):
@router.get("")
@handle_db_errors
async def get_transactions(
season,
season: int,
team_abbrev: list = Query(default=None),
week_start: Optional[int] = 0,
week_end: Optional[int] = None,
@ -46,6 +48,8 @@ async def get_transactions(
player_id: list = Query(default=None),
move_id: Optional[str] = None,
short_output: Optional[bool] = False,
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
offset: int = Query(default=0, ge=0),
):
if season:
transactions = Transaction.select_season(season)
@ -54,7 +58,7 @@ async def get_transactions(
if team_abbrev is not None:
t_list = [x.upper() for x in team_abbrev]
these_teams = Team.select().where((Team.abbrev << t_list))
these_teams = Team.select().where(fn.UPPER(Team.abbrev) << t_list)
transactions = transactions.where(
(Transaction.newteam << these_teams) | (Transaction.oldteam << these_teams)
)
@ -85,8 +89,13 @@ async def get_transactions(
transactions = transactions.order_by(-Transaction.week, Transaction.moveid)
total_count = transactions.count()
transactions = transactions.offset(offset).limit(limit)
return_trans = {
"count": transactions.count(),
"count": total_count,
"limit": limit,
"offset": offset,
"transactions": [
model_to_dict(x, recurse=not short_output) for x in transactions
],
@ -126,7 +135,7 @@ async def patch_transactions(
return f"Updated {these_moves.count()} transactions"
@router.post("", include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def post_transactions(
moves: TransactionList, token: str = Depends(oauth2_scheme)
@ -137,16 +146,31 @@ async def post_transactions(
all_moves = []
all_team_ids = list(
set(x.oldteam_id for x in moves.moves) | set(x.newteam_id for x in moves.moves)
)
all_player_ids = list(set(x.player_id for x in moves.moves))
found_team_ids = (
set(t.id for t in Team.select(Team.id).where(Team.id << all_team_ids))
if all_team_ids
else set()
)
found_player_ids = (
set(p.id for p in Player.select(Player.id).where(Player.id << all_player_ids))
if all_player_ids
else set()
)
for x in moves.moves:
if Team.get_or_none(Team.id == x.oldteam_id) is None:
if x.oldteam_id not in found_team_ids:
raise HTTPException(
status_code=404, detail=f"Team ID {x.oldteam_id} not found"
)
if Team.get_or_none(Team.id == x.newteam_id) is None:
if x.newteam_id not in found_team_ids:
raise HTTPException(
status_code=404, detail=f"Team ID {x.newteam_id} not found"
)
if Player.get_or_none(Player.id == x.player_id) is None:
if x.player_id not in found_player_ids:
raise HTTPException(
status_code=404, detail=f"Player ID {x.player_id} not found"
)

View File

@ -3,239 +3,333 @@ from typing import List, Literal, Optional
import logging
import pydantic
from ..db_engine import SeasonBattingStats, SeasonPitchingStats, db, Manager, Team, Current, model_to_dict, fn, query_to_csv, StratPlay, StratGame
from ..dependencies import add_cache_headers, cache_result, oauth2_scheme, valid_token, PRIVATE_IN_SCHEMA, handle_db_errors, update_season_batting_stats, update_season_pitching_stats, get_cache_stats
logger = logging.getLogger('discord_app')
router = APIRouter(
prefix='/api/v3/views',
tags=['views']
from ..db_engine import (
SeasonBattingStats,
SeasonPitchingStats,
db,
Manager,
Team,
Current,
model_to_dict,
fn,
query_to_csv,
StratPlay,
StratGame,
)
from ..dependencies import (
add_cache_headers,
cache_result,
oauth2_scheme,
valid_token,
PRIVATE_IN_SCHEMA,
handle_db_errors,
update_season_batting_stats,
update_season_pitching_stats,
get_cache_stats,
MAX_LIMIT,
DEFAULT_LIMIT,
)
@router.get('/season-stats/batting')
logger = logging.getLogger("discord_app")
router = APIRouter(prefix="/api/v3/views", tags=["views"])
@router.get("/season-stats/batting")
@handle_db_errors
@add_cache_headers(max_age=10*60)
@cache_result(ttl=5*60, key_prefix='season-batting')
@add_cache_headers(max_age=10 * 60)
@cache_result(ttl=5 * 60, key_prefix="season-batting")
async def get_season_batting_stats(
season: Optional[int] = None,
team_id: Optional[int] = None,
player_id: Optional[int] = None,
sbaplayer_id: Optional[int] = None,
min_pa: Optional[int] = None, # Minimum plate appearances
sort_by: str = "woba", # Default sort field
sort_order: Literal['asc', 'desc'] = 'desc', # asc or desc
limit: Optional[int] = 200,
sort_by: Literal[
"pa",
"ab",
"run",
"hit",
"double",
"triple",
"homerun",
"rbi",
"bb",
"so",
"bphr",
"bpfo",
"bp1b",
"bplo",
"gidp",
"hbp",
"sac",
"ibb",
"avg",
"obp",
"slg",
"ops",
"woba",
"k_pct",
"sb",
"cs",
] = "woba", # Sort field
sort_order: Literal["asc", "desc"] = "desc", # asc or desc
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
offset: int = 0,
csv: Optional[bool] = False
csv: Optional[bool] = False,
):
logger.info(f'Getting season {season} batting stats - team_id: {team_id}, player_id: {player_id}, min_pa: {min_pa}, sort_by: {sort_by}, sort_order: {sort_order}, limit: {limit}, offset: {offset}')
logger.info(
f"Getting season {season} batting stats - team_id: {team_id}, player_id: {player_id}, min_pa: {min_pa}, sort_by: {sort_by}, sort_order: {sort_order}, limit: {limit}, offset: {offset}"
)
# Use the enhanced get_top_hitters method
query = SeasonBattingStats.get_top_hitters(
season=season,
stat=sort_by,
limit=limit if limit != 0 else None,
desc=(sort_order.lower() == 'desc'),
desc=(sort_order.lower() == "desc"),
team_id=team_id,
player_id=player_id,
sbaplayer_id=sbaplayer_id,
min_pa=min_pa,
offset=offset
offset=offset,
)
# Build applied filters for response
applied_filters = {}
if season is not None:
applied_filters['season'] = season
applied_filters["season"] = season
if team_id is not None:
applied_filters['team_id'] = team_id
applied_filters["team_id"] = team_id
if player_id is not None:
applied_filters['player_id'] = player_id
applied_filters["player_id"] = player_id
if min_pa is not None:
applied_filters['min_pa'] = min_pa
applied_filters["min_pa"] = min_pa
if csv:
return_val = query_to_csv(query)
return Response(content=return_val, media_type='text/csv')
return Response(content=return_val, media_type="text/csv")
else:
stat_list = [model_to_dict(stat) for stat in query]
return {
'count': len(stat_list),
'filters': applied_filters,
'stats': stat_list
}
return {"count": len(stat_list), "filters": applied_filters, "stats": stat_list}
@router.post('/season-stats/batting/refresh', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/season-stats/batting/refresh", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def refresh_season_batting_stats(
season: int,
token: str = Depends(oauth2_scheme)
season: int, token: str = Depends(oauth2_scheme)
) -> dict:
"""
Refresh batting stats for all players in a specific season.
Useful for full season updates.
"""
if not valid_token(token):
logger.warning(f'refresh_season_batting_stats - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"refresh_season_batting_stats - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
logger.info(f"Refreshing all batting stats for season {season}")
logger.info(f'Refreshing all batting stats for season {season}')
try:
# Get all player IDs who have stratplay records in this season
batter_ids = [row.batter_id for row in
StratPlay.select(StratPlay.batter_id.distinct())
.join(StratGame).where(StratGame.season == season)]
batter_ids = [
row.batter_id
for row in StratPlay.select(StratPlay.batter_id.distinct())
.join(StratGame)
.where(StratGame.season == season)
]
if batter_ids:
update_season_batting_stats(batter_ids, season, db)
logger.info(f'Successfully refreshed {len(batter_ids)} players for season {season}')
logger.info(
f"Successfully refreshed {len(batter_ids)} players for season {season}"
)
return {
'message': f'Season {season} batting stats refreshed',
'players_updated': len(batter_ids)
"message": f"Season {season} batting stats refreshed",
"players_updated": len(batter_ids),
}
else:
logger.warning(f'No batting data found for season {season}')
logger.warning(f"No batting data found for season {season}")
return {
'message': f'No batting data found for season {season}',
'players_updated': 0
"message": f"No batting data found for season {season}",
"players_updated": 0,
}
except Exception as e:
logger.error(f'Error refreshing season {season}: {e}')
raise HTTPException(status_code=500, detail=f'Refresh failed: {str(e)}')
logger.error(f"Error refreshing season {season}: {e}")
raise HTTPException(status_code=500, detail=f"Refresh failed: {str(e)}")
@router.get('/season-stats/pitching')
@router.get("/season-stats/pitching")
@handle_db_errors
@add_cache_headers(max_age=10*60)
@cache_result(ttl=5*60, key_prefix='season-pitching')
@add_cache_headers(max_age=10 * 60)
@cache_result(ttl=5 * 60, key_prefix="season-pitching")
async def get_season_pitching_stats(
season: Optional[int] = None,
team_id: Optional[int] = None,
player_id: Optional[int] = None,
sbaplayer_id: Optional[int] = None,
min_outs: Optional[int] = None, # Minimum outs pitched
sort_by: str = "era", # Default sort field
sort_order: Literal['asc', 'desc'] = 'asc', # asc or desc (asc default for ERA)
limit: Optional[int] = 200,
sort_by: Literal[
"tbf",
"outs",
"games",
"gs",
"win",
"loss",
"hold",
"saves",
"bsave",
"ir",
"irs",
"ab",
"run",
"e_run",
"hits",
"double",
"triple",
"homerun",
"bb",
"so",
"hbp",
"sac",
"ibb",
"gidp",
"sb",
"cs",
"bphr",
"bpfo",
"bp1b",
"bplo",
"wp",
"balk",
"wpa",
"era",
"whip",
"avg",
"obp",
"slg",
"ops",
"woba",
"hper9",
"kper9",
"bbper9",
"kperbb",
"lob_2outs",
"rbipercent",
"re24",
] = "era", # Sort field
sort_order: Literal["asc", "desc"] = "asc", # asc or desc (asc default for ERA)
limit: int = Query(default=DEFAULT_LIMIT, ge=1, le=MAX_LIMIT),
offset: int = 0,
csv: Optional[bool] = False
csv: Optional[bool] = False,
):
logger.info(f'Getting season {season} pitching stats - team_id: {team_id}, player_id: {player_id}, min_outs: {min_outs}, sort_by: {sort_by}, sort_order: {sort_order}, limit: {limit}, offset: {offset}')
logger.info(
f"Getting season {season} pitching stats - team_id: {team_id}, player_id: {player_id}, min_outs: {min_outs}, sort_by: {sort_by}, sort_order: {sort_order}, limit: {limit}, offset: {offset}"
)
# Use the get_top_pitchers method
query = SeasonPitchingStats.get_top_pitchers(
season=season,
stat=sort_by,
limit=limit if limit != 0 else None,
desc=(sort_order.lower() == 'desc'),
desc=(sort_order.lower() == "desc"),
team_id=team_id,
player_id=player_id,
sbaplayer_id=sbaplayer_id,
min_outs=min_outs,
offset=offset
offset=offset,
)
# Build applied filters for response
applied_filters = {}
if season is not None:
applied_filters['season'] = season
applied_filters["season"] = season
if team_id is not None:
applied_filters['team_id'] = team_id
applied_filters["team_id"] = team_id
if player_id is not None:
applied_filters['player_id'] = player_id
applied_filters["player_id"] = player_id
if min_outs is not None:
applied_filters['min_outs'] = min_outs
applied_filters["min_outs"] = min_outs
if csv:
return_val = query_to_csv(query)
return Response(content=return_val, media_type='text/csv')
return Response(content=return_val, media_type="text/csv")
else:
stat_list = [model_to_dict(stat) for stat in query]
return {
'count': len(stat_list),
'filters': applied_filters,
'stats': stat_list
}
return {"count": len(stat_list), "filters": applied_filters, "stats": stat_list}
@router.post('/season-stats/pitching/refresh', include_in_schema=PRIVATE_IN_SCHEMA)
@router.post("/season-stats/pitching/refresh", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def refresh_season_pitching_stats(
season: int,
token: str = Depends(oauth2_scheme)
season: int, token: str = Depends(oauth2_scheme)
) -> dict:
"""
Refresh pitching statistics for a specific season by aggregating from individual games.
Private endpoint - not included in public API documentation.
"""
if not valid_token(token):
logger.warning(f'refresh_season_batting_stats - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"refresh_season_batting_stats - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
logger.info(f"Refreshing season {season} pitching stats")
logger.info(f'Refreshing season {season} pitching stats')
try:
# Get all pitcher IDs for this season
pitcher_query = (
StratPlay
.select(StratPlay.pitcher_id)
StratPlay.select(StratPlay.pitcher_id)
.join(StratGame, on=(StratPlay.game_id == StratGame.id))
.where((StratGame.season == season) & (StratPlay.pitcher_id.is_null(False)))
.distinct()
)
pitcher_ids = [row.pitcher_id for row in pitcher_query]
if not pitcher_ids:
logger.warning(f'No pitchers found for season {season}')
logger.warning(f"No pitchers found for season {season}")
return {
'status': 'success',
'message': f'No pitchers found for season {season}',
'players_updated': 0
"status": "success",
"message": f"No pitchers found for season {season}",
"players_updated": 0,
}
# Use the dependency function to update pitching stats
update_season_pitching_stats(pitcher_ids, season, db)
logger.info(f'Season {season} pitching stats refreshed successfully - {len(pitcher_ids)} players updated')
logger.info(
f"Season {season} pitching stats refreshed successfully - {len(pitcher_ids)} players updated"
)
return {
'status': 'success',
'message': f'Season {season} pitching stats refreshed',
'players_updated': len(pitcher_ids)
"status": "success",
"message": f"Season {season} pitching stats refreshed",
"players_updated": len(pitcher_ids),
}
except Exception as e:
logger.error(f'Error refreshing season {season} pitching stats: {e}')
raise HTTPException(status_code=500, detail=f'Refresh failed: {str(e)}')
logger.error(f"Error refreshing season {season} pitching stats: {e}")
raise HTTPException(status_code=500, detail=f"Refresh failed: {str(e)}")
@router.get('/admin/cache', include_in_schema=PRIVATE_IN_SCHEMA)
@router.get("/admin/cache", include_in_schema=PRIVATE_IN_SCHEMA)
@handle_db_errors
async def get_admin_cache_stats(
token: str = Depends(oauth2_scheme)
) -> dict:
async def get_admin_cache_stats(token: str = Depends(oauth2_scheme)) -> dict:
"""
Get Redis cache statistics and status.
Private endpoint - requires authentication.
"""
if not valid_token(token):
logger.warning(f'get_admin_cache_stats - Bad Token: {token}')
raise HTTPException(status_code=401, detail='Unauthorized')
logger.warning(f"get_admin_cache_stats - Bad Token: {token}")
raise HTTPException(status_code=401, detail="Unauthorized")
logger.info("Getting cache statistics")
logger.info('Getting cache statistics')
try:
cache_stats = get_cache_stats()
logger.info(f'Cache stats retrieved: {cache_stats}')
return {
'status': 'success',
'cache_info': cache_stats
}
logger.info(f"Cache stats retrieved: {cache_stats}")
return {"status": "success", "cache_info": cache_stats}
except Exception as e:
logger.error(f'Error getting cache stats: {e}')
raise HTTPException(status_code=500, detail=f'Failed to get cache stats: {str(e)}')
logger.error(f"Error getting cache stats: {e}")
raise HTTPException(
status_code=500, detail=f"Failed to get cache stats: {str(e)}"
)

View File

@ -39,7 +39,7 @@ class PlayerService(BaseService):
cache_patterns = ["players*", "players-search*", "player*", "team-roster*"]
# Deprecated fields to exclude from player responses
EXCLUDED_FIELDS = ['pitcher_injury']
EXCLUDED_FIELDS = ["pitcher_injury"]
# Class-level repository for dependency injection
_injected_repo: Optional[AbstractPlayerRepository] = None
@ -135,17 +135,21 @@ class PlayerService(BaseService):
# Apply sorting
query = cls._apply_player_sort(query, sort)
# Convert to list of dicts
players_data = cls._query_to_player_dicts(query, short_output)
# Store total count before pagination
total_count = len(players_data)
# Apply pagination (offset and limit)
if offset is not None:
players_data = players_data[offset:]
if limit is not None:
players_data = players_data[:limit]
# Apply pagination at DB level for real queries, Python level for mocks
if isinstance(query, InMemoryQueryResult):
total_count = len(query)
players_data = cls._query_to_player_dicts(query, short_output)
if offset is not None:
players_data = players_data[offset:]
if limit is not None:
players_data = players_data[:limit]
else:
total_count = query.count()
if offset is not None:
query = query.offset(offset)
if limit is not None:
query = query.limit(limit)
players_data = cls._query_to_player_dicts(query, short_output)
# Return format
if as_csv:
@ -154,7 +158,7 @@ class PlayerService(BaseService):
return {
"count": len(players_data),
"total": total_count,
"players": players_data
"players": players_data,
}
except Exception as e:
@ -204,9 +208,9 @@ class PlayerService(BaseService):
p_list = [x.upper() for x in pos]
# Expand generic "P" to match all pitcher positions
pitcher_positions = ['SP', 'RP', 'CP']
if 'P' in p_list:
p_list.remove('P')
pitcher_positions = ["SP", "RP", "CP"]
if "P" in p_list:
p_list.remove("P")
p_list.extend(pitcher_positions)
pos_conditions = (
@ -245,9 +249,9 @@ class PlayerService(BaseService):
p_list = [p.upper() for p in pos]
# Expand generic "P" to match all pitcher positions
pitcher_positions = ['SP', 'RP', 'CP']
if 'P' in p_list:
p_list.remove('P')
pitcher_positions = ["SP", "RP", "CP"]
if "P" in p_list:
p_list.remove("P")
p_list.extend(pitcher_positions)
player_pos = [
@ -385,19 +389,23 @@ class PlayerService(BaseService):
# This filters at the database level instead of loading all players
if search_all_seasons:
# Search all seasons, order by season DESC (newest first)
query = (Player.select()
.where(fn.Lower(Player.name).contains(query_lower))
.order_by(Player.season.desc(), Player.name)
.limit(limit * 2)) # Get extra for exact match sorting
query = (
Player.select()
.where(fn.Lower(Player.name).contains(query_lower))
.order_by(Player.season.desc(), Player.name)
.limit(limit * 2)
) # Get extra for exact match sorting
else:
# Search specific season
query = (Player.select()
.where(
(Player.season == season) &
(fn.Lower(Player.name).contains(query_lower))
)
.order_by(Player.name)
.limit(limit * 2)) # Get extra for exact match sorting
query = (
Player.select()
.where(
(Player.season == season)
& (fn.Lower(Player.name).contains(query_lower))
)
.order_by(Player.name)
.limit(limit * 2)
) # Get extra for exact match sorting
# Execute query and convert limited results to dicts
players = list(query)
@ -468,19 +476,29 @@ class PlayerService(BaseService):
# Use backrefs=False to avoid circular reference issues
player_dict = model_to_dict(player, recurse=recurse, backrefs=False)
# Filter out excluded fields
return {k: v for k, v in player_dict.items() if k not in cls.EXCLUDED_FIELDS}
return {
k: v for k, v in player_dict.items() if k not in cls.EXCLUDED_FIELDS
}
except (ImportError, AttributeError, TypeError) as e:
# Log the error and fall back to non-recursive serialization
logger.warning(f"Error in recursive player serialization: {e}, falling back to non-recursive")
logger.warning(
f"Error in recursive player serialization: {e}, falling back to non-recursive"
)
try:
# Fallback to non-recursive serialization
player_dict = model_to_dict(player, recurse=False)
return {k: v for k, v in player_dict.items() if k not in cls.EXCLUDED_FIELDS}
return {
k: v for k, v in player_dict.items() if k not in cls.EXCLUDED_FIELDS
}
except Exception as fallback_error:
# Final fallback to basic dict conversion
logger.error(f"Error in non-recursive serialization: {fallback_error}, using basic dict")
logger.error(
f"Error in non-recursive serialization: {fallback_error}, using basic dict"
)
player_dict = dict(player)
return {k: v for k, v in player_dict.items() if k not in cls.EXCLUDED_FIELDS}
return {
k: v for k, v in player_dict.items() if k not in cls.EXCLUDED_FIELDS
}
@classmethod
def update_player(
@ -508,6 +526,8 @@ class PlayerService(BaseService):
raise HTTPException(
status_code=500, detail=f"Error updating player {player_id}: {str(e)}"
)
finally:
temp_service.invalidate_related_cache(cls.cache_patterns)
@classmethod
def patch_player(
@ -535,6 +555,8 @@ class PlayerService(BaseService):
raise HTTPException(
status_code=500, detail=f"Error patching player {player_id}: {str(e)}"
)
finally:
temp_service.invalidate_related_cache(cls.cache_patterns)
@classmethod
def create_players(
@ -567,6 +589,8 @@ class PlayerService(BaseService):
raise HTTPException(
status_code=500, detail=f"Error creating players: {str(e)}"
)
finally:
temp_service.invalidate_related_cache(cls.cache_patterns)
@classmethod
def delete_player(cls, player_id: int, token: str) -> Dict[str, str]:
@ -590,6 +614,8 @@ class PlayerService(BaseService):
raise HTTPException(
status_code=500, detail=f"Error deleting player {player_id}: {str(e)}"
)
finally:
temp_service.invalidate_related_cache(cls.cache_patterns)
@classmethod
def _format_player_csv(cls, players: List[Dict]) -> str:
@ -603,12 +629,12 @@ class PlayerService(BaseService):
flat_player = player.copy()
# Flatten team object to just abbreviation
if isinstance(flat_player.get('team'), dict):
flat_player['team'] = flat_player['team'].get('abbrev', '')
if isinstance(flat_player.get("team"), dict):
flat_player["team"] = flat_player["team"].get("abbrev", "")
# Flatten sbaplayer object to just ID
if isinstance(flat_player.get('sbaplayer'), dict):
flat_player['sbaplayer'] = flat_player['sbaplayer'].get('id', '')
if isinstance(flat_player.get("sbaplayer"), dict):
flat_player["sbaplayer"] = flat_player["sbaplayer"].get("id", "")
flattened_players.append(flat_player)

2
requirements-dev.txt Normal file
View File

@ -0,0 +1,2 @@
pytest==9.0.2
pytest-asyncio==1.3.0

View File

@ -1,11 +1,10 @@
fastapi
uvicorn
fastapi==0.133.0
uvicorn==0.41.0
starlette==0.52.1
peewee==3.13.3
python-multipart
numpy<2.0.0
pandas
psycopg2-binary>=2.9.0
requests
redis>=4.5.0
pytest>=7.0.0
pytest-asyncio>=0.21.0
python-multipart==0.0.22
numpy==1.26.4
pandas==3.0.1
psycopg2-binary==2.9.11
requests==2.32.5
redis==7.3.0

View File

@ -81,9 +81,9 @@ class TestRouteRegistration:
for route, methods in EXPECTED_PLAY_ROUTES.items():
assert route in paths, f"Route {route} missing from OpenAPI schema"
for method in methods:
assert (
method in paths[route]
), f"Method {method.upper()} missing for {route}"
assert method in paths[route], (
f"Method {method.upper()} missing for {route}"
)
def test_play_routes_have_plays_tag(self, api):
"""All play routes should be tagged with 'plays'."""
@ -96,9 +96,9 @@ class TestRouteRegistration:
for method, spec in paths[route].items():
if method in ("get", "post", "patch", "delete"):
tags = spec.get("tags", [])
assert (
"plays" in tags
), f"{method.upper()} {route} missing 'plays' tag, has {tags}"
assert "plays" in tags, (
f"{method.upper()} {route} missing 'plays' tag, has {tags}"
)
@pytest.mark.post_deploy
@pytest.mark.skip(
@ -124,9 +124,9 @@ class TestRouteRegistration:
]:
params = paths[route]["get"].get("parameters", [])
param_names = [p["name"] for p in params]
assert (
"sbaplayer_id" in param_names
), f"sbaplayer_id parameter missing from {route}"
assert "sbaplayer_id" in param_names, (
f"sbaplayer_id parameter missing from {route}"
)
# ---------------------------------------------------------------------------
@ -493,10 +493,9 @@ class TestPlayCrud:
assert result["id"] == play_id
def test_get_nonexistent_play(self, api):
"""GET /plays/999999999 returns an error (wrapped by handle_db_errors)."""
"""GET /plays/999999999 returns 404 Not Found."""
r = requests.get(f"{api}/api/v3/plays/999999999", timeout=10)
# handle_db_errors wraps HTTPException as 500 with detail message
assert r.status_code == 500
assert r.status_code == 404
assert "not found" in r.json().get("detail", "").lower()
@ -575,9 +574,9 @@ class TestGroupBySbaPlayer:
)
assert r_seasons.status_code == 200
season_pas = [s["pa"] for s in r_seasons.json()["stats"]]
assert career_pa >= max(
season_pas
), f"Career PA ({career_pa}) should be >= max season PA ({max(season_pas)})"
assert career_pa >= max(season_pas), (
f"Career PA ({career_pa}) should be >= max season PA ({max(season_pas)})"
)
@pytest.mark.post_deploy
def test_batting_sbaplayer_short_output(self, api):

View File

@ -7,21 +7,18 @@ import pytest
from unittest.mock import MagicMock, patch
import sys
import os
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from app.services.player_service import PlayerService
from app.services.base import ServiceConfig
from app.services.mocks import (
MockPlayerRepository,
MockCacheService,
EnhancedMockCache
)
from app.services.mocks import MockPlayerRepository, MockCacheService, EnhancedMockCache
# ============================================================================
# FIXTURES
# ============================================================================
@pytest.fixture
def cache():
"""Create fresh cache for each test."""
@ -32,20 +29,73 @@ def cache():
def repo(cache):
"""Create fresh repo with test data."""
repo = MockPlayerRepository()
# Add test players
players = [
{'id': 1, 'name': 'Mike Trout', 'wara': 5.2, 'team_id': 1, 'season': 10, 'pos_1': 'CF', 'pos_2': 'LF', 'strat_code': 'Elite', 'injury_rating': 'A'},
{'id': 2, 'name': 'Aaron Judge', 'wara': 4.8, 'team_id': 2, 'season': 10, 'pos_1': 'RF', 'strat_code': 'Power', 'injury_rating': 'B'},
{'id': 3, 'name': 'Mookie Betts', 'wara': 5.5, 'team_id': 3, 'season': 10, 'pos_1': 'RF', 'pos_2': '2B', 'strat_code': 'Elite', 'injury_rating': 'A'},
{'id': 4, 'name': 'Injured Player', 'wara': 2.0, 'team_id': 1, 'season': 10, 'pos_1': 'P', 'il_return': 'Week 5', 'injury_rating': 'C'},
{'id': 5, 'name': 'Old Player', 'wara': 1.0, 'team_id': 1, 'season': 5, 'pos_1': '1B'},
{'id': 6, 'name': 'Juan Soto', 'wara': 4.5, 'team_id': 2, 'season': 10, 'pos_1': '1B', 'strat_code': 'Contact'},
{
"id": 1,
"name": "Mike Trout",
"wara": 5.2,
"team_id": 1,
"season": 10,
"pos_1": "CF",
"pos_2": "LF",
"strat_code": "Elite",
"injury_rating": "A",
},
{
"id": 2,
"name": "Aaron Judge",
"wara": 4.8,
"team_id": 2,
"season": 10,
"pos_1": "RF",
"strat_code": "Power",
"injury_rating": "B",
},
{
"id": 3,
"name": "Mookie Betts",
"wara": 5.5,
"team_id": 3,
"season": 10,
"pos_1": "RF",
"pos_2": "2B",
"strat_code": "Elite",
"injury_rating": "A",
},
{
"id": 4,
"name": "Injured Player",
"wara": 2.0,
"team_id": 1,
"season": 10,
"pos_1": "P",
"il_return": "Week 5",
"injury_rating": "C",
},
{
"id": 5,
"name": "Old Player",
"wara": 1.0,
"team_id": 1,
"season": 5,
"pos_1": "1B",
},
{
"id": 6,
"name": "Juan Soto",
"wara": 4.5,
"team_id": 2,
"season": 10,
"pos_1": "1B",
"strat_code": "Contact",
},
]
for player in players:
repo.add_player(player)
return repo
@ -60,463 +110,453 @@ def service(repo, cache):
# TEST CLASSES
# ============================================================================
class TestPlayerServiceGetPlayers:
"""Tests for get_players method - 50+ lines covered."""
def test_get_all_season_players(self, service, repo):
"""Get all players for a season."""
result = service.get_players(season=10)
assert result['count'] >= 5 # We have 5 season 10 players
assert len(result['players']) >= 5
assert all(p.get('season') == 10 for p in result['players'])
assert result["count"] >= 5 # We have 5 season 10 players
assert len(result["players"]) >= 5
assert all(p.get("season") == 10 for p in result["players"])
def test_filter_by_single_team(self, service):
"""Filter by single team ID."""
result = service.get_players(season=10, team_id=[1])
assert result['count'] >= 1
assert all(p.get('team_id') == 1 for p in result['players'])
assert result["count"] >= 1
assert all(p.get("team_id") == 1 for p in result["players"])
def test_filter_by_multiple_teams(self, service):
"""Filter by multiple team IDs."""
result = service.get_players(season=10, team_id=[1, 2])
assert result['count'] >= 2
assert all(p.get('team_id') in [1, 2] for p in result['players'])
assert result["count"] >= 2
assert all(p.get("team_id") in [1, 2] for p in result["players"])
def test_filter_by_position(self, service):
"""Filter by position."""
result = service.get_players(season=10, pos=['CF'])
assert result['count'] >= 1
assert any(p.get('pos_1') == 'CF' or p.get('pos_2') == 'CF' for p in result['players'])
result = service.get_players(season=10, pos=["CF"])
assert result["count"] >= 1
assert any(
p.get("pos_1") == "CF" or p.get("pos_2") == "CF" for p in result["players"]
)
def test_filter_by_strat_code(self, service):
"""Filter by strat code."""
result = service.get_players(season=10, strat_code=['Elite'])
assert result['count'] >= 2 # Trout and Betts
assert all('Elite' in str(p.get('strat_code', '')) for p in result['players'])
result = service.get_players(season=10, strat_code=["Elite"])
assert result["count"] >= 2 # Trout and Betts
assert all("Elite" in str(p.get("strat_code", "")) for p in result["players"])
def test_filter_injured_only(self, service):
"""Filter injured players only."""
result = service.get_players(season=10, is_injured=True)
assert result['count'] >= 1
assert all(p.get('il_return') is not None for p in result['players'])
assert result["count"] >= 1
assert all(p.get("il_return") is not None for p in result["players"])
def test_sort_cost_ascending(self, service):
"""Sort by WARA ascending."""
result = service.get_players(season=10, sort='cost-asc')
wara = [p.get('wara', 0) for p in result['players']]
result = service.get_players(season=10, sort="cost-asc")
wara = [p.get("wara", 0) for p in result["players"]]
assert wara == sorted(wara)
def test_sort_cost_descending(self, service):
"""Sort by WARA descending."""
result = service.get_players(season=10, sort='cost-desc')
wara = [p.get('wara', 0) for p in result['players']]
result = service.get_players(season=10, sort="cost-desc")
wara = [p.get("wara", 0) for p in result["players"]]
assert wara == sorted(wara, reverse=True)
def test_sort_name_ascending(self, service):
"""Sort by name ascending."""
result = service.get_players(season=10, sort='name-asc')
names = [p.get('name', '') for p in result['players']]
result = service.get_players(season=10, sort="name-asc")
names = [p.get("name", "") for p in result["players"]]
assert names == sorted(names)
def test_sort_name_descending(self, service):
"""Sort by name descending."""
result = service.get_players(season=10, sort='name-desc')
names = [p.get('name', '') for p in result['players']]
result = service.get_players(season=10, sort="name-desc")
names = [p.get("name", "") for p in result["players"]]
assert names == sorted(names, reverse=True)
class TestPlayerServiceSearch:
"""Tests for search_players method."""
def test_exact_name_match(self, service):
"""Search with exact name match."""
result = service.search_players('Mike Trout', season=10)
assert result['count'] >= 1
names = [p.get('name') for p in result['players']]
assert 'Mike Trout' in names
result = service.search_players("Mike Trout", season=10)
assert result["count"] >= 1
names = [p.get("name") for p in result["players"]]
assert "Mike Trout" in names
def test_partial_name_match(self, service):
"""Search with partial name match."""
result = service.search_players('Trout', season=10)
assert result['count'] >= 1
assert any('Trout' in p.get('name', '') for p in result['players'])
result = service.search_players("Trout", season=10)
assert result["count"] >= 1
assert any("Trout" in p.get("name", "") for p in result["players"])
def test_case_insensitive_search(self, service):
"""Search is case insensitive."""
result1 = service.search_players('MIKE', season=10)
result2 = service.search_players('mike', season=10)
assert result1['count'] == result2['count']
result1 = service.search_players("MIKE", season=10)
result2 = service.search_players("mike", season=10)
assert result1["count"] == result2["count"]
def test_search_all_seasons(self, service):
"""Search across all seasons."""
result = service.search_players('Player', season=None)
result = service.search_players("Player", season=None)
# Should find both current and old players
assert result['all_seasons'] == True
assert result['count'] >= 2
assert result["all_seasons"] == True
assert result["count"] >= 2
def test_search_limit(self, service):
"""Limit search results."""
result = service.search_players('a', season=10, limit=2)
assert result['count'] <= 2
result = service.search_players("a", season=10, limit=2)
assert result["count"] <= 2
def test_search_no_results(self, service):
"""Search returns empty when no matches."""
result = service.search_players('XYZ123NotExist', season=10)
assert result['count'] == 0
assert result['players'] == []
result = service.search_players("XYZ123NotExist", season=10)
assert result["count"] == 0
assert result["players"] == []
class TestPlayerServiceGetPlayer:
"""Tests for get_player method."""
def test_get_existing_player(self, service):
"""Get existing player by ID."""
result = service.get_player(1)
assert result is not None
assert result.get('id') == 1
assert result.get('name') == 'Mike Trout'
assert result.get("id") == 1
assert result.get("name") == "Mike Trout"
def test_get_nonexistent_player(self, service):
"""Get player that doesn't exist."""
result = service.get_player(99999)
assert result is None
def test_get_player_short_output(self, service):
"""Get player with short output."""
result = service.get_player(1, short_output=True)
# Should still have basic fields
assert result.get('id') == 1
assert result.get('name') == 'Mike Trout'
assert result.get("id") == 1
assert result.get("name") == "Mike Trout"
class TestPlayerServiceCreate:
"""Tests for create_players method."""
def test_create_single_player(self, repo, cache):
"""Create a single new player."""
config = ServiceConfig(player_repo=repo, cache=cache)
service = PlayerService(config=config)
new_player = [{
'name': 'New Player',
'wara': 3.0,
'team_id': 1,
'season': 10,
'pos_1': 'SS'
}]
new_player = [
{
"name": "New Player",
"wara": 3.0,
"team_id": 1,
"season": 10,
"pos_1": "SS",
}
]
# Mock auth
with patch.object(service, 'require_auth', return_value=True):
result = service.create_players(new_player, 'valid_token')
assert 'Inserted' in str(result)
with patch.object(service, "require_auth", return_value=True):
result = service.create_players(new_player, "valid_token")
assert "Inserted" in str(result)
# Verify player was added (ID 7 since fixture has players 1-6)
player = repo.get_by_id(7) # Next ID after fixture data
assert player is not None
assert player['name'] == 'New Player'
assert player["name"] == "New Player"
def test_create_multiple_players(self, repo, cache):
"""Create multiple new players."""
config = ServiceConfig(player_repo=repo, cache=cache)
service = PlayerService(config=config)
new_players = [
{'name': 'Player A', 'wara': 2.0, 'team_id': 1, 'season': 10, 'pos_1': '2B'},
{'name': 'Player B', 'wara': 2.5, 'team_id': 2, 'season': 10, 'pos_1': '3B'},
{
"name": "Player A",
"wara": 2.0,
"team_id": 1,
"season": 10,
"pos_1": "2B",
},
{
"name": "Player B",
"wara": 2.5,
"team_id": 2,
"season": 10,
"pos_1": "3B",
},
]
with patch.object(service, 'require_auth', return_value=True):
result = service.create_players(new_players, 'valid_token')
assert 'Inserted 2 players' in str(result)
with patch.object(service, "require_auth", return_value=True):
result = service.create_players(new_players, "valid_token")
assert "Inserted 2 players" in str(result)
def test_create_duplicate_fails(self, repo, cache):
"""Creating duplicate player should fail."""
config = ServiceConfig(player_repo=repo, cache=cache)
service = PlayerService(config=config)
duplicate = [{'name': 'Mike Trout', 'wara': 5.0, 'team_id': 1, 'season': 10, 'pos_1': 'CF'}]
with patch.object(service, 'require_auth', return_value=True):
duplicate = [
{
"name": "Mike Trout",
"wara": 5.0,
"team_id": 1,
"season": 10,
"pos_1": "CF",
}
]
with patch.object(service, "require_auth", return_value=True):
with pytest.raises(Exception) as exc_info:
service.create_players(duplicate, 'valid_token')
assert 'already exists' in str(exc_info.value)
service.create_players(duplicate, "valid_token")
assert "already exists" in str(exc_info.value)
def test_create_requires_auth(self, repo, cache):
"""Creating players requires authentication."""
config = ServiceConfig(player_repo=repo, cache=cache)
service = PlayerService(config=config)
new_player = [{'name': 'Test', 'wara': 1.0, 'team_id': 1, 'season': 10, 'pos_1': 'P'}]
new_player = [
{"name": "Test", "wara": 1.0, "team_id": 1, "season": 10, "pos_1": "P"}
]
with pytest.raises(Exception) as exc_info:
service.create_players(new_player, 'bad_token')
service.create_players(new_player, "bad_token")
assert exc_info.value.status_code == 401
class TestPlayerServiceUpdate:
"""Tests for update_player and patch_player methods."""
def test_patch_player_name(self, repo, cache):
"""Patch player's name."""
config = ServiceConfig(player_repo=repo, cache=cache)
service = PlayerService(config=config)
with patch.object(service, 'require_auth', return_value=True):
result = service.patch_player(1, {'name': 'New Name'}, 'valid_token')
with patch.object(service, "require_auth", return_value=True):
result = service.patch_player(1, {"name": "New Name"}, "valid_token")
assert result is not None
assert result.get('name') == 'New Name'
assert result.get("name") == "New Name"
def test_patch_player_wara(self, repo, cache):
"""Patch player's WARA."""
config = ServiceConfig(player_repo=repo, cache=cache)
service = PlayerService(config=config)
with patch.object(service, 'require_auth', return_value=True):
result = service.patch_player(1, {'wara': 6.0}, 'valid_token')
assert result.get('wara') == 6.0
with patch.object(service, "require_auth", return_value=True):
result = service.patch_player(1, {"wara": 6.0}, "valid_token")
assert result.get("wara") == 6.0
def test_patch_multiple_fields(self, repo, cache):
"""Patch multiple fields at once."""
config = ServiceConfig(player_repo=repo, cache=cache)
service = PlayerService(config=config)
updates = {
'name': 'Updated Name',
'wara': 7.0,
'strat_code': 'Super Elite'
}
with patch.object(service, 'require_auth', return_value=True):
result = service.patch_player(1, updates, 'valid_token')
assert result.get('name') == 'Updated Name'
assert result.get('wara') == 7.0
assert result.get('strat_code') == 'Super Elite'
updates = {"name": "Updated Name", "wara": 7.0, "strat_code": "Super Elite"}
with patch.object(service, "require_auth", return_value=True):
result = service.patch_player(1, updates, "valid_token")
assert result.get("name") == "Updated Name"
assert result.get("wara") == 7.0
assert result.get("strat_code") == "Super Elite"
def test_patch_nonexistent_player(self, repo, cache):
"""Patch fails for non-existent player."""
config = ServiceConfig(player_repo=repo, cache=cache)
service = PlayerService(config=config)
with patch.object(service, 'require_auth', return_value=True):
with patch.object(service, "require_auth", return_value=True):
with pytest.raises(Exception) as exc_info:
service.patch_player(99999, {'name': 'Test'}, 'valid_token')
assert 'not found' in str(exc_info.value)
service.patch_player(99999, {"name": "Test"}, "valid_token")
assert "not found" in str(exc_info.value)
def test_patch_requires_auth(self, repo, cache):
"""Patching requires authentication."""
config = ServiceConfig(player_repo=repo, cache=cache)
service = PlayerService(config=config)
with pytest.raises(Exception) as exc_info:
service.patch_player(1, {'name': 'Test'}, 'bad_token')
service.patch_player(1, {"name": "Test"}, "bad_token")
assert exc_info.value.status_code == 401
class TestPlayerServiceDelete:
"""Tests for delete_player method."""
def test_delete_player(self, repo, cache):
"""Delete existing player."""
config = ServiceConfig(player_repo=repo, cache=cache)
service = PlayerService(config=config)
# Verify player exists
assert repo.get_by_id(1) is not None
with patch.object(service, 'require_auth', return_value=True):
result = service.delete_player(1, 'valid_token')
assert 'deleted' in str(result)
with patch.object(service, "require_auth", return_value=True):
result = service.delete_player(1, "valid_token")
assert "deleted" in str(result)
# Verify player is gone
assert repo.get_by_id(1) is None
def test_delete_nonexistent_player(self, repo, cache):
"""Delete fails for non-existent player."""
config = ServiceConfig(player_repo=repo, cache=cache)
service = PlayerService(config=config)
with patch.object(service, 'require_auth', return_value=True):
with patch.object(service, "require_auth", return_value=True):
with pytest.raises(Exception) as exc_info:
service.delete_player(99999, 'valid_token')
assert 'not found' in str(exc_info.value)
service.delete_player(99999, "valid_token")
assert "not found" in str(exc_info.value)
def test_delete_requires_auth(self, repo, cache):
"""Deleting requires authentication."""
config = ServiceConfig(player_repo=repo, cache=cache)
service = PlayerService(config=config)
with pytest.raises(Exception) as exc_info:
service.delete_player(1, 'bad_token')
service.delete_player(1, "bad_token")
assert exc_info.value.status_code == 401
class TestPlayerServiceCache:
"""Tests for cache functionality."""
@pytest.mark.skip(reason="Caching not yet implemented in service methods")
def test_cache_set_on_read(self, service, cache):
"""Cache is set on player read."""
service.get_players(season=10)
assert cache.was_called('set')
@pytest.mark.skip(reason="Caching not yet implemented in service methods")
def test_cache_invalidation_on_update(self, repo, cache):
"""Cache is invalidated on player update."""
config = ServiceConfig(player_repo=repo, cache=cache)
service = PlayerService(config=config)
# Read to set cache
service.get_players(season=10)
initial_calls = len(cache.get_calls('set'))
# Update should invalidate cache
with patch.object(service, 'require_auth', return_value=True):
service.patch_player(1, {'name': 'Test'}, 'valid_token')
# Should have more delete calls after update
delete_calls = [c for c in cache.get_calls() if c.get('method') == 'delete']
assert len(delete_calls) > 0
@pytest.mark.skip(reason="Caching not yet implemented in service methods")
def test_cache_hit_rate(self, repo, cache):
"""Test cache hit rate tracking."""
config = ServiceConfig(player_repo=repo, cache=cache)
service = PlayerService(config=config)
# First call - cache miss
service.get_players(season=10)
miss_count = cache._miss_count
# Second call - cache hit
service.get_players(season=10)
# Hit rate should have improved
assert cache.hit_rate > 0
class TestPlayerServiceValidation:
"""Tests for input validation and edge cases."""
def test_invalid_season_returns_empty(self, service):
"""Invalid season returns empty result."""
result = service.get_players(season=999)
assert result['count'] == 0 or result['players'] == []
assert result["count"] == 0 or result["players"] == []
def test_empty_search_returns_all(self, service):
"""Empty search query returns all players."""
result = service.search_players('', season=10)
assert result['count'] >= 1
result = service.search_players("", season=10)
assert result["count"] >= 1
def test_sort_with_no_results(self, service):
"""Sorting with no results doesn't error."""
result = service.get_players(season=999, sort='cost-desc')
assert result['count'] == 0 or result['players'] == []
result = service.get_players(season=999, sort="cost-desc")
assert result["count"] == 0 or result["players"] == []
def test_cache_clear_on_create(self, repo, cache):
"""Cache is cleared when new players are created."""
config = ServiceConfig(player_repo=repo, cache=cache)
service = PlayerService(config=config)
# Set up some cache data
cache.set('test:key', 'value', 300)
with patch.object(service, 'require_auth', return_value=True):
service.create_players([{
'name': 'New',
'wara': 1.0,
'team_id': 1,
'season': 10,
'pos_1': 'P'
}], 'valid_token')
cache.set("test:key", "value", 300)
with patch.object(service, "require_auth", return_value=True):
service.create_players(
[
{
"name": "New",
"wara": 1.0,
"team_id": 1,
"season": 10,
"pos_1": "P",
}
],
"valid_token",
)
# Should have invalidate calls
assert len(cache.get_calls()) > 0
class TestPlayerServiceIntegration:
"""Integration tests combining multiple operations."""
def test_full_crud_cycle(self, repo, cache):
"""Test complete CRUD cycle."""
config = ServiceConfig(player_repo=repo, cache=cache)
service = PlayerService(config=config)
# CREATE
with patch.object(service, 'require_auth', return_value=True):
create_result = service.create_players([{
'name': 'CRUD Test',
'wara': 3.0,
'team_id': 1,
'season': 10,
'pos_1': 'DH'
}], 'valid_token')
with patch.object(service, "require_auth", return_value=True):
create_result = service.create_players(
[
{
"name": "CRUD Test",
"wara": 3.0,
"team_id": 1,
"season": 10,
"pos_1": "DH",
}
],
"valid_token",
)
# READ
search_result = service.search_players('CRUD', season=10)
assert search_result['count'] >= 1
player_id = search_result['players'][0].get('id')
search_result = service.search_players("CRUD", season=10)
assert search_result["count"] >= 1
player_id = search_result["players"][0].get("id")
# UPDATE
with patch.object(service, 'require_auth', return_value=True):
update_result = service.patch_player(player_id, {'wara': 4.0}, 'valid_token')
assert update_result.get('wara') == 4.0
with patch.object(service, "require_auth", return_value=True):
update_result = service.patch_player(
player_id, {"wara": 4.0}, "valid_token"
)
assert update_result.get("wara") == 4.0
# DELETE
with patch.object(service, 'require_auth', return_value=True):
delete_result = service.delete_player(player_id, 'valid_token')
assert 'deleted' in str(delete_result)
with patch.object(service, "require_auth", return_value=True):
delete_result = service.delete_player(player_id, "valid_token")
assert "deleted" in str(delete_result)
# VERIFY DELETED
get_result = service.get_player(player_id)
assert get_result is None
def test_search_then_filter(self, service):
"""Search and then filter operations."""
# First get all players
all_result = service.get_players(season=10)
initial_count = all_result['count']
initial_count = all_result["count"]
# Then filter by team
filtered = service.get_players(season=10, team_id=[1])
# Filtered should be <= all
assert filtered['count'] <= initial_count
assert filtered["count"] <= initial_count
# ============================================================================

View File

@ -0,0 +1,154 @@
"""
Tests for query limit/offset parameter validation and middleware behavior.
Verifies that:
- FastAPI enforces MAX_LIMIT cap (returns 422 for limit > 500)
- FastAPI enforces ge=1 on limit (returns 422 for limit=0 or limit=-1)
- Transactions endpoint returns limit/offset keys in the response
- strip_empty_query_params middleware treats ?param= as absent
These tests exercise FastAPI parameter validation which fires before any
handler code runs, so most tests don't require a live DB connection.
The app imports redis and psycopg2 at module level, so we mock those
system-level packages before importing app.main.
"""
import sys
import pytest
from unittest.mock import MagicMock, patch
# ---------------------------------------------------------------------------
# Stub out C-extension / system packages that aren't installed in the test
# environment before any app code is imported.
# ---------------------------------------------------------------------------
_redis_stub = MagicMock()
_redis_stub.Redis = MagicMock(return_value=MagicMock(ping=MagicMock(return_value=True)))
sys.modules.setdefault("redis", _redis_stub)
_psycopg2_stub = MagicMock()
sys.modules.setdefault("psycopg2", _psycopg2_stub)
_playhouse_pool_stub = MagicMock()
sys.modules.setdefault("playhouse.pool", _playhouse_pool_stub)
_playhouse_pool_stub.PooledPostgresqlDatabase = MagicMock()
_pandas_stub = MagicMock()
sys.modules.setdefault("pandas", _pandas_stub)
_pandas_stub.DataFrame = MagicMock()
@pytest.fixture(scope="module")
def client():
"""
TestClient with the Peewee db object mocked so the app can be imported
without a running PostgreSQL instance. FastAPI validates query params
before calling handler code, so 422 responses don't need a real DB.
"""
mock_db = MagicMock()
mock_db.is_closed.return_value = False
mock_db.connect.return_value = None
mock_db.close.return_value = None
with patch("app.db_engine.db", mock_db):
from fastapi.testclient import TestClient
from app.main import app
with TestClient(app, raise_server_exceptions=False) as c:
yield c
def test_limit_exceeds_max_returns_422(client):
"""
GET /api/v3/decisions with limit=1000 should return 422.
MAX_LIMIT is 500; the decisions endpoint declares
limit: int = Query(ge=1, le=MAX_LIMIT), so FastAPI rejects values > 500
before any handler code runs.
"""
response = client.get("/api/v3/decisions?limit=1000")
assert response.status_code == 422
def test_limit_zero_returns_422(client):
"""
GET /api/v3/decisions with limit=0 should return 422.
Query(ge=1) rejects zero values.
"""
response = client.get("/api/v3/decisions?limit=0")
assert response.status_code == 422
def test_limit_negative_returns_422(client):
"""
GET /api/v3/decisions with limit=-1 should return 422.
Query(ge=1) rejects negative values.
"""
response = client.get("/api/v3/decisions?limit=-1")
assert response.status_code == 422
def test_transactions_has_limit_in_response(client):
"""
GET /api/v3/transactions?season=12 should include 'limit' and 'offset'
keys in the JSON response body.
The transactions endpoint was updated to return pagination metadata
alongside results so callers know the applied page size.
"""
mock_qs = MagicMock()
mock_qs.count.return_value = 0
mock_qs.where.return_value = mock_qs
mock_qs.order_by.return_value = mock_qs
mock_qs.offset.return_value = mock_qs
mock_qs.limit.return_value = mock_qs
mock_qs.__iter__ = MagicMock(return_value=iter([]))
with (
patch("app.routers_v3.transactions.Transaction") as mock_txn,
patch("app.routers_v3.transactions.Team") as mock_team,
patch("app.routers_v3.transactions.Player") as mock_player,
):
mock_txn.select_season.return_value = mock_qs
mock_txn.select.return_value = mock_qs
mock_team.select.return_value = mock_qs
mock_player.select.return_value = mock_qs
response = client.get("/api/v3/transactions?season=12")
# If the mock is sufficient the response is 200 with pagination keys;
# if some DB path still fires we at least confirm limit param is accepted.
assert response.status_code != 422
if response.status_code == 200:
data = response.json()
assert "limit" in data, "Response missing 'limit' key"
assert "offset" in data, "Response missing 'offset' key"
def test_empty_string_param_stripped(client):
"""
Query params with an empty string value should be treated as absent.
The strip_empty_query_params middleware rewrites the query string before
FastAPI parses it, so ?league_abbrev= is removed entirely rather than
forwarded as an empty string to the handler.
Expected: the request is accepted (not 422) and the empty param is ignored.
"""
mock_qs = MagicMock()
mock_qs.count.return_value = 0
mock_qs.where.return_value = mock_qs
mock_qs.__iter__ = MagicMock(return_value=iter([]))
with patch("app.routers_v3.standings.Standings") as mock_standings:
mock_standings.select_season.return_value = mock_qs
# ?league_abbrev= should be stripped → treated as absent (None), not ""
response = client.get("/api/v3/standings?season=12&league_abbrev=")
assert response.status_code != 422, (
"Empty string query param caused a 422 — middleware may not be stripping it"
)