Compare commits

..

43 Commits

Author SHA1 Message Date
cal
d83a4bdbb7 Merge pull request 'feat: S3 upload pipeline for APNG animated cards (#198)' (#210) from issue/198-feat-s3-upload-pipeline-for-apng-animated-cards into main 2026-04-08 15:25:40 +00:00
Cal Corum
b29450e7d6 feat: S3 upload pipeline for APNG animated cards (#198)
Extends card_storage.py with build_apng_s3_key, upload_apng_to_s3, and
upload_variant_apng to handle animated card uploads. Wires get_animated_card
to trigger a background S3 upload on each new render (cache miss, non-preview).

Closes #198

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-08 10:04:21 -05:00
cal
5ff11759f9 Merge pull request 'feat: add REFRACTOR_START_SEASON floor to evaluator queries (#195)' (#209) from issue/195-docs-document-cross-season-stat-accumulation-decis into main 2026-04-08 13:25:48 +00:00
Cal Corum
fd2cc6534a feat: add REFRACTOR_START_SEASON floor to evaluator queries (#195)
Adds REFRACTOR_START_SEASON constant (default 11, overridable via env var)
to db_engine.py and applies it as a season filter in both BattingSeasonStats
and PitchingSeasonStats queries in refractor_evaluator.py, ensuring pre-Season
11 stats are excluded from refractor progress accumulation.

Closes #195

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-08 08:03:07 -05:00
cal
7701777273 Merge pull request 'feat: include animated_url in tier-up response for T3/T4 (#201)' (#208) from issue/201-feat-include-animated-url-in-tier-up-response-for into main 2026-04-08 10:25:53 +00:00
Cal Corum
4028a24ef9 feat: include animated_url in tier-up response for T3/T4 (#201)
Closes #201

Add animated_url to evaluate-game tier-up entries when new_tier >= 3.
URL is constructed from API_BASE_URL env var + the /animated endpoint
path, using today's date as the cache-bust segment.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-08 04:35:55 -05:00
cal
c8ec976626 Merge pull request 'fix: return image/apng media type from animated card endpoint (#196)' (#204) from issue/196-bug-apng-endpoint-returns-wrong-media-type-image-p into main 2026-04-08 02:26:04 +00:00
cal
e1f3371321 Merge branch 'main' into issue/196-bug-apng-endpoint-returns-wrong-media-type-image-p 2026-04-08 02:25:58 +00:00
cal
3cc4c65717 Merge pull request 'perf: batch image_url prefetch in list_card_states to eliminate N+1 (#199)' (#206) from issue/199-perf-n-1-query-in-build-card-state-response-for-im into main 2026-04-08 02:25:52 +00:00
Cal Corum
b7196c1c56 perf: batch image_url prefetch in list_card_states to eliminate N+1 (#199)
Replace per-row CardModel.get() in _build_card_state_response with a
bulk prefetch in list_card_states: collect variant player IDs, issue at
most 2 queries (BattingCard + PitchingCard), build a (player_id, variant)
-> image_url map, and pass the resolved value directly to the helper.

The single-card get_card_state path is unchanged and still resolves
image_url inline (one extra query is acceptable for a single-item response).

Closes #199

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 21:04:33 -05:00
cal
3852fe1408 Merge pull request 'fix: variant filter commented out on cards list endpoint (#197)' (#203) from issue/197-bug-variant-filter-commented-out-on-cards-list-end into main 2026-04-08 01:45:07 +00:00
cal
b69b4264e8 Merge branch 'main' into issue/197-bug-variant-filter-commented-out-on-cards-list-end 2026-04-08 01:45:00 +00:00
cal
6d857a0f93 Merge pull request 'feat: add template drift check and cache management to deploy tooling' (#205) from chore/deploy-tooling-templates into main 2026-04-08 01:36:35 +00:00
Cal Corum
900f9723e5 fix: address PR review — unknown flag guard, local var scope, container map
- Reject unknown --flags with error instead of silently treating as commit SHA
- Declare remote_hash as local to prevent stale values across loop iterations
- Use associative array for container names (consistent with DEPLOY_HOST pattern)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 20:32:12 -05:00
Cal Corum
cf7279a573 fix: also update cache hit path to return image/apng media type (#204)
The cache hit branch at line 773 still returned image/png, meaning
the MIME type fix was never seen in production since cached responses
dominate. Update it to match the cache miss path.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 20:31:11 -05:00
Cal Corum
91a57454f2 feat: add template drift check and cache management to deploy tooling
deploy.sh now checks local vs remote templates via md5sum on every
deploy and warns about drift. Pass --sync-templates to push changed
files. Also reports cached card image counts on the target server.

New clear-card-cache.sh script inspects or clears cached PNG/APNG
card images inside the API container, with --apng-only and --all
modes. Added scripts/README.md documenting all operational scripts.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 20:24:51 -05:00
Cal Corum
dcff8332a2 fix: return image/apng media type from animated card endpoint (#196)
Closes #196

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 20:01:48 -05:00
Cal Corum
ea36c40902 fix: uncomment variant filter and add variant to CSV export (#197)
Restores the commented-out `?variant=` query filter on GET /api/v2/cards
so callers can pass variant=0 for base cards or variant=N for a specific
refractor variant. Adds variant column to CSV output header and rows.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 19:31:48 -05:00
cal
19003215a3 Merge pull request 'fix: improve diamond tier indicator visibility' (#194) from fix/diamond-indicator-styling into main 2026-04-07 13:58:12 +00:00
Cal Corum
73be3dd6f3 fix: improve diamond tier indicator visibility
- Add silver backing div behind diamond so the X gap lines pop
- Darken unfilled quads for better contrast with filled tier color
- Make diamond grid background transparent (backing shows through gaps)
- Deduplicate shared positioning into a combined CSS rule
- Remove dead TIER_DIAMOND_COLORS dict and filled_bg from Python
  (template already computes these via Jinja)

NOTE: These are volume-mounted template files, NOT baked into the
Docker image. After merging, manually deploy to each server:

  scp storage/templates/tier_style.html <host>:<container-data>/storage/templates/
  scp storage/templates/player_card.html <host>:<container-data>/storage/templates/

Hosts:
  Dev:  ssh pd-database → /home/cal/container-data/dev-pd-database/
  Prod: ssh akamai → /root/container-data/paper-dynasty/

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 08:47:24 -05:00
cal
35fbe2082d Merge pull request 'fix: pass diamond tier colors to card template' (#193) from fix/diamond-tier-colors into main
All checks were successful
Build Docker Image / build (push) Successful in 8m48s
2026-04-07 05:13:02 +00:00
Cal Corum
a105d5412a fix: pass diamond tier colors to card template
The tier_style.html template references {{ filled_bg }} for diamond
quad backgrounds but it was never set in the rendering code, making
the tier indicator invisible.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 00:12:26 -05:00
cal
85f6783eea Merge pull request 'fix: correct apng version pin (0.3.4, not 3.1.1)' (#192) from fix/apng-version-pin into main
All checks were successful
Build Docker Image / build (push) Successful in 8m17s
2026-04-07 04:25:46 +00:00
Cal Corum
1fd681d595 fix: correct apng version pin (0.3.4, not 3.1.1)
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 23:23:48 -05:00
cal
7cb998561e Merge pull request 'chore: add deploy script for dev/prod CI releases' (#191) from chore/deploy-script-apply into main
Some checks failed
Build Docker Image / build (push) Failing after 1m20s
2026-04-07 03:55:29 +00:00
Cal Corum
321efa4909 chore: add deploy script for dev/prod tag-based CI releases
Closes PR #190 (chore/deploy-script — applied directly due to Gitea rebase conflict from stale branch base)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-06 22:49:40 -05:00
cal
0dc096be93 Merge pull request 'feat: refractor card art pipeline — S3 upload + image_url' (#187) from feat/refractor-card-art-pipeline into main 2026-04-07 03:29:24 +00:00
Cal Corum
20f7ac5958 merge: resolve requirements.txt conflict — include both apng and boto3
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 22:27:15 -05:00
cal
67d1f10455 Merge pull request 'feat: APNG animated card effects for T3/T4 refractor tiers (#186)' (#188) from issue/186-feat-apng-animated-card-effects-for-t3-t4-refracto into main
Reviewed-on: #188
2026-04-07 03:22:49 +00:00
Cal Corum
30b5eefa29 feat: APNG animated card effects for T3/T4 refractor tiers (#186)
Closes #186

- Add app/services/apng_generator.py: scrubs CSS animations via Playwright
  (negative animation-delay + running override), assembles frames with apng lib
- Add GET /{player_id}/{card_type}card/{d}/{variant}/animated endpoint: returns
  cached .apng for T3/T4 cards, 404 for T0-T2
- T3: 12 frames × 200ms (gold shimmer, 2.5s cycle)
- T4: 24 frames × 250ms (prismatic sweep + diamond glow, 6s cycle)
- Cache path: storage/cards/cardset-{id}/{type}/{player_id}-{d}-v{variant}.apng
- Add apng==3.1.1 to requirements.txt

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 03:22:05 +00:00
cal
02da6f9cc8 Merge pull request 'fix: ensure count is never null in GET /refractor/cards (#183)' (#185) from issue/183-bug-get-refractor-cards-returns-count-null-with-ve into main
Reviewed-on: #185
2026-04-07 03:15:43 +00:00
Cal Corum
543c8cddf6 fix: ensure count is never null in GET /refractor/cards (#183)
Guards against Peewee 3.17.9 returning None from .count() on a
complex multi-join query when 0 rows match the filter set.

Closes #183

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 03:15:21 +00:00
Cal Corum
fac9f66b3e docs: fix dev PostgreSQL container/db/user in CLAUDE.md
Dev environment uses sba_postgres container, paperdynasty_dev database,
sba_admin user — not pd_postgres/pd_master as previously documented.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 03:15:21 +00:00
Cal Corum
c75e2781be fix: address review feedback (#187)
- Move lazy imports to top level in card_storage.py and players.py (CLAUDE.md violation)
- Use os.environ.get() for S3_BUCKET/S3_REGION to allow dev/prod bucket separation
- Fix test patch targets from app.db_engine to app.services.card_storage (required after top-level import move)
- Fix assert_called_once_with field name: MockBatting.player → MockBatting.player_id

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-06 20:03:02 -05:00
Cal Corum
f4a90da629 fix: review feedback — pin boto3, use player_id consistently, add comment
- Pin boto3==1.42.65 to match project convention of exact version pins
- Use player_id (not player) for FK column access in card_storage.py
  to match the pattern used throughout the codebase
- Add comment explaining the tier is None guard in S3 upload scheduling

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 22:36:20 +00:00
Cal Corum
be8bebe663 feat: include image_url in refractor cards API response
Adds image_url field to each card state entry in the GET
/api/v2/refractor/cards response. Resolved by looking up the variant
BattingCard/PitchingCard row. Returns null when no image has been
rendered yet.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 22:36:20 +00:00
Cal Corum
4ccc0841a8 feat: schedule S3 upload for variant cards after Playwright render
Adds BackgroundTasks to the card render endpoint. After rendering a
variant card (variant > 0) where image_url is None, schedules
backfill_variant_image_url to upload the PNG to S3 and populate
image_url on the card row.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 22:36:20 +00:00
Cal Corum
534d50f1a8 feat: add card_storage S3 upload utility for variant cards
New service with S3 upload functions for the refractor card art
pipeline. backfill_variant_image_url reads rendered PNGs from disk,
uploads to S3, and sets image_url on BattingCard/PitchingCard rows.
18 tests covering key construction, URL formatting, upload params,
and error swallowing.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 22:36:20 +00:00
Cal Corum
088d30b96b docs: fix dev PostgreSQL container/db/user in CLAUDE.md
Dev environment uses sba_postgres container, paperdynasty_dev database,
sba_admin user — not pd_postgres/pd_master as previously documented.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 22:36:20 +00:00
cal
abb1c71f0a Merge pull request 'chore: pre-commit auto-fixes + remove unused Refractor tables' (#181) from chore/pre-commit-autofix into main
Reviewed-on: #181
2026-04-06 04:08:37 +00:00
Cal Corum
bf3a8ca0d5 chore: remove unused RefractorTierBoost and RefractorCosmetic tables
Speculative schema from initial Refractor design that was never used —
boosts are hardcoded in refractor_boost.py and tier visuals are embedded
in CSS templates. Both tables have zero rows on dev and prod.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 22:27:26 -05:00
Cal Corum
bbb5689b1f fix: address review feedback (#181)
- pre-commit: guard ruff --fix + git add with git stash --keep-index so
  partial-staging (git add -p) workflows are not silently broken
- pre-commit: use -z / xargs -0 for null-delimited filename handling
- install-hooks.sh: update echo messages to reflect auto-fix behaviour

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-04 23:03:06 -05:00
Cal Corum
0d009eb1f8 chore: pre-commit hook auto-fixes ruff violations before blocking
Instead of failing and requiring manual fix + re-commit, the hook now
runs ruff check --fix first, re-stages the fixed files, then checks
for remaining unfixable issues.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-04 22:19:37 -05:00
25 changed files with 1812 additions and 192 deletions

31
.githooks/install-hooks.sh Executable file
View File

@ -0,0 +1,31 @@
#!/bin/bash
#
# Install git hooks for this repository
#
REPO_ROOT=$(git rev-parse --show-toplevel 2>/dev/null)
if [ -z "$REPO_ROOT" ]; then
echo "Error: Not in a git repository"
exit 1
fi
HOOKS_DIR="$REPO_ROOT/.githooks"
GIT_HOOKS_DIR="$REPO_ROOT/.git/hooks"
echo "Installing git hooks..."
if [ -f "$HOOKS_DIR/pre-commit" ]; then
cp "$HOOKS_DIR/pre-commit" "$GIT_HOOKS_DIR/pre-commit"
chmod +x "$GIT_HOOKS_DIR/pre-commit"
echo "Installed pre-commit hook"
else
echo "pre-commit hook not found in $HOOKS_DIR"
fi
echo ""
echo "The pre-commit hook will:"
echo " - Auto-fix ruff lint violations (unused imports, formatting, etc.)"
echo " - Block commits only on truly unfixable issues"
echo ""
echo "To bypass in emergency: git commit --no-verify"

53
.githooks/pre-commit Executable file
View File

@ -0,0 +1,53 @@
#!/bin/bash
#
# Pre-commit hook: ruff lint check on staged Python files.
# Catches syntax errors, unused imports, and basic issues before commit.
# To bypass in emergency: git commit --no-verify
#
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
NC='\033[0m'
REPO_ROOT=$(git rev-parse --show-toplevel)
cd "$REPO_ROOT"
STAGED_PY=$(git diff --cached --name-only --diff-filter=ACM -z -- '*.py')
if [ -z "$STAGED_PY" ]; then
exit 0
fi
echo "ruff check on staged files..."
# Stash unstaged changes so ruff only operates on staged content.
# Without this, ruff --fix runs on the full working tree file (staged +
# unstaged), and the subsequent git add would silently include unstaged
# changes in the commit — breaking git add -p workflows.
STASHED=0
if git stash --keep-index -q 2>/dev/null; then
STASHED=1
fi
# Auto-fix what we can, then re-stage the fixed files
printf '%s' "$STAGED_PY" | xargs -0 ruff check --fix --exit-zero
printf '%s' "$STAGED_PY" | xargs -0 git add
# Restore unstaged changes
if [ $STASHED -eq 1 ]; then
git stash pop -q
fi
# Now check for remaining unfixable issues
printf '%s' "$STAGED_PY" | xargs -0 ruff check
RUFF_EXIT=$?
if [ $RUFF_EXIT -ne 0 ]; then
echo ""
echo -e "${RED}Pre-commit checks failed (unfixable issues). Commit blocked.${NC}"
echo -e "${YELLOW}To bypass (not recommended): git commit --no-verify${NC}"
exit 1
fi
echo -e "${GREEN}All checks passed.${NC}"
exit 0

View File

@ -31,7 +31,7 @@ docker build -t paper-dynasty-db . # Build image
| **URL** | pddev.manticorum.com | pd.manticorum.com | | **URL** | pddev.manticorum.com | pd.manticorum.com |
| **Host** | `ssh pd-database` | `ssh akamai``/root/container-data/paper-dynasty` | | **Host** | `ssh pd-database` | `ssh akamai``/root/container-data/paper-dynasty` |
| **API container** | `dev_pd_database` | `pd_api` | | **API container** | `dev_pd_database` | `pd_api` |
| **PostgreSQL** | `pd_postgres` (port 5432) | `pd_postgres` | | **PostgreSQL** | `sba_postgres` / `paperdynasty_dev` / `sba_admin` | `pd_postgres` / `pd_master` |
| **Adminer** | port 8081 | — | | **Adminer** | port 8081 | — |
| **API port** | 816 | 815 | | **API port** | 816 | 815 |
| **Image** | `manticorum67/paper-dynasty-database` | `manticorum67/paper-dynasty-database` | | **Image** | `manticorum67/paper-dynasty-database` | `manticorum67/paper-dynasty-database` |

View File

@ -44,6 +44,10 @@ else:
pragmas={"journal_mode": "wal", "cache_size": -1 * 64000, "synchronous": 0}, pragmas={"journal_mode": "wal", "cache_size": -1 * 64000, "synchronous": 0},
) )
# Refractor stat accumulation starts at this season — stats from earlier seasons
# are excluded from evaluation queries. Override via REFRACTOR_START_SEASON env var.
REFRACTOR_START_SEASON = int(os.environ.get("REFRACTOR_START_SEASON", "11"))
# 2025, 2005 # 2025, 2005
ranked_cardsets = [24, 25, 26, 27, 28, 29] ranked_cardsets = [24, 25, 26, 27, 28, 29]
LIVE_CARDSET_ID = 27 LIVE_CARDSET_ID = 27
@ -1257,50 +1261,15 @@ refractor_card_state_team_index = ModelIndex(
RefractorCardState.add_index(refractor_card_state_team_index) RefractorCardState.add_index(refractor_card_state_team_index)
class RefractorTierBoost(BaseModel):
track = ForeignKeyField(RefractorTrack)
tier = IntegerField() # 1-4
boost_type = CharField() # e.g. 'rating', 'stat'
boost_target = CharField() # e.g. 'contact_vl', 'power_vr'
boost_value = FloatField(default=0.0)
class Meta:
database = db
table_name = "refractor_tier_boost"
refractor_tier_boost_index = ModelIndex(
RefractorTierBoost,
(
RefractorTierBoost.track,
RefractorTierBoost.tier,
RefractorTierBoost.boost_type,
RefractorTierBoost.boost_target,
),
unique=True,
)
RefractorTierBoost.add_index(refractor_tier_boost_index)
class RefractorCosmetic(BaseModel):
name = CharField(unique=True)
tier_required = IntegerField(default=0)
cosmetic_type = CharField() # 'frame', 'badge', 'theme'
css_class = CharField(null=True)
asset_url = CharField(null=True)
class Meta:
database = db
table_name = "refractor_cosmetic"
class RefractorBoostAudit(BaseModel): class RefractorBoostAudit(BaseModel):
card_state = ForeignKeyField(RefractorCardState, on_delete="CASCADE") card_state = ForeignKeyField(RefractorCardState, on_delete="CASCADE")
tier = IntegerField() # 1-4 tier = IntegerField() # 1-4
battingcard = ForeignKeyField(BattingCard, null=True) battingcard = ForeignKeyField(BattingCard, null=True)
pitchingcard = ForeignKeyField(PitchingCard, null=True) pitchingcard = ForeignKeyField(PitchingCard, null=True)
variant_created = IntegerField() variant_created = IntegerField()
boost_delta_json = TextField() # JSONB in PostgreSQL; TextField for SQLite test compat boost_delta_json = (
TextField()
) # JSONB in PostgreSQL; TextField for SQLite test compat
applied_at = DateTimeField(default=datetime.now) applied_at = DateTimeField(default=datetime.now)
class Meta: class Meta:
@ -1313,8 +1282,6 @@ if not SKIP_TABLE_CREATION:
[ [
RefractorTrack, RefractorTrack,
RefractorCardState, RefractorCardState,
RefractorTierBoost,
RefractorCosmetic,
RefractorBoostAudit, RefractorBoostAudit,
], ],
safe=True, safe=True,

View File

@ -79,8 +79,8 @@ async def get_cards(
all_cards = all_cards.where(Card.pack == this_pack) all_cards = all_cards.where(Card.pack == this_pack)
if value is not None: if value is not None:
all_cards = all_cards.where(Card.value == value) all_cards = all_cards.where(Card.value == value)
# if variant is not None: if variant is not None:
# all_cards = all_cards.where(Card.variant == variant) all_cards = all_cards.where(Card.variant == variant)
if min_value is not None: if min_value is not None:
all_cards = all_cards.where(Card.value >= min_value) all_cards = all_cards.where(Card.value >= min_value)
if max_value is not None: if max_value is not None:
@ -114,8 +114,8 @@ async def get_cards(
if csv: if csv:
data_list = [ data_list = [
["id", "player", "cardset", "rarity", "team", "pack", "value"] ["id", "player", "cardset", "rarity", "team", "pack", "value", "variant"]
] # , 'variant']] ]
for line in all_cards: for line in all_cards:
data_list.append( data_list.append(
[ [
@ -125,7 +125,8 @@ async def get_cards(
line.player.rarity, line.player.rarity,
line.team.abbrev, line.team.abbrev,
line.pack, line.pack,
line.value, # line.variant line.value,
line.variant,
] ]
) )
return_val = DataFrame(data_list).to_csv(header=False, index=False) return_val = DataFrame(data_list).to_csv(header=False, index=False)

View File

@ -2,7 +2,15 @@ import datetime
import os.path import os.path
import pandas as pd import pandas as pd
from fastapi import APIRouter, Depends, HTTPException, Request, Response, Query from fastapi import (
APIRouter,
BackgroundTasks,
Depends,
HTTPException,
Request,
Response,
Query,
)
from fastapi.responses import FileResponse from fastapi.responses import FileResponse
from fastapi.templating import Jinja2Templates from fastapi.templating import Jinja2Templates
from typing import Optional, List, Literal from typing import Optional, List, Literal
@ -32,7 +40,9 @@ from ..db_engine import (
) )
from ..db_helpers import upsert_players from ..db_helpers import upsert_players
from ..dependencies import oauth2_scheme, valid_token from ..dependencies import oauth2_scheme, valid_token
from ..services.card_storage import backfill_variant_image_url, upload_variant_apng
from ..services.refractor_boost import compute_variant_hash from ..services.refractor_boost import compute_variant_hash
from ..services.apng_generator import apng_cache_path, generate_animated_card
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
# Persistent browser instance (WP-02) # Persistent browser instance (WP-02)
@ -727,11 +737,149 @@ async def get_one_player(player_id: int, csv: Optional[bool] = False):
return return_val return return_val
@router.get("/{player_id}/{card_type}card/{d}/{variant}/animated")
async def get_animated_card(
request: Request,
background_tasks: BackgroundTasks,
player_id: int,
card_type: Literal["batting", "pitching"],
variant: int,
d: str,
tier: Optional[int] = Query(
None, ge=0, le=4, description="Override refractor tier for preview (dev only)"
),
):
try:
this_player = Player.get_by_id(player_id)
except DoesNotExist:
raise HTTPException(
status_code=404, detail=f"No player found with id {player_id}"
)
refractor_tier = (
tier if tier is not None else resolve_refractor_tier(player_id, variant)
)
if refractor_tier < 3:
raise HTTPException(
status_code=404,
detail=f"No animation for tier {refractor_tier}; animated cards require T3 or T4",
)
cache_path = apng_cache_path(
this_player.cardset.id, card_type, player_id, d, variant
)
headers = {"Cache-Control": "public, max-age=86400"}
if os.path.isfile(cache_path) and tier is None:
return FileResponse(path=cache_path, media_type="image/apng", headers=headers)
all_pos = (
CardPosition.select()
.where(CardPosition.player == this_player)
.order_by(CardPosition.innings.desc())
)
if card_type == "batting":
this_bc = BattingCard.get_or_none(
BattingCard.player == this_player, BattingCard.variant == variant
)
if this_bc is None:
raise HTTPException(
status_code=404,
detail=f"Batting card not found for id {player_id}, variant {variant}",
)
rating_vl = BattingCardRatings.get_or_none(
BattingCardRatings.battingcard == this_bc, BattingCardRatings.vs_hand == "L"
)
rating_vr = BattingCardRatings.get_or_none(
BattingCardRatings.battingcard == this_bc, BattingCardRatings.vs_hand == "R"
)
if None in [rating_vr, rating_vl]:
raise HTTPException(
status_code=404,
detail=f"Ratings not found for batting card {this_bc.id}",
)
card_data = get_batter_card_data(
this_player, this_bc, rating_vl, rating_vr, all_pos
)
if (
this_player.description in this_player.cardset.name
and this_player.cardset.id not in [23]
):
card_data["cardset_name"] = this_player.cardset.name
else:
card_data["cardset_name"] = this_player.description
card_data["refractor_tier"] = refractor_tier
card_data["request"] = request
html_response = templates.TemplateResponse("player_card.html", card_data)
else:
this_pc = PitchingCard.get_or_none(
PitchingCard.player == this_player, PitchingCard.variant == variant
)
if this_pc is None:
raise HTTPException(
status_code=404,
detail=f"Pitching card not found for id {player_id}, variant {variant}",
)
rating_vl = PitchingCardRatings.get_or_none(
PitchingCardRatings.pitchingcard == this_pc,
PitchingCardRatings.vs_hand == "L",
)
rating_vr = PitchingCardRatings.get_or_none(
PitchingCardRatings.pitchingcard == this_pc,
PitchingCardRatings.vs_hand == "R",
)
if None in [rating_vr, rating_vl]:
raise HTTPException(
status_code=404,
detail=f"Ratings not found for pitching card {this_pc.id}",
)
card_data = get_pitcher_card_data(
this_player, this_pc, rating_vl, rating_vr, all_pos
)
if (
this_player.description in this_player.cardset.name
and this_player.cardset.id not in [23]
):
card_data["cardset_name"] = this_player.cardset.name
else:
card_data["cardset_name"] = this_player.description
card_data["refractor_tier"] = refractor_tier
card_data["request"] = request
html_response = templates.TemplateResponse("player_card.html", card_data)
browser = await get_browser()
page = await browser.new_page(viewport={"width": 1280, "height": 720})
try:
await generate_animated_card(
page,
html_response.body.decode("UTF-8"),
cache_path,
refractor_tier,
)
finally:
await page.close()
if tier is None:
background_tasks.add_task(
upload_variant_apng,
player_id=player_id,
variant=variant,
card_type=card_type,
cardset_id=this_player.cardset.id,
apng_path=cache_path,
)
return FileResponse(path=cache_path, media_type="image/apng", headers=headers)
@router.get("/{player_id}/{card_type}card") @router.get("/{player_id}/{card_type}card")
@router.get("/{player_id}/{card_type}card/{d}") @router.get("/{player_id}/{card_type}card/{d}")
@router.get("/{player_id}/{card_type}card/{d}/{variant}") @router.get("/{player_id}/{card_type}card/{d}/{variant}")
async def get_batter_card( async def get_batter_card(
request: Request, request: Request,
background_tasks: BackgroundTasks,
player_id: int, player_id: int,
card_type: Literal["batting", "pitching"], card_type: Literal["batting", "pitching"],
variant: int = 0, variant: int = 0,
@ -906,6 +1054,27 @@ async def get_batter_card(
# save_as=f'{player_id}-{d}-v{variant}.png' # save_as=f'{player_id}-{d}-v{variant}.png'
# ) # )
# Schedule S3 upload for variant cards that don't have an image_url yet.
# Skip when tier is overridden (?tier= dev preview) — those renders don't
# correspond to real variant card rows.
if variant > 0 and tier is None:
CardModel = BattingCard if card_type == "batting" else PitchingCard
try:
card_row = CardModel.get(
(CardModel.player_id == player_id) & (CardModel.variant == variant)
)
if card_row.image_url is None:
background_tasks.add_task(
backfill_variant_image_url,
player_id=player_id,
variant=variant,
card_type=card_type,
cardset_id=this_player.cardset.id,
png_path=file_path,
)
except CardModel.DoesNotExist:
pass
return FileResponse(path=file_path, media_type="image/png", headers=headers) return FileResponse(path=file_path, media_type="image/png", headers=headers)

View File

@ -1,10 +1,11 @@
import os import os
from datetime import date
from fastapi import APIRouter, Depends, HTTPException, Query from fastapi import APIRouter, Depends, HTTPException, Query
import logging import logging
from typing import Optional from typing import Optional
from ..db_engine import model_to_dict from ..db_engine import model_to_dict, BattingCard, PitchingCard
from ..dependencies import oauth2_scheme, valid_token from ..dependencies import oauth2_scheme, valid_token
from ..services.refractor_init import initialize_card_refractor, _determine_card_type from ..services.refractor_init import initialize_card_refractor, _determine_card_type
@ -23,8 +24,12 @@ _NEXT_THRESHOLD_ATTR = {
4: None, 4: None,
} }
# Sentinel used by _build_card_state_response to distinguish "caller did not
# pass image_url" (do the DB lookup) from "caller passed None" (use None).
_UNSET = object()
def _build_card_state_response(state, player_name=None) -> dict:
def _build_card_state_response(state, player_name=None, image_url=_UNSET) -> dict:
"""Serialise a RefractorCardState into the standard API response shape. """Serialise a RefractorCardState into the standard API response shape.
Produces a flat dict with player_id and team_id as plain integers, Produces a flat dict with player_id and team_id as plain integers,
@ -67,6 +72,29 @@ def _build_card_state_response(state, player_name=None) -> dict:
if player_name is not None: if player_name is not None:
result["player_name"] = player_name result["player_name"] = player_name
# Resolve image_url from the variant card row.
# When image_url is pre-fetched by the caller (batch list path), it is
# passed directly and the per-row DB query is skipped entirely.
if image_url is _UNSET:
image_url = None
if state.variant and state.variant > 0:
card_type = (
state.track.card_type
if hasattr(state, "track") and state.track
else None
)
if card_type:
CardModel = BattingCard if card_type == "batter" else PitchingCard
try:
variant_card = CardModel.get(
(CardModel.player_id == state.player_id)
& (CardModel.variant == state.variant)
)
image_url = variant_card.image_url
except CardModel.DoesNotExist:
pass
result["image_url"] = image_url
return result return result
@ -211,15 +239,44 @@ async def list_card_states(
if evaluated_only: if evaluated_only:
query = query.where(RefractorCardState.last_evaluated_at.is_null(False)) query = query.where(RefractorCardState.last_evaluated_at.is_null(False))
total = query.count() total = query.count() or 0
states_page = list(query.offset(offset).limit(limit))
# Pre-fetch image_urls in at most 2 bulk queries (one per card table) so
# that _build_card_state_response never issues a per-row CardModel.get().
batter_pids: set[int] = set()
pitcher_pids: set[int] = set()
for state in states_page:
if state.variant and state.variant > 0:
card_type = state.track.card_type if state.track else None
if card_type == "batter":
batter_pids.add(state.player_id)
elif card_type in ("sp", "rp"):
pitcher_pids.add(state.player_id)
image_url_map: dict[tuple[int, int], str | None] = {}
if batter_pids:
for card in BattingCard.select().where(BattingCard.player_id.in_(batter_pids)):
image_url_map[(card.player_id, card.variant)] = card.image_url
if pitcher_pids:
for card in PitchingCard.select().where(
PitchingCard.player_id.in_(pitcher_pids)
):
image_url_map[(card.player_id, card.variant)] = card.image_url
items = [] items = []
for state in query.offset(offset).limit(limit): for state in states_page:
player_name = None player_name = None
try: try:
player_name = state.player.p_name player_name = state.player.p_name
except Exception: except Exception:
pass pass
items.append(_build_card_state_response(state, player_name=player_name)) img_url = image_url_map.get((state.player_id, state.variant))
items.append(
_build_card_state_response(
state, player_name=player_name, image_url=img_url
)
)
return {"count": total, "items": items} return {"count": total, "items": items}
@ -420,8 +477,14 @@ async def evaluate_game(game_id: int, token: str = Depends(oauth2_scheme)):
# Non-breaking addition: include boost info when available. # Non-breaking addition: include boost info when available.
if boost_result: if boost_result:
tier_up_entry["variant_created"] = boost_result.get( variant_num = boost_result.get("variant_created")
"variant_created" tier_up_entry["variant_created"] = variant_num
if computed_tier >= 3 and variant_num and card_type:
d = date.today().strftime("%Y-%m-%d")
api_base = os.environ.get("API_BASE_URL", "").rstrip("/")
tier_up_entry["animated_url"] = (
f"{api_base}/api/v2/players/{player_id}/{card_type}card"
f"/{d}/{variant_num}/animated"
) )
tier_ups.append(tier_up_entry) tier_ups.append(tier_up_entry)

View File

@ -0,0 +1,125 @@
"""
APNG animated card generation for T3 and T4 refractor tiers.
Captures animation frames by scrubbing CSS animations via Playwright each
frame is rendered with a negative animation-delay that freezes the render at a
specific point in the animation cycle. The captured PNGs are then assembled
into a looping APNG using the apng library.
Cache / S3 path convention:
Local: storage/cards/cardset-{id}/{card_type}/{player_id}-{date}-v{variant}.apng
S3: cards/cardset-{id}/{card_type}/{player_id}-{date}-v{variant}.apng
"""
import os
import tempfile
from apng import APNG
from playwright.async_api import Page
# ---------------------------------------------------------------------------
# Animation specs per tier
# Each entry: list of (css_selector, animation_duration_seconds) pairs that
# need to be scrubbed, plus the frame count and per-frame display time.
# ---------------------------------------------------------------------------
_T3_SPEC = {
"selectors_and_durations": [("#header::after", 2.5)],
"num_frames": 12,
"frame_delay_ms": 200,
}
_T4_SPEC = {
"selectors_and_durations": [
("#header::after", 6.0),
(".tier-diamond.diamond-glow", 2.0),
],
"num_frames": 24,
"frame_delay_ms": 250,
}
ANIM_SPECS = {3: _T3_SPEC, 4: _T4_SPEC}
def apng_cache_path(
cardset_id: int, card_type: str, player_id: int, d: str, variant: int
) -> str:
"""Return the local filesystem cache path for an animated card APNG."""
return f"storage/cards/cardset-{cardset_id}/{card_type}/{player_id}-{d}-v{variant}.apng"
async def generate_animated_card(
page: Page,
html_content: str,
output_path: str,
tier: int,
) -> None:
"""Generate an animated APNG for a T3 or T4 refractor card.
Scrubs each CSS animation by injecting an override <style> tag that sets
animation-play-state: running and a negative animation-delay, freezing the
render at evenly-spaced intervals across one animation cycle. The captured
frames are assembled into a looping APNG at output_path.
Args:
page: An open Playwright page (caller is responsible for lifecycle).
html_content: Rendered card HTML string (from TemplateResponse.body).
output_path: Destination path for the .apng file.
tier: Refractor tier must be 3 or 4.
Raises:
ValueError: If tier is not 3 or 4.
"""
spec = ANIM_SPECS.get(tier)
if spec is None:
raise ValueError(
f"No animation spec for tier {tier}; animated cards are T3 and T4 only"
)
num_frames = spec["num_frames"]
frame_delay_ms = spec["frame_delay_ms"]
selectors_and_durations = spec["selectors_and_durations"]
frame_paths: list[str] = []
try:
for i in range(num_frames):
progress = i / num_frames # 0.0 .. (N-1)/N, seamless loop
await page.set_content(html_content)
# Inject override CSS: unpauses animation and seeks to frame offset
css_parts = []
for selector, duration in selectors_and_durations:
delay_s = -progress * duration
css_parts.append(
f"{selector} {{"
f" animation-play-state: running !important;"
f" animation-delay: {delay_s:.4f}s !important;"
f" }}"
)
await page.add_style_tag(content="\n".join(css_parts))
tmp = tempfile.NamedTemporaryFile(suffix=".png", delete=False)
tmp.close()
await page.screenshot(
path=tmp.name,
type="png",
clip={"x": 0.0, "y": 0, "width": 1200, "height": 600},
)
frame_paths.append(tmp.name)
dir_path = os.path.dirname(output_path)
if dir_path:
os.makedirs(dir_path, exist_ok=True)
apng_obj = APNG()
for frame_path in frame_paths:
# delay/delay_den is the frame display time in seconds as a fraction
apng_obj.append_file(frame_path, delay=frame_delay_ms, delay_den=1000)
apng_obj.save(output_path)
finally:
for path in frame_paths:
try:
os.unlink(path)
except OSError:
pass

View File

@ -0,0 +1,310 @@
"""
card_storage.py S3 upload utility for variant card images.
Public API
----------
get_s3_client()
Create and return a boto3 S3 client using ambient AWS credentials
(environment variables or instance profile).
build_s3_key(cardset_id, player_id, variant, card_type)
Construct the S3 object key for a variant card PNG image.
build_apng_s3_key(cardset_id, player_id, variant, card_type)
Construct the S3 object key for a variant animated card APNG.
build_s3_url(s3_key, render_date)
Return the full HTTPS S3 URL with a cache-busting date query param.
upload_card_to_s3(s3_client, png_bytes, s3_key)
Upload raw PNG bytes to S3 with correct ContentType and CacheControl headers.
upload_apng_to_s3(s3_client, apng_bytes, s3_key)
Upload raw APNG bytes to S3 with correct ContentType and CacheControl headers.
backfill_variant_image_url(player_id, variant, card_type, cardset_id, png_path)
End-to-end: read PNG from disk, upload to S3, update BattingCard or
PitchingCard.image_url in the database. All exceptions are caught and
logged; this function never raises (safe to call as a background task).
upload_variant_apng(player_id, variant, card_type, cardset_id, apng_path)
End-to-end: read APNG from disk and upload to S3. No DB update (no
animated_url column exists yet). All exceptions are caught and logged;
this function never raises (safe to call as a background task).
Design notes
------------
- S3 credentials are resolved from the environment by boto3 at call time;
no credentials are hard-coded here.
- The cache-bust ?d= param matches the card-creation pipeline convention so
that clients can compare URLs across pipelines.
"""
import logging
import os
from datetime import date
import boto3
from app.db_engine import BattingCard, PitchingCard
logger = logging.getLogger(__name__)
S3_BUCKET = os.environ.get("S3_BUCKET", "paper-dynasty")
S3_REGION = os.environ.get("S3_REGION", "us-east-1")
def get_s3_client():
"""Create and return a boto3 S3 client for the configured region.
Credentials are resolved by boto3 from the standard chain:
environment variables ~/.aws/credentials instance profile.
Returns:
A boto3 S3 client instance.
"""
return boto3.client("s3", region_name=S3_REGION)
def build_s3_key(cardset_id: int, player_id: int, variant: int, card_type: str) -> str:
"""Construct the S3 object key for a variant card image.
Key format:
cards/cardset-{csid:03d}/player-{pid}/v{variant}/{card_type}card.png
Args:
cardset_id: Numeric cardset ID (zero-padded to 3 digits).
player_id: Player ID.
variant: Variant number (0 = base, 1-4 = refractor tiers).
card_type: Either "batting" or "pitching".
Returns:
The S3 object key string.
"""
return (
f"cards/cardset-{cardset_id:03d}/player-{player_id}"
f"/v{variant}/{card_type}card.png"
)
def build_s3_url(s3_key: str, render_date: date) -> str:
"""Return the full HTTPS S3 URL for a card image with a cache-bust param.
URL format:
https://{bucket}.s3.{region}.amazonaws.com/{key}?d={date}
The ?d= query param matches the card-creation pipeline convention so that
clients invalidate their cache after each re-render.
Args:
s3_key: S3 object key (from build_s3_key).
render_date: The date the card was rendered, used for cache-busting.
Returns:
Full HTTPS URL string.
"""
base_url = f"https://{S3_BUCKET}.s3.{S3_REGION}.amazonaws.com"
date_str = render_date.strftime("%Y-%m-%d")
return f"{base_url}/{s3_key}?d={date_str}"
def build_apng_s3_key(
cardset_id: int, player_id: int, variant: int, card_type: str
) -> str:
"""Construct the S3 object key for a variant animated card APNG.
Key format:
cards/cardset-{csid:03d}/player-{pid}/v{variant}/{card_type}card.apng
Args:
cardset_id: Numeric cardset ID (zero-padded to 3 digits).
player_id: Player ID.
variant: Variant number (1-4 = refractor tiers).
card_type: Either "batting" or "pitching".
Returns:
The S3 object key string.
"""
return (
f"cards/cardset-{cardset_id:03d}/player-{player_id}"
f"/v{variant}/{card_type}card.apng"
)
def upload_card_to_s3(s3_client, png_bytes: bytes, s3_key: str) -> None:
"""Upload raw PNG bytes to S3 with the standard card image headers.
Sets ContentType=image/png and CacheControl=public, max-age=300 (5 min)
so that CDN and browser caches are refreshed within a short window after
a re-render.
Args:
s3_client: A boto3 S3 client (from get_s3_client).
png_bytes: Raw PNG image bytes.
s3_key: S3 object key (from build_s3_key).
Returns:
None
"""
s3_client.put_object(
Bucket=S3_BUCKET,
Key=s3_key,
Body=png_bytes,
ContentType="image/png",
CacheControl="public, max-age=300",
)
def backfill_variant_image_url(
player_id: int,
variant: int,
card_type: str,
cardset_id: int,
png_path: str,
) -> None:
"""Read a rendered PNG from disk, upload it to S3, and update the DB row.
Determines the correct card model (BattingCard or PitchingCard) from
card_type, then:
1. Reads PNG bytes from png_path.
2. Uploads to S3 via upload_card_to_s3.
3. Fetches the card row by (player_id, variant).
4. Sets image_url to the new S3 URL and calls save().
All exceptions are caught and logged this function is intended to be
called as a background task and must never propagate exceptions.
Args:
player_id: Player ID used to locate the card row.
variant: Variant number (matches the card row's variant field).
card_type: "batting" or "pitching" selects the model.
cardset_id: Cardset ID used for the S3 key.
png_path: Absolute path to the rendered PNG file on disk.
Returns:
None
"""
try:
# 1. Read PNG from disk
with open(png_path, "rb") as f:
png_bytes = f.read()
# 2. Build key and upload
s3_key = build_s3_key(
cardset_id=cardset_id,
player_id=player_id,
variant=variant,
card_type=card_type,
)
s3_client = get_s3_client()
upload_card_to_s3(s3_client, png_bytes, s3_key)
# 3. Build URL with today's date for cache-busting
image_url = build_s3_url(s3_key, render_date=date.today())
# 4. Locate the card row and update image_url
if card_type == "batting":
card = BattingCard.get(
BattingCard.player_id == player_id, BattingCard.variant == variant
)
else:
card = PitchingCard.get(
PitchingCard.player_id == player_id, PitchingCard.variant == variant
)
card.image_url = image_url
card.save()
logger.info(
"backfill_variant_image_url: updated %s card player=%s variant=%s url=%s",
card_type,
player_id,
variant,
image_url,
)
except Exception:
logger.exception(
"backfill_variant_image_url: failed for player=%s variant=%s card_type=%s",
player_id,
variant,
card_type,
)
def upload_apng_to_s3(s3_client, apng_bytes: bytes, s3_key: str) -> None:
"""Upload raw APNG bytes to S3 with the standard animated card headers.
Sets ContentType=image/apng and CacheControl=public, max-age=86400 (1 day)
matching the animated endpoint's own Cache-Control header.
Args:
s3_client: A boto3 S3 client (from get_s3_client).
apng_bytes: Raw APNG image bytes.
s3_key: S3 object key (from build_apng_s3_key).
Returns:
None
"""
s3_client.put_object(
Bucket=S3_BUCKET,
Key=s3_key,
Body=apng_bytes,
ContentType="image/apng",
CacheControl="public, max-age=86400",
)
def upload_variant_apng(
player_id: int,
variant: int,
card_type: str,
cardset_id: int,
apng_path: str,
) -> None:
"""Read a rendered APNG from disk and upload it to S3.
Intended to be called as a background task after a new animated card is
rendered. No DB update is performed (no animated_url column exists yet).
All exceptions are caught and logged this function is intended to be
called as a background task and must never propagate exceptions.
Args:
player_id: Player ID used for the S3 key.
variant: Variant number (matches the refractor tier variant).
card_type: "batting" or "pitching" selects the S3 key.
cardset_id: Cardset ID used for the S3 key.
apng_path: Absolute path to the rendered APNG file on disk.
Returns:
None
"""
try:
with open(apng_path, "rb") as f:
apng_bytes = f.read()
s3_key = build_apng_s3_key(
cardset_id=cardset_id,
player_id=player_id,
variant=variant,
card_type=card_type,
)
s3_client = get_s3_client()
upload_apng_to_s3(s3_client, apng_bytes, s3_key)
logger.info(
"upload_variant_apng: uploaded %s animated card player=%s variant=%s key=%s",
card_type,
player_id,
variant,
s3_key,
)
except Exception:
logger.exception(
"upload_variant_apng: failed for player=%s variant=%s card_type=%s",
player_id,
variant,
card_type,
)

View File

@ -148,10 +148,11 @@ def evaluate_card(
strikeouts=sum(r.strikeouts for r in rows), strikeouts=sum(r.strikeouts for r in rows),
) )
else: else:
from app.db_engine import ( from app.db_engine import ( # noqa: PLC0415
BattingSeasonStats, BattingSeasonStats,
PitchingSeasonStats, PitchingSeasonStats,
) # noqa: PLC0415 REFRACTOR_START_SEASON,
)
card_type = card_state.track.card_type card_type = card_state.track.card_type
if card_type == "batter": if card_type == "batter":
@ -159,6 +160,7 @@ def evaluate_card(
BattingSeasonStats.select().where( BattingSeasonStats.select().where(
(BattingSeasonStats.player == player_id) (BattingSeasonStats.player == player_id)
& (BattingSeasonStats.team == team_id) & (BattingSeasonStats.team == team_id)
& (BattingSeasonStats.season >= REFRACTOR_START_SEASON)
) )
) )
totals = _CareerTotals( totals = _CareerTotals(
@ -175,6 +177,7 @@ def evaluate_card(
PitchingSeasonStats.select().where( PitchingSeasonStats.select().where(
(PitchingSeasonStats.player == player_id) (PitchingSeasonStats.player == player_id)
& (PitchingSeasonStats.team == team_id) & (PitchingSeasonStats.team == team_id)
& (PitchingSeasonStats.season >= REFRACTOR_START_SEASON)
) )
) )
totals = _CareerTotals( totals = _CareerTotals(

View File

@ -0,0 +1,7 @@
-- Drop orphaned RefractorTierBoost and RefractorCosmetic tables.
-- These were speculative schema from the initial Refractor design that were
-- never used — boosts are hardcoded in refractor_boost.py and tier visuals
-- are embedded in CSS templates. Both tables have zero rows on dev and prod.
DROP TABLE IF EXISTS refractor_tier_boost;
DROP TABLE IF EXISTS refractor_cosmetic;

View File

@ -12,3 +12,5 @@ requests==2.32.3
html2image==2.0.6 html2image==2.0.6
jinja2==3.1.4 jinja2==3.1.4
playwright==1.45.1 playwright==1.45.1
apng==0.3.4
boto3==1.42.65

55
scripts/README.md Normal file
View File

@ -0,0 +1,55 @@
# Scripts
Operational scripts for the Paper Dynasty Database API.
## deploy.sh
Deploy the API by tagging a commit and triggering CI/CD.
```bash
./scripts/deploy.sh dev # Tag HEAD as 'dev', CI builds :dev image
./scripts/deploy.sh prod # Create CalVer tag + 'latest' + 'production'
./scripts/deploy.sh dev abc1234 # Tag a specific commit
./scripts/deploy.sh dev --sync-templates # Deploy + push changed templates to server
```
**Template drift check** runs automatically on every deploy. Compares local `storage/templates/*.html` against the target server via md5sum and warns if any files differ. Templates are volume-mounted (not baked into the Docker image), so code deploys alone won't update them.
**Cached image report** also runs automatically, showing PNG and APNG counts on the target server.
| Environment | SSH Host | Template Path |
|---|---|---|
| dev | `pd-database` | `/home/cal/container-data/dev-pd-database/storage/templates` |
| prod | `akamai` | `/root/container-data/paper-dynasty/storage/templates` |
## clear-card-cache.sh
Inspect or clear cached rendered card images inside the API container.
```bash
./scripts/clear-card-cache.sh dev # Report cache size (dry run)
./scripts/clear-card-cache.sh dev --apng-only # Delete animated card cache only
./scripts/clear-card-cache.sh dev --all # Delete all cached card images
```
Cached images regenerate on demand when next requested. APNG files (T3/T4 animated cards) are the most likely to go stale after template CSS changes. Both destructive modes prompt for confirmation before deleting.
| Environment | SSH Host | Container | Cache Path |
|---|---|---|---|
| dev | `pd-database` | `dev_pd_database` | `/app/storage/cards/` |
| prod | `akamai` | `pd_api` | `/app/storage/cards/` |
## Migration Scripts
| Script | Purpose |
|---|---|
| `migrate_to_postgres.py` | One-time SQLite to PostgreSQL migration |
| `migrate_missing_data.py` | Backfill missing data after migration |
| `db_migrations.py` (in repo root) | Schema migrations |
## Utility Scripts
| Script | Purpose |
|---|---|
| `wipe_gauntlet_team.py` | Reset a gauntlet team's state |
| `audit_sqlite.py` | Audit legacy SQLite database |

89
scripts/clear-card-cache.sh Executable file
View File

@ -0,0 +1,89 @@
#!/bin/bash
# Clear cached card images from the API container
# Usage: ./scripts/clear-card-cache.sh <dev|prod> [--apng-only|--all]
#
# With no flags: reports cache size only (dry run)
# --apng-only: delete only .apng files (animated cards)
# --all: delete all cached card images (.png + .apng)
set -euo pipefail
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
CYAN='\033[0;36m'
NC='\033[0m'
declare -A DEPLOY_HOST=([dev]="pd-database" [prod]="akamai")
declare -A CONTAINER=([dev]="dev_pd_database" [prod]="pd_api")
usage() {
echo "Usage: $0 <dev|prod> [--apng-only|--all]"
echo ""
echo " No flag Report cache size (dry run)"
echo " --apng-only Delete only .apng files (animated cards)"
echo " --all Delete all cached card images"
exit 1
}
[[ $# -lt 1 ]] && usage
ENV="$1"
ACTION="${2:-report}"
if [[ "$ENV" != "dev" && "$ENV" != "prod" ]]; then
usage
fi
HOST="${DEPLOY_HOST[$ENV]}"
CTR="${CONTAINER[$ENV]}"
CACHE_PATH="/app/storage/cards"
report() {
echo -e "${CYAN}Card image cache on ${HOST} (${CTR}):${NC}"
ssh "$HOST" "
png_count=\$(docker exec $CTR find $CACHE_PATH -name '*.png' 2>/dev/null | wc -l)
apng_count=\$(docker exec $CTR find $CACHE_PATH -name '*.apng' 2>/dev/null | wc -l)
echo \" PNG: \${png_count} files\"
echo \" APNG: \${apng_count} files\"
echo \" Total: \$((\${png_count} + \${apng_count})) files\"
" 2>/dev/null || {
echo -e "${RED}Could not reach ${HOST}.${NC}"
exit 1
}
}
report
case "$ACTION" in
report)
echo -e "${GREEN}Dry run — no files deleted. Pass --apng-only or --all to clear.${NC}"
;;
--apng-only)
echo -e "${YELLOW}Deleting all .apng files from ${CTR}...${NC}"
read -rp "Proceed? [y/N] " confirm
[[ "$confirm" =~ ^[Yy]$ ]] || {
echo "Aborted."
exit 0
}
deleted=$(ssh "$HOST" "docker exec $CTR find $CACHE_PATH -name '*.apng' -delete -print 2>/dev/null | wc -l")
echo -e "${GREEN}Deleted ${deleted} .apng files.${NC}"
;;
--all)
echo -e "${RED}Deleting ALL cached card images from ${CTR}...${NC}"
read -rp "This will clear PNG and APNG caches. Proceed? [y/N] " confirm
[[ "$confirm" =~ ^[Yy]$ ]] || {
echo "Aborted."
exit 0
}
deleted=$(ssh "$HOST" "docker exec $CTR find $CACHE_PATH -type f \( -name '*.png' -o -name '*.apng' \) -delete -print 2>/dev/null | wc -l")
echo -e "${GREEN}Deleted ${deleted} cached card images.${NC}"
;;
*)
usage
;;
esac

203
scripts/deploy.sh Executable file
View File

@ -0,0 +1,203 @@
#!/bin/bash
# Deploy Paper Dynasty Database API
# Usage: ./scripts/deploy.sh <dev|prod> [--sync-templates] [commit]
#
# Dev: Force-updates the "dev" git tag → CI builds :dev Docker image
# Prod: Creates CalVer tag + force-updates "latest" and "production" git tags
# → CI builds :<calver>, :latest, :production Docker images
#
# Options:
# --sync-templates Upload changed templates to the target server via scp
#
# Templates are volume-mounted (not in the Docker image). The script always
# checks for template drift and warns if local/remote differ. Pass
# --sync-templates to actually push the changed files.
set -euo pipefail
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
CYAN='\033[0;36m'
NC='\033[0m'
REMOTE="origin"
SYNC_TEMPLATES=false
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
TEMPLATE_DIR="$SCRIPT_DIR/../storage/templates"
# Server config
declare -A DEPLOY_HOST=([dev]="pd-database" [prod]="akamai")
declare -A TEMPLATE_PATH=(
[dev]="/home/cal/container-data/dev-pd-database/storage/templates"
[prod]="/root/container-data/paper-dynasty/storage/templates"
)
usage() {
echo "Usage: $0 <dev|prod> [--sync-templates] [commit]"
echo ""
echo " dev [commit] Force-update 'dev' tag on HEAD or specified commit"
echo " prod [commit] Create CalVer + 'latest' + 'production' tags on HEAD or specified commit"
echo ""
echo "Options:"
echo " --sync-templates Upload changed templates to the target server"
exit 1
}
[[ $# -lt 1 ]] && usage
ENV="$1"
shift
# Parse optional flags
COMMIT="HEAD"
while [[ $# -gt 0 ]]; do
case "$1" in
--sync-templates)
SYNC_TEMPLATES=true
shift
;;
--*)
echo -e "${RED}Unknown option: $1${NC}"
usage
;;
*)
COMMIT="$1"
shift
;;
esac
done
SHA=$(git rev-parse "$COMMIT" 2>/dev/null) || {
echo -e "${RED}Invalid commit: $COMMIT${NC}"
exit 1
}
SHA_SHORT="${SHA:0:7}"
git fetch --tags "$REMOTE"
if ! git branch -a --contains "$SHA" 2>/dev/null | grep -qE '(^|\s)(main|remotes/origin/main)$'; then
echo -e "${RED}Commit $SHA_SHORT is not on main. Aborting.${NC}"
exit 1
fi
# --- Template drift check ---
check_templates() {
local host="${DEPLOY_HOST[$ENV]}"
local remote_path="${TEMPLATE_PATH[$ENV]}"
echo -e "${CYAN}Checking templates against ${host}:${remote_path}...${NC}"
local local_hashes remote_hashes
local_hashes=$(cd "$TEMPLATE_DIR" && md5sum *.html 2>/dev/null | sort -k2)
remote_hashes=$(ssh "$host" "cd '$remote_path' && md5sum *.html 2>/dev/null | sort -k2" 2>/dev/null) || {
echo -e "${YELLOW} Could not reach ${host} — skipping template check.${NC}"
return 0
}
local changed=()
local missing_remote=()
while IFS= read -r line; do
local hash file remote_hash
hash=$(echo "$line" | awk '{print $1}')
file=$(echo "$line" | awk '{print $2}')
remote_hash=$(echo "$remote_hashes" | awk -v f="$file" '$2 == f {print $1}')
if [[ -z "$remote_hash" ]]; then
missing_remote+=("$file")
elif [[ "$hash" != "$remote_hash" ]]; then
changed+=("$file")
fi
done <<<"$local_hashes"
if [[ ${#changed[@]} -eq 0 && ${#missing_remote[@]} -eq 0 ]]; then
echo -e "${GREEN} Templates in sync.${NC}"
return 0
fi
echo -e "${YELLOW} Template drift detected:${NC}"
for f in "${changed[@]+"${changed[@]}"}"; do
[[ -n "$f" ]] && echo -e " ${YELLOW}CHANGED${NC} $f"
done
for f in "${missing_remote[@]+"${missing_remote[@]}"}"; do
[[ -n "$f" ]] && echo -e " ${YELLOW}MISSING${NC} $f (not on server)"
done
if [[ "$SYNC_TEMPLATES" == true ]]; then
echo -e "${CYAN} Syncing templates...${NC}"
for f in "${changed[@]+"${changed[@]}"}" "${missing_remote[@]+"${missing_remote[@]}"}"; do
[[ -n "$f" ]] && scp "$TEMPLATE_DIR/$f" "${host}:${remote_path}/$f"
done
echo -e "${GREEN} Templates synced to ${host}.${NC}"
else
echo -e "${YELLOW} Run with --sync-templates to push changes.${NC}"
fi
}
check_templates
# --- Cached image report ---
declare -A API_CONTAINER=([dev]="dev_pd_database" [prod]="pd_api")
report_cache() {
local host="${DEPLOY_HOST[$ENV]}"
local container="${API_CONTAINER[$ENV]}"
echo -e "${CYAN}Cached card images on ${host} (${container}):${NC}"
ssh "$host" "
png_count=\$(docker exec $container find /app/storage/cards -name '*.png' 2>/dev/null | wc -l)
apng_count=\$(docker exec $container find /app/storage/cards -name '*.apng' 2>/dev/null | wc -l)
echo \" PNG: \${png_count} files\"
echo \" APNG: \${apng_count} files\"
echo \" Total: \$((\${png_count} + \${apng_count})) files\"
" 2>/dev/null || echo -e "${YELLOW} Could not reach ${host} — skipping cache report.${NC}"
}
report_cache
case "$ENV" in
dev)
echo -e "${YELLOW}Deploying to dev...${NC}"
echo -e " Commit: ${SHA_SHORT}"
git tag -f dev "$SHA"
git push "$REMOTE" dev --force
echo -e "${GREEN}Tagged ${SHA_SHORT} as 'dev' and pushed. CI will build :dev image.${NC}"
;;
prod)
echo -e "${YELLOW}Deploying to prod...${NC}"
YEAR=$(date -u +%Y)
MONTH=$(date -u +%-m)
PREFIX="${YEAR}.${MONTH}."
LAST_BUILD=$(git tag -l "${PREFIX}*" | sed "s/^${PREFIX}//" | sort -n | tail -1)
BUILD=$((${LAST_BUILD:-0} + 1))
CALVER="${PREFIX}${BUILD}"
echo -e " Commit: ${SHA_SHORT}"
echo -e " Version: ${CALVER}"
echo -e " Tags: ${CALVER}, latest, production"
read -rp "Proceed? [y/N] " confirm
[[ "$confirm" =~ ^[Yy]$ ]] || {
echo "Aborted."
exit 0
}
git tag "$CALVER" "$SHA"
git tag -f latest "$SHA"
git tag -f production "$SHA"
git push "$REMOTE" "$CALVER"
git push "$REMOTE" latest --force
git push "$REMOTE" production --force
echo -e "${GREEN}Tagged ${SHA_SHORT} as '${CALVER}', 'latest', 'production' and pushed.${NC}"
echo -e "${GREEN}CI will build :${CALVER}, :latest, :production images.${NC}"
;;
*)
usage
;;
esac

View File

@ -15,6 +15,7 @@
} -%} } -%}
{%- set dc = diamond_colors[refractor_tier] -%} {%- set dc = diamond_colors[refractor_tier] -%}
{%- set filled_bg = 'linear-gradient(135deg, ' ~ dc.highlight ~ ' 0%, ' ~ dc.color ~ ' 50%, ' ~ dc.color ~ ' 100%)' -%} {%- set filled_bg = 'linear-gradient(135deg, ' ~ dc.highlight ~ ' 0%, ' ~ dc.color ~ ' 50%, ' ~ dc.color ~ ' 100%)' -%}
<div class="tier-diamond-backing"></div>
<div class="tier-diamond{% if refractor_tier == 4 %} diamond-glow{% endif %}"> <div class="tier-diamond{% if refractor_tier == 4 %} diamond-glow{% endif %}">
<div class="diamond-quad{% if refractor_tier >= 2 %} filled{% endif %}" {% if refractor_tier >= 2 %}style="background: {{ filled_bg }};"{% endif %}></div> <div class="diamond-quad{% if refractor_tier >= 2 %} filled{% endif %}" {% if refractor_tier >= 2 %}style="background: {{ filled_bg }};"{% endif %}></div>
<div class="diamond-quad{% if refractor_tier >= 1 %} filled{% endif %}" {% if refractor_tier >= 1 %}style="background: {{ filled_bg }};"{% endif %}></div> <div class="diamond-quad{% if refractor_tier >= 1 %} filled{% endif %}" {% if refractor_tier >= 1 %}style="background: {{ filled_bg }};"{% endif %}></div>

View File

@ -6,17 +6,30 @@
</style> </style>
{% if refractor_tier is defined and refractor_tier > 0 %} {% if refractor_tier is defined and refractor_tier > 0 %}
<style> <style>
.tier-diamond-backing,
.tier-diamond { .tier-diamond {
position: absolute; position: absolute;
left: 597px; left: 597px;
top: 78.5px; top: 78.5px;
transform: translate(-50%, -50%) rotate(45deg); transform: translate(-50%, -50%) rotate(45deg);
border-radius: 2px;
pointer-events: none;
}
.tier-diamond-backing {
width: 44px;
height: 44px;
background: rgba(200,210,220,0.9);
z-index: 19;
}
.tier-diamond {
display: grid; display: grid;
grid-template: 1fr 1fr / 1fr 1fr; grid-template: 1fr 1fr / 1fr 1fr;
gap: 2px; gap: 2px;
z-index: 20; z-index: 20;
pointer-events: none; pointer-events: none;
background: rgba(0,0,0,0.75); background: transparent;
border-radius: 2px; border-radius: 2px;
box-shadow: 0 0 0 1.5px rgba(0,0,0,0.7), 0 2px 5px rgba(0,0,0,0.5); box-shadow: 0 0 0 1.5px rgba(0,0,0,0.7), 0 2px 5px rgba(0,0,0,0.5);
} }
@ -24,7 +37,7 @@
.diamond-quad { .diamond-quad {
width: 19px; width: 19px;
height: 19px; height: 19px;
background: rgba(0,0,0,0.3); background: rgba(0,0,0,0.55);
} }
.diamond-quad.filled { .diamond-quad.filled {

View File

@ -48,8 +48,6 @@ from app.db_engine import (
PitchingCard, PitchingCard,
RefractorTrack, RefractorTrack,
RefractorCardState, RefractorCardState,
RefractorTierBoost,
RefractorCosmetic,
RefractorBoostAudit, RefractorBoostAudit,
ScoutOpportunity, ScoutOpportunity,
ScoutClaim, ScoutClaim,
@ -81,8 +79,6 @@ _TEST_MODELS = [
ScoutClaim, ScoutClaim,
RefractorTrack, RefractorTrack,
RefractorCardState, RefractorCardState,
RefractorTierBoost,
RefractorCosmetic,
BattingCard, BattingCard,
PitchingCard, PitchingCard,
RefractorBoostAudit, RefractorBoostAudit,

350
tests/test_card_storage.py Normal file
View File

@ -0,0 +1,350 @@
"""
Unit tests for app/services/card_storage.py S3 upload utility.
This module covers:
- S3 key construction for variant cards (batting, pitching, zero-padded cardset)
- Full S3 URL construction with cache-busting date param
- put_object call validation (correct params, return value)
- End-to-end backfill: read PNG from disk, upload to S3, update DB row
Why we test S3 key construction separately:
The key format is a contract used by both the renderer and the URL builder.
Validating it in isolation catches regressions before they corrupt stored URLs.
Why we test URL construction separately:
The cache-bust param (?d=...) must be appended consistently so that clients
invalidate cached images after a re-render. Testing it independently prevents
the formatter from silently changing.
Why we test upload params:
ContentType and CacheControl must be set exactly so that S3 serves images
with the correct headers. A missing header is a silent misconfiguration.
Why we test backfill error swallowing:
The backfill function is called as a background task it must never raise
exceptions that would abort a card render response. We verify that S3 failures
and missing files are both silently logged, not propagated.
Test isolation:
All tests use unittest.mock; no real S3 calls or DB connections are made.
The `backfill_variant_image_url` tests patch `get_s3_client` and the DB
model classes at the card_storage module level so lazy imports work correctly.
"""
import os
from datetime import date
from unittest.mock import MagicMock, patch
# Set env before importing module so db_engine doesn't try to connect
os.environ.setdefault("DATABASE_TYPE", "postgresql")
os.environ.setdefault("POSTGRES_PASSWORD", "test-dummy")
from app.services.card_storage import (
build_s3_key,
build_s3_url,
upload_card_to_s3,
backfill_variant_image_url,
S3_BUCKET,
S3_REGION,
)
# ---------------------------------------------------------------------------
# TestBuildS3Key
# ---------------------------------------------------------------------------
class TestBuildS3Key:
"""Tests for build_s3_key — S3 object key construction.
The key format must match the existing card-creation pipeline so that
the database API and card-creation tool write to the same S3 paths.
"""
def test_batting_card_key(self):
"""batting card type produces 'battingcard.png' in the key."""
key = build_s3_key(cardset_id=27, player_id=42, variant=1, card_type="batting")
assert key == "cards/cardset-027/player-42/v1/battingcard.png"
def test_pitching_card_key(self):
"""pitching card type produces 'pitchingcard.png' in the key."""
key = build_s3_key(cardset_id=27, player_id=99, variant=2, card_type="pitching")
assert key == "cards/cardset-027/player-99/v2/pitchingcard.png"
def test_cardset_zero_padded_to_three_digits(self):
"""Single-digit cardset IDs are zero-padded to three characters."""
key = build_s3_key(cardset_id=5, player_id=1, variant=0, card_type="batting")
assert "cardset-005" in key
def test_cardset_two_digit_zero_padded(self):
"""Two-digit cardset IDs are zero-padded correctly."""
key = build_s3_key(cardset_id=27, player_id=1, variant=0, card_type="batting")
assert "cardset-027" in key
def test_cardset_three_digit_no_padding(self):
"""Three-digit cardset IDs are not altered."""
key = build_s3_key(cardset_id=100, player_id=1, variant=0, card_type="batting")
assert "cardset-100" in key
def test_variant_included_in_key(self):
"""Variant number is included in the path so variants have distinct keys."""
key_v0 = build_s3_key(
cardset_id=27, player_id=1, variant=0, card_type="batting"
)
key_v3 = build_s3_key(
cardset_id=27, player_id=1, variant=3, card_type="batting"
)
assert "/v0/" in key_v0
assert "/v3/" in key_v3
assert key_v0 != key_v3
# ---------------------------------------------------------------------------
# TestBuildS3Url
# ---------------------------------------------------------------------------
class TestBuildS3Url:
"""Tests for build_s3_url — full URL construction with cache-bust param.
The URL format must be predictable so clients can construct and verify
image URLs without querying the database.
"""
def test_url_contains_bucket_and_region(self):
"""URL includes bucket name and region in the S3 hostname."""
key = "cards/cardset-027/player-42/v1/battingcard.png"
render_date = date(2026, 4, 6)
url = build_s3_url(key, render_date)
assert S3_BUCKET in url
assert S3_REGION in url
def test_url_contains_s3_key(self):
"""URL path includes the full S3 key."""
key = "cards/cardset-027/player-42/v1/battingcard.png"
render_date = date(2026, 4, 6)
url = build_s3_url(key, render_date)
assert key in url
def test_url_has_cache_bust_param(self):
"""URL ends with ?d=<render_date> for cache invalidation."""
key = "cards/cardset-027/player-42/v1/battingcard.png"
render_date = date(2026, 4, 6)
url = build_s3_url(key, render_date)
assert "?d=2026-04-06" in url
def test_url_format_full(self):
"""Full URL matches expected S3 pattern exactly."""
key = "cards/cardset-027/player-1/v0/battingcard.png"
render_date = date(2025, 11, 8)
url = build_s3_url(key, render_date)
expected = (
f"https://{S3_BUCKET}.s3.{S3_REGION}.amazonaws.com/{key}?d=2025-11-08"
)
assert url == expected
# ---------------------------------------------------------------------------
# TestUploadCardToS3
# ---------------------------------------------------------------------------
class TestUploadCardToS3:
"""Tests for upload_card_to_s3 — S3 put_object call validation.
We verify the exact parameters passed to put_object so that S3 serves
images with the correct Content-Type and Cache-Control headers.
"""
def test_put_object_called_with_correct_params(self):
"""put_object is called once with bucket, key, body, ContentType, CacheControl."""
mock_client = MagicMock()
png_bytes = b"\x89PNG\r\n\x1a\n"
s3_key = "cards/cardset-027/player-42/v1/battingcard.png"
upload_card_to_s3(mock_client, png_bytes, s3_key)
mock_client.put_object.assert_called_once_with(
Bucket=S3_BUCKET,
Key=s3_key,
Body=png_bytes,
ContentType="image/png",
CacheControl="public, max-age=300",
)
def test_upload_returns_none(self):
"""upload_card_to_s3 returns None (callers should not rely on a return value)."""
mock_client = MagicMock()
result = upload_card_to_s3(mock_client, b"PNG", "some/key.png")
assert result is None
# ---------------------------------------------------------------------------
# TestBackfillVariantImageUrl
# ---------------------------------------------------------------------------
class TestBackfillVariantImageUrl:
"""Tests for backfill_variant_image_url — end-to-end disk→S3→DB path.
The function is fire-and-forget: it reads a PNG from disk, uploads to S3,
then updates the appropriate card model's image_url. All errors are caught
and logged; the function must never raise.
Test strategy:
- Use tmp_path for temporary PNG files so no filesystem state leaks.
- Patch get_s3_client at the module level to intercept the S3 call.
- Patch BattingCard/PitchingCard at the module level (lazy import target).
"""
def test_batting_card_image_url_updated(self, tmp_path):
"""BattingCard.image_url is updated after a successful upload."""
png_path = tmp_path / "card.png"
png_path.write_bytes(b"\x89PNG\r\n\x1a\n fake png data")
mock_s3 = MagicMock()
mock_card = MagicMock()
with (
patch("app.services.card_storage.get_s3_client", return_value=mock_s3),
patch("app.services.card_storage.BattingCard") as MockBatting,
):
MockBatting.get.return_value = mock_card
backfill_variant_image_url(
player_id=42,
variant=1,
card_type="batting",
cardset_id=27,
png_path=str(png_path),
)
MockBatting.get.assert_called_once_with(
MockBatting.player_id == 42, MockBatting.variant == 1
)
assert mock_card.image_url is not None
mock_card.save.assert_called_once()
def test_pitching_card_image_url_updated(self, tmp_path):
"""PitchingCard.image_url is updated after a successful upload."""
png_path = tmp_path / "card.png"
png_path.write_bytes(b"\x89PNG\r\n\x1a\n fake png data")
mock_s3 = MagicMock()
mock_card = MagicMock()
with (
patch("app.services.card_storage.get_s3_client", return_value=mock_s3),
patch("app.services.card_storage.PitchingCard") as MockPitching,
):
MockPitching.get.return_value = mock_card
backfill_variant_image_url(
player_id=99,
variant=2,
card_type="pitching",
cardset_id=27,
png_path=str(png_path),
)
MockPitching.get.assert_called_once_with(
MockPitching.player_id == 99, MockPitching.variant == 2
)
assert mock_card.image_url is not None
mock_card.save.assert_called_once()
def test_s3_upload_called_with_png_bytes(self, tmp_path):
"""The PNG bytes read from disk are passed to put_object."""
png_bytes = b"\x89PNG\r\n\x1a\n real png content"
png_path = tmp_path / "card.png"
png_path.write_bytes(png_bytes)
mock_s3 = MagicMock()
with (
patch("app.services.card_storage.get_s3_client", return_value=mock_s3),
patch("app.services.card_storage.BattingCard") as MockBatting,
):
MockBatting.get.return_value = MagicMock()
backfill_variant_image_url(
player_id=1,
variant=0,
card_type="batting",
cardset_id=5,
png_path=str(png_path),
)
mock_s3.put_object.assert_called_once()
call_kwargs = mock_s3.put_object.call_args.kwargs
assert call_kwargs["Body"] == png_bytes
def test_s3_error_is_swallowed(self, tmp_path):
"""If S3 raises an exception, backfill swallows it and returns normally.
The function is called as a background task it must never propagate
exceptions that would abort the calling request handler.
"""
png_path = tmp_path / "card.png"
png_path.write_bytes(b"PNG data")
mock_s3 = MagicMock()
mock_s3.put_object.side_effect = Exception("S3 connection refused")
with (
patch("app.services.card_storage.get_s3_client", return_value=mock_s3),
patch("app.services.card_storage.BattingCard"),
):
# Must not raise
backfill_variant_image_url(
player_id=1,
variant=0,
card_type="batting",
cardset_id=27,
png_path=str(png_path),
)
def test_missing_file_is_swallowed(self, tmp_path):
"""If the PNG file does not exist, backfill swallows the error and returns.
Render failures may leave no file on disk; the background task must
handle this gracefully rather than crashing the request.
"""
missing_path = str(tmp_path / "nonexistent.png")
with (
patch("app.services.card_storage.get_s3_client"),
patch("app.services.card_storage.BattingCard"),
):
# Must not raise
backfill_variant_image_url(
player_id=1,
variant=0,
card_type="batting",
cardset_id=27,
png_path=missing_path,
)
def test_db_error_is_swallowed(self, tmp_path):
"""If the DB save raises, backfill swallows it and returns normally."""
png_path = tmp_path / "card.png"
png_path.write_bytes(b"PNG data")
mock_s3 = MagicMock()
mock_card = MagicMock()
mock_card.save.side_effect = Exception("DB connection lost")
with (
patch("app.services.card_storage.get_s3_client", return_value=mock_s3),
patch("app.services.card_storage.BattingCard") as MockBatting,
):
MockBatting.get.return_value = mock_card
# Must not raise
backfill_variant_image_url(
player_id=1,
variant=0,
card_type="batting",
cardset_id=27,
png_path=str(png_path),
)

View File

@ -72,8 +72,6 @@ from app.db_engine import (
Rarity, Rarity,
RefractorBoostAudit, RefractorBoostAudit,
RefractorCardState, RefractorCardState,
RefractorCosmetic,
RefractorTierBoost,
RefractorTrack, RefractorTrack,
Roster, Roster,
RosterSlot, RosterSlot,
@ -123,8 +121,6 @@ _WP13_MODELS = [
PitchingCardRatings, PitchingCardRatings,
RefractorTrack, RefractorTrack,
RefractorCardState, RefractorCardState,
RefractorTierBoost,
RefractorCosmetic,
RefractorBoostAudit, RefractorBoostAudit,
] ]

View File

@ -52,8 +52,6 @@ from app.db_engine import (
Rarity, Rarity,
RefractorBoostAudit, RefractorBoostAudit,
RefractorCardState, RefractorCardState,
RefractorCosmetic,
RefractorTierBoost,
RefractorTrack, RefractorTrack,
Roster, Roster,
RosterSlot, RosterSlot,
@ -111,8 +109,6 @@ _BOOST_INT_MODELS = [
PitchingCardRatings, PitchingCardRatings,
RefractorTrack, RefractorTrack,
RefractorCardState, RefractorCardState,
RefractorTierBoost,
RefractorCosmetic,
RefractorBoostAudit, RefractorBoostAudit,
] ]

View File

@ -0,0 +1,311 @@
"""Tests for image_url field in refractor cards API response.
What: Verifies that GET /api/v2/refractor/cards includes image_url in each card state
item, pulling the URL from the variant BattingCard or PitchingCard row.
Why: The refractor card art pipeline stores rendered card image URLs in the
BattingCard/PitchingCard rows. The Discord bot and website need image_url in
the /refractor/cards response so they can display variant art without a separate
lookup. These tests guard against regressions where image_url is accidentally
dropped from the response serialization.
Test cases:
test_cards_response_includes_image_url -- BattingCard with image_url set; verify
the value appears in the /cards response.
test_cards_response_image_url_null_when_not_set -- BattingCard with image_url=None;
verify null is returned (not omitted).
Uses the shared-memory SQLite TestClient pattern from test_refractor_state_api.py
so no PostgreSQL connection is required.
"""
import os
os.environ.setdefault("API_TOKEN", "test")
import pytest
from fastapi import FastAPI, Request
from fastapi.testclient import TestClient
from peewee import SqliteDatabase
from app.db_engine import (
BattingCard,
BattingSeasonStats,
Card,
Cardset,
Decision,
Event,
MlbPlayer,
Pack,
PackType,
PitchingCard,
PitchingSeasonStats,
Player,
ProcessedGame,
Rarity,
RefractorCardState,
RefractorCosmetic,
RefractorTierBoost,
RefractorTrack,
Roster,
RosterSlot,
ScoutClaim,
ScoutOpportunity,
StratGame,
StratPlay,
Team,
)
AUTH_HEADER = {"Authorization": "Bearer test"}
# ---------------------------------------------------------------------------
# SQLite database + model list
# ---------------------------------------------------------------------------
_img_url_db = SqliteDatabase(
"file:imgurlapitest?mode=memory&cache=shared",
uri=True,
pragmas={"foreign_keys": 1},
)
# Full model list matching the existing state API tests — needed so all FK
# constraints resolve in SQLite.
_IMG_URL_MODELS = [
Rarity,
Event,
Cardset,
MlbPlayer,
Player,
BattingCard,
PitchingCard,
Team,
PackType,
Pack,
Card,
Roster,
RosterSlot,
StratGame,
StratPlay,
Decision,
ScoutOpportunity,
ScoutClaim,
BattingSeasonStats,
PitchingSeasonStats,
ProcessedGame,
RefractorTrack,
RefractorCardState,
RefractorTierBoost,
RefractorCosmetic,
]
@pytest.fixture(autouse=False)
def setup_img_url_db():
"""Bind image-url test models to shared-memory SQLite and create tables.
What: Initialises the in-process SQLite database before each test and drops
all tables afterwards to ensure test isolation.
Why: SQLite shared-memory databases persist between tests in the same
process unless tables are dropped. Creating and dropping around each test
guarantees a clean state without requiring a real PostgreSQL instance.
"""
_img_url_db.bind(_IMG_URL_MODELS)
_img_url_db.connect(reuse_if_open=True)
_img_url_db.create_tables(_IMG_URL_MODELS)
yield _img_url_db
_img_url_db.drop_tables(list(reversed(_IMG_URL_MODELS)), safe=True)
def _build_image_url_app() -> FastAPI:
"""Minimal FastAPI app with refractor router for image_url tests."""
from app.routers_v2.refractor import router as refractor_router
app = FastAPI()
@app.middleware("http")
async def db_middleware(request: Request, call_next):
_img_url_db.connect(reuse_if_open=True)
return await call_next(request)
app.include_router(refractor_router)
return app
@pytest.fixture
def img_url_client(setup_img_url_db):
"""FastAPI TestClient backed by shared-memory SQLite for image_url tests."""
with TestClient(_build_image_url_app()) as c:
yield c
# ---------------------------------------------------------------------------
# Seed helpers
# ---------------------------------------------------------------------------
def _make_rarity():
r, _ = Rarity.get_or_create(
value=10, name="IU_Common", defaults={"color": "#ffffff"}
)
return r
def _make_cardset():
cs, _ = Cardset.get_or_create(
name="IU Test Set",
defaults={"description": "image url test cardset", "total_cards": 1},
)
return cs
def _make_player(name: str = "Test Player") -> Player:
return Player.create(
p_name=name,
rarity=_make_rarity(),
cardset=_make_cardset(),
set_num=1,
pos_1="CF",
image="https://example.com/img.png",
mlbclub="TST",
franchise="TST",
description="image url test",
)
def _make_team(suffix: str = "IU") -> Team:
return Team.create(
abbrev=suffix,
sname=suffix,
lname=f"Team {suffix}",
gmid=99900 + len(suffix),
gmname=f"gm_{suffix.lower()}",
gsheet="https://docs.google.com/iu_test",
wallet=500,
team_value=1000,
collection_value=1000,
season=11,
is_ai=False,
)
def _make_track(card_type: str = "batter") -> RefractorTrack:
track, _ = RefractorTrack.get_or_create(
name=f"IU {card_type} Track",
defaults=dict(
card_type=card_type,
formula="pa",
t1_threshold=100,
t2_threshold=300,
t3_threshold=700,
t4_threshold=1200,
),
)
return track
def _make_batting_card(player: Player, variant: int, image_url=None) -> BattingCard:
return BattingCard.create(
player=player,
variant=variant,
steal_low=1,
steal_high=3,
steal_auto=False,
steal_jump=1.0,
bunting="N",
hit_and_run="N",
running=5,
offense_col=1,
hand="R",
image_url=image_url,
)
def _make_card_state(
player: Player,
team: Team,
track: RefractorTrack,
variant: int,
current_tier: int = 1,
current_value: float = 150.0,
) -> RefractorCardState:
import datetime
return RefractorCardState.create(
player=player,
team=team,
track=track,
current_tier=current_tier,
current_value=current_value,
fully_evolved=False,
last_evaluated_at=datetime.datetime(2026, 4, 1, 12, 0, 0),
variant=variant,
)
# ---------------------------------------------------------------------------
# Tests
# ---------------------------------------------------------------------------
def test_cards_response_includes_image_url(setup_img_url_db, img_url_client):
"""GET /api/v2/refractor/cards includes image_url when the variant BattingCard has one.
What: Seeds a RefractorCardState at variant=1 and a matching BattingCard with
image_url set. Calls the /cards endpoint and asserts that image_url in the
response matches the seeded URL.
Why: This is the primary happy-path test for the image_url feature. If the
DB lookup in _build_card_state_response fails or the field is accidentally
omitted from the response dict, this test will catch it.
"""
player = _make_player("Homer Simpson")
team = _make_team("IU1")
track = _make_track("batter")
expected_url = (
"https://s3.example.com/cards/cardset-001/player-1/v1/battingcard.png"
)
_make_batting_card(player, variant=1, image_url=expected_url)
_make_card_state(player, team, track, variant=1)
resp = img_url_client.get(
f"/api/v2/refractor/cards?team_id={team.id}&evaluated_only=false",
headers=AUTH_HEADER,
)
assert resp.status_code == 200, resp.text
data = resp.json()
assert data["count"] == 1
item = data["items"][0]
assert "image_url" in item, "image_url key missing from response"
assert item["image_url"] == expected_url
def test_cards_response_image_url_null_when_not_set(setup_img_url_db, img_url_client):
"""GET /api/v2/refractor/cards returns image_url: null when BattingCard.image_url is None.
What: Seeds a BattingCard with image_url=None and a RefractorCardState at
variant=1. Verifies the response contains image_url with a null value.
Why: The image_url field must always be present in the response (even when
null) so API consumers can rely on its presence. Returning null rather than
omitting the key is the correct contract omitting it would break consumers
that check for the key's presence to determine upload status.
"""
player = _make_player("Bart Simpson")
team = _make_team("IU2")
track = _make_track("batter")
_make_batting_card(player, variant=1, image_url=None)
_make_card_state(player, team, track, variant=1)
resp = img_url_client.get(
f"/api/v2/refractor/cards?team_id={team.id}&evaluated_only=false",
headers=AUTH_HEADER,
)
assert resp.status_code == 200, resp.text
data = resp.json()
assert data["count"] == 1
item = data["items"][0]
assert "image_url" in item, "image_url key missing from response"
assert item["image_url"] is None

View File

@ -5,8 +5,6 @@ Covers WP-01 acceptance criteria:
- RefractorTrack: CRUD and unique-name constraint - RefractorTrack: CRUD and unique-name constraint
- RefractorCardState: CRUD, defaults, unique-(player,team) constraint, - RefractorCardState: CRUD, defaults, unique-(player,team) constraint,
and FK resolution back to RefractorTrack and FK resolution back to RefractorTrack
- RefractorTierBoost: CRUD and unique-(track, tier, boost_type, boost_target)
- RefractorCosmetic: CRUD and unique-name constraint
- BattingSeasonStats: CRUD with defaults, unique-(player, team, season), - BattingSeasonStats: CRUD with defaults, unique-(player, team, season),
and in-place stat accumulation and in-place stat accumulation
@ -22,8 +20,6 @@ from playhouse.shortcuts import model_to_dict
from app.db_engine import ( from app.db_engine import (
BattingSeasonStats, BattingSeasonStats,
RefractorCardState, RefractorCardState,
RefractorCosmetic,
RefractorTierBoost,
RefractorTrack, RefractorTrack,
) )
@ -134,115 +130,6 @@ class TestRefractorCardState:
assert resolved_track.name == "Batter Track" assert resolved_track.name == "Batter Track"
# ---------------------------------------------------------------------------
# RefractorTierBoost
# ---------------------------------------------------------------------------
class TestRefractorTierBoost:
"""Tests for RefractorTierBoost, the per-tier stat/rating bonus table.
Each row maps a (track, tier) combination to a single boost the
specific stat or rating column to buff and by how much. The four-
column unique constraint prevents double-booking the same boost slot.
"""
def test_create_tier_boost(self, track):
"""Creating a boost row persists all fields accurately.
Verifies boost_type, boost_target, and boost_value are stored
and retrieved without modification.
"""
boost = RefractorTierBoost.create(
track=track,
tier=1,
boost_type="rating",
boost_target="contact_vl",
boost_value=1.5,
)
fetched = RefractorTierBoost.get_by_id(boost.id)
assert fetched.track_id == track.id
assert fetched.tier == 1
assert fetched.boost_type == "rating"
assert fetched.boost_target == "contact_vl"
assert fetched.boost_value == 1.5
def test_tier_boost_unique_constraint(self, track):
"""Duplicate (track, tier, boost_type, boost_target) raises IntegrityError.
The four-column unique index ensures that a single boost slot
(e.g. Tier-1 contact_vl rating) cannot be defined twice for the
same track, which would create ambiguity during evolution evaluation.
"""
RefractorTierBoost.create(
track=track,
tier=2,
boost_type="rating",
boost_target="power_vr",
boost_value=2.0,
)
with pytest.raises(IntegrityError):
RefractorTierBoost.create(
track=track,
tier=2,
boost_type="rating",
boost_target="power_vr",
boost_value=3.0, # different value, same identity columns
)
# ---------------------------------------------------------------------------
# RefractorCosmetic
# ---------------------------------------------------------------------------
class TestRefractorCosmetic:
"""Tests for RefractorCosmetic, decorative unlocks tied to evolution tiers.
Cosmetics are purely visual rewards (frames, badges, themes) that a
card unlocks when it reaches a required tier. The name column is
the stable identifier and carries a UNIQUE constraint.
"""
def test_create_cosmetic(self):
"""Creating a cosmetic persists all fields correctly.
Verifies all columns including optional ones (css_class, asset_url)
are stored and retrieved.
"""
cosmetic = RefractorCosmetic.create(
name="Gold Frame",
tier_required=2,
cosmetic_type="frame",
css_class="evo-frame-gold",
asset_url="https://cdn.example.com/frames/gold.png",
)
fetched = RefractorCosmetic.get_by_id(cosmetic.id)
assert fetched.name == "Gold Frame"
assert fetched.tier_required == 2
assert fetched.cosmetic_type == "frame"
assert fetched.css_class == "evo-frame-gold"
assert fetched.asset_url == "https://cdn.example.com/frames/gold.png"
def test_cosmetic_unique_name(self):
"""Inserting a second cosmetic with the same name raises IntegrityError.
The UNIQUE constraint on RefractorCosmetic.name prevents duplicate
cosmetic definitions that could cause ambiguous tier unlock lookups.
"""
RefractorCosmetic.create(
name="Silver Badge",
tier_required=1,
cosmetic_type="badge",
)
with pytest.raises(IntegrityError):
RefractorCosmetic.create(
name="Silver Badge", # duplicate
tier_required=3,
cosmetic_type="badge",
)
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
# BattingSeasonStats # BattingSeasonStats
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------

View File

@ -64,8 +64,6 @@ from app.db_engine import (
ProcessedGame, ProcessedGame,
Rarity, Rarity,
RefractorCardState, RefractorCardState,
RefractorCosmetic,
RefractorTierBoost,
RefractorTrack, RefractorTrack,
Roster, Roster,
RosterSlot, RosterSlot,
@ -681,8 +679,6 @@ _STATE_API_MODELS = [
ProcessedGame, ProcessedGame,
RefractorTrack, RefractorTrack,
RefractorCardState, RefractorCardState,
RefractorTierBoost,
RefractorCosmetic,
] ]

View File

@ -41,8 +41,6 @@ from app.db_engine import ( # noqa: E402
ProcessedGame, ProcessedGame,
Rarity, Rarity,
RefractorCardState, RefractorCardState,
RefractorCosmetic,
RefractorTierBoost,
RefractorTrack, RefractorTrack,
Roster, Roster,
RosterSlot, RosterSlot,
@ -204,8 +202,6 @@ _TRACK_API_MODELS = [
ProcessedGame, ProcessedGame,
RefractorTrack, RefractorTrack,
RefractorCardState, RefractorCardState,
RefractorTierBoost,
RefractorCosmetic,
] ]