Compare commits

...

19 Commits

Author SHA1 Message Date
57ef9d7f09 Merge pull request 'feat: Phase 0 baseline benchmark script and log (WP-00) (#87)' (#95) from ai/paper-dynasty-database#87 into next-release
All checks were successful
Build Docker Image / build (push) Successful in 8m23s
2026-03-23 04:00:01 +00:00
14218de02f Merge branch 'next-release' into ai/paper-dynasty-database#87 2026-03-23 03:59:53 +00:00
b9faf278f6 Merge pull request 'fix: prevent paperdex timeout on unfiltered requests (#102)' (#103) from ai/paper-dynasty-database#102 into next-release
Some checks are pending
Build Docker Image / build (push) Waiting to run
2026-03-23 03:59:46 +00:00
3d7ff5a86c Merge branch 'next-release' into ai/paper-dynasty-database#102 2026-03-23 03:59:35 +00:00
cal
6caa24e1be Merge pull request 'chore: replace deprecated datetime.utcnow() with datetime.now(UTC) (#114)' (#118) from ai/paper-dynasty-database#114 into next-release
All checks were successful
Build Docker Image / build (push) Successful in 8m43s
Reviewed-on: #118
2026-03-19 18:26:13 +00:00
cal
dc163e0ddd Merge pull request 'fix: sort /teams/{id}/evolutions by current_tier desc, current_value desc (#116)' (#120) from ai/paper-dynasty-database-116 into next-release
Some checks failed
Build Docker Image / build (push) Has been cancelled
Reviewed-on: #120
Reviewed-by: Claude <cal.corum+openclaw@gmail.com>
2026-03-19 18:24:47 +00:00
Cal Corum
0953a45b9f fix: sort /teams/{id}/evolutions by current_tier desc, current_value desc (#116)
Closes #116

The endpoint was returning results in player_id insertion order, causing
/evo status in Discord to show a wall of T0/value-0 cards before any
progressed players. Sort by current_tier DESC, current_value DESC so
the most-evolved cards always appear first.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-19 13:24:14 -05:00
cal
87f46a1bfd Merge pull request 'docs: update list_team_evolutions docstring for player_name and Player join' (#121) from ai/paper-dynasty-database#115 into next-release
Some checks are pending
Build Docker Image / build (push) Waiting to run
Reviewed-on: #121
2026-03-19 18:22:22 +00:00
Cal Corum
8733fd45ad docs: update list_team_evolutions docstring for player_name and Player join
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-19 13:20:57 -05:00
cal
57d8a929fd Merge pull request 'fix: include player_name in /teams/{id}/evolutions response (#115)' (#119) from ai/paper-dynasty-database#115 into next-release
Some checks are pending
Build Docker Image / build (push) Waiting to run
Reviewed-on: #119
Reviewed-by: cal <cal@manticorum.com>
2026-03-19 18:20:18 +00:00
cal
a81bde004b Merge pull request 'fix: season-stats update-game returns 404 for nonexistent game_id' (#117) from ai/paper-dynasty-database#113 into next-release
Some checks failed
Build Docker Image / build (push) Has been cancelled
Reviewed-on: #117
2026-03-19 18:18:39 +00:00
Cal Corum
383fb2bc3f fix: include player_name in /teams/{id}/evolutions response (#115)
JOIN the Player table in the evolutions query so p_name can be
included in each serialized item without N+1 queries.

Closes #115

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-19 13:03:34 -05:00
Cal Corum
9c19120444 chore: replace deprecated datetime.utcnow() with datetime.now(UTC) (#114)
Closes #114

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-19 12:31:43 -05:00
Cal Corum
3fc6721d4d fix: catch DoesNotExist and return 404 for nonexistent game_id
Closes #113

Adds a specific `DoesNotExist` handler before the generic `Exception`
block in `update_game_season_stats`. Peewee's `DoesNotExist` (raised
when `StratGame.get_by_id(game_id)` finds no row) previously bubbled
through to the `except Exception` handler which included raw SQL and
params in the 500 detail string. Now returns a clean 404.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-19 12:01:46 -05:00
cal
cf0b1d1d1c Merge pull request 'Card Evolution: season stats full recalculation → next-release' (#112) from card-evolution into next-release
All checks were successful
Build Docker Image / build (push) Successful in 8m49s
Reviewed-on: #112
Reviewed-by: Claude <cal.corum+openclaw@gmail.com>
2026-03-19 15:49:10 +00:00
cal
3ff3fd3d14 Merge pull request 'Card Evolution Phase 1: card-evolution → next-release' (#110) from card-evolution into next-release
All checks were successful
Build Docker Image / build (push) Successful in 8m12s
Reviewed-on: #110
2026-03-18 21:29:14 +00:00
Cal Corum
6f10c8775c fix: prevent paperdex timeout on unfiltered requests (#102)
- Remove premature all_dex.count() before filters are applied (was a
  full-table COUNT(*) before any WHERE clauses)
- Add limit parameter (default 500) to cap unbounded result sets
- Materialize queryset into list once and use len() for count, avoiding
  a second COUNT(*) query
- Remove unused all_sets variable (blocked pre-commit ruff check)

Closes #102

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 16:33:57 -05:00
Cal Corum
2ab6e71735 fix: address review feedback (#95)
- Fix curl -w override bug: consolidate --write-out/"-w" into single
  -w "%{http_code} %{time_total}" so STATUS and TIMING are both captured
- Add ?nocache=1 to render URL so baseline measures cold render time
- Fix duplicate BASELINE.md Section 2 rows (batting vs pitching)
- Add benchmarks/render_timings.txt to .gitignore

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-13 00:31:35 -05:00
Cal Corum
4f2a66b67e feat: add Phase 0 baseline benchmark script and log (WP-00) (#87)
Closes #87

Add benchmarks/benchmark_renders.sh to time 10 sequential card renders
via curl against any API environment, and benchmarks/BASELINE.md to
record methodology and results for pre/post optimization comparison.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-13 00:04:34 -05:00
7 changed files with 264 additions and 71 deletions

3
.gitignore vendored
View File

@ -83,3 +83,6 @@ postgres_data/
README_GAUNTLET_CLEANUP.md
wipe_gauntlet_team.py
SCHEMA.md
# Benchmark output files
benchmarks/render_timings.txt

View File

@ -8,35 +8,33 @@ from pandas import DataFrame
from ..db_engine import Paperdex, model_to_dict, Player, Cardset, Team, DoesNotExist
from ..dependencies import oauth2_scheme, valid_token
router = APIRouter(
prefix='/api/v2/paperdex',
tags=['paperdex']
)
router = APIRouter(prefix="/api/v2/paperdex", tags=["paperdex"])
class PaperdexModel(pydantic.BaseModel):
team_id: int
player_id: int
created: Optional[int] = int(datetime.timestamp(datetime.now())*1000)
created: Optional[int] = int(datetime.timestamp(datetime.now()) * 1000)
@router.get('')
@router.get("")
async def get_paperdex(
team_id: Optional[int] = None, player_id: Optional[int] = None, created_after: Optional[int] = None,
cardset_id: Optional[int] = None, created_before: Optional[int] = None, flat: Optional[bool] = False,
csv: Optional[bool] = None):
team_id: Optional[int] = None,
player_id: Optional[int] = None,
created_after: Optional[int] = None,
cardset_id: Optional[int] = None,
created_before: Optional[int] = None,
flat: Optional[bool] = False,
csv: Optional[bool] = None,
limit: Optional[int] = 500,
):
all_dex = Paperdex.select().join(Player).join(Cardset).order_by(Paperdex.id)
if all_dex.count() == 0:
raise HTTPException(status_code=404, detail=f'There are no paperdex to filter')
if team_id is not None:
all_dex = all_dex.where(Paperdex.team_id == team_id)
if player_id is not None:
all_dex = all_dex.where(Paperdex.player_id == player_id)
if cardset_id is not None:
all_sets = Cardset.select().where(Cardset.id == cardset_id)
all_dex = all_dex.where(Paperdex.player.cardset.id == cardset_id)
if created_after is not None:
# Convert milliseconds timestamp to datetime for PostgreSQL comparison
@ -47,61 +45,63 @@ async def get_paperdex(
created_before_dt = datetime.fromtimestamp(created_before / 1000)
all_dex = all_dex.where(Paperdex.created <= created_before_dt)
# if all_dex.count() == 0:
# db.close()
# raise HTTPException(status_code=404, detail=f'No paperdex found')
if limit is not None:
all_dex = all_dex.limit(limit)
if csv:
data_list = [['id', 'team_id', 'player_id', 'created']]
data_list = [["id", "team_id", "player_id", "created"]]
for line in all_dex:
data_list.append(
[
line.id, line.team.id, line.player.player_id, line.created
]
[line.id, line.team.id, line.player.player_id, line.created]
)
return_val = DataFrame(data_list).to_csv(header=False, index=False)
return Response(content=return_val, media_type='text/csv')
return Response(content=return_val, media_type="text/csv")
else:
return_val = {'count': all_dex.count(), 'paperdex': []}
for x in all_dex:
return_val['paperdex'].append(model_to_dict(x, recurse=not flat))
items = list(all_dex)
return_val = {"count": len(items), "paperdex": []}
for x in items:
return_val["paperdex"].append(model_to_dict(x, recurse=not flat))
return return_val
@router.get('/{paperdex_id}')
@router.get("/{paperdex_id}")
async def get_one_paperdex(paperdex_id, csv: Optional[bool] = False):
try:
this_dex = Paperdex.get_by_id(paperdex_id)
except DoesNotExist:
raise HTTPException(status_code=404, detail=f'No paperdex found with id {paperdex_id}')
raise HTTPException(
status_code=404, detail=f"No paperdex found with id {paperdex_id}"
)
if csv:
data_list = [
['id', 'team_id', 'player_id', 'created'],
[this_dex.id, this_dex.team.id, this_dex.player.id, this_dex.created]
["id", "team_id", "player_id", "created"],
[this_dex.id, this_dex.team.id, this_dex.player.id, this_dex.created],
]
return_val = DataFrame(data_list).to_csv(header=False, index=False)
return Response(content=return_val, media_type='text/csv')
return Response(content=return_val, media_type="text/csv")
else:
return_val = model_to_dict(this_dex)
return return_val
@router.post('')
@router.post("")
async def post_paperdex(paperdex: PaperdexModel, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logging.warning('Bad Token: [REDACTED]')
logging.warning("Bad Token: [REDACTED]")
raise HTTPException(
status_code=401,
detail='You are not authorized to post paperdex. This event has been logged.'
detail="You are not authorized to post paperdex. This event has been logged.",
)
dupe_dex = Paperdex.get_or_none(Paperdex.team_id == paperdex.team_id, Paperdex.player_id == paperdex.player_id)
dupe_dex = Paperdex.get_or_none(
Paperdex.team_id == paperdex.team_id, Paperdex.player_id == paperdex.player_id
)
if dupe_dex:
return_val = model_to_dict(dupe_dex)
return return_val
@ -109,7 +109,7 @@ async def post_paperdex(paperdex: PaperdexModel, token: str = Depends(oauth2_sch
this_dex = Paperdex(
team_id=paperdex.team_id,
player_id=paperdex.player_id,
created=datetime.fromtimestamp(paperdex.created / 1000)
created=datetime.fromtimestamp(paperdex.created / 1000),
)
saved = this_dex.save()
@ -119,24 +119,30 @@ async def post_paperdex(paperdex: PaperdexModel, token: str = Depends(oauth2_sch
else:
raise HTTPException(
status_code=418,
detail='Well slap my ass and call me a teapot; I could not save that dex'
detail="Well slap my ass and call me a teapot; I could not save that dex",
)
@router.patch('/{paperdex_id}')
@router.patch("/{paperdex_id}")
async def patch_paperdex(
paperdex_id, team_id: Optional[int] = None, player_id: Optional[int] = None, created: Optional[int] = None,
token: str = Depends(oauth2_scheme)):
paperdex_id,
team_id: Optional[int] = None,
player_id: Optional[int] = None,
created: Optional[int] = None,
token: str = Depends(oauth2_scheme),
):
if not valid_token(token):
logging.warning('Bad Token: [REDACTED]')
logging.warning("Bad Token: [REDACTED]")
raise HTTPException(
status_code=401,
detail='You are not authorized to patch paperdex. This event has been logged.'
detail="You are not authorized to patch paperdex. This event has been logged.",
)
try:
this_dex = Paperdex.get_by_id(paperdex_id)
except DoesNotExist:
raise HTTPException(status_code=404, detail=f'No paperdex found with id {paperdex_id}')
raise HTTPException(
status_code=404, detail=f"No paperdex found with id {paperdex_id}"
)
if team_id is not None:
this_dex.team_id = team_id
@ -151,40 +157,43 @@ async def patch_paperdex(
else:
raise HTTPException(
status_code=418,
detail='Well slap my ass and call me a teapot; I could not save that rarity'
detail="Well slap my ass and call me a teapot; I could not save that rarity",
)
@router.delete('/{paperdex_id}')
@router.delete("/{paperdex_id}")
async def delete_paperdex(paperdex_id, token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logging.warning('Bad Token: [REDACTED]')
logging.warning("Bad Token: [REDACTED]")
raise HTTPException(
status_code=401,
detail='You are not authorized to delete rewards. This event has been logged.'
detail="You are not authorized to delete rewards. This event has been logged.",
)
try:
this_dex = Paperdex.get_by_id(paperdex_id)
except DoesNotExist:
raise HTTPException(status_code=404, detail=f'No paperdex found with id {paperdex_id}')
raise HTTPException(
status_code=404, detail=f"No paperdex found with id {paperdex_id}"
)
count = this_dex.delete_instance()
if count == 1:
raise HTTPException(status_code=200, detail=f'Paperdex {this_dex} has been deleted')
else:
raise HTTPException(status_code=500, detail=f'Paperdex {this_dex} was not deleted')
@router.post('/wipe-ai')
async def wipe_ai_paperdex(token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logging.warning('Bad Token: [REDACTED]')
raise HTTPException(
status_code=401,
detail='Unauthorized'
status_code=200, detail=f"Paperdex {this_dex} has been deleted"
)
else:
raise HTTPException(
status_code=500, detail=f"Paperdex {this_dex} was not deleted"
)
g_teams = Team.select().where(Team.abbrev.contains('Gauntlet'))
@router.post("/wipe-ai")
async def wipe_ai_paperdex(token: str = Depends(oauth2_scheme)):
if not valid_token(token):
logging.warning("Bad Token: [REDACTED]")
raise HTTPException(status_code=401, detail="Unauthorized")
g_teams = Team.select().where(Team.abbrev.contains("Gauntlet"))
count = Paperdex.delete().where(Paperdex.team << g_teams).execute()
return f'Deleted {count} records'
return f"Deleted {count} records"

View File

@ -52,9 +52,12 @@ async def update_game_season_stats(
raise HTTPException(status_code=401, detail="Unauthorized")
from ..services.season_stats import update_season_stats
from ..db_engine import DoesNotExist
try:
result = update_season_stats(game_id, force=force)
except DoesNotExist:
raise HTTPException(status_code=404, detail=f"Game {game_id} not found")
except Exception as exc:
logger.error("update-game/%d failed: %s", game_id, exc, exc_info=True)
raise HTTPException(

View File

@ -1049,7 +1049,6 @@ async def team_buy_players(team_id: int, ids: str, ts: str):
detail=f"You are not authorized to buy {this_team.abbrev} cards. This event has been logged.",
)
all_ids = ids.split(",")
conf_message = ""
total_cost = 0
@ -1542,9 +1541,11 @@ async def list_team_evolutions(
):
"""List all EvolutionCardState rows for a team, with optional filters.
Joins EvolutionCardState to EvolutionTrack so that card_type filtering
works without a second query. Results are paginated via page/per_page
(1-indexed pages); items are ordered by player_id for stable ordering.
Joins EvolutionCardState EvolutionTrack (for card_type filtering and
threshold context) and EvolutionCardState Player (for player_name),
both eager-loaded in a single query. Results are paginated via
page/per_page (1-indexed pages); items are ordered by current_tier DESC,
current_value DESC so the most-progressed cards appear first.
Query parameters:
card_type -- filter to states whose track.card_type matches (e.g. 'batter', 'sp')
@ -1555,20 +1556,26 @@ async def list_team_evolutions(
Response shape:
{"count": N, "items": [card_state_with_threshold_context, ...]}
Each item in 'items' has the same shape as GET /evolution/cards/{card_id}.
Each item in 'items' has the same shape as GET /evolution/cards/{card_id},
plus a ``player_name`` field sourced from the Player table.
"""
if not valid_token(token):
logging.warning("Bad Token: [REDACTED]")
raise HTTPException(status_code=401, detail="Unauthorized")
from ..db_engine import EvolutionCardState, EvolutionTrack
from ..db_engine import EvolutionCardState, EvolutionTrack, Player
from ..routers_v2.evolution import _build_card_state_response
query = (
EvolutionCardState.select(EvolutionCardState, EvolutionTrack)
EvolutionCardState.select(EvolutionCardState, EvolutionTrack, Player)
.join(EvolutionTrack)
.switch(EvolutionCardState)
.join(Player)
.where(EvolutionCardState.team == team_id)
.order_by(EvolutionCardState.player_id)
.order_by(
EvolutionCardState.current_tier.desc(),
EvolutionCardState.current_value.desc(),
)
)
if card_type is not None:
@ -1581,5 +1588,9 @@ async def list_team_evolutions(
offset = (page - 1) * per_page
page_query = query.offset(offset).limit(per_page)
items = [_build_card_state_response(state) for state in page_query]
items = []
for state in page_query:
item = _build_card_state_response(state)
item["player_name"] = state.player.p_name
items.append(item)
return {"count": total, "items": items}

View File

@ -19,7 +19,7 @@ and WP-09 (formula engine). Models and formula functions are imported lazily so
this module can be imported before those PRs merge.
"""
from datetime import datetime
from datetime import datetime, UTC
import logging
@ -170,7 +170,7 @@ def evaluate_card(
new_tier = _tier_from_value_fn(value, track)
# 58. Update card state (no tier regression)
now = datetime.utcnow()
now = datetime.now(UTC)
card_state.current_value = value
card_state.current_tier = max(card_state.current_tier, new_tier)
card_state.fully_evolved = card_state.current_tier >= 4

92
benchmarks/BASELINE.md Normal file
View File

@ -0,0 +1,92 @@
# Phase 0 Baseline Benchmarks — WP-00
Captured before any Phase 0 render-pipeline optimizations (WP-01 through WP-04).
Run these benchmarks again after each work package lands to measure improvement.
---
## 1. Per-Card Render Time
**What is measured:** Time from HTTP request to full PNG response for a single card image.
Each render triggers a full Playwright Chromium launch, page load, screenshot, and teardown.
### Method
```bash
# Set API_BASE to the environment under test
export API_BASE=http://pddev.manticorum.com:816
# Run against 10 batting cards (auto-fetches player IDs)
./benchmarks/benchmark_renders.sh
# Or supply explicit player IDs:
./benchmarks/benchmark_renders.sh 101 102 103 104 105 106 107 108 109 110
# For pitching cards:
CARD_TYPE=pitching ./benchmarks/benchmark_renders.sh
```
Prerequisites: `curl`, `jq`, `bc`
Results are appended to `benchmarks/render_timings.txt`.
### Baseline Results — 2026-03-13
| Environment | Card type | N | Min (s) | Max (s) | Avg (s) |
|-------------|-----------|---|---------|---------|---------|
| dev (pddev.manticorum.com:816) | batting | 10 | _TBD_ | _TBD_ | _TBD_ |
| dev (pddev.manticorum.com:816) | pitching | 10 | _TBD_ | _TBD_ | _TBD_ |
> **Note:** Run `./benchmarks/benchmark_renders.sh` against the dev API and paste
> the per-render timings from `render_timings.txt` into the table above.
**Expected baseline (pre-optimization):** ~2.03.0s per render
(Chromium spawn ~1.01.5s + Google Fonts fetch ~0.30.5s + render ~0.3s)
---
## 2. Batch Upload Time
**What is measured:** Wall-clock time to render and upload N card images to S3
using the `pd-cards upload` CLI (in the `card-creation` repo).
### Method
```bash
# In the card-creation repo:
time pd-cards upload --cardset 24 --limit 20
```
Or to capture more detail:
```bash
START=$(date +%s%3N)
pd-cards upload --cardset 24 --limit 20
END=$(date +%s%3N)
echo "Elapsed: $(( (END - START) / 1000 )).$(( (END - START) % 1000 ))s"
```
### Baseline Results — 2026-03-13
| Environment | Cards | Elapsed (s) | Per-card avg (s) |
|-------------|-------|-------------|-----------------|
| dev (batting) | 20 | _TBD_ | _TBD_ |
| dev (pitching) | 20 | _TBD_ | _TBD_ |
> **Note:** Run the upload command in the `card-creation` repo and record timings here.
**Expected baseline (pre-optimization):** ~4060s for 20 cards (~23s each sequential)
---
## 3. Re-run After Each Work Package
| Milestone | Per-card avg (s) | 20-card upload (s) | Notes |
|-----------|-----------------|-------------------|-------|
| Baseline (pre-WP-01/02) | _TBD_ | _TBD_ | This document |
| After WP-01 (self-hosted fonts) | — | — | |
| After WP-02 (persistent browser) | — | — | |
| After WP-01 + WP-02 combined | — | — | |
| After WP-04 (concurrent upload) | — | — | |
Target: <1.0s per render, <5 min for 800-card upload (with WP-01 + WP-02 deployed).

75
benchmarks/benchmark_renders.sh Executable file
View File

@ -0,0 +1,75 @@
#!/usr/bin/env bash
# WP-00: Baseline benchmark — sequential card render timing
#
# Measures per-card render time for 10 cards by calling the card image
# endpoint sequentially and recording curl's time_total for each request.
#
# Usage:
# API_BASE=http://pddev.manticorum.com:816 ./benchmarks/benchmark_renders.sh
# API_BASE=http://localhost:8000 CARD_TYPE=pitching ./benchmarks/benchmark_renders.sh
# API_BASE=http://pddev.manticorum.com:816 ./benchmarks/benchmark_renders.sh 101 102 103
#
# Arguments (optional): explicit player IDs to render. If omitted, the script
# queries the API for the first 10 players in the live cardset.
#
# Output: results are printed to stdout and appended to benchmarks/render_timings.txt
set -euo pipefail
API_BASE="${API_BASE:-http://localhost:8000}"
CARD_TYPE="${CARD_TYPE:-batting}"
OUTFILE="$(dirname "$0")/render_timings.txt"
echo "=== Card Render Benchmark ===" | tee -a "$OUTFILE"
echo "Date: $(date -u +%Y-%m-%dT%H:%M:%SZ)" | tee -a "$OUTFILE"
echo "API: $API_BASE" | tee -a "$OUTFILE"
echo "Type: $CARD_TYPE" | tee -a "$OUTFILE"
# --- Resolve player IDs ---
if [ "$#" -gt 0 ]; then
PLAYER_IDS=("$@")
echo "Mode: explicit IDs (${#PLAYER_IDS[@]} players)" | tee -a "$OUTFILE"
else
echo "Mode: auto-fetch first 10 players from live cardset" | tee -a "$OUTFILE"
# Fetch player list and extract IDs; requires jq
RAW=$(curl -sf "$API_BASE/api/v2/players?page_size=10")
PLAYER_IDS=($(echo "$RAW" | jq -r '.players[].id // .[]?.id // .[]' 2>/dev/null | head -10))
if [ "${#PLAYER_IDS[@]}" -eq 0 ]; then
echo "ERROR: Could not fetch player IDs from $API_BASE/api/v2/players" | tee -a "$OUTFILE"
exit 1
fi
fi
echo "Players: ${PLAYER_IDS[*]}" | tee -a "$OUTFILE"
echo "" | tee -a "$OUTFILE"
# --- Run renders ---
TOTAL=0
COUNT=0
for player_id in "${PLAYER_IDS[@]}"; do
# Bypass cached PNG files; remove ?nocache=1 after baseline is captured to test cache-hit performance.
URL="$API_BASE/api/v2/players/$player_id/${CARD_TYPE}card?nocache=1"
HTTP_CODE=$(curl -s -o /dev/null -w "%{http_code} %{time_total}" "$URL" 2>&1)
STATUS=$(echo "$HTTP_CODE" | awk '{print $1}')
TIMING=$(echo "$HTTP_CODE" | awk '{print $2}')
echo " player_id=$player_id http=$STATUS time=${TIMING}s" | tee -a "$OUTFILE"
if [ "$STATUS" = "200" ]; then
TOTAL=$(echo "$TOTAL + $TIMING" | bc -l)
COUNT=$((COUNT + 1))
fi
done
# --- Summary ---
echo "" | tee -a "$OUTFILE"
if [ "$COUNT" -gt 0 ]; then
AVG=$(echo "scale=3; $TOTAL / $COUNT" | bc -l)
echo "Successful renders: $COUNT / ${#PLAYER_IDS[@]}" | tee -a "$OUTFILE"
echo "Total time: ${TOTAL}s" | tee -a "$OUTFILE"
echo "Average: ${AVG}s per render" | tee -a "$OUTFILE"
else
echo "No successful renders — check API_BASE and player IDs" | tee -a "$OUTFILE"
fi
echo "---" | tee -a "$OUTFILE"
echo "" | tee -a "$OUTFILE"
echo "Results appended to $OUTFILE"