Update remote refs and card generation workflow

- Remove homelab special-case from commit-push command (all repos now use origin)
- Update sync-config to use origin remote instead of homelab
- Enhance card generation with season-pct params, CLI reference, and validation fixes

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
Cal Corum 2026-02-15 19:11:19 -06:00
parent 2b6805a250
commit 1e9b52186b
3 changed files with 60 additions and 22 deletions

View File

@ -13,7 +13,7 @@ If `$ARGUMENTS` contains "pr", also create a pull request after pushing.
3. Run `git log --oneline -5` to see recent commit style
4. If there are no changes, say "Nothing to commit" and stop
5. Determine the remote name and current branch:
- Remote: use `git remote` (if multiple, prefer `origin`; for `~/.claude` use `homelab`)
- Remote: use `git remote` (if multiple, prefer `origin`)
- Branch: use `git branch --show-current`
6. Stage all relevant changed files (prefer specific files over `git add -A` — avoid secrets, .env, credentials)
7. Draft a concise commit message following the repo's existing style (focus on "why" not "what")

View File

@ -8,12 +8,12 @@ Sync my ~/.claude configuration to its Gitea repo.
4. Show me a summary of the changes (new files, modified files, deleted files)
5. Stage all changes with `git add -A` (the .gitignore already excludes secrets, session data, and caches)
6. Create a commit with a descriptive message summarizing what changed (e.g., "Update skills, add new commands" or "Update Paper Dynasty workflows")
7. Push to the `homelab` remote: `git push homelab main`
7. Push to the `origin` remote: `git push origin main`
8. Confirm success with the commit hash and a brief summary
## Important
- The remote is called `homelab`, NOT `origin`
- The remote is called `origin`
- The branch is `main`
- The .gitignore is already configured to exclude secrets, session data, memory, and caches - so `git add -A` is safe
- This command IS explicit approval to commit and push - no need to ask for confirmation

View File

@ -6,6 +6,7 @@ Ask the user before starting:
1. **Refresh or new date range?** (refresh keeps existing config)
2. **Which environment?** (prod or dev)
3. **Which cardset?** (e.g., 27 for "2005 Live")
4. **Season progress?** (games played or date range for season-pct calculation)
All commands run from `/mnt/NV2/Development/paper-dynasty/card-creation/`.
@ -13,10 +14,12 @@ All commands run from `/mnt/NV2/Development/paper-dynasty/card-creation/`.
```bash
# 1. Verify config (dry-run shows settings without executing)
pd-cards retrosheet process <year> -c <cardset_id> -d <description> --dry-run
pd-cards retrosheet process <year> -c <cardset_id> -d <description> \
--start <YYYYMMDD> --end <YYYYMMDD> --season-pct <0.0-1.0> --dry-run
# 2. Generate cards (POSTs player data to API)
pd-cards retrosheet process <year> -c <cardset_id> -d <description> --end <YYYYMMDD>
pd-cards retrosheet process <year> -c <cardset_id> -d <description> \
--start <YYYYMMDD> --end <YYYYMMDD> --season-pct <0.0-1.0>
# 3. Validate positions (DH count MUST be <5; high DH = defense calc failure)
pd-cards retrosheet validate <cardset_id>
@ -27,17 +30,40 @@ pd-cards upload check -c "<cardset name>"
# 5. CRITICAL: Validate database for negative groundball_b — STOP if errors found
# (see "Bug Prevention" section below)
# 6. Upload to S3 (fast — uses cached images from step 4)
# 6. Upload to S3
pd-cards upload s3 -c "<cardset name>"
# 7. Generate scouting reports
pd-cards scouting all -c <cardset_id>
# 7. Generate scouting reports (ALWAYS run without --cardset-id to cover all cardsets)
pd-cards scouting all
# 8. Upload scouting CSVs to production server
scp scouting/*.csv sba-db:container-data/pd-database/storage/
pd-cards scouting upload
```
**Verify scouting upload**: `ssh sba-db "ls -lh container-data/pd-database/storage/ | grep -E 'batting|pitching'"`
### CLI Parameter Reference
| Parameter | Description | Example |
|-----------|-------------|---------|
| `--start` | Season start date (YYYYMMDD) | `--start 20050403` |
| `--end` | Data cutoff date (YYYYMMDD) | `--end 20050815` |
| `--season-pct` | Fraction of season completed (0.0-1.0) | `--season-pct 0.728` |
| `--min-pa-vl` | Min plate appearances vs LHP (default: 20 Live, 1 PotM) | `--min-pa-vl 20` |
| `--min-pa-vr` | Min plate appearances vs RHP (default: 40 Live, 1 PotM) | `--min-pa-vr 40` |
| `--last-twoweeks-ratio` | Recency bias weight (auto-enabled at 0.2 after May 30) | `--last-twoweeks-ratio 0.2` |
| `--dry-run` / `-n` | Preview without saving to database | |
### Example: 2005 Live Series Update (Mid-August)
```bash
pd-cards retrosheet process 2005 -c 27 -d Live --start 20050403 --end 20050815 --season-pct 0.728 --dry-run
pd-cards retrosheet process 2005 -c 27 -d Live --start 20050403 --end 20050815 --season-pct 0.728
pd-cards retrosheet validate 27
pd-cards upload check -c "2005 Live"
# Run groundball_b validation (step 5)
pd-cards upload s3 -c "2005 Live"
pd-cards scouting all
pd-cards scouting upload
```
---
@ -51,31 +77,42 @@ Card image generation (step 4) can create **negative groundball_b values** that
**Never skip step 5.** Broken cards uploaded to S3 affect all players immediately.
### Step 5 Validation Query
### Step 5 Validation Script
There is no CLI command for this validation yet. Run this Python script via `uv run python -c`:
```python
uv run python -c "
from db_calls import db_get
import asyncio
async def check_cards():
cards = await db_get('battingcards', params=[('cardset', 27)])
result = await db_get('battingcards', params=[('cardset', CARDSET_ID)])
cards = result.get('cards', [])
errors = []
for card in cards:
if card.get('groundball_b', 0) < 0:
errors.append(f"Player {card.get('player_id')}: groundball_b = {card.get('groundball_b')}")
player = card.get('player', {})
pid = player.get('player_id', card.get('id'))
gb = card.get('groundball_b')
if gb is not None and gb < 0:
errors.append(f'Player {pid}: groundball_b = {gb}')
for field in ['gb_b', 'fb_b', 'ld_b']:
val = card.get(field, 0)
if val < 0 or val > 100:
errors.append(f"Player {card.get('player_id')}: {field} = {val}")
val = card.get(field)
if val is not None and (val < 0 or val > 100):
errors.append(f'Player {pid}: {field} = {val}')
if errors:
print('ERRORS FOUND:')
print('\n'.join(errors))
print('\nDO NOT PROCEED — fix data and re-run step 2')
else:
print('Validation passed')
print(f'Validation passed — {len(cards)} batting cards checked, no issues')
asyncio.run(check_cards())
"
```
**Note:** Replace `CARDSET_ID` with the actual cardset ID (e.g., 27). The API returns `{'count': N, 'cards': [...]}` — always use `result.get('cards', [])` to extract the card list.
---
## Architecture
@ -147,12 +184,12 @@ pd-cards retrosheet process <year> -c <promo_cardset_id> \
# 3. Validate (expect higher DH count — promo players may lack defense data for short windows)
pd-cards retrosheet validate <promo_cardset_id>
# 4-6. Image validation and S3 upload (same as full cardset)
# 4-5. Image validation (same as full cardset — check, validate groundball_b, then upload)
pd-cards upload check -c "<promo cardset name>"
# Run groundball_b validation (step 5 from main workflow)
pd-cards upload s3 -c "<promo cardset name>"
# 7-8. Scouting reports — ALWAYS regenerate for ALL cardsets
# 6-7. Scouting reports — ALWAYS regenerate for ALL cardsets (no --cardset-id filter)
pd-cards scouting all
pd-cards scouting upload
```
@ -174,6 +211,7 @@ pd-cards retrosheet process 2005 -c 28 -d "May PotM" --start 20050501 --end 2005
pd-cards retrosheet process 2005 -c 28 -d "May PotM" --start 20050501 --end 20050531
pd-cards retrosheet validate 28
pd-cards upload check -c "2005 Promos"
# Run groundball_b validation
pd-cards upload s3 -c "2005 Promos"
pd-cards scouting all
pd-cards scouting upload
@ -181,5 +219,5 @@ pd-cards scouting upload
---
**Last Updated**: 2026-02-14
**Version**: 3.1 (Added Players of the Month variant)
**Last Updated**: 2026-02-15
**Version**: 3.2 (Fixed scouting commands to use CLI, fixed groundball_b validation script, added CLI parameter reference and example)