strat-gameplay-webapp/.claude/plans/004-alembic-migrations.md
Cal Corum e0c12467b0 CLAUDE: Improve UX with single-click OAuth, enhanced games list, and layout fix
Frontend UX improvements:
- Single-click Discord OAuth from home page (no intermediate /auth page)
- Auto-redirect authenticated users from home to /games
- Fixed Nuxt layout system - app.vue now wraps NuxtPage with NuxtLayout
- Games page now has proper card container with shadow/border styling
- Layout header includes working logout with API cookie clearing

Games list enhancements:
- Display team names (lname) instead of just team IDs
- Show current score for each team
- Show inning indicator (Top/Bot X) for active games
- Responsive header with wrapped buttons on mobile

Backend improvements:
- Added team caching to SbaApiClient (1-hour TTL)
- Enhanced GameListItem with team names, scores, inning data
- Games endpoint now enriches response with SBA API team data

Docker optimizations:
- Optimized Dockerfile using --chown flag on COPY (faster than chown -R)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-12-05 16:14:00 -06:00

391 lines
11 KiB
Markdown

# Plan 004: Initialize Alembic Migrations
**Priority**: CRITICAL
**Effort**: 2-3 hours
**Status**: NOT STARTED
**Risk Level**: HIGH - Schema evolution blocked
---
## Problem Statement
The database schema is created via `Base.metadata.create_all()` with no migration history. Only one migration file exists (`004_create_stat_materialized_views.py`), and it's for materialized views only.
**Current State**:
- No version control of schema changes
- Cannot rollback to previous schema versions
- No documentation of schema evolution
- Production schema sync is risky
## Impact
- **Operations**: Cannot safely evolve schema
- **Rollback**: No way to revert schema changes
- **Audit**: No history of what changed when
- **Team**: Other developers can't sync schema
## Files to Modify/Create
| File | Action |
|------|--------|
| `backend/alembic/` | Initialize properly |
| `backend/alembic/env.py` | Configure for async SQLAlchemy |
| `backend/alembic/versions/001_initial_schema.py` | Create initial migration |
| `backend/app/database/session.py` | Remove `create_all()` call |
## Implementation Steps
### Step 1: Backup Current Schema (15 min)
```bash
# Export current schema
cd /mnt/NV2/Development/strat-gameplay-webapp/backend
# Dump schema from database
pg_dump --schema-only -d strat_gameplay > schema_backup.sql
# Also save current models
cp app/models/db_models.py db_models_backup.py
```
### Step 2: Configure Alembic for Async (30 min)
Update `backend/alembic/env.py`:
```python
import asyncio
from logging.config import fileConfig
from sqlalchemy import pool
from sqlalchemy.engine import Connection
from sqlalchemy.ext.asyncio import async_engine_from_config
from alembic import context
from app.models.db_models import Base
from app.config import settings
# Alembic Config object
config = context.config
# Set database URL from settings
config.set_main_option("sqlalchemy.url", settings.database_url.replace("+asyncpg", ""))
# Interpret the config file for Python logging
if config.config_file_name is not None:
fileConfig(config.config_file_name)
# Model metadata for autogenerate
target_metadata = Base.metadata
def run_migrations_offline() -> None:
"""Run migrations in 'offline' mode."""
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
)
with context.begin_transaction():
context.run_migrations()
def do_run_migrations(connection: Connection) -> None:
context.configure(connection=connection, target_metadata=target_metadata)
with context.begin_transaction():
context.run_migrations()
async def run_async_migrations() -> None:
"""Run migrations in 'online' mode with async engine."""
connectable = async_engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
async with connectable.connect() as connection:
await connection.run_sync(do_run_migrations)
await connectable.dispose()
def run_migrations_online() -> None:
"""Run migrations in 'online' mode."""
asyncio.run(run_async_migrations())
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()
```
### Step 3: Create Initial Migration (30 min)
```bash
# Generate initial migration from existing models
cd /mnt/NV2/Development/strat-gameplay-webapp/backend
alembic revision --autogenerate -m "Initial schema from models"
```
Review the generated migration and ensure it matches existing schema.
Create/update `backend/alembic/versions/001_initial_schema.py`:
```python
"""Initial schema from models
Revision ID: 001
Revises:
Create Date: 2025-01-27
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
# revision identifiers
revision: str = '001'
down_revision: Union[str, None] = None
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# ### Game table ###
op.create_table('games',
sa.Column('id', postgresql.UUID(as_uuid=True), nullable=False),
sa.Column('league_id', sa.String(10), nullable=False),
sa.Column('home_team_id', sa.Integer(), nullable=False),
sa.Column('away_team_id', sa.Integer(), nullable=False),
sa.Column('home_user_id', sa.Integer(), nullable=True),
sa.Column('away_user_id', sa.Integer(), nullable=True),
sa.Column('current_inning', sa.Integer(), server_default='1'),
sa.Column('current_half', sa.String(10), server_default='top'),
sa.Column('home_score', sa.Integer(), server_default='0'),
sa.Column('away_score', sa.Integer(), server_default='0'),
sa.Column('status', sa.String(20), server_default='pending'),
sa.Column('allow_spectators', sa.Boolean(), server_default='true'),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()')),
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()')),
sa.PrimaryKeyConstraint('id')
)
op.create_index('idx_game_status', 'games', ['status'])
op.create_index('idx_game_league', 'games', ['league_id'])
# ### Lineup table ###
op.create_table('lineups',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('game_id', postgresql.UUID(as_uuid=True), nullable=False),
sa.Column('team_id', sa.Integer(), nullable=False),
sa.Column('card_id', sa.Integer(), nullable=True),
sa.Column('player_id', sa.Integer(), nullable=True),
sa.Column('position', sa.String(5), nullable=False),
sa.Column('batting_order', sa.Integer(), nullable=True),
sa.Column('is_active', sa.Boolean(), server_default='true'),
sa.Column('entered_game_at', sa.Integer(), nullable=True),
sa.Column('exited_game_at', sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(['game_id'], ['games.id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('id')
)
op.create_index('idx_lineup_game', 'lineups', ['game_id'])
op.create_index('idx_lineup_game_team', 'lineups', ['game_id', 'team_id'])
# ### Play table ###
op.create_table('plays',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('game_id', postgresql.UUID(as_uuid=True), nullable=False),
sa.Column('play_number', sa.Integer(), nullable=False),
sa.Column('inning', sa.Integer(), nullable=False),
sa.Column('half', sa.String(10), nullable=False),
sa.Column('outs_before', sa.Integer(), nullable=False),
sa.Column('outs_after', sa.Integer(), nullable=False),
sa.Column('batter_id', sa.Integer(), nullable=True),
sa.Column('pitcher_id', sa.Integer(), nullable=True),
sa.Column('catcher_id', sa.Integer(), nullable=True),
sa.Column('outcome', sa.String(50), nullable=False),
sa.Column('description', sa.Text(), nullable=True),
# ... additional columns ...
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()')),
sa.ForeignKeyConstraint(['game_id'], ['games.id'], ondelete='CASCADE'),
sa.ForeignKeyConstraint(['batter_id'], ['lineups.id']),
sa.ForeignKeyConstraint(['pitcher_id'], ['lineups.id']),
sa.ForeignKeyConstraint(['catcher_id'], ['lineups.id']),
sa.PrimaryKeyConstraint('id')
)
op.create_index('idx_play_game', 'plays', ['game_id'])
op.create_index('idx_play_game_number', 'plays', ['game_id', 'play_number'])
# ### Additional tables (Roll, GameSession, RosterLink, etc.) ###
# ... (similar patterns for remaining tables)
def downgrade() -> None:
op.drop_table('plays')
op.drop_table('lineups')
op.drop_table('games')
# ... drop remaining tables in reverse order
```
### Step 4: Stamp Existing Database (15 min)
For existing databases, mark as already at initial migration:
```bash
# Mark existing database as having initial schema
alembic stamp 001
```
### Step 5: Remove create_all() (10 min)
Update `backend/app/database/session.py`:
```python
async def init_db() -> None:
"""
Initialize database connection.
NOTE: Schema creation is now handled by Alembic migrations.
Run `alembic upgrade head` to create/update schema.
"""
# Removed: await conn.run_sync(Base.metadata.create_all)
logger.info("Database connection initialized")
```
### Step 6: Update README (15 min)
Add to `backend/README.md`:
```markdown
## Database Migrations
This project uses Alembic for database migrations.
### Initial Setup
```bash
# Apply all migrations
alembic upgrade head
```
### Creating New Migrations
```bash
# Auto-generate from model changes
alembic revision --autogenerate -m "Description of changes"
# Review the generated migration!
# Then apply:
alembic upgrade head
```
### Rolling Back
```bash
# Rollback one migration
alembic downgrade -1
# Rollback to specific revision
alembic downgrade 001
```
### Viewing History
```bash
# Show migration history
alembic history
# Show current revision
alembic current
```
```
### Step 7: Integrate Existing Materialized Views Migration (15 min)
Ensure `004_create_stat_materialized_views.py` is properly linked:
```python
# Update revision to chain properly
revision: str = '004_stat_views'
down_revision: str = '001' # Chain to initial
```
### Step 8: Write Migration Tests (30 min)
Create `backend/tests/integration/test_migrations.py`:
```python
import pytest
from alembic import command
from alembic.config import Config
class TestMigrations:
"""Tests for Alembic migrations."""
@pytest.fixture
def alembic_config(self):
config = Config("alembic.ini")
return config
def test_upgrade_to_head(self, alembic_config):
"""Migrations can be applied cleanly."""
command.upgrade(alembic_config, "head")
def test_downgrade_to_base(self, alembic_config):
"""Migrations can be rolled back."""
command.upgrade(alembic_config, "head")
command.downgrade(alembic_config, "base")
def test_upgrade_downgrade_upgrade(self, alembic_config):
"""Full round-trip migration works."""
command.upgrade(alembic_config, "head")
command.downgrade(alembic_config, "base")
command.upgrade(alembic_config, "head")
```
## Verification Checklist
- [ ] `alembic upgrade head` creates all tables
- [ ] `alembic downgrade base` removes all tables
- [ ] Existing database can be stamped without issues
- [ ] New migrations can be auto-generated
- [ ] Migration tests pass
- [ ] README updated with migration instructions
## CI/CD Integration
Add to CI pipeline:
```yaml
# .github/workflows/test.yml
- name: Run migrations
run: |
cd backend
alembic upgrade head
```
## Rollback Plan
If issues arise:
1. `alembic downgrade -1` to revert last migration
2. Restore from `schema_backup.sql` if needed
3. Re-enable `create_all()` temporarily
## Dependencies
- None (can be implemented independently)
## Notes
- Always review auto-generated migrations before applying
- Test migrations on staging before production
- Keep migrations small and focused
- Future: Add data migrations for complex changes