File cleanup
This commit is contained in:
parent
7ce64a14ea
commit
f34c1977a3
96
.claude/CUSTOM_COMMANDS_FIX.md
Normal file
96
.claude/CUSTOM_COMMANDS_FIX.md
Normal file
@ -0,0 +1,96 @@
|
|||||||
|
# Custom Commands Fix Summary
|
||||||
|
|
||||||
|
**Date**: 2025-10-19
|
||||||
|
**Issue**: Custom commands endpoint failing with "relation 'customcommand' does not exist" and "'creator_id' not defined" errors
|
||||||
|
|
||||||
|
## Problems Fixed
|
||||||
|
|
||||||
|
### 1. Peewee Table Name Mismatch
|
||||||
|
**Problem**: Peewee was auto-generating table names as `customcommand` and `customcommandcreator` (lowercase, no underscores), but PostgreSQL tables were named `custom_commands` and `custom_command_creators`.
|
||||||
|
|
||||||
|
**Fix**: Added `Meta` classes to both models in `app/db_engine.py`:
|
||||||
|
```python
|
||||||
|
class CustomCommandCreator(BaseModel):
|
||||||
|
# ... fields ...
|
||||||
|
class Meta:
|
||||||
|
table_name = 'custom_command_creators'
|
||||||
|
|
||||||
|
class CustomCommand(BaseModel):
|
||||||
|
# ... fields ...
|
||||||
|
class Meta:
|
||||||
|
table_name = 'custom_commands'
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Missing Helper Functions
|
||||||
|
**Problem**: `app/routers_v3/custom_commands.py` was calling undefined functions like `get_custom_command_by_name()`, `create_custom_command()`, etc.
|
||||||
|
|
||||||
|
**Fix**: Implemented all helper functions using Peewee ORM (lines 95-185):
|
||||||
|
- `get_custom_command_by_name(name: str)` - Case-insensitive lookup with creator info
|
||||||
|
- `get_custom_command_by_id(command_id: int)` - Full command with all creator fields
|
||||||
|
- `create_custom_command(command_data: dict)` - Create new command
|
||||||
|
- `update_custom_command(command_id: int, update_data: dict)` - Update command
|
||||||
|
- `delete_custom_command(command_id: int)` - Delete command
|
||||||
|
- `get_creator_by_discord_id(discord_id: int)` - Lookup creator
|
||||||
|
- `create_creator(creator_data: dict)` - Create new creator
|
||||||
|
|
||||||
|
### 3. DATABASE_TYPE References
|
||||||
|
**Problem**: Code referenced `DATABASE_TYPE` for conditional SQL (SQLite vs PostgreSQL), but variable wasn't imported.
|
||||||
|
|
||||||
|
**Fix**: Removed all `DATABASE_TYPE` checks and use PostgreSQL-specific syntax (`ILIKE`) directly since migration is complete.
|
||||||
|
|
||||||
|
### 4. Missing creator_id in execute_custom_command
|
||||||
|
**Problem**: `execute_custom_command()` was failing because `get_custom_command_by_name()` didn't return `creator_id` field.
|
||||||
|
|
||||||
|
**Fix**: Updated `get_custom_command_by_name()` to explicitly include `creator_id` in returned dict (line 109).
|
||||||
|
|
||||||
|
## Files Modified
|
||||||
|
|
||||||
|
1. **app/db_engine.py**
|
||||||
|
- Added `Meta.table_name = 'custom_command_creators'` to `CustomCommandCreator`
|
||||||
|
- Added `Meta.table_name = 'custom_commands'` to `CustomCommand`
|
||||||
|
|
||||||
|
2. **app/routers_v3/custom_commands.py**
|
||||||
|
- Added 7 helper functions using Peewee ORM (lines 95-185)
|
||||||
|
- Removed `DATABASE_TYPE` checks (lines 206, 951-956)
|
||||||
|
- Simplified `execute_custom_command()` to use helper functions (lines 894-922)
|
||||||
|
- Added `creator_id` to return value of `get_custom_command_by_name()` (line 109)
|
||||||
|
|
||||||
|
3. **CLAUDE.md**
|
||||||
|
- Updated Database Configuration section
|
||||||
|
- Removed DATABASE_TYPE from environment variables
|
||||||
|
- Added Custom Commands to Key Data Models
|
||||||
|
- Updated Important Notes to reflect PostgreSQL migration completion
|
||||||
|
|
||||||
|
## Endpoints Now Working
|
||||||
|
|
||||||
|
All custom commands endpoints are now functional:
|
||||||
|
|
||||||
|
**Public Endpoints**:
|
||||||
|
- `GET /api/v3/custom_commands` - List/search commands with pagination
|
||||||
|
- `GET /api/v3/custom_commands/{command_id}` - Get command by ID
|
||||||
|
- `GET /api/v3/custom_commands/by_name/{command_name}` - Get command by name
|
||||||
|
- `GET /api/v3/custom_commands/autocomplete` - Autocomplete for Discord
|
||||||
|
- `GET /api/v3/custom_commands/stats` - Command statistics
|
||||||
|
- `GET /api/v3/custom_commands/creators` - List creators
|
||||||
|
|
||||||
|
**Private Endpoints** (require API token):
|
||||||
|
- `POST /api/v3/custom_commands` - Create new command
|
||||||
|
- `PUT /api/v3/custom_commands/{command_id}` - Update command
|
||||||
|
- `PATCH /api/v3/custom_commands/{command_id}` - Partially update command
|
||||||
|
- `PATCH /api/v3/custom_commands/by_name/{name}/execute` - Execute and track usage
|
||||||
|
- `DELETE /api/v3/custom_commands/{command_id}` - Delete command
|
||||||
|
- `POST /api/v3/custom_commands/creators` - Create new creator
|
||||||
|
|
||||||
|
## Testing Results
|
||||||
|
|
||||||
|
✅ Successfully retrieved creator by Discord ID
|
||||||
|
✅ Command creation returns proper 409 conflict for existing commands
|
||||||
|
✅ Command execution successfully updates usage statistics
|
||||||
|
✅ All endpoints return properly formatted JSON responses
|
||||||
|
|
||||||
|
## Key Learnings
|
||||||
|
|
||||||
|
1. **Always specify table names in Peewee models** - Peewee's auto-generated names don't match PostgreSQL naming conventions
|
||||||
|
2. **Use Peewee ORM over raw SQL** - Better type safety, maintainability, and less error-prone
|
||||||
|
3. **PostgreSQL migration complete** - No need for DATABASE_TYPE conditionals anymore
|
||||||
|
4. **Helper functions should return all necessary fields** - Including foreign key IDs, not just related object data
|
||||||
211
.claude/bulk_update_player_images.py
Normal file
211
.claude/bulk_update_player_images.py
Normal file
@ -0,0 +1,211 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
One-time bulk update of Player.image values to S3 URLs
|
||||||
|
Maps player records to: https://sba-cards-2024.s3.us-east-1.amazonaws.com/<year>-cards/<player_name>.png
|
||||||
|
"""
|
||||||
|
|
||||||
|
import logging
|
||||||
|
from app.db_engine import db, Player
|
||||||
|
|
||||||
|
# Configure logging
|
||||||
|
logging.basicConfig(
|
||||||
|
level=logging.INFO,
|
||||||
|
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
|
||||||
|
)
|
||||||
|
logger = logging.getLogger(f'{__name__}.bulk_update_player_images')
|
||||||
|
|
||||||
|
# Season to year mapping
|
||||||
|
SEASON_TO_YEAR = {
|
||||||
|
4: 2020,
|
||||||
|
5: 2020,
|
||||||
|
6: 2021,
|
||||||
|
7: 2021,
|
||||||
|
8: 2022,
|
||||||
|
9: 2022,
|
||||||
|
10: 2023,
|
||||||
|
11: 2023,
|
||||||
|
}
|
||||||
|
|
||||||
|
S3_BASE_URL = "https://sba-cards-2024.s3.us-east-1.amazonaws.com"
|
||||||
|
|
||||||
|
|
||||||
|
def generate_image_url(current_url: str, season: int) -> str | None:
|
||||||
|
"""
|
||||||
|
Generate S3 image URL for a player based on their season
|
||||||
|
Preserves the existing filename from current URL (including jr/sr suffixes)
|
||||||
|
|
||||||
|
Args:
|
||||||
|
current_url: Current image URL (e.g., "https://sombaseball.ddns.net/cards/2020/albert-almora-jr.png")
|
||||||
|
season: Season number
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Full S3 URL or None if season not in mapping
|
||||||
|
Example: https://sba-cards-2024.s3.us-east-1.amazonaws.com/2020-cards/albert-almora-jr.png
|
||||||
|
"""
|
||||||
|
year = SEASON_TO_YEAR.get(season)
|
||||||
|
if year is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Extract filename from current URL (preserves jr/sr/etc designations)
|
||||||
|
# "https://sombaseball.ddns.net/cards/2020/albert-almora-jr.png" -> "albert-almora-jr.png"
|
||||||
|
filename = current_url.split('/')[-1]
|
||||||
|
|
||||||
|
return f"{S3_BASE_URL}/{year}-cards/{filename}"
|
||||||
|
|
||||||
|
|
||||||
|
def preview_updates(limit: int = 10):
|
||||||
|
"""Preview what the updates would look like"""
|
||||||
|
logger.info("=" * 80)
|
||||||
|
logger.info("PREVIEW MODE - Showing first %d updates", limit)
|
||||||
|
logger.info("=" * 80)
|
||||||
|
|
||||||
|
query = (Player
|
||||||
|
.select(Player.id, Player.name, Player.season, Player.image)
|
||||||
|
.where(Player.season.in_(list(SEASON_TO_YEAR.keys())))
|
||||||
|
.limit(limit))
|
||||||
|
|
||||||
|
for player in query:
|
||||||
|
new_url = generate_image_url(player.image, player.season)
|
||||||
|
logger.info(f"Player ID {player.id}: {player.name} (Season {player.season})")
|
||||||
|
logger.info(f" OLD: {player.image}")
|
||||||
|
logger.info(f" NEW: {new_url}")
|
||||||
|
logger.info("-" * 80)
|
||||||
|
|
||||||
|
|
||||||
|
def get_update_statistics():
|
||||||
|
"""Get statistics about what will be updated"""
|
||||||
|
logger.info("=" * 80)
|
||||||
|
logger.info("GATHERING STATISTICS")
|
||||||
|
logger.info("=" * 80)
|
||||||
|
|
||||||
|
# Total players in target seasons
|
||||||
|
total_query = (Player
|
||||||
|
.select()
|
||||||
|
.where(Player.season.in_(list(SEASON_TO_YEAR.keys()))))
|
||||||
|
total_count = total_query.count()
|
||||||
|
|
||||||
|
# Breakdown by season
|
||||||
|
season_counts = {}
|
||||||
|
for season in sorted(SEASON_TO_YEAR.keys()):
|
||||||
|
count = Player.select().where(Player.season == season).count()
|
||||||
|
season_counts[season] = count
|
||||||
|
logger.info(f"Season {season} ({SEASON_TO_YEAR[season]}): {count} players")
|
||||||
|
|
||||||
|
logger.info("-" * 80)
|
||||||
|
logger.info(f"TOTAL players to update: {total_count}")
|
||||||
|
logger.info("=" * 80)
|
||||||
|
|
||||||
|
return total_count, season_counts
|
||||||
|
|
||||||
|
|
||||||
|
def bulk_update_images(batch_size: int = 1000, dry_run: bool = False):
|
||||||
|
"""
|
||||||
|
Bulk update player images in batches
|
||||||
|
|
||||||
|
Args:
|
||||||
|
batch_size: Number of records to update per batch
|
||||||
|
dry_run: If True, only show what would be updated without committing
|
||||||
|
"""
|
||||||
|
if dry_run:
|
||||||
|
logger.info("DRY RUN MODE - No changes will be committed")
|
||||||
|
preview_updates(limit=20)
|
||||||
|
total_count, season_counts = get_update_statistics()
|
||||||
|
return
|
||||||
|
|
||||||
|
logger.info("=" * 80)
|
||||||
|
logger.info("STARTING BULK UPDATE")
|
||||||
|
logger.info("=" * 80)
|
||||||
|
|
||||||
|
# Get all players that need updates
|
||||||
|
target_seasons = list(SEASON_TO_YEAR.keys())
|
||||||
|
players_query = (Player
|
||||||
|
.select(Player.id, Player.name, Player.season, Player.image)
|
||||||
|
.where(Player.season.in_(target_seasons)))
|
||||||
|
|
||||||
|
# Build update list
|
||||||
|
updates = []
|
||||||
|
skipped = 0
|
||||||
|
|
||||||
|
logger.info("Building update list...")
|
||||||
|
for player in players_query:
|
||||||
|
new_url = generate_image_url(player.image, player.season)
|
||||||
|
if new_url:
|
||||||
|
updates.append({'id': player.id, 'image': new_url})
|
||||||
|
else:
|
||||||
|
skipped += 1
|
||||||
|
logger.warning(f"Skipped player {player.id} - season {player.season} not in mapping")
|
||||||
|
|
||||||
|
total = len(updates)
|
||||||
|
logger.info(f"Prepared {total} updates (skipped {skipped})")
|
||||||
|
|
||||||
|
if total == 0:
|
||||||
|
logger.warning("No updates to perform!")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Perform batch updates in a single transaction
|
||||||
|
try:
|
||||||
|
with db.atomic():
|
||||||
|
updated_count = 0
|
||||||
|
|
||||||
|
for i in range(0, total, batch_size):
|
||||||
|
batch = updates[i:i + batch_size]
|
||||||
|
|
||||||
|
# Build CASE statement for batch update
|
||||||
|
# SQL: UPDATE player SET image = CASE id WHEN 1 THEN 'url1' WHEN 2 THEN 'url2' END WHERE id IN (1,2)
|
||||||
|
case_statements = " ".join([
|
||||||
|
f"WHEN {item['id']} THEN '{item['image']}'"
|
||||||
|
for item in batch
|
||||||
|
])
|
||||||
|
ids = ",".join(str(item['id']) for item in batch)
|
||||||
|
|
||||||
|
query = f"""
|
||||||
|
UPDATE player
|
||||||
|
SET image = CASE id {case_statements} END
|
||||||
|
WHERE id IN ({ids})
|
||||||
|
"""
|
||||||
|
|
||||||
|
result = db.execute_sql(query)
|
||||||
|
updated_count += len(batch)
|
||||||
|
|
||||||
|
logger.info(f"Progress: {updated_count}/{total} records updated ({updated_count/total*100:.1f}%)")
|
||||||
|
|
||||||
|
logger.info("=" * 80)
|
||||||
|
logger.info(f"SUCCESS! Updated {updated_count} player image values")
|
||||||
|
logger.info("=" * 80)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"ERROR during bulk update: {e}")
|
||||||
|
logger.error("Transaction rolled back - no changes were made")
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
"""Main execution function"""
|
||||||
|
import sys
|
||||||
|
|
||||||
|
# Check command line arguments
|
||||||
|
dry_run = '--dry-run' in sys.argv or '-n' in sys.argv
|
||||||
|
|
||||||
|
if dry_run:
|
||||||
|
logger.info("Running in DRY RUN mode (use without --dry-run to apply changes)")
|
||||||
|
bulk_update_images(dry_run=True)
|
||||||
|
else:
|
||||||
|
logger.warning("=" * 80)
|
||||||
|
logger.warning("LIVE RUN - This will modify the database!")
|
||||||
|
logger.warning("Press Ctrl+C within 5 seconds to cancel...")
|
||||||
|
logger.warning("=" * 80)
|
||||||
|
|
||||||
|
import time
|
||||||
|
try:
|
||||||
|
time.sleep(5)
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
logger.info("\nCancelled by user")
|
||||||
|
sys.exit(0)
|
||||||
|
|
||||||
|
bulk_update_images(batch_size=1000, dry_run=False)
|
||||||
|
|
||||||
|
logger.info("Done!")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
321
.claude/migrate_custom_commands_v2.py
Executable file
321
.claude/migrate_custom_commands_v2.py
Executable file
@ -0,0 +1,321 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Migration script to transfer custom commands from old SQLite schema to new PostgreSQL schema.
|
||||||
|
|
||||||
|
This script:
|
||||||
|
1. Reads data from the old 'creator' and 'command' tables in test-storage/sba_is_fun.db
|
||||||
|
2. Transforms the data to match the new schema (custom_command_creators and custom_commands)
|
||||||
|
3. Inserts the data into the target database (dev or production PostgreSQL)
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
# Migrate to dev database
|
||||||
|
python migrate_custom_commands_v2.py --source test-storage/sba_is_fun.db --target dev
|
||||||
|
|
||||||
|
# Migrate to production database
|
||||||
|
python migrate_custom_commands_v2.py --source test-storage/sba_is_fun.db --target prod
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sqlite3
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import argparse
|
||||||
|
from datetime import datetime
|
||||||
|
import psycopg2
|
||||||
|
from psycopg2.extras import execute_values
|
||||||
|
|
||||||
|
def get_postgres_connection(target):
|
||||||
|
"""Get PostgreSQL connection based on target environment."""
|
||||||
|
if target == 'dev':
|
||||||
|
return psycopg2.connect(
|
||||||
|
host=os.environ.get('POSTGRES_HOST', '10.10.0.42'),
|
||||||
|
database=os.environ.get('POSTGRES_DB', 'sba_master'),
|
||||||
|
user=os.environ.get('POSTGRES_USER', 'sba_admin'),
|
||||||
|
password=os.environ.get('POSTGRES_PASSWORD', 'your_production_password'),
|
||||||
|
port=int(os.environ.get('POSTGRES_PORT', '5432'))
|
||||||
|
)
|
||||||
|
elif target == 'prod':
|
||||||
|
# Production credentials should be in environment variables
|
||||||
|
return psycopg2.connect(
|
||||||
|
host=os.environ.get('PROD_POSTGRES_HOST', 'api.sba.manticorum.com'),
|
||||||
|
database=os.environ.get('PROD_POSTGRES_DB', 'sba_master'),
|
||||||
|
user=os.environ.get('PROD_POSTGRES_USER', 'sba_admin'),
|
||||||
|
password=os.environ.get('PROD_POSTGRES_PASSWORD', 'your_production_password'),
|
||||||
|
port=int(os.environ.get('PROD_POSTGRES_PORT', '5432'))
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
raise ValueError(f"Invalid target: {target}. Must be 'dev' or 'prod'")
|
||||||
|
|
||||||
|
def read_source_data(source_db_path):
|
||||||
|
"""Read creators and commands from source SQLite database."""
|
||||||
|
if not os.path.exists(source_db_path):
|
||||||
|
raise FileNotFoundError(f"Source database not found: {source_db_path}")
|
||||||
|
|
||||||
|
conn = sqlite3.connect(source_db_path)
|
||||||
|
conn.row_factory = sqlite3.Row
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
# Read creators
|
||||||
|
cursor.execute("SELECT id, name, discordid FROM creator ORDER BY id")
|
||||||
|
creators = [dict(row) for row in cursor.fetchall()]
|
||||||
|
|
||||||
|
# Read commands
|
||||||
|
cursor.execute("""
|
||||||
|
SELECT id, name, message, creator_id, createtime, last_used, sent_warns
|
||||||
|
FROM command
|
||||||
|
ORDER BY id
|
||||||
|
""")
|
||||||
|
commands = [dict(row) for row in cursor.fetchall()]
|
||||||
|
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
print(f"✓ Read {len(creators)} creators and {len(commands)} commands from source")
|
||||||
|
return creators, commands
|
||||||
|
|
||||||
|
def transform_data(creators, commands):
|
||||||
|
"""Transform old schema data to new schema format."""
|
||||||
|
# Create a mapping of old creator IDs to new data
|
||||||
|
creator_map = {}
|
||||||
|
transformed_creators = []
|
||||||
|
|
||||||
|
for creator in creators:
|
||||||
|
# Transform creator data
|
||||||
|
new_creator = {
|
||||||
|
'discord_id': creator['discordid'],
|
||||||
|
'username': creator['name'],
|
||||||
|
'display_name': None, # Not in old schema
|
||||||
|
'created_at': datetime.now().isoformat(),
|
||||||
|
'total_commands': 0,
|
||||||
|
'active_commands': 0
|
||||||
|
}
|
||||||
|
transformed_creators.append(new_creator)
|
||||||
|
creator_map[creator['id']] = creator['discordid']
|
||||||
|
|
||||||
|
# Transform commands
|
||||||
|
transformed_commands = []
|
||||||
|
skipped_names = []
|
||||||
|
|
||||||
|
for cmd in commands:
|
||||||
|
# Skip commands with names that are too long (max 32 chars)
|
||||||
|
if len(cmd['name']) > 32:
|
||||||
|
skipped_names.append(cmd['name'])
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Calculate use_count from last_used (if used at all, assume at least 1)
|
||||||
|
use_count = 1 if cmd['last_used'] else 0
|
||||||
|
|
||||||
|
# warning_sent is True if sent_warns has a value
|
||||||
|
warning_sent = bool(cmd['sent_warns'])
|
||||||
|
|
||||||
|
new_command = {
|
||||||
|
'name': cmd['name'],
|
||||||
|
'content': cmd['message'],
|
||||||
|
'creator_discord_id': creator_map.get(cmd['creator_id']),
|
||||||
|
'created_at': cmd['createtime'] or datetime.now().isoformat(),
|
||||||
|
'updated_at': None,
|
||||||
|
'last_used': cmd['last_used'] or cmd['createtime'] or datetime.now().isoformat(),
|
||||||
|
'use_count': use_count,
|
||||||
|
'warning_sent': warning_sent,
|
||||||
|
'is_active': True,
|
||||||
|
'tags': '[]' # Empty JSON array
|
||||||
|
}
|
||||||
|
transformed_commands.append(new_command)
|
||||||
|
|
||||||
|
if skipped_names:
|
||||||
|
print(f"⚠ Skipped {len(skipped_names)} commands with names > 32 characters:")
|
||||||
|
for name in skipped_names:
|
||||||
|
print(f" - {name} ({len(name)} chars)")
|
||||||
|
|
||||||
|
print(f"✓ Transformed data for {len(transformed_creators)} creators and {len(transformed_commands)} commands")
|
||||||
|
return transformed_creators, transformed_commands
|
||||||
|
|
||||||
|
def migrate_to_postgres(pg_conn, creators, commands):
|
||||||
|
"""Insert transformed data into PostgreSQL database."""
|
||||||
|
cursor = pg_conn.cursor()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Start transaction
|
||||||
|
pg_conn.autocommit = False
|
||||||
|
|
||||||
|
# Create a mapping of discord_id to new database ID
|
||||||
|
creator_id_map = {}
|
||||||
|
|
||||||
|
# Insert creators
|
||||||
|
print(f"\nMigrating {len(creators)} creators...")
|
||||||
|
for i, creator in enumerate(creators, 1):
|
||||||
|
cursor.execute("""
|
||||||
|
INSERT INTO custom_command_creators
|
||||||
|
(discord_id, username, display_name, created_at, total_commands, active_commands)
|
||||||
|
VALUES (%(discord_id)s, %(username)s, %(display_name)s, %(created_at)s, %(total_commands)s, %(active_commands)s)
|
||||||
|
ON CONFLICT (discord_id) DO UPDATE
|
||||||
|
SET username = EXCLUDED.username
|
||||||
|
RETURNING id, discord_id
|
||||||
|
""", creator)
|
||||||
|
db_id, discord_id = cursor.fetchone()
|
||||||
|
creator_id_map[discord_id] = db_id
|
||||||
|
|
||||||
|
if i % 10 == 0:
|
||||||
|
print(f" ✓ Migrated {i}/{len(creators)} creators")
|
||||||
|
|
||||||
|
print(f"✓ All {len(creators)} creators migrated")
|
||||||
|
|
||||||
|
# Insert commands
|
||||||
|
print(f"\nMigrating {len(commands)} commands...")
|
||||||
|
migrated = 0
|
||||||
|
skipped = 0
|
||||||
|
|
||||||
|
for i, cmd in enumerate(commands, 1):
|
||||||
|
# Get the new creator_id from the mapping
|
||||||
|
creator_id = creator_id_map.get(cmd['creator_discord_id'])
|
||||||
|
if not creator_id:
|
||||||
|
print(f" ⚠ Skipping command '{cmd['name']}' - creator not found")
|
||||||
|
skipped += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Use savepoint to allow individual command failure without rolling back entire transaction
|
||||||
|
cursor.execute("SAVEPOINT cmd_insert")
|
||||||
|
cursor.execute("""
|
||||||
|
INSERT INTO custom_commands
|
||||||
|
(name, content, creator_id, created_at, updated_at, last_used,
|
||||||
|
use_count, warning_sent, is_active, tags)
|
||||||
|
VALUES (%(name)s, %(content)s, %(creator_id)s, %(created_at)s, %(updated_at)s,
|
||||||
|
%(last_used)s, %(use_count)s, %(warning_sent)s, %(is_active)s, %(tags)s)
|
||||||
|
ON CONFLICT (name) DO UPDATE
|
||||||
|
SET content = EXCLUDED.content,
|
||||||
|
last_used = EXCLUDED.last_used,
|
||||||
|
use_count = EXCLUDED.use_count
|
||||||
|
""", {**cmd, 'creator_id': creator_id})
|
||||||
|
cursor.execute("RELEASE SAVEPOINT cmd_insert")
|
||||||
|
migrated += 1
|
||||||
|
|
||||||
|
if i % 25 == 0:
|
||||||
|
print(f" ✓ Migrated {migrated}/{len(commands)} commands")
|
||||||
|
except Exception as e:
|
||||||
|
cursor.execute("ROLLBACK TO SAVEPOINT cmd_insert")
|
||||||
|
print(f" ⚠ Error migrating command '{cmd['name']}': {e}")
|
||||||
|
skipped += 1
|
||||||
|
|
||||||
|
print(f"✓ Migrated {migrated} commands ({skipped} skipped)")
|
||||||
|
|
||||||
|
# Update creator stats
|
||||||
|
print("\nUpdating creator statistics...")
|
||||||
|
cursor.execute("""
|
||||||
|
UPDATE custom_command_creators cc
|
||||||
|
SET total_commands = (
|
||||||
|
SELECT COUNT(*) FROM custom_commands WHERE creator_id = cc.id
|
||||||
|
),
|
||||||
|
active_commands = (
|
||||||
|
SELECT COUNT(*) FROM custom_commands WHERE creator_id = cc.id AND is_active = TRUE
|
||||||
|
)
|
||||||
|
""")
|
||||||
|
|
||||||
|
# Commit transaction
|
||||||
|
pg_conn.commit()
|
||||||
|
print("✓ Transaction committed successfully")
|
||||||
|
|
||||||
|
# Print final stats
|
||||||
|
cursor.execute("SELECT COUNT(*) FROM custom_command_creators")
|
||||||
|
total_creators = cursor.fetchone()[0]
|
||||||
|
cursor.execute("SELECT COUNT(*) FROM custom_commands")
|
||||||
|
total_commands = cursor.fetchone()[0]
|
||||||
|
cursor.execute("SELECT COUNT(*) FROM custom_commands WHERE is_active = TRUE")
|
||||||
|
active_commands = cursor.fetchone()[0]
|
||||||
|
|
||||||
|
print(f"\n{'='*60}")
|
||||||
|
print(f"Migration Summary:")
|
||||||
|
print(f" Total creators in database: {total_creators}")
|
||||||
|
print(f" Total commands in database: {total_commands}")
|
||||||
|
print(f" Active commands: {active_commands}")
|
||||||
|
print(f"{'='*60}")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
pg_conn.rollback()
|
||||||
|
print(f"\n✗ Migration failed: {e}")
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
cursor.close()
|
||||||
|
|
||||||
|
def main():
|
||||||
|
parser = argparse.ArgumentParser(
|
||||||
|
description='Migrate custom commands from old SQLite schema to new PostgreSQL schema'
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
'--source',
|
||||||
|
default='test-storage/sba_is_fun.db',
|
||||||
|
help='Path to source SQLite database (default: test-storage/sba_is_fun.db)'
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
'--target',
|
||||||
|
choices=['dev', 'prod'],
|
||||||
|
required=True,
|
||||||
|
help='Target database environment (dev or prod)'
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
'--dry-run',
|
||||||
|
action='store_true',
|
||||||
|
help='Show what would be migrated without actually doing it'
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
'--yes',
|
||||||
|
action='store_true',
|
||||||
|
help='Skip confirmation prompt for production'
|
||||||
|
)
|
||||||
|
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
print(f"{'='*60}")
|
||||||
|
print(f"Custom Commands Migration Script")
|
||||||
|
print(f"{'='*60}")
|
||||||
|
print(f"Source: {args.source}")
|
||||||
|
print(f"Target: {args.target.upper()} PostgreSQL database")
|
||||||
|
print(f"Dry run: {'Yes' if args.dry_run else 'No'}")
|
||||||
|
print(f"{'='*60}\n")
|
||||||
|
|
||||||
|
# Read source data
|
||||||
|
creators, commands = read_source_data(args.source)
|
||||||
|
|
||||||
|
# Transform data
|
||||||
|
transformed_creators, transformed_commands = transform_data(creators, commands)
|
||||||
|
|
||||||
|
if args.dry_run:
|
||||||
|
print("\n=== DRY RUN MODE ===")
|
||||||
|
print(f"\nWould migrate {len(transformed_creators)} creators:")
|
||||||
|
for creator in transformed_creators[:5]:
|
||||||
|
print(f" - {creator['username']} (discord_id: {creator['discord_id']})")
|
||||||
|
if len(transformed_creators) > 5:
|
||||||
|
print(f" ... and {len(transformed_creators) - 5} more")
|
||||||
|
|
||||||
|
print(f"\nWould migrate {len(transformed_commands)} commands:")
|
||||||
|
for cmd in transformed_commands[:5]:
|
||||||
|
print(f" - {cmd['name']} (by discord_id: {cmd['creator_discord_id']})")
|
||||||
|
if len(transformed_commands) > 5:
|
||||||
|
print(f" ... and {len(transformed_commands) - 5} more")
|
||||||
|
|
||||||
|
print("\n=== No changes made (dry run) ===")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Confirm before proceeding with production
|
||||||
|
if args.target == 'prod' and not args.yes:
|
||||||
|
print("\n⚠️ WARNING: You are about to migrate to PRODUCTION!")
|
||||||
|
response = input("Type 'YES' to continue: ")
|
||||||
|
if response != 'YES':
|
||||||
|
print("Migration cancelled.")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Connect to PostgreSQL and migrate
|
||||||
|
print(f"\nConnecting to {args.target.upper()} PostgreSQL database...")
|
||||||
|
try:
|
||||||
|
pg_conn = get_postgres_connection(args.target)
|
||||||
|
print("✓ Connected successfully")
|
||||||
|
|
||||||
|
migrate_to_postgres(pg_conn, transformed_creators, transformed_commands)
|
||||||
|
|
||||||
|
pg_conn.close()
|
||||||
|
print("\n✓ Migration completed successfully!")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"\n✗ Migration failed: {e}")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
||||||
BIN
.claude/sba_is_fun.db
Normal file
BIN
.claude/sba_is_fun.db
Normal file
Binary file not shown.
45
migrations/add_custom_commands_tables.sql
Normal file
45
migrations/add_custom_commands_tables.sql
Normal file
@ -0,0 +1,45 @@
|
|||||||
|
-- Migration: Add custom_commands tables
|
||||||
|
-- Date: 2025-10-02
|
||||||
|
-- Description: Creates tables for Discord bot custom command functionality
|
||||||
|
|
||||||
|
-- Create custom_command_creators table
|
||||||
|
CREATE TABLE IF NOT EXISTS custom_command_creators (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
discord_id BIGINT UNIQUE NOT NULL,
|
||||||
|
username VARCHAR(32) NOT NULL,
|
||||||
|
display_name VARCHAR(32),
|
||||||
|
created_at TIMESTAMP NOT NULL,
|
||||||
|
total_commands INTEGER DEFAULT 0,
|
||||||
|
active_commands INTEGER DEFAULT 0
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_custom_command_creators_discord_id
|
||||||
|
ON custom_command_creators(discord_id);
|
||||||
|
|
||||||
|
-- Create custom_commands table
|
||||||
|
CREATE TABLE IF NOT EXISTS custom_commands (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
name VARCHAR(32) UNIQUE NOT NULL,
|
||||||
|
content TEXT NOT NULL,
|
||||||
|
creator_id INTEGER NOT NULL,
|
||||||
|
created_at TIMESTAMP NOT NULL,
|
||||||
|
updated_at TIMESTAMP,
|
||||||
|
last_used TIMESTAMP,
|
||||||
|
use_count INTEGER DEFAULT 0,
|
||||||
|
warning_sent BOOLEAN DEFAULT FALSE,
|
||||||
|
is_active BOOLEAN DEFAULT TRUE,
|
||||||
|
tags TEXT,
|
||||||
|
FOREIGN KEY (creator_id) REFERENCES custom_command_creators (id)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_custom_commands_name
|
||||||
|
ON custom_commands(name);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_custom_commands_creator_id
|
||||||
|
ON custom_commands(creator_id);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_custom_commands_is_active
|
||||||
|
ON custom_commands(is_active);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_custom_commands_last_used
|
||||||
|
ON custom_commands(last_used);
|
||||||
31
migrations/add_help_commands_table.sql
Normal file
31
migrations/add_help_commands_table.sql
Normal file
@ -0,0 +1,31 @@
|
|||||||
|
-- Migration: Add help_commands table
|
||||||
|
-- Date: 2025-10-10
|
||||||
|
-- Description: Creates table for Discord bot custom help system
|
||||||
|
|
||||||
|
-- Create help_commands table
|
||||||
|
CREATE TABLE IF NOT EXISTS help_commands (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
name VARCHAR(32) UNIQUE NOT NULL,
|
||||||
|
title VARCHAR(200) NOT NULL,
|
||||||
|
content TEXT NOT NULL,
|
||||||
|
category VARCHAR(50),
|
||||||
|
created_by_discord_id TEXT NOT NULL,
|
||||||
|
created_at TIMESTAMP NOT NULL,
|
||||||
|
updated_at TIMESTAMP,
|
||||||
|
last_modified_by TEXT,
|
||||||
|
is_active BOOLEAN DEFAULT TRUE,
|
||||||
|
view_count INTEGER DEFAULT 0,
|
||||||
|
display_order INTEGER DEFAULT 0
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_help_commands_name
|
||||||
|
ON help_commands(name);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_help_commands_category
|
||||||
|
ON help_commands(category);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_help_commands_is_active
|
||||||
|
ON help_commands(is_active);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_help_commands_display_order
|
||||||
|
ON help_commands(display_order);
|
||||||
Loading…
Reference in New Issue
Block a user