Completed Phase 3E-Final with Redis caching upgrade and WebSocket X-Check
integration for real-time defensive play resolution.
## Redis Caching System
### New Files
- app/services/redis_client.py - Async Redis client with connection pooling
* 10 connection pool size
* Automatic connect/disconnect lifecycle
* Ping health checks
* Environment-configurable via REDIS_URL
### Modified Files
- app/services/position_rating_service.py - Migrated from in-memory to Redis
* Redis key pattern: "position_ratings:{card_id}"
* TTL: 86400 seconds (24 hours)
* Graceful fallback if Redis unavailable
* Individual and bulk cache clearing (scan_iter)
* 760x performance improvement (0.274s API → 0.000361s Redis)
- app/main.py - Added Redis startup/shutdown events
* Connect on app startup with settings.redis_url
* Disconnect on shutdown
* Warning logged if Redis connection fails
- app/config.py - Added redis_url setting
* Default: "redis://localhost:6379/0"
* Override via REDIS_URL environment variable
- app/services/__init__.py - Export redis_client
### Testing
- test_redis_cache.py - Live integration test
* 10-step validation: connect, cache miss, cache hit, performance, etc.
* Verified 760x speedup with player 8807 (7 positions)
* Data integrity checks pass
## X-Check WebSocket Integration
### Modified Files
- app/websocket/handlers.py - Enhanced submit_manual_outcome handler
* Serialize XCheckResult to JSON when present
* Include x_check_details in play_resolved broadcast
* Fixed bug: Use result.outcome instead of submitted outcome
* Includes defender ratings, dice rolls, resolution steps
### New Files
- app/websocket/X_CHECK_FRONTEND_GUIDE.md - Comprehensive frontend documentation
* Event structure and field definitions
* Implementation examples (basic, enhanced, polished)
* Error handling and common pitfalls
* Test scenarios with expected data
* League differences (SBA vs PD)
* 500+ lines of frontend integration guide
- app/websocket/MANUAL_VS_AUTO_MODE.md - Workflow documentation
* Manual mode: Players read cards, submit outcomes
* Auto mode: System generates from ratings (PD only)
* X-Check resolution comparison
* UI recommendations for each mode
* Configuration reference
* Testing considerations
### Testing
- tests/integration/test_xcheck_websocket.py - WebSocket integration tests
* Test X-Check play includes x_check_details ✅
* Test non-X-Check plays don't include details ✅
* Full event structure validation
## Performance Impact
- Redis caching: 760x speedup for position ratings
- WebSocket: No performance impact (optional field)
- Graceful degradation: System works without Redis
## Phase 3E-Final Progress
- ✅ WebSocket event handlers for X-Check UI
- ✅ Frontend integration documentation
- ✅ Redis caching upgrade (from in-memory)
- ✅ Redis connection pool in app lifecycle
- ✅ Integration tests (2 WebSocket, 1 Redis)
- ✅ Manual vs Auto mode workflow documentation
Phase 3E-Final: 100% Complete
Phase 3 Overall: ~98% Complete
## Testing Results
All tests passing:
- X-Check table tests: 36/36 ✅
- WebSocket integration: 2/2 ✅
- Redis live test: 10/10 steps ✅
## Configuration
Development:
REDIS_URL=redis://localhost:6379/0 (Docker Compose)
Production options:
REDIS_URL=redis://10.10.0.42:6379/0 (DB server)
REDIS_URL=redis://your-redis-cloud.com:6379/0 (Managed)
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
152 lines
4.8 KiB
Python
152 lines
4.8 KiB
Python
"""
|
|
Test Redis cache integration for position ratings.
|
|
|
|
Tests the complete flow:
|
|
1. Connect to Redis
|
|
2. Fetch ratings from API (cache miss)
|
|
3. Verify cached in Redis
|
|
4. Fetch again (cache hit)
|
|
5. Measure performance improvement
|
|
6. Test cache clearing
|
|
|
|
Author: Claude
|
|
Date: 2025-11-03
|
|
Phase: 3E-Final
|
|
"""
|
|
import asyncio
|
|
import pendulum
|
|
from app.services import redis_client, position_rating_service
|
|
|
|
|
|
async def main():
|
|
print("=" * 60)
|
|
print("Redis Cache Integration Test")
|
|
print("=" * 60)
|
|
|
|
# Step 1: Connect to Redis
|
|
print("\n1. Connecting to Redis...")
|
|
try:
|
|
await redis_client.connect("redis://localhost:6379/0")
|
|
print(f" ✅ Connected to Redis")
|
|
|
|
# Test ping
|
|
is_alive = await redis_client.ping()
|
|
print(f" ✅ Redis ping: {is_alive}")
|
|
except Exception as e:
|
|
print(f" ❌ Failed to connect: {e}")
|
|
return
|
|
|
|
# Step 2: Clear any existing cache
|
|
print("\n2. Clearing existing cache...")
|
|
await position_rating_service.clear_cache()
|
|
print(" ✅ Cache cleared")
|
|
|
|
# Step 3: Fetch ratings from API (should be cache miss)
|
|
print("\n3. First fetch (API call - cache miss)...")
|
|
card_id = 8807 # Test player with 7 positions
|
|
start = pendulum.now('UTC')
|
|
|
|
ratings = await position_rating_service.get_ratings_for_card(
|
|
card_id=card_id,
|
|
league_id="pd"
|
|
)
|
|
|
|
api_duration = (pendulum.now('UTC') - start).total_seconds()
|
|
print(f" ✅ Fetched {len(ratings)} ratings in {api_duration:.4f}s")
|
|
|
|
if ratings:
|
|
print(f" 📋 Positions found:")
|
|
for rating in ratings:
|
|
print(f" - {rating.position}: range={rating.range}, error={rating.error}, innings={rating.innings}")
|
|
|
|
# Step 4: Fetch again (should be cache hit from Redis)
|
|
print("\n4. Second fetch (Redis cache hit)...")
|
|
start = pendulum.now('UTC')
|
|
|
|
cached_ratings = await position_rating_service.get_ratings_for_card(
|
|
card_id=card_id,
|
|
league_id="pd"
|
|
)
|
|
|
|
cache_duration = (pendulum.now('UTC') - start).total_seconds()
|
|
print(f" ✅ Fetched {len(cached_ratings)} ratings in {cache_duration:.6f}s")
|
|
|
|
# Step 5: Calculate performance improvement
|
|
print("\n5. Performance Comparison:")
|
|
if cache_duration > 0:
|
|
speedup = api_duration / cache_duration
|
|
print(f" API call: {api_duration:.4f}s")
|
|
print(f" Cache hit: {cache_duration:.6f}s")
|
|
print(f" ⚡ Speedup: {speedup:.0f}x faster")
|
|
else:
|
|
print(f" API call: {api_duration:.4f}s")
|
|
print(f" Cache hit: < 0.000001s")
|
|
print(f" ⚡ Speedup: > 100,000x faster")
|
|
|
|
# Step 6: Verify data matches
|
|
print("\n6. Data Integrity Check:")
|
|
if len(ratings) == len(cached_ratings):
|
|
print(f" ✅ Same number of ratings ({len(ratings)})")
|
|
|
|
# Compare each rating
|
|
matches = 0
|
|
for i, (r1, r2) in enumerate(zip(ratings, cached_ratings)):
|
|
if (r1.position == r2.position and
|
|
r1.range == r2.range and
|
|
r1.error == r2.error):
|
|
matches += 1
|
|
|
|
if matches == len(ratings):
|
|
print(f" ✅ All {matches} ratings match exactly")
|
|
else:
|
|
print(f" ⚠️ Only {matches}/{len(ratings)} ratings match")
|
|
else:
|
|
print(f" ❌ Different counts: API={len(ratings)}, Cache={len(cached_ratings)}")
|
|
|
|
# Step 7: Test single position lookup
|
|
print("\n7. Single Position Lookup:")
|
|
rating_ss = await position_rating_service.get_rating_for_position(
|
|
card_id=card_id,
|
|
position="SS",
|
|
league_id="pd"
|
|
)
|
|
|
|
if rating_ss:
|
|
print(f" ✅ Found SS rating: range={rating_ss.range}, error={rating_ss.error}")
|
|
else:
|
|
print(f" ❌ SS rating not found")
|
|
|
|
# Step 8: Test cache clearing for specific card
|
|
print("\n8. Clear cache for specific card...")
|
|
await position_rating_service.clear_cache(card_id=card_id)
|
|
print(f" ✅ Cleared cache for card {card_id}")
|
|
|
|
# Verify it's cleared (should be API call again)
|
|
print("\n9. Verify cache cleared (should be API call)...")
|
|
start = pendulum.now('UTC')
|
|
|
|
ratings_after_clear = await position_rating_service.get_ratings_for_card(
|
|
card_id=card_id,
|
|
league_id="pd"
|
|
)
|
|
|
|
duration_after_clear = (pendulum.now('UTC') - start).total_seconds()
|
|
|
|
if duration_after_clear > 0.01: # If it took more than 10ms, likely API call
|
|
print(f" ✅ Cache was cleared (took {duration_after_clear:.4f}s - API call)")
|
|
else:
|
|
print(f" ⚠️ Unexpectedly fast ({duration_after_clear:.6f}s)")
|
|
|
|
# Step 10: Disconnect
|
|
print("\n10. Disconnecting from Redis...")
|
|
await redis_client.disconnect()
|
|
print(" ✅ Disconnected")
|
|
|
|
print("\n" + "=" * 60)
|
|
print("✅ All tests completed successfully!")
|
|
print("=" * 60)
|
|
|
|
|
|
if __name__ == "__main__":
|
|
asyncio.run(main())
|