Compare commits

..

41 Commits

Author SHA1 Message Date
Cal Corum
d43927258a fix: remove import-time derived globals in retrosheet_data.py (#14)
Closes #14

Five globals (MIN_PA_VL, MIN_PA_VR, MIN_TBF_VL, MIN_TBF_VR, CARDSET_ID)
were derived from PLAYER_DESCRIPTION at module load time, creating a
hidden ordering dependency: any value baked in before the CLI overrides
PLAYER_DESCRIPTION would be silently wrong if a caller relied on the
derived relationship. The CLI explicitly sets all of them anyway, so
replacing with scalar defaults makes the module self-contained and safe.

Also collapses LAST_WEEK_RATIO dead ternary (both branches were 0.0).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-23 07:37:46 -05:00
cal
fd142c27d2 Merge pull request 'fix: replace wildcard import from db_calls_card_creation (#13)' (#34) from ai/paper-dynasty-card-creation-13 into main 2026-03-23 12:37:29 +00:00
Cal Corum
df6e96bc76 fix: replace wildcard import from db_calls_card_creation (#13)
Closes #13

Replace `from db_calls_card_creation import *` with an explicit
`from db_calls_card_creation import PitcherData`. Only PitcherData
is referenced in creation_helpers.py; the wildcard was also
pulling in all Peewee ORM internals via a transitive
`from peewee import *`, polluting the namespace.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-23 07:37:16 -05:00
cal
dd42f35674 Merge pull request 'fix: use logger.exception() in calculate_pitcher_ratings error handler' (#33) from ai/paper-dynasty-card-creation#17 into main 2026-03-23 12:35:47 +00:00
Cal Corum
9e48616274 fix: use logger.exception() in calculate_pitcher_ratings error handler
Replaces logger.error() with logger.exception() so the full stack trace
is captured when a pitcher card fails to generate, making it possible to
diagnose the root cause rather than just seeing the error message.

Closes #17

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-23 07:35:31 -05:00
cal
deaa43432b Merge pull request 'fix: correct Dict[str, any] to Dict[str, Any] in type annotations (#15)' (#31) from ai/paper-dynasty-card-creation-15 into main 2026-03-23 12:12:28 +00:00
cal
3fd07b6d89 Merge branch 'main' into ai/paper-dynasty-card-creation-15 2026-03-23 12:12:18 +00:00
cal
55f2eda888 Merge pull request 'chore: pin peewee and polars to exact versions (#24)' (#32) from ai/paper-dynasty-card-creation-24 into main 2026-03-23 12:12:07 +00:00
cal
a432d37850 Merge branch 'main' into ai/paper-dynasty-card-creation-24 2026-03-23 12:11:56 +00:00
dde163e2fb Merge pull request 'fix: narrow swallowed exception in get_pitching_peripherals() (#10)' (#35) from ai/paper-dynasty-card-creation#10 into main 2026-03-23 03:53:18 +00:00
f485241dd7 Merge branch 'main' into ai/paper-dynasty-card-creation#10 2026-03-23 03:53:10 +00:00
6d0497431f Merge pull request 'fix: remove dead LAST_WEEK_RATIO ternary — both branches are 0.0 (#19)' (#45) from ai/paper-dynasty-card-creation-19 into main 2026-03-23 03:52:58 +00:00
f5cb72cc26 Merge branch 'main' into ai/paper-dynasty-card-creation-19 2026-03-23 03:52:52 +00:00
f67d111a66 Merge pull request 'fix: remove test_positions_df non-test that always passes (#16)' (#43) from ai/paper-dynasty-card-creation-16 into main 2026-03-23 03:52:48 +00:00
230f3e79ce Merge branch 'main' into ai/paper-dynasty-card-creation-16 2026-03-23 03:52:41 +00:00
ecc62a0521 Merge pull request 'fix: correct get_of() opposite-field direction for switch hitters' (#40) from ai/paper-dynasty-card-creation#5 into main 2026-03-23 03:52:38 +00:00
992feba79e Merge branch 'main' into ai/paper-dynasty-card-creation#5 2026-03-23 03:52:32 +00:00
57c379a8e0 Merge branch 'main' into ai/paper-dynasty-card-creation#10 2026-03-23 03:52:23 +00:00
e413fd5cc8 Merge pull request 'fix: return default 8 on XBT% parse error in running() (#8)' (#37) from ai/paper-dynasty-card-creation#8 into main 2026-03-23 03:52:19 +00:00
6a6767f5d8 Merge branch 'main' into ai/paper-dynasty-card-creation#8 2026-03-23 03:52:13 +00:00
2b955dd8f7 Merge pull request 'fix: resolve unreachable duplicate elif 'DO*' branch in result_string() (#6)' (#39) from ai/paper-dynasty-card-creation#6 into main 2026-03-23 03:51:33 +00:00
0e66ff71e7 Merge branch 'main' into ai/paper-dynasty-card-creation-19 2026-03-23 03:51:06 +00:00
b55820eec8 Merge branch 'main' into ai/paper-dynasty-card-creation-16 2026-03-23 03:51:01 +00:00
b4a3e4b865 Merge branch 'main' into ai/paper-dynasty-card-creation#5 2026-03-23 03:50:56 +00:00
bb546c6ded Merge branch 'main' into ai/paper-dynasty-card-creation#10 2026-03-23 03:50:51 +00:00
5c7c613813 Merge branch 'main' into ai/paper-dynasty-card-creation#8 2026-03-23 03:50:47 +00:00
cbfcba5e26 Merge branch 'main' into ai/paper-dynasty-card-creation#6 2026-03-23 03:50:40 +00:00
006b48e60f Merge pull request 'fix: use player_id instead of key_bbref in create_pit_position() (#7)' (#38) from ai/paper-dynasty-card-creation#7 into main 2026-03-23 03:50:38 +00:00
5e135ff554 Merge branch 'main' into ai/paper-dynasty-card-creation#7 2026-03-23 03:50:35 +00:00
602151fb16 Merge pull request 'Remove hardcoded secrets, load API token from env' (#29) from fix/2-3-security-hardcoded-secrets into main 2026-03-23 03:50:07 +00:00
6c20f93901 Merge branch 'main' into fix/2-3-security-hardcoded-secrets 2026-03-23 03:50:00 +00:00
Cal Corum
937620e2e9 fix: remove dead LAST_WEEK_RATIO ternary — both branches are 0.0 (#19)
Closes #19

The conditional `0.0 if PLAYER_DESCRIPTION == 'Live' else 0.0` is dead
code: both branches evaluate to the same value. Simplified to a direct
assignment.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-21 07:03:38 -05:00
Cal Corum
5b8d027d46 fix: remove test_positions_df non-test that always passes (#16)
Closes #16

Deleted test_positions_df which called an async function synchronously
(returning a coroutine, not a DataFrame) and asserted True == True.
Zero coverage. Also removed the now-unused pd_positions_df import.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-21 02:02:10 -05:00
Cal Corum
a2e374cd4f fix: correct get_of() opposite-field direction for switch hitters
Switch hitters batting vs LHP hit right-handed (pull=lf, oppo=rf).
Switch hitters batting vs RHP hit left-handed (pull=rf, oppo=lf).
Copy-paste error had both pull_side branches returning the same value.

Closes #5

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-21 00:02:01 -05:00
Cal Corum
b52c5418db fix: resolve unreachable duplicate elif 'DO*' branch in result_string() (#6)
The second `elif "DO*" in data_string` was dead code — the first always
matched, so `spaces -= 2` for the DO** variant was silently skipped.
Fix: check "DO**" first (spaces -= 2), then "DO*" (spaces -= 1).

Closes #6

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-20 23:33:06 -05:00
Cal Corum
1d96223c78 fix: use player_id instead of key_bbref in create_pit_position() (#7)
Closes #7

The fallback branch of create_pit_position() used `int(df_data["key_bbref"])`
which always raises ValueError for string IDs like 'verlaju01'. The exception
was silently swallowed, causing pitchers without defensive stats to receive no
position record at all.

Fix: use `int(float(df_data["player_id"]))` to match the pattern used in
create_pitching_card() on the same file.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-20 23:03:43 -05:00
Cal Corum
8e24b4e686 fix: return default 8 on XBT% parse error in running() (#8)
Closes #8

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-20 21:32:14 -05:00
Cal Corum
46fdde3d02 fix: narrow swallowed exception in get_pitching_peripherals() (#10)
Closes #10

Replace `except Exception: pass` with `except KeyError: pass` so only
the expected missing-attribute case (`cell["data-append-csv"]` not
present) is silently skipped. Network errors, encoding issues, and
other unexpected exceptions will now propagate instead of being hidden.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-20 20:02:55 -05:00
Cal Corum
09cb942435 chore: pin peewee and polars to exact versions (#24)
Closes #24

Pins the two unpinned dependencies in requirements.txt:
- peewee (unversioned → 3.19.0)
- polars (unversioned → 1.36.1)

All other dependencies were already pinned with ==.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-20 16:33:00 -05:00
Cal Corum
b39d3283fd fix: correct Dict[str, any] to Dict[str, Any] in type annotations (#15)
Closes #15

`any` (lowercase) refers to the builtin function, not `typing.Any`.
Added `Any` to the `typing` imports in both files and updated the
`cardset` parameter annotation accordingly.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-20 16:03:45 -05:00
Cal Corum
e3220bf337 Remove hardcoded secrets, load API token from environment
- Replace hardcoded PD API bearer token in db_calls.py with dotenv/env var
- Delete scripts/supabase_doodling.py (dead scratch file with hardcoded Supabase JWT)
- Add python-dotenv dependency and .env.example template
- Consolidate check_prod_missing_ratings.py to import AUTH_TOKEN from db_calls
- Hard fail if PD_API_TOKEN is missing to prevent silent auth failures

Fixes #2, Fixes #3

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-20 12:38:01 -05:00
11 changed files with 80 additions and 143 deletions

View File

@ -1,8 +1,14 @@
import asyncio
import sys
from pathlib import Path
import aiohttp
import pandas as pd
AUTH_TOKEN = {"Authorization": "Bearer Tp3aO3jhYve5NJF1IqOmJTmk"}
# Add project root so we can import db_calls
sys.path.insert(0, str(Path(__file__).resolve().parents[2]))
from db_calls import AUTH_TOKEN
PROD_URL = "https://pd.manticorum.com/api"

2
.env.example Normal file
View File

@ -0,0 +1,2 @@
# Paper Dynasty API
PD_API_TOKEN=your-bearer-token-here

View File

@ -573,7 +573,7 @@ def stealing_line(steal_data: dict):
else:
good_jump = "2-12"
return f'{"*" if sd[2] else ""}{good_jump}/- ({sd[1] if sd[1] else "-"}-{sd[0] if sd[0] else "-"})'
return f"{'*' if sd[2] else ''}{good_jump}/- ({sd[1] if sd[1] else '-'}-{sd[0] if sd[0] else '-'})"
def running(extra_base_pct: str):
@ -583,7 +583,7 @@ def running(extra_base_pct: str):
xb_pct = float(extra_base_pct.strip("%")) / 80
except Exception as e:
logger.error(f"calcs_batter running - {e}")
xb_pct = 20
return 8
return max(min(round(6 + (10 * xb_pct)), 17), 8)
@ -693,11 +693,11 @@ def get_batter_ratings(df_data) -> List[dict]:
logger.debug(
f"all on base: {vl.hbp + vl.walk + vl.total_hits()} / all chances: {vl.total_chances()}"
f'{"*******ERROR ABOVE*******" if vl.hbp + vl.walk + vl.total_hits() != vl.total_chances() else ""}'
f"{'*******ERROR ABOVE*******' if vl.hbp + vl.walk + vl.total_hits() != vl.total_chances() else ''}"
)
logger.debug(
f"all on base: {vr.hbp + vr.walk + vr.total_hits()} / all chances: {vr.total_chances()}"
f'{"*******ERROR ABOVE*******" if vr.hbp + vr.walk + vr.total_hits() != vr.total_chances() else ""}'
f"{'*******ERROR ABOVE*******' if vr.hbp + vr.walk + vr.total_hits() != vr.total_chances() else ''}"
)
vl.calculate_strikeouts(df_data["SO_vL"], df_data["AB_vL"], df_data["H_vL"])

View File

@ -3,7 +3,7 @@ import urllib.parse
import pandas as pd
import numpy as np
from typing import Dict
from typing import Any, Dict
from creation_helpers import (
get_all_pybaseball_ids,
sanitize_name,
@ -158,8 +158,8 @@ async def create_new_players(
{
"p_name": f"{f_name} {l_name}",
"cost": NEW_PLAYER_COST,
"image": f'{card_base_url}/{df_data["player_id"]}/battingcard'
f'{urllib.parse.quote("?d=")}{release_dir}',
"image": f"{card_base_url}/{df_data['player_id']}/battingcard"
f"{urllib.parse.quote('?d=')}{release_dir}",
"mlbclub": CLUB_LIST[df_data["Tm_vL"]],
"franchise": FRANCHISE_LIST[df_data["Tm_vL"]],
"cardset_id": cardset["id"],
@ -302,7 +302,7 @@ async def calculate_batting_ratings(offense_stats: pd.DataFrame, to_post: bool):
async def post_player_updates(
cardset: Dict[str, any],
cardset: Dict[str, Any],
card_base_url: str,
release_dir: str,
player_desc: str,
@ -432,8 +432,8 @@ async def post_player_updates(
[
(
"image",
f'{card_base_url}/{df_data["player_id"]}/battingcard'
f'{urllib.parse.quote("?d=")}{release_dir}',
f"{card_base_url}/{df_data['player_id']}/battingcard"
f"{urllib.parse.quote('?d=')}{release_dir}",
)
]
)

View File

@ -10,7 +10,7 @@ import requests
import time
from db_calls import db_get
from db_calls_card_creation import *
from db_calls_card_creation import PitcherData
from bs4 import BeautifulSoup
# Card Creation Constants
@ -533,7 +533,7 @@ def get_pitching_peripherals(season: int):
row_data.append(player_id)
if len(headers) == 0:
col_names.append("key_bbref")
except Exception:
except KeyError:
pass
row_data.append(cell.text)
if len(headers) == 0:
@ -595,21 +595,21 @@ def legal_splits(tot_chances):
def result_string(tba_data, row_num, split_min=None, split_max=None):
bold1 = f'{"<b>" if tba_data["bold"] else ""}'
bold2 = f'{"</b>" if tba_data["bold"] else ""}'
row_string = f'{"<b> </b>" if int(row_num) < 10 else ""}{row_num}'
bold1 = f"{'<b>' if tba_data['bold'] else ''}"
bold2 = f"{'</b>' if tba_data['bold'] else ''}"
row_string = f"{'<b> </b>' if int(row_num) < 10 else ''}{row_num}"
if TESTING:
print(
f'adding {tba_data["string"]} to row {row_num} / '
f"adding {tba_data['string']} to row {row_num} / "
f"split_min: {split_min} / split_max: {split_max}"
)
# No splits; standard result
if not split_min:
return f'{bold1}{row_string}-{tba_data["string"]}{bold2}'
return f"{bold1}{row_string}-{tba_data['string']}{bold2}"
# With splits
split_nums = f'{split_min if split_min != 20 else ""}{"-" if split_min != 20 else ""}{split_max}'
split_nums = f"{split_min if split_min != 20 else ''}{'-' if split_min != 20 else ''}{split_max}"
data_string = (
tba_data["sm-string"] if "sm-string" in tba_data.keys() else tba_data["string"]
)
@ -618,10 +618,10 @@ def result_string(tba_data, row_num, split_min=None, split_max=None):
spaces -= 3
elif "SI**" in data_string:
spaces += 1
elif "DO**" in data_string:
spaces -= 2
elif "DO*" in data_string:
spaces -= 1
elif "DO*" in data_string:
spaces -= 2
elif "so" in data_string:
spaces += 3
elif "gb" in data_string:
@ -638,41 +638,39 @@ def result_string(tba_data, row_num, split_min=None, split_max=None):
row_output = "<b> </b>"
if TESTING:
print(f"row_output: {row_output}")
return f'{bold1}{row_output}{data_string}{" " * spaces}{split_nums}{bold2}'
return f"{bold1}{row_output}{data_string}{' ' * spaces}{split_nums}{bold2}"
def result_data(
tba_data, row_num, tba_data_bottom=None, top_split_max=None, fatigue=False
):
ret_data = {}
top_bold1 = f'{"<b>" if tba_data["bold"] else ""}'
top_bold2 = f'{"</b>" if tba_data["bold"] else ""}'
top_bold1 = f"{'<b>' if tba_data['bold'] else ''}"
top_bold2 = f"{'</b>' if tba_data['bold'] else ''}"
bot_bold1 = None
bot_bold2 = None
if tba_data_bottom:
bot_bold1 = f'{"<b>" if tba_data_bottom["bold"] else ""}'
bot_bold2 = f'{"</b>" if tba_data_bottom["bold"] else ""}'
bot_bold1 = f"{'<b>' if tba_data_bottom['bold'] else ''}"
bot_bold2 = f"{'</b>' if tba_data_bottom['bold'] else ''}"
if tba_data_bottom is None:
ret_data["2d6"] = f"{top_bold1}{int(row_num)}-{top_bold2}"
ret_data["splits"] = f"{top_bold1}{top_bold2}"
ret_data["result"] = (
f"{top_bold1}"
f'{tba_data["string"]}{"" if fatigue else ""}'
f"{top_bold2}"
f"{top_bold1}{tba_data['string']}{'' if fatigue else ''}{top_bold2}"
)
else:
ret_data["2d6"] = f"{top_bold1}{int(row_num)}-{top_bold2}\n"
ret_data["splits"] = (
f'{top_bold1}1{"-" if top_split_max != 1 else ""}'
f'{top_split_max if top_split_max != 1 else ""}{top_bold2}\n'
f'{bot_bold1}{top_split_max+1}{"-20" if top_split_max != 19 else ""}{bot_bold2}'
f"{top_bold1}1{'-' if top_split_max != 1 else ''}"
f"{top_split_max if top_split_max != 1 else ''}{top_bold2}\n"
f"{bot_bold1}{top_split_max + 1}{'-20' if top_split_max != 19 else ''}{bot_bold2}"
)
ret_data["result"] = (
f'{top_bold1}{tba_data["sm-string"] if "sm-string" in tba_data.keys() else tba_data["string"]}'
f"{top_bold1}{tba_data['sm-string'] if 'sm-string' in tba_data.keys() else tba_data['string']}"
f"{top_bold2}\n"
f"{bot_bold1}"
f'{tba_data_bottom["sm-string"] if "sm-string" in tba_data_bottom.keys() else tba_data_bottom["string"]}'
f"{tba_data_bottom['sm-string'] if 'sm-string' in tba_data_bottom.keys() else tba_data_bottom['string']}"
f"{bot_bold2}"
)
@ -688,9 +686,9 @@ def get_of(batter_hand, pitcher_hand, pull_side=True):
if batter_hand == "S":
if pitcher_hand == "L":
return "rf" if pull_side else "rf"
return "lf" if pull_side else "rf"
else:
return "lf" if pull_side else "lf"
return "rf" if pull_side else "lf"
def get_col(col_num):
@ -729,7 +727,7 @@ def get_position_string(all_pos: list, inc_p: bool):
for x in all_pos:
if x.position == "OF":
of_arm = f'{"+" if "-" not in x.arm else ""}{x.arm}'
of_arm = f"{'+' if '-' not in x.arm else ''}{x.arm}"
of_error = x.error
of_innings = x.innings
elif x.position == "CF":
@ -744,7 +742,7 @@ def get_position_string(all_pos: list, inc_p: bool):
elif x.position == "C":
all_def.append(
(
f'c-{x.range}({"+" if int(x.arm) >= 0 else ""}{x.arm}) e{x.error} T-{x.overthrow}(pb-{x.pb})',
f"c-{x.range}({'+' if int(x.arm) >= 0 else ''}{x.arm}) e{x.error} T-{x.overthrow}(pb-{x.pb})",
x.innings,
)
)
@ -1079,7 +1077,7 @@ def mlbteam_and_franchise(mlbam_playerid):
p_data["franchise"] = normalize_franchise(data["currentTeam"]["name"])
else:
logger.error(
f'Could not set team for {mlbam_playerid}; received {data["currentTeam"]["name"]}'
f"Could not set team for {mlbam_playerid}; received {data['currentTeam']['name']}"
)
else:
logger.error(
@ -1222,5 +1220,5 @@ def get_hand(df_data):
else:
return "R"
except Exception:
logger.error(f'Error in get_hand for {df_data["Name"]}')
logger.error(f"Error in get_hand for {df_data['Name']}")
return "R"

View File

@ -1,10 +1,18 @@
import os
import aiohttp
import pybaseball as pb
from dotenv import load_dotenv
from typing import Literal
from exceptions import logger
AUTH_TOKEN = {"Authorization": "Bearer Tp3aO3jhYve5NJF1IqOmJTmk"}
load_dotenv()
_token = os.environ.get("PD_API_TOKEN")
if not _token:
raise EnvironmentError("PD_API_TOKEN environment variable is required")
AUTH_TOKEN = {"Authorization": f"Bearer {_token}"}
DB_URL = "https://pd.manticorum.com/api"
master_debug = True
alt_database = None
@ -25,7 +33,7 @@ def param_char(other_params):
def get_req_url(
endpoint: str, api_ver: int = 2, object_id: int = None, params: list = None
):
req_url = f'{DB_URL}/v{api_ver}/{endpoint}{"/" if object_id is not None else ""}{object_id if object_id is not None else ""}'
req_url = f"{DB_URL}/v{api_ver}/{endpoint}{'/' if object_id is not None else ''}{object_id if object_id is not None else ''}"
if params:
other_params = False
@ -39,11 +47,11 @@ def get_req_url(
def log_return_value(log_string: str):
if master_debug:
logger.info(
f'return: {log_string[:1200]}{" [ S N I P P E D ]" if len(log_string) > 1200 else ""}\n'
f"return: {log_string[:1200]}{' [ S N I P P E D ]' if len(log_string) > 1200 else ''}\n"
)
else:
logger.debug(
f'return: {log_string[:1200]}{" [ S N I P P E D ]" if len(log_string) > 1200 else ""}\n'
f"return: {log_string[:1200]}{' [ S N I P P E D ]' if len(log_string) > 1200 else ''}\n"
)
@ -183,4 +191,4 @@ def get_player_data(
def player_desc(this_player) -> str:
if this_player["p_name"] in this_player["description"]:
return this_player["description"]
return f'{this_player["description"]} {this_player["p_name"]}'
return f"{this_player['description']} {this_player['p_name']}"

View File

@ -1,7 +1,7 @@
import datetime
import urllib.parse
import pandas as pd
from typing import Dict
from typing import Any, Dict
from creation_helpers import (
get_all_pybaseball_ids,
@ -196,8 +196,8 @@ async def create_new_players(
{
"p_name": f"{f_name} {l_name}",
"cost": NEW_PLAYER_COST,
"image": f'{card_base_url}/{df_data["player_id"]}/'
f'pitchingcard{urllib.parse.quote("?d=")}{release_dir}',
"image": f"{card_base_url}/{df_data['player_id']}/"
f"pitchingcard{urllib.parse.quote('?d=')}{release_dir}",
"mlbclub": CLUB_LIST[df_data["Tm_vL"]],
"franchise": FRANCHISE_LIST[df_data["Tm_vL"]],
"cardset_id": cardset["id"],
@ -268,7 +268,7 @@ async def calculate_pitching_cards(
def create_pitching_card(df_data):
logger.info(
f'Creating pitching card for {df_data["name_first"]} {df_data["name_last"]} / fg ID: {df_data["key_fangraphs"]}'
f"Creating pitching card for {df_data['name_first']} {df_data['name_last']} / fg ID: {df_data['key_fangraphs']}"
)
pow_data = cde.pow_ratings(
float(df_data["Inn_def"]), df_data["GS"], df_data["G"]
@ -298,11 +298,13 @@ async def calculate_pitching_cards(
int(df_data["GF"]), int(df_data["SV"]), int(df_data["G"])
),
"hand": df_data["pitch_hand"],
"batting": f'#1W{df_data["pitch_hand"]}-C',
"batting": f"#1W{df_data['pitch_hand']}-C",
}
)
except Exception as e:
logger.error(f'Skipping fg ID {df_data["key_fangraphs"]} due to: {e}')
except Exception:
logger.exception(
f"Skipping fg ID {df_data['key_fangraphs']} due to exception"
)
print("Calculating pitching cards...")
pitching_stats.apply(create_pitching_card, axis=1)
@ -333,7 +335,7 @@ async def create_position(
def create_pit_position(df_data):
if df_data["key_bbref"] in df_p.index:
logger.debug(f'Running P stats for {df_data["p_name"]}')
logger.debug(f"Running P stats for {df_data['p_name']}")
pit_positions.append(
{
"player_id": int(df_data["player_id"]),
@ -355,7 +357,7 @@ async def create_position(
try:
pit_positions.append(
{
"player_id": int(df_data["key_bbref"]),
"player_id": int(float(df_data["player_id"])),
"position": "P",
"innings": 1,
"range": 5,
@ -364,7 +366,7 @@ async def create_position(
)
except Exception:
logger.error(
f'Could not create pitcher position for {df_data["key_bbref"]}'
f"Could not create pitcher position for {df_data['key_bbref']}"
)
print("Calculating pitcher fielding lines now...")
@ -386,7 +388,7 @@ async def calculate_pitcher_ratings(pitching_stats: pd.DataFrame, post_pitchers:
pitching_ratings.extend(cpi.get_pitcher_ratings(df_data))
except Exception:
logger.error(
f'Could not create a pitching card for {df_data["key_fangraphs"]}'
f"Could not create a pitching card for {df_data['key_fangraphs']}"
)
print("Calculating card ratings...")
@ -400,7 +402,7 @@ async def calculate_pitcher_ratings(pitching_stats: pd.DataFrame, post_pitchers:
async def post_player_updates(
cardset: Dict[str, any],
cardset: Dict[str, Any],
player_description: str,
card_base_url: str,
release_dir: str,
@ -525,8 +527,8 @@ async def post_player_updates(
[
(
"image",
f'{card_base_url}/{df_data["player_id"]}/pitchingcard'
f'{urllib.parse.quote("?d=")}{release_dir}',
f"{card_base_url}/{df_data['player_id']}/pitchingcard"
f"{urllib.parse.quote('?d=')}{release_dir}",
)
]
)

View File

@ -23,6 +23,8 @@ dependencies = [
"pydantic>=2.9.0",
# AWS
"boto3>=1.35.0",
# Environment
"python-dotenv>=1.0.0",
# Scraping
"beautifulsoup4>=4.12.0",
"lxml>=5.0.0",

View File

@ -23,9 +23,9 @@ multidict==6.1.0
numpy==2.1.2
packaging==24.1
pandas==2.2.3
peewee
peewee==3.19.0
pillow==11.0.0
polars
polars==1.36.1
pluggy==1.5.0
propcache==0.2.0
# pyarrow==17.0.0

View File

@ -1,75 +0,0 @@
from typing import Literal
import requests
from exceptions import logger, log_exception
AUTH_TOKEN = {
"Authorization": "Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZSIsInJlZiI6ImNucGhwbnV2aGp2cXprY2J3emRrIiwicm9sZSI6InNlcnZpY2Vfcm9sZSIsImlhdCI6MTc0NTgxMTc4NCwiZXhwIjoyMDYxMzg3Nzg0fQ.7dG_y2zU2PajBwTD8vut5GcWf3CSaZePkYW_hMf0fVg",
"apikey": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZSIsInJlZiI6ImNucGhwbnV2aGp2cXprY2J3emRrIiwicm9sZSI6InNlcnZpY2Vfcm9sZSIsImlhdCI6MTc0NTgxMTc4NCwiZXhwIjoyMDYxMzg3Nzg0fQ.7dG_y2zU2PajBwTD8vut5GcWf3CSaZePkYW_hMf0fVg",
}
DB_URL = "https://cnphpnuvhjvqzkcbwzdk.supabase.co/rest/v1"
def get_req_url(endpoint: str, params: list = None):
req_url = f"{DB_URL}/{endpoint}?"
if params:
other_params = False
for x in params:
req_url += f'{"&" if other_params else "?"}{x[0]}={x[1]}'
other_params = True
return req_url
def log_return_value(log_string: str, log_type: Literal["info", "debug"]):
if log_type == "info":
logger.info(
f'return: {log_string[:1200]}{" [ S N I P P E D ]" if len(log_string) > 1200 else ""}\n'
)
else:
logger.debug(
f'return: {log_string[:1200]}{" [ S N I P P E D ]" if len(log_string) > 1200 else ""}\n'
)
def db_get(
endpoint: str,
params: dict = None,
limit: int = 1000,
offset: int = 0,
none_okay: bool = True,
timeout: int = 3,
):
req_url = f"{DB_URL}/{endpoint}?limit={limit}&offset={offset}"
logger.info(f"HTTP GET: {req_url}, params: {params}")
response = requests.request("GET", req_url, params=params, headers=AUTH_TOKEN)
logger.info(response)
if response.status_code != requests.codes.ok:
log_exception(Exception, response.text)
data = response.json()
if isinstance(data, list) and len(data) == 0:
if none_okay:
return None
else:
log_exception(Exception, "Query returned no results and none_okay = False")
return data
# async with aiohttp.ClientSession(headers=AUTH_TOKEN) as session:
# async with session.get(req_url) as r:
# logger.info(f'session info: {r}')
# if r.status == 200:
# js = await r.json()
# log_return_value(f'{js}')
# return js
# elif none_okay:
# e = await r.text()
# logger.error(e)
# return None
# else:
# e = await r.text()
# logger.error(e)
# raise ValueError(f'DB: {e}')

View File

@ -1,10 +1,4 @@
from creation_helpers import pd_positions_df, mround, sanitize_chance_output
def test_positions_df():
cardset_19_pos = pd_positions_df(19)
assert True == True
from creation_helpers import mround, sanitize_chance_output
def test_mround():