perf: parallelize per-pack card fetches after roll_for_cards (#99) #167

Merged
cal merged 1 commits from issue/99-perf-parallelize-per-pack-card-fetches-after-roll into main 2026-04-12 14:54:14 +00:00
Collaborator

Closes #99

Changes

Replaced the sequential for loop that fetched cards one pack at a time with asyncio.gather() in open_st_pr_packs() (helpers/main.py).

Before: 5 packs = 5 sequential db_get("cards", ...) round-trips.

After: All pack card fetches fire concurrently via a nested _fetch_pack_cards coroutine + asyncio.gather(). The pack_id default injection (card.setdefault("pack_id", p_id)) is preserved per-pack inside the coroutine. Results are flattened into all_cards with a list comprehension.

Files Changed

  • helpers/main.py — lines 1764–1771 (7 insertions, 5 deletions)

Notes

No test suite covers this path. Ruff lint passes.

Closes #99 ## Changes Replaced the sequential `for` loop that fetched cards one pack at a time with `asyncio.gather()` in `open_st_pr_packs()` (`helpers/main.py`). **Before:** 5 packs = 5 sequential `db_get("cards", ...)` round-trips. **After:** All pack card fetches fire concurrently via a nested `_fetch_pack_cards` coroutine + `asyncio.gather()`. The `pack_id` default injection (`card.setdefault("pack_id", p_id)`) is preserved per-pack inside the coroutine. Results are flattened into `all_cards` with a list comprehension. ## Files Changed - `helpers/main.py` — lines 1764–1771 (7 insertions, 5 deletions) ## Notes No test suite covers this path. Ruff lint passes.
Claude added 1 commit 2026-04-10 16:32:04 +00:00
perf: parallelize per-pack card fetches after roll_for_cards (#99)
All checks were successful
Ruff Lint / lint (pull_request) Successful in 13s
1b885ca66f
Closes #99

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Claude added the
ai-reviewing
label 2026-04-10 16:49:43 +00:00
Claude reviewed 2026-04-10 16:50:44 +00:00
Claude left a comment
Author
Collaborator

AI Code Review

Files Reviewed

  • helpers/main.py (modified — lines 1764–1777)

Findings

Correctness

  • No issues found.
  • asyncio.gather() preserves result order matching pack_ids input order — flattened all_cards ordering is identical to the original sequential build.
  • card.setdefault("pack_id", p_id) runs inside the coroutine before return result["cards"], so all returned cards carry pack_id. Mutation is correctly scoped per-pack.
  • Error behavior is equivalent: default gather (no return_exceptions=True) propagates the first raised exception to the caller, same as the original sequential loop. Remaining tasks continue running but their results are discarded — no behavioral regression.
  • result["cards"] KeyError risk is identical to the original new_cards["cards"] — not a regression.

Security

  • No issues found. No user input involved in the parallelized path.

Style & Conventions

  • No issues found. Pattern is consistent with PR #156 (scout opportunity asyncio.gather) and PR #157 (card embed asyncio.gather) already merged to main.
  • Inner coroutine definition is the correct approach for per-item closures with asyncio.gather.

Suggestions

  • None. The "no test coverage" note in the PR body is acceptable for a 7-line perf change on an already-untested path.

Verdict: APPROVED

Clean parallelization. Ordering, mutation, and error behavior all preserved. Follows established project patterns.

(Posted as COMMENT — Gitea blocks self-approval)


Automated review by Claude PR Reviewer

## AI Code Review ### Files Reviewed - `helpers/main.py` (modified — lines 1764–1777) ### Findings #### Correctness - No issues found. - `asyncio.gather()` preserves result order matching `pack_ids` input order — flattened `all_cards` ordering is identical to the original sequential build. - `card.setdefault("pack_id", p_id)` runs inside the coroutine before `return result["cards"]`, so all returned cards carry `pack_id`. Mutation is correctly scoped per-pack. - Error behavior is equivalent: default `gather` (no `return_exceptions=True`) propagates the first raised exception to the caller, same as the original sequential loop. Remaining tasks continue running but their results are discarded — no behavioral regression. - `result["cards"]` KeyError risk is identical to the original `new_cards["cards"]` — not a regression. #### Security - No issues found. No user input involved in the parallelized path. #### Style & Conventions - No issues found. Pattern is consistent with PR #156 (scout opportunity `asyncio.gather`) and PR #157 (card embed `asyncio.gather`) already merged to main. - Inner coroutine definition is the correct approach for per-item closures with `asyncio.gather`. #### Suggestions - None. The "no test coverage" note in the PR body is acceptable for a 7-line perf change on an already-untested path. ### Verdict: APPROVED Clean parallelization. Ordering, mutation, and error behavior all preserved. Follows established project patterns. _(Posted as COMMENT — Gitea blocks self-approval)_ --- *Automated review by Claude PR Reviewer*
Claude added
ai-reviewed
and removed
ai-reviewing
labels 2026-04-10 16:50:55 +00:00
Claude removed the
ai-reviewed
label 2026-04-11 19:23:45 +00:00
claude-reviewer added the
ai-reviewing
label 2026-04-11 21:19:37 +00:00
claude-reviewer approved these changes 2026-04-11 21:21:02 +00:00
claude-reviewer left a comment
Collaborator

AI Code Review

Files Reviewed

  • helpers/main.py (modified — lines 1764–1777)

Findings

Correctness

No issues found.

  • asyncio is imported at line 1. ✓
  • pack_id injection via card.setdefault("pack_id", p_id) preserved inside _fetch_pack_cards coroutine. ✓
  • Flatten via [card for pack_cards in results for card in pack_cards] is correct — same card order within each pack, packs in input order (gather preserves order). ✓
  • Empty pack_ids case already guarded by if not pack_ids: raise ValueError before the gather is reached. ✓
  • if not all_cards: guard preserved post-gather. ✓
  • Exception behavior is equivalent to the original: if any db_get call raises, the exception propagates from asyncio.gather. Sibling tasks continue running to completion but their results are discarded — same net outcome as a sequential loop where the first failure exits. ✓

Security

No issues found.

Style & Conventions

No issues found. Pattern matches the asyncio.gather usage already in this same function (line 1789, create_scout_opportunity parallelization from PR #156) and PR #157 (get_card_embeds parallelization).

Suggestions

  • Consider asyncio.gather(..., return_exceptions=True) + explicit re-raise if you want to log which specific pack fetch failed before propagating. Not required — the outer except + error logger in callers covers this adequately. Pure opt-in observability improvement.

Verdict: APPROVED

Clean parallel refactor. Logic is correct, pack_id injection preserved per-pack, flattening is correct, existing error guards are intact. Consistent with the gather pattern used in two adjacent PRs already merged to main.


Automated review by Claude PR Reviewer

## AI Code Review ### Files Reviewed - `helpers/main.py` (modified — lines 1764–1777) ### Findings #### Correctness No issues found. - `asyncio` is imported at line 1. ✓ - `pack_id` injection via `card.setdefault("pack_id", p_id)` preserved inside `_fetch_pack_cards` coroutine. ✓ - Flatten via `[card for pack_cards in results for card in pack_cards]` is correct — same card order within each pack, packs in input order (gather preserves order). ✓ - Empty `pack_ids` case already guarded by `if not pack_ids: raise ValueError` before the gather is reached. ✓ - `if not all_cards:` guard preserved post-gather. ✓ - Exception behavior is equivalent to the original: if any `db_get` call raises, the exception propagates from `asyncio.gather`. Sibling tasks continue running to completion but their results are discarded — same net outcome as a sequential loop where the first failure exits. ✓ #### Security No issues found. #### Style & Conventions No issues found. Pattern matches the `asyncio.gather` usage already in this same function (line 1789, `create_scout_opportunity` parallelization from PR #156) and PR #157 (`get_card_embeds` parallelization). #### Suggestions - Consider `asyncio.gather(..., return_exceptions=True)` + explicit re-raise if you want to log which specific pack fetch failed before propagating. Not required — the outer `except` + error logger in callers covers this adequately. Pure opt-in observability improvement. ### Verdict: APPROVED Clean parallel refactor. Logic is correct, `pack_id` injection preserved per-pack, flattening is correct, existing error guards are intact. Consistent with the gather pattern used in two adjacent PRs already merged to main. --- *Automated review by Claude PR Reviewer*
claude-reviewer added
ai-reviewed
and removed
ai-reviewing
labels 2026-04-11 21:21:19 +00:00
cal force-pushed issue/99-perf-parallelize-per-pack-card-fetches-after-roll from 1b885ca66f to 6b475ba439 2026-04-12 14:53:41 +00:00 Compare
cal merged commit 3481291259 into main 2026-04-12 14:54:14 +00:00
cal deleted branch issue/99-perf-parallelize-per-pack-card-fetches-after-roll 2026-04-12 14:54:14 +00:00
Sign in to join this conversation.
No description provided.