Conversation
- Add a lean dev entrypoint that uses temporary cache paths and cleans up on exit - Add targeted and full cleanup scripts for reproducible local artifacts - Make Vite/Tauri dev port and Vite cache path configurable for safe local overrides - Document normal vs lean dev flow and cleanup tradeoffs in README Tests: npm run lint && npm run build
- install global git and performance baseline payload - add codex-os manifest and verification contracts - align repo with global policies under /Users/d/.codex/policies Tests: not run (bootstrap scaffolding only)
|
Warning You have reached your daily quota limit. Please wait up to 24 hours and I will start processing your requests again! |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 1a3af60352
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| with: | ||
| node-version: 20 | ||
| cache: pnpm | ||
| - run: pnpm install --frozen-lockfile |
There was a problem hiding this comment.
Use lockfile-compatible install mode in perf workflows
This step will fail in this repository because the tree contains package-lock.json but no pnpm-lock.yaml; pnpm install --frozen-lockfile aborts with ERR_PNPM_NO_LOCKFILE, so the perf jobs stop before any metric collection. I checked the same command path in both .github/workflows/perf-foundation.yml and .github/workflows/perf-enforced.yml, and pnpm install --help also documents that --frozen-lockfile is a fail-fast mode rather than generating a lockfile.
Useful? React with 👍 / 👎.
| process.exit(2); | ||
| } | ||
|
|
||
| const ratio = (c - b) / b; |
There was a problem hiding this comment.
Guard against zero baseline before ratio regression check
The regression ratio is computed as (c - b) / b without handling b === 0, and this commit seeds baselines at zero, so the first non-zero measurement produces Infinity/non-finite output and fails the threshold gate immediately. In practice this makes enforced perf checks fail as soon as PERF_PROFILE=production is enabled, even when teams are only trying to establish an initial baseline.
Useful? React with 👍 / 👎.
What
Why
Testing
Risk / Notes
.codex/bootstrap-conflicts/*.newfiles and merge intentionally