Codex-native prompts, templates, scripts, and agents that bring the neural-claude workflow to the Codex CLI. Everything is file-based, repo-local, and designed for repeatable iteration with clear state.
plans/ and .codex/neural.*neural.loop-start, neural.loop-plan, neural.loop-status, neural.loop-cancelneural.plan, neural.plan-executeneural.memory, neural.recallneural.route, neural.question, neural.pv, neural.evolveneural.research, neural.gh-learn, neural.yt-learnneural.sync, neural.changelog-architectneural.todo-new, neural.todo-checkneural.meta.agent, neural.meta.skill, neural.meta.prompt, neural.meta.improve, neural.meta.eval, neural.meta.brainneural.output-style (default/concise/table/yaml/html/genui)neural.skill, neural.profile, neural.testProject-scoped skills in .codex/skills/:
plans/prd.json and plans/progress.jsonlexpertise.template.yamltodo-workflow.mdscripts/ralph-loop.sh and scripts/ralph-once.shscripts/memory_read.py / scripts/memory_write.pyscripts/youtube-transcript.pyscripts/setup-global.sh / scripts/setup-project.shagents/multi-ai/AGENTS.mdagents/dispatcher/AGENTS.mdagents/meta-agent/AGENTS.md1) Run the global install from this repo:
scripts/setup-global.sh
2) Restart Codex so /prompts:neural.* are picked up.
3) In any project, run the project install:
scripts/setup-project.sh
4) Verify prompts:
/prompts:neural.loop-start
The Ralph loop requires flock and timeout.
macOS (Homebrew):
brew install util-linux coreutils
export PATH="/opt/homebrew/opt/util-linux/bin:/opt/homebrew/opt/coreutils/libexec/gnubin:$PATH"
Linux:
flock (util-linux) and timeout (coreutils) are available in PATH.scripts/setup-global.sh
This installs:
~/.codex/neural-codex/ (prompts, templates, skills, scripts, config stub)~/.codex/prompts/ (so /prompts:neural.* appear)~/.codex/skills/ (optional autodiscovery)Use --force to overwrite existing files:
scripts/setup-global.sh --force
scripts/setup-project.sh
This seeds a project with:
.codex/prompts/.codex/templates/.codex/skills/.codex/config.toml (MCP stubs)scripts/neural-codex/ (loop + helpers)plans/prd.json, plans/progress.jsonl (from templates)Install into another path:
scripts/setup-project.sh --path /path/to/project
TEST_CMD="npm test" scripts/neural-codex/ralph-loop.sh 5
Notes:
plans/prd.json.plans/progress.jsonl./prompts:neural.memory to append notes to plans/progress.jsonl./prompts:neural.recall to search the log.scripts/memory_write.py and scripts/memory_read.py.Named configuration sets for different workflows. Switch with codex --profile <name>:
| Profile | Model | Approval | Use Case |
|---|---|---|---|
| default | gpt-5.2-codex | on-failure | Standard development |
| fast | gpt-4.1-mini | on-failure | Quick tasks, low cost |
| autonomous | gpt-5.2-codex | never | Ralph loop, unattended work |
| careful | gpt-5.2-codex | untrusted | Sensitive changes |
Example:
codex --profile autonomous exec "Fix the auth bug"
Supported MCP servers are stubbed in .codex/config.toml and include:
Set tokens in your shell as needed (e.g., GITHUB_PERSONAL_ACCESS_TOKEN).
The config file supports advanced options (see .codex/config.toml):
Reference: https://developers.openai.com/codex/config-advanced/
.
├── .codex/
│ ├── prompts/
│ ├── skills/
│ ├── templates/
│ └── config.toml
├── agents/
├── plans/
├── scripts/
└── README.md
Prompts not showing:
scripts/setup-global.sh and restart Codex.Ralph loop fails immediately:
flock and timeout are in PATH.codex CLI is installed and logged in.Tests not running:
TEST_CMD explicitly for your project.Static site lives in docs/. Enable Pages with source main / docs/.