Your AI connector for Goose
Goose is an incredible CLI-based AI assistant, but by default, it only sees what’s in front of it. What if your assistant could remember what you did yesterday, reference old PRs, or reuse your previous snippets?
That’s where Pieces comes in.
Pieces is the free AI connector for Goose, enabling deep context and long-term memory directly inside your CLI or desktop chat session.
Powered by the Model Context Protocol (MCP), this integration allows Goose to surface highly personalized answers, grounded in your real development history.
Whether you’re on macOS, Windows, or Linux, this connection unlocks an entirely new layer of productivity.
Why use Pieces as your AI connector for Goose?
Most AI tools offer you suggestions. Pieces offers you memory.
Bring long-term context to every prompt
Once integrated, Goose can access your full development memory not just your current project. That includes:
Past code you’ve worked on
Bug fixes from teammates
Prompts and completions you’ve used before
Reusable snippets across projects
Instead of retyping everything or hunting through git logs, Goose + Pieces helps you recall and reuse context effortlessly.
Fully local, deeply personalized
Pieces works offline-first
No user data is ever shared without opt-in
Supports secure sharing and optional cloud sync
CLI-native experience with no external dependencies
This makes Pieces a smart, privacy-respecting AI connector for Goose, perfect for developers who prefer transparency and control.
Set up your AI connector for Goose
Step 1: Install PiecesOS
This is the foundation of your long-term memory.
Compatible with macOS (Ventura+), Windows 10+, and Linux
Download PiecesOS
Or install via Snap (Linux):
Open the PiecesOS Quick Menu and enable the Long-Term Memory Engine (LTM-2.5) to start capturing context.
Step 2: Get your SSE endpoint
You’ll need to pass Goose your current MCP endpoint. You can find this in:
PiecesOS Quick Menu → Model Context Protocol tab
Pieces Desktop App → Settings → MCP
Example:
http://localhost:39300/model_context_protocol/2024-11-05/sse
Adjust the port number if necessary.
Step 3: Configure Goose
Open your terminal
Run:goose configure
Choose Add Extension → Remote Extension
Name it (e.g.
Pieces
)Paste your SSE endpoint
Set a tool timeout (e.g. 300)
Skip optional variables unless needed
You should see:
Added Pieces Extension
To verify:
goose list extensions
If you're using Goose on macOS and see a PATH error, run:
Now, you can launch Goose from anywhere.
How to use Goose AI with Pieces MCP
Once connected, Goose becomes far more intelligent because it now remembers what you’ve done.
Start a conversation
goose
You’ll see the Goose CLI open with your configured tools, including Pieces.
Try a real-world prompt
If LTM is enabled and data has been captured, you’ll get a complete memory-based report.
Manage, toggle, or remove the extension
To remove or toggle the Pieces integration:
Run
goose configure
Navigate to Toggle Extensions to enable/disable
Navigate to Remove Extensions to delete it completely
Always confirm via space + return (macOS) or enter (Windows/Linux)
Check the port: Your SSE endpoint may vary
Restart services if anything fails, and reconfigure with the correct port
If you see JSON errors or Goose says “Sorry, I can’t do this,” your MCP may not be active.
Why developers choose Pieces as their AI connector for Goose
CLI-native experience, no switching windows
Personal memory via Pieces Long-Term Memory
Built-in snippet recall across sessions
Works on macOS, Windows 10/11, and Linux
Fast, local, secure—ideal for private workflows
Optional cloud sync and sharing on your terms
A powerful Goose AI alternative when used as a contextual layer
Try Pieces with Goose today
If you're exploring Goose AI, looking to improve its memory, or need a smarter way to prompt from the terminal, this is the connection that brings your workflow history into the conversation.