Back
Everything a dev community should know about using Pieces in Cursor
Combine Pieces and Cursor for a powerful AI assistant with Long-Term Memory and agentic workflows to boost productivity
One of the best things about Pieces is accessing it in whatever tool you are using to avoid having to jump back and forth between different tools.
In this post, I look at how you can use Pieces inside Cursor, the AI code editor, mixing their agentic code workflow with Pieces Long-Term Memory and other Pieces features.
What is Cursor?
Cursor describes itself as “the AI code editor”. It’s an all-in-one IDE that runs on Windows, Linux, and macOS, with an agentic workflow, essentially mixing an AI chat, with using AI agents to update multiple code files for you based on the AI understanding of what code changes you need.
For example, you can ask the Cursor Composer about a change you want in a base class, and have it show you all the changes it wants to make in all the derived classes, then one-click apply those changes across all files.
Obviously, with any AI code updates, Cursor should still be considered a copilot and you need to know enough about the code base to be able to review those changes as they can often be wrong.
Cursor is built around the open-source core of VS Code.
VS Code is sort of open source – the core of the product is open source, but the actual VS Code product itself is not, as it contains some closed source bits, like copyright images, some private telemetry code, and things like that.
It’s probably about 90% open source, and 10% closed source, with enough open source that you can literally build and run a fully-featured IDE with just the open source component.
There are a number of developer tool companies that have leveraged this open source core to build out their own IDEs, one of these being Cursor (others include Windsurfer.
Apart from a small layout change for the activity bar, and the secondary side bar being replaced by an AI pane, it looks and feels just like VS Code.
How to use Pieces from Cursor
At Pieces we are all about giving you a long-term memory to help you as a developer be more productive.
We also want you to have access to that memory in an AI coding assistant wherever you are so you don’t have to constantly context switch and lose focus and productivity.
This is why we have built a range of extensions and plugins to integrate Pieces wherever you are, including a VS Code extension that can be used from Cursor!
Where Cursor is built on VS Code, it is fully compatible with all VS Code extensions.
To use Pieces from Cursor, search for Pieces in the extensions tab and select Install, the same as if you were installing it in VS Code.
This then adds both the Pieces Copilot, and the Pieces Drive Explorer to your side bar.
You can also pin them to always be at the top of the primary side bar if you wish.
This gives you the Pieces experience you are used to – the code lens to comment or explain code, context menu options to ask the copilot about files or folders or code, and so on.
What does Pieces bring to Cursor?
Why do Pieces and Cursor work so well together? Let’s look at a few different ways.
Get answers on everything you have been doing with Long-Term Memory
The biggest limitation of LLMs is the lack of information they have to use to give you the answer you need.
Tools like Cursor will add your current project to the context sent to the LLM, and chats have access to everything discussed in that single char, but that’s it.
One of the key features of Pieces is the Long-Term Memory, the OS-level context captured from everything you are doing across all activities on your developer machine.
By having Pieces inside Cursor, you can mix conversations with this Long-Term Memory with conversations against your code.
When would you need Long-Term memory?
Cursor is great at helping you fix a bug, but it can’t help you remember first thing in the morning what bug it was you needed to look at that was assigned the day before.
It can’t take the context of a conversation you were having with colleagues around the best approach to fixing this that is consistent with your existing code base, coding standards, or aligns with the other systems and applications in your company.
Implementing a ticket powered by Cursor and Pieces
Imagine this scenario – you are working for a large financial services enterprise, building a reporting tool that sends a file to a regulator every night with details of stock trades.
Pretty standard stuff in the fintech world – I’ve built many such services.
You have a ticket assigned to you on a Friday afternoon to add a user's account number to each row.
You jump in a Teams chat with your colleagues to get guidance, and they point you to an accounts microservice that can provide you with the information.
Monday morning comes around, and you want to quickly get up to speed with what you need to do. You fire up Cursor, open the Pieces copilot, and start asking it questions.
“What was the ticket I was looking at last week?”.
Pieces responds with the details, as well as a link to the ticket. “How should I implement this based on the conversations with my team?”.
Pieces responds with a summary of the conversations, showing you the API specification, and details of the accounts microservice, and so on.
You leverage the Long-Term memory to get the details you need, and you can then feed these to Cursor to let its agentic workflow make code changes for you.
Without this additional context, there is no way Cursor could help – it doesn’t know about the account service or its API as it only has access to the reporting microservice project.
By using Pieces to get the context you need, and giving this to the Cursor Composer, you have sped up your workflow and avoided the constant context switching from Cursor to the ticket, to Teams, and the inevitable loss of productivity as you dive into the rest of your unread chats, email, cat pictures in the #pets channel, and so on.
Ask the copilot about files or folders outside of the current workspace
Most IDE-based AI tools are very heavily focused on the current workspace – the project or folder of code that you have open in the IDE.
This is great when you have a narrow focus, but many times you need to reference multiple projects at the same time.
In the example above, you have the reporting microservice open in your IDE but need to refer to the code of the account microservice to understand the API (if only everyone understood the beauty of a good OpenAPI spec…).
Pieces is different – with Pieces you can choose the context you want for every chat, you are not limited just by what is open in the IDE.
One option is 2 IDEs open and constantly switch between the two.
Been there, done that, very distracting, especially if you are working with a single screen. Another option is to have one project open in your IDE, and the other as context for your copilot chat in Pieces.
You can open the reporting microservice in Cursor, and start a new copilot chat in Pieces with the accounts microservice folder as the context of the chat.
You can then ask questions about the API, and use the results to guide Cursor to create code to connect. No context switching, everything all in one place.
Save and use code snippets
The Pieces Drive is where you can save and manage code, and have these snippets enriched by AI, or manipulated as you need.
You can access these saved snippets from anywhere that you can use Pieces, including from inside Cursor.
This means that as folks share code with you, you can access this code from Cursor, using the Pieces search to find what you need.
Again, back to our reporting microservice.
Someone from the accounts microservice has provided you with an example calling their API in Go, but you are writing this service in C#.
No problem, you save the Go code to Pieces, then duplicate and translate it into C# from the Pieces Explorer.
From there you can either add this translated snippet directly to your code or bring it into a copilot chat.
Use different LLMs to get a combination of answers
If you’ve spent any time with LLMs, you’ll know that different LLMs give different qualities of answers.
It not only varies by LLM but also can vary on a day-to-day basis as the LLM providers push out new updates.
As we all should know, these AIs are copilots –designed to guide and assist you but not replace you as the code they write can be questionable.
If the AI produces code that you are not happy with, then it can be good to ask another AI the same question and compare.
By having the Pieces Copilot open with the Cursor chat you can compare the responses from different LLMs at the same time, or use Pieces to access LLMs that Cursor doesn’t provide access to.
Some developers swear by Claude 3.5 Sonnet, some like GPT-4o, others prefer Gemini which isn’t available in Cursor.
Different LLMs were trained on data from different points in time, so have access to more or less information than others.
By using Pieces you can expand the range of LLMs, and experiment with the same prompt and context in two LLMs at once, one in Pieces, the other in Cursor.
Use the AI offline
Finally, one of my favorite features of Pieces is the ability to select a local model. This is great when you are travelling and the airplane WiFi is spotty, or in the middle of nowhere.
It’s also useful if you are a student who lives in a developing world where internet access may be unavailable or too expensive outside of your college.
When you have good internet, you download a local model, then you can work offline.
Cursor relies on an internet connection.
No internet and the chat and composer won’t work. In these situations, you can still use the Pieces copilot. Select a local model and continue your chats!
Conclusion
Pieces and Cursor are 2 different and complementary tools.
You can combine the strengths of both to get an amazing experience as a developer.
Mix the Long-Term Memory of Pieces with the agentic workflows of Cursor to get a powerful AI assistant that helps you boost your developer productivity.
How do you use Pieces? Do you use the desktop app, use it inside your IDE, or combine it with a tool like Cursor? Let me know on X, Bluesky, LinkedIn, or our Discord. And don’t forget to try Pieces