Back

Jul 10, 2024

Live Context in IDE Extensions, 10 New LLMs, Pieces OS Popover

You may have noticed: Pieces is everywhere. We recently raised a $13.5 million Series A and Pieces Copilot+ with Live Context is revolutionizing the way developers work.

Pieces Suite

3.0.0

Pieces OS

10.0.0

Pieces Suite

You may have noticed: Pieces is everywhere. We recently raised a $13.5 million Series A, authors around the world are reviewing the Pieces Suite, Pieces Copilot+ with Live Context is revolutionizing the way developers work, and we’re live on Product Hunt today!

As we welcome new users and continue to serve existing ones, we remain focused on creating the absolute best developer tool on the market. This release puts us further down that path, with massive performance improvements to Live Context, eight new LLMs, brand new syntax highlighting, a new Pieces OS popover menu, and a whole lot more.

Support for 10 New Large Language Models

Viewing the new LLMs in the Pieces Desktop App.

Based on the feedback from the community, we are launching support for TEN new large language models. including Llama3, Gemma, and Granite models from Meta, Google, and IBM, respectively. We now offer 20 LLMs (both cloud and on-device) for you to choose as your Pieces Copilot Runtime in all of our products.

If you want to use live context with a local model, we recommend using Llama3 8B or Granite 8B. At this time, Microsoft’s LLMs are particularly efficient.

Our 10 new models launched in this release are:

  • Llama 3 8B (On-Device)

  • Phi-3 Mini 4K (On-Device)

  • Phi-3 Mini 128K (On-Device)

  • Granite 3B (On-Device)

  • Granite 8B (On-Device)

  • Gemma 1.1 2B (On-Device)

  • Gemma 1.1 7B (On-Device)

  • Code Gemma 1.1 7B (On-Device)

  • Gemini 1.5 Pro (Cloud)

  • Gemini 1.5 Flash (Cloud)

Intelligent CPU vs GPU Local LLM Routing

In the past, using a local large language model came with the overhead of deciding which runtime was right for your machine– either a CPU-based or a GPU-based model. Our team has been working hard to remove this burden, and Pieces can now detect which model will work best on your machine. Download any On-Device LLM and rest assured that we’ll automatically optimize it for your workspace.

Your Code, Beautified. Brand-New Syntax Highlighting

A snippet in the Pieces Desktop App with the old syntax highlighter, and the same snippet newly highlighted.

As developers ourselves, we know how important syntax highlighting is. We want the code you save in Pieces to look and feel exactly as you’ve come to expect from your main IDE. We’re thrilled to be launching a big UX upgrade in the Pieces Desktop App and Pieces Copilot today!

In the past, your snippets’ syntax highlighting may have been a little lackluster. This release updates our syntax highlighting system across all of our products to make it easier for you to read and understand your code. The new update is compatible with all 40+ of our supported languages and will improve your experience with Pieces.

Live Context in the IDE Extensions

Using Pieces Copilot+ with Live Context in VS Code.

The team has been grinding to bring you a consistent, polished, and beautiful copilot experience throughout the Pieces Suite. Starting now, you’ll experience the same Pieces Copilot Chat experience you’ve come to know and love in the Pieces Desktop App, now in JetBrains, VS Code, and Visual Studio.

This means you’ll enjoy the same Pieces Copilot experience across every Pieces product, with a unified UI and, most importantly, direct access to Live Context within your IDE. Be sure to install one or all of these plugins, check it out, and let us know what you think!

New Pieces OS Popover

The new Pieces OS menu with additional functionality.

Pieces OS powers the entire Pieces Suite, but it does so largely invisibly. We’re bringing more of Pieces OS’ functionality to the forefront with the updated Pieces OS popover menu.

From your taskbar, you can now view important Pieces OS information and take actions that will affect every Pieces application and extension in your workstream. Check for updates, Optimize your Memory Usage, Workstream Pattern Engine on/off status, and more. We’re excited to introduce even more Pieces OS actions through this menu in future releases.

General Enhancements, Bug Fixes, and Performance Improvements

As always, this release includes too many enhancements, bug fixes, and performance improvements to mention. A few of the most important updates:

  • You may have formerly experienced some bugs with selecting text and code in the desktop app— this is fixed!

  • The Workflow Pattern Engine's improvement is vastly improved; it will run more quickly and smoothly on all operating systems with a reduction in CPU usage, lower energy consumption, and less memory usage when active, and with a wider selection of LLMs.

  • In the Pieces Web Extensions, we’ve fixed a couple of bugs and made some new updates that improve the ux like launching the side panel on extension popup icon click.

  • We’ve expanded the input length for Pieces Copilot chats, so you can add much longer messages whenever you need.

  • No more duplicate suggested prompts in Copilot Chats

  • Easily create new Copilot Chats with the Power Menu when in Focus Mode

  • Better navigation between the Global Search & Copilot View

  • Improved folder & file handling when associating Copilot Context

User Support

If you need help, check out our Github repo where you can create issues to get assistance from us and other users, as well as join in on discussions to request features, show off something you’ve done lately with Pieces, and generally engage with us and the rest of the Pieces community.

As always, you can reach out to us for individual assistance by filling out this quick form. Don’t forget to check out our extensive documentation as well!

Pieces ❤️ Open Source

Our team has been hard at work improving the developer experience around building with Pieces SDKs. We’ve created copilot wrappers around the Python and TypeScript Pieces SDKs to simplify the interaction with your applications by providing a more user-friendly interface to the underlying Pieces OS Client SDKs. Now it’s a whole lot easier to implement a conversational copilot into your app.

Check out our GitHub to learn more about our Open Source initiatives and how you can start contributing today!

Join our Discord Server 🎉

Do you love Pieces? Stop sending us carrier pigeons 🐦 and join our Discord Server to chat with our team, other power users, get support, and more.

SHARE

Pieces Now Integrated with Neovim!