Back
The next big LLM trends in 2025 to watch
Discover the next LLM trends set to shape 2025, from advanced AI applications to industry innovations.
It feels like most products that we see nowadays have some sort of AI/LLM integration. From tools that write code to apps that teach languages; GenAI is literally everywhere.
Not only that, but improvements in AI are happening every day.
Especially if you look at things like the 12 days of OpenAI or the recent releases by Google (for ex: Veo 2 and Imagen 3), it feels like there's way too much going on and it's almost impossible to keep up with it.
Nonetheless, I’m going to try to introduce some current AI trends and possible future trends, so get ready!
LLMs are being used in exciting new ways
Remember back in the day when we used ChatGPT to generate some simple code or help us rewrite articles or text?
Well, now you can do so much more. You can use models like GPT-4 Vision that can actually see the images and interpret them then base its response on that. And we have models like Google Imagen 3, Grok, Sora, and Veo 2, for image and video generation.
I actually just got access to Sora today, and was able to make some short videos of cats:
Some videos were better than others.. 😅
There are also some other very cool use cases for LLMs, for example:
DeepScribe uses LLMs to transcribe and analyze medical records, this ends up saving doctors a lot of time (and hopefully in the future will save patients money as well, but who knows 🤷♀️).
Legal tools like Harvey AI help lawyers search and summarize large volumes of case files.
Writing tools like Sudowrite and Claude (which you can access through Pieces for free 😉) gives writers access to smart suggestions and can generate text based on their preferred tone.
Pieces improves developer productivity by having access to your workflow history via Long-Term Memory. This means that if you forget something important, you don’t have to worry, Pieces will remember it!
Pretty amazing, right? But that’s not all. Let’s talk about how local models and privacy are sharing the future.
Navigating local models and privacy trends
You can now run AI models locally on your computer. To put this simply, that means your data won't leave your device and go to some cloud provider. So you can feel safe to use it for whatever purpose needed.
Plus, privacy is not only the concern of individual users, it's a huge concern for large organizations. These companies may be responsible for the data of possibly millions of users.
As Yang et al. (2023) points out in their comprehensive survey, even major tech companies have struggled with data protection.
In 2018, 52.5 million Google+ users had their accounts exposed due to an API bug, and in 2019, hundreds of millions of Facebook users had their private data, including phone numbers and names, exposed.
That’s why it’s important to remember what Uncle Ben said to Peter Parker:
If a company has some sort of data breach, they not only hurt themselves/their reputation but possibly the lives of many others depending on what kind of data was accessed.
Smaller models, big impact
For a long time, bigger models dominated the AI space. But now, smaller models are catching up.
Models like LLaMA 2 7B are showing that we don’t always need billions of parameters to get the job done. They’re almost as powerful as massive models but require far less computing power and cost to run.
This change is thanks to LoRA (Low-Rank Adaptation), which lets developers fine-tune smaller models without the need for massive infrastructure.
For companies, this means better performance without spending loads of $$ – making AI more accessible to everyone.
AI as Your Copilot
Like we’ve already talked about, LLM's aren't just chatbots anymore.
Now they can collaborate with you on a whole other level. For example, our CEO Tsavo Knott likes to talk about the future of AI, more specifically the future of Pieces being like Jarvis from Iron Man.
Instead of being something you chat with when you have an issue, AI could be your copilot or your super smart, personal assistant. What if there was an AI tool that knew everything about you and your workflow?
That’s our goal with Pieces, creating an AI that acts as a true copilot.
Talking about AI tools like Pieces, you may just think that AI Copilots are just for developers, but that is far from the truth.
Just as an example, we have people learning new languages that use apps like Duolingo, which has an AI tutor built in. In this case, AI would act as your learning copilot or teacher.
The possibilities and applications of AI copilots are endless. And I personally can't wait to see what other types of copilots end up coming out over the next few years.
Challenges that still exist
Of course, AI isn’t perfect. LLMs sometimes reflect biases from their training data, and when they don’t know an answer, they occasionally “hallucinate” which ends up with them generating things that sound pretty good, but oftentimes is just wrong.
Power usage is another major problem. Even as models get smaller and more efficient, running LLMs still requires significant computing resources.
For teams or individuals with limited hardware, this can be a real struggle.
What’s next for LLMs?
The future is all about smarter, more context-aware AI.
Features like long-term memory (already part of tools like Pieces 😉) allow tools to remember previous work and context. Instead of starting from scratch every time you begin a new chat, these AI tools would know what you are working on and when you were working on it.
For instance, with Pieces you could ask “What was I working on recently” (maybe you’ve had a long weekend and want to pick up where you left off on Friday):
Not only that, but we are also seeing AI tools work together.
Imagine combining a language model like GPT-4 with a visual model like Imagen 3: one of them understands your instructions, and the other can generate images or graphics based on what you said.
On another note, open-source models like LLaMA are giving developers more opportunities to create AI-powered projects without being held back by the cost of calling different APIs like OpenAI.
I don’t know about you, but personally, I am super excited for the day when I can chat (or better yet, NOT chat) with Jarvis (or Pieces) at any time, whether it's ordering what I am missing from my refrigerator or helping me debug my code.
What are you looking forward to the most when it comes to AI trends or advancements in AI? Make sure to fill out the State of AI Tools 2024 survey, or tag us on X (@getpieces) to let us know!
This article was first published on October 7th, 2024, and was improved as of January 3rd, 2025, to improve your experience and share the latest information.