
LlamaIndex • 2025-05-20
LlamaIndex Newsletter 2025-05-20
Hi there, Loyal Llama Listeners! 🦙
Welcome to this week's edition of the LlamaIndex newsletter! We're thrilled to share some exciting updates, including our new Memory API, enhancements to LlamaParse, and the introduction of Tig, a terminal-based AI coding agent.
In-Person Workshop in NYC: Join LlamaIndex CEO Jerry Liu for a workshop on AI in finance, featuring top thinkers and builders. Limited spots available, register here.
Office hours on Discord: You can also join us on Thursday, May 22nd at 8am Pacific/5pm CET for an events driven agent workflows run-through and live coding session. Join the event on Discord to get notified when it starts!
🤩 The Highlights:
- Big Memory Upgrade in LlamaIndex: Our new flexible Memory API combines short-term chat history and long-term memory with plug-and-play blocks, including StaticMemoryBlock, FactExtractionMemoryBlock, and VectorMemoryBlock. Learn more in our docs.
- Exciting Updates to LlamaParse: We’ve streamlined the interface for a faster user experience, added a Code Snippet button for easy configuration copying, and introduced new presets for various use cases. Get started with LlamaCloud here.
- Citations and Reasoning in LlamaExtract: Discover how to implement citations and reasoning in your LlamaExtract agents with our latest code walkthrough. Watch the full video here.
✍️ Community:
- Introducing Tig, the Terminal-Based AI Coding Agent: Built with LlamaIndex workflows, Tig can write, debug, and analyze code across multiple languages. Check out Tig on GitHub and learn how to create your own agentic applications with Workflows here.
- Event-Driven Agent Workflows Walkthrough: Learn how to build a multi-agent Docs Assistant that writes web pages and uses orchestrators for search and aggregation. Check out the full code and guide here.
- Memory Component for AI Agents: Improve your agents' memory with our new Memory component, allowing for context-aware conversations and long-term recall. Read the intro blog here.
Thank you for being part of the LlamaIndex community! We can’t wait to see what you build with these new features. Stay tuned for more updates, and as always, feel free to reach out with any questions or feedback.
Happy coding! 🚀