
Agentic Long-Term Memory for LLMs — Why Not to Rely on LangChain or Letta
In this video, we dive deep into agentic long-term memory for LLM-based applications. We’ll start from the basics and work our way up to designing advanced memory systems. You'll learn how to build a custom chatbot with agentic memory from scratch, and explore two leading frameworks in this space: Letta and LangChain.
🚀 Full code and materials:
https://github.com/Farzad-R/Agentic-L...
🔗 Connect on LinkedIn:
/ farzad-roozitalab
🎥 Related series:
• Advanced Q&A and RAG series: https://github.com/Farzad-R/Advanced-...
• LLM-Zero-To-Hundred Series: https://github.com/Farzad-R/LLM-Zero-...
🛠 Frameworks and Tools Used:
#openai #langchain #letta #gradio #sqlite #chatbot #rag #llm #agent #python #gpt #memory
📚 Timestamps:
00:00:00 Intro
00:07:21 Importance of Memory
00:14:12 What is Memory?
00:30:43 Custom Chatbot with Basic Memory
00:58:50 Custom Chatbot with Agentic Memory (V1)
01:33:38 Custom Chatbot with Agentic Memory (V2)
01:45:33 Letta
01:48:55 LangChain's First Long-Term Memory Strategy (Vector Database + Knowledge Graph)
02:10:10 LangChain's New Memory Architecture (Semantic, Episodic, and Procedural Memory)
コメント