Building a Versatile AI Chatbot with LangGraph: A Step-by-Step Guide

Devendra Bogati
4 min readAug 19, 2024

--

In the rapidly evolving world of AI, building robust, intelligent chatbots that can handle complex tasks, maintain conversation state, and involve human oversight is becoming increasingly essential. LangGraph, a powerful framework built on LangChain, offers a comprehensive set of tools to create sophisticated AI agents with features like state management, tool integration, and even time-travel capabilities.

In this article, we’ll walk through the process of building a versatile AI chatbot using LangGraph. By the end of this guide, you’ll have a chatbot that can answer common questions, manage conversation states, escalate complex queries to humans, and explore alternative conversation paths through time travel.

### Getting Started with LangGraph

LangGraph is a framework-agnostic library that extends the Runnable API from LangChain, making it easier to manage states and routes within complex conversational workflows. Before diving into the coding process, make sure to install the necessary packages:

```bash
pip install -U langgraph langsmith langchain_anthropic
```

Next, set up your API keys:

```python
import getpass
import os

def _set_env(var: str):
if not os.environ.get(var):
os.environ[var] = getpass.getpass(f”{var}: “)

_set_env(“ANTHROPIC_API_KEY”)
_set_env(“LANGSMITH_API_KEY”)
os.environ[“LANGCHAIN_TRACING_V2”] = “true”
os.environ[“LANGCHAIN_PROJECT”] = “LangGraph Tutorial”
```

### Part 1: Building a Basic Chatbot

We’ll begin by creating a basic chatbot using LangGraph. This chatbot will handle user inputs and generate responses using a large language model (LLM). The core concept here is to define a `StateGraph` object, which will structure our chatbot as a state machine. Nodes represent the chatbot’s functions, and edges define the transitions between these nodes.

Here’s a simple implementation:

```python
from langchain_anthropic import ChatAnthropic
from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
from typing import Annotated
from typing_extensions import TypedDict

class State(TypedDict):
messages: Annotated[list, add_messages]

graph_builder = StateGraph(State)
llm = ChatAnthropic(model=”claude-3-haiku-20240307")

def chatbot(state: State):
return {“messages”: [llm.invoke(state[“messages”])]}

graph_builder.add_node(“chatbot”, chatbot)
graph_builder.add_edge(START, “chatbot”)
graph_builder.add_edge(“chatbot”, END)

graph = graph_builder.compile()
```

In this setup, the chatbot node processes user messages, and the state is updated by appending new messages rather than overwriting them. This structure allows the bot to maintain a continuous conversation.

Part 2: Enhancing the Chatbot with Tools

To make our chatbot more versatile, we’ll add the ability to perform web searches when it encounters questions outside its knowledge base. We’ll use the Tavily Search Engine as an example tool.

```python
from langchain_community.tools.tavily_search import TavilySearchResults

tool = TavilySearchResults(max_results=2)
tools = [tool]

llm_with_tools = llm.bind_tools(tools)

def chatbot(state: State):
return {“messages”: [llm_with_tools.invoke(state[“messages”])]}

graph_builder.add_node(“chatbot”, chatbot)

# Add a tool node
from langgraph.prebuilt import ToolNode
graph_builder.add_node(“tools”, ToolNode(tools=[tool]))

# Define conditional routing
from typing import Literal

def route_tools(state: State) -> Literal[“tools”, “__end__”]:
if isinstance(state, list) and “tool_calls” in state[-1]:
return “tools”
return “__end__”

graph_builder.add_conditional_edges(“chatbot”, route_tools, {“tools”: “tools”, “__end__”: “__end__”})
graph_builder.add_edge(“tools”, “chatbot”)
graph_builder.add_edge(START, “chatbot”)

graph = graph_builder.compile()
```

Part 3: Adding Memory to the Chatbot

Now that our chatbot can perform web searches, let’s give it memory. By checkpointing the state after each interaction, the bot can maintain context across multiple turns of conversation.

```python
from langgraph.checkpoint.memory import MemorySaver

memory = MemorySaver()
graph = graph_builder.compile(checkpointer=memory)

config = {“configurable”: {“thread_id”: “1”}}
user_input = “Hi there! My name is Will.”
events = graph.stream({“messages”: [(“user”, user_input)]}, config, stream_mode=”values”)
for event in events:
event[“messages”][-1].pretty_print()
```

Part 4: Human-in-the-Loop

There are situations where the chatbot might need human intervention. LangGraph supports human-in-the-loop workflows, allowing you to interrupt the bot’s process and have a human provide input or approve actions before the bot continues.

```python
graph = graph_builder.compile(
checkpointer=memory,
interrupt_before=[“tools”],
)

user_input = “I’m learning LangGraph. Could you do some research on it for me?”
events = graph.stream({“messages”: [(“user”, user_input)]}, config, stream_mode=”values”)
for event in events:
event[“messages”][-1].pretty_print()
```

Part 5: Manually Updating the State

Sometimes, you may want to manually update the chatbot’s state to correct its course or explore alternative paths. LangGraph makes this possible by allowing you to update the state mid-conversation.

```python
from langchain_core.messages import AIMessage

snapshot = graph.get_state(config)
existing_message = snapshot.values[“messages”][-1]
new_message = AIMessage(content=”Updated response based on new context”, id=existing_message.id)
graph.update_state(config, {“messages”: [new_message]})
```

Part 6: Customizing State

To make the chatbot even more flexible, we can add custom states. For example, you might want the chatbot to decide whether it needs to ask for human help or proceed autonomously.

```python
class State(TypedDict):
messages: Annotated[list, add_messages]
ask_human: bool

In the chatbot function
if response.tool_calls and response.tool_calls[0][“name”] == “RequestAssistance”:
ask_human = True

graph_builder.add_node(“human”, human_node)
graph_builder.add_conditional_edges(“chatbot”, select_next_node, {“human”: “human”, “tools”: “tools”, “__end__”: “__end__”})
```

Part 7: Time Travel

LangGraph’s time-travel feature allows you to rewind the chatbot’s state to a previous point in the conversation. This can be useful for debugging, exploring different outcomes, or correcting mistakes.

```python
to_replay = None
for state in graph.get_state_history(config):
if len(state.values[“messages”]) == 6:
to_replay = state

events = graph.stream(None, to_replay.config, stream_mode=”values”)
for event in events:
event[“messages”][-1].pretty_print()
```

### Conclusion

Congratulations! You’ve successfully built a feature-rich chatbot using LangGraph. This bot can maintain conversation context, handle complex queries, involve human oversight, and even explore alternative conversation paths through time travel.

LangGraph is a powerful tool that can help you build sophisticated AI agents tailored to your specific needs. As you continue to explore its capabilities, you’ll find even more ways to customize and enhance your AI applications. Happy coding!

--

--

Devendra Bogati
Devendra Bogati

Written by Devendra Bogati

Let's create value!! I am a passionate IoT, AI, and ML enthusiast who is currently pursuing a B.Tech in Electronic and communication(IoT).

No responses yet