
As far as I know, Streamlit is the fastest way to get a customizable web-app off the ground. If you’re looking to build an AI agent and deploy it on your own front-end, I couldn’t think of a better option.
The only hold-back is the chat elements library. They’re pretty specifically tuned to OpenAI’s API and Python client
Which is great– a couple of lines of code to interact with some of the most prestigious technology available is, well… great.
But it’s not everything.
What if you want some more control over your bot? For instance, you may want a multi-step workflow, or retrieval-augmented generation (RAG). These added layers of functionality generally mean wrangling together libraries with all sorts of dependencies.
Or does it?
In this tutorial, I’ll be building a Streamlit-hosted chatbot client. I’ll show you an interface for quick-iteration and highly-customizable chatbots. Then, you’ll learn how to integrate the chatbot using a custom-built OpenAI-style Python client.
If you’re prototyping, dependencies and technicalities shouldn’t hold you back.
And, in the spirit of quick prototyping, if you want to skip the tutorial and start tinkering, the code is on GitHub.
Bombs away 💣
Step 1: Build the Chatbot Logic
Be it for workflow automation or an appointment booking chatbot, the world really is your oyster here.
I urge you to explore the breadth of use cases for GenAI chatbots if you’re looking for inspiration. For the sake of simplicity, I’ll be coming back at you with my hopefully-now-famous sommelier, Winona.
Our sophisticated, helpful little bot can be achieved in just a few steps. I’ll be brief, but there are many lengthy, ultra helpful tutorials you can browse.
1. Give it Instructions
In the studio, we’ll navigate to Home in the left-hand sidebar.

You should see the Instructions section front-and-center. Click on it to add or modify the instructions in plain text.

This gives our bot directives, personality, and guardrails. Using plain-language, you can steer your bot pretty effectively towards the desired behavior. Make it sound more human, and
2. Build the Flow
This is where the nuts and bolts of the bot’s personality live: access to specific information, rigid steps-by-step, code execution, etc.
Don’t underestimate the power of simplicity. A single autonomous node rivals the functionality of reasoning agents. I have one hooked up to my Knowledge Base (KB).

3. Add the Knowledge Base
If the instructions are about vibes, the KB is about cold, hard facts. In my case, the facts in question are the wines in the Wine Reviews dataset, a list of wines, descriptions, and prices. I’ll be treating this as a de-facto wine inventory for our bot to Sommelier-ize.
I’ll click on Tables in the left panel and hit New Table in the top left of the page, and give it a descriptive name.

Click on the vertical ellipsis (⋮) on the top-right, and hit Import.

Drag your .csv into the modal that pops up and follow the steps on-screen.
To make the table accessible to your bot, navigate to Knowledge Bases in your left-hand sidebar.

Click the little green table icon and select the relevant source. Click Add tables.

Make sure your flow has access to the Knowledge Base and you’re good to go.

Step 2: Add the Chat API Integration
The point of contact between the bot and our local client is the Chat API. To add that to our bot, I’ll scroll to Communication Channels and hit … More.

Peruse through the integrations if you’d like. We’re after Chat. I had to scroll a bit to find it.

Click the integration and hit Install Integration in the modal that pops up.

Once installed, you’ll see the Chat API ID at the end of the webhook URL. You’ll need that for later.
Step 3: Write the Python Client
The Chat API exposes a number of endpoints to perform crud operations on users, conversations, and messages. As promised, I’ll wrap these into a Python client that can replace an OpenAI client.
1. Add your credentials
class BotpressClient:
def __init__(self, api_id=None, user_key=None):
self.api_id = api_id or os.getenv("CHAT_API_ID")
self.user_key = user_key or os.getenv("USER_KEY")
self.base_url = f"{BASE_URI}/{self.api_id}"
self.headers = {
**HEADERS,
"x-user-key": self.user_key,
}
You’re free to add your Chat API ID to a .env file– it helps with debugging, but it’s not strictly necessary. We’ll deal with the API ID and user key when we build the Streamlit app.
I keep BASE_URI
and HEADERS
in a separate constants.py
file, for clutter.
# constants.py
BASE_URI = "https://chat.botpress.cloud"
HEADERS = {
"accept": "application/json",
"Content-Type": "application/json",
}
2. Create the Request methods
def _request(self, method, path, json=None):
url = f"{self.base_url}{path}"
try:
response = requests.request(method, url, headers=self.headers, json=json)
response.raise_for_status()
return response.json()
except requests.HTTPError:
return response.status_code, response.text
# --- Core API Methods ---
def get_user(self):
return self._request("GET", "/users/me")
def create_user(self, name, id):
user_data = {"name": name, "id": id}
return self._request("POST", "/users", json=user_data)
def set_user_key(self, key):
self.user_key = key
self.headers["x-user-key"] = key
def create_and_set_user(self, name, id):
new_user = self.create_user(name, id)
self.set_user_key(new_user["key"])
def create_conversation(self):
return self._request("POST", "/conversations", json={"body": {}})
def list_conversations(self):
return self._request("GET", "/conversations")
def get_conversation(self, conversation_id):
return self._request("GET", f"/conversations/{conversation_id}")
def create_message(self, message, conversation_id):
payload = {
"payload": {"type": "text", "text": message},
"conversationId": conversation_id,
}
return self._request("POST", "/messages", json=payload)
def list_messages(self, conversation_id):
return self._request("GET", f"/conversations/{conversation_id}/messages")
As mentioned, nearly all of these map to an endpoint in the API. I’m just wrapping them in a class.
3. Create an SSE Listener
This is the extent of the hackery. To listen for conversation updates and loop into a Streamlit front-end, the client needs a method to listen for– and yield– server-sent events from our bot.
def listen_conversation(self, conversation_id):
url = f"{self.base_url}/conversations/{conversation_id}/listen"
for event in sseclient.SSEClient(url, headers=self.headers):
print(event.data)
if event.data == "ping":
continue
data = json.loads(event.data)["data"]
yield {"id": data["id"], "text": data["payload"]["text"]}
This function takes the conversation_id (which will be accessed programmatically within the app) and yields incoming data as it happens.
Step 4: Create the Streamlit App
With our ducks in a row, it’s time to build the chatbot. Note that I’m following Streamlit’s guide on building an LLM chat app– with some added features.
1. Adapt the boilerplate code
In theory, you can get the app working with minimal changes to the boilerplate in the Streamlit example.
# app.py
from client import BotpressClient
import streamlit as st
from constants import CONVERSATION_ID
st.title("Botpress Front-end for Streamlit")
client = BotpressClient(
api_id=st.secrets["CHAT_API_ID"], user_key=st.secrets["USER_KEY"]
)
if "messages" not in st.session_state:
messages = client.list_messages(CONVERSATION_ID)
next_token = messages["meta"]["nextToken"]
st.session_state.messages = messages["messages"][::-1]
for message in st.session_state.messages:
with st.chat_message(message["userId"]):
st.markdown(message["payload"]["text"])
if prompt := st.chat_input("*wine*-d it up"):
st.session_state.messages.append({"role": "user", "content": prompt})
with st.chat_message("user"):
st.markdown(prompt)
client.create_message(prompt, conversation_id=CONVERSATION_ID)
with st.chat_message("assistant"):
response_box = st.empty()
last_rendered = ""
for message in client.listen_conversation(CONVERSATION_ID):
message_id = message["id"]
message_text = message["text"]
if message_id != last_rendered:
last_rendered = message_id
response_box.markdown(message_text)
st.session_state.messages.append(
{"role": "assistant", "content": message_text}
)
We’re reading secret variables here. So create a .streamlit/secrets.toml file and place your variables inside:
CHAT_API_ID = ”YOUR_API_ID”
USER_KEY = ”YOUR_USER_KEY”
The heavy lifting is happening in:
with st.chat_message("assistant"):
response_box = st.empty()
last_rendered = ""
for message in client.listen_conversation(CONVERSATION_ID):
message_id = message["id"]
message_text = message["text"]
if message_id != last_rendered:
last_rendered = message_id
response_box.markdown(message_text)
st.session_state.messages.append(
{"role": "assistant", "content": message_text}
)
where the client is latching to the chat elements to deliver and receive messages.
This works, but this isn’t ideal for a few reasons:
- You have to have created a new conversation separately.
- Old messages will be formatted differently from new ones, because they don’t have the role designation (user or assistant).
- You can’t toggle conversations.
2. Create conversations dynamically
Starting from scratch, I’ll automatically create a new conversation or open the most recent one:
# app.py
from client import BotpressClient
import streamlit as st
st.title("Botpress Front-end for Streamlit")
client = BotpressClient(
api_id=st.secrets["CHAT_API_ID"], user_key=st.secrets["users"][0]["key"]
)
# user info
user = client.get_user()
user_id = user["user"]["id"]
conversations = client.list_conversations()["conversations"]
conversation_ids = [conv["id"] for conv in conversations]
# conversation
def create_conversation():
res = client.create_conversation()
print(f"Created new conversation: {res}")
conversation_id = res["conversation"]["id"]
st.session_state.active_conversation = conversation_id
st.session_state.messages = []
st.rerun()
if not conversations:
create_conversation()
if "active_conversation" not in st.session_state:
st.session_state["active_conversation"] = conversations[0]["id"]
Note that I’ve modified the secret keys to be able to store multiple users. You’ll want to modify your .streamlit/secrets.toml
file to reflect that:
[[users]]
key = "your_user_key"
You can repeat this block as much as you’d like, storing users as an array of tables.
3. Let users create and toggle between conversations
Like the heading says, this creates a dropdown at the top with a button to let you choose your conversation.
col1, col2 = st.columns([5, 1])
with col1:
conversation_id = st.selectbox(
"Select Conversation",
options=[conv["id"] for conv in conversations],
index=conversation_ids.index(st.session_state.active_conversation),
)
with col2:
st.markdown("<div style='height: 1.9em'></div>", unsafe_allow_html=True)
if st.button("➕"):
create_conversation()
selected_conversation = client.get_conversation(conversation_id)
4. Assign the correct role for past messages
We’ll solve the formatting issue from above by assigning the user or assistant role to each of the past messages:
if (
"messages" not in st.session_state
or st.session_state.get("active_conversation") != conversation_id
):
st.session_state.active_conversation = conversation_id
st.session_state.messages = []
messages = client.list_messages(conversation_id)
next_token = messages["meta"].get("nextToken")
for message in messages["messages"][::-1]:
role = "user" if message["userId"] == user_id else "assistant"
text = message["payload"]["text"]
st.session_state.messages.append({"role": role, "content": text})
This conforms our code to the structure that Streamlit expects.
5. Add the messaging logic
This is more-or-less the same as before, adapted for the new structure.
# display chat history
for message in st.session_state.messages:
with st.chat_message(message["role"]):
st.markdown(message["content"])
if prompt := st.chat_input("*wine*-d it up"):
st.session_state.messages.append({"role": "user", "content": prompt})
client.create_message(prompt, conversation_id=conversation_id)
with st.chat_message("user"):
st.markdown(prompt)
with st.chat_message("assistant"):
stream = client.listen_conversation(conversation_id=conversation_id)
response = st.write_stream(stream)
st.session_state.messages.append({"role": "assistant", "content": response})
5. Create a user
The logic is ready, but you’ll need to create a user to run the app. I chose to add this separately to simulate the experience of signing up for a service. Fortunately for you, I also wrote a script:
# create_user.py
import argparse
from pathlib import Path
from client import *
from constants import *
secrets_path = Path(".streamlit") / "secrets.toml"
template = """[[users]]
key="{}"
"""
client = BotpressClient()
def create_user(name, id, add_to_secrets=True):
res = client.create_user(name, id)
if not add_to_secrets:
return res
secrets_path.touch(exist_ok=True)
with open(secrets_path, "a") as f:
f.write(template.format(res["user"]["id"], res["key"]))
return res
if __name__ == "__main__":
parser = argparse.ArgumentParser(
description="Create a Botpress user and optionally store secrets."
)
parser.add_argument("--name", required=True, help="Display name of the user.")
parser.add_argument(
"--id", required=True, help="User ID. If omitted, one is generated by the API."
)
parser.add_argument("--chat_api_id", help="ID for the Botpress Chat API integration. Taken from `.env` file if not provided.")
parser.add_argument(
"--no-secrets",
action="store_true",
help="Do not append to .streamlit/secrets.toml.",
)
args = parser.parse_args()
print(f"Creating user: {args.name} (ID: {args.id or 'auto-generated'})")
result = create_user(name=args.name, id=args.id, add_to_secrets=not args.no_secrets)
print("✅ User created:")
print(result)
Provided you have your Chat API ID, you can run:
python create_user.py –name YOUR_NAME –id SOME_USER_ID –chat_api_id YOUR_CHAT_API_ID
This will deal with creating the user and adding it to your secrets file.
Step 5: Run the application
With your logic built and your user created, it’s time to take this application for a spin. Install the dependencies, and run:
streamlit run app.py
And just like that, you’ll see our bot in all its glory.

Run a Streamlit chatbot today
If you’re prototyping with Streamlit, you know customizability shouldn’t come at the cost of convenience. Chatbots are there to solve problems– not create them.
Botpress comes with a visual drag-and-drop builder, dozens of official integrations, and accessible API endpoints. This way you can build, iterate, and deploy across many communication channels.
Start building today. It’s free.