Building a Roleplay AI Assistant with Gemini

Apr 12, 2025

As an avid fan of pen & paper roleplaying games, I know the immense challenge Game Masters (GMs) face. Beyond crafting compelling worlds and stories, GMs must juggle intricate plot threads, a diverse cast of characters (both player-controlled and non-player characters), and the ever-evolving events of each session. Recalling specific details – who discovered the ancient artifact, what cryptic clue did the mysterious hermit share, which town did the party visit three sessions ago? – can become a significant cognitive load, detracting from the creative aspects of running a game.

This project explores how AI, specifically Google's powerful Gemini models, can serve as a dedicated assistant for GMs. The goal isn't just to store data, but to create an interactive system that understands the game's context, allowing GMs to offload memory tasks and query their campaign world using simple, natural language. This frees up valuable mental bandwidth, enabling GMs to focus on delivering a richer, more immersive experience for their players.

Generating Session Summaries Automatically

The foundation of this assistant is the session summary. Manually writing detailed summaries after each game can be time-consuming. Here, I leveraged Gemini's generative capabilities. By providing the AI with initial context about the game setting, the player characters (like Ulfric Whitebeard, Aeliana Sunwhisper, and Silas Rockwood), and a basic narrative structure, I used few-shot prompting to have it generate coherent session summaries. The AI effectively simulated the GM's role in documenting the game's progression, even introducing relevant NPCs like "Elder Elara" based on the prompt. Each generated summary was saved for later processing.

# filepath: c:\Users\ricca\Downloads\roleplay-ai.ipynb
# Define the basic informations for the AI to generate a story.
context = """You are an expert dungeon master that is playing pen & paper fantasy roleplay adventures with his players and likes to write summaries about his game sessions.
Your players and their characters are:
- Marco, with his character Ulfric Whitebeard, called 'the Demon Slayer'. A barbarian with superhuman strenght.
- Sabrina, with her character Aeliana Sunwhisper. An elven priestress of the light god Solarian.
- Luca, with his character Silas Rockwood. A mage specialized in earth sorceries.
When asked something do not add any comments, just output what is requested.
Wait for further instructions.
"""

# define the basic promts for generating sessions
first_request = """Generate the summary of the first roleplaying session you could do with your players and their characters. In this session an NPC named Elder Elara must appear.
Remember to list what session number this is and to cite all the characters present, include in the same list all the npcs(non-playing characters) you make up when writing the story."""
next_requests = """Generate a new roleplay session, continuing the last one. There shouldn't be an NPC named Elder Elara.
Remember to list what session number this is and to cite all the characters present, include in the same list all the npcs(non-playing characters) you make up when writing the story."""

# create a chat and give the necessary context
client = genai.Client(api_key=GOOGLE_API_KEY)
chat = client.chats.create(model='gemini-2.0-flash', history=[])
chat.send_message(context)

# prepare the method to generate sessions and save them ino a file
def generate_session_and_write(request: str, index: int) -> str:
    response = chat.send_message(request)

    # Write new session file in kaggle's notebook
    filename = f"{index}.txt"
    mode = 'w'
    try:
        with open(filename, mode) as file:
            # Write a single string
            file.write(response.text)
            print(f"Successfully wrote content to '{filename}' in the /kaggle/working/ directory.")
    except Exception as e:
        print(f"An error occurred while writing to the file: {e}")

# Number of sessions to generate
gen_sessions = 3
for i in range(1, gen_sessions + 1):
    if i==1:
        generate_session_and_write(first_request, i)
    else:
        generate_session_and_write(next_requests, i)

Structuring the Chaos with JSON

While narrative summaries are great for reading, they aren't easily searchable for specific facts. To make the information machine-readable and queryable, structure is key. I prompted Gemini again, instructing it to analyze each text summary and extract vital information into a structured JSON format. This included a unique session_id, a title for the session, and a list of participants (both PCs and NPCs). This transformation is crucial for efficiently populating a database.

# filepath: c:\Users\ricca\Downloads\roleplay-ai.ipynb
import json

structured_output_prompt = """Given the session summarized in the document,
read the text and generate a JSON record that sums the session and its partecipants including both player characters and npcs.

OUTPUT EXAMPLE 1:

{
"session_id": "4",
"title": "The Whispers of the Old Mine",
"partecipants": ["Ulfric Whitebeard", "Aeliana Sunwhisper", "Silas Rockwood","Elaria Eidd", "Bron Graidd"]
}

OUTPUT EXAMPLE 2:

{
"session_id": "8",
"title": "Deeper into the Dark",
"partecipants": ["Aeliana Sunwhisper", "Elaria Eidd", "Brunwick Blackforge", "Ulfric Whitebeard"]
}

"""
def generate_JSON(document):
    response = client.models.generate_content(
      model='gemini-2.0-flash',
      contents=[structured_output_prompt, document]
    )
    print("- JSON OUTPUT CALL: " + response.text)
    return json.loads(response.text.strip().removeprefix("```json").removesuffix("```").strip())

#for each session generate his correspondent JSON and save it inside sessions_JSON list
sessions_JSON = []
for i in range(1, gen_sessions + 1):
    with open(f"{i}.txt", 'r') as file:
        document_text = file.read()
    session = generate_JSON(document_text)
    sessions_JSON.append(session)
    print("- SESSION ADDED: ID " + session["session_id"])

Enabling Database Interaction via Function Calling

With structured data ready, the next logical step was storing it persistently. I set up a simple SQLite database (roleplay.db) with tables designed to hold character information, session details, and the partecipation links between them.

-- filepath: c:\Users\ricca\Downloads\roleplay-ai.ipynb
-- Create the 'character' table
CREATE TABLE IF NOT EXISTS character (
    character_id INTEGER PRIMARY KEY AUTOINCREMENT,
    name VARCHAR(255) NOT NULL
  );

-- Create the 'session' table
CREATE TABLE IF NOT EXISTS session (
    session_id INTEGER PRIMARY KEY AUTOINCREMENT,
    title VARCHAR(255) NOT NULL
  );

-- Create the 'partecipation' table
CREATE TABLE IF NOT EXISTS partecipation (
    session_id INTEGER,
    character_id INTEGER NOT NULL,
    FOREIGN KEY (session_id) REFERENCES session (session_id),
    FOREIGN KEY (character_id) REFERENCES character (character_id)
    PRIMARY KEY (session_id, character_id) -- Enforce unique partecipation per session
  );

Instead of manually writing SQL INSERT statements, I employed Gemini's Function Calling feature. I defined Python functions (list_tables, describe_table, execute_query, retrieve_document) that acted as tools the AI could use. By providing the AI with the generated JSON data and the database tools, I instructed it to populate the database. The AI intelligently determined the necessary SQL commands and executed them via the provided functions, demonstrating how AI can automate data entry tasks based on high-level instructions.

# filepath: c:\Users\ricca\Downloads\roleplay-ai.ipynb
def list_tables() -> list[str]:
    """Retrieve the names of all tables in the database."""
    print(' - DB CALL: list_tables()')
    cursor = db_conn.cursor()
    cursor.execute("SELECT name FROM sqlite_master WHERE type='table';")
    tables = cursor.fetchall()
    return [t[0] for t in tables]

def describe_table(table_name: str) -> list[tuple[str, str]]:
    """Look up the table schema."""
    print(f' - DB CALL: describe_table({table_name})')
    cursor = db_conn.cursor()
    cursor.execute(f"PRAGMA table_info({table_name});")
    schema = cursor.fetchall()
    return [(col[1], col[2]) for col in schema]

def execute_query(sql: str) -> list[list[str]]:
    """Execute an SQL statement, returning the results."""
    print(f' - DB CALL: execute_query({sql})')
    cursor = db_conn.cursor()
    cursor.execute(sql)
    return cursor.fetchall()

def retrieve_document(session: int) -> list[str]:
    """Retrieve document with the summary of a given session."""
    filename = f"{session}.txt"
    print(f' - DOCUMENT CALL: retrieve_document({session}) - Attempting to read: {filename}')
    try:
        with open(filename, 'r') as f:
            text = f.read()
        return text
    except FileNotFoundError:
        print(f"Warning: Document file '{filename}' not found.")
        return None
    except Exception as e:
        print(f"Error reading document file '{filename}': {e}")
        return None

# These are the Python functions defined above.
db_tools = [list_tables, describe_table, execute_query, retrieve_document]

instruction = """You are a helpful chatbot that can interact with an SQL database
for an app that helps roleplay game masters to organize their sessions and characters data.
You will take the users requests and turn them into SQL queries using the tools available.
Once you have the information you need, you will answer the user's requests using the data returned.

Use list_tables to see what tables are present, describe_table to understand the
schema, execute_query to issue an SQL SELECT or INSERT query (no CREATE) and retrieve_document to obtain the documents stored inside the disk."""

# Start a chat with automatic function calling enabled.
chat = client.chats.create(
    model="gemini-2.0-flash",
    config=types.GenerateContentConfig(
        system_instruction=instruction,
        tools=db_tools,
    ),
)

#Make the client populate the database based on the JSON objects we created
prompt = """Given the JSON objects I'm passing you insert data into the database.
Start with characters, then sessions and lastly partecipations.
Try to execute as little queries as possible by grouping up inserts on the same table."""

for session in sessions_JSON:
    prompt += json.dumps(session) + "\n"

resp = chat.send_message(prompt)
print(f"\n{resp.text}") # AI confirms data insertion

Querying the Game World Naturally

The real power for the GM comes now: the ability to ask questions in plain English. The chat model, armed with the database interaction tools, can now access the structured data to answer specific queries:

  • "How many times did character Elder Elara appear in the sessions?" (Checks the partecipation table)
  • "Give me a summary of what the character Ulfric Whitebeard did in the second session." (Uses retrieve_document to fetch the session 2.txt summary and synthesizes the answer)

This allows the GM to quickly recall facts without manually searching through notes or database tables.

# filepath: c:\Users\ricca\Downloads\roleplay-ai.ipynb
resp = chat.send_message("How many times did character Elder Elara appear in the sessions?")
print(f"\n{resp.text}")

resp = chat.send_message("Give me a summary of what the character Ulfric Whitebeard did in the second session. Retrieve the session summary if needed.")
print(f"\n{resp.text}")

Deeper Insights with Embeddings and RAG

Structured data is excellent for facts, but what about understanding the nuances within the session narratives? Questions like "What is Elder Elara's role in the story?" require understanding the content of the summaries. This is where Retrieval Augmented Generation (RAG) comes in.

I utilized ChromaDB, a vector database, along with Gemini's text-embedding-004 model. Each session summary was converted into a numerical representation (embedding) that captures its semantic meaning and stored in ChromaDB.

# filepath: c:\Users\ricca\Downloads\roleplay-ai.ipynb
!pip install -qU "chromadb==0.6.3"
import chromadb
from chromadb import Documents, EmbeddingFunction, Embeddings

class GeminiEmbeddingFunction(EmbeddingFunction):
    # Specify whether to generate embeddings for documents, or queries
    document_mode = True

    @retry.Retry(predicate=is_retriable)
    def __call__(self, input: Documents) -> Embeddings:
        if self.document_mode:
            embedding_task = "retrieval_document"
        else:
            embedding_task = "retrieval_query"

        response = client.models.embed_content(
            model="models/text-embedding-004",
            contents=input,
            config=types.EmbedContentConfig(
                task_type=embedding_task,
            ),
        )
        return [e.values for e in response.embeddings]

DB_NAME = "rpdocumentsdb"

embed_fn = GeminiEmbeddingFunction()
embed_fn.document_mode = True

chroma_client = chromadb.Client()
db = chroma_client.get_or_create_collection(name=DB_NAME, embedding_function=embed_fn)

documents = []
for i in range(1, gen_sessions + 1):
    with open(f"{i}.txt", 'r') as file:
        document_text = file.read()
        documents.append(document_text)

db.add(documents=documents, ids=[str(i) for i in range(len(documents))])

Now, when the GM asks a question like "Who is Elder Elara?", the RAG process efficiently finds the answer:

  1. Retrieve: The question is embedded, and ChromaDB finds the most semantically similar document chunks (the session summary where Elder Elara was mentioned).
  2. Augment: These relevant text passages are added to the original question, providing specific context.
  3. Generate: Gemini uses this augmented prompt to generate a comprehensive answer grounded in the retrieved information.

RAG allows the assistant to answer questions based on the meaning and content of the session summaries, going beyond simple keyword matching or structured data lookups.

# filepath: c:\Users\ricca\Downloads\roleplay-ai.ipynb
# Switch to query mode when generating embeddings.
embed_fn.document_mode = False

# Search the Chroma DB using the specified query.
query = "Who is Elder Elara?"

result = db.query(query_texts=[query], n_results=1)
[all_passages] = result["documents"]

# Display retrieved passage (optional)
# Markdown(all_passages[0])

# --- Generation Step ---
query_oneline = query.replace("\n", " ")

# This prompt is where you can specify any guidance on tone, or what topics the model should stick to, or avoid.
prompt = f"""You are a helpful and informative bot that answers questions using text from the reference passage included below.
Be sure to respond in a complete sentence, being comprehensive, including all relevant background information.
If the passage is irrelevant to the answer, you may ignore it.

QUESTION: {query_oneline}
"""

# Add the retrieved documents to the prompt.
for passage in all_passages:
    passage_oneline = passage.replace("\n", " ")
    prompt += f"PASSAGE: {passage_oneline}\n"

# print(prompt) # View the augmented prompt

answer = client.models.generate_content(
    model="gemini-2.0-flash",
    contents=prompt)

# Display the final answer
# Markdown(answer.text)

Conclusion: A Powerful Proof-of-Concept

This project successfully demonstrates how different facets of Google's Gemini AI – few-shot prompting, structured output, function calling, and RAG – can be combined to create a genuinely useful tool for roleplaying Game Masters. It tackles the core challenges of information management and recall, offering a way to automate tedious tasks and access campaign knowledge through natural language. While implemented here within a Jupyter Notebook (c:\Users\ricca\Downloads\roleplay-ai.ipynb), the potential is clear.

Next Steps: Towards a Full Application

This notebook serves as a strong foundation, but turning it into a fully-fledged application requires further development:

  1. User Interface: Build a user-friendly web or desktop interface instead of relying on notebook cells. This would allow GMs to easily input data, view summaries, and chat with the AI assistant.
  2. Real-time Input: Allow GMs to dictate or type notes during a session, which the AI could then process and summarize automatically or semi-automatically.
  3. Character Relationship Mapping: Extend the database schema and AI capabilities to track relationships between characters (allies, rivals, family ties) and factions.
  4. Plot Hook Generation: Train the AI on the existing campaign data to suggest potential plot hooks or consequences based on past events and character actions.
  5. Integration with VTTs: Explore integrations with Virtual Tabletops (VTTs) like Roll20 or Foundry VTT to pull character data or session logs directly.
  6. Scalability: Move from SQLite and in-memory ChromaDB to more scalable cloud-based database solutions if handling very large campaigns.
  7. Fine-tuning: Fine-tune a Gemini model specifically on RPG sourcebooks or the GM's own campaign notes for even more context-aware responses.

By pursuing these next steps, this concept could evolve into an indispensable tool, significantly enhancing the Game Master's ability to manage complex campaigns and focus on creating unforgettable stories.

You can find the original Jupyter Notebook here.

Riccardo Passacantando