BACK_TO_FEEDAICRIER_2
Roleplay Bot chats with local AI characters
OPEN_SOURCE ↗
REDDIT · REDDIT// 18d agoOPENSOURCE RELEASE

Roleplay Bot chats with local AI characters

Roleplay Bot is a small open-source Python CLI for chatting with local AI characters through Ollama's llama3.2 model. It spins up a custom roleplay-model, injects persona and scene context on the first turn, and the author says replies can take about a minute on their setup, so this reads more like a proof-of-concept than a polished chat app.

// ANALYSIS

This is a neat local-AI demo, but it also shows how fast roleplay bots run into latency and state-management limits once you move beyond a single prompt. The project creates a dedicated roleplay-model from llama3.2, which keeps the character setup self-contained and easy to reproduce. Each turn is sent as a fresh Ollama request with only a one-time persona/scene prefix, so the bot does not persist real multi-turn memory. The cleanup path relies on a Python destructor hook to delete the custom model, which is neat but brittle if the process exits abruptly. The prompt explicitly tells the model to stay in character and ignore restrictive safety guidance, which makes the output vivid but risky to reuse as-is. Source links: Reddit demo https://www.reddit.com/r/LocalLLaMA/comments/1s2j6bl/made-a-roleplaying-chatbot-with-python-and-ollama/; GitHub repo https://github.com/rakin406/roleplay_bot; v1.0.0 release https://github.com/rakin406/roleplay_bot/releases/tag/v1.0.0.

// TAGS
roleplay-botllmchatbotcliself-hostedopen-source

DISCOVERED

18d ago

2026-03-24

PUBLISHED

18d ago

2026-03-24

RELEVANCE

7/ 10

AUTHOR

Far_Introduction1824