Building your personal AI assistant from scratch with python, OpenAI API, Gradio, and ChromaDB

By | March 11, 2025

I’m definitely new to a lot of things on the AI side. Luckily, that’s understandable given that AI itself is relatively new.

However, I tend to learn things through personal experiences, which means I can read about something, but it does not necessarily stick. It works much better when I spend some time trying things out, running into errors, cursing myself for making stupid mistakes…

So here is the outcome of one of such projects – basically, it was an attempt to create a personal AI assistant from scratch using the following:

  • OpenAI API
  • ChromaDB for RAG
  • Python
  • Gradio for UI

All of that was new to me as of 3 months ago, but I think the results are quit interesting. Of course one could probably do even more impressive things using, for example, Microsoft Copilot. The difference is that… this way I have full control of what’s happening, it’s almost “free” (OpenAI API calls costs have been negligible so far, which may have to do with using “gpt-4o-mini” model), I have my own local RAG. But, most importantly, I’m now starting to understand how things work (which is different from just knowing it in theory)

Either way, here is a demo that shows how I can add new content to my local knowledge base, and that content will be incorporated into the prompts to provide contextual responses.

You’ll see a couple of tasks added to the list (there is no special formatting there, once they are eventually added to the prompt, the model will take care of formatting its output as needed)

You’ll also see how the model would not know, initially, that Donald Trump was re-elected (yep, could not help but use that as an example). And, then, on the second try (once I’ve added additional details to the knowledge base), it’ll provide a more accurate response.

This would not be complete without a link to a corresponding git repo:

https://github.com/ashlega/personal_ai_chat

I’ll add some instructions there shortly, but, just to provide a quick summary for now:

  • There is docker-compose file there that you can use to create a docker container
  • Once the container is up, you should be able to access your personal ai assistant in the browser using this url: http://localhost:7855/?userid=123
  • You can replace 123 with combination of numbers/characters – this will essentially identify your user, and every user will have their own local knowledge base

A word of caution: this is not meant to be production-ready. This is a pet project which was meant to explore what AI can do, how to build RAG, how to use ChromaDB… without adding any other libaries to the mix (so there is no LangChain or AgenticAI, for example).

Leave a Reply

Your email address will not be published. Required fields are marked *