Using RAG and Ollama to Make a Health Bot
I used RAG to make a medical diagnosis bot that uses med school textbooks to answer questions. Here are the 4 steps I took:
1. RAG storage
I first used Dabarqus, our no-code RAG app (coming soon from Electric Pipelines) to store some public medical textbooks. Notably, for this demo, a neurological textbook.
2. Evaluating the diagnosis
I then grabbed an example neurological assessment from online and had Google Gemini summarize the text.
3. RAG retrieval
I then took the summary I got from Gemini of the assessment and fed that, plus a little context, to Dabarqus to get back some results from our textbooks.
4. Interpretation of RAG data
Finally, this RAG data got passed over to Llama3 via Ollama to be presented in a user-friendly way.
I’m really happy with how it turned out! There’s still a lot of improvement that could be (and will be) made, but this version works as a proof of concept. I still want to add more health info so that I can build an accessible health bot, but one step at a time.
Recent Posts
-
Using RAG and Ollama to Make a Health Bot
I used RAG to make a medical diagnosis bot that uses med school textbooks to answer questions. These are the 4 steps I took.
-
AI Used a Video of My Fridge to Fix My Diet
I used Google Gemini and Dabarqus, our no-code RAG app to take a video of my fridge and my diet from MyFitnessPal to find recipes that fill holes in my diet with foods I already have. This is how it went.
-
Comparing AI Image Recognition
I compared 3 LLMs on image recognition: GPT-4o, Gemini, and Claude 3.5 Sonnet. I gave each model a picture of my bookshelf and asked them to identify the books. Here’s the results:
-
Using AI for Medical Report Analysis
I used Google Gemini, Langchain and Llama 3 to turn a medical report into an action plan for a patient. I couldn’t believe how good of a job they did, and it only took me one hour. This is how I did it.