Guidance, Not Just Data: Redefining Survivor Recovery with AI

Written by sumo2o | Published 2025/05/19
Tech Story Tags: ai | ai-empathy | retrieval-augmented-generation | sanjeevani-ai | healing-architecture | nlp-+-metadata-extraction | agentic-rag-layer | hybrid-recommender-layer

TLDRCancer survivors face a relentless cycle of urgency, bloating, fatigue, and pain. The tools we have today can track. They can inform. But they don’t guide. We need a tool that can help us turn ‘setbacks’ to ‘comebacks”via the TL;DR App

This story is about how we built a Retrieval-Augmented Generation system for colorectal cancer survivors who needed daily, human-centered healing guidance.

The Problem

Most RAG systems are built to surface facts not feelings. They power search, chatbots, and enterprise Q&A. But what if your question isn’t just about “how,” but how to live again?

Let me tell you a story. A story of a survivor who dreamed, who aspired, who wanted so much from life.

Yes, it’s me.
It’s my story.

I am a survivor. I survived the harsh battle against cancer.

But when the treatment ended and the surgery was done, I thought I could go back to dreaming, aspiring, achieving…

Only to learn that the battle wasn’t over yet.

It had just changed shape.

Most cancer survivors are told they're “done” after treatment. But for almost all of us, that's when the real struggle begins.

As a colorectal cancer survivor living with LARS (Low Anterior Resection Syndrome), I face a relentless cycle of urgency, bloating, fatigue, and pain.

Symptoms that aren’t just unpredictable. They are uncontrollable.

LARS doesn’t just disrupt life. It erases it.

I stopped eating normally. I stopped leaving my home. I stopped trusting my body.

I amn’t living. I am existing.

Dejected and depressed, I wanted to give up but I am determined to turn ‘setbacks’ to ‘comebacks’. I started my research to meet with people who are sailing in the same boat as mine. I heard their stories.

One survivor said: “My legs are shaking now. No strength left. It’s been a week of constipation. I’m angry.”

Another shared: “I’ve had severe, nonstop diarrhea for days. Even my legs shake from the weakness. My BP is 69/54. I’m so exhausted.”

These extremes of incontinence or no stool at all trap survivors in a constant state of distress.

And through it all, the world assumes we’ve recovered.

Why the Tools We Have Aren’t Enough

I do what most survivors do.

I track symptoms. I log food. I Google. I scroll through Facebook groups. I try meditation apps and fiber supplements.

I even ask ChatGPT questions about bloating, diarrhea, constipation, fatigue.

Sometimes, it answers kindly. Sometimes, it gives generic advice. But mostly, it doesn’t understand what I’m going through…not really.

The tools we have today can track. They can inform. But they don’t guide.

And they definitely don’t hold space for the emotional chaos, the fear, the identity loss that comes with every flare.

I’m not looking for data.
I’m looking for healing.

And if no tool today can do that then I have to build one that will and as I already wrote in my earlier post, SANJEEVANI AI was born.

From Information Retrieval to Recovery Guidance

In the world of AI, RAG (Retrieval-Augmented Generation) is used to fetch relevant documents so language models can generate more accurate answers.

But survivors don’t need documents.
We don’t need accuracy in the academic sense.

We need resonance.

So we reimagined RAG not as a search tool, but as a healing architecture.

Instead of pulling journal articles, we taught the system to retrieve:

  • Survivor symptom logs
  • Food triggers and recovery routines
  • Reflections, emotional states, and pattern-matched outcomes

We trained it to search for what helped, not just what was written.

And we didn’t stop there. We layered in a reasoning engine, one that not only recommends what to eat. For example, “If a user said, I feel constipated, it not only suggests “eat more fiber,” but explains why that may not work for someone with post-radiation bowel damage.

Through a deeply personal, survivor-trained RAG pipeline, we achieved the success.

How Our the Healing Architecture Works

We designed SANJEEVANI AI with a simple but radical idea:
healing isn’t a single answer — it’s an adaptable, daily flow.

Traditional AI systems stop at response generation.
But survivorship requires sensing, retrieving, reasoning, and explaining — all grounded in personal context.

So we built a multi-layered architecture that does exactly that.

1. Smart Input Layer

The journey begins when a survivor logs something:
“Breakfast: oats with banana. Felt bloated. Urgency. Mood: anxious.”

Whether it's voice or text, our interface converts that into 28+ structured data points including:

  • Symptom severity
  • Emotional state
  • Food types
  • Energy levels
  • Bowel behavior (e.g., Bristol scale)

We call this our Smart Logging System. It's voice-enabled, multilingual-ready, and designed for compassionate expression, not just data capture.

2. NLP + Metadata Extraction Layer

Once the log is captured, it flows into a BERT-based NLP layer, fine-tuned on:

  • Colorectal cancer survivorship language
  • Symptom-event-emotion triplets
  • Layperson-to-clinical translation patterns

This layer extracts:

  • Intent (e.g., looking for relief or routine validation)
  • Emotion (fear, frustration, confusion)
  • Temporal markers (e.g., post-meal, post-TAI, post-bowel movement)

We enrich every log with metadata embeddings that describe more than what’s said, they infer what’s felt.

3. Agentic RAG Layer

Here’s where our retrieval engine comes in but unlike traditional RAG, we don’t pull PDFs or papers.

We retrieve:

  • Curated survivor logs (anonymized)
  • Treatment-to-symptom maps
  • Food-to-impact relationships
  • Peer recommendations + emotional recovery timelines
  • Doctor-validated insights, tagged by symptom patterns

This data lives in a vector store, where each entry is embedded not just by keywords, but by:

  • Symptom clusters
  • Emotional energy
  • Healing success likelihood

When a survivor logs a new entry, the system retrieves other cases that “feel” similar statistically, as well as experientially.

4. Hybrid Recommender Layer

After retrieval, the system moves to recommendation following a ‘not one-size-fits-all’, but a hybrid approach:

  • Content-based filtering suggests interventions aligned with the survivor’s past entries and current state
  • Collaborative filtering matches anonymized cohorts of similar survivors (age, symptom pattern, treatment type)

From this, SANJEEVANI AI chooses the most context-aware suggestion:

  • A food change
  • A lifestyle routine
  • A mindset shift
  • Or just a calming “you’re doing okay” moment

5. QoLGPT Layer

Recommendations are one thing but how they’re explained is everything.

Our QoLGPT layer is a fine-tuned GPT model trained on:

  • Survivor storytelling
  • Peer-to-peer support threads
  • Emotionally regulated, non-clinical language
  • Compassionate, non-directive coaching tones

It explains whya certain meal may help.
It validates fears without making assumptions.
And most importantly, it doesn’t medicalize suffering, it meets it.

6. Feedback Loop

Every interaction is logged not just for metrics alone, but mainly for system learning and adapts.

If a survivor rejects a recommendation, or logs that something helped or hurt, the system adapts.

Survivors train our model by living rather than by tagging data.

A New Kind of Intelligence for a New Kind of Healing

I didn’t set out to build an AI company.

I set out to survive.

And in that process, I discovered something that no spreadsheet, chatbot, or PDF could give me:
the right insight at the right time with the right tone.

Healing isn’t one decision. It’s a thousand tiny ones.
When to eat. What to eat. Whether to walk. Whether to rest. Whether to try again after yesterday failed.

What survivors need is not more content. We need a system that sees us not just our symptoms.

That’s what SANJEEVANI AI is.

An intelligent, adaptable, empathetic healing system trained on real human lives.

It doesn’t diagnose or treat because that’s the medical oncologist’s job.
It listens. It adapts. It guides. It’s not built to replace doctors. It’s built to support the space between the hours, days, and months when no one else is there but the pain, the food, the fear… and the survivor.

We used RAG not to search. But to remember.

We used GPT not to answer. But to explain.

And we used our own stories as survivors not as anecdotes, but as data.

Because the next generation of AI doesn’t need to be just intelligent. It needs to be kind.


Written by sumo2o | I’m Suneeta Modekurty, a Senior Data Scientist and Bioinformatician with a passion for exploring AI in healthcare
Published by HackerNoon on 2025/05/19