Dokumentation (english)

Therapy Optimization

Creating personalized medical treatments with explainability

Imagine we want to check what different patient groups have in common to understand why a therapy does not work for everyone.

How do we define patient groups if we have no idea? We could simply run a bunch of experiments. But as you can imagine people are very different on a lot of levels from blood type to music taste.

So how about we cluster patients automatically based on the data we already collected to come up with an initial and very targeted assumption.

It will save us millions on research resources and speed up the process by years.

In this case, we had:

  • Microscopy images: Tissue samples showing cell structures that we used to detect the cancer and the treatment success initially
  • Health parameters: Blood test results and other clinical measurements
  • Patient information: Basic parameters such as ethnicity, gender other diseases then the one currently treated
  • Treatment details: Type, dosage, and timing of therapy
  • Outcomes: Survival and treatment response (This is the value we predict and is called label, we do supervised learning here)

We predicted the likelihood of survival based on the images.

We then used xAI to find the region of interest (ROI) the most interesting region of an image that the AI detected as most important for the prediction.

Next, we clustered the patients based on their ROI (which is basically also an image) and checked their patient information. Now we calculate if the survival outcomes in both groups are different.

If so, then we investigate further.

We checked if there is something in common between these groups. Then we found a reason why this could be the case, for example gender.

So we optimized the treatment for this specific group based on the females/ males that survived more likely, such as reducing/ increasing the dosage e.g. intensity of radiation in radiation therapy

Then we check if the survival rate improves.

Next round, we can subdivide more based on other criteria. Like medication that patients took.

And based on the ROI that we found previously we can train another AI model that predicts if a ROI similar to this is there or not. And use this to predict what therapy parameters make the most sense before we even subdivide the groups manually.

But of course we want to understand why, especially in healthcare. So we still put in the work to create a decision model that we can trust instead of using another mathematical model that doesn't have a simple this-then-that logic we can follow.

This is how we create more personalized therapies using explainability.


Command Palette

Search for a command to run...

Schnellzugriffe
STRG + KSuche
STRG + DNachtmodus / Tagmodus
STRG + LSprache ändern

Software-Details
Kompiliert vor etwa 10 Stunden
Release: v4.0.0-production
Buildnummer: master@d237a7f
Historie: 10 Items