At Piction Health, our mission is to help every patient with a skin disease go from sick to healthy as fast as possible. We also believe in being completely open and transparent about new features on our roadmap that equip doctors with the right set of tools to make the most informed decision the first time.
We build new features after consulting with our internal team of primary care physicians, dermatologists, and clinician users of our app.
What is Explainable AI (XAI)?
The ubiquity of AI models in almost every technology we interact with on a daily basis brings up an important need to understand how the AI model makes decisions. This need is heightened when it comes to AI models operating on medical imaging data and integrated in clinical decision support tools that advise doctors on the next course of action for a patient. It is both useful and important to uncover the black box of AI decision making to build trust between the doctor and the AI tool, and help doctors be confident in their decision making.
Specifically in the dermatology domain, Explainable AI models provide valuable hints to clinicians on the region of the image the AI focused on the most to arrive at a result. Photos taken with a smartphone have a lot of variation in them, unlike radiology and pathology images. Highlighting the specific region within an image provides the clinician confidence that the AI is focusing on the right region and information to generate an accurate visual search.
When primary care physicians evaluate a skin condition, they aren’t sure of all the conditions a patient could have. When a dermatologist evaluates a skin condition, they usually build out a list of all the conditions that look similar. Then they go through them one by one to rule them as less likely or more likely based on their experience and medical history of the patient.
Our app uses AI to build this list of conditions by evaluating the specific features in the photo of the skin condition.
However, it’s not always obvious which area of the rash to focus on. There are multiple diseases where the appearance of the rash varies by severity, body part and skin tone. This makes it tough to narrow down on the part of the rash to focus on to arrive at a conclusion with confidence.
How we solved the challenge
Our AI models have been trained using the semantic segmentation approach to identify the segments or group of pixels within the image that were used by the AI model to come up with a list of visually similar conditions. Highlighting these pixels not only provides confidence to physicians about the outcome, but also provides them with valuable hints to focus on the right area of the rash.
Uncovering the blackbox of AI decision making is critical for doctors to understand how our AI generates its results so that they can confidently interpret these results and make the best decisions possible.
The example below provides a peek into how we help physicians understand AI decision making by producing a heat map of pixels in the original image that were used to produce a differential diagnosis.
AI Explainability is still a developing concept for AI models used in medical imaging. Regulatory bodies are also weighing in on this issue and inviting opinions from industry experts in order to come up with a regulatory framework for software enabled medical devices using AI. We will continue to monitor this space and will continue to improve on our transparency interface. We are just getting started.