TechRepublic’s Karen Roby spoke with Jon Friis, CEO and founder of Miiskin, about how the Miiskin app is helping prevent skin cancer. The following is an edited transcript of their conversation.
Karen Roby: We understand how technology can help change things in medicine, such as robots are in the operating room, and we’re just seeing all kinds of really innovative things going on. Observing your moles is one of those things on our skin that I would never think technology would play a role in. Tell us before we get to the technology part of this, the augmented reality and machine learning. How did you come up with this idea, and tell us a little bit more about it.
SEE: 5 Internet of Things (IoT) innovations (free PDF) (TechRepublic)
Jon Friis: I’m an IT consultant. I’ve been in the IT industry for many years. And I’m living with my partner here, and she’s definitely in the high risk of developing skin cancer. She’s blonde, blue-eyed, probably done too much sunbathing, tanning in her youth. She’s already had 12 moles removed, which was a concern to the doctor. But what they kept repeating to us, that we had to be extremely aware of how the skin and especially your moles develop on your skin. I was kind of frustrated not doing anything.
When we sit with these powerful tools, and we have these images, technologies available, I couldn’t help thinking that there was something that you could do as a patient or as a concerned user at home in between your clinical consultation. Because the single most important thing that the doctor tells you about is that if anything is changing on your skin, either a mole changing in size or shape, color or if anything new appears on your skin, you should have it checked out. That’s a very easy, understandable recommendation from the clinical side, but it’s not trivial to do. That’s why I came up with the idea of using images to support these patients, these users at home in between their clinical consultations.
Karen Roby: With you having a tech background, it’s very interesting how you’ve married the two here, especially from your own personal experience. I certainly have my fair share of issues from being light-skinned and spending way too much time out in the sun years ago, so I’m paying for that now. So this is very interesting. Talk about the machine learning behind it, whether it’s virtual reality, AR. What’s involved here, and how this is going into really putting this platform out there?
Jon Friis: The whole technology is a skin monitoring app and technology that supports users with checking themselves, identifying and documenting these changes that they can see over time, and comparing these images. But we have several high-fidelity, very advanced technologies that we have developed over time that support the user with recognizing these lesions and elements on your skin, so you can compare them better. We have a feature where you basically place the app and the phone on the table, and you can stand in front of it. It automatically takes all the images in full body of you, so you can compare and see these images over time to see if anything new is coming up.
SEE: Future of 5G: Projections, rollouts, use cases, and more (free PDF) (TechRepublic)
And recently, we are just releasing this, what we call a mole-sizing feature. That is totally new, and that gives you the capability of being able to measure moles because size is an important factor to the doctor. So recommendations are that when a mole is above six millimeters, that’s something you should have checked with your doctor, your dermatologist. And also, if a mole or lesion is changing over time, growing or changing shape or color, that’s also something that the doctor should know about and have it checked. But it’s not trivial to make these observations. It’s not trivial to measure these things, and that’s why we invented the mole-sizing feature.
Karen Roby: This is really the epitome of how technology, I think, Jon, that can help empower people to take control of their health and not just say, well, I’m going to go in once a year and hope that the doctor finds something, rather, they’re really taking part in their own care.
Jon Friis: I think that’s what you see, the change in the landscape now. That is the active patient involvement, active patient engagement at home, where patients and users invest their own time in collecting these data points. These measurements that later then can be used by the clinicians for different things so that the clinician can get a better understanding of what has happened. How did it look a year ago? What was the measure of this a year ago and so forth? And we see that change rapidly ramping up since more and more users adapt to our technology.
Karen Roby: Jon, having been in technology for years now, that being your background. Would you have thought, years ago, that you would see machine learning being used in this way to help with someone’s health and sizing moles and following changes and things like that?
Jon Friis: I don’t know if I’ve been thinking in that direction. I just see that you see technology and AI, machine learning, supporting different businesses everywhere in the world, so also in the health space. But what I think was maybe somewhat very hopeful to see that these advanced technologies should be able to diagnose or assess.
SEE: Gaming healthcare: Virtual reality gives surgeons life-like training (TechRepublic)
Where I actually think that the first really use case and the first successful adoption of this is actually to support getting more information at the right time, and also getting more precise information, and strengthen that relations between the patients and the doctor. That doesn’t mean that you, at some point, maybe even be able to do something more advanced, but I think where we are in the current scenario, it’s about creating convenient technology so the patient-doctor relationship it becomes stronger, so to speak.
Karen Roby: Jon, specifically when we talk about monitoring the size of our moles and through this, what does that look like?
Jon Friis: The technology we just recently released is, actually, it’s quite simple to get the support at home in between your clinic consultation. You basically take this, kind of like a quarter, it’s a quarter, you put it next to your mole and simply take the app technology we’ve developed, and take that image. Then, you’ll be able to see the size of the mole because it recognizes this with advanced machine learning and augmented reality, and computer vision that then defines the object and calculates the size of the mole. So people can see this, and they have these reference points and the baseline of how things look now. And then, of course, they can come back and see how things develop.
The convenience about mole sizing, it’s basically a very simple tool. You’re always wearing this. It’s quite easy to get a quarter, and then you’ll be able to measure your moles, lesions on your skin. That’s the new feature that we just recently launched.