Blogs

AuraSense: Bringing Haptics-Based Data Collection to Telemedicine

By Mark Ogilbee posted 06-29-2023 03:09 PM

  

AuraSense, an AgeTech Collaborative™ startup participant, builds a remote patient monitoring and evaluation platform that integrates AI, machine vision and digital haptics that enhance telemedicine with the sense of touch, delivering personalized care when and wherever it is needed. 

We spent time with AuraSense co-founder and CEO Amber Ivey to learn more about the company and its cutting-edge tech. 

This interview has been edited for clarity and length. 

 

Can you tell us about AuraSense? 

One of our first products that we’re launching is a monitoring system that uses haptic technology, which is the buzzing sensation you get through the controller when you’re playing a video game, or on your phone when you get a text. We’re bringing that technology to healthcare and telemedicine. 

For example: I live with multiple sclerosis, and I have to physically go see my neurologist to get a quantitative measurement of what’s happening. During COVID, I and many other people had to do these appointments via telemedicine, and we lost the ability to capture a lot of biometric data. Now, our product brings that ability to telemedicine by using AI and mid-air haptic technology, where you can actually touch what we call a sound object. 

 

What’s a “sound object”? 

We can use sounds to create objects so someone can interact with them, and cameras and other technology allow us to capture what someone is doing with their hand so we can collect data on that. 

For example, normally I would go into a doctor’s office and do a strength test by squeezing the doctor’s hand and gripping things, and she can tell if I’m getting stronger or not, and other things. Our technology allows you to simulate that touch by interacting with a sound object, which then allows you to collect different data sets and give the doctor a report so she knows what’s happening with the strength in my hand for my particular disease. 

 

That’s extra cool. 

Imagine you can touch a ball, and it feels just like a ball — but you can’t see it, because it’s sound. But because you can feel the different vibrations based on what you’re doing, it allows you to interact immersively with the technology. It gives you a sense of the real thing. 

 

How is the sound object generated? 

Our prototype is a haptic pad with a monitor attached to it, with cameras around it. So if I'm doing a hand examination, it can tell you what's happening with the person just because of the different interactions their hands are making. 

 

Why are you focusing so closely on getting data from the hand, specifically? 

Hands tell us a lot about ourselves, and a lot of diseases — including dementia, cerebral palsy and MS — require hand assessments. You can tell a lot about how a disease is progressing just through their hands. 

 

What’s the origin story behind AuraSense? 

One of my co-founders has a daughter who has special needs. During COVID, they had to move her PT appointments to telemedicine. She was only 3 years old, and a 3-year-old isn’t going to want to interact with people like that. The PT folks kept telling her to pay attention, and they’d get irritated when she wouldn't. Finally, her dad said, “Don’t you have an Xbox controller or something that she can use to interact with, and you can get the data from that?” They didn’t, so he explained how he helps make quantitative tools in the healthcare field to help with measurement. And they said, “You should make this!” 

 

What are some of the obstacles the company has had to navigate? 

One of the hard things is explaining the concept clearly, because not everyone has thought about haptics in this way. People in the virtual reality (VR) gaming world get it, because in VR you can feel and interact with objects and have a sensation associated with that. But it’s often hard to explain this to many people. 

 

What’s next on the horizon for AuraSense? 

Our goal is to pilot this summer. We want to make sure that the measurements we’re collecting match the systems doctors are already using. We want to fine-tune our metrics so doctors can easily integrate them into their workflows. Mainly, we’re looking to start piloting with neurologists, particularly people working in areas that require hand assessment — so that could range from MS, cerebral palsy, dementia and other conditions. 

What we’re doing is novel and has the potential to impact lives — especially for people with mobility issues. They can capture the data they need to from the comfort of their own home, so they can see how their condition is progressing. I’m really excited about it, and I hope others are, too. 

 

Learn more about AuraSense at their website. 

#SpotlightOn 

0 comments
478 views

Permalink