Blogs

Getting your Trinity Audio player ready...

AI in AgeTech, Part 3: Collaborative™ Startups Train AIs to Help Alleviate Social Isolation

By Mark Ogilbee posted 07-13-2023 01:52 PM

  

Audio of this post here  (Beta)

Although artificial intelligence (AI) has recently surged back into the public imagination, many AgeTech Collaborative™ (ATC) startup participants have been incorporating AI in their solutions for a long time. But using AI isn’t a one-and-done deal; a key to successfully leveraging AI is staying nimble and helping the AI evolve to continually improve its performance.  

In this third part of a series, we explore how two ATC startups are incorporating and continually training AI models in their solutions to help alleviate the problem of social isolation among older adults. 

Care.coach is aiming to solve the caregiver crisis and improve health outcomes through a unique combination of AI and human intelligence, all packaged in an easy-to-use, at-home device. “We’ve created an avatar on our device; it looks like a little dog or cat,” says founder and CEO Victor Wang. “It's a relational agent that combines the empathy and intelligence of our human team working around the world 24/7, with software intelligence and AI-driven automation. 

AI has been instrumental in helping care.coach develop its offerings. “We use AI to create an empathy and conversational intelligence that wouldn’t have been possible a few years ago,” says Wang. Training the AI is a critical part of the process. Care.coach starts with open-source foundational models, then, says Wang, “we use our own training platform to elevate the ability of the foundational model to be more empathetic and conversational, as opposed to just spouting out a bunch of information. And we make sure that it's safe and effective in a healthcare context, which is unique to what care.coach does. 

Another unique advantage care.coach enjoys is the ability to fine-tune the AI with its proprietary data. “We have these years-long relationships with people carried out through the avatar. Our staff have all kinds of conversations with people about their kids, their medications and so on. That gives us cycles of socially supportive conversations, and we can feed that into the model to fine-tune the AI and make it more empathetic.” 

Care.coach doesn’t stop there, however. They can perform additional rounds of AI training called reinforcement learning with human feedback, empowered by their staff who operate the avatars. For example, if a customer expresses concern about losing their Medicare coverage, the AI will generate a response, and the human behind the avatar can rate the response in terms of its effectiveness. “That’s our reinforcement learning with a human feedback loop,” says Wang. “We're constantly improving our model to do its job better and better.” 

Thriving.ai is a digital application that helps to integrate health and social care, bringing together people, data and the business of healthcare to deliver a holistic experience for older adults and their “circle of care” caregivers. “We bring the health components and the social components together,” says founder Shain Khoja, “because 80% of our health is a result of social determinants of health.” 

For example, a user’s interaction with the app tells Thriving.ai how active they are on a particular day compared to their daily pattern. If activity is lower, that could be an indicator that they’re feeling sad. “In turn, not being active can reinforce that sad mood,” says Khoja. It’s very multidimensional. Currently, if the app detects that a user is sad, data analytics can generate certain responses or suggest certain content. 

But Thriving.ai is beginning to use AI to deepen the app’s functionality and the user experience. Khoja explains: “The natural language processing that we’re working on will actually say, ‘I’m sorry to hear you’re sad. Since you love cats, here’s a selection of lovely pictures of cats. And may I reach out to a circle of care member to get in touch?’ It then gives the user the dignity of responding to that.” 

“Once we release the deeper version of the AI, it will absorb information about you and your circle of care members, and it will make greater connections,” Khoja says. For example, if a user is feeling sad, and one of their caregivers happens to be nearby right then, the AI can create a response that is more meaningful and humanlike. “That may result in your caregiver giving you a call, and maybe you go out for pizza, says Khoja. In turn, this enables the AI to continue to learn. If you feel better, the AI has now learned something that triggers your mood to improve; it will continue to learn those things and become more equipped to respond to the needs of users.” 

See Part 1 of the series here, Part 2 here, Part 4 here, Part 5 here and Part 6 here.

#HotTopics 

0 comments
776 views

Permalink