AI in AgeTech, Part 1: Collaborative™ Startups Capture the Massive Opportunity in Training AIs

By Mark Ogilbee posted 06-29-2023 04:48 PM


Prior to the public release of ChatGPT (at the end of 2022) and similar generative AI tools, the popular understanding of AI and its promise — and potential dangers — was largely formed by Hollywood blockbusters and other pop-culture influences. This popular conception of computers run amok long outstripped the actual abilities of AI, which began in the 1950s, when researchers at universities and government agencies began work on algorithms to accomplish “fuzzy” tasks such as speech recognition. For decades, AI development continued apace quietly until advances in computing power and other breakthroughs brought the seemingly magical power of AI to everyone’s laptop.  

And, just like that, we’re entering a new reality where AI is destined to be a part of our everyday lives. 

But if most people have been surprised by the sudden appearance of ChatGPT and other AI tools, tech innovators — including many AgeTech Collaborative™ (ATC) startup participants — have been leveraging the power of AI behind the scenes to build their products and services for a long time. In this first installment of a series, we take a look at how two ATC startups are using AI to power devices that help older adults live more independently and safely, and help caregivers become more efficient. 

6Degrees has developed an AI that focuses on and categorizes human motion, which allows people with disabilities or limited motion to interact with technology using the range of motion they do have. After a five-minute onboarding process, users can connect to smart devices via Bluetooth and use their existing range of motion to interact with devices such as phones and tablets. 

The technology is being used to assist people with disabilities, improve communication and enable more independent living. Miri Berger, founder and CEO, explains: “If you have a tremor, we learn what that looks like and the device will turn it into a smooth result on the device. For example, you can connect to an Oculus virtual reality headset to play a game, and your limited motion will be turned into a full range of motion in the virtual world. 

AI is a critical component of this functionality. 6Degrees is building its own library of human motion and categorizing it, and personalizing it for every user. “We collect thousands of data points and analyze those,” says Berger. “AI is an amazing tool to do that.” Down the road, AI will allow 6Degrees to anticipate a user’s motion. For example, by learning the particularities of an individual’s movement when they draw a circle, the AI can learn to anticipate the user’s intentions and can complete the circle faster and faster.  

This translates into allowing people with motion-related disabilities to get back to work more quickly, for example, and give them a higher quality of life. “AI is an amazing opportunity to fit technology to the person, to fit the way people move individually — rather than users needing to adjust to the technology,” says Berger. 

Tellus is another ATC startup using AI to drive innovation. Tellus has developed a hardware and software solution that uses radar as a sensor to generate point cloud data in real time for analyzing health information. The technology can provide actionable data without the need for wearables or cameras, which is more convenient and helps ensure user privacy. 

“There’s a lot of AI that goes into our product,” says Tania Coke, co-founder of Tellus. “Imagine data points being generated multiple times per second and translating those points into real health information, such as whether somebody is sleeping versus in their bed but awake, or how many hours they're spending in their room — which is indicative of loneliness and social isolation. 

Tellus has leveraged recent advances in radar and 3D sensing technology to collect fine-grained information and turning that into health data. “That’s really where AI comes in,” says Coke. “We’re using AI to turn the radar data into information and making sure that it’s actionable. We’ve developed a large language model AI called Pomfrey, which will be immensely helpful: Caregivers can see information — such as too many wake-ups in the middle of the night — and use Pomfrey to dig into the data to see if it needs further attention.” 

Because Tellus uses AI to analyze data in real time, they can focus on surfacing only the most important information. Coke explains: “For aging in place, we want to focus on the big changes in behavior. If someone sleeps for eight hours and that’s typical, the caregiver doesn’t need to know that. But if there’s a change in that behavior, that could be significant.” 

This eases the burden on caregivers and allows Tellus to deliver a highly personalized healthcare experience based on the behaviors of each individual user. “Pomfrey allows a level of interaction with the data that wouldn’t be possible otherwise,” says Coke. 


You can see Part 2 in the series here, Part 3 here, Part 4 here and Part 5 here and Part 6 here.