Part of the Aging and Innovation Special Report
Richard Adler knows a few things about the uneasy relationship between older adults and technology.
It’s something that Adler, a distinguished research fellow at the Institute for the Future in Palo Alto, Calif., has been studying for more than 25 years. One thing, in particular, has struck him.
“The truth is that a lot of older adults are technophobes,” he said. “They tend to be classically late adopters of almost any technology.” Even though they are the ones with the most to gain from embracing the latest technology, Adler said.
Hesitant About Artificial Intelligence
That wariness may especially be true when it comes to the digital innovation that seems destined to become the next game-changer — artificial intelligence or AI. The name alone conjures up notions of talking robots and other brainy devices. That can seem creepy to older adults, not to mention that the idea of being around thinking machines can make them anxious about losing privacy or perhaps even worse, constantly being reminded of their own slipping cognitive skills.
Algorithms (were) able to predict heart attacks based on data from 10 years before they happened, with more accuracy than human doctors.
Actually, artificial intelligence covers a lot of ground. But put simply, “intelligent” machines, instead of just being programmed to do a task step by step, are able to learn by recognizing patterns, classifying data and adjusting to mistakes they make.
Helping Older Adults Stay at Home
It’s that ability to draw conclusions from enormous amounts of data that’s making AI a key component of efforts to help ensure that older adults can remain living in their own homes later in their lives.
Here are five examples:
1. Talking devices are interacting with humans in more engaging ways.
First, there was Apple’s Siri, followed by Amazon Echo and Google Home — a wave of devices with impressive voice recognition skills that can answer questions, play music and help keep the temperature in your home just right.
Now, the next generation of home assistants is being developed. They are being designed to connect more with their housemates, and not simply respond to requests.
A good example is ElliQ, a device created by an Israeli company called Intuition Robotics, and now being tested in the San Francisco area. ElliQ looks more like a desk lamp than an archetypal robot. But while it’s meant to sit on a table or nightstand, it’s hardly static. ElliQ is designed to interact more effectively with humans by mimicking them. In fact, one of its key characteristics is what its designers describe as “body language.”
Not only does ElliQ talk, it also moves. When a person speaks to it, ElliQ leans in his or her direction. Or it might turn to its separate screen component to join its owner looking at new photos of grandchildren. It also uses lights of varying brightness and differing volume levels to get its message across.
That’s part of ElliQ’s learning process. One of its roles is to help owners meet goals they have set, such as taking a daily walk. Through experience, ElliQ determines what kind of “coaching” is more effective for certain people — gentle reminders or more insistent prodding.
2. Analysis of data gathered by sensors can help predict problems.
Scientists are also finding they can learn quite a bit about the behaviors of older adults through sensors installed in their homes. And, they say, using AI algorithms to evaluate patterns of activity from the data those sensors collect can help them make predictions about what behavior changes might mean.
“AI can help us create a snapshot of an activity,” explained Guido Pusiol, a researcher at Stanford’s Program in AI-Assisted Care. “Different activities have different duration times and that information can be used to trigger that someone may have an issue.”
For example, if a person takes an unusually long time to leave the bedroom in the morning, it could mean he has fallen, according to Pusiol, who noted that most falls by older people occur near their beds. Sensors might also detect that a person is starting to have difficulty getting into the shower, and that could alert caregivers or family members that an aide may be needed to help in the future.
“Someone can be sent to help them do that one thing,” Pusiol said. “So they can still be at home, they can still feel independent.”
It’s harder to spot changes in a person’s mental or emotional state, but even there the sensors can provide clues. Lack of motion over long periods of time — say a person is watching much more television than in the past — can reflect apathy and perhaps depression.
“You don’t really need to see the person’s face,” said Pusiol. “You can learn a lot by how they move over a period of time.”
Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory are using sensors to track movements of older adults. Specifically, they’ve been analyzing how fast or slowly people walk, and say that can be a predictor of falls or cognitive decline.
They’ve developed a device, called WiGait, that bounces wireless signals off a person’s body, and from that can measure his or her walking speed with 95 percent accuracy, according to a recently published study. Changes in that speed, or their gait, can be early indicators of health or cognition issues.
3. Smart homes are getting smarter.
It should tell you something about the future role of artificial intelligence in homes that IKEA recently rolled out a worldwide survey seeking opinions on the subject. It wants to know such things as what kind of attitude people prefer in an AI device — “autonomous and challenging” or “obedient and assisting” — and what gender its voice should be — male, female or neutral.
Home devices such as Amazon Echo and Google Home already are capable of turning lights on and off, adjusting the temperature and locking doors. At present, they’re voice-activated and follow directions. As impressive as that can seem, it’s still fairly basic AI.
The next phase, though, will bump things up a notch, enabling smart homes to become more proactive by taking actions based on what they’ve learned about your preferences and behavior. For instance, a startup in London called AI Build has developed a product it calls aiPort, which not only responds to voice commands and gestures — it has six cameras — but also adapts its actions as it gets to know you better.
That could result in aiPort turning on the coffee machine based on what it knows about your morning schedule or adjusting the lighting or music in response to what it recognizes in your body language.
This level of smart home sophistication is still in its early stages, but it gives you a sense of what’s possible to make it easier for older adults to manage their homes.
4. Machines are learning to diagnose diseases.
This spring, Google announced that its research teams have developed algorithms enabling machines both to diagnose an eye disease related to diabetes that can lead to blindness and to detect breast cancer.
After being shown many images of swollen blood vessels in the eye, or cancerous tumors, computers were able to learn what to look for in the future. To train a machine to spot diabetic retinopathy — the fastest growing cause of preventable blindness — it was provided with 128,000 retinal images to analyze.
Researchers at Stanford University followed a similar process in helping a machine diagnose skin cancer. In that case, the computer was given a database of nearly 130,000 skin disease images, which it used to learn what skin cancer looked like. Ultimately, the algorithm enabled machines to match the performance of experienced dermatologists.
Another Stanford study looked at whether artificial intelligence could be effective at predicting heart attacks. Computers were given the task of analyzing almost 400,000 patient medical records, with the goal of finding patterns associated with future cardiovascular emergencies. Not only were the algorithms able to predict heart attacks based on data from 10 years before they happened, they were able to do so with more accuracy than human doctors.
5. One day you could be able to back up your memory.
Clearly, one of the great challenges of getting older is fading memory. Not being able to remember names, places or events can make taking care of yourself that much harder.
But what if you could have a memory as good as computer memory? That’s a question Siri co-founder Tom Gruber asked during a recent TED talk.
“What if you could remember every person you ever met? ” he continued. “How to pronounce their name? Their family details? Their favorite sports? The last conversation you had with them?”
While Gruber conceded that he’s not sure how or when it will happen, he’s convinced that using machines to dramatically enhance our memories is inevitable. Computers and phones already constantly gather data about our day-to-day lives, including photos of friends and family, the news we read and the music we listen to.
It’s only a matter of time, he said, before all of this is stored in a more systematic way to allow for easy memory retrieval. Of course, having a complete digital database of your life would raise some pretty significant privacy and security issues.
Gruber acknowledged that. But he believes a backup memory has enormous potential, particularly for people with conditions like dementia or schizophrenia.
“It’s the difference,” he said, “between a life of isolation and one of dignity and connection.”
Next Avenue Editors Also Recommend:
Next Avenue is bringing you stories that are not only motivating and inspiring but are also changing lives. We know that because we hear it from our readers every single day. One reader says,
"Every time I read a post, I feel like I'm able to take a single, clear lesson away from it, which is why I think it's so great."
Your generous donation will help us continue to bring you the information you care about. What story will you help make possible?