As the population of senior citizens grows, so do the health problems—including cognitive diseases like Parkinson’s and Alzheimer’s.
Recently, researchers have developed a complete information and communication technology system that uses smart devices, the Internet of Things (IoT), and imaging sensors to help people with Parkinson’s and Alzheimer’s live more independent lives.
The analysis of data collected allows early detection and prevention of health problems.
How the system works
The system includes data capturing and multi-modal fusion to extract relevant health information, analyze data, and provide useful recommendations.
“The system gathers signals from diverse sources in health monitoring environments, understands the user behavior and context, and triggers proper actions for improving the patient’s quality of life,” say the authors of “Behavior Analysis through Multimodal Sensing for Care of Parkinson’s and Alzheimer’s Patients.”
Testing the system
The researchers implemented and tested the system on eighteen patients over a ten-week period. The patients were equipped with a number of sensors designed to monitor their health, behavior, and movement:
- Multisensory band (bracelet), which provides health and motion data.
- Binary sensor placed on doors or drawers that can detect when they are opened or closed.
- RGB-D (Microsoft Kinect v2) camera , which monitors user activity, status, and evolution.
- Zenith panoramic camera that allows a wide coverage of target areas such as living rooms in senior centers.
- Wireless sensor network (WSN) anchors or beacons, which monitor the radio signals from patients’ wearables in a non-intrusive manner.
A triple-layered architecture
This system architecture, depicted below, consists of three layers: a services subsystem, a high-level subsystem, and a low-level subsystem. The services are oriented to gather other subsystems’ information and present the relevant information to the actors involved.
The illustration below shows the interconnection of all the sensors in the system. The multi-sensorial app integrates real-time data, allowing for synchronization and organization of the sensory measurements.

The illustration below shows how all the devices are calibrated in a common spatial coordinate system.
“The process consists of three steps: selecting a common coordinate system (manual), obtaining the rotation and translation matrix for the cameras, and performing the calibration for the WSN using the mentioned reference system,” the authors say.
The devices monitor a number of behaviors that can signal trouble:
- Abnormal day and nighttime motion patterns
- Disorientation or confusion
- Signs of apathy such as spending too much time on a couch or in bed
- Number of visits to the bathroom
- Patient disorientation when leaving the house
- Falling down
- Freezing of gait—inability to move
- Festination—fast shuffling caused by a deficiency of dopamine in the basal ganglia circuit
- Loss of balance
- Therapy-related exercises of a patient
An illustration of “freezing of gait” (FoG) detection is shown below.

“FoG refers to a state in which a patient with Parkinson’s disease experiences a sudden lack of movement despite his or her willingness to move. It typically occurs in specific situations, such as when starting to walk, stepping through a doorway, attempting to turn a corner, or approaching a chair. It typically lasts a few seconds, and it is very important to be detected, because FoG episodes are unpredictable and greatly increase the chance of falling. Furthermore, an increase in FoG episodes needs to be reported to the physician,” say the authors.
In the illustration below, data was recorded while patients were walking back and forth in a straight line, randomly, and into and out of rooms.

“In total, 8 hours and 20 minutes of data was recorded, containing more than 200 recorded freezing incidences,” say the authors.
A highly accurate and versatile system
One of the most important features of the system is that it can analyze patient behavior accurately even with limited information from just a few sensors.
“The multimodal fusion techniques presented here guarantee the modularity of the entire system by allowing its operation even in cases where not all presented sensors are available. This offers the versatility to identify different activities and events related to the evaluation of the patient’s health status. The results are promising compared to other works,” the authors say.
Read more about eldercare in the Computer Society Digital Library:
- Toward an ElderCare Living Lab for Sensor-Based Health Assessment and Physical Therapy
- Elder Falls Detection Based on Artificial Neural Networks
- Continuous Spine Care Service for the Elderly
- Using mobile application for Long-Term Care system
- Design of elder-friendly auditory signals for microwave ovens
- Using pervasive computing to deliver elder care
- A Bluetooth-Based Device-Free Motion Detector for a Remote Elder Care Support System
- Balancing Priorities: A Field Study of Coordination in Distributed Elder Care
- Stroke Prediction Context-Aware Health Care System
About Lori Cameron
Lori Cameron is a Senior Writer for the IEEE Computer Society and currently writes regular features for Computer magazine, Computing Edge, and the Computing Now and Magazine Roundup websites. Contact her at l.cameron@computer.org. Follow her on LinkedIn.