Researchers from Stanford University have developed a new tool to track patient recovery in a hospital setting. They have created a deep learning-based system to see if patients are able to perform mobility tasks, such as getting in and out of bed or sitting in a chair. In an ICU setting, improved mobility is associated with better recovery, quality of life, and overall survival.
In partnership with Intermountain LDS Hospital ICU, in Salt Lake City, Utah, the researchers set up depth sensors in eight patient ICU rooms. The team chose the unique depth sensors because they can only capture shapes and silhouettes of people, and not photographs, sounds, or any other information in order to provide the highest level of patient privacy. They collected nearly 100,000 frames of depth sensor data and developed a deep learning algorithm to detect mobility activities.
The new system was able to detect mobility activities, such as getting in and out of bed or sitting in a chair, with 89% specificity and 87% sensitivity. The researchers also developed their algorithm to detect the number of personnel involved with a certain mobility activity, which can provide deeper insights into patient recovery. The system provides valuable insight into patient recovery in hospital settings, allowing staff to intervene earlier and help patients improve their recovery.
The publication in npj Digital Medicine: A computer vision system for deep learning-based detection of patient mobilization activities in the ICU