While patient mannequins have been around for a long time, some even having features that mimic how the body responds to different treatments, they’re still essentially inanimate objects. Researchers at Notre Dame University have been working on introducing facial expressions on such human patient simulators in order to help train clinicians in dealing with real patients in distress.
As an initial step of the project, the team used video scans of real patients in pain to analyze how people generate looks of discomfort. These were then mapped onto virtual models and then onto a physical facial robot that has a bunch of moving points. To allow training with a variety of patient types, the robot uses interchangeable skins that can simulate patients of different ages and skin tones.
The next step the researchers are taking involves creating templates of stroke patients in order to use patient robots for training how clinicians spot and respond to people suffering a stroke.