Pain Algorithm

Researchers are working on a way to automate pain assessment accurately and reliably using a combination of facial recognition, neural networks and machine learning. The goal is to not only distinguish real pain from fake pain but also to reduce the use of potentially addictive painkillers by ensuring that appropriate therapies are being prescribed.

Past research into automatic pain recognition has focused primarily on generic models to quantify facial expressions. The new algorithm—known as DeepFaceLIFT—utilizes a two-stage learning model to account for individual differences in facial features and cues. The first stage combines self-reported pain scale data with facial landmark features to determine which expressions are most significant for that particular patient. The second stage estimates pain levels based on these personal features using a multitask learning system.

The system was tested on a dataset containing over 48,000 image frames from a total of 25 patients who suffered from one-sided shoulder pain. DeepFaceLIFT outperformed non-personalized models and also provided valuable information on the pain-relevant facial regions for each subject, which will allow for easier interpretation of pain-related facial features.

For information: Dianbo Liu, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, MA 02139; email: dianbo@mit.edu; website: http://web.mit.edu/