It’s not uncommon for patients to hide their true emotions from their caregivers – or even from their own conscious selves. An experimental new facial “sticker” could help, by detecting and relaying information on its wearer’s present state of mind.
Currently being developed by Assoc. Prof. Huanyu “Larry” Cheng and colleagues at Pennsylvania State University, the flexible, stretchable device incorporates sensors that measure biaxial mechanical strain, body temperature, sweat-induced humidity, and blood oxygen levels. These sensors are stacked like pancakes, separated by thin sheets of different materials that keep their signals and measurement methods from interfering with one another.
Other components include a printed circuit board, wireless charging coil, 5-volt battery, and Bluetooth chip. All of these bits and pieces are encapsulated within a waterproof silicone covering, with the whole device measuring about 6 cm (2.4 inches) in length.
When the sticker is temporarily adhered to the patient’s face, its strain sensor monitors their skin movements along two axes, wirelessly relaying that data to an app on a nearby cloud-connected smartphone or tablet.
AI-based algorithms in the software are in turn able to deduce the person’s current facial expression, which is certainly linked to their mood. In lab tests, the technology has proven to be over 96% accurate at identifying six common facial expressions: happiness, surprise, fear, sadness, anger, and disgust.
That said, expressions can be faked, often even subconsciously. For that reason, the app also utilizes real-time readings from the temperature, humidity, and blood oxygen sensors. Utilizing this combination of data, the system is so far almost 89% accurate at identifying true emotions triggered by the viewing of various video clips.
That figure should improve as the technology is developed further. And importantly, because the data is processed in the cloud, the device could allow doctors to remotely monitor the psychological wellbeing of patients via the internet.

“This is a new and improved way to understand our emotions by looking at multiple body signals at once,” says Cheng. “People often don’t visibly show how they truly feel, so that’s why we’re combining facial expression analysis with other important physiological signals, which will ultimately lead to better mental health monitoring and support.”
The research is described in a paper that was recently published in the journal Nano Letters.