Facial expressions can say a lot about our tastes. From music to food, the things we like are no longer a secret now that there are smartphone apps that can read our micro facial expressions. In fact, if you think about it, you realize that the majority of our communication with other people is non-verbal – some studies say our words actually represent less than 10% of communication. But what happens when the Facial Action Coding System is applied to technology? Well, that’s when your smartphone knows you better than you know yourself!
The app from Oxford that chooses food for your mood
Direct from the University of Oxford, this app can suggest foods based on a user’s mood. It was developed in collaboration with British scientists and the food delivery service Just Eat, and involves facial mapping that shows emotions in real time, and matches the right food for each one.
It’s a well known fact that certain foods can positively impact a person’s mood by stimulating the production of serotonin, which is one of the body’s strongest influences on mood, appetite, sleep and pain perception. If the app recognizes anger or stress for example, it tries to suggest calming foods like almonds, avocado, chocolate and so on.
MorphCast: videos that change based on your facial expression
Another app that can read facial expressions is MorphCast. This uses a system that can recognize your face as you watch the screen, understand your age, gender and emotions, and once it’s collected this data, it modifies the video that you’re watching.
The smartphone camera takes center stage here too, capturing the user’s expressions. The user’s interaction is magnified because the viewer becomes the director and can change any features they like based on the expressions they show. No doubt it’s something that will appeal to a lot of people, but it also could have future implications for children’s safety, as a parental control or in the marketing or advertising fields.
Peekabeat, the app that guesses your favorite music
Peekabeat is an app developed by AQuest that recognizes a smartphone user’s emotions by reading their facial expressions, and then suggests a suitable playlist for how they’re feeling in that moment.
How does it work? The user takes a selfie or uploads a photo from their gallery, and it’s analyzed with FACS, the app that suggests a full playlist based on 7 emotions.
The system is integrated with the streaming music service Spotify, which provides the songs. And thanks to Microsoft technology like the Azure Web App, Azure Cognitive Services and Emotion API, it works on desktop too.
Faception identifies terrorists through physical traits
It’s not just about music and food – facial recognition apps are also proving vital for national security. One example is Faception, which uses Cesare Lombroso’s theories about identifying criminals by analyzing their physical traits.
Computer vision and computational intelligence are technologies based on a system divided into 15 sets of facial characteristics common across all types of individuals. In this way, through the data collected, it’s able to elicit information about a human being’s personality through a simple photo.
In tests run in collaboration with Israeli national security forces, Faception was able to identify 9 of the 11 perpetrators of the terrorist attacks in Paris, even though only 3 had prior criminal records. In fact, the app has an 80% success rate of identifying not only someone who plans to commit a terrorist attack, but also someone who has the potential to be the next Steve Jobs.
“We know human beings much better than they know each other, because personality is determined by DNA, which is reflected in facial features,” says Shai Gilboa, co-founder of Faception.