Fujitsu
Not logged in » Login
X

Please login

Please log in with your Fujitsu Partner Account.

Login


» Forgot password

Register now

If you do not have a Fujitsu Partner Account, please register for a new account.

» Register now

Recent Discussion:

NickBown | 22.12.2018, 14:18
We have come across this issue as well, and don't seem to have found a way around it (the server is ...
NickBown | 20.12.2018, 18:40
Hi everyoneWe've got an RX2540 M1 which won't boot past the Fujitsu splash screen (which shows the i...
MarkM | 15.10.2018, 11:33
Hi there.I do not believe the Fujitsu policy on this subject has changed.So NO you can not order dri...
Oct 24 2019

Fujitsu Develops AI based Facial Expression Recognition Technology to Accurately Detect Subtle Changes in Expression

Fujitsu announced the development of an AI facial expression recognition technology that detects subtle changes in facial expression with a high degree of accuracy. It is more accurate at tracking complex facial expressions such as awkward giggles, nervousness or confusion.

Image

Following up on a Fujitsu’s press release, ZDNet published an article highlighting a new AI tool developed by Fujitsu.  

Always considered a controversial subject, with some being intrigued or sceptical about it, it’s a fact that the use of facial recognition technology is growing. Always in the vanguard, Fujitsu developed a way to help track emotions better too. 

Our laboratories have come up with an AI-based technology that can track subtle changes in expression, like nervousness or confusion. “Companies like Microsoft are already using emotion tools to recognise facial expression, but they are limited to eight "core" states: anger, contempt, fear, disgust, happiness, sadness, surprise or neutral.”

“Fujitsu’s new technology works by identifying various action units (AUs) – that is, certain facial muscle movements we make and which can be linked to specific emotions. For example, if both the AU "cheek raiser" and the AU "lip corner puller" are identified together, the AI can conclude that the person it is analysing is happy.”

The issue with the current technology is that the AI needs to be trained on huge datasets for each AU. It needs to know how to recognise an AU from all possible angles and positions. But we don't have enough images for that – so usually, it is not that accurate.

Because of the vast “amounts of data needed to train an AI to effectively detect emotions, current technologies struggle to recognise what we feel, particularly so if we are not in a prime position – that is, sitting in front of a camera and looking straight into it.”

However, Fujitsu has now found a work-around for this issue. Instead of creating more images to train the AI, Fujitsu developed a tool to extract more data out of one picture. “Thanks to what it calls a "normalisation process", it can convert pictures taken from a particular angle into images that resemble a frontal shot.”

“After it has been appropriately enlarged, reduced or rotated, the newly-created frontal picture lets the AI detect AUs much more easily – and much more accurately.“

With the same limited dataset, we can better detect more AUs, even in pictures taken from an oblique angle, for example. And with more AUs, we can identify complex emotions, which are more subtle than the core expressions currently analysed.

Fujitsu is now able to detect emotional changes as elaborate as nervous laughter, with a detection accuracy rate of 81%, even with limited training data. This technology is also more accurate than other existing technologies according to certain facial expression recognition technology benchmarks (Facial Expression Recognition and Analysis Challenge 2017).

Fujitsu aims to introduce the technology to practical applications for various use cases, including teleconferencing support, employee engagement measurement, and driver monitoring.

Nuno Costa

 

About the Author:

Nuno Costa

Channel Business Development Associate

SHARE

Comments on this article

No comments yet.

Please Login to leave a comment.