Understanding emotions is not a straightforward process. From laughter to tears and every feeling in between, not everyone expresses or interprets emotions in the same…
Understanding emotions is not a straightforward process. From laughter to tears and every feeling in between, not everyone expresses or interprets emotions in the same way. Will machines be able to identify and analyze the full spectrum of human emotions? Will they achieve empathy?
At the closing session of the recent SEMI MEMS & Sensors Executive Congress (MSEC 2020), Jens Fabrowsky, executive vice president for automotive electronics at Robert Bosch GmbH, depicted a future where machines can read users’ emotions and predict their intentions — a future where artificial intelligence transitions toward artificial empathy.
Processing data at the edge
In just a few decades, microelectromechanical system (MEMS) devices have moved from small specialty applications to mainstream products used in every aspect of our lives. “MEMS have been researched since the dawn of semiconductors throughout the world, but it is only recently that MEMS are present in all high-volume consumer devices we use every day,” Fabrowsky said at MEMS 2020. According to Yole Développement (Lyon, France), global MEMS revenue is set to grow at a 7.4% CAGR, from US$11.5 billion in 2019 to US$17.7 billion in 2025. The consumer market is and will continue to be the foremost driver for MEMS, with about 60% of the total, followed by the automotive market, at less than 20% of the total.
MEMS designs are driven by the increase in sensing density achieved by orchestrating innovations in four main technology areas: manufacturing, materials, packaging, and computing. In manufacturing, the next frontier of evolution is vertical integration, or 3D MEMS, said Fabrowsky, specifying that today’s state of the art is the use of extra-thick sacrificial layers. Innovation also comes from new materials such as piezoelectric material for speakers, nitrogen-vacancy in diamond for magnetometers, and graphene for chemical sensors. As for packaging innovations, stacking different dies in the same package has improved the form factor, reduced the system cost, and increased the performance by limiting parasitic effects, said Fabrowsky.
These building blocks for MEMS devices are necessary but not sufficient to develop the products customers are asking for. “Most of the innovation that will occur for MEMS in the next decade will be through application-level processing and intelligence at the edge,” Fabrowsky said. Having AI locally, directly in the MEMS device, will yield more relevant and useful applications for users.
Running AI algorithms at the edge offers four main benefits. The first one is personalization, where calculation is performed locally and optimized for the user based on his or her individual behavior and environment. Second is trust, or privacy of user data. Local edge processing provides full control of what data is released and for what purpose. The third benefit is real-time feedback. Executing at the edge avoids data transfer and reduces latency. Fourth is battery life improvement, enabled by local processing.
MEMS sensors and actuators are incorporating ever more intelligence, and local sensor data processing will continue to shape the timeline, said Fabrowsky. By capturing and analyzing the wealth of data, sensors bring humans and the digital world together. Human-machine interactions are rapidly evolving, but isn’t it time to deepen this relationship and make the machine identify, infer, interpret, and feel human emotions? A kind of artificial empathy?
Empathetic sensors are already here, said Dimitrios Damianos, technology and market analyst in the photonics and sensing division at Yole Développement, in a discussion with EE Times Europe. But he urged us to think about the meaning and understanding of the word. “What is it to be empathetic? A human quality in the sense that it will understand my emotions based on my voice” and a sensing capability combined with large processing power to read between the lines and decode the true state of a user’s needs, Damianos said.
Achieving artificial empathy
Humans are social creatures, and emotions govern our actions and reactions. The ability to recognize emotional facial and vocal expressions is innate, and when we meet someone for the first time, he or she only needs to speak for seven seconds for us to form an opinion. Reading emotions accurately on the faces of others is the result of complex processes, and neuropsychology studies suggest that intonations and body language are more meaningful than the actual choice of words.
Holistic perception is only possible through multi-sensor data fusion. “Sensors can independently identify and characterize an activity such as walking, cycling, or even sleeping,” said Fabrowsky. “At the sensor level, we are beginning to consolidate multiple information [types] to determine the health, the feelings, and the mood of the user. Sensors can answer questions such as, ‘Is my stress level rising?’ and we can use the information to coach the user.” This is sometimes referred to as the tactile internet, in which all human senses interact with machines.
“It is only by understanding such emotions that machines can become intentional and helpful,” added Fabrowsky. But to build trust and extend the human-machine relationship, devices must become more intuitive and master values such as caring and compassion. “We will feel closer together with machines that are able to receive and transmit empathetic information. The end purpose is to digitally predict the intent of the user and allow him or her to make decisions in the physical life based on his or her own data and connected data.”