Achieving Artificial Empathy: An AI Challenge

Article By : Anne-Françoise Pelé

Understanding emotions is not a straightforward process. From laughter to tears and every feeling in between, not everyone expresses or interprets emotions in the same…

Understanding emotions is not a straightforward process. From laughter to tears and every feeling in between, not everyone expresses or interprets emotions in the same way. Will machines be able to identify and analyze the full spectrum of human emotions? Will they achieve empathy? At the closing session of the recent SEMI MEMS & Sensors Executive Congress (MSEC 2020), Jens Fabrowsky, executive vice president for automotive electronics at Robert Bosch GmbH, depicted a future where machines can read users’ emotions and predict their intentions — a future where artificial intelligence transitions toward artificial empathy.

Processing data at the edge

In just a few decades, microelectromechanical system (MEMS) devices have moved from small specialty applications to mainstream products used in every aspect of our lives. “MEMS have been researched since the dawn of semiconductors throughout the world, but it is only recently that MEMS are present in all high-volume consumer devices we use every day,” Fabrowsky said at MEMS 2020. According to Yole Développement (Lyon, France), global MEMS revenue is set to grow at a 7.4% CAGR, from US$11.5 billion in 2019 to US$17.7 billion in 2025. The consumer market is and will continue to be the foremost driver for MEMS, with about 60% of the total, followed by the automotive market, at less than 20% of the total. MEMS designs are driven by the increase in sensing density achieved by orchestrating innovations in four main technology areas: manufacturing, materials, packaging, and computing. In manufacturing, the next frontier of evolution is vertical integration, or 3D MEMS, said Fabrowsky, specifying that today’s state of the art is the use of extra-thick sacrificial layers. Innovation also comes from new materials such as piezoelectric material for speakers, nitrogen-vacancy in diamond for magnetometers, and graphene for chemical sensors. As for packaging innovations, stacking different dies in the same package has improved the form factor, reduced the system cost, and increased the performance by limiting parasitic effects, said Fabrowsky.
Jens Fabrowsky
Robert Bosch GmbH’s Jens Fabrowsky
These building blocks for MEMS devices are necessary but not sufficient to develop the products customers are asking for. “Most of the innovation that will occur for MEMS in the next decade will be through application-level processing and intelligence at the edge,” Fabrowsky said. Having AI locally, directly in the MEMS device, will yield more relevant and useful applications for users. Running AI algorithms at the edge offers four main benefits. The first one is personalization, where calculation is performed locally and optimized for the user based on his or her individual behavior and environment. Second is trust, or privacy of user data. Local edge processing provides full control of what data is released and for what purpose. The third benefit is real-time feedback. Executing at the edge avoids data transfer and reduces latency. Fourth is battery life improvement, enabled by local processing. MEMS sensors and actuators are incorporating ever more intelligence, and local sensor data processing will continue to shape the timeline, said Fabrowsky. By capturing and analyzing the wealth of data, sensors bring humans and the digital world together. Human-machine interactions are rapidly evolving, but isn’t it time to deepen this relationship and make the machine identify, infer, interpret, and feel human emotions? A kind of artificial empathy? Empathetic sensors are already here, said Dimitrios Damianos, technology and market analyst in the photonics and sensing division at Yole Développement, in a discussion with EE Times Europe. But he urged us to think about the meaning and understanding of the word. “What is it to be empathetic? A human quality in the sense that it will understand my emotions based on my voice” and a sensing capability combined with large processing power to read between the lines and decode the true state of a user’s needs, Damianos said.

Achieving artificial empathy

Humans are social creatures, and emotions govern our actions and reactions. The ability to recognize emotional facial and vocal expressions is innate, and when we meet someone for the first time, he or she only needs to speak for seven seconds for us to form an opinion. Reading emotions accurately on the faces of others is the result of complex processes, and neuropsychology studies suggest that intonations and body language are more meaningful than the actual choice of words. Holistic perception is only possible through multi-sensor data fusion. “Sensors can independently identify and characterize an activity such as walking, cycling, or even sleeping,” said Fabrowsky. “At the sensor level, we are beginning to consolidate multiple information [types] to determine the health, the feelings, and the mood of the user. Sensors can answer questions such as, ‘Is my stress level rising?’ and we can use the information to coach the user.” This is sometimes referred to as the tactile internet, in which all human senses interact with machines. “It is only by understanding such emotions that machines can become intentional and helpful,” added Fabrowsky. But to build trust and extend the human-machine relationship, devices must become more intuitive and master values such as caring and compassion. “We will feel closer together with machines that are able to receive and transmit empathetic information. The end purpose is to digitally predict the intent of the user and allow him or her to make decisions in the physical life based on his or her own data and connected data.”
Artificial EmpathyThis is already happening in the automotive domain. At the 2019 Frankfurt Motor Show, French automotive software supplier Valeo unveiled Smart Cocoon, which leverages AI and the information captured by various sensors to deliver personalized comfort through a combination of temperature, lighting, sound effects, and fragrance. The interior lighting, for instance, adjusts automatically to reflect the temperature and adds to the feeling of comfort, with warm shades for heat and pale shades for cool. The vehicle becomes empathetic or emotionally intelligent in the sense that it considers the state of its driver and passengers, by detecting signs of fatigue, distraction, emotion, and stress, the company claims. Building advanced driver assistance systems “taught us that the only way to have a real-time experience is to process the sensor data locally,” Fabrowsky said. “All these capabilities of developing intent are growing in the sensor itself — not in an app, not in the cloud.” Such localization will provide better personalization, the highest privacy, the lowest possible power consumption, and minimum latency. Germany’s Bosch Sensortec is exploring the promise of neuromorphic sensing and computation by building brain-inspired AI at all levels of the systems, from the sensor itself to the application layers. “This helps us mimic the brain hierarchies,” said Fabrowsky. With the emergence of wearable medical devices and biosensors, algorithmic processes help people learn about their personal health and fitness level. Going well beyond step counting, step tracking, and activity monitoring, the next frontier is early-stage medical diagnostics, said Fabrowsky. “We are just at the beginning of what empathetic computing can do, and new opportunities are in our reach,” he said. These include “the ability to detect the onset of diseases like Parkinson’s or sclerosis well in advance thanks to non-invasive, always-on motion sensors containing AI processing.” Eventually, artificial empathy will make devices behave “like a trusted friend that we want to have at our side” and gradually enter all aspects of our lives. Applications could span across multiple sectors, including industrial automation, robotics, telepresence, elderly care, remote education, and training. From artificial strength (AS), we have moved to AI. We are now undergoing a new transition from AI to artificial empathy. “In the 1950s, the transistor enabled the roadmap of increasing computational power, augmenting our ability to model a forecast of the world and leading us to AI,” Fabrowsky said. “Thanks to MEMS, we will enormously extend and process sensory inputs, and interactions with machines will be more humanlike.” As Fabrowsky concluded, “The next decade will be remembered for its transition to artificial empathy.” But just as artificial intelligence differs from human intelligence, artificial empathy will differ from human empathy. Could artificial empathy eliminate cognitive biases? Unlike humans, could it make decisions devoid of cultural and gender stereotypes? That debate, venturing into the realms of philosophy and sociology, is one for another time.

Leave a comment