In the year 2023, we are witnessing the rise of emotional AI as a dominant application of machine learning. Innovations in technology are enabling the detection and interaction with human emotions, offering a myriad of opportunities across various industries. For instance, Hume AI, spearheaded by former Google researcher Alan Cowen, is at the forefront of developing tools capable of gauging emotions through verbal, facial, and vocal cues. Swedish company Smart Eyes recently acquired Affectiva, an MIT Media Lab spinoff renowned for its SoundNet neural network, which can classify emotions like anger from audio samples in a mere 1.2 seconds. Even widely-used platforms like Zoom are introducing features like Zoom IQ, providing real-time analysis of emotions and engagement during virtual meetings.

In the realm of chatbots, 2023 marks the advent of advanced models designed to closely mimic human emotions. These chatbots are set to forge more empathetic connections with users in fields ranging from banking to education and healthcare. Notably, Microsoft’s chatbot Xiaoice has already made significant strides in China, with users engaging in conversations with it more than 60 times a month. Xiaoice has even passed the Turing test, effectively fooling users into believing they are interacting with a human for up to 10 minutes. Juniper Research Consultancy predicts a nearly 167 percent surge in chatbot interactions in healthcare from 2018 to 2023, totaling 2.8 billion annual interactions. This is expected to free up valuable medical staff time and potentially save approximately $3.7 billion for healthcare systems worldwide.

Furthermore, emotional AI is making inroads into the education sector. In Hong Kong, some secondary schools have adopted artificial intelligence programs developed by Find Solutions AI. These programs analyze minute facial muscle movements in students to identify a range of emotions, both positive and negative. Teachers use this technology to monitor emotional shifts, motivation, and focus, enabling early interventions when students show signs of disinterest.

However, a significant challenge lies in the foundation of much of emotional AI: flawed science. Despite being trained on extensive and diverse datasets, emotional AI algorithms often reduce complex facial and tonal expressions to simplified emotions without considering the intricate social and cultural context of individuals and situations. While algorithms may recognize tears, they may not accurately deduce their underlying causes and meanings. Similarly, a frowning face may not necessarily denote anger, but an algorithm might arrive at that conclusion. This discrepancy arises because individuals adapt their emotional displays according to societal and cultural norms, masking their true emotions through “emotion work” and learned responses. For instance, women often modify their emotional expressions more than men, especially for emotions with negative connotations like anger, due to societal expectations.

As a result, AI systems making assumptions about emotional states are poised to exacerbate gender and racial inequalities. For instance, a 2019 UNESCO report highlighted the detrimental impact of gendered AI technologies, portraying “feminine” voice-assistant systems based on stereotypes of emotional passivity and servitude.

Furthermore, facial recognition AI can perpetuate racial disparities. Analysis of 400 NBA games using popular emotion-recognition software programs, Face and Microsoft’s Face API, revealed a tendency to assign more negative emotions to Black players, even when they were smiling. This research underscores previous findings showing that Black men often have to project more positive emotions in the workplace due to stereotypes of aggressiveness and threat.

While emotional AI technologies are set to become increasingly pervasive in 2023, they pose ethical challenges that must not go unaddressed. Left unchecked and unexamined, these technologies have the potential to reinforce systemic racial and gender biases, perpetuate existing inequalities, and further marginalize already vulnerable populations.

Leave a comment