Bitget App
Trade smarter
Buy cryptoMarketsTradeFuturesCopyBotsEarn
The giants are scrambling to position themselves, and VCs are pouring money wildly, all to enable "AI mind-reading" as these companies go crazy

The giants are scrambling to position themselves, and VCs are pouring money wildly, all to enable "AI mind-reading" as these companies go crazy

ChaincatcherChaincatcher2024/10/17 02:45
By:Industry Express

The first step of AI disrupting humanity: understanding the human heart.

Author: Lexie

Editor: Lu

In the grand discussion about AI, people assign it roles either as our most efficient and capable assistants or as the "machine legion" that will disrupt us. Whether friend or foe, AI must not only complete tasks assigned by humans but also be able to "read" human hearts. This ability to read minds has been a major focus in the AI field this year.

In the emerging technology research report on enterprise SaaS released by PitchBook this year, "Emotion AI" has become a significant technological highlight. It refers to the use of emotional computing and artificial intelligence technologies to perceive, understand, and interact with human emotions, attempting to understand human feelings by analyzing text, facial expressions, voice, and other physiological signals. In simple terms, Emotion AI aims for machines to "read" emotions as well as, or even better than, humans.

Its main technologies include:

  • Facial expression analysis: Detecting micro-expressions and facial muscle movements through cameras, computer vision, and deep learning.

  • Voice analysis: Recognizing emotional states through voiceprints, tone, and rhythm.

  • Text analysis: Interpreting sentences and context using natural language processing (NLP) technology.

  • Physiological signal monitoring: Analyzing heart rate, skin response, etc., using wearable devices to enhance interaction personalization and emotional richness.

Emotion AI

Emotion AI evolved from sentiment analysis technology, which primarily analyzes interactions through text, such as extracting user emotions from social media posts. With the support of AI, integrating various input methods like visual and audio, Emotion AI promises more accurate and comprehensive emotional analysis.

01 VC Investment, Startups Secure Huge Funding

Silicon Rabbit observes that the potential of Emotion AI has attracted the attention of numerous investors. Some startups focused on this field, such as Uniphore and MorphCast, have already secured substantial investments in this track.

California-based Uniphore has been exploring automated dialogue solutions for enterprises since 2008 and has developed multiple product lines, including U-Self Serve, U-Assist, U-Capture, and U-Analyze, helping clients achieve more personalized and emotionally rich interactions through voice, text, visual, and emotion AI technologies. U-Self Serve focuses on accurately identifying emotions and tones in conversations, enabling businesses to provide more personalized services to enhance user engagement satisfaction;

U-Self Serve

U-Assist improves customer service agents' efficiency through real-time guidance and workflow automation; U-Capture provides deep insights into customer needs and satisfaction through automated emotional data collection and analysis; while U-Analyze helps clients identify key trends and emotional changes in interactions, offering data-driven decision support to enhance brand loyalty.

Uniphore's technology is not just about making machines understand language; it aims for them to capture and interpret the emotions hidden behind tone and expression during interactions with humans. This capability allows businesses to engage with customers not just mechanically but to better meet their emotional needs. By using Uniphore, user satisfaction can reach 87%, and customer service performance can improve by 30%.

To date, Uniphore has completed over $620 million in funding, with the latest round being a $400 million investment led by NEA in 2022, with existing investors like March Capital also participating, bringing its post-round valuation to $2.5 billion.

Uniphore

Hume AI has launched the world's first empathetic voice AI, founded by former Google scientist Alan Cowen, who is known for pioneering the semantic space theory, which reveals the nuances of voice, facial expressions, and gestures to understand emotional experiences and expressions. Cowen's research has been published in numerous journals, including "Nature" and "Trends in Cognitive Sciences," covering the broadest and most diverse range of emotional samples to date.

Driven by this research, Hume has developed a conversational voice API - EVI, which combines large language models with empathy algorithms to deeply understand and analyze human emotional states. It can not only recognize emotions in speech but also respond with more nuanced and personalized reactions during interactions with users, and developers can access these features with just a few lines of code, integrating them into any application.

Hume AI

One of the main limitations of most current AI systems is that their instructions are primarily given by humans, which can lead to errors and fail to tap into the vast potential of artificial intelligence. Hume's empathetic large language model (eLLM) can adjust its choice of words and tone based on context and user emotional expression, providing a more natural and genuine experience in various scenarios, including mental health, education training, emergency calls, and brand analysis.

In March of this year, Hume AI completed a $50 million Series B funding round led by EQT Ventures, with investors including Union Square Ventures, Nat Friedman & Daniel Gross, Metaplanet, and Northwell Holdings.

Another player in this field is Entropik, which specializes in measuring consumer cognition and emotional responses. Through Decode, a feature that integrates the powers of emotional AI, behavioral AI, generative AI, and predictive AI, it can better understand consumer behavior and preferences, providing more personalized marketing recommendations. Entropik recently completed a $25 million Series B funding round in February 2023, with investors including SIG Venture Capital and Bessemer Venture Partners.

Entropik

02 Giants Involved, A Battleground

Tech giants are also making their moves in the field of Emotion AI, leveraging their advantages.

Including Microsoft Azure's Cognitive Services Emotion API, which can identify various emotions such as joy, anger, sadness, and surprise in images and videos by analyzing facial expressions and emotions;

IBM Watson's Natural Language Understanding API can process large amounts of text data to identify underlying emotional tendencies (such as positive, negative, or neutral) for more accurate interpretation of user intent;

Google Cloud AI's Cloud Vision API has powerful image analysis capabilities, quickly recognizing emotional expressions in images, and supports text recognition and emotional association;

AWS Rekognition can also detect emotions, recognize facial features, and track changes in expressions, and can be combined with other AWS services to create complete social media analysis or emotion AI-driven marketing applications.

Cloud Vision API

Some startups are moving faster in the development of Emotion AI, to the point where tech giants are looking to "poach" talent. For instance, the unicorn Inflection AI caught the attention of investor Microsoft for its AI team and models. After Microsoft, along with Bill Gates, Eric Schmidt, and NVIDIA, invested $1.3 billion in Inflection AI, it extended an olive branch to Mustafa Suleyman, one of the co-founders and a leader in AI. Subsequently, Suleyman and over 70 employees moved to Microsoft, costing the company nearly $650 million.

However, Inflection AI quickly regrouped, forming a new team with backgrounds in Google Translate, AI consulting, and AR, continuing to focus on its core product, Pi. Pi is a personal assistant capable of understanding and responding to user emotions. Unlike traditional AI, Pi emphasizes building emotional connections with users, perceiving emotions through analyzing voice, text, and other inputs, and demonstrating empathy in conversations. Inflection AI views Pi as a coach, confidant, listener, and creative partner, rather than just a simple AI assistant. Additionally, Pi has a powerful memory feature that can remember users' multiple conversation histories, enhancing the continuity and personalization of interactions.

Inflection AI Pi

03 Development Path, Coexistence of Attention and Doubt

Although Emotion AI embodies our expectations for more humanized interaction methods, like all AI technologies, its promotion is accompanied by both attention and skepticism. First, can Emotion AI truly interpret human emotions accurately? Theoretically, this technology can enrich the experience of services, devices, and technologies, but from a practical perspective, human emotions are inherently vague and subjective. As early as 2019, researchers questioned this technology, stating that facial expressions do not reliably reflect true human emotions. Therefore, relying solely on machines to simulate human facial expressions, body language, and tone to understand emotions has certain limitations.

Secondly, strict regulatory oversight has always been a stumbling block in the development of AI. For example, the EU's AI Act prohibits the use of computer vision emotion detection systems in fields like education, which may limit the promotion of certain Emotion AI solutions. States like Illinois in the U.S. also have laws prohibiting the collection of biometric data without permission, directly restricting the use of certain technologies in Emotion AI. Additionally, data privacy and protection are significant issues; Emotion AI is often applied in fields like education, health, and insurance, which have stringent data privacy requirements. Therefore, ensuring the security and lawful use of emotional data is a challenge that every Emotion AI company must face.

Finally, communication and emotional interpretation between people from different cultural regions are challenging, posing a test for AI. For example, the understanding and expression of emotions vary across regions, which may affect the effectiveness and completeness of Emotion AI systems. Furthermore, Emotion AI may face considerable difficulties in addressing biases related to race, gender, and gender identity.

Emotion AI promises not only to reduce labor efficiently but also to be considerate in reading human hearts. However, can it truly become a universal solution in human interactions, or will it end up being just another smart assistant like Siri, performing mediocrely in tasks requiring genuine emotional understanding? Perhaps in the future, AI's "mind-reading" capabilities will revolutionize human-machine and even human interactions, but at least for now, truly understanding and responding to human emotions may still require human involvement and caution.

0

Disclaimer: The content of this article solely reflects the author's opinion and does not represent the platform in any capacity. This article is not intended to serve as a reference for making investment decisions.

PoolX: Locked for new tokens.
APR up to 10%. Always on, always get airdrop.
Lock now!

You may also like

ESMA Proposes to Tighten MiCA Requirements

Coinspaidmedia2024/10/17 14:15

ETH breaks through $2,600

Cointime2024/10/17 14:03