IIM

Emotion AI: Machine augmenting Human in a thoughtful way

Emotion AI is learning and recognizing human emotions, and using that knowledge to improve everything — from marketing campaigns to healthcare. Here, Ranjan Kumar, founder and CEO, Entropik Tech, explains how machines are replicating the way humans think.

Excerpts:

Dataquest: Elaborate on Emotion AI. For what purpose can this be used?

Ranjan Kumar: AI is a complex web for logical decision making, an intelligence based on ‘if-else’ rules crafted to decipher an outcome based on vast possibilities of logical permutations of choices.

However, Emotion AI also understands the emotional context of logic. It’s an AI system, which is not just artificially intelligent, but also emotionally perceptive.

Avenues for applications of Emotion AI are plentiful, including recruitment, medical diagnosis and assistance, loan evaluation, customer service, automobile passenger safety, optimization of ROI for marketers and many more. Entropik Tech is dedicated to building Emotion AI.

Dataquest: What technologies are you using to illustrate the emotion and expressiveness? Do you also map facial expressions?

Ranjan: For the Emotion AI platform, Affect Lab uses proprietary technologies like, brainwave mapping, facial coding and eye tracking to decipher cognitive and emotional response of consumers as they watch an ad, experience a product or undergo a purchase in a retail store.

  • For brain wave mapping, we use a special hardware which works like wearing a headphone. The hardware monitors electrical activity of the brain, while the user is experiencing a product/watching content. The raw data collected by the EEG (electroencephalogram) headset is then interpreted by algorithms to calculate behavioral and cognitive metrics like attention level of the user, mental effort applied towards completing a task, etc.
  • The facial coding system developed by the team at Entropik can track facial expressions of users by identifying facial landmarks. This data is generated from the way you grin; the way you roll your eyes to way you smirk is fed into our deep learning algorithm and to get emotion metrics.
  • Visual activity is monitored using a standard webcam and screen-based Eye Tracking software. Image processing algorithms calculate the users’ ‘point-of-gaze’ to create heat maps and gaze plots in real-time.

Dataquest: What is Affect Lab? Elaborate.

Ranjan: The online SaaS platform Affect Lab’ is a one-stop Emotion AI platform that integrates EEG, facial coding and eye tracking with integrated workflows to support end-to-end consumer research. It is designed to help consumer brands decode consumers’ subconscious emotional response to watching media content, ad commercials, experiencing a product, UX/UI (user experience/user interface) platforms, driving an automobile, etc., in order to gain insights on what drives their purchase decisions.

Combining our emotion recognition tech, and AI models we capture cognitive and behavioral parameters such as attention, appreciation, attentiveness, along with cognitive parameters, like happiness, boredom, familiarity, etc.

Dataquest: How can the feature eye tracking and facial coding help your clients?

Ranjan: As many as 46% of the ads launches are unsuccessful, and that’s a huge loss for any company. 82% of the time, the single biggest reason is from brands who are failing to understand the connect, and preference of their target audience.

Affect Lab helps brands measure consumers’ preference at a subconscious level, allowing them to optimize each and every aspect of the product/ad experience using emotion recognition techniques like facial coding and eye tracking.

Media research

For e.g., before an ad launch, brands can have the ad/ad variants tested by a group of users that represent their target audience. Using our facial coding and eye tracking software, we capture facial expressions and eye movements of the user while they are watching the content.

A second-by-second analysis is provided to the brand, including heat maps, gaze trails and data about which ad segment invokes the high attention, engagement, boredom, etc. Or, what is the overall emotion score and predicted conversion based on industry norms.

Marketers can use this data to optimize their media plans by tailoring it to audience segments that were most receptive to the content. Eye tracking data can help understand if the brand prominence was as expected and the product showcase was effective or not. We have seen brands leveraging up to 4xRoI on their marketing spends using our testing platform.

UX research:

The above analogy holds true for UX testing across mobile apps, websites, and chat bots. Emotion and eye tracking helps the UX designers understand the user’s navigation flow, along with his enjoyment, frustration, mental effort levels on a page-by-page level.

By knowing how the website/app visitors are feeling on interacting with these digital assets enables brands to optimize their UX across navigation, content, presentation and interaction; resulting in disproportionate RoI. For e.g., a consumer app, an improvement of 10% drop-off improvement on checkout leads to 2x efficiency in their cost per acquisition (CPA).

Dataquest: Is this based on sentiment analysis? If not, how is this different from sentiment analysis?

Ranjan: Sentiment Analysis classically is:

  • mining massive amount of chatter generated online by consumers who are expressing their feelings and attitudes about brands, products or services they had used, through various social media sites, review portals, website etc.
  • Further, using NLP (Natural Language Processing) to reading and interpreting emotions expressed plus the overall mood for the topic.

Our platform uses:

  • Emotion recognition technologies and AI modules to read physiological and neural responses of chosen test user in order to measure emotional, behavioral and cognitive data points.
  • The emotion analytics data is collected for user responses, while the user is experiencing a product or watching content.

While both fall under affective computing, they are very different means to understand the market/consumer.

Dataquest: Who are your clients, and how are they using this technology for their business?

Ranjan: Our list of clientele includes – GroupM, HSBC, CITI, CHUBB, Born Group, ITC, Myntra, IPSOS, IMRB, L-Brands, Xiaomi, L Brands, TATA, UB Group, Viacom18, TAM Media Research, among others, based out of India, USA, Australia, Indonesia and Singapore.

Entropik is a part of several accelerator programs, including Accenture Ventures, Viacom18 VStEP, Intel AI Builders, Oracle Cohort, SAP, and Plug and Play.

We help brands by optimizing various consumer touch points

  • Brands make their audio/video/digital/print advertising emotionally efficient.
  • Brands test their products before and after launch.
  • Brands improve conversions and user experience on their digital assets via our UI\UX testing module, used for website/app audits and competitive benchmarking.
  • C-SAT tracking suite helps brand track customer satisfaction using facial coding in their retail stores.
  • Human chat bot index helps brands audit their chat bots. Voice- and text-based sentiment analysis is used by brands to improve the efficiency of their call centers.

By Aanchal Ghatak

Leave a Reply

Your email address will not be published. Required fields are marked *