Sitting comfortable at home, Karen sat looking at an advert on her facial coding v2laptop. The advert was like many others; there was a voice over, various people playing out some scene, a pack shot and then the advert ended and so did a little red light at the top of her screen. Karen then clicked a button and answered some short questions about the advert she saw. Three-minutes later she was making a cup of tea and back on Pinterest looking at party ideas for her 3 year old’s birthday party. A casual observer may have concluded Karen’s apparent behaviour was nonchalance to what she had seen. They would be wrong.

Karen had just completed an advertising pre-test along with another two-hundred or more people that night for a new advert, some even used their tablet and others their smart phone. In the 30-seconds it took to watch the advert Karen’s face was continuously analysed for small changes in her gaze, head direction, changes in her zygomatic major and inferior part of orbicularis oculi muscle . . . she smiled a real smile ever so slightly. Karen knew her face was being analysed while she watched the advert. The little red light was on to show her webcam was active. Karen was also aware that her privacy is protected. Facial coding is not facial recognition and personal identifying information is not linked to her results.

Along with others, Karen’s emotional expressions while watching advertising is helping to create adverting that is distinctive, relevant, memorable and helping organisations to achieve their goals.

 

In Just 30 Seconds

In the 30 seconds a consumer takes to watch a TVC a facial coding algorithm makes hundreds of measurements of muscular movements that form facial expressions and relate to emotional states that we can then map against an advertisement to tell us when a person responded to in the advert and what they felt.

Initially based on the cross-cultural research by Paul Ekman who codified emotional expressions into a system known as FACS (Facial Action Coding System) that is based on observable groups of facial movements. Using machine learning, facial movements are calibrated against the individual so that other factors that can affect our expressions such as age, gender, culture do not affect results. In the software development, which is being continuously improved, faces from different races, cultures, age and gender are used to train the algorithm.

Facial-Muscles

 

Facial coding provides a significant improvement over manual reaction measurement (dials) and survey responses which are retrospective and in the case of survey responses just measure our overall reaction. Providing no insights into what created the reactions and how it may have changed during the advert. Because multiple emotions are being measured across all participants we are able understand which emotions are dominate, how they interact, and how it varies across segments.

The example below is from test study that used a Herron pain reliever advert. What grabbed my attention with this advert was its use of humour. Humour is not a stock-in-trade technique for over the counter medicine. Our analysis showed that the agency nailed it. Humour worked and built throughout the execution and also coincided with branding to give this small brand maximum opportunity for success. The humour was true to the category, making message highly relevant. Note how peaks in humour are generally preceded by fear (anticipation of fear) and moments of surprise.

 

 

There is more to Emotion than Our Face

There a very few measures that can claim universal understanding of some property, and none of them are in the social or behavioural sciences. Facial coding provides us one of the best approaches to getting an accurate and reliable measure that we can use in a meaningful way and in real world settings.

In pre-testing we hold many of the context variables as constant or allowed to fall randomly, such social context. How we express our emotions can vary by who we are with at the time. Measuring emotion in relatively private setting, where advertising is often viewed, may give different results to measuring it among friends or strangers in a public place. Knowing the context is important for all consumer and human decision research and in all research we need to be account for this in our design and interpretation.

 

Implementing Emotion Analysis into Research

Facial coding is not just for advertising. The approach can be used where ever we need to measure emotional reactions and is used effectively in shopper research, service encounters, movie trailers, speeches/ PR releases, and packaging research. With only a camera needed to capture a person’s facial movements we can use the approach out in field (research speak for ‘anywhere’). Combined with eye tracking systems even greater opportunities exist for understanding how consumers emotionally engage with your communication, product and service offering.

 

 

For those who like to dig deeper. . .

  • Ekman, Paul (1999), “Basic Emotions”, in Dalgleish, T; Power, M,Handbook of Cognition and Emotion, Sussex, UK: John Wiley & Sons
  • Russell, James (1980). “A circumplex model of affect”.Journal of Personality and Social Psychology39: 1161–1178.
  • Intelligent Behaviour Understanding Group (iBUG), Department of Computing, Imperial College London http://ibug.doc.ic.ac.uk/