You'll Laugh! You'll Cry! Optimizing Video for Emotional Impact

Xuan Liu (Netflix), Savannah Wei Shi (Santa Clara University), Thales Teixeira (Harvard University), and Michel Wedel (University of Maryland)
 

 

 
Key Takeaways

What? Online video content aggregators struggle to connect with consumers with short attention spans.

So What? Consumers are expecting more video content in a variety of channels.

Now what? Viewer happiness increases with increasing scene sequences for a conventionally cut trailer. Placing the key scene later in the trailer improves happiness levels and watching intention.

​​​​

Online video viewing is exploding. However, online content aggregators struggle to connect with consumers, given their short attention spans. In addition, some channels, such as email and social media, do not auto-play video with sound, meaning that aggregators have fewer tools to market their products. 

Trailers of coming attractions, whether for online movies, TV shows, video games, or books, are becoming shorter than ever. Online content aggregators must edit down trailer content, which may be as long as 2.5 minutes, into snackable digital content of 10, 20, or 30 seconds. At the same time, marketers at these companies are working to elicit a desired emotional response in viewers that will translate into intent to watch full content, driving advertising and product sales. For example, the 30-second version of the Black Panther War TV trailer inspires viewer fear by focusing on the conflict between T’Challa and Erik Killmonger, whereas the longer official 2:19-minute trailer creates surprise and happiness by focusing on the futuristic world of Wakanda and its many heroes. 

We offer a first-of-its-kind, analytics-based methodology to help marketers achieve this goal. Our team analyzed viewers as they watched comedy movie trailers, using facial-tracking software to measure emotional response and determine their intent to watch the movies they were shown. The data was used to calibrate a model that explains viewing preferences based on trailer audio-visual scene sequences. Next, our team collected data on movie ratings and box office sales, validating the role of scene structure in trailers, which is within the control of content aggregators and marketers. Finally, after understanding how scene structure was associated with emotional response, intent to watch, and box office sales, our team optimized the editing of trailers to produce short film clips for digital channels where sound is supported, such as IMDb, and ones where it is not, such as Facebook. 

We validated the algorithm in two experiments: a second, large-scale field experiment and an online campaign with Netflix. The online experiment was conducted with the company nViso2, which helped record 122 viewers’ facial expressions via webcam. Viewers watched 12 randomly selected comedy trailers from a group of 100 on their personal computers. nViso2’s automated algorithm helped calculate viewers’ emotional response to clips, creating a probability score for six basic emotions – happiness, surprise, fear, disgust, sadness, and anger -- for each second of video watching. These data were used to validate the algorithm.

Next, we analyzed movie trailer video and audio content. We used image and audio processing to detect scene cuts, including total number of scenes; average length of scenes; and the location of the longest scene in the trailer, which was designed to draw the highest emotional response. We also analyzed audio volume to obtain total volume and second-by-second volume data. 

Using this data, we analyzed intensities of happiness on a second-by-second basis, calculating aggregate measures for the video start, emotional trend, scene location that produced the happiness peak, and ending intensity. We also incorporated variables that would be readily available to content aggregators, including online ratings from IMDb and Rotten Tomatoes, MPAA reviews, and release times. 

We developed a statistical methodology to model moment-by-moment emotional response jointly with variables of key interest, such as watching intention and box office revenues. 

Key findings include: 

  • Viewer happiness increases with increasing scene sequences for a conventionally cut trailer.

  • Fast-paced trailers with a large number of scenes, on the other hand, decreased happiness – and watching intention. 

  • Placing the key scene later in the trailer improves happiness levels and watching intention. 

  • Moment-to-moment volume has a significant positive instantaneous effect on happiness, but peak volume and growing volumes decrease happiness. 

  • Increasing happiness and scenes with high peak and end happiness result in higher watching intentions.

Using findings and parameter estimates, we produced an optimal movie clip of about 30 seconds in length for each of the 50 pairs of trailers, both with and without sound. They were measured against benchmarks (the first 30 seconds of a longer trailer). The optimal clips with sound had an average of 3.6 scenes versus the benchmark of 4.8 scenes and a predicted watching intent that was considerably higher at 3.83 (on a seven-point scale) versus the benchmark of 2.91 and also outranked the trailer at 3.32. Since two versions of each clip were produced, we selected the one with the highest watching intention (averaging 4.21across all movies), which would translate to a 4.8% increase in higher box-office revenue. For the best silent clips, the average predicted watching intent was 4.07, translating to a 2.45% box-office boost. Optimized clips were then tested in a field experiment and online with Netflix users, with detailed results provided in the study. 

The new methodology provides more accurate results than the currently used heuristic approach for producing clips. It can be applied to all genres and automated, making it scalable and usable by aggregators of all sizes.

Read the full article​.

Xuan Liu, Savannah Wei Shi, Thales Teixeira, and Michel Wedel (2018) "Video Content Marketing: The Making of Clips,” Journal of Marketing, 82 (4), 86-101.

Go to the Journal of Marketing 


 
ABOUT THE AUTHOR:
Xuan Liu (Netflix), Savannah Wei Shi (Santa Clara University), Thales Teixeira (Harvard University), and Michel Wedel (University of Maryland)
Xuan Liu is Senior Data Scientist, Netflix; Savannah Wei Shi is Assistant Professor of Marketing, Santa Clara University; Thales Teixeira is Lumry Family Associate Professor, Harvard University; and Michel Wedel is PepsiCo Professor of Consumer Science, University of Maryland

COMMENT: