In this month’s research spotlight, COSMOS highlights two studies both recently accepted to the 9th International Conference on Multimedia and Image Processing (ICMIP) at Tokyo, Japan. At ICMIP 2024, cosmographers Mert Can Cakmak, Mainuddin Shaik, and Dr. Nitin Agarwal will present their paper titled “Emotion Assessment of YouTube Videos using Color Theory,” and Niloofar Yousefi, Mert Can Cakmak, and Dr. Nitin Agarwal will also present another paper titled “Examining Multimodal Emotion Assessment and Resonance with Audience on YouTube.” Both studies use color theory to analyze the emotions in video datasets. 

These research studies explore the complex interplay between colors in digital media, specifically YouTube videos, and their emotional impact on viewers. They address the challenge of quantitatively linking color hues with viewer emotions, a complex area due to the subjective nature of emotional experience and cultural differences in color interpretation.

The first paper develops a novel color theory-based emotion analysis from videos.. This research shows how videos establish emotional themes through color—each video is condensed into a barcode summarizing the colors present throughout a video. “Our research developed the Color-Emotion Baseline Dictionary by combining color theory with emotional psychology studies to link specific colors with emotions,” says Mert. “We used emotion wheels to categorize these relationships clearly, ensuring a consistent approach by normalizing emotional terms.” Then, to further structure the dictionary, “We then quantified how often certain emotions are associated with specific colors,” says Mert, with the dataset being Trailers12k (a YouTube channel with many trailers). 

Their findings empirically validated the hypothesis that specific colors in video content significantly influence viewer emotions. Mert commented on this by saying, “The research highlighted the effectiveness of its novel color-emotion mapping framework, demonstrating a clear correlation between color patterns in movie trailers and the emotional responses they elicit.”

The second study builds upon this research by applying not only color barcode-based emotion analysis but also audio-based and text-based emotion analysis to various contexts, such as movie trailers and political news channels. By comparing text-based emotion analysis, audio-based emotion analysis, and color barcode-based emotion analysis of political datasets with a movie trailer dataset, Yousefi, Cakmak, and Dr. Agarwal show how multimedia use can differ across different contexts.

“Our analysis shows that content like news or incident explanations evokes more emotion through text,” Niloofar explains. “We believe this is likely due to the need to convey important information in a short video, so the creator focuses on clarity rather than visual elements, whereas, on the other hand, movie trailers use color and audio to attract users. For instance, horror movie trailers often use red and black colors to evoke fear.”

Dr. Agarwal said, “These studies’ insights are invaluable for decision-makers, filmmakers, digital marketers, and content creators, enabling them to engage audiences through emotionally charged visual narratives more effectively and better evaluate the impact of content on the audience. Beyond its practical applications, the research enriches our understanding of visual communication, opening new avenues for exploring how visual stimuli can shape human emotion, cognition, and social interaction on a global scale.”