From Culture to Clothing: Discovering the World Events Behind A Century of Fashion Images

Wei-Lin Hsiao

Kristen Grauman

UT Austin

Fashion is intertwined with external cultural factors, but identifying these links remains a manual process limited to only the most salient phenomena. We propose a data-driven approach to identify specific cultural factors affecting the clothes people wear. Using large-scale datasets of news articles and vintage photos spanning a century, we present a multi-modal statistical model to detect influence relationships between happenings in the world and people's choice of clothing. Furthermore, on two image datasets we apply our model to improve the concrete vision tasks of visual style forecasting and photo timestamping. Our work is a first step towards a computational, scalable, and easily refreshable approach to link culture to clothing.

Approach Overview

We first mine cultural factors from news articles in the year 1900 to 1990s using topic models, and then mine clothing styles from vintage photos by running clustering on clothing-sensitive features. Cultural influences on clothing styles are detected by measuring Granger-causality relations between the respective popularity time series.

Example Detected Influences

Curves in each subfigure are popularity trends of visual styles and cultural factors. Corresponding call-out boxes for the curves show centroid images and detected attributes/categories in a style (blue boxes), and also top words in a mined textual topic (yellow boxes).

Discovered influences within the time range 1900-1990s:

Discovered influences within the time range 2013-2016:



Wei-Lin Hsiao and Kristen Grauman.
"From Culture to Clothing: Discovering the World Events Behind A Century of Fashion Images".
In IEEE International Conference on Computer Vision (ICCV), 2021.



[txt] README.txt

[tgz] data.tgz: Pre-processed data for Flickr Vintage and NYT articles (see Sec. 3.1 in the paper)


We thank Greg Durrett and Chao-Yuan Wu for helpful discussions. We also thank Ziad Al-Halah for kindly sharing the data and code in their work with us.