New AI Approach Harnesses Shoppers’ Visual Preferences to Boost Clothing Sales
en-GBde-DEes-ESfr-FR

New AI Approach Harnesses Shoppers’ Visual Preferences to Boost Clothing Sales

02/07/2025 Frontiers Journals

One day, your favorite online store feels like it just gets your style—showing you pieces you cannot help but click on and buy. A research team from Nanjing University, Nanjing University of Science and Technology, and Alibaba Group has made this possible with a fresh spin on AI-powered recommendations. By teaching computers to pay attention to the images you click on, they have fine-tuned Taobao’s women’s clothing suggestions, boosting prediction accuracy by 0.46% offline and nudging sales up by 0.88% online.
“We wanted to teach AI to see the same visual details that catch your eye—the cut of a dress, a splash of color, even the drape of the fabric,” says Prof. Yang Yang, lead researcher on the project. “And use that insight to make recommendations as personal as a friend’s suggestion.”
What’s the Big Deal?
We all know the frustration of endless scrolling, only to settle on something “good enough.” Traditional recommendation systems heavily rely on product descriptions and categories, but they overlook a crucial element: visual style. Color palettes, silhouettes, and patterns—these visual cues carry a world of meaning. By tying the AI’s image learning directly to what you click, this new approach streamlines the shopping journey, surfaces items you will love, and keeps you engaged.
Why Retailers Should Care
Online retailers are always on the hunt for easy wins that do not break the bank—and this approach delivers just that. By decoupling the heavy lifting of image representation learning into an offline stage, companies can avoid investing in massive new hardware or ballooning compute budgets. Once the model is pre-trained, it integrates seamlessly into existing recommendation systems, resulting in minimal disruption to live operations. And it’s not limited to one product line: experiments on Amazon’s Sports, Baby, and Clothing review datasets all showed clear boosts in recommendation quality, proving that this method can drive value across a wide range of categories.
Research Results
In offline trials using Taobao’s extensive women’s clothing dataset, the new method lifted the area under the ROC curve by 0.46% compared to a strong baseline—an impressive gain at this scale. When switched on in a live A/B test, Taobao saw a 0.88% uptick in gross merchandise volume, turning subtle improvements in visual understanding into real-world revenue. When the team applied this technique to other multimodal recommenders using Amazon review data, both recall and NDCG metrics increased noticeably, underscoring the broad applicability and business impact of this user-focused image-learning strategy.
The ‘Secret Sauce’
At the heart of this innovation lies a clever two-step process that mimics how we humans shop. First, every product photo is run through a visual “backbone” that distills it into a unique digital fingerprint—a high-dimensional embedding that captures color, shape, and style. Next, the model examines the user’s past clicks—each one a snapshot of their personal taste—and utilises an attention mechanism to integrate those snapshots when evaluating a new item. By training the system contrastively—pulling representations of clicked items closer and pushing non-clicked ones away—it learns to spotlight the visual cues that truly matter, all without entangling the main recommendation engine.
Why You’ll Love It
For you, the shopper, it means less “meh” browsing and more “Wow, where did you find this?!” For the store, it’s a powerful way to turn the visuals you adore into genuine business growth. As e-commerce gets ever more image-driven, weaving your clicks into the AI’s training loop could become the new standard—making your next online spree faster, friendlier, and unfailingly on-point.
DOI: 10.1007/s11704-024-3939-x
02/07/2025 Frontiers Journals
Regions: Asia, China
Keywords: Applied science, Computing

Disclaimer: AlphaGalileo is not responsible for the accuracy of content posted to AlphaGalileo by contributing institutions or for the use of any information through the AlphaGalileo system.

Testimonios

We have used AlphaGalileo since its foundation but frankly we need it more than ever now to ensure our research news is heard across Europe, Asia and North America. As one of the UK’s leading research universities we want to continue to work with other outstanding researchers in Europe. AlphaGalileo helps us to continue to bring our research story to them and the rest of the world.
Peter Dunn, Director of Press and Media Relations at the University of Warwick
AlphaGalileo has helped us more than double our reach at SciDev.Net. The service has enabled our journalists around the world to reach the mainstream media with articles about the impact of science on people in low- and middle-income countries, leading to big increases in the number of SciDev.Net articles that have been republished.
Ben Deighton, SciDevNet
AlphaGalileo is a great source of global research news. I use it regularly.
Robert Lee Hotz, LA Times

Trabajamos en estrecha colaboración con...


  • e
  • The Research Council of Norway
  • SciDevNet
  • Swiss National Science Foundation
  • iesResearch
Copyright 2025 by DNN Corp Terms Of Use Privacy Statement