Online Modeling of Esthetic Communities Using Deep Perception Graph Analytics
2017; Institute of Electrical and Electronics Engineers; Volume: 20; Issue: 6 Linguagem: Inglês
10.1109/tmm.2017.2769799
ISSN1941-0077
AutoresLuming Zhang, Maofu Liu, Lei Chen, Lanxin Qiu, Chao Zhang, Yuxing Hu, Roger Zimmermann,
Tópico(s)Advanced Image and Video Retrieval Techniques
ResumoAccurately detecting esthetic communities from a large number of Internet users (e.g., Flickr 1 1 www.flickr.com . or Picasa 2 2 picasa.google.com . users) is a useful technique that can facilitate several applications, such as image retargeting, visual esthetic assessment, and fashion recommendation. Conventional approaches cannot appropriately handle this task due to the following challenges: first, it is difficult to online update the detected esthetic communities since these photos may uploaded/removed frequently; second, human visual perception is essential to describe esthetic characteristics, but integrating it into an existing mining algorithm is challenging; and third, flat models cannot encode human visual perception precisely, especially for those sophisticated sceneries. To solve these problems, we propose deep perception graph analytics, an incremental pipeline where the esthetic relations among users are described by utilizing their gaze shifting paths (GSPs). Specifically, we first propose an aggregation-based deep network that formulates GSP representation into a unified framework. Afterward, the deep perception graph is constructed where the esthetic discrepancy between users is measured by their GSP distributions. Accordingly, we adopt a dense subgraph discovery algorithm that efficiently detects the communities belonging to each esthetic style. Finally, an online Gaussian mixture model (GMM) learning model is designed, which dynamically updates the GMM parameters in order to describe esthetic communities given the photos are uploaded/removed on the fly. Experiments on a million-scale image set crawled from Flickr demonstrate the efficiency and effectiveness of our method.
Referência(s)