ST-SAGE: A spatial-Temporal sparse additive generative model for spatial item recommendation

Publication Type:
Journal Article
ACM Transactions on Intelligent Systems and Technology, 2017, 8 (3)
Issue Date:
Filename Description Size
ST-SAGE Spatial-Temporal Sparse Additive.pdfPublished Version724.23 kB
Adobe PDF
Full metadata record
©c 2017 ACM 2157-6904/2017/04-ART48 15.00. With the rapid development of location-based social networks (LBSNs), spatial item recommendation has become an important mobile application, especially when users travel away from home. However, this type of recommendation is very challenging compared to traditional recommender systems. A user may visit only a limited number of spatial items, leading to a very sparse user-item matrix. This matrix becomes even sparser when the user travels to a distant place, as most of the items visited by a user are usually located within a short distance from the user's home. Moreover, user interests and behavior patterns may vary dramatically across different time and geographical regions. In light of this, we propose ST-SAGE, a spatial-Temporal sparse additive generative model for spatial item recommendation in this article. ST-SAGE considers both personal interests of the users and the preferences of the crowd in the target region at the given time by exploiting both the co-occurrence patterns and content of spatial items. To further alleviate the data-sparsity issue, ST-SAGE exploits the geographical correlation by smoothing the crowd's preferences over a well-designed spatial index structure called the spatial pyramid. To speed up the training process of ST-SAGE, we implement a parallel version of themodel inference algorithm on the GraphLab framework.We conduct extensive experiments; the experimental results clearly demonstrate that ST-SAGE outperforms the state-of-The-Art recommender systems in terms of recommendation effectiveness, model training efficiency, and online recommendation efficiency.
Please use this identifier to cite or link to this item: