Are We Really Making Recommendations Robust? Revisiting Model Evaluation for Denoising Recommendation
- Publisher:
- Association for Computing Machinery (ACM)
- Publication Type:
- Conference Proceeding
- Citation:
- Recsys2025 Proceedings of the 19th ACM Conference on Recommender Systems, 2025, pp. 706-715
- Issue Date:
- 2025-08-07
Open Access
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is open access.
Implicit feedback data has emerged as a fundamental component of modern recommender systems due to its scalability and availability. However, the presence of noisy interactions - such as accidental clicks and position bias - can potentially degrade recommendation performance. Recently, denoising recommendation have emerged as a popular research topic, aiming to identify and mitigate the impact of noisy samples to train robust recommendation models in the presence of noisy interactions. Although denoising recommendation methods have become a promising solution, our systematic evaluation reveals critical reproducibility issues in this growing research area. We observe inconsistent performance across different experimental settings and a concerning misalignment between validation metrics and test performance caused by distribution shifts. Through extensive experiments testing 6 representative denoising methods across 4 recommender models and 3 datasets, we find that no single denoising approach consistently outperforms others, and simple improvements to evaluation strategies can sometimes match or exceed state-of-the-art denoising methods. Our analysis further reveals concerns about denoising recommendation in high-noise scenarios. We identify key factors contributing to reproducibility defects and propose pathways toward more reliable denoising recommendation research. This work serves as both a cautionary examination of current practices and a constructive guide for the development of more reliable evaluation methodologies in denoising recommendation.
Please use this identifier to cite or link to this item:
