Multi-level Cross-view Contrastive Learning for Knowledge-aware Recommender System
This paper proposes a novel multi-level cross-view contrastive learning mechanism, named MCCLK, which comprehensively considers three different graph views for KG-aware recommendation, including global-level structural view, local-level collaborative and semantic views, and a k-Nearest-Neighbor item-item semantic graph construction module is proposed.
Abstract
Knowledge graph (KG) plays an increasingly important role in recommender\nsystems. Recently, graph neural networks (GNNs) based model has gradually\nbecome the theme of knowledge-aware recommendation (KGR). However, there is a\nnatural deficiency for GNN-based KGR models, that is, the sparse supervised\nsignal problem, which may make their actual performance drop to some extent.\nInspired by the recent success of contrastive learning in mining supervised\nsignals from data itself, in this paper, we focus on exploring the contrastive\nlearning in KG-aware recommendation and propose a novel multi-level cross-view\ncontrastive learning mechanism, named MCCLK. Different from traditional\ncontrastive learning methods which generate two graph views by uniform data\naugmentation schemes such as corruption or dropping, we comprehensively\nconsider three different graph views for KG-aware recommendation, including\nglobal-level structural view, local-level collaborative and semantic views.\nSpecifically, we consider the user-item graph as a collaborative view, the\nitem-entity graph as a semantic view, and the user-item-entity graph as a\nstructural view. MCCLK hence performs contrastive learning across three views\non both local and global levels, mining comprehensive graph feature and\nstructure information in a self-supervised manner. Besides, in semantic view, a\nk-Nearest-Neighbor (kNN) item-item semantic graph construction module is\nproposed, to capture the important item-item semantic relation which is usually\nignored by previous work. Extensive experiments conducted on three benchmark\ndatasets show the superior performance of our proposed method over the\nstate-of-the-arts. The implementations are available at:\nhttps://github.com/CCIIPLab/MCCLK.\n