Indexed by:
Abstract:
In personalized recommendations, users often express complex logical requirements, involving the intersection of multiple preferences over heterogeneous graphs containing users, items, and external knowledge. Existing methods for mining logical relations face challenges in scalability and often overlook the semantics of relations, which are essential for uncovering higher-order connections and addressing incomplete relations within the graph. To tackle these challenges, we propose RelRec, a novel approach that leverages large language models (LLMs) to mine logical relations and satisfy users' logical requirements in personalized recommendation tasks. Specifically, the framework begins with the extraction of user-driven logical relations, followed by a rule-based logical relation mining module that integrates both semantic and structural information using the capabilities of LLMs. By uncovering higher-order logical relations, our approach effectively refines the heterogeneous graph for reasoning capacity and recommendation accuracy. Extensive experiments on real-world datasets demonstrate that RelRec significantly outperforms existing methods. © 2025 Copyright held by the owner/author(s). Publication rights licensed to ACM.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
Year: 2025
Page: 1326-1330
Language: English
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 0
Affiliated Colleges: