TY - JOUR
T1 - MetaCAR
T2 - Cross-Domain Meta-Augmentation for Content-Aware Recommendation
AU - Xu, Hui
AU - Li, Changyu
AU - Zhang, Yan
AU - Duan, Lixin
AU - Tsang, Ivor W.
AU - Shao, Jie
N1 - Publisher Copyright:
© 1989-2012 IEEE.
PY - 2023/8/1
Y1 - 2023/8/1
N2 - Cold-start has become critical for recommendations, especially for sparse user-item interactions. Recent approaches based on meta-learning succeed in alleviating the issue, owing to the fact that these methods have strong generalization, so they can fast adapt to new tasks under cold-start settings. However, these meta-learning-based recommendation models learned with single and spase ratings are easily falling into the meta-overfitting, since the one and only rating rui to a specific item i cannot reflect a user's diverse interests under various circumstances(e.g., time, mood, age, etc), i.e., if rui equals to 1 in the historical dataset, but rui could be 0 in some circumstance. In meta-learning, tasks with these single ratings are called Non-Mutually-Exclusive(Non-ME) tasks, and tasks with diverse ratings are called Mutually-Exclusive(ME) tasks. Fortunately, a meta-augmentation technique is proposed to relief the meta-overfitting for meta-learning methods by transferring Non-ME tasks into ME tasks by adding noises to labels without changing inputs. Motivated by the meta-augmentation method, in this paper, we propose a cross-domain meta-augmentation technique for content-aware recommendation systems (MetaCAR) to construct ME tasks in the recommendation scenario. Our proposed method consists of two stages: meta-augmentation and meta-learning. In the meta-augmentation stage, we first conduct domain adaptation by a dual conditional variational autoencoder (CVAE) with a multi-view information bottleneck constraint, and then apply the learned CVAE to generate ratings for users in the target domain. In the meta-learning stage, we introduce both the true and generated ratings to construct ME tasks that enables the meta-learning recommendations to avoid meta-overfitting. Experiments evaluated in real-world datasets show the significant superiority of MetaCAR for coping with the cold-start user issue over competing baselines including cross-domain, content-aware, and meta-learning-based recommendations.
AB - Cold-start has become critical for recommendations, especially for sparse user-item interactions. Recent approaches based on meta-learning succeed in alleviating the issue, owing to the fact that these methods have strong generalization, so they can fast adapt to new tasks under cold-start settings. However, these meta-learning-based recommendation models learned with single and spase ratings are easily falling into the meta-overfitting, since the one and only rating rui to a specific item i cannot reflect a user's diverse interests under various circumstances(e.g., time, mood, age, etc), i.e., if rui equals to 1 in the historical dataset, but rui could be 0 in some circumstance. In meta-learning, tasks with these single ratings are called Non-Mutually-Exclusive(Non-ME) tasks, and tasks with diverse ratings are called Mutually-Exclusive(ME) tasks. Fortunately, a meta-augmentation technique is proposed to relief the meta-overfitting for meta-learning methods by transferring Non-ME tasks into ME tasks by adding noises to labels without changing inputs. Motivated by the meta-augmentation method, in this paper, we propose a cross-domain meta-augmentation technique for content-aware recommendation systems (MetaCAR) to construct ME tasks in the recommendation scenario. Our proposed method consists of two stages: meta-augmentation and meta-learning. In the meta-augmentation stage, we first conduct domain adaptation by a dual conditional variational autoencoder (CVAE) with a multi-view information bottleneck constraint, and then apply the learned CVAE to generate ratings for users in the target domain. In the meta-learning stage, we introduce both the true and generated ratings to construct ME tasks that enables the meta-learning recommendations to avoid meta-overfitting. Experiments evaluated in real-world datasets show the significant superiority of MetaCAR for coping with the cold-start user issue over competing baselines including cross-domain, content-aware, and meta-learning-based recommendations.
KW - cold-start
KW - content-aware
KW - meta-augmentation
KW - Recommendation systems
UR - http://www.scopus.com/inward/record.url?scp=85139429821&partnerID=8YFLogxK
U2 - 10.1109/TKDE.2022.3209005
DO - 10.1109/TKDE.2022.3209005
M3 - Article
AN - SCOPUS:85139429821
SN - 1041-4347
VL - 35
SP - 8199
EP - 8212
JO - IEEE Transactions on Knowledge and Data Engineering
JF - IEEE Transactions on Knowledge and Data Engineering
IS - 8
ER -