Date of Graduation

8-2025

Document Type

Thesis

Degree Name

Master of Science in Computer Science (MS)

Degree Level

Graduate

Department

Electrical Engineering and Computer Science

Advisor/Mentor

Nakarmi, Ukash

Committee Member

Brajendra Nath Panda

Second Committee Member

Lu Zhang

Abstract

A recommendation system is a bridge between users and products, which is widely used in e-commerce such as Amazon and Netflix. This study investigates the use of Graph Neural Networks (GNNs), Light Graph Convolution Network(LightGCN) and Graph Sample and Aggregate (GraphSAGE), in the recommendation system on two categories of Amazon review datasets ( "All Beauty" and "Tools and Home Improvement"). The novelty of this work includes combining supervised and self-supervised learning through Weighted Approximate Rank Pairwise (WARP) and Information Noise-Contrastive Estimation (InfoNCE) losses, to optimize the embeddings of users and recommended items in the shape of a ranking list. The results show that GraphSAGE outperforms LightGCN, and more specifically, among GraphSAGE models, the one with the combination of WARP and InfoNCE losses performs very well. The reasons behind these improvements could be that GraphSAGE leverages the node features to capture both collaborative and content-based signals. Moreover, it samples a fixed number of neighbors, not all of them, which reduces the noise’s impact, specifically in sparse and large datasets. GraphSAGE uses the ReLU activation function, which makes it capable of learning non-linear behaviors, such as price sensitivity. Besides the model, contrastive learning (InfoNCE) offers robustness and consistency by creating two augmented views of a graph. Optimization in InfoNCE aims to make these two graphs’ embeddings similar. Overall, this work contributes to recommendation systems through feature engineering, novel loss design, and scalability solutions for large-scale data. It can be shown that contrastive loss will improve this system’s performance significantly.

Share

COinS