Paper ID: 2310.05964
Exploring Embeddings for Measuring Text Relatedness: Unveiling Sentiments and Relationships in Online Comments
Anthony Olakangil, Cindy Wang, Justin Nguyen, Qunbo Zhou, Kaavya Jethwa, Jason Li, Aryan Narendra, Nishk Patel, Arjun Rajaram
After the COVID-19 pandemic caused internet usage to grow by 70%, there has been an increased number of people all across the world using social media. Applications like Twitter, Meta Threads, YouTube, and Reddit have become increasingly pervasive, leaving almost no digital space where public opinion is not expressed. This paper investigates sentiment and semantic relationships among comments across various social media platforms, as well as discusses the importance of shared opinions across these different media platforms, using word embeddings to analyze components in sentences and documents. It allows researchers, politicians, and business representatives to trace a path of shared sentiment among users across the world. This research paper presents multiple approaches that measure the relatedness of text extracted from user comments on these popular online platforms. By leveraging embeddings, which capture semantic relationships between words and help analyze sentiments across the web, we can uncover connections regarding public opinion as a whole. The study utilizes pre-existing datasets from YouTube, Reddit, Twitter, and more. We made use of popular natural language processing models like Bidirectional Encoder Representations from Transformers (BERT) to analyze sentiments and explore relationships between comment embeddings. Additionally, we aim to utilize clustering and Kl-divergence to find semantic relationships within these comment embeddings across various social media platforms. Our analysis will enable a deeper understanding of the interconnectedness of online comments and will investigate the notion of the internet functioning as a large interconnected brain.
Submitted: Sep 15, 2023