Repository logo
 

Word Embeddings for Domain Specific Semantic Relatedness

Date

2018-11-01T18:02:37Z

Authors

Tilbury, Kyle

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Word embeddings are becoming pervasive in natural language processing (NLP), with one of their main strengths being their ability to capture semantic relationships between words. Rather than training their own embeddings many NLP practitioners elect to use pre-trained word embeddings. These pre-trained embeddings are typically created and evaluated using general corpora. Thus, there is a deficiency in the understanding of their performance within a technical domain. In this thesis, we explore how the nature of the data used to train embeddings can affect their performance when computing semantic relatedness within different domains. The three main contributions are as follows. Firstly, we find that the performance of general pre-trained embeddings is lacking in the biomedical domain. Secondly, we provide key insights that should be considered when working with word embeddings for any semantic task. Finally, we develop new biomedical word embeddings and provide them as publicly available for use by others.

Description

Keywords

word embedding, word vector, semantic relatedness, semantic similarity, biomedical

Citation