Affordable Access

Exploring phrase-compositionality in skip-gram models

Authors
  • Peng, Xiaochang
  • Gildea, Daniel
Type
Preprint
Publication Date
Jul 21, 2016
Submission Date
Jul 21, 2016
Identifiers
arXiv ID: 1607.06208
Source
arXiv
License
Yellow
External links

Abstract

In this paper, we introduce a variation of the skip-gram model which jointly learns distributed word vector representations and their way of composing to form phrase embeddings. In particular, we propose a learning procedure that incorporates a phrase-compositionality function which can capture how we want to compose phrases vectors from their component word vectors. Our experiments show improvement in word and phrase similarity tasks as well as syntactic tasks like dependency parsing using the proposed joint models.

Report this publication

Statistics

Seen <100 times