Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Glove: Global Vectors for Word Representation
33.427
Zitationen
3
Autoren
2014
Jahr
Abstract
Recent methods for learning vector space representations of words have succeeded in capturing fine-grained semantic and syntactic regularities using vector arith-metic, but the origin of these regularities has remained opaque. We analyze and make explicit the model properties needed for such regularities to emerge in word vectors. The result is a new global log-bilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods. Our model efficiently leverages statistical information by training only on the nonzero elements in a word-word co-occurrence matrix, rather than on the en-tire sparse matrix or on individual context windows in a large corpus. The model pro-duces a vector space with meaningful sub-structure, as evidenced by its performance of 75 % on a recent word analogy task. It also outperforms related models on simi-larity tasks and named entity recognition. 1
Ähnliche Arbeiten
MizAR 60 for Mizar 50
2023 · 74.770 Zit.
AI-Assisted Pipeline for Dynamic Generation of Trustworthy Health Supplement Content at Scale
2018 · 45.479 Zit.
2019 · 31.688 Zit.
Latent dirichlet allocation
2003 · 27.000 Zit.
Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation
2014 · 24.021 Zit.