Investigating Different Syntactic Context Types and Context Representations for Learning Word Embeddings

Bofang Li, Tao Liu, Zhe Zhao, Buzhou Tang, Aleksandr Drozd, Anna Rogers, Xiaoyong Du

    Publikation: Konference artikel i Proceeding eller bog/rapport kapitelKonferencebidrag i proceedingsForskningpeer review

    Abstract

    The number of word embedding models is growing every year. Most of them are based on the co-occurrence information of words and their contexts. However, it is still an open question what is the best definition of context. We provide a systematical investigation of 4 different syntactic context types and context representations for learning word embeddings. Comprehensive experiments are conducted to evaluate their effectiveness on 6 extrinsic and intrinsic tasks. We hope that this paper, along with the published code, would be helpful for choosing the best context type and representation for a given task.
    OriginalsprogEngelsk
    TitelProceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
    Antal sider11
    UdgivelsesstedCopenhagen, Denmark, September 71, 2017
    Publikationsdato2017
    Sider2411-2421
    StatusUdgivet - 2017

    Emneord

    • Word Embedding Models
    • Syntactic Context Types
    • Context Representations
    • Co-occurrence Information
    • Extrinsic and Intrinsic Evaluation

    Fingeraftryk

    Dyk ned i forskningsemnerne om 'Investigating Different Syntactic Context Types and Context Representations for Learning Word Embeddings'. Sammen danner de et unikt fingeraftryk.

    Citationsformater