Investigating Different Syntactic Context Types and Context Representations for Learning Word Embeddings

Bofang Li, Tao Liu, Zhe Zhao, Buzhou Tang, Aleksandr Drozd, Anna Rogers, Xiaoyong Du

    Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review

    Abstract

    The number of word embedding models is growing every year. Most of them are based on the co-occurrence information of words and their contexts. However, it is still an open question what is the best definition of context. We provide a systematical investigation of 4 different syntactic context types and context representations for learning word embeddings. Comprehensive experiments are conducted to evaluate their effectiveness on 6 extrinsic and intrinsic tasks. We hope that this paper, along with the published code, would be helpful for choosing the best context type and representation for a given task.
    Original languageEnglish
    Title of host publicationProceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
    Number of pages11
    Place of PublicationCopenhagen, Denmark, September 71, 2017
    Publication date2017
    Pages2411-2421
    Publication statusPublished - 2017

    Keywords

    • Word Embedding Models
    • Syntactic Context Types
    • Context Representations
    • Co-occurrence Information
    • Extrinsic and Intrinsic Evaluation

    Fingerprint

    Dive into the research topics of 'Investigating Different Syntactic Context Types and Context Representations for Learning Word Embeddings'. Together they form a unique fingerprint.

    Cite this