Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Mar;48(3):e13416.
doi: 10.1111/cogs.13416.

Probing the Representational Structure of Regular Polysemy via Sense Analogy Questions: Insights from Contextual Word Vectors

Affiliations

Probing the Representational Structure of Regular Polysemy via Sense Analogy Questions: Insights from Contextual Word Vectors

Jiangtian Li et al. Cogn Sci. 2024 Mar.

Abstract

Regular polysemes are sets of ambiguous words that all share the same relationship between their meanings, such as CHICKEN and LOBSTER both referring to an animal or its meat. To probe how a distributional semantic model, here exemplified by bidirectional encoder representations from transformers (BERT), represents regular polysemy, we analyzed whether its embeddings support answering sense analogy questions similar to "is the mapping between CHICKEN (as an animal) and CHICKEN (as a meat) similar to that which maps between LOBSTER (as an animal) to LOBSTER (as a meat)?" We did so using the LRcos model, which combines a logistic regression classifier of different categories (e.g., animal vs. meat) with a measure of cosine similarity. We found that (a) the model was sensitive to the shared structure within a given regular relationship; (b) the shared structure varies across different regular relationships (e.g., animal/meat vs. location/organization), potentially reflective of a "regularity continuum;" (c) some high-order latent structure is shared across different regular relationships, suggestive of a similar latent structure across different types of relationships; and (d) there is a lack of evidence for the aforementioned effects being explained by meaning overlap. Lastly, we found that both components of the LRcos model made important contributions to accurate responding and that a variation of this method could yield an accuracy boost of 10% in answering sense analogy questions. These findings enrich previous theoretical work on regular polysemy with a computationally explicit theory and methods, and provide evidence for an important organizational principle for the mental lexicon and the broader conceptual knowledge system.

Keywords: BERT model; Distributional semantic model; Lexical semantics; Regular polysemy; Semantic ambiguity; Word analogy; Word sense analogy.

PubMed Disclaimer

References

    1. Alonso, H., Pedersen, B., & Bel, N. (2013). Annotation of regular polysemy and underspecification. In Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics (Vol. 2, pp. 725-730).
    1. Apresjan, J. U. D. (1974). Regular polysemy. Linguistics, 12(142), 5-32. https://doi.org/10.1515/ling.1974.12.142.5
    1. Armstrong, B. C., & Plaut, D. C. (2016). Disparate semantic ambiguity effects from semantic processing dynamics rather than qualitative task differences. Language, Cognition and Neuroscience, 31(7), 940-966. https://doi.org/10.1080/23273798.2016.1171366
    1. Azuma, T., & Van Orden, G. C. (1997). Why SAFE is better than FAST: The relatedness of a word's meanings affects lexical decision times. Journal of Memory and Language, 36(4), 484-504. https://doi.org/10.1006/jmla.1997.2502
    1. Betker, J., Goh, G., Jing, L., Brooks, T., Wang, J., Li, L., … Ramesh, A. (2023). Improving image generation with better captions. https://cdn.openai.com/papers/dall-e-3.pdf

Publication types

LinkOut - more resources