Contextual Representations
Contextual representations are representations of words, phrases, or sentences within the context of the surrounding text. Unlike word embeddings from Word2Vec where each word is represented by a fixed vector regardless of its context, contextual representations capture the meaning of a word or sequence of words based on their context in a particular document such that the representation of a word can vary depending on the words surrounding it, allowing for a more nuanced understanding of meaning in natural language processing tasks.
References
Attention is All You Need, Vaswani et al., Proceedings of Advances in Neural Information Processing Systems (NeurIPS), 2017.
Last updated