Word Embedding
W
Word Embedding
Definition
A learned representation of text where words with similar meanings are mapped to nearby points in a vector space. Pioneered by Word2Vec and GloVe, word embeddings capture semantic and syntactic relationships and serve as input features for NLP models.