Fundamentals

Embeddings

Quick Answer

Numerical vector representations of text that capture semantic meaning.

An embedding converts text (words, sentences, or documents) into a vector of numbers that captures its semantic meaning. Similar texts produce similar embeddings. Embeddings are fundamental to semantic search, clustering, recommendation systems, and as input to machine learning models. They enable you to work with text numerically. Modern embedding models are trained to capture meaningful relationships—synonyms cluster together, and distance in vector space correlates with semantic distance. Embeddings are typically lower-dimensional (384-1536 dimensions) than raw text.

Last verified: 2026-04-08

Compare models

See how different LLMs compare on benchmarks, pricing, and speed.

Browse all models →