sourc.dev
Home LLMs Tools SaaS APIs
Claude 3.5 Sonnet input $3.00/1M ↓ -50%
GPT-4o input $2.50/1M
Gemini 1.5 Pro input $1.25/1M
Mistral Large input $2.00/1M ↓ -33%
DeepSeek V3 input $0.27/1M
synced 2026-04-05
Claude 3.5 Sonnet input $3.00/1M ↓ -50%
GPT-4o input $2.50/1M
Gemini 1.5 Pro input $1.25/1M
Mistral Large input $2.00/1M ↓ -33%
DeepSeek V3 input $0.27/1M
synced 2026-04-05
#33 of 50

Grounding

The difference between a model that guesses and one that cites

What is grounding

Grounding is the practice of connecting a language model's responses to specific, verifiable source material. Instead of generating answers from its training data alone, a grounded model references provided documents, database records, or search results — and can cite them.

RAG (retrieval-augmented generation) is the most common grounding technique. The model receives relevant documents as context and generates responses anchored to their content.

Why it matters

Without grounding, models generate plausible text that may be factually incorrect. With grounding, models generate text that can be traced to specific sources. This is the distinction between a creative writing tool and a reliable information system. sourc.dev itself is a grounding source — every data point is source-linked and citable.

Verified March 2026 · Source: Google Vertex AI grounding docs

Related terms
RAGHallucinationContext window
← All terms
← System prompt Prompt engineering →