AI Terminology Glossary
This glossary explains the AI terms you are most likely to see in product pages, research summaries, and business conversations.
AGI
Artificial general intelligence: a hypothetical AI system that can perform broadly across domains at human-level flexibility. No current public AI system is generally accepted as AGI.
AI Agent
An AI system that can use tools, follow steps, and pursue a goal beyond one simple response.
Alignment
The work of making AI systems behave according to human intentions, safety expectations, and stated constraints.
Attention
A mechanism that helps a model weigh which parts of the input are most relevant.
Benchmark
A standardized test used to compare AI models. Benchmarks are useful, but they do not always predict real-world performance.
Context Window
The amount of text, code, or other input a model can consider at once.
Embedding
A numerical representation of text, images, or other content. Embeddings make similarity search possible.
Fine-Tuning
Additional training that adapts a model to a specific task, style, or data pattern.
Generative AI
AI that creates content such as text, images, audio, video, or code.
Hallucination
When an AI produces false or unsupported information while sounding confident.
Inference
Using a trained model to generate an output.
LLM
Large language model. A model trained on large text or code datasets to understand and generate language.
Machine Learning
An AI approach where systems learn patterns from data instead of only following hand-written rules.
Multimodal AI
AI that can process more than one type of input, such as text, images, audio, video, or code.
Prompt
The instruction or input you give to an AI model.
RAG
Retrieval-augmented generation. A pattern where a system retrieves relevant source material before asking a model to answer.
RLHF
Reinforcement learning from human feedback. A training method that uses human preference signals to improve model behavior.
Token
A piece of text processed by a model. Tokens can be words, word parts, punctuation, or characters depending on the tokenizer.
Transformer
The neural network architecture behind most modern language models. Transformers use attention to process context efficiently.
Vector Database
A database designed to store embeddings and search by similarity. Commonly used in RAG systems.
Zero-Shot
When a model performs a task without being given examples in the prompt.
Verified Sources
- Vaswani et al., “Attention Is All You Need,” arXiv, 2017: https://arxiv.org/abs/1706.03762
- Lewis et al., “Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks,” arXiv, 2020: https://arxiv.org/abs/2005.11401
- OpenAI Help Center, “Best practices for prompt engineering with the OpenAI API,” updated April 2026: https://help.openai.com/en/articles/6654000-best-practices-for-crafting-prompts