Comprehensive definitions of AI terms and concepts
25 terms available
The simulation of human intelligence in machines that are programmed to think and learn like humans. AI systems can perform tasks such as visual perception, speech recognition, decision-making, and language translation.
A subset of AI that enables systems to learn and improve from experience without being explicitly programmed. ML algorithms use statistical techniques to enable computers to "learn" from data.
A subset of machine learning that uses neural networks with multiple layers (hence "deep") to analyze various factors of data. It's particularly effective for image recognition, natural language processing, and speech recognition.
A branch of AI that helps computers understand, interpret, and manipulate human language. NLP enables applications like chatbots, translation services, and sentiment analysis.
A type of AI model trained on vast amounts of text data to understand and generate human-like text. Examples include GPT, Claude, and Gemini.
The input text or instruction given to an AI model to generate a response. Effective prompting is crucial for getting desired outputs from AI systems.
The practice of designing and refining prompts to get the best results from AI models. It involves crafting clear, specific instructions to guide AI behavior.
When an AI model generates information that is incorrect, nonsensical, or not present in its training data. A common challenge in AI systems that requires careful validation.
AI systems that can process and understand multiple types of input (text, images, audio, video) simultaneously. Examples include GPT-4 Vision and Gemini.
An autonomous AI system that can perceive its environment, make decisions, and take actions to achieve specific goals. Agents can use tools and interact with external systems.
An AI assistant that works alongside users to help with tasks in real-time. Examples include GitHub Copilot for coding and Microsoft Copilot for productivity.
A type of large language model developed by OpenAI that uses transformer architecture to generate human-like text based on prompts.
A set of protocols and tools for building software applications. AI APIs allow developers to integrate AI capabilities into their applications.
A unit of text that an AI model processes. Tokens can be words, parts of words, or characters. Token count affects API costs and model limits.
The process of training a pre-trained AI model on a specific dataset to adapt it for particular tasks or domains. This improves performance on specific use cases.
A numerical representation of text, images, or other data that captures semantic meaning. Embeddings enable AI models to understand relationships between different pieces of information.
A technique that combines information retrieval with text generation. RAG systems retrieve relevant documents and use them to generate more accurate and contextual responses.
A specialized database designed to store and query high-dimensional vectors (embeddings). Used in AI applications for similarity search and retrieval.
A deep learning architecture that uses attention mechanisms to process sequences of data. Transformers are the foundation of modern LLMs like GPT and BERT.
The ability of an AI model to perform tasks it wasn't explicitly trained on, using only its general knowledge and understanding from pre-training.
A technique where an AI model learns to perform a task with only a few examples provided in the prompt, without additional training.
A parameter that controls the randomness of AI model outputs. Lower temperature (0-0.5) produces more focused, deterministic responses. Higher temperature (0.7-1.0) produces more creative, varied responses.
Systematic errors in AI systems that create unfair outcomes, often reflecting biases present in training data or model design. Addressing bias is crucial for ethical AI.
A business model where basic features are free, but advanced features require a paid subscription. Common in AI tool pricing.
Software with source code that is freely available for modification and distribution. Many AI models and tools are open source, allowing community contributions.