Skip to main content

Blog

You are here:

Artificial Intelligence glossary: the top 15 definitions you need to know

As with any generational shift in technology, the rapidly growing influence of artificial intelligence on day-to-day life also introduces its own language. Even as we experiment with chatbots like ChatGPT or Google Bard to get a better sense of what they’re capable of, we’re also challenged to learn a seemingly ever-expanding list of terms and definitions. Perhaps to a degree greater than previously seen, understanding the subtle and unsubtle meanings of this essentially new language is critical to getting the most out of AI. With this in mind, we’ve surveyed some of the most respected sources of computer science thought leadership and have compiled an initial glossary of terms. Check it out.

IT’S ALL ABOUT THE DEFINITION

Algorithm

“An algorithm is a procedure used for solving a problem or performing a computation. Algorithms act as an exact list of instructions that conduct specified actions step by step in either hardware- or software-based routines.”

Source: TechTarget

Artificial Intelligence (AI)

“At its simplest form, artificial intelligence is a field, which combines computer science and robust datasets, to enable problem-solving. It also encompasses sub-fields of machine learning and deep learning, which are frequently mentioned in conjunction with artificial intelligence. These disciplines are comprised of AI algorithms which seek to create expert systems which make predictions or classifications based on input data.”

Source: IBM

Big Data

“Big data is a voluminous set of structured, unstructured, and semi-structured datasets, which is challenging to manage using traditional data processing tools. It requires additional infrastructure to govern, analyze, and convert into insights.”

Source: Spiceworks

Chatbot

“At the most basic level, a chatbot is a computer program that simulates and processes human conversation (either written or spoken), allowing humans to interact with digital devices as if they were communicating with a real person. Chatbots can be as simple as rudimentary programs that answer a simple query with a single-line response, or as sophisticated as digital assistants that learn and evolve to deliver increasing levels of personalization as they gather and process information.”

Source: Oracle

Data Mining

“Data mining is the process of discovering meaningful correlations, patterns, and trends by sifting through large amounts of data stored in repositories. Data mining employs pattern recognition technologies, as well as statistical and mathematical techniques.”

Source: Gartner

Deep Learning

“Deep Learning is a machine learning technique that constructs artificial neural networks to mimic the structure and function of the human brain. In practice, deep learning, also known as deep structured learning or hierarchical learning, uses a large number of hidden layers – typically more than 6 but often much higher – of nonlinear processing to extract features from data and transform the data into different levels of abstraction (representations). 

Source: DeepAI

Generative AI

“Generative AI refers to a category of artificial intelligence (AI) algorithms that generate new outputs based on the data they have been trained on. Unlike traditional AI systems that are designed to recognize patterns and make predictions, generative AI creates new content in the form of images, text, audio, and more.”

Source: World Economic Forum

Machine Learning (ML)

“Machine learning is an application of AI that enables systems to learn and improve from experience without being explicitly programmed. Machine learning focuses on developing computer programs that can access data and use it to learn for themselves.”

Source: ExpertAI

Metadata

“The information that describes and explains data. It provides context with details such as the source, type, owner, and relationships to other data sets, thus helping you understand the relevance of a particular data set and guiding you on how to use it.”

Source: AtlanAI

Natural Language Processing (NLP)

“The ability of a computer program to understand human language as it is spoken and written — referred to as natural language. It is a component of artificial intelligence.”

Source: TechTarget

Neural Network

 “A series of algorithms that endeavors to recognize underlying relationships in a set of data through a process that mimics the way the human brain operates. In this sense, neural networks refer to systems of neurons, either organic or artificial in nature. Neural networks can adapt to changing input; so the network generates the best possible result without needing to redesign the output criteria.”

Source: Investopedia

Predictive Analytics

“The use of data to predict future trends and events. It uses historical data to forecast potential scenarios that can help drive strategic decisions.”

Source: Harvard Business School

Supervised Learning

“Supervised learning, also known as supervised machine learning, is a subcategory of machine learning and artificial intelligence. It is defined by its use of labeled datasets to train algorithms that to classify data or predict outcomes accurately. As input data is fed into the model, it adjusts its weights until the model has been fitted appropriately, which occurs as part of the cross validation process. Supervised learning helps organizations solve for a variety of real-world problems at scale, such as classifying spam in a separate folder from your inbox.”

Source: IBM

Turing Test

“A deceptively simple method of determining whether a machine can demonstrate human intelligence: If a machine can engage in a conversation with a human without being detected as a machine, it has demonstrated human intelligence. The Turing Test was proposed in a paper published in 1950 by mathematician and computing pioneer Alan Turing. It has become a fundamental motivator in the theory and development of artificial Intelligence.”

Source: Investopedia

Unsupervised Learning

“Unsupervised learning uses machine learning algorithms to analyze and cluster unlabeled data sets. These algorithms discover hidden patterns in data without the need for human intervention (hence, they are “unsupervised”).”

Source: IBM 

THE BOTTOM LINE

Much like the technology it describes, language is constantly evolving to meet the needs of the ever-shifting world around it. This has always been true in the tech space, and it is especially so as AI continues to grow its influence.

We’ll continue to track AI’s growth, and will update these terms in future entries here. In the meantime, if you’re still wondering how AI could – or should – be influencing your own business direction, reach out anytime.