AI From A-Z: The biggest buzzwords around artificial intelligence

Doctor checking on a patient from a remote patient monitoring device

Note: A version of this article originally appeared in The Quintessence.

Companies across markets from consumer electronics to factory floors are finding power in the introduction of two letters: AI. The world of artificial intelligence is vast—and there are plenty of terms that get thrown around this industry shifting technology. Here’s a handful of the most important and frequent terms to consider when seeing how you might implement this technology in your business:

Algorithm: A generally interpretable unique description of a sequence of actions to resolve a – usually mathematical – problem.

Bias: In the context of AI, this term describes the bias of a system that leads to a tendency, distortion or even an error in results. This is because, when learning from data sets, Artificial Intelligence adopts the cultural stereotypes or prejudices of the people who created or generated those data sets.

Big Data: Data volumes that are too large to be evaluated using conventional data-processing methods. The data volumes involved are measured in zettabytes. The aim of big-data applications is to compare and analyze wide-ranging types of data.

Bluetooth. Internationally standardized wireless data interface. It enables different mobile devices, such as mobile phones or notebook computers, to connect wirelessly. A wireless connection is made between the various devices’ transmitter and receiver units. It covers distances of approximately 50 meters – enough to link on-board devices in a vehicle or to connect a smartphone with accessories. There’s also an energy-saving variant of Bluetooth which is standardized primarily for the transfer of sensor values and control data.

Chatbot: Program that simulates human conversation. Chatbots are not just speech-based, but can also work using text. The chatbot is trained in advance on the answers it can give to questions from the conversation partner.

Cloud Computing: The provision, use and billing of IT services over a network such as the Internet that is dynamically adapted to requirements.

CPU (Central Processing Unit): The core component of a computer. It consists of a microprocessor which monitors and controls the entire computer and its computing operations.

Cybersecurity: Both a generic term for all risks posed by connecting to and over the Internet and for the solutions to counter these risks.

Data mining: Processing large data sets (big data), whereby individual data parts are connected to one another, bringing about new information.

Deep learning: Sub-area of machine learning in which deep neural networks are used. Whilst machine learning works using linear algorithms, the algorithms in deep learning are hierarchical, organized according to increasing complexity.

Embedded system: Hardware and software components integrated into a unified system to implement system-specific functional features.

Industry 4.0: The concept describes the increasing connectivity and digitalization in industrial manufacturing. At its core is the convergence of information technology and production engineering.

Inference: Phase of application of Artificial Intelligence. After the system has been trained, it calls on what it has learned in its inference and expands its knowledge.

Internet of Things: Linkage of physical objects (things) to virtual replicas of them on the Internet. Everyday objects are provided with their own intelligence and are connected to each other or to the Internet.

Lidar (light detection and ranging): A method of optical distance and speed measurement related to radar. It involves the emission of laser beams. Sensors then detect the reflected light. The distance is calculated from the travel time of the light.

Machine learning: Procedure by which computer systems acquire knowledge independently and can expand their knowledge, allowing them to be better able to solve a given problem than previously. The system extracts the most important patterns and characteristics from large data volumes, and can make predictions based on them.

NLP (natural-language processing): Technology that deals with processing natural language. Using relevant algorithms, computers can understand human language and its content-based significance, and execute relevant instructions.

Neural networks: Computer program that is inspired by the way organic neurons work and is capable of learning tasks.

Radar (radio detection and ranging): Detection and ranging technique based on the electromagnetic waves in the radio frequency band.

Sensor fusion: The intelligent convergence and processing of all sensor data required for autonomous processes. The results of sensor fusion are better than those obtained from the interpretation of data from individual sensors.

Singularity: Also called technological singularity – the point in time at which machines are so advanced that they can upgrade themselves.

Wi-Fi: Designation for a consortium of companies issuing certification of devices with wireless interfaces as well as for the associated brand name. The consortium comprises over 300 companies and certifies products from various manufacturers based on the IEEE-802.11 standard.

WLAN (wireless local area network): Local wireless network that is primarily based on the IEEE 802.11 standard in the 2.4 GHz and 5 GHz frequency ranges. The data transfer rates are between 1 and 780 Mbit/s.

Take a deep dive into artificial intelligence in the latest edition of The Quintessence.


Wired together: How automation systems power the Industrial Internet of Things

From process control and automotive assembly, to robotics and the food and beverage industry, the co...

Getting started in enterprise IoT: Six tricks for solving the puzzle

IoT is like a technological Rubik’s Cube. Here are some of the best tips to solving the IoT puzzle....

Don’t ignore analog—it’s the secret sauce of IoT

Most of the innovation in IoT is focused on digital technology. From power servers running software ...