Glossary

XR, AI, VR, and AR Glossary

Welcome to our comprehensive glossary designed to help you navigate the complex world of Extended Reality (XR), Artificial Intelligence (AI), Virtual Reality (VR), and Augmented Reality (AR). Below you will find clear, concise definitions of key technical terms used across these emerging technologies and their products.

A

Accelerometer
A sensor in XR devices that measures acceleration forces to track device movement and orientation.
Adaptive Bitrate Streaming
A technique that adjusts video quality dynamically based on network conditions for smooth XR streaming.
AI (Artificial Intelligence)
Computer systems capable of performing tasks that require human intelligence, such as perception, learning, and decision-making.
AI Assistant
Software that uses AI to provide voice or gesture-based assistance within XR environments.
Ambient Occlusion
Rendering method that simulates soft shadows where surfaces meet to enhance 3D realism.
AR (Augmented Reality)
Technology that overlays digital content onto the real world through devices like glasses or smartphones.
AR Cloud
Persistent, shared 3D spatial map enabling multi-user AR experiences anchored to real-world locations.
Avatar
Digital representation of a user in virtual or augmented environments.

B

Binaural Audio
3D spatial sound technology mimicking human hearing to create immersive audio experiences.
Bounding Box
Virtual frame defining the interactive area or boundaries of objects in 3D space.
Brightness (Nits)
Unit measuring display luminance; higher values improve visibility in bright environments.

C

CAD (Computer-Aided Design)
Software used to create precise 3D models for XR content development.
Chromatic Aberration
Optical distortion causing color fringes around objects in XR displays.
Computer Vision
AI technology enabling devices to interpret and process visual data for object recognition and tracking.
Convergence
Eye movement focusing on a 3D object, important for comfortable XR viewing.
Cylindrical Lens
Curved lens used in AR glasses to widen the horizontal field of view.

D

Depth Sensor
Hardware that measures distances to objects for 3D mapping and spatial awareness.
Digital Twin
Virtual replica of a physical object or system used for simulation and monitoring.
Diopter Adjustment
Optical correction in XR devices for users with vision impairments.
DLSS (Deep Learning Super Sampling)
AI-powered rendering technique that improves image quality while reducing GPU load.
Degrees of Freedom (DoF)
Tracking axes: 3DoF tracks rotation; 6DoF tracks rotation and position.

E

Edge Computing
Local data processing near the device to reduce latency in XR applications.
EMG (Electromyography)
Sensors detecting muscle electrical activity used for gesture control.
Eye Tracking
Technology that monitors gaze direction to enable foveated rendering and intuitive interaction.

F

Field of View (FoV)
Angular extent of the visible scene through XR displays, measured in degrees.
Foveated Rendering
Rendering technique that sharpens detail where the user is looking to optimize performance.
Fresnel Lens
Lightweight lens with concentric grooves used in VR optics to focus light efficiently.

G

GPU (Graphics Processing Unit)
Hardware responsible for rendering high-quality XR visuals.
Guardian System
Virtual boundary system in VR that alerts users when approaching physical obstacles.
Gesture Recognition
AI technology interpreting hand or body movements for XR control.

H

Haptic Feedback
Tactile sensations such as vibrations that simulate touch in XR devices.
HDR (High Dynamic Range)
Display technology enhancing contrast and color range for realistic visuals.
HMD (Head-Mounted Display)
Wearable device that displays XR content directly in front of the user’s eyes.

I

IMU (Inertial Measurement Unit)
Sensor combining accelerometer and gyroscope to track motion and orientation.
Inside-Out Tracking
Tracking method using cameras on the headset to determine position without external sensors.
Interpupillary Distance (IPD)
Adjustable distance between lenses to match the user’s pupil spacing for visual comfort.

K

Keyframe Animation
Predefined animation frames used to create smooth motion in XR content.

L

Laser Beam Scanning (LBS)
Projection technology using lasers for bright, high-resolution AR displays.
LCoS (Liquid Crystal on Silicon)
Reflective display technology used in compact XR optics.
LiDAR (Light Detection and Ranging)
Laser-based depth sensing technology for accurate 3D environment mapping.
Level of Detail (LOD)
Technique adjusting 3D model complexity based on distance to optimize rendering performance.

M

Machine Learning (ML)
Subset of AI focused on training models to recognize patterns and make predictions.
Marker-Based AR
AR experiences triggered by recognizing visual markers such as QR codes.
Metaverse
Interconnected network of 3D virtual worlds enabling social and economic activities.
Micro OLED
High-resolution, self-emissive display technology used in XR devices.
Mixed Reality (MR)
Blending of physical and digital worlds with interactive digital content anchored in real environments.
Multimodal Interaction
Use of multiple input methods (voice, gesture, gaze) to control XR applications.

N

Neural Network
AI architecture inspired by the human brain for complex data processing.
Natural Language Processing (NLP)
AI technology enabling machines to understand and respond to human language.

O

Occlusion
Rendering technique ensuring virtual objects appear correctly behind real-world objects.
OpenXR
Open standard API for cross-platform XR application development.
Optical Waveguide
Transparent component in AR glasses that guides light from microdisplays to the eye.
Outside-In Tracking
Tracking method using external sensors or cameras to determine headset position.

P

Passthrough Camera
Real-time video feed inside VR headsets that overlays virtual content onto the physical world.
Photogrammetry
Technique creating 3D models from multiple photographs for realistic XR assets.
Point Cloud
Set of data points representing 3D shapes or environments captured by depth sensors.

Q

Quantum Dot Display
Display technology using quantum dots to enhance color accuracy and brightness.

R

Ray Tracing
Rendering technique simulating realistic lighting, shadows, and reflections.
Refresh Rate
Number of times per second a display updates its image (measured in Hz).
RGB-D Camera
Camera capturing both color (RGB) and depth (D) information for 3D mapping.

S

SLAM (Simultaneous Localization and Mapping)
Algorithm enabling devices to map environments and track their location simultaneously.
Spatial Audio
3D sound technology that simulates direction and distance for immersive audio.
Spatial Computing
Computing paradigm where digital content is anchored and interacted with in physical space.
Standalone Headset
Wireless XR device with onboard processing, requiring no external PC or smartphone.
SteamVR
Platform for VR content distribution and hardware support.

T

Time-of-Flight (ToF)
Depth sensing method measuring light pulse return times to calculate distance.
Tracking Volume
Physical area within which XR devices can accurately track user movement.
Teleportation
VR locomotion technique allowing instant movement between points to reduce motion sickness.

U

Unity
Popular game engine used for creating XR applications.
Unreal Engine
High-fidelity 3D engine widely used in XR content development.
User Experience (UX)
Design discipline focused on optimizing user interaction and satisfaction.

V

Varifocal Display
Display that dynamically adjusts focus to simulate realistic depth perception.
Volumetric Capture
Technique recording 3D representations of real-world objects or people for XR use.

W

Waveguide
Optical component that directs light from a projector to the user’s eye in AR glasses.
WebXR
API enabling XR experiences directly in web browsers without additional software.
Wide Field of View
Displays offering a horizontal FoV greater than 90Β°, enhancing immersion.
Wireless XR
XR devices that operate without physical cables, improving mobility.

X

XR (Extended Reality)
Umbrella term covering AR, VR, and MR technologies.

Z

Z-Buffer
Depth buffer storing distance information to render 3D scenes correctly.
Zero Latency
Ideal state where system response to user input is instantaneous, critical for immersion.