Welcome to the Arvello Jobs VR/AR Glossary
Whether you’re a job seeker, recruiter, or XR enthusiast, this glossary will help you navigate the evolving landscape of Virtual Reality (VR), Augmented Reality (AR), Mixed Reality (MR), and Extended Reality (XR).
A–Z Terms (Sample)
A
AR (Augmented Reality) – Technology that overlays digital content onto the real world via smartphones, AR glasses, or headsets. Example: IKEA Place app.
AI (Artificial Intelligence) – Powers NPCs, object detection, and scene understanding in AR/VR applications.
Anchors – Persistent digital points in physical space used in AR for placing virtual objects.
Asset Bundle – A package of assets used in game engines like Unity for building XR apps.
ARCore/ARKit – SDKs from Google and Apple that enable AR development on Android and iOS.
B
Binaural Audio – 3D audio system that enhances spatial immersion in VR.
Body Tracking – Tracks full-body movements using cameras, sensors, or suits.
Blending (Scene Blending) – Merging of real and virtual content in mixed reality.
Boundary System – Defines the safe movement area in room-scale VR (e.g., Oculus Guardian).
C
Computer Vision – Enables AR devices to recognize and interpret visual input.
Controller Tracking – Tracking position and orientation of handheld input devices.
CV (Curriculum Vitae) – Often tailored in XR careers to include project links, GitHub, or AR demos.
Cloud Anchors – Shared AR anchors enabling multi-user AR experiences.
D
Degrees of Freedom (DoF) – 3DoF allows rotation only; 6DoF includes positional tracking.
Digital Twin – A virtual representation of a real-world object or system in XR.
Depth Sensing – Measures distances in real-time for environment mapping and occlusion.
Display Latency – Delay between head movement and screen update—lower is better.
E
Extended Reality (XR) – Umbrella term for VR, AR, and MR technologies.
Eye Tracking – Detects where users are looking; useful for foveated rendering and UI interactions.
Environment Mapping – Rendering technique used to simulate reflective surfaces.
F
FOV (Field of View) – The extent of the observable world seen at any moment in XR.
Foveated Rendering – Prioritizes high-resolution rendering where the eye is focused.
Frame Rate – The speed at which frames are rendered. High FPS improves comfort.
G
Gesture Recognition – Detects and interprets human gestures as input in AR/VR.
GPU (Graphics Processing Unit) – Critical for rendering immersive 3D environments.
Guardian System – VR safety feature that defines boundaries in physical space.
H
Haptics – Feedback via vibration or force to simulate touch in XR.
Hand Tracking – Detects and interprets hand movement without controllers.
HMD (Head-Mounted Display) – Wearable device with a screen for immersive visuals.
I
Immersive Content – Experiences designed to fully engage users in XR environments.
Inside-Out Tracking – Uses onboard sensors to track position without external cameras.
Image Recognition – Enables AR experiences to trigger from visual markers.
J
Jitter – Unwanted movement in visuals due to tracking or frame rate issues.
Joint Orientation – Used in body tracking to determine limb and joint angles.
K
Kinect – Microsoft's motion sensing device; an early influence on XR input.
Keyframe Animation – Animation technique widely used in 3D asset creation.
L
Latency – Delay between input and system response, crucial in XR to prevent motion sickness.
Light Estimation – AR feature that mimics real-world lighting on virtual objects.
M
Mixed Reality (MR) – Blends real and virtual worlds with realistic interaction.
Motion Capture (MoCap) – Records body movements for digital animation.
Marker-based AR – Uses images/objects to trigger AR experiences.
N
Navigation Mesh (NavMesh) – Defines walkable areas for NPCs and pathfinding in 3D environments.
Natural User Interface (NUI) – Interface that feels natural to use (e.g., gestures, gaze).
O
Occlusion – Realistic blocking of virtual objects by physical ones in AR.
Optical Flow – Computer vision method for detecting motion of objects.
P
Passthrough View – Real-world view streamed through cameras in MR/VR devices.
Pose Estimation – Detects human position and orientation in 3D space.
Photogrammetry – Converts real-world photos into 3D models.
Q
Quaternion – Math construct used for smooth 3D rotations without gimbal lock.
Quick Look (Apple) – AR viewer on iOS that lets users see 3D models in real space.
R
Rendering Pipeline – Sequence of steps to render 3D graphics.
Reticle – A fixed UI element (often a dot or ring) used in gaze-based interactions.
Reality Capture – Technology used to digitize the real world (LiDAR, photogrammetry).
S
SLAM (Simultaneous Localization and Mapping) – Core AR tech that maps environments in real time.
Spatial Audio – Audio that changes based on user’s position and orientation.
SDK (Software Development Kit) – Set of tools to build AR/VR apps (e.g., ARKit, Oculus SDK).
T
Tracking Space – The physical space mapped and used for XR movement.
Teleportation (Locomotion) – Instant movement mechanic used in VR to prevent motion sickness.
Thin Clients – Lightweight XR devices relying on cloud processing.
U
UI (User Interface) – Visual layer of interaction in XR apps.
Unity – Popular real-time 3D engine for XR development.
Unreal Engine – Another leading 3D engine known for photorealism.
V
VR (Virtual Reality) – Fully immersive digital environment accessed through HMDs.
Volumetric Video – Captures 3D scenes that users can move around within.
VPS (Visual Positioning System) – Determines user’s location via visual landmarks.
W
WebXR – Web API for creating XR content that runs in browsers.
World Anchors – Persistent AR content tied to real-world locations.
X
XR (Extended Reality) – All immersive tech including AR, VR, and MR.
Y
Yaw – Rotation around the vertical axis; part of 3D orientation.
Y-axis – Vertical axis in 3D space; relevant for movement and positioning.
Z
Z-Depth – Used to determine how far an object is from the viewer.
Zero Latency – Ideal state of real-time interaction with no perceptible delay in XR.