Tuesday, 1 December, 2020 UTC


Summary

This list is a compilation of AR/VR terminologies from all over the web. I used to refer to this list each time I came across a term, but didn’t remember what it meant exactly. I am sure you might find this useful too.
Accelerometer — measures acceleration; for ARCore-capable smartphones, it helps enable motion tracking.
Anchors — user-defined points of interest upon which AR objects are placed. Anchors are created and updated relative to geometry (planes, points, etc.) that ARCore detects in the environment.
Asset — refers to any 3D model.
Augmented Reality (AR) — a direct or indirect live view of a physical, real-world environment whose elements are “augmented”(or enhanced) by computer-generated perceptual information.
Computer Vision — a blend of artificial intelligence and computer science that aims to enable computers (like smartphones) to visually understand the surrounding world like human vision does.
Concurrent Odometry and Mapping (COM) — motion tracking process for ARCore, and tracks the smartphone’s location in relation to its surrounding world.
Design Document — a guide for your AR experience that contains all of the 3D assets, sounds, and other design ideas for your team to implement.
Drift — refers to the accumulation of potential motion tracking error. If you walk around digital assets too quickly, eventually the device’s pose may not reflect where you actually are. ARCore attempts to correct for drift over time and updates Anchors to keep digital objects placed correctly relative to the real world.

Trending AR VR Articles:

1. What can Vuzix Smart glasses mean for the current Android Developers?
2. How To Use the ARLOOPA App: A Step-by-Step Guide (2020)
3. Augmented reality for maintenance and repair
4. Top 10 Movies that Got Augmented Reality and Virtual Reality Righ
Edit-time — when edits/changes are made during non-gameplay mode/edit mode and before your application or game is deployed.
Environmental understanding — understanding the real world environment by detecting feature points and planes and using them as reference points to map the environment. Also referred to as context awareness.
Extended Reality (XR): An increasingly popular initialization, XR is used as an umbrella term to refer to both augmented and virtual reality, where the X really acts as a variable to substitute any flavor of computer-assisted visual modification to reality.
Field of View (FoV): Field of View represents the visual area in which users can see virtual content in an augmented reality headset. Also known as “FoV,” this term can also be explained as a measurement of the angle formed by the distance from the user to a fixed point in space and the bounds of vision to the left and right of that point.
GL Transmission Format (gITF): Run as an open-source project by Kronos, glTF is an royalty-free format for exporting 3D models and scenes from one program and importing them into an application to view in augmented or virtual reality.
Feature Points — are visually distinct features in your environment, like the edge of a chair, a light switch on a wall, the corner of a rug, or anything else that is likely to stay visible and consistently placed in your environment. ARCore uses feature points in the captured camera image to compute change in location, further environmental understanding, and place planes in an AR app.
Framing — with regards to mobile AR design, this is the strategic placement of 3D objects in the environment to avoid breaking immersion.
Gaze — The direction in which an experiencer is looking in.
Gesture — A form of non-verbal communication through the body (typically the hands or head) that, when tracked by a motion sensing device, can be interpreted as movement and mirrored in virtual reality. Gestures in virtual reality empower the experiencer with the ability to physically influence the experience.
Google Poly — a free repository of 3D assets that can be quickly downloaded and used in your ARCore experience.
GPS — global navigation satellite system that provides geolocation and time information; for ARCore-capable smartphones, it helps enable location-based AR apps.
Gyroscope — measures orientation and angular velocity; for ARCore-capable smartphones, it helps enable motion tracking.
Haptics — The simulation (or recreation) of sense of touch through the sensations of applying force, vibration, or motion to the user. Haptics be used to assist in the creation of virtual objects in a computer simulation, to control such virtual objects, and to enhance the remote control of machines and devices (telerobotics).
Hit-testing — used to take an (x,y) coordinate corresponding to the phone’s screen (provided by a tap or whatever other interaction you want your app to support) and project a ray into the camera’s view of the world. This returns any planes or feature points that the ray intersects, along with the pose of that intersection in world space. This allows users to select or otherwise interact with objects in the environment.
HMD — In the context of virtual reality, a head-mounted display (also called HMD) is either a pair of goggles or a full helmet that users wear to fully immerse them in virtual experiences. Inside of the HMD, there are tiny monitors in front of each eye which allows for images to appear as three-dimensional. In addition, most HMDs include head tracking sensors so that the system can respond to a user’s head movements.
Immersion — the sense that digital objects belong in the real world. Breaking immersion means that the sense of realism has been broken; in AR this is usually by an object behaving in a way that does not match our expectations.
Inside-Out Tracking — when the device has internal cameras and sensors to detect motion and track positioning.
Latency — The time delay (also called lag) that occurs when there is a change in input from the experiencer and it’s accomplishment, often creating a mismatch between the motion you feel and see. In the real world, there is virtually no latency. In virtual worlds, the average latency is 20 milliseconds, which is considered low.
Light estimation — allows the phone to estimate the environment’s current lighting conditions .
Magnetometer — measures cardinal direction and allows ARCore-capable smartphones to auto-rotate digital maps depending physical orientation, which helps enable location-based AR apps.
Motion Tracking — in the basic sense, this means tracking the movement of an object in space. ARCore uses your phone’s camera, internal gyroscope, and accelerometer to estimate its pose in 3D space in real time.
Multi-plane detection — ARCore’s ability to detect various surfaces as different height and depth.
Occlusion — when one 3D object blocks another 3D object. Currently, this can only happen with digital objects. ARCore objects cannot be occluded by a real world object.
Outside-In Tracking — when the device uses external cameras or sensors to detect motion and track positioning.
Presence — In the context of reality technologies, is when enough of one’s senses have been stimulated to the point that the user feels, believes, and accepts that they are physically occupying a new virtual world. Achieving comfortable, sustained presence requires a combination of the proper virtual reality content and hardware.
Phone Camera — supplies a live feed of the surrounding real world upon which AR content is overlaid when using mobile AR.
Placing — when the tracking of a digital object is fixed, or anchored, to a certain point in the real-world.
Plane Finding — the smartphone-specific process by which ARCore determines where horizontal and vertical surfaces are in your environment and uses those surfaces to place and orient digital objects.
Pose — the unique position and orientation of any object in relation to the world around it, from your mobile device to the augmented 3D asset that you see on your display.
Raycasting — projecting a ray to help estimate where the AR object should be placed in order to appear on the real-world surface in a believable way; used during hit testing.
Room Scale — A design paradigm for Virtual Reality experiences that allows users to move freely within a room-sized environment, while partaking in a virtual reality (VR) experience. The experiencer’s physical movements are mirrored within the virtual world and helps to contribute to a greater sense of immersion, with the body being directly engaged.
Runtime — when edits/changes are made during active gameplay mode or while your app is running. For example, you can download Poly assets while your application is in gameplay mode or running.
Scaling — When a placed AR object changes size and/or dimension relative to the AR device’s position; enabled by environment understanding.
SLAM — motion tracking process that tracks the device in relation to its surrounding world.
Spatial mapping — the ability to create a 3D map of the environment and helps establish where assets should be posed.
Standalone headset — VR or AR headset that does not require external processors, memory, or power.
Surface detection — allows ARCore to place digital objects on various surface heights, to render different objects at different sizes and positions, and to create more realistic AR experiences in general.
Simulation Sickness — In the context of virtual reality, simulation sickness refers to the feeling of dizziness and nauseousness. It is different from motion sickness because it can be caused without any movement, rather by the visually-induced perception of movement. The physiology behind VR sickness is believed to be caused by slower than required refresh rate of on-screen images. This is when the refresh rate is slower than what the human brain processes, which causes a discord between the VR refresh rate and the human brain processing rate; the result is perceived glitches on the screen.
Six Degrees of Freedom (6DoF) Tracking: In AR and VR, 6DoF describes the range of motion that a head-mounted display allows the user to move on an axis in relation to virtual content in a scene. As demonstrated in the video below, three of the degrees refer to the motion of the user’s head — left and right (yaw), backwards and forwards (pitch), and circular (roll) — while the remaining three pertain to the movement within the space — left and right, backwards and forwards, and up and down.
Tracking: In augmented reality, tracking is the method by which a computer anchors content to a fixed point in space, allowing users to walk and/or look around it, as defined by the degrees of freedom allowed by the display device. In marker-based tracking, computers recognize a two-dimensional image or code on which it anchors the content. In markerless tracking, the computer uses some other mapping technique (usually SLAM) to determine a surface on which to anchor content.
Unity — cross-platform game engine and development environment for both 3D and 2D interactive applications.
User Experience (UX) — the process and underlying framework of enhancing user flow to create products with high usability and accessibility for end users.
User Flow — the journey of your app’s users and how a person will engage, step by step, with your AR experience.
User Interface (UI) — the visuals of your app and everything that a user interacts with.
User interface metaphor — gives the user instantaneous knowledge about how to interact with the user interface, like a QWERTY keyboard or a computer mouse.
Virtual Reality (VR) — the use of computer technology to create a simulated environment, placing the user inside an experience.

Don’t forget to give us your 👏 !

https://medium.com/media/1e1f2ee7654748bb938735cbca6f0fd3/href
Top 50 Terminologies related to AR/VR was originally published in AR/VR Journey: Augmented & Virtual Reality Magazine on Medium, where people are continuing the conversation by highlighting and responding to this story.