docs.unity3d.com
Search Results for

    Show / Hide Table of Contents

    Occlusion

    ARKit provides support for occlusion based on depth images it generates every frame.

    There are three types of depth images that ARKit exposes through the provider's implementation of the XROcclusionSubsystem implementation:

    • Environment depth: distance from the device to any part of the environment in the camera field of view.
    • Human depth: distance from the device to any part of a human recognized within the camera field of view.
    • Human stencil: value that designates, for each pixel, whether that pixel is part of a recognized human.

    Environment Depth

    The occlusion subsystem provides access to two types of environment depth: raw and smoothed. These correspond to the following ARKit APIs:

    • Raw: ARFrame.sceneDepth
    • Smoothed: ARFrame.smoothedSceneDepth
    Note

    You must enable smoothed depth by setting environmentDepthTemporalSmoothingRequested to true. Otherwise, TryAcquireSmoothedEnvironmentDepthCpuImage will return false.

    Requirements

    Environment depth requires Xcode 12 or later, and it only works on iOS 14 devices with the LiDAR scanner, such as the new iPad Pro.

    Human depth and human scencils requires Xcode 11 or later, and it only works on iOS 13+ devices with the A12 Bionic or higher.


    Apple and ARKit are trademarks of Apple Inc., registered in the U.S. and other countries and regions.

    In This Article
    Back to top
    Copyright © 2024 Unity Technologies — Trademarks and terms of use
    • Legal
    • Privacy Policy
    • Cookie Policy
    • Do Not Sell or Share My Personal Information
    • Your Privacy Choices (Cookie Settings)