Publication

The OVER SDK is optimize to publish the current open scene, there must be only one OvrAsset present in the scene and no other object as root. The only other assets that can be present are the map and the land borders because they will automatically remove upon upload.

To speed up the publication process, other than auto select the active scene, we gave the possibility to generate throught the Unity Recorder the thumbnails used to preview the experience .

Validity Button

The Scene Validity button will analyze the scene for valid scripts and objects and if any of the objects in the prefab go outside of the bounds of the hexagon or folder. The Scene will also be checked to ensure that there is only one OvrAsset in scene and no other external root objects.

A popup will inform the user about the result of the check.

AR Layer / Virtual Layer Experience

The AR Layer represents the publication of a geolocalised experience. Geolocalised experiences can also be lived remotely, by replacing the camera with the OVRMap (real-world Digital Tween) or, alternatively, by positioning the AR experience within the context in which one is located, previewing the experience outside its original context.

The Virtual Layer is optimised for occlusive experience. The relationship with the original context is therefore not necessary and the background of the experience replaces, in a completely occlusive way, the camera and the OVRMap. In this Layer, being completely virtual, collisions with objects are present, so it is possible to touch walls, climb stairs and teleport.

Features

Over Gate Console

Using the OVER GATE Console is possible to create your personalized events. Thanks to this powerful tool, it will be possible to transmit audio, show your avatar and monitor all the connected users in the event rooms.

Gated Access

A further feature, which the OVER SDK offers, is the ability to add gated access to the experience by defining an NFT contract address during publication.

This feature works with the following chains: Ethereum, Polygon, Binance Smart Chain.

The user visiting the published experience will only be able to access it if he/she is in possession of an NFT belonging to the contract declared at the time of publication.

Remote Settings

Remote Experience

Gives users the ability to access, Geolocated experiences, remotely. This setting is enabled by default for virtual experiences

Remote OVRMap

Gives the possibility to display the OVRMap remotely, this setting is not present in Virtual Layer Experience.

Walk Mode

Walk Mode allows the user to move inside the experience by moving the phone in real space.

AR Occlusion Settings

Occlusion allows mixed reality content in your app to appear hidden or partially obscured behind objects in the physical environment. Without occlusion, geometry in your scene will always render on top of physical objects in the AR background, regardless of their difference in depth.

Android Occlusion Settings

Activates environment occlusion for Android devices, ensuring that 3D elements are visually "hidden" behind real-world objects when viewed through the camera.

iOS Occlusion Settings

Activates the human occlusion on Apple devices, utilizing iOS's capability to recognize people within the camera's view. It allows virtual elements to appear as if they're going behind people, creating a more realistic AR experience.

iOS Lidar Occlusion Settings

Activates environment occlusion for ApplePro devices, ensuring that 3D elements are visually "hidden" behind real-world objects when viewed through the camera.

Shadow Settings

Shadow Distance

Unity's property for determining the distance from the camera, the shadows of game objects become less obvious the farther the game objects are from the camera.

Shadow Projection

There are two different methods for projecting shadows from a directional light. Close Fit renders higher resolution shadows but they can sometimes wobble slightly if the camera moves. Stable Fit renders lower resolution shadows but they don’t wobble with camera movements.

Shadow Resolution

Shadows can be rendered at several different resolutions: Low, Medium, High and Very High. The higher the resolution, the greater the processing overhead.

AR Settings

Main Plane Detection Type

The method for detecting and establishing the main surface or "plane" in your augmented reality scene determines how the system behaves. This is crucial for accurately placing objects, managing shadows, and ensuring stable tracking within the environment. Depending on your project's requirements, you can choose from three modes: Normal , Fast , and No Detection . Each mode offers a different balance between detection speed, accuracy, and stability.

  • Normal: You will be asked to scan the surface and confirm it. The surface will be automatically selected only if it's reliable and/or framed for an extended period. This ensures the stability of the tracking system and allows you to choose the surface for shadow projection

  • Fast: You will be asked to scan the surface, but the first detected surface will be selected automatically. This allows a check of the tracking stability, but the position of the surface may be approximate, with the risk that objects and shadows may be rendered in the wrong position, potentially causing visual parallax issues.

  • No Detection: The surface will be automatically positioned at a fixed location beneath the initial viewpoint, simulating the floor position. In this mode, no stability checks for the tracking system are performed, and there is a risk that objects and shadows may be rendered in the wrong position, potentially causing visual parallax issues.

Show Main Plane

Enables or disables the visibility of the main surface, which projects shadows and prevents the rendering of everything located beneath it.

AR Image Target Settings

Frame Image Target UI Mode

Determines if and when to display the UI that prompts the user to frame the Image Target.

  • Never: It is never shown.

  • Ever: It becomes visible whenever an Image Target is not being framed.

  • Once: It is shown only once and disappears when an Image Target is detected.

Last updated