3D-Sensorik 

Das geistige Eigentum und das technologische Know-how von ams OSRAM eröffnet der Weiterentwicklung von 3D-Sensorik den Weg und beschleunigt so die Markteinführung von mobilen Anwendungen wie AR/VR und Autofokus, mit der sich Mobilhersteller von Wettbewerbern abheben. 

World facing sensing is critical for virtual reality (VR) and augmented reality (AR)

How do VR headsets and AR glasses know their position and how the user is moving? Where and how can virtual objects be placed with realistic occlusion? 

World facing sensing is critical to all VR and AR experiences. ams OSRAM offers a comprehensive portfolio of image sensors, micro cameras, IR LEDs, IR VCSEL modules, illuminator drivers, integrated dToF range sensors for world facing sensing. 

Immersive Virtual Reality (VR) experiences require precise 6 Degree of Freedom (6DoF) tracking of the movement of the of the headset. This enables the right parts of the virtual scene to be rendered and displayed naturally to the user as they move their head. Boundaries must also be set to avoid user collisions. 

Augmented Reality (AR) experiences further require a 3D map of the surroundings to realistically place and occlude virtual objects. VR Video Pass Through (VPT) must also re-render the complete scene, and will further benefit from flicker, ambient light and color sensing. 

Hand tracking, with or without hand controllers, is a key user interface; and this is usually achieved using the same World Facing tracking sensors.

×

3D scene reconstruction

Augmented Reality experiences require a real-time 3D depth map of the surroundings. This is needed to place virtual objects in the right locations with correct occlusion, and to re-render the real scene from the user’s perspective for VR Video Pass Through (VPT). Multiple approaches to 3D sensing can be deployed.  

Camera based 3D sensing systems capture images from multiple positions, and triangulate to common features that can be identified in the scene. Various combinations of multiple cameras (stereovision) and projection of features (structured light) are possible. ams OSRAM offers: 


Time of Flight (ToF) based 3D sensors directly measure optical path length to points in the scene. ams OSRAM offers: 

×

Camera enhancement for video pass through

VR headsets create high fidelity AR experiences by reproducing live images of the real world, with added virtual content, on the high resolution displays of the device. This so-called Video Pass Through (VPT) feature, however, is significantly more complex than literally passing through images from the external RGB cameras. The scene has to be captured in 3D and re-rendered to match the actual positions of the users eyes and to realistically add the actual content. 

Beyond 3D mapping of the scene using the methods above, a photorealistic rendering also requires high fidelity RGB camera images.  ams OSRAM is an industry leading supplier of camera enhancement sensors for mobile phones. These work alongside the image sensor and image signal processor (ISP) to maximize image quality and minimize latency; requirements which are also true for creating the best video pass through. Specifically, we offer sensors for: 

  • Ambient light level and flicker frequencies, using our ambient light sensors. These enable rapid setting of the camera exposure time and frame rate, to reliably capture smooth video at optimal brightness across changing conditions. 
  • True color of the scene, using our color and spectral sensors, enabling realistic white balance to be maintained across varying environments and scenes. 
  • Distance, using our dToF sensor modules, enabling fast and accurate focusing. 
×

Hand tracking

Tracking hand movements is crucial to creating a natural user interface for VR and AR experiences. 

Fine motor hand movements are tracked using world facing cameras, typically the same as already present for position and 3D sensing. For enhanced performance, high intensity IR flood illumination may also be added to maximize hand visibility over that of the background scene. ams OSRAM offers: 

  • MIRA global shutter image sensors and wafer level optics, combined to create ultra-small, IR enhanced, low power camera modules with resolutions from 0.16 – 2.2MP. 
  • OSLON BLACK & P1616 Infrared LEDs  and BIDOS VCSEL flood illuminators offer powerful and efficient IR flood illumination from highly compact packages. 
  • AS1170 driver IC drives LED and VCSEL illuminators synchronously with the camera shutter, featuring advanced production and safety features. 


Large and intentional hand gestures may further be detected at very low power levels using our compact TMF8828 multi-zone dToF sensor module. This enables features such as low power gesture-based wakeup; or simple hand tracking in ultra-light-weight AR glasses. 

×

6DoF position tracking for VR headsets

Accurate and fast Six Degree of Freedom position tracking is a crucial function for all VR/AR headsets. Inertial sensors alone are insufficient to give a repeatable absolute position, so optical techniques are typically used. There are two key approaches. 

Inside Out tracking uses cameras on the device to track multiple fixed features identified in the surrounding scene. By tracking the relative movement of the features, from multiple camera positions, the 6DoF position of the device can be calculated using image processing. Because no additional infrastructure in the scene is required, this approach is now the most widely used.

Outside In tracking places fixed sensors, typically cameras, around the room in which the device(s) are used. Specific markers are placed onto the devices to be tracked, typically infrared LEDs. By tracking the relative movement of the markers from different camera positions, high accuracy position can be triangulated. Whilst this approach high reliability and accuracy, the requirement to place hardware around the room means that it is used only for more specialized systems. 

ams OSRAM offers sensors and emitters for all types of position tracking systems:

  • MIRA global shutter image sensors and wafer level optics can be combined to create ultra-small, IR enhanced, low power camera modules with resolutions from 0.16 – 2.2MP.
  • Firefly IR chip LEDs provide a high efficiency IR point source in a small form factor, idea for use as a marker on objects to be tracked. 
  • BPW34S and SFH2704 photodiodes 
×

VR hand controller position tracking

VR hand controllers also require accurate tracking of their movement and position relative to the user. 

Outside in tracking is typically employed for hand controllers: A constellation of IR LEDs are embedded, and these are tracked using the existing world facing cameras on the headset. This provides a robust and simple solution that works well for the majority of use cases.  To implement the IR LED constellation, ams OSRAM Firefly® chip LEDs provide a high efficacy and compact solution.

Inside out tracking can alternatively be implemented by embedding multiple cameras into each hand controller, and tracking their absolute position relative to features in the scene. This has the advantage that it works even when the hands are not in view of the headset, but makes the controllers more complex. ams OSRAM MIRA global shutter image sensors and wafer level optic micro-cameras offer a small and power efficient solution here.

×

VR collision avoidance

Collisions with real world objects are one of the hazards of immersive VR experiences. To avoid this, VR headsets alert the user when they move out of a safe play area.

Typically, VR headsets implement this alert based upon a safety boundary, defined by the user, and tracked using the 6DoF position tracking systems described above.

In addition, ams OSRAM direct time of flight (dToF) distance sensor modules can be used to detect the presence and distance of nearby objects. Also adopted for collision avoidance in robotics systems, dToF sensors provide a compact and low power solution for accurately measuring range in real time. 

×