眼动、面部和手部识别追踪

通过我们全方位的红外线发射器和传感器,实现新一代互动体验。

Powering next generation eye, face and hand tracking 

Eye and face tracking enable natural interaction in virtual and augmented reality systems as well as in smart glasses. Eye gaze becomes a user interface and powers photorealistic avatars, while foveated rendering reduces compute and system power. Face tracking transfers facial expressions for social and professional presence. Critical requirements include high accuracy, fast update rates, minimal power consumption, and tiny form factor. 

How eye tracking works (camera-based) 

Camera-based eye tracking combines infrared (IR) illumination (LED or VCSEL) with a CMOS globalshutter image sensor. Illumination operates in the invisible IR spectrum; reflected light from the eye is captured and processed via pupil/glint detection. Globalshutter sensors avoid rollingshutter artifacts and support ultrashort exposure times for high precision at low motion blur. 

Component examples (camera-based) 

  • MIRA globalshutter image sensors integrated in reflowable, IRenhanced miniature camera modules. 
     
  • IR LEDs for illumination, e.g., SFH4060 Firefly®. 
     
  • AS1181 multichannel LED/VCSEL driver enabling ultrashort pulses for globalshutter sensors and singlefaulttolerant safety monitoring for eye safety under all operating conditions. 

How eye tracking works (photosensor-based)

For ultra-low power, especially in all-day smart glasses, photodiode-based approaches offer advantages. IR LEDs emit short pulses, and photodiodes detect reflections from cornea/sclera. Optical reflection sensors integrate LED, photodiode, LED driver, and analog front end with digital control and readout. 

  • TMD2636 proximity/optical reflection sensing module. 

Architecture at a glance 

  • Camera-based: Highest accuracy and full image information; higher power and compute. 

  • Photosensor-based: Very low power, compact and costefficient; sufficient accuracy for many glasses use-cases.

Face tracking and avatar reconstruction

Face tracking reconstructs facial geometry and expression to drive photorealistic avatars. In VR/AR, high frame rates, low latency, and robust illumination are key to consistent facial expression transfer for social interaction, telepresence, and collaboration. 

Benefits and use cases 

  • Interaction: Eyegaze UI for precise selection and navigation. 

  • Efficiency: Foveated rendering reduces processing and system power. 

  • Safety & comfort: Invisible IR illumination; eyesafety drivers prevent limit violations even under fault conditions. 

  • Extended functions: Eye movements as potential vital signs. 

  • Design: Miniaturized modules for compact headsets and glasses. 

×

Safety & compliance 

IR sources near the eye must operate within defined eyesafety limits.
Multichannel drivers with single-fault-tolerant monitoring ensure compliance even in hardware or software fault scenarios. 

Application block diagram 

The following application block diagram illustrates potential hardware architectures for eye and face tracking in XR devices. Camerabased systems combine infrared illumination (LEDs or VCSELs) with a CMOS globalshutter image sensor to capture detailed eye and facial features. Photosensor-based systems, in contrast, use highly integrated optical reflection (proximity) sensors to enable ultra-low-power gaze detection in compact wearable designs. 

FAQ

What is eye tracking in VR/AR?

Capturing eye gaze vectors using IR illumination and sensors to enable user interfaces, reduce compute via foveated rendering, and support emerging biosignal applications. 

Why globalshutter sensors for eye tracking?

They eliminate rollingshutter artifacts, capture fast eye movements precisely, and support ultrashort exposures synchronized with IR pulse illumination. 

Camera vs. photosensor-based—what’s the difference?

Camerabased delivers maximum accuracy and full image context at higher power; photosensor-based minimizes power and size with adequate tracking performance for many glasses designs. 

How is eye safety ensured?

Multichannel drivers with singlefaulttolerant monitoring limit IR output within regulatory thresholds, maintaining compliance even during faults. 

Which components are suitable?

MIRA globalshutter sensors, SFH4060 Firefly® IR LEDs, AS1181 LED/VCSEL driver, and TMD2636 optical reflection sensor. 

\n