3D Imaging and Sensing: From Enhanced Photography to an Enabling Technology for AR and VR – WebinarRegister now
Beginning in late 2017, Apple brought an innovative use case to mobile devices: by means of a structured light-based 3D sensing camera module that the company built into the front bezels of smartphones (and later, tablets), users were able to rapidly and reliably unlock their devices using only their faces. Android-based mobile device manufacturers have added front depth sensors to their products in response, and are now striving to enable applications that additionally leverage rear-side-mounted depth sensors.
So far, at least, these new applications are predominantly photography-related. In the near future, however, they’re expected to further expand into augmented reality, virtual reality and other applications, following in the footsteps of Google’s trendsetting Project Tango experiment of a few years ago. And with rumors suggesting that time-of-flight camera modules will also begin appearing on the rear sides of iPhones later this year, it’s anyone’s guess as to who—Android or iOS—will launch (and achieve widespread success) with this technology first.
In this 40-minute webinar, leading market research firm Yole Développement will describe the application roadmap, market value and cost of those highly anticipated mobile 3D sensing modules, including topics such as CMOS image sensors, optical elements and VCSEL illumination. The webinar will be presented by Yole Développement Principal Analyst Pierre Cambou, who has been has been active in the imaging industry for more than 20 years, and is organized by the Embedded Vision Alliance. A 15-minute question-and-answer session will follow the presentation.