Continuing our investigation about the potential of the Augmented Reality (AR) market for consumers, Yole Développement is interviewing the leading players from the industry we identified in our recent report “Displays and Optics for AR & VR 2020”. Rumors continue about AR headsets being around the corner. Is this the consumer dream? Is it the next generation consumer electronics revolution? All these catchy ideas represent billions of investments from all parts of the supply chain, trying to build upon this momentum. And in the meantime, we are seeing major announcements in the industry.
More strategic moves are happening throughout the supply chain, with major consumer electronics brands investing, partnering and acquiring. Meanwhile, we see that the AR headset paradigm for consumers is a bound-to-happen reality. We discussed the optics challenges and progress through interviews with WaveOptics about its advancements in waveguides and with Oxford Instruments about its newly released equipment specifically targeting the AR market. Now Yole is focusing on another critical building block for AR headsets. Zine Bouhamri, Technology and Market Analyst in Displays at Yole Développement interviewed Compound Photonics, a leading provider for compact high-resolution microdisplay solutions. Mike Lee, the company’s Head of Business and Corporate Development, kindly shares details about the company’s vision of the AR market and what Compound Photonics brings to disrupt the driving paradigm of microdisplays.
Yole Développement (YD): Could you please share with our readers a little bit about yourself, your activities and the activities of Compound Photonics?
Mike Lee (ML): I was previously at Plessey Semiconductors focusing on the business of microLED technology development. I am now leading CP Display (CP), also known as Compound Photonics, as we leverage the company’s capabilities from Liquid Crystal on Silicon (LCoS) microdisplay products with resolutions of 1080p, 2Kx2K and 4K now, transitioning to microLED-based display technology. Presently, we have a team of optical and electronic engineers in our two U.S. locations, Chandler, Arizona, and Vancouver, Washington. I am leading the activities in business and strategic corporate development.
YD: CP has been in the LCoS field for quite some time now. What are the major applications that you target with these devices?
ML: CP is a leading provider for compact high-resolution microdisplay solutions. As many agree that AR, together with virtual reality and mixed reality (VR/MR) has continued to lead the wave of the mobile technology revolution, CP has been at the forefront of the innovations by architecting our microdisplay to optimize the human visual experience.
We started as a high performance projector system design company based on LCoS technology. While the demand of the display technologies for AR/MR started to rise and our compact, 3µm pixel microdisplay received a lot of attention. CP has since made a strategic move to focus on microdisplay subsystem development. With this background, our team has accumulated a wealth of display design experience and intellectual property (IP) at both component and system levels.
CP’s microdisplay has been well known in the industry as the world’s smallest full high-definition solution. Our microdisplay specs have enabled a host of compact and wide field-of-view (FOV) engine designs. In today’s AR/MR hardware development activities, most of the smart glasses developers and many global consumer electronic brands have adopted our signature 0.26” 1080p Integrated Display Module (IDM) microdisplay in their AR headset development.
YD: What are your capabilities in terms of manufacturing?
ML: In our Chandler, Arizona, fab, we have been perfecting multiple complex fabrication processes, including Liquid Crystal Assembly (LCA) filling for LCoS and backplane to array bonding for microLED. In addition, we have developed best-in-class capability for device packaging, display testing, as well as light engine reference design.
YD: What do you think could be the limitations of LCOS for future applications?
ML: Projection technologies like Digital Light Processing (DLP) and LCoS need an additional light source and polarizers to control the light in order to generate images. CP’s existing 1080p, 2K2 LCoS display products have achieved an industry leading small 0.26″-0.34″ diagonal size, as well as highest contrast ratio and brightness for AR. But to meet the future requirements, the relatively complicated optical design in LCoS-based engines makes it challenging to optimize brightness and contrast while meeting the needs to reduce size and power consumption for next-gen AR.
YD: What are the latest hurdles you have had to manage or that you plan on tackling? What is your roadmap in terms of technology and market development? And why did you get into the microLED business?
ML: The industry continues to have a very strong appetite for reducing size and increasing power efficiency and brightness. We believe that our current technology path will meet or exceed these demands.
CP has pioneered our full digital backplane and drive architecture that is highly scalable and can be easily made modular. Therefore, it is a natural path for CP to re-engineer our advanced backplane for LCoS and expand into an all-in-one microdisplay driver platform to enable next generation light modulations – from near term microLED-based display devices, to future development in phase modulation when true 3D holography computation emerges.
YD: How do you plan on developing the microLED opportunity?
ML: Currently CP is working with many leading microLED developers, and we plan to make some announcements later this year on strategic partnerships. We consider our backplane to be microLED agnostic, meaning we can work with many leading microLED providers, no matter their wafer technology or their transfer process. Our business model is flexible. We can provide the backplane, or we can provide end-to-end solutions including process integration for bonding, assembly, packaging and testing.
The high-end, high-resolution display market has been heading towards the microLEDs. CP is able to transition our LCoS voltage driven vDrive backplane to constant current drive iDrive, which is optimized for driving microLED arrays based on our existing bonding and packaging for 0.26”diagonal 1080p resolution module and 0.34” diagonal 2Kx2K module.
In parallel to building this microLED compatibility on our present driver architecture called Nova, we also have developed a game-changing video pipeline drive architecture – IntelliPix™ – that departs from frame-based driving paradigm to a design that is able to control each pixel intelligently. This paradigm shift enables us to manage each pixel dynamically based on the application needs and enables foveated rendering at the display level to achieve faster speed and lower power in the future.
As of today, CP is the only player in the market to tape out a functional MIPI specification-based constant current drive technology platform to enable global brands and key microLED makers to build their display sub-subsystems and channels.
In order to better communicate our IntelliPix™ technology and platform to the AR/MR/VR technology community, we will kick off a series of branding activities starting from Display Week next month.
YD: Your focus now is more around the driving technology you can provide, rather than the whole device in itself. Can you tell us more about this new driving technology? How do you think it will change the overall technology and market landscapes?
ML: CP’s constant current iDrive version backplane leverages our existing 3.015µm pixel pitch, 0.26” 1080p device that is fully optimized in its uniformity. The rest of the system logic stays the same as the voltage driven backplane allowing for an easy development process.
Our Nova drive architecture has reduced latency for 60Hz video from as much as 16ms to around 1ms to avoid image lag and swimming. Nova also allows for frame-by-frame switching so the system integrator can use the software toolkit to tailor performance and associated power consumption to the current application.
Given the fact that AR, in most situations, does not fill the FOV with information, the system is dealing with sparse data. The question then needs to be asked, why transmit the full frames of video and carry the full bandwidth video through the system?
Our new generation architecture, IntelliPix™ is a single chip display that can drive up to 100x faster wave-forms while consuming up to 4x to 12x less system power and enabling up bandwidth reduction across the video pipeline than our present Nova platforms.
With its capability to control each pixel dynamically, IntelliPix™ also enables operating selected display regions at different refresh rates adapted to content type within each region and also opens the door to foveated rendering with attendant benefits for optimized processing and power on the host. When applied to emissive displays such as microLED, power is only burned for active pixels thus multiplying the power savings.
YD: It feels that you are targeting the AR headset opportunity. How do you feel about the opportunity it represents for you and for the consumers, if it ever materializes?
ML: We are at an exciting turning point now. All the global technology brands are working to take content from the phone to the eye through AR/MR. Current AR glasses are too big and expensive to become widespread among consumers. On the other hand, sleeker, cheaper ones offer little in functionality and distinctive consumer benefits. Although consumer adaptation has yet to take off, the AR headset and solutions have been growing steadily in industrial and commercial segments.
In the past few months, we saw in public announcements that Apple acquired VR entertainment platform NextVR, Google bought AR glasses maker North and Facebook got into microLED development. We have no doubt that these consumer technology giants will make sure they will introduce their AR glasses products by balancing price, design, and functionality to hit that all-important market criteria. On the display technology front, microLED provides multiple unique value propositions to enable AR glasses products for consumers. Currently, the industry attention in microLED is to solve engineering/manufacturing challenges by putting more efforts on technologies such as display driving, image improvement, light management, bonding and packaging and supply chain. We believe CP is well positioned today to be a leading all-in-one solution provider to accelerate time to market for microLEDs.
YD: It feels that using the incumbent technologies for display engines in AR headsets is something that OEMs would like to circumvent, waiting for the microLED opportunity to happen. How can you help with that?
ML: AR devices and applications are complex and diversified and it takes time for the industry to build up the ecosystem.
We offer market leading LCoS technology today to allow our OEM customers to nurture that ecosystem. Our Nova and IntelliPix™ drive architectures are based on the same control scheme, so our OEM customers can naturally transition from LCoS display light engine platform to the next generation emissive microLED platform.
One of the key advantages of the IntelliPix™ platform is its flexibility to drive the microLED pixels. The scalability of pixel sizes and resolutions expand the display development tailored to commercially ready consumer MR applications, such as sports goggles and smart watches.
YD: How do you see the competition with other display engine technologies?
ML: The most possible competition is from OLED technology. However, microLED has certain natural advantages. The first and foremost is its size reduction. Compared with OLED, microLED has no issues in luminance and speed of decay, hence the total display performance is superior for brightness, contrast and outstanding battery life.
YD: As you are providing backplanes, are you front plane agnostic in the display engine field? Can you interface with any kind of technology? What do you make of OLED-on-silicon, for example?
ML: Yes, we are front plane agnostic. Our architecture is best suited for premium microdisplay for compact headsets designed for AR near eye applications. Both Nova and IntelliPix™ are based on advanced silicon design nodes that are fully optimized with sub 10 micron pixels for high resolution, high optical efficiency and fast modulation speed. Additionally, IntellPix™’s unique feature to manage video pipeline at the individual pixel level is a critical enabler for microLED displays. For the leading microLED makers, our platform and our packaging and bonding processes provide a jump start to accelerate their time to market for a premium display device.
With that in mind, we are open to evaluate other front plane technologies and partnerships that make technical and commercial sense.
YD: What is your expectation in terms of market development for AR headsets in the near future?
ML: We are involved with many exciting AR headset projects with some of the largest AR OEMs. We believe this is going to be the next generation of mobile computing. The recent global pandemic has motived companies and individuals to invest in productivity for remote and virtual collaborations. In addition, AR headset projects will be further accelerated along with 5G coverage, which will drastically expand the capabilities of mobile data processing for the AR headsets.
YD: Apart from AR headsets, have you identified niche pockets of growth for your technology?
ML: We are working with many different types of companies producing simple displays for swimming or ski goggles, scuba diving masks, and even for gun scopes. We also are developing solutions for direct-to-eye VR systems and wearable applications such as smart watches. Essentially, our technology and capabilities have centered on ultra-compact and highly efficient display solution, where optical performance and battery life are most critical.
YD: Further down the road, there is this computer generated holography trend and your partnership with VividQ goes alongside your developments of phase-modulation capable LCOS. We do not think this is something to be expected by consumers soon though, because of the huge computing requirements and optical requirements. Could you please share some insights about this partnership and product development?
ML: Fortunately, our current amplitude LCoS technology allows us to transition to microLED based displays as mentioned above. But, this also leads to future development in the area of phase LCoS, which unlocks the true 3D holographic image capability.
VividQ is an example of the strategic ecosystem and technology partnership we will continue to develop over the long term as the holographic image presentation becomes more viable. It is still early days, but we are very pleased to have the phase modulation LCoS on our roadmap.
YD: Do you have a final word for our readers?
ML: Thank you for the opportunity to discuss CP’s technology and strategy with the industry experts at Yole as well as the i-Micronews readers. We hope we have given you and your readers a better insight into CP’s full range of capabilities and the roadmap from our 1080p to 2Kx2K amplitude LCoS modulation, leading to microLED integrated display systems, and then carrying on to LCoS phase modulation in the future as holographic computing begins to emerge.
In the meantime, we hope you and your readers stay safe in these very challenging times.
Mike Lee joined Compound Photonics (CP) in early 2020 leading strategic business and corporate development efforts. He has been spearheading CP’s overall strategy as the leading technology provider in backplane and video drive architecture that enables microLED display development for next generation AR/MR/VR.
Mike has over 20 years of experience in display, electronics and semiconductor industries as an investor, turn-around expert and advisor to private equity and venture capital. He has successfully transformed many display and technology companies into several hundred million in valuation. Mike attended University of Arizona where he studied Systems Engineering and London Business School.
As a Technology & Market Analyst, Displays, Zine Bouhamri, PhD is a member of the Photonics, Sensing & Display division at Yole Développement (Yole).
Zine manages the day to day production of technology & market reports, as well as custom consulting projects. He is also deeply involved in the business development of the Displays unit activities at Yole.
Previously, Zine was in charge of numerous R&D programs at Aledia. During more than three years, he developed strong technical expertise as well as a detailed understanding of the display industry.
Zine is author and co-author of several papers and patents.
Zine Bouhamri holds an Electronics Engineering Degree from the National Polytechnic Institute of Grenoble (France), one from the Politecnico di Torino (Italy), and a Ph.D. in RF & Optoelectronics from Grenoble University (France).
Displays and Optics for AR & VR 2020
Optics are getting ready – now MicroLED displays are the next roadblock for the implementation of augmented reality.