An article written by OJO & YOSHIDA REPORT in collaboration with Pierre Cambou from Yole Intelligence, part of Yole Group – Finally, event-based image sensors leveraging neuromorphic engineering principles come to smartphones.
Despite its aspirations in the mobile market, Prophesee needed Sony to develop a sample of stacked event-based sensor chip in late 2019. “This was a very important step for us,” said Luca Verre. “We managed, through Sony’s manufacturing technology, to significantly reduce the size of the event-based sensor” — as small as image sensors already integrated into smartphones. Verre explained that with Sony’s stacking technology, “we reduced the pixel pitch from 15 µm to 4.86 µm, thus effectively shrinking the silicon area by a factor of 10.”
Pierre Cambou, principal analyst at Yole Intelligence, agreed. “Biologically inspired, event-based sensors require many transistors duplicating the neurons. It was stacking semiconductor technology (circa 2015) which opened the door to practical event-based sensors which are now available in High Dynamic Range (HDR).”
Several years ago, Sony devised a process for physically stacking two or three wafers, with connections at pixel level, explained Verre. “So, you have the photodiode on top, and then you put more intelligence on the second layer, and potentially even additional intelligence in the third layer.”