We all know that for a camera to work, it needs a couple of lenses stuck on each other to capture light and deliver it on the sensor to produce a sharp image. This is also the main reason why even the thinnest and most powerful smartphones still has to deal with camera bump. The lens used in the camera is as essential as the sensor itself and we can’t change that, at least for now.
Engineers at California Institute of Technology (Caltech) just developed a camera design which replaces the camera lenses with ultra-thin optical phased array (OPA), means it is using math as a substitute for lenses.
“Here, like most other things in life, timing is everything. With our new system, you can selectively look in a desired direction and at a very small part of the picture in front of you at any given time, by controlling the timing with femto-second—quadrillionth of a second—precision,” says Ali Hajimiri, an engineering professor at Caltech and principal investigator of the paper.
“We’ve created a single thin layer of integrated silicon photonics that emulates the lens and sensor of a digital camera, reducing the thickness and cost of digital cameras.It can mimic a regular lens, but can switch from a fish-eye to a telephoto lens instantaneously—with just a simple adjustment in the way the array receives light,” professor Hajimiri added.
“What the camera does is similar to looking through a thin straw and scanning it across the field of view. We can form an image at an incredibly fast speed by manipulating the light instead of moving a mechanical object,” says Reza Fatemi, lead author of the paper.
Read the paper here:“An 8X8 Heterodyne Lens-less OPA Camera”
The lensless camera array are only capable of capturing low resolution photos for now. Currently, the chips contains 8×8 grid with 64 sensor. The team aims to scaling up the camera by designing chips that have much larger sensors to improve the light sensitivity and create a high resolution image.