ECE Seminar: CMOS Image Sensors with Multi-Bucket Pixels for Computational Photography
This event has passed.
Monday, February 11, 2013 - 1:00pm to 2:00pm
Dr. Gordon Wan, Stanford University
Reducing pixel size to improve spatial resolution has been the main driving force for image sensor development in the last few decades. As resolution reaches the diffraction limit, the benefit of pixel scaling is diminishing. Alternative directions need to be explored to further image sensor development. One of the possibilities is to add new functionality to an image sensor to enable computational photography. In this talk, I will present new CMOS image sensors with multi-bucket pixels that enable time-interleaved exposure, an alternative imaging approach. When applied to multi-image computational photography, time-interleaved exposure provides a dramatic advantage over capturing and combining a burst of images having different camera settings because it eliminates the need to align the frames after capture. Moreover, all frames have the same handshake or object motion blur, and moving objects are in the same position in all frames. The new image sensors have incorporated virtual phase charge-coupled device concept into a standard 4-transistor CMOS imager pixel to minimize area overhead. Two image sensors with dual and quad-bucket pixels have been designed and fabricated. The dual-bucket sensor overcomes motion artifacts commonly found in flash-no flash imaging and dual-exposure high dynamic range imaging.