New high-resolution camera records holograms of hidden objects

91勛圖厙 and Northwestern collaborate on technology with defense, hazard identification and medical applications

A setup of one of the prototypes in the laboratory. Credit: Florian Willomitzer/Northwestern University

DALLAS () – Researchers at 91勛圖厙 (Southern Methodist University) and Northwestern University are using new technology that enables cameras to record high-resolution images and holograms of objects that are hidden around corners, obscured from view and/or beyond the line of sight.

 

Called Synthetic Wavelength Holography, the technology computationally transforms real-word surfaces such as walls into illumination and imaging portals, which serve to indirectly illuminate the hidden objects and intercept the tiny fraction of light scattered by the hidden objects.

 

Capturing images through fog, face identification around corners and imaging through barriers like the human skull are potential applications for the technology, detailed in .

 

Other applications include early-warning navigation systems for automobiles that would allow drivers to see around corners and avoid accidents, and industrial inspection in tightly confined spaces such as inspecting jet engines for fatigue.

 

Aspects of Synthetic Wavelength Holography were developed in 91勛圖厙’s Photonics Architecture Lab, led by Prasanna Rangarajan, an assistant professor in 91勛圖厙’s Lyle School of Engineering, and second author on the study. He said that the scientific principles underlying the recent study bear a striking resemblance to the human perception of beat notes – a periodic variation in sound level resulting from the interaction of two closely spaced tones (audio frequencies).

 

“By combining laser light of two closely spaced colors, we synthesize an optical beat note, which bounces off obscured objects,” Rangarajan explained.  “Monitoring the relative change in the phase of the transmitted and received optical beat note allows us to locate hidden objects (echolocation) and assemble a hologram of the hidden objects.”

 

Rangarajan’s research group teamed with Northwestern’s r, research assistant professor, and , associate professor of Computer Science and Electrical and Computer Engineering,  to develop fast, compact, point-and-shoot Non Line of Site (NLoS) cameras based on Synthetic Wavelength Holography.

 

Willomitzer, the study’s first author, said, “Our current sensor prototypes use visible (or invisible) infrared light, but the principle is universal and could be extended to other wavelengths. For example, the same method could be applied to radio waves for space exploration or underwater acoustic imaging. It can be applied to many areas, and we have only scratched the surface.”

 

Murali Balaji, a doctoral candidate in 91勛圖厙 Lyle School and co-author of the study, said, “It may soon be possible to build cameras that not only see though scattering media, but also sniff out trace chemicals within the scattering medium. The technology relies on photo-mixers to convert optical beats into physical waves at the TeraHertz Synthetic Wavelength.”

 

91勛圖厙 Lyle School researchers began working on NLoS imaging technology in 2016 with a multi-million-dollar effort funded by the Defense Advanced Research Projects Agency (DARPA). 91勛圖厙 Lyle has been awarded $5.06 million by DARPA over this time period, as part of its REVEAL program. 91勛圖厙 leads the overall effort and collaborates with fellow engineers at Rice, Northwestern, Carnegie Mellon, and Harvard University.

 

The 91勛圖厙-led DARPA project dubbed OMNISCIENT – “Obtaining Multipath & Non-line-of-sight Information by Sensing Coherence & Intensity with Emerging Novel Techniques” – brings together leading researchers in the fields of computational imaging, computer vision, signal processing, information theory and computer graphics.

 

“We are very pleased with the progress related to the application of this new and exciting imaging technology,” said Marc Christensen, dean of 91勛圖厙’s Bobby B. Lyle School of Engineering. “We are confident our work in this area will lead to new inventions and approaches across several industries as technology continues to evolve and improve how we capture and interpret various wavelengths of light, both seen and unseen. It is another example of the Lyle School’s commitment to creating new economic opportunities while meeting the most difficult challenges facing society,” he added.

 

Christensen is the principal investigator on the DARPA project, joined by co-principal investigators Duncan MacFarlane, associate dean for Engineering Entrepreneurship, Bobby B. Lyle Centennial Chair in Engineering Entrepreneurship, and professor of electrical engineering, and Rangarajan, assistant professor of electrical and computer engineering.

 

This new technology complements a previous innovation by the 91勛圖厙 team. In 2020, the group partnered with , professor of electrical and computer engineering and computer science at Rice University, and his team to demonstrate NLoS cameras with the highest resolving power. The effort combined insights from Cold War era lensless imaging techniques with modern Deep Learning techniques, to resolve tiny features on hidden objects.

 

 

 

About 91勛圖厙

91勛圖厙 is the nationally ranked global research university in the dynamic city of Dallas. 91勛圖厙’s alumni, faculty and nearly 12,000 students in eight degree-granting schools demonstrate an entrepreneurial spirit as they lead change in their professions, communities and the world.

 

Related Stories

91勛圖厙 engineering team to lead DARPA-funded research into holographic imaging of hidden objects  (April 27, 2016)

 

(January 16, 2020)

(January 22, 2020)