Document Type

Conference Proceeding

Publication Title

Proceedings of SPIE - the International Society for Optical Engineering


The advancement of neural network methods and technologies is finding applications in many fields and disciplines of interest to the defense, intelligence, and homeland security communities. Rapidly reconfigurable sensors for real or near-real time signal or image processing can be used for multi-functional purposes such as image compression, target tracking, image fusion, edge detection, thresholding, pattern recognition, and atmospheric turbulence compensation to name a few. A neural network based smart sensor is described that can accomplish these tasks individually or in combination, in real-time or near real-time. As a computationally intensive example, the case of optical imaging through volume turbulence is addressed. For imaging systems in the visible and near infrared part of the electromagnetic spectrum, the atmosphere is often the dominant factor in reducing the imaging system's resolution and image quality. The neural network approach described in this paper is shown to present a viable means for implementing turbulence compensation techniques for near-field and distributed turbulence scenarios. Representative high-speed neural network hardware is presented. Existing 2-D cellular neural network (CNN) hardware is capable of 3 trillion operations per second with peta-operations per second possible using current 3-D manufacturing processes. This hardware can be used for high-speed applications that require fast convolutions and de-convolutions. Existing 3-D artificial neural network technology is capable of peta-operations per second and can be used for fast array processing operations. Methods for optical imaging through distributed turbulence are discussed, simulation results are presented and computational and performance assessments are provided.



Publication Date