A Fast Wavefront Reconstructor for the Nonlinear Curvature Wavefront Sensor

Date: June 2018

Authors: Codona, J.L., Mateen, M., Hart, M.  

Abstract. The Nonlinear Curvature Wavefront Sensor (nlCWFS), first proposed by Guyon,[1] determines wavefront shape from images of a reference beacon in a number of planes between the pupil and focal plane of a telescope. We describe a new algorithm that rapidly recovers the low-order aberrations accurately enough to allow practical use of the nlCWFS in an adaptive optics (AO) system. The algorithm was inspired by refractive strong scintillation in the interstellar me dium[2], which behaves similarly to near-pupil linear curvature focusing, but over larger scales. The refractive component is extracted from the speckled images by binning with the lowest-order aberrations being additionally estimated through the use of first and second distribution moments. The linearity of the refractive scintillation process allows us to use a reconstructor matrix to compute an estimate of the pupil wavefront. The resulting wavefront estimate is then applied in reverse to a deformable mirror (DM), reducing the nonlinearity to the point that a single update phase retrieval algorithm such as a multi-plane version of Gerchberg-Saxton[3] (GS) can be used to estimate the remaining wavefront error (WFE). An AO simulation of a 1.5 m telescope, a 16x16 actuator DM, and four image planes show that the scintillation algorithm works, reducing ~800 nm rms WFE to ~ 40 nm, well below the fitting error (~90 nm) in closed loop. Once corrected to this level, the image planes still show a great deal of information that can then be used with a single-update wavefront retrieval algorithm. A couple simple variants of GS are suggested, including one that can be parallelized for each camera and run in parallel with the scintillation algorithm. A Monte Carlo study will be required to determine the
best approach.

Image Registration for Daylight Adaptive Optics

Date: January 2018

Authors: Hart, M. 

Abstract. Daytime use of adaptive optics (AO) at large telescopes is hampered by shot noise from the bright sky background. Wave-front sensing may use a sodium laser guide star observed through a magneto-optical filter to suppress the background, but the laser beacon is not sensitive to overall image motion. To estimate that, laser-guided AO systems generally rely on light from the object itself, collected through the full aperture of the telescope. Daylight sets a lower limit to the brightness of an object that may be tracked at rates sufficient to overcome the image jitter. Below that limit, wave-front correction on the basis of the laser alone will yield an image that is approximately diffraction limited but that moves randomly. I describe an iterative registration algorithm that recovers high-resolution long-exposure images in this regime from a rapid series of short exposures with very low signal-to-noise ratio. The technique takes advantage of the fact that in the photon noise limit there is negligible penalty in taking short exposures, and also that once the images are recorded, it is not necessary, as in the case of an AO tracker loop, to estimate the image motion correctly and quickly on every cycle. The algorithm is likely to find application in space situational awareness, where high-resolution daytime imaging of artificial satellites is important.

Image Restoration from Limited Data

Date: September 2017

Authors: Hope, D., Hart, M., Jefferies, S. 

Abstract. Ground-based imagery of satellites is a cornerstone of SSA. The resolution of this imagery is fundamentally limited by turbulence in the atmosphere. Full resolution can be restored by using advanced multi-frame blind deconvolution (MFBD) algorithms which, applied to sequences of short-exposure images, estimate the object scene and point spread functions (PSFs) that characterize the turbulence. Because there are always more variables to estimate than measurements, MFBD is an ill-posed problem. Furthermore, in the regime of limited data, for example a satellite with a rapidly changing pose, the problem is also ill-conditioned because of the lack of diversity in the PSFs. These challenges typically lead to poor quality restorations. The Daylight Object Restoration Algorithm (DORA) overcomes this problem, by using additional simultaneous measurements from a wave-front sensor, along with a frozen flow model of the atmosphere, to achieve high-resolution estimates of space objects from limited data sets. The improvement in image resolution achieved by DORA when compared to current state of the art MFBD algorithms is demonstrated using real data.

Remote Acoustic Imaging of Geosynchronous Satellites

Date: September 2017

Authors: Watson, Z., Hart, M.

Abstract. Identification and characterization of orbiting objects that are not spatially resolved are challenging problems for traditional remote sensing methods. Hyper temporal imaging, enabled by fast, low-noise electro-optical detectors is a new sensing modality which may allow the direct detection of acoustic resonances on satellites enabling a new regime of signature and state detection. Detectable signatures may be caused by the oscillations of solar panels, high-gain antennae, or other on-board subsystems driven by thermal gradients, fluctuations in solar radiation pressure, worn reaction wheels, or orbit maneuvers. Herein we present the first hyper-temporal observations of geosynchronous satellites. Data were collected at the Kuiper 1.54-meter telescope in Arizona using an experimental dual-channel imaging instrument that simultaneously measures light in two orthogonally polarized beams at sampling rates extending up to 1 kHz. In these observations, we see evidence of acoustic resonances in the polarization state of satellites. The technique is expected to support object identification and characterization of on-board components and to act as a discriminant between active satellites, debris, and passive bodies.

Date: July 2017

Authors: Hart, M., Hope, D.

Abstract. Present space-based optical imaging sensors are expensive. Launch costs are dictated by weight and size, and system design must take into account the low fault tolerance of a system that cannot be readily accessed once deployed. We describe the design and first prototype of the space-based infrared imaging interferometer (SIRII) that aims to mitigate several aspects of the cost challenge. SIRII is a six-element Fizeau interferometer intended to operate in the short-wave and midwave IR spectral regions over a 6 × 6 mrad field of view. The volume is smaller by a factor of three than a filled-aperture telescope with equivalent resolving power. The structure and primary optics are fabricated from light-weight space-qualified carbon fiber reinforced polymer; they are easy to replicate and inexpensive. The design is intended to permit one-time alignment during assembly, with no need for further adjustment once on orbit. A three-element prototype of the SIRII imager has been constructed with a unit telescope primary mirror diameter of 165 mm and edge-to-edge baseline of 540 mm. The optics, structure, and interferometric signal processing principles draw on experience developed in ground-based astronomical applications designed to yield the highest sensitivity and resolution with costeffective optical solutions. The initial motivation for the development of SIRII was the long-term collection of technical intelligence from geosynchronous orbit, but the scalable nature of the design will likely make it suitable for a range of IR imaging scenarios.

Daylight Operation of a Sodium Laser Guide Star for Wavefront Sensing

Date: October 2016

Authors: Hart, M., Jefferies, S. M., Murphy, N.

Abstract: We report contrast measurements of a sodium resonance guide star against the daylight sky when observed through a tuned magneto-optical filter (MOF). The guide star was created by projection of a laser beam at 589.16 nm into the mesospheric sodium layer and the observations were made with a collocated 1.5-m telescope. While MOFs are used with sodium light detecting and ranging systems during the day to improve the signal-to-noise ratio of the measurements, they have not so far been employed with laser guide stars to drive adaptive optics (AO) systems to correct atmospherically induced image blur. We interpret our results in terms of the performance of AO systems for astronomy, with particular emphasis on thermal infrared observations at the next generation of extremely large telescopes now being built.

A Comprehensive Approach to High-Resolution Daylight Imaging for SSA

Date: September 2016

Authors: Hart, M., Jefferies, S., Hope, D., Nagy, J.

Abstract: High resolution daytime imaging of resident space objects (RSO) from the ground is presently severely challenging. At visible wavelengths, where diffraction-limited resolution is the highest before the atmosphere becomes opaque in the UV, shot noise from the bright background degrades the information that may be recovered from RSO imagery. Total exposure times must be limited in order to avoid motion blur induced either by the object’s intrinsic rotation or simply by its orbital motion over the site. Fundamentally, then, one cannot collect enough light from the object to achieve adequate signal-to-noise ratio (SNR) in the presence of very high noise before the apparent shape of the object has changed. To overcome this limitation, we propose in this paper a suite of techniques which we believe will collectively enable high-resolution imaging during daylight. The approach, which has yet to be fully implemented, relies on a sequence of short-exposure images from a high-cadence camera together with simultaneous wave-front sensor (WFS) measurements acquired from a filtered sodium laser guide star. We then directly estimate the three-dimensional shape of the RSO using a formalism similar to the concept of deconvolution from wave-front sensing (DWFS). In this way, provided that the intrinsic shape of the RSO does not significantly change during the course of the observations, we can combine data from quite different pose angles in order to achieve a high resolution result with adequate SNR. By adopting this approach, we expect an improvement of 3-4 stellar magnitudes in the faintest satellites that may be characterized independent of the telescope and observing waveband. Furthermore, a model derived from observations by one sensor may be used as the basis for the restoration of data sets from widely disparate telescopes and sensor modalities; data fusion in this sense is a natural feature of the approach.

Atmospheric Tomography for Artifi cial Satellite Observations with a Single Guide Star

Date: August 2016

Authors: Hart, M., Jefferies, S. M., Hope, D. A.

Abstract. Estimation of wavefront errors in three dimensions is required to mitigate isoplanatic errors when using adaptive optics or numerical restoration algorithms to recover high-resolution images from blurred data taken through atmospheric turbulence. Present techniques rely on multiple beacons, either natural stars or laser guide stars, to probe the atmospheric aberration along different lines of sight, followed by tomographic projection of the measurements. In this Letter, we show that a three-dimensional estimate of the wavefront aberration can be recovered from measurements by a single guide star in the case where the aberration is stratified, provided that the telescope tracks across the sky with nonuniform angular velocity. This is generally the case for observations of artificial Earth-orbiting satellites, and the new method is likely to find application in ground-based telescopes used for space situational awareness.

High-Resolution Speckle Imaging through Strong Atmospheric Turbulence

Date: May 2016

Authors: Hope, D., Jefferies, S., Hart, M., Nagy, J.

Abstract: We demonstrate that high-resolution imaging through strong atmospheric turbulence can be achieved by acquiring data with a system that captures short exposure (“speckle”) images using a range of aperture sizes and then using a bootstrap multi-frame blind deconvolution restoration process that starts with the smallest aperture data. Our results suggest a potential paradigm shift in how we image through atmospheric turbulence. No longer should image acquisition and post processing be treated as two independent processes: they should be considered as intimately related.

1 / 2

Please reload