Using the Stars for Altitude-Azimuth Calibration of HPWREN Cameras

20 September 2024


by Robert Quimby -- Director, Mount Laguna Observatory
Professor of Astronomy, San Diego State University

At night, images from the HPWREN cameras contain dozens to hundreds of stars. As the Earth spins, the x, y positions of the stars change predictably. Using basic astronomy I can calculate the altitude (angle above the horizon) and azimuth (angle east of the north point on the horizon; e.g., due east is azimuth 90 deg., due west is azimuth 270 deg.) for each of these stars at the time each image was taken. I can then create a mapping between the position of the star in image coordinates (x, y) and its sky coordinates (azimuth, altitude). Using many images taken at different times this can result in a very accurate mapping between image and sky coordinates.

The image above provides an example of this process using HPWREN's east-facing monochrome camera at the Mount Laguna Observatory. The red squares mark the positions of stars with known sky positions. Not all stars are used on this image, but by including hundreds of images the mapping between image and sky coordinates above the horizon can be well constrained. The typical accuracy here is about 0.5 pixels. That is, above the horizon I can predict the locations of known stars to well within one pixel, or going the other way, I can determine the sky coordinates of a star to within a few hundredths of a degree. I have no points below the horizon to constrain the mapping, so the accuracy is likely significantly worse below the horizon.

In addition to determining the precise coordinates of objects in the field, I can resample HPWREN images and display them with different projections. For example, I can show the north and east views of MLO together on the same image (see above). In this case there is some noticeable blurring of the trees where the two images overlap (around azimuth 45 deg.). This may be caused by the inaccurate alignment below the horizon or possibly the parallax effect, where nearby objects appear in slightly different positions relative to distant objects when viewed by different cameras (the same thing happens when you hold your thumb in front of your face and close one eye then the other). This is not, however, a problem for more distant, celestial objects in the sky. Thus I can reproject images to simulate a single camera panning along the horizon, as I demonstrate in the video below: