Automated technology allows unparalleled space exploration from the moon, asteroids and beyond


OSIRIS-REx at the sample site Nightingale

Overhead view of OSIRIS-Rex in sample site Nightingale with parking for comparison. Credit: NASA / Goddard / CI Lab / University of Arizona

During the landing of Apollo 11 in 1969, astronauts looked out the window for special features they identified from lunar maps and were able to operate the lander to avoid a destructive touch on top of a rocky area. Now, 50 years later, the process can be automated. Distinctive features, such as known craters, boulders or other unique surface characteristics, provide an understanding of surface hazards to avoid them during landing.

NASA Scientists and engineers are maturing the technology for navigating and landing on planetary bodies by analyzing images during the descent – a process called terrain relational navigation (TRN). This optical navigation technology has been incorporated into NASA’s latest technology Mars The rover, Perseverance, will test the TRN when it lands on the Red Planet in 2021, paving the way for future crew missions to the moon and beyond. TRN was also used during NASA’s recent origins, spectral interpretation, resource identification, security, and Regolith Explorer.OSIRIS-REx) Event for collecting samples of the Mission Touch-and-Go (TAG) Asteroid Bennu to better understand the characteristics and movements of asteroids.

After arriving at Bennu in 2018, the OSIRIS-REx spacecraft maps and studies its surface, in preparation for the TAG, including its topography and lighting conditions. The Nightingale Crater was selected from four candidate sites based on its sampling material and huge amount of candidate accessibility for the spacecraft.


On October 20, the OSIRIS-Rex spacecraft successfully landed on the surface of the asteroid Bennu and collected a sample. Credit: NASA’s Goddard Space Flight Center / Scientific Visualization Studio

Engineers regularly use ground-based optical navigation methods to navigate the OSISIS-REx spacecraft near Benu, where new images taken by the spacecraft are compared to three-dimensional topographic maps. During this time, OSIRIS-RX performed the same physical navigation process in real-time, using a TRN system called Natural Feature Tracking. Compared to desboard topographic maps, images of the sampling site were taken during the TAG descent, and the route of the spacecraft was redistributed to target the landing site. In the future solar optical navigation could also be used to reduce the risks associated with landing in other unfamiliar atmospheres of our solar system.

NASA’s Lunar Reconnaissance Orbiter (LRO) has been photographing orbits since 2009. LRO project scientist Noah Petro said one of the challenges of preparing a landing mission is the lack of high-resolution, narrow-angle camera images in every lighting position for any landing site. . These images will be useful for automated landing systems that require illumination data for a specific time of lunar day. However, NASA has enabled the collection of high-resolution topographic data using LRO’s lunarbiter laser altimeter (LOLA).

“Lola data and other topographic data, let’s shape the moon and illuminate it for any time in the future or the past, and with that we can predict what the surface will look like,” Petro said.

Artemis astronaut on the moon

The artist concept of Artemis astronaut proceeds to the moon. Credit: NASA

Using Lola data, to model the shadow of surface features on a specific date and time, the sun angles are overlaid on a three-dimensional elevation map. NASA scientists know the position and direction of the moon and LRO in space, taking billions of lunar measurements. Over time, these measurements are coordinated into a grid-map of the lunar surface. Images taken during the landing are compared with these main maps so that landers used as part of the Artemis program have another tool for safely navigating the lunar terrain.

Where the lunar surface is like a fingerprint, where no two landscapes are identical, Petro said. Topography can be used to determine the exact position of a spacecraft over the moon, like a forensic scientist comparing images with crime scenes to match a stranger with a well-known person – or to match the spacecraft’s location in its flight. .

After landing, the TRN can be used to help astronauts navigate crew rovers to the ground. As part of NASA’s lunar surface stability concept, the agency is considering using a livable mobility platform, such as the RV, as well as a lunar terrain vehicle (LTV) to help crews travel to the lunar surface.

Astronauts can usually travel a short distance of a few miles in an unpressurized rover like the LTV, as they have landmarks for guidance. However, traveling longer distances is more challenging, not to mention the sun at the moon’s south pole is always lower on the horizon, which adds to the visibility challenges. Driving towards the South Pole is like driving straight east first thing in the morning – the light can be blinded, and the landmarks can look distorted. With TRN, astronauts will be more able to navigate the South Pole despite the lighting conditions, as the computer can better detect hazards.

The main difference between using a TRN to land a spacecraft and using it to navigate a crew rover. For landings, images need to be captured and processed quickly, with shorter intervals between images. To bridge the gap between images, onboard processors keep the spacecraft on track for safe landing.

“When you move slow – such as rosters or roaming around with OSIRIS-Rex around the osteoside – you have Garyard’s Anaerospace engineer working to improve current data products for the moon,” said Carolina Restrapo, NASA Goddard’s aerospace engineer in Maryland. . Surface. “When you’re moving too fast – descent and landing – there’s no time for this. You need to take images and process them on the spacecraft as quickly as possible and they all need to be autonomous. “

Automated TRN solutions can address the needs of human and robotic researchers as they navigate to unique locations in our solar system, such as the optical navigation challenges faced by OSIRIS-Rex for t for g on the rocky surface of Bennu. Due to LRO-like missions, Artemis astronauts can use TRN algorithms and lunar topography data to supplement surface images for landing and safely locating the moon’s south pole.

“What we’re trying to do is make sure we can create high-resolution maps for key locations with future trajectories and landing sites, combining existing data types, anticipating the needs of future field-related navigation systems,” said Restrepo. . ” “In other words, we need high-resolution maps for both scientific purposes as well as navigation.”