New software lets Curiosity select its own laser targets

NASA’s Curiosity Mars rover autonomously selects some targets for the laser and telescopic camera of its ChemCam instrument. For example, on-board software analyzed the NavCam image on the left, chose the target indicated with a yellow dot, and pointed ChemCam for laser shots and the image at right. Credit: NASA / JPL-Caltech
NASA’s Curiosity rover can choose targets for examination via laser spectrometer on its own, thanks to new software developed at the space agency’s Jet Propulsion Laboratory in Pasadena, California. The development marks a major milestone: For the first time, a robotic explorer on a planetary mission can autonomously select target rocks for additional study.
Known as AEGIS, or Autonomous Exploration for Gathering Increased Science, the software views images captured by the rover’s stereo Navigation Camera (NavCam), taken at every location at which Curiosity ends a drive, and selects specific photos for analysis by its Chemistry and Camera (ChemCam) instrument.

Since Curiosity touched down on the surface of Mars in 2012, it has undertaken an extensive scientific campaign. In addition to the many targets scientists on Earth choose for study, the rover can now target is own via the laser spectrometer. Image Credit: NASA / JPL
Rocks are chosen via specifications determined by scientists, such as size and brightness. These criteria are flexible and can be adjusted based on factors such as the rover’s environment or particular science goals.
ChemCam uses a laser and telescopic camera to analyze selected targets. It fires the laser while its spectrometers simultaneously record the wavelengths viewed through its telescope.
The laser zap generates a spectrum of color plasmas which ChemCam analyzes, thereby enabling scientists to determine the targets’ chemical compositions. Images taken by this instrument’s telescope have the highest resolution of any the rover can capture.
In some cases, the software chooses to use ChemCam’s Remote Micro-Imager instead of NavCam to finely pinpoint laser targets, including those pre-selected by scientists.
“Due to their small size and other pointing challenges, hitting these targets accurately with the laser has often required the rover to stay in place while ground operators fine tune pointing parameters,” noted Tara Estlin, a robotics engineer and AEGIS development leader. “AEGIS enables these targets to be hit on the first try by automatically identifying them and calculating a pointing that will center a ChemCam measurement on the target.”
The software is capable of analyzing rock and soil composition from as far away as 23 feet (7 meters).
While the rover is now selecting multiple targets of its own every week, the majority of samples ChemCam analyzes are selected by mission scientists from images Curiosity captures and sends back to Earth.
Since the rover arrived in 2012, ChemCam has analyzed more than 1,400 targets. It and other Curiosity instruments are currently studying the geological layers on the lower part of Mount Sharp in an effort to determine how the planet’s environment changed from one that could have hosted microbial life billions of years ago to the dry, barren conditions it has today.
Curiosity is not the first to use AEGIS. The software has been used on the Opportunity rover, which has been on the Red Planet since 2004. On that rover, the software analyzes images taken by the wide-angle camera to help scientists select rocks to photograph at close range with a narrower angle camera.
In 2011, NASA’s work on AEGIS earned the agency a Software of the Year award.
Curiosity’s ability to act autonomously is especially beneficial at times when it is difficult to bring all members of the mission team together in one location, or when the orbital positions of Earth and Mars cause delays in the time messages take to get from Earth to Mars and back, Estlin said.
Laurel Kornfeld
Laurel Kornfeld is an amateur astronomer and freelance writer from Highland Park, NJ, who enjoys writing about astronomy and planetary science. She studied journalism at Douglass College, Rutgers University, and earned a Graduate Certificate of Science from Swinburne University’s Astronomy Online program. Her writings have been published online in The Atlantic, Astronomy magazine’s guest blog section, the UK Space Conference, the 2009 IAU General Assembly newspaper, The Space Reporter, and newsletters of various astronomy clubs. She is a member of the Cranford, NJ-based Amateur Astronomers, Inc. Especially interested in the outer solar system, Laurel gave a brief presentation at the 2008 Great Planet Debate held at the Johns Hopkins University Applied Physics Lab in Laurel, MD.
I love this stuff but come on we need more then just pictures. How about a robot capable of actual excavation deep digging and launching samples back to earth. Maybe even construction of a habitation for future. hmmmm right could you imagine a maglev highway that at the appropriate day in the month that opens up and launches your family RV to the moon and oh well…. maybe 3016
amazing fact that posted by nasa to get someone howfar science could reach for well being of humanity.