NASA has been collecting a literal universe of data over its six decades of existence, creating petabytes upon petabytes of data that can be mined by scientists.
Experts at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, described examples of NASA’s work in artificial intelligence and machine learning for National Press Foundation fellows. Their goal is to work with other government agencies and academia on three core areas: development of infrastructure (hardware and software), development of algorithms and creation of big data analytics.
A few examples of the ongoing work at Goddard:
• Raging wildfires in the West prompted computer engineer James MacKinnon to compile 20 years of data to create a network capable of detecting wildfires at 99 percent accuracy from space. “If there’s a fire, we’re going to see it,” he said; that will give firefighters an important jump on location and strength of fires.
• NASA is building robots that will be able to service the thousands of satellites that now become space junk after their fuel source runs dry. Brian Roberts, robotic technologist, described how robots can be trained to act almost like roadside service for cars. With some ground assistance from humans, robots will be able to refuel and repair satellites, extending their usefulness. “No one’s ever done this before,” Roberts said. Launch is scheduled for 2022.
• NASA has catalogued the galaxy, and is using virtual reality to explore 4 million stars. By identifying small clusters of stars moving in the same direction, researchers can figure out when and where they were born. “We let the algorithms grind away,” said Thomas Grubb, augmented reality and virtual reality product development lead, pointing toward the discovery of exoplanets. Closer to home, NASA has created 3-D models of lava flows to aid volcanologists.
NASA data is available to researchers and the public, via the Scientific Visualization Studio at Goddard, founded in 1988.
This program is funded by IBM. NPF is solely responsible for the content.