While we wait for the launch tomorrow of TESS, delayed “to conduct additional Guidance Navigation and Control analysis”, let’s look at how the data getting from the spacecraft back to researchers. We are not talking about a little data, the data rate could reach 27 gigabytes per day!
A quick word about the delay, everybody is doing their best to be sure that everything is perfect for the launch. I am unaware of what appears to be a last-minute decision; it could be just for reassurance or it could be something was not quite right. Double check the fairing while you are there.
Anyway, being a ham radio operator I enjoy these bits of communications news.
NASA — A Science Pipeline to New Planet Discoveries
NASA’s ongoing search for life in the universe produces a lot of data. The agency’s new planet-hunting mission, the Transiting Exoplanet Survey Satellite, or TESS, will collect 27 gigabytes per day in its all-sky search for undiscovered planets orbiting 200,000 of the brightest and closest stars in our solar neighborhood. That’s the equivalent of about 6,500 song files beaming down to Earth every two weeks. The music of the stars, however, is not as polished for human ears as the latest Taylor Swift album. To get ready for scientific discovery, the data needs a bit of fine tuning.
One of the first steps in the data’s journey from deep space to a scientist’s laptop is the Science Processing Operations Center, called SPOC, at NASA’s Ames Research Center in Silicon Valley, the design of which is based on the Kepler mission’s Science Operations Center, called the SOC, also at Ames. The SOC has been chugging along for more than a decade, spitting out tens of thousands of possible planet signals from the Kepler space telescope, NASA’s groundbreaking planet-finding mission that’s revolutionized our view of the heavens as a place chock-full of other worlds where life could exist. Among Kepler’s many gifts to TESS is its science data pipeline, which will provide the public’s “data of record” for the mission. About 75 percent of the Kepler pipeline, which took over 150 person-years to develop, remains the same for TESS, giving this new mission a leg up on discoveries.
A data pipeline is like an assembly line where computer algorithms act in stages to refine data and extract types of information — in this case, the possible signals of planets. TESS’s cameras observe the slight dip in the brightness of a star as a planet crosses, or transits, in front of the star. Over time, a pattern emerges as the dips line up across multiple transits, revealing the signal of an orbiting planet.
It’s a simple concept with a history of successful science, but the raw data, appearing as two-minute digital counts of brightness on each pixel, is contaminated with signals from the telescope and the sky when it first arrives here on Earth. SPOC’s science data pipeline does a cleanup job, and paves the way for the mission’s science office branch at the Massachusetts Institute of Technology to pick out the most promising planet candidates. From there, the Harvard-Smithsonian Center for Astrophysics at Harvard University coordinates follow-up observations to determine which candidates are bona fide planets.
NASA Ames’ Pleiades supercomputer, one of the most powerful systems in the world, has the power to process TESS’s biweekly data deluge of almost 10 billion pixels in three to five days, a cadence that enables SPOC to keep up with the volume of incoming data.
TESS’s mission is to identify the most promising exoplanets for follow-up observations. Future missions and observatories, such as the James Webb Space Telescope, will apply new technologies to study these exoplanets’ atmospheres in search of the chemical signatures of life.
TESS’s first public release of pipeline-processed data is planned for the beginning of 2019. Astronomers will then begin to peer at data from entirely new areas of the sky where we await new discoveries from these singing stars and their quietly humming planets.