Exchanging Tools With Silicon ValleyĬhi-Kwan Chan, Ph.D., a computational astrophysicist at the University of Arizona who dealt with computation for the M87 imaging project, tells Inverse that once the correlators had cleaned the data, the task then got a lot more granular. This means that supercomputers took all the raw observational data collected by the telescopes and used the atomic clock information to line them all up with one another, creating a seamless record of the wavefront of light from the black hole as it reached Earth. The correlators then began the job of syncing up all the data from the telescopes with each other. “There’s no internet that can compete with 5 petabytes of data on a plane.”Īdding to this challenge, the scientists had to wait until summer to send the hard drives from the South Pole Telescope, as the images were captured during Antarctica’s winter. “The fastest way to do that is not over the internet, it’s actually to put them on planes,” said Marrone. Once all 1,000 pounds of hard drives were filled with these 5 petabytes of raw data, they were loaded onto airplanes and flown to two centralized “correlators,” located in Massachusetts and Germany. The advantage of this long baseline between telescopes is that the rotation of the Earth gave scientists shots of the black hole from eight simultaneous angles. In this scenario, though, the object was really far away, and the telescopes were really far apart. In other words, it’s like if eight people took videos of the same far-away phenomenon from different angles, then put all of their videos together to make one really clear video. Each of these telescopes recorded raw incoming radio signals as tons of data. The EHT experiment employed a technique called very long baseline interferometry, which used the eight simultaneously recording telescopes to essentially turn the Earth into a single, rotating telescope dish. European Space Agency Eight Synchronized Telescopes The ALMA observatory in Chile was one of the eight telescopes that imaged the M87 black hole. Here’s why and how this one picture required the data equivalent of 1.39 billion copies of “ Old Town Road” by Lil Nas X. Five petabytes is a lot of data: It’s equivalent to 5,000 years of MP3 files.” “It amounts to more than half a ton of hard drives. “We had 5 petabytes of data recorded,” Dan Marrone, Ph.D., an associate professor of astronomy at the University of Arizona who specialized in data storage for the EHT experiment, told reporters on Wednesday. "There’s no internet that can compete with 5 petabytes of data on a plane." Synchronized by custom-made atomic clocks, they all started collecting the incoming radio signals from the distant black hole and logging the data on super-fast data recorders that had been built for this very task. Over seven days in April 2017, the EHT experiment turned all eight telescopes toward M87. So in addition to being a massive achievement of human ingenuity and understanding, one that confirmed several theories about black holes, the M87 black hole image was also a Herculean feat of data storage and management. Instead, the massive quantity of data collected by the radio antennae had to be flown on airplanes to central data centers where it could be cleaned and analyzed. No, we’re not talking about black hole Shrek memes or snarky opinion pieces about how this image of an object 55 million light-years away was “so blurry.” We’re talking about how the internet literally couldn’t handle the quantity of data collected by the eight telescopes across five continents that make up the Event Horizon Telescope experiment that captured this image of the black hole at the center of the galaxy Messier 87. On Wednesday, astronomers announced that they’d captured the world’s first image of a black hole - and the internet couldn’t handle it.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |