As their downloadable large images are on Google, that’s probably where they have the rest also, while direct raw data probably goes to hard drives first
  
  
    When operational, the 3.2 gigapixel detector will capture 15 terabytes of data per night over its 10-year survey as it investigates 37 billion stars and galaxies.
   
 
  
  
    Therefore, Blackburn is always on the lookout for another method to move petabyte-scale data. 
I am sure they are aware of cloud offerings like Wasabi and others which are cheaper than AWS. 
So probably that’s still too expensive and not a good fit for their use cases. It was even said that the final data lies on Googles cloud. 
But what if the offer would be that they could build their own data center by running their own satellite? I am sure there are hundreds of thousands of astronomy ent…
   
 
             
            
              1 Like