EDIT: I should say the analogy for these kinds of events is like wine sloshing in a glass. The hot X-ray emitting atmosphere of a galaxy cluster would sit calmly in the gravitational potential well. Likely what is happening is that a smaller subcluster passes close to the main cluster. This causes the graviational potential to shift across, the atmosphere is out of equilibrium and therefore sloshes back and forward for billions of years.
These are fascinating objects. Up to 10^15 more massive than our sun, mostly made of dark matter (80-90%) and most of the normal baryonic matter is in the form of a hot plasma, heated up by shocks as the cluster grew by mergers and accretion of subclusters.
I see lots of large timeframes of the data (20 years), but nothing about how much data that actually is. I'm not very familiar with this kind of data, but am curious about the software side and how much data was needed, timeframes for processing the data, any special hardware required, etc....
How much data did you start with (gigabytes? terabytes?)?
What does this data actually look like? csv, custom binary format, some open spec maybe?
How much did you end up filtering out for the various reasons in the paper?
Was there anything that surprised you personally while working on this paper? It seems like most of this is confirming existing theory which is great, but curious if you had any new take aways.
Does the team want to continue to pursue this? If so, what do they hope to accomplish or maybe there's some odd data / behavior that you would like to continue to look at?
Yes - we analysed a lot of observations to do the calibration work - that's the advantage of a big public archive. After processing it takes several hundred gigabytes. It probably would take a few times more, but I threw away quite a lot of it which we don't use for this analysis (flared time periods and low energies). That doesn't included the input raw datasets, which might be a few TB - I've not checked, as they're on a different system.
The data, as I say above, is in FITS format, which is standard binary table format. The processed data are these event files (lists of photons), spectra (tables of energy vs number of photons), and detector responses (matrices to turn a model spectrum into an observed spectrum). Along the way there are lots of intermediate text and FITS files. I even used HDF5 for part of the code, but that's mainly because it's so easy to use from Python.
How much was filtered? Usually we need to filter around 40% of the time periods for an average observation due to flares caused by soft protons hitting the detector. In this analysis we also threw away a lot of the data at lower energies, as we were only interested in the high energy emission lines, where we can calibrate the detector. I don't know the number there - maybe we threw away 80% of the total events by filtering the low energies. Finally, we also throw away half of the events, to retain those with the best energy resolution (those where a photon hits a single pixel on the detector).
Surprises? For the Perseus cluster, it was nice when I made a map of the motions and ended up with something that looked like the simulations of sloshing. For Coma, I was surprised that the gas in the cluster still has the same velocity as the central galaxies - I would have thought that it should have slowed down - it will be interesting to discuss this further with theorists. I was also surprised by the complexity of the detector on the instrument. It seemed a simple idea when I started, but turned out to be rather tricky.
We're planning to pursue this further. We have new deep observations of two other nearby clusters. The aim is study "feedback" by active galactic nuclei - active black holes affecting their surroundings - in the centre of these clusters. They should be disturbing the gas/plasma and we hope to measure that, as that hasn't been done before. There are also some things we could do to improve the calibration technique if we have time. For example, we could also use photons which land on multiple pixels.
A long time ago I was doing analysis of X-ray data of Coma and Perseus to infer dark matter distribution, using the temperature and density variation of the gas to derive the shape of the gravitational well. Among other things, that assumes things are in hydrostatic equilibrium. How much do you feel the bulk motion you are detecting will affect dark matter estimates derived from such X-ray measurements of clusters?
BTW one thing I think is underappreciated in this kind of analysis is the sheer amount of work you need to do to get believable results. I am guessing you probably spent more time on data reduction and correcting for the endless variety of sources of noise and uncertainty than you did on the actual science.
Yes - there was a lot of work behind this paper, and most of it was on trying to get the calibration procedures we developed to work. It was a long process of trying things and seeing what worked and testing the results for consistency. Initially I thought it would be a lot simpler, but real detectors in space are complex things.
What's the impact of this on the galaxy as a whole? More star formation? Different composition of the stars or different frequencies on the types of stars?
You should get star formation in the central massive galaxy, however, as the atmosphere should cool relatively quickly (less than a billion years). However, the central black hole appears to regulate cooling to low levels. There are filaments of star formation in many clusters like the Perseus cluster, but these appear to know more about the black hole activity (it's structures) than the sloshing. There may be some sloshing imprint, however. The scales the sloshing happens on are also important. These filaments are on small scales compared to the sloshing we see.
I would assume that the sloshing moves the cooling hot gas to one side, which might make cooling more efficient as it's further from the central black hole.
if you can answer it in the short post, how it is established that it is dark matter and not cold gas?
We can measure the shape of the dark matter in the cluster by various techniques (eg gravitational lensing). Cold material couldn't exist in a distributed form in the very hot cluster atmosphere very long. Also how could it get there intact? Also it's difficult to hide normal matter from being seen, particularly in this hot energetic environment.
I thought that had passed. Is the requirement lifted only in certain domains, like below some size limit?