One of the highest-resolution maps of dark matter ever created has now been revealed, offering a detailed case for the existence of cold dark matter — sluggish particles that comprise the bulk of matter in the universe.
More…
One of the highest-resolution maps of dark matter ever created has now been revealed, offering a detailed case for the existence of cold dark matter — sluggish particles that comprise the bulk of matter in the universe.
More…
Tau.Neutrino said:
Astronomy: Dark matter mappedOne of the highest-resolution maps of dark matter ever created has now been revealed, offering a detailed case for the existence of cold dark matter — sluggish particles that comprise the bulk of matter in the universe.
More…
> Hubble Space Telescope Frontier Fields data
Probably worth finding out more about.
I’m not at all sure how this dark matter mapping software works, or whether it works at all. At a guess, I’d say it maps each blob of light onto an ellipse – assuming that the blob of light is a galaxy. Further assume that each galaxy ellipse is very much further away than the galaxy cluster in the foreground (is this valid?) For each ellipse determine the ratio of axes and orientation of major axis. Then smooth out the random fluctuations in axis orientation by some (what?) smoothing scheme. The average direction of the minor axis and ellipse eccentricity gives the direction to and strength of the centre of dark matter.
The direction in which the minor axis is pointing is ambiguous with an ellipse, and relying on a more complicated shape than an ellipse is inaccurate but possible. Also unknown is a global scaling factor for the distance along that minor axis to the centre of dark matter.
A poor algorithm would simply add up the source positions (trial and error on the scaling factor). A better algorithm would interpret the dark matter source for each ellipse as the gradient of a potential field and then integrate out the potential field from the smoothed local gradient strength. The 2-D integration, even after smoothing, would be self-inconsistent so would need some sort of averaging. Possibly by trial and error but preferably not.
It’s the type of algorithm that would be easy to get wrong.
mollwollfumble said:
Tau.Neutrino said:
Astronomy: Dark matter mappedOne of the highest-resolution maps of dark matter ever created has now been revealed, offering a detailed case for the existence of cold dark matter — sluggish particles that comprise the bulk of matter in the universe.
More…
The phrase “One of the highest-resolution maps of dark matter” does not exactly inspire confidence.> Hubble Space Telescope Frontier Fields data
Probably worth finding out more about.
I’m not at all sure how this dark matter mapping software works, or whether it works at all. At a guess, I’d say it maps each blob of light onto an ellipse – assuming that the blob of light is a galaxy. Further assume that each galaxy ellipse is very much further away than the galaxy cluster in the foreground (is this valid?) For each ellipse determine the ratio of axes and orientation of major axis. Then smooth out the random fluctuations in axis orientation by some (what?) smoothing scheme. The average direction of the minor axis and ellipse eccentricity gives the direction to and strength of the centre of dark matter.
The direction in which the minor axis is pointing is ambiguous with an ellipse, and relying on a more complicated shape than an ellipse is inaccurate but possible. Also unknown is a global scaling factor for the distance along that minor axis to the centre of dark matter.
A poor algorithm would simply add up the source positions (trial and error on the scaling factor). A better algorithm would interpret the dark matter source for each ellipse as the gradient of a potential field and then integrate out the potential field from the smoothed local gradient strength. The 2-D integration, even after smoothing, would be self-inconsistent so would need some sort of averaging. Possibly by trial and error but preferably not.
It’s the type of algorithm that would be easy to get wrong.
Another variation on this algorithm would be to track along the major axis direction as if it was a contour line. Coming back to the starting position will give an error that can then distributed along the contour line in much the same way as it is done on surveying transects. The the average ellipticity along each contour line will tell what the deflection of light is for the contour line.
Then use the deflection of light to recover the distribution of dark matter.
Difficult to apply the last step.