Tau.Neutrino said:
Break it down: A new way to address common computing problem
In this era of big data, there are some problems in scientific computing that are so large, so complex and contain so much information that attempting to solve them would be too big of a task for most computers.
Now, researchers at the McKelvey School of Engineering at Washington University in St. Louis have developed a new algorithm for solving a common class of problem—known as linear inverse problems—by breaking them down into smaller tasks, each of which can be solved in parallel on standard computers.
more…
> linear inverse problems
What?
Nonlinear inverse problems are difficult. One I know about is removing the blur from a photograph. Which is a type of blind deconvolution. Deconvolutions also occur in NMR as a classic case.
https://en.wikipedia.org/wiki/Inverse_problem#Linear_inverse_problems
> An elementary example : Earth’s gravitational field
If we have N measurements of the Earth’s gravitational field and approximate the interior of the Earth by N masses then the N measurements uniquely determine those N masses.
Easy peasy. Any matrix solver can do it. So what’s the difficulty?
The difficulty is that the inverse matrix can approach singularity, which amplifies errors due to noise. There are several ways to help overcome this, by using constraints or probabilistic methods. When the matrix is symmetric, a simple multidimensional optimisation method such as conjugate gradient can quickly find the optimum. When the matrix is not symmetric then it becomes more difficult, but tractable.
What does the OP link https://techxplore.com/news/2020-08-common-problem.html say?
> Parallel Residual Projection (PRP)
That’s actually a good point. Conjugate gradient optimisation uses residual projection but this part of the algorithm is not parallelisable. I can envisage it turned into a parallel algorithm by doing multiple residual projections individually and then merging these before scattering the result and starting again.