
'Breakthrough' algorithm exponentially faster than any previous one - optimalsolver
https://www.sciencedaily.com/releases/2018/06/180628131104.htm
======
LyndsySimon
To say this is vague would be an understatement.

As for "adaptive sampling" \- by definition, if some paths are being discarded
without analysis, then there's no guarantee that your selection is truly the
best available.

I'm tentatively labeling this as "hype and marketing" unless I see something
more detailed and accurate.

~~~
Hnrobert42
I think the point is that they are willing to accept the risk of not choosing
the truly best available solution in return for speed. Most practical
solutions for these NP problems make the same sacrifice. The authors are just
claiminng to be more effecient in deriving an approximate solution.

------
fhood
Is this satire? What algorithm is faster than what algorithm?!?! They have
names! "State of the art" is not the name of a damn algorithm. Why is the
abstract being coy?

------
sxv
More recent IEEE post [0] links to a different version of the paper [1] than
the arxiv preprint linked in previous comments [2].

[0] [https://spectrum.ieee.org/tech-
talk/computing/software/new-o...](https://spectrum.ieee.org/tech-
talk/computing/software/new-optimization-algorithm-exponentially-speeds-
computation)

[1] [https://scholar.harvard.edu/files/ericbalkanski/files/the-
ad...](https://scholar.harvard.edu/files/ericbalkanski/files/the-adaptive-
complexity-of-maximizing-a-submodular-function.pdf)

[2]
[https://arxiv.org/pdf/1804.06355.pdf](https://arxiv.org/pdf/1804.06355.pdf)

------
pmarreck
So this is basically "early search-tree pruning"? Or, how is it different from
that, exactly? (I believe early search-tree pruning has been used in chess
algorithms for years.)

~~~
sp332
It samples from the data randomly, which can be massively parallelized. As it
runs, it adapts the distribution that it samples from, which is similar to
pruning.

~~~
LyndsySimon
> It samples from the data randomly

If this is true, then it's not an "optimization algorithm" at all.

------
IshKebab
This is too dumbed down to be understood by anyone. :/

------
bhouston
What is this? Adaptive Monte Carlo simulation or something similar? Randomized
algorithms are great but generally they do not offer consistently repeatable
results which can be problematic.

------
Nerdfest
The article claims it's reducing the number of paralellizable steps, but it
sounds like it's actually reducing serial operations and become more parallel.

------
HelloNurse
Any link to a paper?

~~~
MaxBarraclough
[https://arxiv.org/pdf/1804.06355.pdf](https://arxiv.org/pdf/1804.06355.pdf)

~~~
LyndsySimon
Excellent, thank you!

> This algorithm therefore achieves an exponential speedup in parallel running
> time for submodular maximization at the expense of an arbitrarily small loss
> in approximation quality.

Based on this, it seems that this isn't so much a better search algorithm as
much as it is a means of finding the optimum tradeoff between accuracy and
speed in situations where processing the entire dataset is computationally
prohibitive.

I'm no computer scientist, and algorithms in generally aren't really my area,
so take this with a large grain of salt.

