• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Computational Complexity: Threesomes, Degenerates, and Love Triangles

beero1000

Veteran Member
Joined
Sep 23, 2006
Messages
2,139
Location
Connecticut
Basic Beliefs
Atheist
The paper "Threesomes, Degenerates, and Love Triangles" was posted to arXiv yesterday (http://arxiv.org/abs/1404.0799). Besides having an awesome title, it disproves a widely held belief in computational complexity theory - that the 3SUM problem has an order n2 lower bound.

The 3SUM problem is this: given a set of n real numbers, determine whether or not there is a subset of three of them that sums to 0. The trivial algorithm just checks every triple and takes around n3 operations. Being a bit cleverer, we can sort the numbers (n log n operations), choose any of n2 pairs and then binary search for the third in log n time. Altogether that is asymptotically n2 log n. A bit cleverer still and we can remove the log n factor for the binary search and get an n2 algorithm.

People have been trying to find the most efficient algorithm to solve this problem for at least 30 years, but the best anyone has managed until now was an algorithm that takes on the order of n2 time. The (now known to be false) conjecture stated that any algorithm that will correctly answers the 3SUM problem requires on the order of n2 computations in the worst case. This was so widely believed to be true that a complexity class called "3SUM-hard" was created to find a class of problems that would take quadratic time to solve (this is completely analogous to the more widely known class of NP-hard problems, although disproving that widely believed conjecture would be a much bigger deal :).) Hundreds of problems were shown to be 3SUM-hard, many having very serious applications (such as determining if a point set is degenerate) and most theorists devoted their time to proving lower bounds. The kept failing at achieving the sharp quadratic lower bound, and now we know why.

The paper shows an (n log log n)2/log n algorithm (just a teensy bit better asymptotically than quadratic - for 100,000 numbers it would be about 1% faster) but is a huge theoretical breakthrough. They also gave an indication (the decision tree complexity) that the algorithm might be improvable down to something like n3/2 (log n)1/2, which would be a much bigger improvement (for 100,000 numbers it would be something like 99% faster).
 
Back
Top Bottom