Thursday, July 10, 2014

ICALP Day 2 (guest post by Andrew Winslow)

Andrew Winslow was one of the colleagues who kindly answered my call for guest bloggers from ICALP 2014. Here is guest post on the second conference day. Enjoy it! 

Invited Talk: Victor Kuncak

Victor gave a talk entitled "Verifying and Synthesizing Software with Recursive Functions" about work with his students, including Ruzica Piskac, Phillipe Suter, Eva Darulova, Ettienne Kneuss, and Andrej Spielmann. The work focused on a number of problems in the area of turning a specification into a program that meets it. He did a great job of breaking out four types of problems they're working on:
Run-time Compile-time
Have specification C and program P Assertion checking
(Check P meets C)
Prove correctness
(Prove P always meets C)
Have specification C Constraint programming
(Carry out execution according to C)
Program synthesis
(Create P meeting C)
Their work is done in Scala and implemented in the Leon system. Out of the four types of problems, Victor indicated that the first two (first row of the table) are easier. Of course, easier doesn't mean easy, and he briefly covered a few of the ideas (induction, arithmetic/algebraic reasoning, SAT solvers, integer linear programs) used to achieve good empirical performance -- a few seconds to prove correctness programs of 100 lines of code or more, and counterexamples in the case of incorrect code.
Constraint programming was next, and he gave the example of programming red-black tress via constraints: using as constraints the five properties a red-black tree must have. Insertion, for instance, would be encoded as a combination of the five properties, plus the fact that the new tree must have the elements of the old tree plus the inserted item. Although the performance was not great, he stressed that the applications of constraint programming are more along the lines of prototyping new data structures, test case generation, etc. One cool trick was using the fact that a solution is a counterexample of the negation of the constraints...reducing the problem to one in program verification.
Finally, he covered most extreme problem: writing code based on a specification. Here he had some really nice examples, generating code for converting dates and times, simple data structures like sorted lists, etc. in just a few seconds. Many ideas were used, but I was especially impressed by his mention of their work on floating point representations, synthesizing programs that avoided rounding errors.

Andreas Björklund and Thore Husfeldt, Shortest Two Disjoint Paths in Polynomial Time

Andreas gave the talk for this Best Paper award winner in Track A. The problem considered is a very simple one: given an undirected graph and two pairs of vertices (s1, t1) and (s2, t2), find a pair of disjoint paths whose total length is minimized. Several problems near to this one are NP-hard (e.g. minimizing the maximum of the two path lengths), so placing the problem into P is the major thrust of the paper -- the algorithms given run in O(n^11) and O(n^22) time for vertex- and edge-disjoint paths, respectively.
The approach of the algorithm is almost entirely matrix-based, focusing on solving a cycle cover problem on the input graph with self-loops added to each vertex...except that doesn't quite work. Andreas did a great job of leading us on a journey where obstacle after obstacle to the approach was eliminated. Many of the issues relate to the complexity of computing permanent...indeed one of the major technical contributions is proving that computing the permanent of a matrix over a certain type of ring is in P. In the end, the necessary permanent computation is replaced with a polynomial number of determinant computations, giving the large (but polynomial) running time.
In perhaps the most natural course of events, someone asked after the talk whether the technique might work for the shortest three disjoint paths. So natural was the question that a followup slide for the question was already prepared. The conclusion was "probably not", but Andreas posed a first problem to try in this direction: given an undireced graph, compute the parity of the number of cycle covers having c cycles, with c divisible by 4.

Mitsuru Kusumoto and Yuichi Yoshida, Testing Forest-Isomorphism in the Adjacency List Model

Mitsuru started by introducing the notion of property testing on graphs: computing whether the graph has a given property, answering correctly at least 2/3rds of the time while using only a small number of queries on the graph. In this setting, even the representation of a graph as an adjacency matrix or list matters, as checking whether a given edge is present takes too long with an adjacent list repesentation.
Kusumoto and Yoshida considered the problem of testing whether two forests are isomorphic. It was known that in general graphs this problem requires Omega(sqrt(n)) queries to the graphs, and only O(1) if the graphs have bounded degree. For forests, they achieve an algorithm that uses only O(polylog(n)) queries and prove Omega(sqrt(log(n)) are needed. The algorithm works by first performing a special type of partitioning of the input graph, then testing whether each subgraph is isomorphic. The partitioning is done in a way that ensures that if the two input graphs are significantly far away from being isomorphic, then some pair of their subgraphs in their partitions are too.

Rom Aschner, Matthew J. Katz, Bounded-Angle Spanning Tree: Modeling Networks with Angular Constraints

Rom gave a nicely visual talk about a problem in wireless network design in the plane. The input is a set of points, and the output is a (Euclidean) minimum spanning tree (MST) with an additional constraint: the smallest angle spanning all of the edges is at most some fixed constant b (a b-MST). Another way to think about this constraint is that the largest angle between any pair of adjacent incoming edges must be at least 2*π - b.
For b < π / 3, a b-MST may not even exist: three points at the vertices of an equilateral triangle require at least one of the points to have two incoming edges with interior angle π / 3. For b >= 8 π / 5, the b-MST is simply the minimum spanning tree, as the upper degree bound of 5 on every node in an MST ensures the smallest angle spanning all of the edges is at least 2 * π - 2 π / 5 = 8 π / 5. Finally, for b < 1/2, there exists a simple family of examples (n-1 points placed collinearly ε distance apart and a final point distance 1 from the shared line) that causes the b-MST to have total weight ε * (n - 1) + 1 and b-MST to have total weight 1 * (n - 1) = n - 1. So the non-trivial b are those in the interval [1/2, 8/5*pi).
For these, Aschner and Katz prove both NP-hardness and approximation results. They prove that finding the b-MST for b = π and b = 2 π / 3 are both NP-hard, using a reduction from Hamiltonian path in square and hexagonal grid graphs, respectively. They also achieve constant-factor approximation algorithms for b = π, 2 π / 3, π / 2. The approximation algorithms use a common sequence of steps:
  1. Compute the MST.
  2. Turn the MST into a Hamiltonian cycle via a tour around the boundary of the MST.
  3. Decompose the points into consecutive groups along the cycle.
  4. For each group, compute a set of edges connecting the points and minimizing total edge length.
  5. Connect adjacent groups using the shortest edge possible.
Step 4 requires a careful selection of group sizes, as larger groups give greater potential for shorter collections of edges, but for large enough b, actually connecting the points may not be possible. The resulting approximation ratios are 16 and lower, and can further be reduced by a neat pigeonhole argument regarding shofting the groupings along the tour.

Canal Tour and Conference Dinner

The day concluded with a tour through the canals of Copenhagen and dinner at the Odd Fellow Palace. The 2014 Gödel Prize was awarded to Ronald Fagin, Amnon Lotem, and Moni Naor. Moni gave a great historical and technical talk on the work for which they received the award, including the threshold algorithm. Ronald Fagin also spoke briefly about Michael Franklin, the "missing fourth author" of the work, and his role in facilitating the successful collaboration.

No comments: