# ALGO 2015 Plenary and Special Event Keynote Talks

Paul Spirakis University of Liverpool, UK CTI & University of Patras, GR |

* On the Discrete Dynamics of Probabilistic (Finite) Population Protocols* (Monday 14.09.2015)

**Abstract**: Population Protocols are a recent model of computation that captures the way in which complex behavior of systems can emerge from the underlying local interactions of agents. Agents are usually anonymous and the local interaction rules are scalable (independent of the size, n, of the population). Such protocols can model the antagonism between members of several “species” and relate to evolutionary games.
In the recent past the speaker was involved in joint research studying the discrete dynamics of cases of such protocols for finite populations. Such dynamics are, usually, probabilistic in nature , either due to the protocol itself or due to the stochastic nature of scheduling local interactions. Examples are (a) the generalized Moran process (where the protocol is evolutionary because a fitness parameter is crucially involved) (b) the Discrete Lotka-Volterra Population Protocols (and associated Cyclic Games) and (c) the Majority protocols for random interactions.

Such protocols are usually discrete time transient Markov Chains. However the detailed states description of such chains is exponential in size and the state equations do not facilitate a rigorous approach. Instead, ideas related to filtering, stochastic domination and Potentials (leading to Martingales) help in understanding the dynamics of the protocols.

In the talk we discuss such rigorous approaches and their techniques. We examine the question of fast (in time polynomial in the population size) convergence (to an absorbing state). We also discuss the question of “most probable” eventual state of the protocols (and the computation of the probability of such states). Several aspects of such discrete dynamics are wide open and it seems that the algorithmic thought can contribute to the understanding of this emerging subfield of science.

Rasmus Pagh IT University of Copenhagen, DK |

* Correlated Locality-Sensitive Hashing* (Tuesday 15.09.2015)

**Abstract**:
When designing hashing schemes that use multiple hash functions it is standard practice to choose these
functions independently. This often makes it possible to show guarantees on correctness or running time that hold with high probability. However, there are situations in which it pays off to intentionally introduce correlation among hash functions. In this talk we focus on locality-sensitive hashing (LSH), a key tool for high-dimensional similarity search.

A drawback of traditional LSH methods is that they come with the possibility of false negatives, i.e., data points that should have been returned but were “missed” by the search procedure. This has limited the applicability of the method, and many papers considering “exact similarity search”, where there must be certainty about the output, have disregarded the possibility of randomized approaches. Following preliminary work of Arasu et al. (VLDB ’06) we show that carefully correlating hash functions can eliminate false negatives, at least in Hamming space, while achieving performance bounds comparable to those of traditional LSH methods.

Dimitrios Thilikos University of Athens, GR & CNRS, LIRMM, FR |

* Bidimensionality and Parameterized Algorithms * (Wednesday 16.09.2015)

**Abstract**: We provide an exposition of the main results of the theory of bidimensionality in parameterized algorithm design.
This theory applies to graph problems that are bidimensional in the sense that i) their solution value is not increasing
when we take minors or contractions of the input graph and ii) their solution value for the (triangulated) (k x k)-grid graph
grows as a quadratic function of k. Under certain additional conditions, mainly of logical and combinatorial nature, such
problems admit subexponential parameterized algorithms and linear kernels when their inputs are restricted to certain
topologically defined graph classes. In this talk we give a formal description of these results in their latest update.

Shlomi Dolev Ben-Gurion University, IL |

* Communication-less Secure-Multiparty-Computation* (Wednesday 16.09.2015)

**Abstract**:
Several recent results will be overviewed including: Turing machine
implementation by secure-multiparty-computation, communication-less
implementation of an (accumulating) automaton, secret shared database, and
secret shared random access machine. The results enable provable
information theoretical secure, private computations in public clouds.

Burkhard Monien University of Paderborn, DE |

* The Complexity of Equilibria for Risk-Modeling Valuations* (Wednesday 16.09.2015)

**Abstract**:
We study the complexity of deciding the existence of mixed equilibria for minimization games where
players use valuations other than expectation to evaluate their costs. We consider risk-averse players
seeking to minimize the sum V = E + R of expectation E and a risk valuation R of their costs, where R
is non-negative and vanishes exactly when the cost incurred to a player is constant over all choices
of strategies by the other players. In a V-equilibrium, no player can unilaterally reduce her cost.

Say that V has the Weak-Equilibrium-for-Expectation property if all strategies supported in a player's best-response mixed strategy incur the same conditional expectation of her cost. We introduce E-strict concavity and observe that every E-strictly concave valuation has the Weak-Equilibrium-for-Expectation property. We focus on a broad class of valuations shown to have the Weak-Equilibrium-for-Expectation property, which we exploit to prove a complexity result for the smallest case of two players.

Deciding the existence of a V-equilibrium is NP-hard when choosing R as (1) Var, or (2) SD, or choosing V as (3) a convex combination of E + Var and a certain concave ν-valuation . We present a very general reduction, relying on some natural continuity and monotonicity properties of R.

Kurt Mehlhorn Max Planck Institute for Informatics, DE |

* Computing Equilibrium Prices in Linear Arrow-Debreu Markets* (Wednesday 16.09.2015)

**Abstract**: We report on recent advances in computing equilibrium prices in linear
Arrow-Debreu markets. We discuss new algorithms and an implementation effort.

Ralf Borndoerfer Zuse-Institute Berlin, DE |

* Hypergraphs in Traffic Optimization* (Thursday 17.09.2015)

**Abstract**:
Traffic optimization is intimately related to algorithmic graph theory,
which provides elegant solutions to problems ranging from network design
to vehicle rotation planning. Extending these approaches to a hypergraph
setting is a natural next step that allows to deal, in a mathematically
appealing way, with complex types of constraints beyond the node-edge
level. The talk illustrates the potential of hypergraph models on two
examples in line planning and railway vehicle rotation planning. Line
planning gives rise to the Steiner path connectivity problem, a
generalization of the Steiner tree problems to hypergraphs, while
railway vehicle rotation planning leads to the consideration of
hyperassignment problems. These models, their theory, and algorithmic
solution will be discussed.

Virginia Vassilevska Williams Stanford University, USA |

* The Strong Exponential Time Hypothesis and Hardness for Easy Problems* (Thursday 17.09.2015)

**Abstract**: The Strong Exponential time hypothesis (SETH) is one of the most popular conjectures. It concerns the time complexity of CNF-SAT, roughly stating that the brute-force algorithm is essentially optimal. Recent research has shown that by assuming that SETH holds, one can explain the lack of progress for many problems, both "hard" (e.g. NP-hard) and "easy" (poly-time solvable). That is, many problems have a natural algorithm, discovered as early as the 1960s, with only tiny runtime improvements since its discovery; via tight reductions from k-SAT one can show that for many problems, a nontrivial improvement over this natural algorithm would break SETH. The set of such "SETH-hard" problems has grown to include easy problems such as estimating the diameter of a sparse graph, computing the longest common subsequence or the edit distance of two sequences, dynamically maintaining the strongly connected components of a graph, and many more. The goal of this talk is to provide an introduction to this scope of research.

Jochen Koenemann University of Waterloo, CA |

* Recent News for an Old Steiner Tree Formulation* (Friday 18.09.2015)

**Abstract**:
The Steiner tree problem is a fundamental network design problem where
the goal is to compute a minimum-cost tree spanning a collection of
terminals in a given input graph. In this talk I will report on some
recent progress for several variants of the problem that all stems
from new insights into an old directed formulation for optimal
branchings due to Edmonds.

Thomas Kesselheim Max Planck Institute for Informatics, DE |

* Online Packing Beyond the Worst Case* (Friday 18.09.2015)

**Abstract**:
For a number of online optimization problems, standard worst-case
competitive analysis is very pessimistic or even pointless.
Sometimes, even a trivial algorithm might be considered optimal
because an adversary would be able to trick any algorithm.

An interesting way to avoid these pathological effects is to slightly reduce the power of the adversary by introducing stochastic components. For example, the adversary might still define the instance but not the order in which it is presented to the algorithm. This order is drawn uniformly at random from all possible permutations.

In this talk, we consider online packing problems and show that this small transition beyond worst-case analysis can have a big impact. We focus on the online independent-set problem in graph classes motivated by wireless networks and on online packing LPs, which among other applications also play a big role in web advertising.