During the ICS 2011, Yuan Zhou from CMU introduced a new algorithm for finding bisections from an almost bipartite graph. He raised an interesting open question to me, which incentivizes me to go through his paper and two related ones. The topic is mainly as follows: given a graph which is guaranteed to have a bisection cutting fraction of the edges, how well can we approximately find some (possibly different) bisection cutting of the edges? It has already been proved that cannot be smaller than based on the UGC-hardness. Zhou et al.’s paper has given a result of , while two previous results are focusing on a different benchmark of . The latter will be provided in a different post tomorrow.

### Algorithm

The main idea of the algorithm is as follows. After the preprocessing, we can assume that the input graph is already bipartite where the size of two parts are not necessarily the same. This loses only fraction of the edges and uses the well-known Goemans-Williamson SDP relaxation algorithm which guarantees to find a MAX-CUT from a graph that exists some MAX-CUT. The algorithm later will make extensive use of this bipartite separation to find a good bisection.

The first step is the decomposition step. The algorithm will partition the nodes into , where each is either a -expander graph or a small enough subgraph s.t. . Here is the total summation of the degrees of points in . This decomposition will ensure that the edges acrossing different ‘s are on the magnitude of . To achieve this requires Cheeger’s inequality and a subroutine which guarantees that if a graph is of small conductance (low in ), one can divide it into such that the cross edges are bounded by . Then we simply need to run this subroutine again and again to achieve the desired decomposition.

In the second and the third step, I will assume that if the optimal bisection cut is , then for each subgraph , is known to the algorithm. In the fourth step, I will state that we can use dynamic programming to relax this constraint.

The second step is to process the -expanders . The algorithm will guarantee to find in polynomial time some cut such that this is an almost perfect cut and satisfies . This is because we eventually need to find a almost-perfect bisection so we want this partial cut on subgraph to be as close to as possible; they must have the same number of nodes.

Using the definition of expander graph, one can show that if there are two cuts in some -expander graph, then the two cuts differ by at most of the edges. Because we already have a perfect cut for , then since the global optimal bisection induces another almost-perfect cut , they cannot be too far away. The major lemma states that we can move some low-degree nodes from to (or possibly the other way), to make another almost-perfect cut , but guaranteeing that ..

The third step is to process the small graphs . Assume the optimal bisection satisfies that , then the algorithm in poly time finds some other cut satisfying and . Notice that the latter will be bounded by after summing up for all ‘s.

The technique to obtain this is similar to the original SDP of Goemans-Williamson for MAX-CUT, and the term is directly from that algorithm. The extra term of $latex \mathrm{uncut}(S,T)/\Delta$ is due to the requirement to make as small as possible (in fact ). Unlike the previous relaxation which makes if and for the other case, this new algorithm divides the nodes to a bunch of pairs of sets , according to the real number . For each pair , they contain antipodals and the algorithm will pick the smaller one of them to be in the set . Details ignored here.

The fourth step is a simply dynamic programming algorithm. Since we have assumed the exact number (or equivalently ) in the previous steps, we actually need a dynamic programming technique to enumerate over all such possibilities. The state contains the maximum number of cross edges if we consider the first subgraphs , and having a cut that has exactly nodes on one side. This DP algo is straightforward, and based on our assumptions in the previous two steps, there must exist a good one which is very close to the behavior of the optimal .

### Remark

The final theorem states that the fraction of uncut edges is bounded by . The bottleneck is the first and the third term, which are introduced by the cross edges between decompositions and the uncut edges within an expander graph. Choosing appropriate parameters ends the proof.

The paper also provided an improved algorithm which states that for any **constant **tolerence , if we allow our outputed cut to be s.t. , then we can obtain the approximation, instead of . The rough idea is to decompose to -expenders instead of -expanders. This definition roughly requires that for each bisection with tolerence , the cut edges is more than . This is a weaker definition because we no longer require every partition but only bisections with tolerence. Using the traditional SDP (with modifications) directly, one can find a good enough bisection for all such -expanders. This technique removes the third term in the result above, but requires very complicated analysis, which I ignored here.