We define a slice in a graph as a partition of into two non-intersecting sets and . An edge of is said to be sliced if one of its ends is in and the other is in . Suppose that we want to maximize the number of edges that have been sliced . We use the following approximation algorithm :
Assign every vertex to , uniformly at random.
The randomized algorithm has no more than times worse than the optimum in expectation. What is ?
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
The claim stated in this problem is wrong. The randomized algorithm presented here guarantees that the algorithm does not do worse than the optimal algorithm more than half in expectation . More precisely E ( Cut ) ≥ 2 1 Max-Cut Where I use the standard term Cut in place of Slice . The proof is simple and follows from linearity of expectation. In particular, for any edge ( i , j ) ∈ E , the edge is included in the cut iff the vertices i and j are in different partitions, which happens with probability 2 1 . Thus, Cut = e ∈ E ∑ 1 ( e ∈ Cut ) Thus, E ( Cut ) = e ∈ E ∑ 2 1 = 2 1 ∣ E ∣ ≥ 2 1 Max-Cut Where the last inequality follows because ∣ E ∣ is a trivial upper bound on Max-Cut . This proves the result. Note : The Max-Cut problem is a famous hard problem for which no polynomial time algorithm is known which solves it exactly. The best known poly-time approximation algorithm known for Max-Cut has an approximation factor of ≈ 0 . 8 7 8 and has very deep connection with complexity theory. See the wiki article .