The PSD Relaxation for the Max-Cut Problem

 
Approximating  Max Cut with SDP
 
The landmark paper of 
GW
 on 
max cut
.
 
The Max Cut Problem
 
Given an undirected graph 
G(V,E) 
partition 
V
into 
S 
and 
V-S
.
Maximize the number of edges with one
endpoint in 
S 
and the other in 
V-S.
This problem in 
NPC
.
The Minimization problem poltnomial time
solvable.
Due 
to [GW]. 
A landmark.
 
PSD 
matrices
 
By the 
Ellipsoids algorithm 
to solve it you just
need to check if some fractional solution is
valid.  And if not show a violated constrain.
But can we pose 
A
0 
(
A is PSD
) as linear
constrains?  Fix the fractional entries  of 
A 
and
find if there is a violated constrain for
                        
v
T
 · A · v ≥ 0.
 
PSD matrices
 
Negative eigenvalue
A · v=α
˖
v  
for
 α<0 
shows that
v
T
 · A · v=
 
v
T
 α
 
v =α 
(assuming 
v
has norm 
1
)
 
is a violated
constraint.
Thus 
A 
 0 
can be solved in poly
time.
 
The approximation for MAX-CUT
 
Let 
Y
 be a  matrix.
       
The 
PSD 
program is:
 
Maximize 
  
  
  
∑ (1 − 
y
ji
 )/2
                            
Y 
 0.
We know that 
Y 
 0 
can be solved in
poly time.
 
The PSD
 
First, why is it a relaxation of 
Max-
Cut
. There is a matrix so that 
B
T
 
˖
B
= 
Y
and so
                (B
j
)
T
 
˖
B
i
= 
y
ji
 
Maximize 
  
  
  
∑ (1 − 
y
ji
 )/2
                            
Y 
 0.
 
Let the optimal cut be S,V-S
 
Let
(B
j
)
T
 =(1,0,0,0,……,0) 
if
 j
S
                         
and
(B
j
)
T
 =(-1,0,0,0,……,0) 
if 
j
V-S
 
For an edge 
i,j 
if
 i 
and
 j  
are in
different sides
 
we get 
1 
and
0 
otherwise.
 
 
What do we do with high
dimension vectors?
 
Consider the Matrix 
B
 so that
(B
j
)
T
 
˖
B
i
= 
y
ji
 
This gives a vector 
B
i 
for every vertex
i. 
Moreover the 
PSD
 programs makes
(B
j
)
T
 
˖
B
i  
as close to 
-1
 as it
can
 
For two vectors 
B
j
 
,
B
i
 
There is a two dimensional plane that
contains these vectors. Hence there  is an
angle between them.
The 
PSD 
says: let the vectors of an edge 
i,j
.
Make the angle 
θ
 
between then as close as
possible to 
180.
Say that we take a random hyperplane.
Because of the large degree between 
i 
and 
j
so that 
i,j 
is an edge, there is 
high probability
the hyperplane will 
separate
 
i
 and 
j
.
 
Analysis
 
Choose a 
random hyperplane
.
Exercise: In the 
2
 dimensional plane
it is a random line and so 
the
probability that i and j will be
separated
 is 
2
˖
θ
/2
π
 with 
θ
  the
angle between 
i
 and 
j
.
We need to pose this in terms of
the 
PSD
 program.
 
On the plane
 
Say that we choose the vectors of norm 
1
, the inner
products is just 
cos(
θ
)
 with 
θ
 the angle between
B
j
 
and
 
B
i 
. The probability that 
i,j
 are
separated is 
2
θ
/
2
π
Θ
=arccos
(
(B
j
)
T
 
˖
B
i
)=Arccos(
y
ji
)
 
We use the inequality:
arccos(
y
ji
 
)/
π ≥ 0.878(1 −
y
ij
 
)/2
 
Which clearly implies a 
0.878 
approximation ratio.
Slide Note
Embed
Share

Exploring the positive semidefinite (PSD) relaxation approach for the Max-Cut problem, which involves formulating the problem with PSD matrices and creating constraints to approximate the optimal solution efficiently. The relaxation technique, through manipulating high-dimensional vectors and angles, aims to maximize the number of cut edges in a graph partition.

  • Max-Cut Problem
  • PSD Relaxation
  • Graph Partitioning
  • Optimization Techniques
  • Semidefinite Programming

Uploaded on Sep 21, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Approximating Max Cut with SDP The landmark paper of GW on max cut.

  2. The Max Cut Problem Given an undirected graph G(V,E) partition V into S and V-S. Maximize the number of edges with one endpoint in S and the other in V-S. This problem in NPC. The Minimization problem poltnomial time solvable. Due to [GW]. A landmark.

  3. PSD matrices By the Ellipsoids algorithm to solve it you just need to check if some fractional solution is valid. And if not show a violated constrain. But can we pose A 0 (A is PSD) as linear constrains? Fix the fractional entries of A and find if there is a violated constrain for vT A v 0.

  4. PSD matrices Negative eigenvalue A v= v for <0 shows that vT A v= vT v = (assuming v has norm 1) is a violated constraint. Thus A 0 can be solved in poly time.

  5. The approximation for MAX-CUT Let Y be a matrix. The PSD program is: Maximize (1 yji)/2 Y 0. We know that Y 0 can be solved in poly time.

  6. The PSD First, why is it a relaxation of Max- Cut. There is a matrix so that BT B= Y and so (Bj)T Bi= yji Maximize (1 yji)/2 Y 0.

  7. Let the optimal cut be S,V-S Let (Bj)T=(1,0,0,0, ,0) if j S and (Bj)T=(-1,0,0,0, ,0) if j V-S For an edge i,j if i and j are in different sides we get 1 and 0 otherwise.

  8. What do we do with high dimension vectors? Consider the Matrix B so that (Bj)T Bi= yji This gives a vector Bi for every vertex i. Moreover the PSD programs makes (Bj)T Bi as close to -1 as it can

  9. For two vectors Bj,Bi There is a two dimensional plane that contains these vectors. Hence there is an angle between them. The PSD says: let the vectors of an edge i,j. Make the angle between then as close as possible to 180. Say that we take a random hyperplane. Because of the large degree between i and j so that i,j is an edge, there is high probability the hyperplane will separate i and j.

  10. Analysis Choose a random hyperplane. Exercise: In the 2 dimensional plane it is a random line and so the probability that i and j will be separated is 2 /2 with the angle between i and j. We need to pose this in terms of the PSD program.

  11. On the plane Say that we choose the vectors of norm 1, the inner products is just cos( ) with the angle between Bjand Bi . The probability that i,j are separated is 2 /2 =arccos((Bj)T Bi)=Arccos(yji) We use the inequality: arccos(yji)/ 0.878(1 yij)/2 Which clearly implies a 0.878 approximation ratio.

More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#