Learning Algorithms for Convex Problems work project make money

Algorithms for Convex Problem



Method of Steepest Descent

This method is also called Gradient method or Cauchy”s method. This method involves the following terminologies −

$$x_{k+1}=x_k+alpha_kd_k$$

$d_k= – bigtriangledown fleft ( x_k right )$ or $ d_k= -frac{bigtriangledown fleft ( x_k right )}{left | bigtriangledown fleft ( x_k right ) right |}$

Let $phi left (alpha right )=fleft ( x_k +alpha d_kright )$

By differentiating $phi$ and equating it to zero, we can get $alpha$.

So the algorithm goes as follows −

  • Initialize $x_0$,$varepsilon_1$,$varepsilon_2$ and set $k=0$.

  • Set $d_k=-bigtriangledown fleft ( x_k right ) $or $d_k=-frac{bigtriangledown fleft (x_k right )}{left |bigtriangledown fleft (x_k right ) right |}$.

  • find $alpha_k$ such that it minimizes $phileft ( alpha right )=fleft ( x_k+alpha d_k right )$.

  • Set $x_{k+1}=x_k+alpha_kd_k$.

  • If $left | x_{k+1-x_k} right |

  • The optimal solution is $hat{x}=x_{k+1}$.

Newton Method

Newton Method works on the following principle −

$fleft ( x right )=yleft ( x right )=fleft ( x_k right )+left ( x-x_k right )^T bigtriangledown fleft ( x_k right )+frac{1}{2}left ( x-x_k right )^T Hleft ( x_k right )left ( x-x_k right )$

$bigtriangledown yleft ( x right )=bigtriangledown fleft ( x_k right )+Hleft ( x_k right )left ( x-x_k right )$

At $x_{k+1}, bigtriangledown yleft ( x_{k+1} right )=bigtriangledown fleft ( x_k right )+Hleft ( x_k right )left ( x_{k+1}-x_k right )$

For $x_{k+1}$ to be optimal solution $bigtriangledown yleft ( x_k+1 right )=0$

Thus, $x_{k+1}=x_k-Hleft ( x_k right )^{-1} bigtriangledown fleft ( x_k right )$

Here $Hleft ( x_k right )$ should be non-singular.

Hence the algorithm goes as follows −

Step 1 − Initialize $x_0,varepsilon$ and set $k=0$.

Step 2 − find $Hleft ( x_k right ) bigtriangledown fleft ( x_k right )$.

Step 3 − Solve for the linear system $Hleft ( x_k right )hleft ( x_k right )=bigtriangledown fleft ( x_k right )$ for $hleft ( x_k right )$.

Step 4 − find $x_{k+1}=x_k-hleft ( x_k right )$.

Step 5 − If $left | x_{k+1}-x_k right |

Step 6 − The optimal solution is $hat{x}=x_{k+1}$.

Conjugate Gradient Method

This method is used for solving problems of the following types −

$min fleft ( x right )=frac{1}{2}x^T Qx-bx$

where Q is a positive definite nXn matrix and b is constant.

Given $x_0,varepsilon,$ compute $g_0=Qx_0-b$

Set $d_0=-g_0$ for $k=0,1,2,…,$

Set $alpha_k=frac{g_{k}^{T}g_k}{d_{k}^{T}Q d_k}$

Compute $x_{k+1}=x_k+alpha_kd_k$

Set $g_{k+1}=g_k+alpha_kd_k$

Compute $beta_k=frac{g_{k}^{T}g_k}{d_{k}^{T}Qd_k}$

Compute $x_{k+1}=x_k+alpha_kd_k$

Set $g_{k+1}=x_k+alpha_kQd_k$

Compute $beta_k=frac{g_{k+1}^{T}g_{k+1}}{g_{k}^{T}gk}$

Set $d_{k+1}=-g_{k+1}+beta_kd_k$.

Learning working make money

Leave a Reply

Your email address will not be published. Required fields are marked *