Convex Optimization – Direction Let S be a closed convex set in $mathbb{R}^n$. A non zero vector $d in mathbb{R}^n$ is called a direction of S if for each $x in S,x+lambda d in S, forall lambda geq 0.$ Two directions $d_1$ and $d_2$ of S are called distinct if $d neq alpha d_2$ for $ alpha>0$. A direction $d$ of $S$ is said to be extreme direction if it cannot be written as a positive linear combination of two distinct directions, i.e., if $d=lambda _1d_1+lambda _2d_2$ for $lambda _1, lambda _2>0$, then $d_1= alpha d_2$ for some $alpha$. Any other direction can be expressed as a positive combination of extreme directions. For a convex set $S$, the direction d such that $x+lambda d in S$ for some $x in S$ and all $lambda geq0$ is called recessive for $S$. Let E be the set of the points where a certain function $f:S rightarrow$ over a non-empty convex set S in $mathbb{R}^n$ attains its maximum, then $E$ is called exposed face of $S$. The directions of exposed faces are called exposed directions. A ray whose direction is an extreme direction is called an extreme ray. Example Consider the function $fleft ( x right )=y=left |x right |$, where $x in mathbb{R}^n$. Let d be unit vector in $mathbb{R}^n$ Then, d is the direction for the function f because for any $lambda geq 0, x+lambda d in fleft ( x right )$. Learning working make money
Category: convex Optimization
Pseudoconvex Function Let $f:Srightarrow mathbb{R}$ be a differentiable function and S be a non-empty convex set in $mathbb{R}^n$, then f is said to be pseudoconvex if for each $x_1,x_2 in S$ with $bigtriangledown fleft ( x_1 right )^Tleft ( x_2-x_1 right )geq 0$, we have $fleft ( x_2 right )geq fleft ( x_1 right )$, or equivalently if $fleft ( x_1 right )>fleft ( x_2 right )$ then $bigtriangledown fleft ( x_1 right )^Tleft ( x_2-x_1 right ) Pseudoconcave function Let $f:Srightarrow mathbb{R}$ be a differentiable function and S be a non-empty convex set in $mathbb{R}^n$, then f is said to be pseudoconvex if for each $x_1, x_2 in S$ with $bigtriangledown fleft ( x_1 right )^Tleft ( x_2-x_1 right )geq 0$, we have $fleft ( x_2 right )leq fleft ( x_1 right )$, or equivalently if $fleft ( x_1 right )>fleft ( x_2 right )$ then $bigtriangledown fleft ( x_1 right )^Tleft ( x_2-x_1 right )>0$ Remarks If a function is both pseudoconvex and pseudoconcave, then is is called pseudolinear. A differentiable convex function is also pseudoconvex. A pseudoconvex function may not be convex. For example, $fleft ( x right )=x+x^3$ is not convex. If $x_1 leq x_2,x_{1}^{3} leq x_{2}^{3}$ Thus,$bigtriangledown fleft ( x_1 right )^Tleft ( x_2-x_1 right )=left ( 1+3x_{1}^{2} right )left ( x_2-x_1 right ) geq 0$ And, $fleft ( x_2 right )-fleft ( x_1 right )=left ( x_2-x_1 right )+left ( x_{2}^{3} -x_{1}^{3}right )geq 0$ $Rightarrow fleft ( x_2 right )geq fleft ( x_1 right )$ Thus, it is pseudoconvex. A pseudoconvex function is strictly quasiconvex. Thus, every local minima of pseudoconvex is also global minima. Strictly pseudoconvex function Let $f:Srightarrow mathbb{R}$ be a differentiable function and S be a non-empty convex set in $mathbb{R}^n$, then f is said to be pseudoconvex if for each $x_1,x_2 in S$ with $bigtriangledown fleft ( x_1 right )^Tleft ( x_2-x_1 right )geq 0$, we have $fleft ( x_2 right )> fleft ( x_1 right )$,or equivalently if $fleft ( x_1 right )geq fleft ( x_2 right )$ then $bigtriangledown fleft ( x_1 right )^Tleft ( x_2-x_1 right ) Theorem Let f be a pseudoconvex function and suppose $bigtriangledown fleft ( hat{x}right )=0$ for some $hat{x} in S$, then $hat{x}$ is global optimal solution of f over S. Proof Let $hat{x}$ be a critical point of f, ie, $bigtriangledown fleft ( hat{x}right )=0$ Since f is pseudoconvex function, for $x in S,$ we have $$bigtriangledown fleft ( hat{x}right )left ( x-hat{x}right )=0 Rightarrow fleft ( hat{x}right )leq fleft ( xright ), forall x in S$$ Hence, $hat{x}$ is global optimal solution. Remark If f is strictly pseudoconvex function, $hat{x}$ is unique global optimal solution. Theorem If f is differentiable pseudoconvex function over S, then f is both strictly quasiconvex as well as quasiconvex function. Remarks The sum of two pseudoconvex fucntions defined on an open set S of $mathbb{R}^n$ may not be pseudoconvex. Let $f:Srightarrow mathbb{R}$ be a quasiconvex function and S be a non-empty convex subset of $mathbb{R}^n$ then f is pseudoconvex if and only if every critical point is a global minima of f over S. Let S be a non-empty convex subset of $mathbb{R}^n$ and $f:Srightarrow mathbb{R}$ be a function such that $bigtriangledown fleft ( xright )neq 0$ for every $x in S$ then f is pseudoconvex if and only if it is a quasiconvex function. Learning working make money
Convex Optimization – Conic Combination A point of the form $alpha_1x_1+alpha_2x_2+….+alpha_nx_n$ with $alpha_1, alpha_2,…,alpha_ngeq 0$ is called conic combination of $x_1, x_2,…,x_n.$ If $x_i$ are in convex cone C, then every conic combination of $x_i$ is also in C. A set C is a convex cone if it contains all the conic combination of its elements. Conic Hull A conic hull is defined as a set of all conic combinations of a given set S and is denoted by coni(S). Thus, $conileft ( S right )=left { displaystylesumlimits_{i=1}^k lambda_ix_i:x_i in S,lambda_iin mathbb{R}, lambda_igeq 0,i=1,2,…right }$ The conic hull is a convex set. The origin always belong to the conic hull. Learning working make money
Convex Optimization – Convex Set Let $Ssubseteq mathbb{R}^n$ A set S is said to be convex if the line segment joining any two points of the set S also belongs to the S, i.e., if $x_1,x_2 in S$, then $lambda x_1+left ( 1-lambda right )x_2 in S$ where $lambda inleft ( 0,1 right )$. Note − The union of two convex sets may or may not be convex. The intersection of two convex sets is always convex. Proof Let $S_1$ and $S_2$ be two convex set. Let $S_3=S_1 cap S_2$ Let $x_1,x_2 in S_3$ Since $S_3=S_1 cap S_2$ thus $x_1,x_2 in S_1$and $x_1,x_2 in S_2$ Since $S_i$ is convex set, $forall$ $i in 1,2,$ Thus $lambda x_1+left ( 1-lambda right )x_2 in S_i$ where $lambda in left ( 0,1 right )$ Therfore, $lambda x_1+left ( 1-lambda right )x_2 in S_1cap S_2$ $Rightarrow lambda x_1+left ( 1-lambda right )x_2 in S_3$ Hence, $S_3$ is a convex set. Weighted average of the form $displaystylesumlimits_{i=1}^k lambda_ix_i$,where $displaystylesumlimits_{i=1}^k lambda_i=1$ and $lambda_igeq 0,forall i in left [ 1,k right ]$ is called conic combination of $x_1,x_2,….x_k.$ Weighted average of the form $displaystylesumlimits_{i=1}^k lambda_ix_i$, where $displaystylesumlimits_{i=1}^k lambda_i=1$ is called affine combination of $x_1,x_2,….x_k.$ Weighted average of the form $displaystylesumlimits_{i=1}^k lambda_ix_i$ is called linear combination of $x_1,x_2,….x_k.$ Examples Step 1 − Prove that the set $S=left { x in mathbb{R}^n:Cxleq alpha right }$ is a convex set. Solution Let $x_1$ and $x_2 in S$ $Rightarrow Cx_1leq alpha$ and $:and :Cx_2leq alpha$ To show:$::y=left ( lambda x_1+left ( 1-lambda right )x_2 right )in S :forall :lambda inleft ( 0,1 right )$ $Cy=Cleft ( lambda x_1+left ( 1-lambda right )x_2 right )=lambda Cx_1+left ( 1-lambda right )Cx_2$ $Rightarrow Cyleq lambda alpha+left ( 1-lambda right )alpha$ $Rightarrow Cyleq alpha$ $Rightarrow yin S$ Therefore, $S$ is a convex set. Step 2 − Prove that the set $S=left { left ( x_1,x_2 right )in mathbb{R}^2:x_{1}^{2}leq 8x_2 right }$ is a convex set. Solution Let $x,y in S$ Let $x=left ( x_1,x_2 right )$ and $y=left ( y_1,y_2 right )$ $Rightarrow x_{1}^{2}leq 8x_2$ and $y_{1}^{2}leq 8y_2$ To show − $lambda x+left ( 1-lambda right )yin SRightarrow lambda left ( x_1,x_2 right )+left (1-lambda right )left ( y_1,y_2 right ) in SRightarrow left [ lambda x_1+left ( 1- lambda)y_2] in Sright ) right ]$ $Now, left [lambda x_1+left ( 1-lambda right )y_1 right ]^{2}=lambda ^2x_{1}^{2}+left ( 1-lambda right )^2y_{1}^{2}+2 lambdaleft ( 1-lambda right )x_1y_1$ But $2x_1y_1leq x_{1}^{2}+y_{1}^{2}$ Therefore, $left [ lambda x_1 +left ( 1-lambda right )y_1right ]^{2}leq lambda ^2x_{1}^{2}+left ( 1- lambda right )^2y_{1}^{2}+2 lambdaleft ( 1- lambda right )left ( x_{1}^{2}+y_{1}^{2} right )$ $Rightarrow left [ lambda x_1+left ( 1-lambda right )y_1 right ]^{2}leq lambda x_{1}^{2}+left ( 1- lambda right )y_{1}^{2}$ $Rightarrow left [ lambda x_1+left ( 1-lambda right )y_1 right ]^{2}leq 8lambda x_2+8left ( 1- lambda right )y_2$ $Rightarrow left [ lambda x_1+left ( 1-lambda right )y_1 right ]^{2}leq 8left [lambda x_2+left ( 1- lambda right )y_2 right ]$ $Rightarrow lambda x+left ( 1- lambda right )y in S$ Step 3 − Show that a set $S in mathbb{R}^n$ is convex if and only if for each integer k, every convex combination of any k points of $S$ is in $S$. Solution Let $S$ be a convex set. then, to show; $c_1x_1+c_2x_2+…..+c_kx_k in S, displaystylesumlimits_{1}^k c_i=1,c_igeq 0, forall i in 1,2,….,k$ Proof by induction For $k=1,x_1 in S, c_1=1 Rightarrow c_1x_1 in S$ For $k=2,x_1,x_2 in S, c_1+c_2=1$ and Since S is a convex set $Rightarrow c_1x_1+c_2x_2 in S.$ Let the convex combination of m points of S is in S i.e., $c_1x_1+c_2x_2+…+c_mx_m in S,displaystylesumlimits_{1}^m c_i=1 ,c_i geq 0, forall i in 1,2,…,m$ Now, Let $x_1,x_2….,x_m,x_{m+1} in S$ Let $x=mu_1x_1+mu_2x_2+…+mu_mx_m+mu_{m+1}x_{m+1}$ Let $x=left ( mu_1+mu_2+…+mu_m right )frac{mu_1x_1+mu_2x_2+mu_mx_m}{mu_1+mu_2+………+mu_m}+mu_{m+1}x_{m+1}$ Let $y=frac{mu_1x_1+mu_2x_2+…+mu_mx_m}{mu_1+mu_2+………+mu_m}$ $Rightarrow x=left ( mu_1+mu_2+…+mu_m right )y+mu_{m+1}x_{m+1}$ Now $y in S$ because the sum of the coefficients is 1. $Rightarrow x in S$ since S is a convex set and $y,x_{m+1} in S$ Hence proved by induction. Learning working make money
Convex Optimization Tutorial Job Search This tutorial will introduce various concepts involved in non-linear optimization. Linear programming problems are very easy to solve but most of the real world applications involve non-linear boundaries. So, the scope of linear programming is very limited. Hence, it is an attempt to introduce the topics like convex functions and sets and its variants, which can be used to solve the most of the worldly problems. Audience This tutorial is suited for the students who are interested in solving various optimization problems. These concepts are widely used in bioengineering, electrical engineering, machine learning, statistics, economics, finance, scientific computing and computational mathematics and many more. Prerequisites The prerequisites for this course is introduction to linear algebra like introduction to the concepts like matrices, eigenvectors, symmetric matrices; basic calculus and introduction to the optimization like introduction to the concepts of linear programming. Learning working make money