Learning Convex Optimization – Discussion work project make money

Discuss Convex Optimization This tutorial will introduce various concepts involved in non-linear optimization. Linear programming problems are very easy to solve but most of the real world applications involve non-linear boundaries. So, the scope of linear programming is very limited. Hence, it is an attempt to introduce the topics like convex functions and sets and its variants, which can be used to solve the most of the worldly problems. Learning working make money

Learning Norm work project make money

Convex Optimization – Norm A norm is a function that gives a strictly positive value to a vector or a variable. Norm is a function $f:mathbb{R}^nrightarrow mathbb{R}$ The basic characteristics of a norm are − Let $X$ be a vector such that $Xin mathbb{R}^n$ $left | x right |geq 0$ $left | x right |= 0 Leftrightarrow x= 0forall x in X$ $left |alpha x right |=left | alpha right |left | x right |forall 😡 in X and :alpha :is :a :scalar$ $left | x+y right |leq left | x right |+left | y right | forall x,y in X$ $left | x-y right |geq left | left | x right |-left | y right | right |$ By definition, norm is calculated as follows − $left | x right |_1=displaystylesumlimits_{i=1}^nleft | x_i right |$ $left | x right |_2=left ( displaystylesumlimits_{i=1}^nleft | x_i right |^2 right )^{frac{1}{2}}$ $left | x right |_p=left ( displaystylesumlimits_{i=1}^nleft | x_i right |^p right )^{frac{1}{p}},1 leq p leq infty$ Norm is a continuous function. Proof By definition, if $x_nrightarrow x$ in $XRightarrow fleft ( x_n right )rightarrow fleft ( x right ) $ then $fleft ( x right )$ is a constant function. Let $fleft ( x right )=left | x right |$ Therefore, $left | fleft ( x_n right )-fleft ( x right ) right |=left | left | x_n right | -left | x right |right |leq left | left | x_n-x right | :right |$ Since $x_n rightarrow x$ thus, $left | x_n-x right |rightarrow 0$ Therefore $left | fleft ( x_n right )-fleft ( x right ) right |leq 0Rightarrow left | fleft ( x_n right )-fleft ( x right ) right |=0Rightarrow fleft ( x_n right )rightarrow fleft ( x right )$ Hence, norm is a continuous function. Learning working make money

Learning Direction work project make money

Convex Optimization – Direction Let S be a closed convex set in $mathbb{R}^n$. A non zero vector $d in mathbb{R}^n$ is called a direction of S if for each $x in S,x+lambda d in S, forall lambda geq 0.$ Two directions $d_1$ and $d_2$ of S are called distinct if $d neq alpha d_2$ for $ alpha>0$. A direction $d$ of $S$ is said to be extreme direction if it cannot be written as a positive linear combination of two distinct directions, i.e., if $d=lambda _1d_1+lambda _2d_2$ for $lambda _1, lambda _2>0$, then $d_1= alpha d_2$ for some $alpha$. Any other direction can be expressed as a positive combination of extreme directions. For a convex set $S$, the direction d such that $x+lambda d in S$ for some $x in S$ and all $lambda geq0$ is called recessive for $S$. Let E be the set of the points where a certain function $f:S rightarrow$ over a non-empty convex set S in $mathbb{R}^n$ attains its maximum, then $E$ is called exposed face of $S$. The directions of exposed faces are called exposed directions. A ray whose direction is an extreme direction is called an extreme ray. Example Consider the function $fleft ( x right )=y=left |x right |$, where $x in mathbb{R}^n$. Let d be unit vector in $mathbb{R}^n$ Then, d is the direction for the function f because for any $lambda geq 0, x+lambda d in fleft ( x right )$. Learning working make money

Learning Convex Set work project make money

Convex Optimization – Convex Set Let $Ssubseteq mathbb{R}^n$ A set S is said to be convex if the line segment joining any two points of the set S also belongs to the S, i.e., if $x_1,x_2 in S$, then $lambda x_1+left ( 1-lambda right )x_2 in S$ where $lambda inleft ( 0,1 right )$. Note − The union of two convex sets may or may not be convex. The intersection of two convex sets is always convex. Proof Let $S_1$ and $S_2$ be two convex set. Let $S_3=S_1 cap S_2$ Let $x_1,x_2 in S_3$ Since $S_3=S_1 cap S_2$ thus $x_1,x_2 in S_1$and $x_1,x_2 in S_2$ Since $S_i$ is convex set, $forall$ $i in 1,2,$ Thus $lambda x_1+left ( 1-lambda right )x_2 in S_i$ where $lambda in left ( 0,1 right )$ Therfore, $lambda x_1+left ( 1-lambda right )x_2 in S_1cap S_2$ $Rightarrow lambda x_1+left ( 1-lambda right )x_2 in S_3$ Hence, $S_3$ is a convex set. Weighted average of the form $displaystylesumlimits_{i=1}^k lambda_ix_i$,where $displaystylesumlimits_{i=1}^k lambda_i=1$ and $lambda_igeq 0,forall i in left [ 1,k right ]$ is called conic combination of $x_1,x_2,….x_k.$ Weighted average of the form $displaystylesumlimits_{i=1}^k lambda_ix_i$, where $displaystylesumlimits_{i=1}^k lambda_i=1$ is called affine combination of $x_1,x_2,….x_k.$ Weighted average of the form $displaystylesumlimits_{i=1}^k lambda_ix_i$ is called linear combination of $x_1,x_2,….x_k.$ Examples Step 1 − Prove that the set $S=left { x in mathbb{R}^n:Cxleq alpha right }$ is a convex set. Solution Let $x_1$ and $x_2 in S$ $Rightarrow Cx_1leq alpha$ and $:and :Cx_2leq alpha$ To show:$::y=left ( lambda x_1+left ( 1-lambda right )x_2 right )in S :forall :lambda inleft ( 0,1 right )$ $Cy=Cleft ( lambda x_1+left ( 1-lambda right )x_2 right )=lambda Cx_1+left ( 1-lambda right )Cx_2$ $Rightarrow Cyleq lambda alpha+left ( 1-lambda right )alpha$ $Rightarrow Cyleq alpha$ $Rightarrow yin S$ Therefore, $S$ is a convex set. Step 2 − Prove that the set $S=left { left ( x_1,x_2 right )in mathbb{R}^2:x_{1}^{2}leq 8x_2 right }$ is a convex set. Solution Let $x,y in S$ Let $x=left ( x_1,x_2 right )$ and $y=left ( y_1,y_2 right )$ $Rightarrow x_{1}^{2}leq 8x_2$ and $y_{1}^{2}leq 8y_2$ To show − $lambda x+left ( 1-lambda right )yin SRightarrow lambda left ( x_1,x_2 right )+left (1-lambda right )left ( y_1,y_2 right ) in SRightarrow left [ lambda x_1+left ( 1- lambda)y_2] in Sright ) right ]$ $Now, left [lambda x_1+left ( 1-lambda right )y_1 right ]^{2}=lambda ^2x_{1}^{2}+left ( 1-lambda right )^2y_{1}^{2}+2 lambdaleft ( 1-lambda right )x_1y_1$ But $2x_1y_1leq x_{1}^{2}+y_{1}^{2}$ Therefore, $left [ lambda x_1 +left ( 1-lambda right )y_1right ]^{2}leq lambda ^2x_{1}^{2}+left ( 1- lambda right )^2y_{1}^{2}+2 lambdaleft ( 1- lambda right )left ( x_{1}^{2}+y_{1}^{2} right )$ $Rightarrow left [ lambda x_1+left ( 1-lambda right )y_1 right ]^{2}leq lambda x_{1}^{2}+left ( 1- lambda right )y_{1}^{2}$ $Rightarrow left [ lambda x_1+left ( 1-lambda right )y_1 right ]^{2}leq 8lambda x_2+8left ( 1- lambda right )y_2$ $Rightarrow left [ lambda x_1+left ( 1-lambda right )y_1 right ]^{2}leq 8left [lambda x_2+left ( 1- lambda right )y_2 right ]$ $Rightarrow lambda x+left ( 1- lambda right )y in S$ Step 3 − Show that a set $S in mathbb{R}^n$ is convex if and only if for each integer k, every convex combination of any k points of $S$ is in $S$. Solution Let $S$ be a convex set. then, to show; $c_1x_1+c_2x_2+…..+c_kx_k in S, displaystylesumlimits_{1}^k c_i=1,c_igeq 0, forall i in 1,2,….,k$ Proof by induction For $k=1,x_1 in S, c_1=1 Rightarrow c_1x_1 in S$ For $k=2,x_1,x_2 in S, c_1+c_2=1$ and Since S is a convex set $Rightarrow c_1x_1+c_2x_2 in S.$ Let the convex combination of m points of S is in S i.e., $c_1x_1+c_2x_2+…+c_mx_m in S,displaystylesumlimits_{1}^m c_i=1 ,c_i geq 0, forall i in 1,2,…,m$ Now, Let $x_1,x_2….,x_m,x_{m+1} in S$ Let $x=mu_1x_1+mu_2x_2+…+mu_mx_m+mu_{m+1}x_{m+1}$ Let $x=left ( mu_1+mu_2+…+mu_m right )frac{mu_1x_1+mu_2x_2+mu_mx_m}{mu_1+mu_2+………+mu_m}+mu_{m+1}x_{m+1}$ Let $y=frac{mu_1x_1+mu_2x_2+…+mu_mx_m}{mu_1+mu_2+………+mu_m}$ $Rightarrow x=left ( mu_1+mu_2+…+mu_m right )y+mu_{m+1}x_{m+1}$ Now $y in S$ because the sum of the coefficients is 1. $Rightarrow x in S$ since S is a convex set and $y,x_{m+1} in S$ Hence proved by induction. Learning working make money

Learning Home work project make money

Convex Optimization Tutorial Job Search This tutorial will introduce various concepts involved in non-linear optimization. Linear programming problems are very easy to solve but most of the real world applications involve non-linear boundaries. So, the scope of linear programming is very limited. Hence, it is an attempt to introduce the topics like convex functions and sets and its variants, which can be used to solve the most of the worldly problems. Audience This tutorial is suited for the students who are interested in solving various optimization problems. These concepts are widely used in bioengineering, electrical engineering, machine learning, statistics, economics, finance, scientific computing and computational mathematics and many more. Prerequisites The prerequisites for this course is introduction to linear algebra like introduction to the concepts like matrices, eigenvectors, symmetric matrices; basic calculus and introduction to the optimization like introduction to the concepts of linear programming. Learning working make money