Quasiconvex and Quasiconcave functions Let $f:S rightarrow mathbb{R}$ where $S subset mathbb{R}^n$ is a non-empty convex set. The function f is said to be quasiconvex if for each $x_1,x_2 in S$, we have $fleft ( lambda x_1+left ( 1-lambda right )x_2 right )leq maxleft { fleft ( x_1 right ),fleft ( x_2 right ) right },lambda in left ( 0, 1 right )$ For example, $fleft ( x right )=x^{3}$ Let $f:Srightarrow R $ where $Ssubset mathbb{R}^n$ is a non-empty convex set. The function f is said to be quasiconvex if for each $x_1, x_2 in S$, we have $fleft ( lambda x_1+left ( 1-lambda right )x_2 right )geq minleft { fleft ( x_1 right ),fleft ( x_2 right ) right }, lambda in left ( 0, 1 right )$ Remarks Every convex function is quasiconvex but the converse is not true. A function which is both quasiconvex and quasiconcave is called quasimonotone. Theorem Let $f:Srightarrow mathbb{R}$ and S is a non empty convex set in $mathbb{R}^n$. The function f is quasiconvex if and only if $S_{alpha} =left ( x in S:fleft ( x right )leq alpha right }$ is convex for each real number alpha$ Proof Let f is quasiconvex on S. Let $x_1,x_2 in S_{alpha}$ therefore $x_1,x_2 in S$ and $max left { fleft ( x_1 right ),fleft ( x_2 right ) right }leq alpha$ Let $lambda in left (0, 1 right )$ and let $x=lambda x_1+left ( 1-lambda right )x_2leq max left { fleft ( x_1 right ),fleft ( x_2 right ) right }Rightarrow x in S$ Thus, $fleft ( lambda x_1+left ( 1-lambda right )x_2 right )leq maxleft { fleft ( x_1 right ), fleft ( x_2 right ) right }leq alpha$ Therefore, $S_{alpha}$ is convex. Converse Let $S_{alpha}$ is convex for each $alpha$ $x_1,x_2 in S, lambda in left ( 0,1right )$ $x=lambda x_1+left ( 1-lambda right )x_2$ Let $x=lambda x_1+left ( 1-lambda right )x_2$ For $x_1, x_2 in S_{alpha}, alpha= max left { fleft ( x_1 right ), fleft ( x_2 right ) right }$ $Rightarrow lambda x_1+left (1-lambda right )x_2 in S_{alpha}$ $Rightarrow f left (lambda x_1+left (1-lambda right )x_2 right )leq alpha$ Hence proved. Theorem Let $f:Srightarrow mathbb{R}$ and S is a non empty convex set in $mathbb{R}^n$. The function f is quasiconcave if and only if $S_{alpha} =left { x in S:fleft ( x right )geq alpha right }$ is convex for each real number $alpha$. Theorem Let $f:Srightarrow mathbb{R}$ and S is a non empty convex set in $mathbb{R}^n$. The function f is quasimonotone if and only if $S_{alpha} =left { x in S:fleft ( x right )= alpha right }$ is convex for each real number $alpha$. Learning working make money
Category: convex Optimization
Convex Optimization – Fritz-John Conditions Necessary Conditions Theorem Consider the problem − $min fleft ( x right )$ such that $x in X$ where X is an open set in $mathbb{R}^n$ and let $g_i left ( x right ) leq 0, forall i =1,2,….m$. Let $f:X rightarrow mathbb{R}$ and $g_i:X rightarrow mathbb{R}$ Let $hat{x}$ be a feasible solution and let f and $g_i, i in I$ are differentiable at $hat{x}$ and $g_i, i in J$ are continuous at $hat{x}$. If $hat{x}$ solves the above problem locally, then there exists $u_0, u_i in mathbb{R}, i in I$ such that $u_0 bigtriangledown fleft ( hat{x} right )+displaystylesumlimits_{iin I} u_i bigtriangledown g_i left ( hat{x} right )$=0 where $u_0,u_i geq 0,i in I$ and $left ( u_0, u_I right ) neq left ( 0,0 right )$ Furthermore, if $g_i,i in J$ are also differentiable at $hat{x}$, then above conditions can be written as − $u_0 bigtriangledown fleft ( hat{x}right )+displaystylesumlimits_{i=1}^m u_i bigtriangledown g_ileft ( hat{x} right )=0$ $u_ig_ileft (hat{x} right )$=0 $u_0,u_i geq 0, forall i=1,2,….,m$ $left (u_0,u right ) neq left ( 0,0 right ), u=left ( u_1,u_2,s,u_m right ) in mathbb{R}^m$ Remarks $u_i$ are called Lagrangian multipliers. The condition that $hat{x}$ be feasible to the given problem is called primal feasible condition. The requirement $u_0 bigtriangledown fleft (hat{x} right )+displaystylesumlimits_{i=1}^m u-i bigtriangledown g_ileft ( x right )=0$ is called dual feasibility condition. The condition $u_ig_ileft ( hat{x} right )=0, i=1, 2, …m$ is called complimentary slackness condition. This condition requires $u_i=0, i in J$ Together the primal feasible condition, dual feasibility condition and complimentary slackness are called Fritz-John Conditions. Sufficient Conditions Theorem If there exists an $varepsilon$-neighbourhood of $hat{x}N_varepsilon left ( hat{x} right ),varepsilon >0$ such that f is pseudoconvex over $N_varepsilon left ( hat{x} right )cap S$ and $g_i,i in I$ are strictly pseudoconvex over $N_varepsilon left ( hat{x}right )cap S$, then $hat{x}$ is local optimal solution to problem described above. If f is pseudoconvex at $hat{x}$ and if $g_i, i in I$ are both strictly pseudoconvex and quasiconvex function at $hat{x},hat{x}$ is global optimal solution to the problem described above. Example $min :fleft ( x_1,x_2 right )=left ( x_1-3 right )^2+left ( x_2-2 right )^2$ such that $x_{1}^{2}+x_{2}^{2} leq 5, x_1+2x_2 leq 4, x_1,x_2 geq 0$ And $hat{x}=left ( 2,1 right )$ Let $g_1left (x_1,x_2 right )=x_{1}^{2}+x_{2}^{2} -5,$ $g_2left (x_1,x_2 right )=x_1+2x_2-4,$ $g_3left (x_1,x_2 right )=-x_1$ and $g_4left ( x_1, x_2 right )= -x_2$. Thus the above constraints can be written as − $g_1left (x_1,x_2 right )leq 0,$ $g_2left (x_1,x_2 right )leq 0,$ $g_3left (x_1,x_2 right )leq 0$ and $g_4left (x_1,x_2 right )leq 0$ Thus, $I = left {1,2 right }$ therefore, $u_3=0,u_4=0$ $bigtriangledown f left (hat{x} right )=left (2,-2 right ),bigtriangledown g_1left (hat{x} right )=left (4,2 right )$ and $bigtriangledown g_2left (hat{x} right )=left (1,2 right )$ Thus putting these values in the first condition of Fritz-John conditions, we get − $u_0=frac{3}{2} u_2, ::u_1= frac{1}{2}u_2,$ and let $u_2=1$, therefore $u_0= frac{3}{2},::u_1= frac{1}{2}$ Thus Fritz John conditions are satisfied. $min fleft (x_1,x_2right )=-x_1$. such that $x_2-left (1-x_1right )^3 leq 0$, $-x_2 leq 0$ and $hat{x}=left (1,0right )$ Let $g_1left (x_1,x_2 right )=x_2-left (1-x_1right )^3$, $g_2left (x_1,x_2 right )=-x_2$ Thus the above constraints can be wriiten as − $g_1left (x_1,x_2 right )leq 0,$ $g_2left (x_1,x_2 right )leq 0,$ Thus, $I=left {1,2 right }$ $bigtriangledown fleft (hat{x} right )=left (-1,0right )$ $bigtriangledown g_1 left (hat{x} right )=left (0,1right )$ and $g_2 left (hat{x} right )=left (0, -1 right )$ Thus putting these values in the first condition of Fritz-John conditions, we get − $u_0=0,:: u_1=u_2=a>0$ Thus Fritz John conditions are satisfied. Learning working make money
Convex Optimization – Polar Cone Let S be a non empty set in $mathbb{R}^n$ Then, the polar cone of S denoted by $S^*$ is given by $S^*=left {p in mathbb{R}^n, p^Tx leq 0 : forall x in S right }$. Remark Polar cone is always convex even if S is not convex. If S is empty set, $S^*=mathbb{R}^n$. Polarity may be seen as a generalisation of orthogonality. Let $Csubseteq mathbb{R}^n$ then the orthogonal space of C, denoted by $C^perp =left { y in mathbb{R}^n:left langle x,y right rangle=0 forall x in C right }$. Lemma Let $S,S_1$ and $S_2$ be non empty sets in $mathbb{R}^n$ then the following statements are true − $S^*$ is a closed convex cone. $S subseteq S^{**}$ where $S^{**}$ is a polar cone of $S^*$. $S_1 subseteq S_2 Rightarrow S_{2}^{*} subseteq S_{1}^{*}$. Proof Step 1 − $S^*=left { p in mathbb{R}^n,p^Txleq 0 : forall 😡 in S right }$ Let $x_1,x_2 in S^*Rightarrow x_{1}^{T}xleq 0 $ and $x_{2}^{T}x leq 0,forall x in S$ For $lambda in left ( 0, 1 right ),left [ lambda x_1+left ( 1-lambda right )x_2 right ]^Tx=left [ left ( lambda x_1 right )^T+ left {left ( 1-lambda right )x_{2} right }^{T}right ]x, forall x in S$ $=left [ lambda x_{1}^{T} +left ( 1-lambda right )x_{2}^{T}right ]x=lambda x_{1}^{T}x+left ( 1-lambda right )x_{2}^{T}leq 0$ Thus $lambda x_1+left ( 1-lambda right )x_{2} in S^*$ Therefore $S^*$ is a convex set. For $lambda geq 0,p^{T}x leq 0, forall 😡 in S$ Therefore, $lambda p^T x leq 0,$ $Rightarrow left ( lambda p right )^T x leq 0$ $Rightarrow lambda p in S^*$ Thus, $S^*$ is a cone. To show $S^*$ is closed, i.e., to show if $p_n rightarrow p$ as $n rightarrow infty$, then $p in S^*$ $forall x in S, p_{n}^{T}x-p^T x=left ( p_n-p right )^T x$ As $p_n rightarrow p$ as $n rightarrow infty Rightarrow left ( p_n rightarrow p right )rightarrow 0$ Therefore $p_{n}^{T}x rightarrow p^{T}x$. But $p_{n}^{T}x leq 0, : forall x in S$ Thus, $p^Tx leq 0, forall x in S$ $Rightarrow p in S^*$ Hence, $S^*$ is closed. Step 2 − $S^{**}=left { q in mathbb{R}^n:q^T p leq 0, forall p in S^*right }$ Let $x in S$, then $ forall p in S^*, p^T x leq 0 Rightarrow x^Tp leq 0 Rightarrow x in S^{**}$ Thus, $S subseteq S^{**}$ Step 3 − $S_2^*=left { p in mathbb{R}^n:p^Txleq 0, forall x in S_2 right }$ Since $S_1 subseteq S_2 Rightarrow forall x in S_2 Rightarrow forall x in S_1$ Therefore, if $hat{p} in S_2^*, $then $hat{p}^Tx leq 0,forall x in S_2$ $Rightarrow hat{p}^Txleq 0, forall x in S_1$ $Rightarrow hat{p}^T in S_1^*$ $Rightarrow S_2^* subseteq S_1^*$ Theorem Let C be a non empty closed convex cone, then $C=C^**$ Proof $C=C^{**}$ by previous lemma. To prove : $x in C^{**} subseteq C$ Let $x in C^{**}$ and let $x notin C$ Then by fundamental separation theorem, there exists a vector $p neq 0$ and a scalar $alpha$ such that $p^Ty leq alpha, forall y in C$ Therefore, $p^Tx > alpha$ But since $left ( y=0 right ) in C$ and $p^Tyleq alpha, forall y in C Rightarrow alphageq 0$ and $p^Tx>0$ If $p notin C^*$, then there exists some $bar{y} in C$ such that $p^T bar{y}>0$ and $p^Tleft ( lambda bar{y} right )$ can be made arbitrarily large by taking $lambda$ sufficiently large. This contradicts with the fact that $p^Ty leq alpha, forall y in C$ Therefore,$p in C^*$ Since $x in C^*=left { q:q^Tpleq 0, forall p in C^* right }$ Therefore, $x^Tp leq 0 Rightarrow p^Tx leq 0$ But $p^Tx> alpha$ Thus is contardiction. Thus, $x in C$ Hence $C=C^{**}$. Learning working make money
Extreme point of a convex set Let S be a convex set in $mathbb{R}^n$. A vector $x in S$ is said to be a extreme point of S if $x= lambda x_1+left ( 1-lambda right )x_2$ with $x_1, x_2 in S$ and $lambda inleft ( 0, 1 right )Rightarrow x=x_1=x_2$. Example Step 1 − $S=left { left ( x_1,x_2 right ) in mathbb{R}^2:x_{1}^{2}+x_{2}^{2}leq 1 right }$ Extreme point, $E=left { left ( x_1, x_2 right )in mathbb{R}^2:x_{1}^{2}+x_{2}^{2}= 1 right }$ Step 2 − $S=left { left ( x_1,x_2 right )in mathbb{R}^2:x_1+x_2 Extreme point, $E=left { left ( 0, 0 right), left ( 2, 0 right), left ( 0, 1 right), left ( frac{2}{3}, frac{4}{3} right) right }$ Step 3 − S is the polytope made by the points $left { left ( 0,0 right ), left ( 1,1 right ), left ( 1,3 right ), left ( -2,4 right ),left ( 0,2 right ) right }$ Extreme point, $E=left { left ( 0,0 right ), left ( 1,1 right ),left ( 1,3 right ),left ( -2,4 right ) right }$ Remarks Any point of the convex set S, can be represented as a convex combination of its extreme points. It is only true for closed and bounded sets in $mathbb{R}^n$. It may not be true for unbounded sets. k extreme points A point in a convex set is called k extreme if and only if it is the interior point of a k-dimensional convex set within S, and it is not an interior point of a (k+1)- dimensional convex set within S. Basically, for a convex set S, k extreme points make k-dimensional open faces. Learning working make money
Convex Optimization – Cones A non empty set C in $mathbb{R}^n$ is said to be cone with vertex 0 if $x in CRightarrow lambda x in C forall lambda geq 0$. A set C is a convex cone if it convex as well as cone. For example, $y=left | x right |$ is not a convex cone because it is not convex. But, $y geq left | x right |$ is a convex cone because it is convex as well as cone. Note − A cone C is convex if and only if for any $x,y in C, x+y in C$. Proof Since C is cone, for $x,y in C Rightarrow lambda x in C$ and $mu y in C :forall :lambda, mu geq 0$ C is convex if $lambda x + left ( 1-lambda right )y in C : forall :lambda in left ( 0, 1 right )$ Since C is cone, $lambda x in C$ and $left ( 1-lambda right )y in C Leftrightarrow x,y in C$ Thus C is convex if $x+y in C$ In general, if $x_1,x_2 in C$, then, $lambda_1x_1+lambda_2x_2 in C, forall lambda_1,lambda_2 geq 0$ Examples The conic combination of infinite set of vectors in $mathbb{R}^n$ is a convex cone. Any empty set is a convex cone. Any linear function is a convex cone. Since a hyperplane is linear, it is also a convex cone. Closed half spaces are also convex cones. Note − The intersection of two convex cones is a convex cone but their union may or may not be a convex cone. Learning working make money
Convex Optimization – Jensen”s Inequality Let S be a non-empty convex set in $mathbb{R}^n$ and $f:S rightarrow mathbb{R}^n$. Then f is convex if and only if for each integer $k>0$ $x_1,x_2,…x_k in S, displaystylesumlimits_{i=1}^k lambda_i=1, lambda_igeq 0, forall i=1,2,s,k$, we have $fleft ( displaystylesumlimits_{i=1}^k lambda_ix_i right )leq displaystylesumlimits_{i=1}^k lambda _ifleft ( x right )$ Proof By induction on k. $k=1:x_1 in S$ Therefore $fleft ( lambda_1 x_1right ) leq lambda_i fleft (x_1right )$ because $lambda_i=1$. $k=2:lambda_1+lambda_2=1$ and $x_1, x_2 in S$ Therefore, $lambda_1x_1+lambda_2x_2 in S$ Hence by definition, $fleft ( lambda_1 x_1 +lambda_2 x_2 right )leq lambda _1fleft ( x_1 right )+lambda _2fleft ( x_2 right )$ Let the statement is true for $n Therefore, $fleft ( lambda_1 x_1+ lambda_2 x_2+….+lambda_k x_kright )leq lambda_1 fleft (x_1 right )+lambda_2 fleft (x_2 right )+…+lambda_k fleft (x_k right )$ $k=n+1:$ Let $x_1, x_2,….x_n,x_{n+1} in S$ and $displaystylesumlimits_{i=1}^{n+1}=1$ Therefore $mu_1x_1+mu_2x_2+…….+mu_nx_n+mu_{n+1} x_{n+1} in S$ thus,$fleft (mu_1x_1+mu_2x_2+…+mu_nx_n+mu_{n+1} x_{n+1} right )$ $=fleft ( left ( mu_1+mu_2+…+mu_n right)frac{mu_1x_1+mu_2x_2+…+mu_nx_n}{mu_1+mu_2+mu_3}+mu_{n+1}x_{n+1} right)$ $=fleft ( mu_y+mu_{n+1}x_{n+1} right )$ where $mu=mu_1+mu_2+…+mu_n$ and $y=frac{mu_1x_1+mu_2x_2+…+mu_nx_n}{mu_1+mu_2+…+mu_n}$ and also $mu_1+mu_{n+1}=1,y in S$ $Rightarrow fleft ( mu_1x_1+mu_2x_2+…+mu_nx_n+mu_{n+1}x_{n+1}right ) leq mu fleft ( y right )+mu_{n+1} fleft ( x_{n+1} right )$ $Rightarrow fleft ( mu_1x_1+mu_2x_2+…+mu_nx_n+mu_{n+1}x_{n+1}right ) leq$ $left ( mu_1+mu_2+…+mu_n right )fleft ( frac{mu_1x_1+mu_2x_2+…+mu_nx_n}{mu_1+mu_2+…+mu_n} right )+mu_{n+1}fleft ( x_{n+1} right )$ $Rightarrow fleft ( mu_1x_1+mu_2x_2+…+mu_nx_n +mu_{n+1}x_{n+1}right )leq left ( mu_1+ mu_2+ …+mu_n right )$ $left [ frac{mu_1}{mu_1+ mu_2+ …+mu_n}fleft ( x_1 right )+…+frac{mu_n}{mu_1+ mu_2+ …+mu_n}fleft ( x_n right ) right ]+mu_{n+1}fleft ( x_{n+1} right )$ $Rightarrow fleft ( mu_1x_1+mu_2x_2+…+mu_nx_n+mu_{n+1}x_{n+1}right )leq mu_1fleft ( x_1 right )+mu_2fleft ( x_2 right )+….$ Hence Proved. Learning working make money
Convex Optimization – Direction Let S be a closed convex set in $mathbb{R}^n$. A non zero vector $d in mathbb{R}^n$ is called a direction of S if for each $x in S,x+lambda d in S, forall lambda geq 0.$ Two directions $d_1$ and $d_2$ of S are called distinct if $d neq alpha d_2$ for $ alpha>0$. A direction $d$ of $S$ is said to be extreme direction if it cannot be written as a positive linear combination of two distinct directions, i.e., if $d=lambda _1d_1+lambda _2d_2$ for $lambda _1, lambda _2>0$, then $d_1= alpha d_2$ for some $alpha$. Any other direction can be expressed as a positive combination of extreme directions. For a convex set $S$, the direction d such that $x+lambda d in S$ for some $x in S$ and all $lambda geq0$ is called recessive for $S$. Let E be the set of the points where a certain function $f:S rightarrow$ over a non-empty convex set S in $mathbb{R}^n$ attains its maximum, then $E$ is called exposed face of $S$. The directions of exposed faces are called exposed directions. A ray whose direction is an extreme direction is called an extreme ray. Example Consider the function $fleft ( x right )=y=left |x right |$, where $x in mathbb{R}^n$. Let d be unit vector in $mathbb{R}^n$ Then, d is the direction for the function f because for any $lambda geq 0, x+lambda d in fleft ( x right )$. Learning working make money
Pseudoconvex Function Let $f:Srightarrow mathbb{R}$ be a differentiable function and S be a non-empty convex set in $mathbb{R}^n$, then f is said to be pseudoconvex if for each $x_1,x_2 in S$ with $bigtriangledown fleft ( x_1 right )^Tleft ( x_2-x_1 right )geq 0$, we have $fleft ( x_2 right )geq fleft ( x_1 right )$, or equivalently if $fleft ( x_1 right )>fleft ( x_2 right )$ then $bigtriangledown fleft ( x_1 right )^Tleft ( x_2-x_1 right ) Pseudoconcave function Let $f:Srightarrow mathbb{R}$ be a differentiable function and S be a non-empty convex set in $mathbb{R}^n$, then f is said to be pseudoconvex if for each $x_1, x_2 in S$ with $bigtriangledown fleft ( x_1 right )^Tleft ( x_2-x_1 right )geq 0$, we have $fleft ( x_2 right )leq fleft ( x_1 right )$, or equivalently if $fleft ( x_1 right )>fleft ( x_2 right )$ then $bigtriangledown fleft ( x_1 right )^Tleft ( x_2-x_1 right )>0$ Remarks If a function is both pseudoconvex and pseudoconcave, then is is called pseudolinear. A differentiable convex function is also pseudoconvex. A pseudoconvex function may not be convex. For example, $fleft ( x right )=x+x^3$ is not convex. If $x_1 leq x_2,x_{1}^{3} leq x_{2}^{3}$ Thus,$bigtriangledown fleft ( x_1 right )^Tleft ( x_2-x_1 right )=left ( 1+3x_{1}^{2} right )left ( x_2-x_1 right ) geq 0$ And, $fleft ( x_2 right )-fleft ( x_1 right )=left ( x_2-x_1 right )+left ( x_{2}^{3} -x_{1}^{3}right )geq 0$ $Rightarrow fleft ( x_2 right )geq fleft ( x_1 right )$ Thus, it is pseudoconvex. A pseudoconvex function is strictly quasiconvex. Thus, every local minima of pseudoconvex is also global minima. Strictly pseudoconvex function Let $f:Srightarrow mathbb{R}$ be a differentiable function and S be a non-empty convex set in $mathbb{R}^n$, then f is said to be pseudoconvex if for each $x_1,x_2 in S$ with $bigtriangledown fleft ( x_1 right )^Tleft ( x_2-x_1 right )geq 0$, we have $fleft ( x_2 right )> fleft ( x_1 right )$,or equivalently if $fleft ( x_1 right )geq fleft ( x_2 right )$ then $bigtriangledown fleft ( x_1 right )^Tleft ( x_2-x_1 right ) Theorem Let f be a pseudoconvex function and suppose $bigtriangledown fleft ( hat{x}right )=0$ for some $hat{x} in S$, then $hat{x}$ is global optimal solution of f over S. Proof Let $hat{x}$ be a critical point of f, ie, $bigtriangledown fleft ( hat{x}right )=0$ Since f is pseudoconvex function, for $x in S,$ we have $$bigtriangledown fleft ( hat{x}right )left ( x-hat{x}right )=0 Rightarrow fleft ( hat{x}right )leq fleft ( xright ), forall x in S$$ Hence, $hat{x}$ is global optimal solution. Remark If f is strictly pseudoconvex function, $hat{x}$ is unique global optimal solution. Theorem If f is differentiable pseudoconvex function over S, then f is both strictly quasiconvex as well as quasiconvex function. Remarks The sum of two pseudoconvex fucntions defined on an open set S of $mathbb{R}^n$ may not be pseudoconvex. Let $f:Srightarrow mathbb{R}$ be a quasiconvex function and S be a non-empty convex subset of $mathbb{R}^n$ then f is pseudoconvex if and only if every critical point is a global minima of f over S. Let S be a non-empty convex subset of $mathbb{R}^n$ and $f:Srightarrow mathbb{R}$ be a function such that $bigtriangledown fleft ( xright )neq 0$ for every $x in S$ then f is pseudoconvex if and only if it is a quasiconvex function. Learning working make money
Convex Optimization – Conic Combination A point of the form $alpha_1x_1+alpha_2x_2+….+alpha_nx_n$ with $alpha_1, alpha_2,…,alpha_ngeq 0$ is called conic combination of $x_1, x_2,…,x_n.$ If $x_i$ are in convex cone C, then every conic combination of $x_i$ is also in C. A set C is a convex cone if it contains all the conic combination of its elements. Conic Hull A conic hull is defined as a set of all conic combinations of a given set S and is denoted by coni(S). Thus, $conileft ( S right )=left { displaystylesumlimits_{i=1}^k lambda_ix_i:x_i in S,lambda_iin mathbb{R}, lambda_igeq 0,i=1,2,…right }$ The conic hull is a convex set. The origin always belong to the conic hull. Learning working make money
Convex Optimization – Polyhedral Set A set in $mathbb{R}^n$ is said to be polyhedral if it is the intersection of a finite number of closed half spaces, i.e., $S=left { x in mathbb{R}^n:p_{i}^{T}xleq alpha_i, i=1,2,….,n right }$ For example, $left { x in mathbb{R}^n:AX=b right }$ $left { x in mathbb{R}^n:AXleq b right }$ $left { x in mathbb{R}^n:AXgeq b right }$ Polyhedral Cone A set in $mathbb{R}^n$ is said to be polyhedral cone if it is the intersection of a finite number of half spaces that contain the origin, i.e., $S=left { x in mathbb{R}^n:p_{i}^{T}xleq 0, i=1, 2,… right }$ Polytope A polytope is a polyhedral set which is bounded. Remarks A polytope is a convex hull of a finite set of points. A polyhedral cone is generated by a finite set of vectors. A polyhedral set is a closed set. A polyhedral set is a convex set. Learning working make money