Linear Programming in Convex Regions andFunctions

Basic Concepts and priciples

There are two objetives in this section. On the one hand we want to find necessary and sufficient conditions for a certain point x to be solution of a non linnear programming problem.

On the other hand we'd like to find an algorithm to find those point, begining at any certain initial given point x0.

Extended Theory

快乐飞艇怎么玩:Convex Functions Properties
1. f is convex then the level set

    Ck = {x ∈ S : f(x) ≤ k, k ∈ R } Is convex too.

2. If f is convex at S then it is continuous in the interior of f.

3. f is convex at S, open set of Rn then f has directional derivates at everywhere, ie

    ∀ x, d ∈ Rn

    f'(x; d) = lim ( f(x + kd) - f(x) ) / k
                    k → 0
Exists and it is finite.

4. f is convex if and only if its Epigraph:
    Epi(f) = { (x, y) ∈ S x R : y ≥ f(x) }
Is a convex set of Rn+1

5. f is concave if and only if it Hipograph:
    Hipo(f) = { (x, y) ∈ S x R : y ≤ f(x) }
is a convex set of Rn+1

Convexity is not a sufficient condition for differentiability, for example the function

    f(x) = |x|

Is not differentiable at x = 0, however it is convex at that point.

From point 3, we know that f has directional derivatives at x = 0.
Obviously, if a function is differentiable we can define the gradient of f. For differentiable functions do not define the 快乐飞艇怎么玩: subgradient , which makes role of the gradient for those functions thar are not differentiable.
Theorem (Existence of subgradient in convex functions)
Lets f: S ⊂ Rn → R convex, then

f has subgradient in all interior points of S.
Theorem (Characterization of convexity for differentiable functions)
Lets f: S ⊂ Rn → R differentiable and S convex open and non empty set, then f is convex if and only if

    f(x) ≥ f(y) + ∇f(y) (x - y) ∀ x ∈ S

The following theorem, which is another characterization of convexity for differentiable functions generalizes the idea of increasing functions on differentiable functions.
Theorem (Characterization 2 of convexity for differentiable functions)
Lets f: S ⊂ Rn → R diferenciable and S convex, open and non empty set, then f is convex if and only if

    (∇f(y) - ∇f(x) ) (x - y) ≥ 0 ∀ x, y ∈ S

Note that at one dimension case we have the two-derivate convex definition.
Theorem (Necessary and sufficient condition for a point be a non linear programming problem solution)
Lets f: S ⊂ Rn → R with S convex, open and not empty. Lets be the nonlinear programming problem

     Min(f(x))
    Subject to x∈S

x0 is an optimal solution if and only if f has a subgradeint, ξ at x0 such as

    ξt (x - x0) ≥ 0, ∀x∈S

    (∇f(y) - ∇f(x) ) (x - y) ≥ 0 ∀ x, y ∈ S
The subgradient is not unique at a point, but next corollary says that if there are one with cero value at the pointo x0, then this point is the optimal solution for the problem
Corollary In the same theorem hypotheses, if S is open

x0 is an optimal solution if and inly if there exist a subgradient 0 of f at x0.





Was useful? want add anything?



快乐飞艇怎么玩:Post here

Post from other users





快乐飞艇怎么玩:Post here
快乐飞艇开奖 快乐飞艇做任务靠谱吗 熊猫乐园快乐飞艇 华创投资快乐飞艇靠谱吗 快乐飞艇综合走势图 快乐飞艇app首页 快乐飞艇官网 快乐飞艇是不是官方的 快乐飞艇计划 快乐飞艇彩票 快乐飞艇34567玩法 快乐飞艇app下载