Exemplos de uso de Convex functions em Inglês e suas traduções para o Português
{-}
-
Colloquial
-
Official
-
Medicine
-
Financial
-
Ecclesiastic
-
Ecclesiastic
-
Computer
-
Official/political
This is possible for convex functions because they have directional derivatives.
We consider an optimization problem for which the objective function is the sum of convex functions, not necessarily differentiable.
Keywords: Convex functions; Integral inequalities; h-Convex functions. .
We also present a generalized proximal linearized method for difference of convex functions which uses a quasi distance as regularization.
In the context of this work we will focus on the special case of a convex differentiable function possessing an isolated point of non differentiability, and on its generalization, the zh convex functions.
In this work, we study subdi erential calculus rules for the sum of convex functions, in mal convolution, pre-composition and marginal function. .
This dissertation work forward the optimization of a physical arrangement problem by formulating it as a signomial geometric programming problem andsolving it through representable functions as the difference between convex functions in its standard form.
This work presents the concept and applications of one real variable convex functions. the main goal is to show their use in problems involving inequalities.
The branch of mathematics devoted to the study of properties of convex sets and convex functions is called convex analysis.
We develop an approach based on the difference of two convex functions(dc) in order to build a convex relaxation to be used in a spatial branch-and-bound procedure.
We present extensions to hadamard manifolds of the proximal point method for difference of convex functions and the steepest descent method for continuously differentiable functions which satisfy the kurdyka- lojasiewicz property.
The main objective of this work is to strictly,the important numerical inequalities through the right convex functions and left concave functions. .
In this context, a detailed proof of global convergence analysis for(not necessarily) convex functions is developed, jointly with the description of a study of the linear convergence rate for the case of strongly convex functions.
The necessary conditions are sufficient for optimality if the objective function f{\displaystyle f} of a maximization problem is a concave function,the inequality constraints g j{\displaystyle g_{j}} are continuously differentiable convex functions and the equality constraints h i{\displaystyle h_{i}} are affine functions.
In this work we will study the concept,the main properties and some applications of the convex functions of a real variable, so that our main objective is to show its use in problems that involve inequalities, especially those that we see in basic education.
In this work, we present the theory based on the notion of proximity operators,used to study the problem of minimizing the sum of two convex functions with certain regularity properties, in hilbert spaces.
Inequalities in particular those involving right convex functions and left concave functions deserve attention because they involve sequences of mathematical procedures to be demonstrated, even if each step to be shown, in principle, is not very complex.
In mathematics, the subderivative, subgradient, andsubdifferential generalize the derivative to convex functions which are not necessarily differentiable.
This work considers convex optimization problems with a separable structure, i.e.,to minimize problems the sum of convex functions subject restrictions for each independent variable.
In this work an approach theory of geometrical problems andprogramming optimization theory convex difference function(dc) is shown which is made of a class of geometric programming problems, known as signomiais problems can be written as the difference convex functions and further, a dc problem can be written as cdc which is the canonical form of the problem dc.
The additive inverse of a convex function is a concave function. .
A convex function of a martingale is a submartingale, by Jensen's inequality.
The Phragmen-Lindelöf theorem implies that μ is a convex function.
The right derivative f+′{\displaystyle f\prime}} of any convex function f defined on an open interval, is an increasing cadlag function. .
The right derivative"f+"' of any convex function" f" defined on an open interval, is an increasing cadlag function. .
Let"f":"I"→R be a real-valued convex function defined on an open interval of the real line.
In this work we deal with optimal methods for optimizing a differentiable convex function with and without constrains.
Continuing, was proved a theorem which guarantees the maximality of the subdiferencial of a lower semi-continuous proper convex function.
If the objective function is a ratio of a concave and a convex function(in the maximization case) and the constraints are convex, then the problem can be transformed to a convex optimization problem using fractional programming techniques.