Nondifferentiable mathematical programming involving $(G,\beta )$invexity
 Dehui Yuan^{1},
 Xiaoling Liu^{1}Email author,
 Shengyun Yang^{2} and
 Guoming Lai^{2}
https://doi.org/10.1186/1029242X2012256
© Yuan et al.; licensee Springer 2012
Received: 30 June 2012
Accepted: 16 October 2012
Published: 1 November 2012
Abstract
In this paper, we define new vector generalized convexity, namely nondifferentiable vector $({G}_{f},{\beta}_{f})$invexity, for a given locally Lipschitz vector function f. Basing on this new nondifferentiable vector generalized invexity, we have managed to deal with nondifferentiable nonlinear programming problems under some assumptions. Firstly, we present GKarushKuhnTucker necessary optimality conditions for nonsmooth mathematical programming problems. With the new vector generalized invexity assumption, we also obtain GKarushKuhnTucker sufficient optimality conditions for the same programming problems. Moreover, we establish duality results for this kind of multiobjective programming problems. In the end, a suitable example illustrates that the new optimality results are more useful for some class of optimization problems than the optimality conditions with invex functions.
MSC:90C26.
Keywords
1 Introduction
Convexity plays a central role in many aspects of mathematical programming including the analysis of stability, sufficient optimality conditions and duality. Based on convexity assumptions, nonlinear programming problems can be solved efficiently. In order to treat many practical problems, there have been many attempts to weaken the convexity assumptions and many concepts of generalized convex functions have been introduced and applied to mathematical programming problems in the literature [1–4]. One of these concepts, invexity, was introduced by Hanson in [1]. He has shown that invexity has a common property in mathematical programming with convexity and that KarushKuhnTucker conditions are sufficient for global optimality of nonlinear programming under the invexity assumptions. BenIsrael and Mond [2] also introduced the concept of preinvex functions, which is a special case of invexity. Many researchers, such as Mordukhovich [5], Mishra [6, 7], Ahmad [8, 9], SoleimaniDamaneh [10] and so on, are devoted to this hot topic. Furthermore, Ansari and Yao [11] edited a book which provides a good review for different variants of invexity. With generalized convexity, sufficient and dual results can be obtained, and we refer to [12–14] and references therein for more research results.
In [3], Antczak introduced new definitions of a pinvex set and a $(p,r)$preinvex function which is the generalization of the concept in [2]. He also discussed the differentiable and nondifferentiable nonlinear programming problems involving the $(p,r)$invexitytype functions in [15]. With respect to fixed functions η and b, Antczak extended the $(p,r)$invexity to the $B\text{}(p,r)$invexity and generalized $B\text{}(p,r)$invexity in [16]. Ahmad et al. [8] derived the sufficient conditions for an optimal solution to the minimax fractional problem and then established weak, strong, and strict converse duality theorems for the problem and its dual problem under $B\text{}(p,r)$invexity assumptions. Antczak [4] considered a special kind of $(p,r)$invexity, $(0,r)$invexity, which is called rinvexity in the cases of differentiability and nondifferentiability. Later, Antczak [17] generalized the concept of (scalar) differentiable rinvex functions to the vectorial case and defined a class of Vrinvex functions. In [18], Antczak further generalized the notion of Vrinvexity to the case of nondifferentiability. Note that some other researchers were interested in studying the mathematical programming involving Vrinvex functions; see [6, 7, 9] and the references therein.
To further enlarge the class of mathematical models for which the theoretical tools hold, Antczak extended the invexity to Ginvexity [19] for scalar differentiable functions. In the natural way, he extended the definition of Ginvexity to the case of differentiable vectorvalued functions. He [20] also applied this vector Ginvexity to develop optimality conditions for differentiable multiobjective programming problems with both inequality and equality constraints and established the socalled GKarushKuhnTucker necessary optimality conditions for this kind of programming under the KuhnTucker constraint qualification. With vector Ginvexity, he proved new duality results for nonlinear differentiable multiobjective programming problems, and a number of new vector duality problems such as GMondWeir, GWolfe and Gmixed dual vector problems to the primal one were defined in [21]. Further, Kim et al. [22] considered a special kind of nondifferentiable multiobjective programming with Ginvexity.

In some case, choosing G suitably can simplify the computation of the Clarke derivative of f; see Examples 1 and 2;

The concept of $({G}_{f},{\beta}_{f})$invexity can not only unify but also extend the concepts of αinvexity and Ginvexity; see Example 3. Moreover, $({G}_{f},{\beta}_{f})$invexity, together with Lemma 1, can make the choosing of a vectorvalued function η easy; see Example 3.
Basing on the new nondifferentiable vector generalized invexity, we have managed to deal with nonlinear programming problems under some assumptions. The rest of the paper is organized as follows. In Section 2, we present the concept of the nondifferentiable vector $({G}_{f},{\beta}_{f})$invexity pertaining to a given locally Lipschitz vector function f. For a given function f, we discuss the relation between $({G}_{f},{\beta}_{f})$invexity and $(b,{G}_{f})$preinvexity in Section 3. In Section 4, we present the GKarushKuhnTucker necessary optimality conditions for the nondifferentiable mathematical programming problems. Moreover, with this nondifferentiable vector generalized invexity assumption, we prove the GKarushKuhnTucker sufficient optimality conditions for the nondifferentiable mathematical programming problems. In Section 5, we establish the duality results for this kind of nonsmooth multiobjective programming problems as applications of this new generalized invexity. In Section 6, we give our conclusion. Moreover, we present a suitable example which illustrates that the optimality results in this paper are more useful for some class of optimization problems than the optimality conditions with existing invexity; see Example 6.
2 Notations and definitions
For any function f defined on a nonempty set $X\subset {\mathbb{R}}^{n}$, ${I}_{f}(X)$ denotes the range of f or the image of X under f. Moreover, let $K=\{1,\dots ,k\}$ and $M=\{1,2,\dots ,m\}$.
is called the Clarke subdifferential of f at x.
We give a direct proof for the following useful lemma, which can also be deduced from Theorem 2.3.9 in [24].
Lemma 1 (Chain rule)
Thus, we obtain the desired result. □
With the above chain rule, we can compute the Clarke derivative of a realvalued function f more easily than by using the definition of the Clarke derivative itself; see the following Examples 1 and 2.
For differentiable functions, Antczak introduced the Ginvexity in [20]. Note from Example 2 that the function $\phi (f)$ may be not differentiable even if the function φ is differentiable. Thus, it is necessary to introduce the following vector $({G}_{f},{\beta}_{f})$invexity concept for a given nondifferentiable function f.
holds for all $x\in X$ ($x\ne u$) and $i\in K$, then f is said to be (strictly) nondifferentiable vector $({G}_{f},{\beta}_{f})$invex at u on X (with respect to η) (or shortly, $({G}_{f},{\beta}_{f})$invex at u on X), where ${G}_{f}=({G}_{{f}_{1}},\dots ,{G}_{{f}_{k}})$ and $\beta :=({\beta}_{1}^{f},{\beta}_{2}^{f},\dots ,{\beta}_{k}^{f})$. If f is (strictly) nondifferentiable vector $({G}_{f},{\beta}_{f})$invex at u on X (with respect to η) for all $u\in X$, then f is (strictly) nondifferentiable vector $({G}_{f},{\beta}_{f})$invex on X with respect to η.
Remark 1 In order to define (strictly) nondifferentiable vector $({G}_{f},{\beta}_{f})$incave functions with respect to η for given f, the direction of the inequality (2) in Definition 2 should be changed to the opposite one.
 (2)
Let $f:X\to \mathbb{R}$ be differentiable $({G}_{f},{\beta}_{f})$invex and ${G}_{f}(a)=a$ for $a\in \mathbb{R}$, then f is αinvex as defined in [23], where $\alpha ={\beta}_{f}$.
 (3)
Let $f=({f}_{1},\dots ,{f}_{k})$ be differentiable vector $({G}_{f},{\beta}_{f})$invex and ${\beta}_{i}^{f}(x,u)=1$ for all $x,u\in X$ ($i\in K$), then f is vector Ginvex as defined in [20]. Further, if $K=1$, then f is Ginvex as defined in [19].
Hence, the concept of $({G}_{f},{\beta}_{f})$invexity defined in this paper not only unifies but also extends the concepts of αinvexity and Ginvexity. Example 3 illustrates that there exists a function which is neither αinvex as defined in [23] nor Ginvex as defined in [20], but $({G}_{f},{\beta}_{f})$invex as defined in this paper. Moreover, Definition 2 together with Lemma 1 can help us to choose a vectorvalued function η simply; see Example 3 too.
Then, by Definition 2, f is nondifferentiable vector $({G}_{f},{\beta}_{f})$invex with respect to η. Note that f is nondifferentiable. Then f is neither αinvex as defined in [23] nor Ginvex as defined in [20].
3 Relations between $(b,{G}_{f})$preinvexity and $({G}_{f},{\beta}_{f})$invexity
In this section, we present the concept of $(b,{G}_{f})$preinvexity and discuss its relations with $({G}_{f},{\beta}_{f})$invexity introduced in the above section.
X is said to be an αinvex set with respect to η if X is αinvex at each $u\in X$. If $\alpha (x,u)=1$ for all $x,u\in X$, then the αinvex set X with respect to η is called an invex set X with respect to η.
hold for all $x\in X$ ($x\ne u$), then $f=({f}_{1},\dots ,{f}_{k})$ is said to be (strictly) vector bpreinvex at u on X with respect to η, where $b=({b}_{1},\dots ,{b}_{k})$. If f is (strictly) vector bpreinvex at u on X with respect to η for each $u\in X$, then f is (strictly) vector bpreinvex on X with respect to η.
hold for all $x\in X$ ($x\ne u$), then $f=({f}_{1},\dots ,{f}_{k})$ is said to be (strictly) vector $(b,{G}_{f})$preinvex at u on X with respect to η, where ${G}_{f}=({G}_{{f}_{1}},\dots ,{G}_{{f}_{k}})$ and $b=({b}_{1},\dots ,{b}_{k})$. If f is (strictly) vector $(b,{G}_{f})$preinvex at u on X for all $u\in X$, then f is (strictly) vector $(b,{G}_{f})$preinvex on X with respect to η.
Above Example 4 illustrates there exists a function which is not bpreinvex but $(b,G)$preinvex. Next, we give another useful lemma and the proof is omitted.
Lemma 2 Let φ be an increasing function defined on $A\subset \mathbb{R}$, then ${\phi}^{1}$ exists and ${\phi}^{1}$ is increasing on ${I}_{\phi}(A)$.
Theorem 1 Let X be an invex set (with respect to η) in ${\mathbb{R}}^{n}$ and $f=({f}_{1},\dots ,{f}_{k})$ be a function defined on X; let ${G}_{f}=({G}_{{f}_{1}},\dots ,{G}_{{f}_{k}})$ be a function such that ${G}_{{f}_{i}}:{I}_{{f}_{i}}(X)\to \mathbb{R}$ is strictly increasing on ${I}_{{f}_{i}}(X)$ for $i\in K$; let $b=:({b}_{1},\dots ,{b}_{k})$, where ${b}_{i}:X\times X\times [0,1]\to {\mathbb{R}}_{+}$ ($i\in K$). Then f is (strictly) vector $(b,{G}_{f})$preinvex at u on X with respect to η if and only if ${G}_{f}\circ f=({G}_{{f}_{1}}\circ {f}_{1},\dots ,{G}_{{f}_{k}}\circ {f}_{k})$ is (strictly) vector bpreinvex at u on X with respect to the same η.
By Definition 5, we deduce f is (strictly) vector $(b,{G}_{f})$preinvex at u on X with respect to the same η.
Moreover, the above steps are invertible, so the result follows. □
Theorem 2 Let X be an invex set (with respect to η) in ${\mathbb{R}}^{n}$; let $f=({f}_{1},\dots ,{f}_{k})$ be (strictly) vector $(b,{G}_{f})$preinvex on X with respect to η; assume that ${G}_{{f}_{i}}(\cdot )$ is differentiable and strictly increasing on ${I}_{{f}_{i}}(X)$, ${b}_{i}(x,u;\lambda )$ is continuous on $X\times X\times [0,1]$ for each $i\in K$. Moreover, ${lim}_{\lambda \downarrow 0}sup{b}_{i}(x,u;\lambda )>0$ for any $x,u\in X$. Then f is vector $({G}_{f},{\beta}_{f})$invex on X with respect to η, where ${\beta}_{i}^{f}(x,u)=\frac{1}{{lim}_{\lambda \downarrow 0}sup{b}_{i}(x,u;\lambda )}$ for $i\in K$.
Thus, the result follows. □
hold for any ${\zeta}_{i}\in \partial {f}_{i}(u)$ and for each $i\in K$. Thus, f is exactly the locally Lipschitz Vrinvexity with respect to η on X or rinvex.
Remark 3 By Definition 2 and Example 5, we know that both a Vrinvex function and an rinvex function are nondifferentiable vector $({G}_{f},{\beta}_{f})$invex.
where X is a nonempty set of ${\mathbb{R}}^{n}$, ${f}_{i}$ ($i\in K$) and ${g}_{j}$ ($j\in M$) are realvalued Lipschitz functions on X.
Let ${E}_{\mathrm{CVP}}=\{x\in X:{g}_{j}(x)\leqq 0,j\in M\}$ be the set of all feasible solutions for the problem (CVP). Further, denote by $J(\overline{x}):=\{j\in M:{g}_{j}(\overline{x})=0\}$ the set of constraint indices active at $\overline{x}\in {E}_{\mathrm{CVP}}$.
The above multiobjective programming problem (CVP) was widely used in applied sciences. Recently, this kind of programming was used to solve problems arising in fields such as bioinformatics, computational biology, molecular biology, wastewater treatment, drug discovery, and food processing.
where ${G}_{g}(0):=({G}_{{g}_{1}}(0),\dots ,{G}_{{g}_{m}}(0))$. Denote by ${E}_{G\text{}\mathrm{CVP}}:=\{x\in X:{G}_{g}\circ g(x)\leqq {G}_{g}(0)\}$, ${J}^{\mathrm{\prime}}(\overline{x}):=\{j\in M:{G}_{{g}_{j}}\circ {g}_{j}(\overline{x})={G}_{{g}_{j}}(0)\}$. Then it is easy to see that ${E}_{\mathrm{CVP}}={E}_{G\text{}\mathrm{CVP}}$ and $J(\overline{x})={J}^{\mathrm{\prime}}(\overline{x})$. So, the set of all feasible solutions and the set of constraint active indices for either (CVP) or (GCVP) are denoted by E and $J(\overline{x})$, respectively.
Before studying optimality in multiobjective programming, we have to define clearly the concepts of optimality and solutions in relation to a multiobjective programming problem. Note that in vector optimization problems, there is a multitude of competing definitions and approaches. One of the dominating ones is (weak) Pareto optimality. The (weak) Pareto optimality in multiobjective programming associates the concept of a solution with some property that seems intuitively natural.
Lemma 3 Let ${G}_{{f}_{i}}$ be strictly increasing on ${I}_{{f}_{i}}(X)$ for each $i\in K$ and ${G}_{{g}_{j}}$ be strictly increasing on ${I}_{{g}_{j}}(X)$ for each $j\in M$. Further, let $0\in {I}_{{g}_{j}}(X)$, $j\in M$. Then $\overline{x}$ is a (weakly) efficient solution for (CVP) if and only if $\overline{x}$ is a (weakly) efficient solution for (GCVP).
4 Optimality conditions in nondifferentiable multiobjective programming
The first necessary conditions for the inequalityconstrained problem have been presented in 1948 by Fritz John; while stronger necessary conditions for the same inequalityconstrained problem were obtained in 1951 by Kuhn and Tucker. Since then, optimality conditions of Fritz John and KarushKuhnTucker type for differentiable or nondifferentiable nonconvex multiobjective programming problems were established under different assumptions. For example, optimality conditions of FritzJohn and KarushKuhnTucker type for nondifferentiable convex multiobjective programming problems were established by Kanniappan. Later, Craven proved these conditions for nondifferentiable multiobjective programming problems involving locally Lipschitz functions. Also, under some constraint qualifications, Lee proved the KarushKuhnTucker necessary optimality conditions for multiobjective programming problems involving Lipschitz functions. Moveover, SoleimaniDamaneh characterized the weak Paretooptimal solutions of nonsmooth multiobjective programs in Asplund spaces under locally Lipschitz and generalized convexity conditions. Further, he established some sufficient conditions for optimality and proper optimality for multipleobjective programs in Banach spaces after extending the concept of vector invexity.
Recently, Antczak [19] introduced the socalled GKarushKuhnTucker necessary optimality conditions for a differentiable mathematical programming problem. In a natural way, he [20] extended the socalled GKarushKuhnTucker necessary optimality conditions to the vectorial case for differentiable multiobjective programming problems. From the discussion in the above sections, it is interesting to consider the nondifferentiable nonlinear programming. Hence, we present not only GKarushKuhnTucker necessary optimality but also GKarushKuhnTucker sufficient optimality for this kind of nondifferentiable mathematical programming problems.
Theorem 3 (GFritz John necessary optimality condition)
Hence, by Lemma 1, we get the desired result. □
The GKarushKuhnTucker necessary optimality conditions for $\overline{x}$ to be (weak) Pareto optimal are obtained from the above Fritz John necessary optimality conditions under some constraint qualifications.
Now, we give a generalized Slater type constraint qualification. Under this regularity constraint qualification, we establish the GKarushKuhnTucker necessary optimality conditions for the considered nonsmooth multiobjective programming problem (CVP).
Definition 7 The program (CVP) is said to satisfy the generalized Slater type constraint at $\overline{x}$ if there exists ${x}^{0}\in E$ such that ${g}_{J}({x}^{0})<0$ and ${g}_{J}$ is $({G}_{{g}_{J}},{\beta}_{{g}_{J}})$invex with respect to η at $\overline{x}$ on E, where $J\triangleq J(\overline{x})$.
Theorem 4 (GKarushKuhnTucker necessary optimality condition)
which contradicts (11). □
Now, under the assumption of generalized invexity defined in Section 2, we can establish sufficient optimality conditions for nonsmooth multiobjective programming problems involving locally Lipschitz functions.
Theorem 5 (GKarushKuhnTucker sufficient optimality conditions)
Let $\overline{x}$ be a feasible point for (CVP); let ${G}_{{f}_{i}}$ be differentiable and strictly increasing on ${I}_{{f}_{i}}(X)$ for each $i\in K$, and let ${G}_{{g}_{j}}$ be differentiable and strictly increasing on ${I}_{{g}_{j}}(X)$ for each $j\in M$. Moreover, GKarushKuhnTucker necessary optimality conditions (8)(10) are satisfied at $\overline{x}$. If f is nondifferentiable vector $({G}_{f},{\beta}_{f})$invex at $\overline{x}$ on X with respect to η and g is nondifferentiable vector $({G}_{g},{\beta}_{g})$invex at $\overline{x}$ on X with respect to the same η, then $\overline{x}$ is a (weakly) efficient solution for (CVP).
which contradicts the GKarushKuhnTucker necessary optimality condition (8). Hence, $\overline{x}$ is a weakly efficient solution for (CVP), and the proof is complete. □
Theorem 6 (GKarushKuhnTucker sufficient optimality conditions)
Let $\overline{x}$ be a feasible point for (CVP); let ${G}_{{f}_{i}}$ be differentiable and strictly increasing on ${I}_{{f}_{i}}(X)$ for each $i\in K$, and let ${G}_{{g}_{j}}$ be differentiable and strictly increasing on ${I}_{{g}_{j}}(X)$ for each $j\in M$. Moreover, GKarushKuhnTucker necessary optimality conditions (8)(10) are satisfied at $\overline{x}$. If f is strictly nondifferentiable vector $({G}_{f},{\beta}_{f})$invex at $\overline{x}$ on X with respect to η and g is nondifferentiable vector $({G}_{g},{\beta}_{g})$invex at $\overline{x}$ on X with respect to the same η, then $\overline{x}$ is an efficient solution for (CVP).
Proof Proof is similar to the proof of Theorem 5. □
5 Duality
Let W denote the set of all feasible solutions for the dual problem (MWD). Further, denote by Y the set $Y=\{y\in X:(y,\lambda ,\mu )\in W\}$.
Theorem 7 (Weak duality)
Let x and $(y,\lambda ,\mu )$ be feasible solutions for (CVP) and (MWD), respectively. Moreover, assume that ${f}_{I}$ and ${g}_{J}$ are $({G}_{{f}_{I}},{\beta}_{{f}_{I}})$invex and $({G}_{{g}_{J}},{\beta}_{{g}_{J}})$invex at y on $E\cup Y$ with respect to the same η, respectively, where $I\triangleq I(y)$ and $J\triangleq J(y)$. Then $f(x)\nless f(y)$.
holds for all ${\zeta}_{i}^{f}\in \partial {f}_{i}(y)$, $i\in I$, ${\zeta}_{j}^{g}\in \partial {g}_{j}(y)$, $j\in J$. This contradicts (16). □
Theorem 8 (Strong duality)
Let $\overline{x}$ be a (weakly) efficient solution in (CVP). Then there exist $\overline{\lambda}\in {\mathbb{R}}^{k}$, $\overline{\lambda}\u2a7e0$, $\overline{\mu}\in {\mathbb{R}}^{m}$, $\overline{\mu}\geqq 0$ such that $(\overline{x},\overline{\lambda},\overline{\mu})$ is feasible in (MWD). If, also weak duality theorem holds for problems (CVP) and (MWD), then $(\overline{x},\overline{\lambda},\overline{\mu})$ is a (weakly) efficient solution in (MWD) and the optimal values in both problems are the same.
But the above inequality is a contradiction to weak duality. Thus, $(\overline{x},\overline{\lambda},\overline{\mu})$ is a (weakly) efficient solution in (MWD), and the optimal values in both problems are the same. □
Theorem 9 (Converse duality)
Let $(\overline{y},\overline{\lambda},\overline{\mu})$ be a (weakly) efficient solution for (MWD) such that $\overline{y}\in E$. Moreover, assume that ${f}_{I}$ and ${g}_{J}$ are (strictly) $({G}_{{f}_{I}},{\beta}_{{f}_{I}})$invex and (strictly) $({G}_{{g}_{J}},{\beta}_{{g}_{J}})$invex at $\overline{y}$ on $E\cup Y$ with respect to the same η, respectively, where $I\triangleq I(\overline{y})$ and $J\triangleq J(\overline{y})$. Then $\overline{y}$ is a (weakly) efficient solution in (CVP).
holds for all ${\zeta}_{i}^{f}\in \partial {f}_{i}(\overline{y})$, $i\in K$, ${\zeta}_{j}^{g}\in \partial {g}_{j}(\overline{y})$, $j\in M$, which contradicts the feasibility of $(\overline{y},\overline{\lambda},\overline{\mu})$ in (MWD). □
6 Conclusion
This paper presents a new type of generalized invexity, namely nondifferentiable $({G}_{f},{\beta}_{f})$invexity for a given locally Lipschitz function f defined on $X\subset {\mathbb{R}}^{n}$. This new invexity not only unifies but also extends the existing Ginvexity and αinvexity presented in literatures. We have constructed auxiliary mathematical programming (GCVP) and have discussed the relations between programming (GCVP) and (CVP). With (GCVP), we have proved the GKarushKuhnTucker necessary optimality conditions for (CVP). Our statement of the socalled GKuhnTucker necessary optimality conditions established in this paper is more general than the classical KuhnTucker necessary optimality conditions found in the literature. Also, we have proved the sufficiency of the introduced GKarushKuhnTucker necessary optimality conditions for (CVP) under the new nondifferentiable vector invexity assumption. More exactly, this result has been proved for such multiobjective programming problems in which the objective functions, the constraints are nondifferentiable vector generalized invex with respect to the same η defined in Section 2, but not necessarily with respect to the same G; see the following example. As applications of our new generalized invexity, we establish dual results for (CVP) under the MondWeir dual programming. Note that many researchers were interested in studying minimax programming or fractional programming with different generalized invexities; see [6, 8, 10, 15]. As pointed out by an anonymous referee, we will study minimax programming or fractional programming under the invexity proposed in this sequel in the future.
To illustrate the approach to optimality considered in the paper, we here give an example of a nonsmooth multiobjective programming problem involving nondifferentiable vector generalized invex functions with respect to the same function η defined in Section 2.
It is not difficult to see that ${f}_{1}$, ${f}_{2}$, g are locally Lipschitz functions and, moreover, the set of all feasible solutions $E=X=[1/2,1/2]\subset \mathbb{R}$. Note also that a feasible solution $\overline{x}=0$ is an efficiently optimal in the considered nonsmooth vector optimization problem. Then, from Example 3, f and g are nondifferentiable vector $({G}_{f},{\beta}_{f})$invex and $({G}_{g},{\beta}_{g})$invex with respect to the same η, respectively, where η, ${\beta}_{f}$, and ${\beta}_{g}$ are defined in Example 3. Also, it can be established that the GKarushKuhnTucker necessary optimality conditions (8)(10) are satisfied at $\overline{x}$. Since all the hypotheses of Theorem 6 are fulfilled, then $\overline{x}$ is an efficient optimal in the considered multiobjective programming problem. Further, note that the sufficient optimality Theorem 20 in [19] for efficient optimality is not applicable to the considered multiobjective programming problem (CVP). This follows from the fact that all functions involved in the considered multiobjective programming problem are nondifferentiable.
Declarations
Acknowledgements
The authors are grateful to the referees for their valuable suggestions that helped to improve the paper in its present form. This research is supported by the Science Foundation of Hanshan Normal University (LT200801).
Authors’ Affiliations
References
 Hanson MA: On sufficiency of the KuhnTucker conditions. J. Math. Anal. Appl. 1981, 80: 545–550. 10.1016/0022247X(81)901232MathSciNetView ArticleGoogle Scholar
 BenIsrael A, Mond B: What is invexity. J. Aust. Math. Soc. Ser. B 1986, 28: 1–9. 10.1017/S0334270000005142MathSciNetView ArticleGoogle Scholar
 Antczak T:$(p,r)$invex sets and functions. J. Math. Anal. Appl. 2001, 263: 355–379. 10.1006/jmaa.2001.7574MathSciNetView ArticleGoogle Scholar
 Antczak T: r preinvexity and r invexity in mathematical programming. Comput. Math. Appl. 2005, 50(3–4):551–566. 10.1016/j.camwa.2005.01.024MathSciNetView ArticleGoogle Scholar
 Mordukhovich BS: Multiobjective optimization problems with equilibrium constraints. Math. Program. 2009, 117(1):331–354. 10.1007/s101070070172yMathSciNetView ArticleGoogle Scholar
 Mishra SK, Shukla K: Nonsmooth minimax programming problems with V  r invex functions. Optimization 2010, 59(1):95–103. 10.1080/02331930903500308MathSciNetView ArticleGoogle Scholar
 Mishra SK, Singh V, Wang SY, Lai KK: Optimality and duality for nonsmooth multiobjective optimization problems with generalized V  r invexity. J. Appl. Anal. 2010, 16: 49–58.MathSciNetView ArticleGoogle Scholar
 Ahmad I, Gupta SK, Kailey N, Agarwal RP:Duality in nondifferentiable minimax fractional programming with $B\text{}(p,r)$invexity. J. Inequal. Appl. 2011., 2011: Article ID 1Google Scholar
 Ahmad I, Gupta SK, Jayswal A: On sufficiency and duality for nonsmooth multiobjective programming problems involving generalized V  r invex functions. Nonlinear Anal., Theory Methods Appl. 2011, 74(17):5920–5928. 10.1016/j.na.2011.05.058MathSciNetView ArticleGoogle Scholar
 SoleimaniDamaneh M: Optimality for nonsmooth fractional multiple objective programming. Nonlinear Anal., Theory Methods Appl. 2008, 68(10):2873–2878. 10.1016/j.na.2007.02.033MathSciNetView ArticleGoogle Scholar
 Ansari QH, Yao JC: Recent Developments in Vector Optimization. Springer, Berlin; 2012.View ArticleGoogle Scholar
 Li J, Gao Y: Nondifferentiable multiobjective mixed symmetric duality under generalized convexity. J. Inequal. Appl. 2011., 2011: Article ID 23. doi:10.1186/1029–242X2011–23Google Scholar
 Gao Y: Higherorder symmetric duality for a class of multiobjective fractional programming problems. J. Inequal. Appl. 2012., 2012: Article ID 142. doi:10.1186/1029–242X2012–142Google Scholar
 Gupta SK, Dangar D, Kumar S: Secondorder duality for a nondifferentiable minimax fractional programming under generalized α univexity. J. Inequal. Appl. 2012., 2012: Article ID 187. doi:10.1186/1029–242X2012–187Google Scholar
 Antczak T:Minimax programming under $(p,R)$invexity. Eur. J. Oper. Res. 2004, 158: 1–19. 10.1016/S03772217(03)003527MathSciNetView ArticleGoogle Scholar
 Antczak T:Generalized $B\text{}(p,R)$invexity functions and nonlinear mathematical programming. Numer. Funct. Anal. Optim. 2009, 30(1–2):1–22. 10.1080/01630560802678549MathSciNetView ArticleGoogle Scholar
 Antczak T: V  r invexity in multiobjective programming. J. Appl. Anal. 2005, 11(1):63–80.MathSciNetView ArticleGoogle Scholar
 Antczak T: Optimality and duality for nonsmooth multiobjective programming problems with V  r invexity. J. Glob. Optim. 2009, 45(2):319–334. 10.1007/s1089800893778MathSciNetView ArticleGoogle Scholar
 Antczak T: New optimality conditions and duality results of G type in differentiable mathematical programming. Nonlinear Anal., Theory Methods Appl. 2007, 66: 1617–1632. 10.1016/j.na.2006.02.013MathSciNetView ArticleGoogle Scholar
 Antczak T: On G invex multiobjective programming. Part I. Optimality. J. Glob. Optim. 2009, 43(1):97–109. 10.1007/s1089800892995MathSciNetView ArticleGoogle Scholar
 Antczak T: On G invex multiobjective programming. Part II. Duality. J. Glob. Optim. 2009, 43(1):111–140. 10.1007/s1089800892986MathSciNetView ArticleGoogle Scholar
 Kim HJ, Seo YY, Kim DS: Optimality conditions in nondifferentiable G invex multiobjective programming. J. Inequal. Appl. 2010., 2010: Article ID 172059. doi:10.1155/2010/172059Google Scholar
 Noor M: On generalized preinvex functions and monotonicities. J. Inequal. Pure Appl. Math. 2004, 5(4):1–9.Google Scholar
 Clarke FH: Optimization and Nonsmooth Analysis. WileyInterscience, New York; 1983.Google Scholar
Copyright
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.