A projection method for bilevel variational inequalities

Abstract

A fixed point iteration algorithm is introduced to solve bilevel monotone variational inequalities. The algorithm uses simple projection sequences. Strong convergence of the iteration sequences generated by the algorithm to the solution is guaranteed under some assumptions in a real Hilbert space.

MSC:65K10, 90C25.

1 Introduction

Let C be a nonempty closed convex subset of a real Hilbert space â„‹ with the inner product $ã€ˆâ‹\dots ,â‹\dots ã€‰$ and the norm $âˆ¥â‹\dots âˆ¥$. We denote weak convergence and strong convergence by notations â‡€ and â†’, respectively. The bilevel variational inequalities, shortly (BVI), are formulated as follows:

where $G:\mathcal{H}â†’\mathcal{H}$, $Sol\left(G,C\right)$ denotes the set of all solutions of the variational inequalities:

and $F:Câ†’\mathcal{H}$. We denote the solution set of problem (BVI) by Î©.

Bilevel variational inequalities are special classes of quasivariational inequalities (see [1â€“4]) and of equilibrium with equilibrium constraints considered in [5]. However, they cover some classes of mathematical programs with equilibrium constraints (see [6]), bilevel minimization problems (see [7]), variational inequalities (see [8â€“13]), minimum-norm problems of the solution set of variational inequalities (see [14, 15]), bilevel convex programming models (see [16]) and bilevel linear programming in [17].

Suppose that $f:\mathcal{H}â†’\mathcal{R}$. It is well known in convex programming that if f is convex and differentiable on $Sol\left(G,C\right)$, then ${x}^{âˆ—}$ is a solution to

$min\left\{f\left(x\right):xâˆˆSol\left(G,C\right)\right\}$

if and only if ${x}^{âˆ—}$ is the solution to the bilevel variational inequalities $VI\left(\mathrm{âˆ‡}f,Sol\left(G,C\right)\right)$, where âˆ‡f is the gradient of f. Then the bilevel variational inequalities (BVI) are written in a form of mathematical programs with equilibrium constraints as follows:

$\left\{\begin{array}{c}minf\left(x\right),\hfill \\ \phantom{\rule{1em}{0ex}}xâˆˆ\left\{{y}^{âˆ—}:ã€ˆG\left({y}^{âˆ—}\right),zâˆ’{y}^{âˆ—}ã€‰â‰¥0,\mathrm{âˆ€}zâˆˆC\right\}.\hfill \end{array}$

If f, g are two convex and differentiable functions, then problem (BVI) (where $F:=\mathrm{âˆ‡}f$ and $G:=\mathrm{âˆ‡}g$) becomes the following bilevel minimization problem (see [7]):

$\left\{\begin{array}{c}minf\left(x\right),\hfill \\ \phantom{\rule{1em}{0ex}}xâˆˆargmin\left\{g\left(x\right):xâˆˆC\right\}.\hfill \end{array}$

In a special case $F\left(x\right)=x$ for all $xâˆˆC$, problem (BVI) becomes the minimum-norm problems of the solution set of variational inequalities as follows:

where ${Pr}_{Sol\left(G,C\right)}\left(0\right)$ is the projection of 0 onto $Sol\left(G,C\right)$. A typical example is the least-squares solution to the constrained linear inverse problem in [18]. For solving this problem under the assumption that the subset $CâŠ†\mathcal{H}$ is nonempty closed convex, $G:Câ†’\mathcal{H}$ is Î±-inverse strongly monotone and , Yao et al. in [14] introduced the following extended extragradient method:

$\left\{\begin{array}{c}{x}^{0}âˆˆC,\hfill \\ {y}^{k}={Pr}_{C}\left({x}^{k}âˆ’\mathrm{Î»}G\left({x}^{k}\right)âˆ’{\mathrm{Î±}}_{k}{x}^{k}\right),\hfill \\ {x}^{k+1}={Pr}_{C}\left[{x}^{k}âˆ’\mathrm{Î»}G\left({x}^{k}\right)+\mathrm{Î¼}\left({y}^{k}âˆ’{x}^{k}\right)\right],\phantom{\rule{1em}{0ex}}\mathrm{âˆ€}kâ‰¥0.\hfill \end{array}$

They showed that under certain conditions over parameters, the sequence $\left\{{x}^{k}\right\}$ converges strongly to $\stackrel{Ë†}{x}={Pr}_{Sol\left(G,C\right)}\left(0\right)$.

Recently, Anh et al. in [19] introduced an extragradient algorithm for solving problem (BVI) in the Euclidean space ${\mathcal{R}}^{n}$. Roughly speaking the algorithm consists of two loops. At each iteration k of the outer loop, they applied the extragradient method for the lower variational inequality problem. Then, starting from the obtained iterate in the outer loop, they computed an ${\mathrm{Ïµ}}_{k}$-solution of problem $VI\left(G,C\right)$. The convergence of the algorithm crucially depends on the starting points ${x}^{0}$ and the parameters chosen in advance. More precisely, they presented the following scheme

Under assumptions that F is strongly monotone and Lipschitz continuous, G is pseudomonotone and Lipschitz continuous on C, the sequences of parameters were chosen appropriately. They showed that two iterative sequences $\left\{{x}^{k}\right\}$ and $\left\{{z}^{k}\right\}$ converged to the same point ${x}^{âˆ—}$ which is a solution of problem (BVI). However, at each iteration of the outer loop, the scheme requires computing an approximation solution to a variational inequality problem.

There exist some other solution methods for bilevel variational inequalities when the cost operator has some monotonicity (see [16, 19â€“21]). In all of these methods, solving auxiliary variational inequalities is required. In order to avoid this requirement, we combine the projected gradient method in [10] for solving variational inequalities and the fixed point property that ${x}^{âˆ—}$ is a solution to problem $VI\left(F,C\right)$ if and only if it is a fixed point of the mapping ${Pr}_{C}\left(xâˆ’\mathrm{Î»}F\left(x\right)\right)$, where $\mathrm{Î»}>0$. Then, the strong convergence of proposed sequences is considered in a real Hilbert space.

In this paper, we are interested in finding a solution to bilevel variational inequalities (BVI), where the operators F and G satisfy the following usual conditions:

(A1) G is Î·-inverse strongly monotone on â„‹ and F is Î²-strongly monotone on C.

(A2) F is L-Lipschitz continuous on C.

(A3) The solution set Î© of problem (BVI) is nonempty.

The purpose of this paper is to propose an algorithm for directly solving bilevel pseudomonotone variational inequalities by using the projected gradient method and fixed point techniques.

The rest of this paper is divided into two sections. In Section 2, we recall some properties for monotonicity, the metric projection onto a closed convex set and introduce in detail a new algorithm for solving problem (BVI). The third section is devoted to the convergence analysis for the algorithm.

2 Preliminaries

We list some well-known definitions and the projection under the Euclidean norm which will be used in our analysis.

Definition 2.1 Let C be a nonempty closed convex subset in â„‹. We denote the projection on C by ${Pr}_{C}\left(â‹\dots \right)$, i.e.,

${Pr}_{C}\left(x\right)=argmin\left\{âˆ¥yâˆ’xâˆ¥:yâˆˆC\right\},\phantom{\rule{1em}{0ex}}\mathrm{âˆ€}xâˆˆ\mathcal{H}.$

The operator $\mathrm{Ï†}:Câ†’\mathcal{H}$ is said to be

1. (i)

Î³-strongly monotone on C if for each $x,yâˆˆC$,

$ã€ˆ\mathrm{Ï†}\left(x\right)âˆ’\mathrm{Ï†}\left(y\right),xâˆ’yã€‰â‰¥\mathrm{Î³}{âˆ¥xâˆ’yâˆ¥}^{2};$
2. (ii)

Î·-inverse strongly monotone on C if for each $x,yâˆˆC$,

$ã€ˆ\mathrm{Ï†}\left(x\right)âˆ’\mathrm{Ï†}\left(y\right),xâˆ’yã€‰â‰¥\mathrm{Î·}{âˆ¥\mathrm{Ï†}\left(x\right)âˆ’\mathrm{Ï†}\left(y\right)âˆ¥}^{2};$
3. (iii)

Lipschitz continuous with constant $L>0$ (shortly L-Lipschitz continuous) on C if for each $x,yâˆˆC$,

$âˆ¥\mathrm{Ï†}\left(x\right)âˆ’\mathrm{Ï†}\left(y\right)âˆ¥â‰¤Lâˆ¥xâˆ’yâˆ¥.$

If $\mathrm{Ï†}:Câ†’C$ and $L=1$, then Ï† is called nonexpansive on C.

We know that the projection ${Pr}_{C}\left(â‹\dots \right)$ has the following well-known basic properties.

Property 2.2

1. (a)

$âˆ¥{Pr}_{C}\left(x\right)âˆ’{Pr}_{C}\left(y\right)âˆ¥â‰¤âˆ¥xâˆ’yâˆ¥$, $\mathrm{âˆ€}x,yâˆˆ\mathcal{H}$.

2. (b)

$ã€ˆxâˆ’{Pr}_{C}\left(x\right),yâˆ’{Pr}_{C}\left(x\right)ã€‰â‰¤0$, $\mathrm{âˆ€}yâˆˆC,xâˆˆ\mathcal{H}$.

3. (c)

${âˆ¥{Pr}_{C}\left(x\right)âˆ’{Pr}_{C}\left(y\right)âˆ¥}^{2}â‰¤{âˆ¥xâˆ’yâˆ¥}^{2}âˆ’{âˆ¥{Pr}_{C}\left(x\right)âˆ’x+yâˆ’{Pr}_{C}\left(y\right)âˆ¥}^{2}$, $\mathrm{âˆ€}x,yâˆˆ\mathcal{H}$.

To prove the main theorem of this paper, we need the following lemma.

Lemma 2.3 (see [21])

Let $A:\mathcal{H}â†’\mathcal{H}$ be Î²-strongly monotone and L-Lipschitz continuous, $\mathrm{Î»}âˆˆ\left(0,1\right]$ and $\mathrm{Î¼}âˆˆ\left(0,\frac{2\mathrm{Î²}}{{L}^{2}}\right)$. Then the mapping $T\left(x\right):=xâˆ’\mathrm{Î»}\mathrm{Î¼}A\left(x\right)$ for all $xâˆˆ\mathcal{H}$ satisfies the inequality

$âˆ¥T\left(x\right)âˆ’T\left(y\right)âˆ¥â‰¤\left(1âˆ’\mathrm{Î»}\mathrm{Ï„}\right)âˆ¥xâˆ’yâˆ¥,\phantom{\rule{1em}{0ex}}\mathrm{âˆ€}x,yâˆˆ\mathcal{H},$

where $\mathrm{Ï„}=1âˆ’\sqrt{1âˆ’\mathrm{Î¼}\left(2\mathrm{Î²}âˆ’\mathrm{Î¼}{L}^{2}\right)}âˆˆ\left(0,1\right]$.

Lemma 2.4 (see [22])

Let â„‹ be a real Hilbert space, C be a nonempty closed and convex subset of â„‹ and $S:Câ†’\mathcal{H}$ be a nonexpansive mapping. Then $Iâˆ’S$ (I is the identity operator on â„‹) is demiclosed at $yâˆˆ\mathcal{H}$, i.e., for any sequence $\left({x}^{k}\right)$ in C such that ${x}^{k}â‡€\stackrel{Â¯}{x}âˆˆD$ and $\left(Iâˆ’S\right)\left({x}^{k}\right)â†’y$, we have $\left(Iâˆ’S\right)\left(\stackrel{Â¯}{x}\right)=y$.

Lemma 2.5 (see [19])

Let $\left\{{a}_{n}\right\}$ be a sequence of nonnegative real numbers such that

${a}_{n+1}â‰¤\left(1âˆ’{\mathrm{Î³}}_{n}\right){a}_{n}+{\mathrm{Î´}}_{n},\phantom{\rule{1em}{0ex}}\mathrm{âˆ€}nâ‰¥0,$

where $\left\{{\mathrm{Î³}}_{n}\right\}âŠ‚\left(0,1\right)$ and $\left\{{\mathrm{Î´}}_{n}\right\}$ is a sequence in â„› such that

1. (a)

${âˆ‘}_{n=0}^{\mathrm{âˆž}}{\mathrm{Î³}}_{n}=\mathrm{âˆž}$,

2. (b)

${limâ€‰sup}_{nâ†’\mathrm{âˆž}}\frac{{\mathrm{Î´}}_{n}}{{\mathrm{Î³}}_{n}}â‰¤0$ or ${âˆ‘}_{n=0}^{\mathrm{âˆž}}|{\mathrm{Î´}}_{n}{\mathrm{Î³}}_{n}|<+\mathrm{âˆž}$.

Then ${lim}_{nâ†’\mathrm{âˆž}}{a}_{n}=0$.

Now we are in a position to describe an algorithm for problem (BVI). The proposed algorithm can be considered as a combination of the projected gradient and fixed point methods. Roughly speaking the algorithm consists of two steps. First, we use the well-known projected gradient method for solving the variational inequalities $VI\left(G,C\right):{x}^{k+1}={Pr}_{C}\left({x}^{k}âˆ’\mathrm{Î»}G\left({x}^{k}\right)\right)$ ($k=0,1,â€¦$), where $\mathrm{Î»}>0$ and ${x}^{0}âˆˆC$. The method generates a sequence $\left({x}^{k}\right)$ converging strongly to the unique solution of problem $VI\left(G,C\right)$ under assumptions that G is L-Lipschitz continuous and Î±-strongly monotone on C with the step-size $\mathrm{Î»}âˆˆ\left(0,\frac{2\mathrm{Î±}}{{L}^{2}}\right)$. Next, we use the Banach contraction-mapping fixed-point principle for finding the unique fixed point of the contraction-mapping ${T}_{\mathrm{Î»}}=Iâˆ’\mathrm{Î»}\mathrm{Î¼}F$, where F is Î²-strongly monotone and L-Lipschitz continuous, I is the identity mapping, $\mathrm{Î¼}âˆˆ\left(0,\frac{2\mathrm{Î²}}{{L}^{2}}\right)$ and $\mathrm{Î»}âˆˆ\left(0,1\right]$. The algorithm is presented in detail as follows.

Algorithm 2.6 (Projection algorithm for solving (BVI))

Step 0. Choose ${x}^{0}âˆˆC$, $k=0$, a positive sequence $\left\{{\mathrm{Î±}}_{k}\right\}$, Î», Î¼ such that

$\left\{\begin{array}{c}0<{\mathrm{Î±}}_{k}â‰¤min\left\{1,\frac{1}{\mathrm{Ï„}}\right\},\phantom{\rule{2em}{0ex}}\mathrm{Ï„}=1âˆ’\sqrt{1âˆ’\mathrm{Î¼}\left(2\mathrm{Î²}âˆ’\mathrm{Î¼}{L}^{2}\right)},\hfill \\ {lim}_{kâ†’\mathrm{âˆž}}{\mathrm{Î±}}_{k}=0,\phantom{\rule{2em}{0ex}}{lim}_{kâ†’\mathrm{âˆž}}|\frac{1}{{\mathrm{Î±}}_{k+1}}âˆ’\frac{1}{{\mathrm{Î±}}_{k}}|=0,\hfill \\ {âˆ‘}_{k=0}^{\mathrm{âˆž}}{\mathrm{Î±}}_{k}=\mathrm{âˆž},\phantom{\rule{2em}{0ex}}0<\mathrm{Î»}â‰¤2\mathrm{Î·},\phantom{\rule{2em}{0ex}}0<\mathrm{Î¼}<\frac{2\mathrm{Î²}}{{L}^{2}}.\hfill \end{array}$
(2.1)

Step 1. Compute

$\left\{\begin{array}{c}{y}^{k}:={Pr}_{C}\left({x}^{k}âˆ’\mathrm{Î»}G\left({x}^{k}\right)\right),\hfill \\ {x}^{k+1}={y}^{k}âˆ’\mathrm{Î¼}{\mathrm{Î±}}_{k}F\left({y}^{k}\right).\hfill \end{array}$

Update $k:=k+1$, and go to Step 1.

Note that in the case $F\left(x\right)=0$ for all $xâˆˆC$, Algorithm 2.6 becomes the projected gradient algorithm as follows:

${x}^{k+1}:={Pr}_{C}\left({x}^{k}âˆ’\mathrm{Î»}G\left({x}^{k}\right)\right).$

3 Convergence results

In this section, we state and prove our main results.

Theorem 3.1 Let C be a nonempty closed convex subset of a real Hilbert space â„‹. Let two mappings $F:Câ†’\mathcal{H}$ and $G:\mathcal{H}â†’\mathcal{H}$ satisfy assumptions (A1)-(A3). Then the sequences $\left({x}^{k}\right)$ and $\left({y}^{k}\right)$ in Algorithm 2.6 converge strongly to the same point ${x}^{âˆ—}âˆˆ\mathrm{Î©}$.

Proof For conditions (2.1), we consider the mapping ${S}_{k}:\mathcal{H}â†’\mathcal{H}$ defined by

${S}_{k}\left(x\right):={Pr}_{C}\left(xâˆ’\mathrm{Î»}G\left(x\right)\right)âˆ’\mathrm{Î¼}{\mathrm{Î±}}_{k}F\left[{Pr}_{C}\left(xâˆ’\mathrm{Î»}G\left(x\right)\right)\right],\phantom{\rule{1em}{0ex}}\mathrm{âˆ€}xâˆˆ\mathcal{H}.$

Using Property 2.2(a), G is Î·-inverse strongly monotone and conditions (2.1), for each $x,yâˆˆ\mathcal{H}$, we have

$\begin{array}{rcl}{âˆ¥{Pr}_{C}\left(xâˆ’\mathrm{Î»}G\left(x\right)\right)âˆ’{Pr}_{C}\left(yâˆ’\mathrm{Î»}G\left(y\right)\right)âˆ¥}^{2}& â‰¤& {âˆ¥xâˆ’\mathrm{Î»}G\left(x\right)âˆ’y+\mathrm{Î»}G\left(y\right)âˆ¥}^{2}\\ =& {âˆ¥xâˆ’yâˆ¥}^{2}+{\mathrm{Î»}}^{2}{âˆ¥G\left(x\right)âˆ’G\left(y\right)âˆ¥}^{2}\\ âˆ’2\mathrm{Î»}ã€ˆxâˆ’y,G\left(x\right)âˆ’G\left(y\right)ã€‰\\ â‰¤& {âˆ¥xâˆ’yâˆ¥}^{2}+\mathrm{Î»}\left(\mathrm{Î»}âˆ’2\mathrm{Î·}\right){âˆ¥G\left(x\right)âˆ’G\left(y\right)âˆ¥}^{2}\\ â‰¤& {âˆ¥xâˆ’yâˆ¥}^{2}.\end{array}$
(3.1)

Combining this and Lemma 2.3, we get

$\begin{array}{rl}âˆ¥{S}_{k}\left(x\right)âˆ’{S}_{k}\left(y\right)âˆ¥=& âˆ¥{Pr}_{C}\left(xâˆ’\mathrm{Î»}G\left(x\right)\right)âˆ’\mathrm{Î¼}{\mathrm{Î±}}_{k}F\left[{Pr}_{C}\left(xâˆ’\mathrm{Î»}G\left(x\right)\right)\right]âˆ’{Pr}_{C}\left(yâˆ’\mathrm{Î»}G\left(y\right)\right)\\ +\mathrm{Î¼}{\mathrm{Î±}}_{k}F\left[{Pr}_{C}\left(yâˆ’\mathrm{Î»}G\left(y\right)\right)\right]âˆ¥\\ â‰¤& \left(1âˆ’{\mathrm{Î±}}_{k}\mathrm{Ï„}\right)âˆ¥{Pr}_{C}\left(xâˆ’\mathrm{Î»}G\left(x\right)\right)âˆ’{Pr}_{C}\left(yâˆ’\mathrm{Î»}G\left(y\right)\right)âˆ¥\\ â‰¤& \left(1âˆ’{\mathrm{Î±}}_{k}\mathrm{Ï„}\right)âˆ¥xâˆ’yâˆ¥,\end{array}$
(3.2)

where $\mathrm{Ï„}:=1âˆ’\sqrt{1âˆ’\mathrm{Î¼}\left(2\mathrm{Î²}âˆ’\mathrm{Î¼}{L}^{2}\right)}$. Thus, ${S}_{k}$ is a contraction on â„‹. By the Banach contraction principle, there is the unique fixed point ${\mathrm{Î¾}}^{k}$ such that ${S}_{k}\left({\mathrm{Î¾}}^{k}\right)={\mathrm{Î¾}}^{k}$. For each $\stackrel{Ë†}{x}âˆˆSol\left(G,C\right)$, set

$\stackrel{Ë†}{C}:=\left\{xâˆˆ\mathcal{H}:âˆ¥xâˆ’\stackrel{Ë†}{x}âˆ¥â‰¤\frac{\mathrm{Î¼}âˆ¥F\left(\stackrel{Ë†}{x}\right)âˆ¥}{\mathrm{Ï„}}\right\}.$

Due to this and Property 2.2(a), we get that the mapping ${S}_{k}{Pr}_{\stackrel{Ë†}{C}}$ is contractive on â„‹. Then there exists the unique point ${z}^{k}$ such that ${S}_{k}\left[{Pr}_{\stackrel{Ë†}{C}}\left({z}^{k}\right)\right]={z}^{k}$. Set ${\stackrel{Â¯}{z}}^{k}={Pr}_{\stackrel{Ë†}{C}}\left({z}^{k}\right)$. It follows from (3.2) that

$\begin{array}{rcl}âˆ¥{z}^{k}âˆ’\stackrel{Ë†}{x}âˆ¥& =& âˆ¥{S}_{k}\left({\stackrel{Â¯}{z}}^{k}\right)âˆ’\stackrel{Ë†}{x}âˆ¥\\ â‰¤& âˆ¥{S}_{k}\left({\stackrel{Â¯}{z}}^{k}\right)âˆ’{S}_{k}\left(\stackrel{Ë†}{x}\right)âˆ¥+âˆ¥{S}_{k}\left(\stackrel{Ë†}{x}\right)âˆ’\stackrel{Ë†}{x}âˆ¥\\ =& âˆ¥{S}_{k}\left({\stackrel{Â¯}{z}}^{k}\right)âˆ’{S}_{k}\left(\stackrel{Ë†}{x}\right)âˆ¥+âˆ¥{S}_{k}\left(\stackrel{Ë†}{x}\right)âˆ’{Pr}_{C}\left(\stackrel{Ë†}{x}âˆ’{\mathrm{Î±}}_{k}G\left(\stackrel{Ë†}{x}\right)\right)âˆ¥\\ â‰¤& \left(1âˆ’{\mathrm{Î±}}_{k}\mathrm{Ï„}\right)âˆ¥{\stackrel{Â¯}{z}}^{k}âˆ’\stackrel{Ë†}{x}âˆ¥+\mathrm{Î¼}{\mathrm{Î±}}_{k}âˆ¥F\left[{Pr}_{C}\left(\stackrel{Ë†}{x}âˆ’{\mathrm{Î±}}_{k}G\left(\stackrel{Ë†}{x}\right)\right)\right]âˆ¥\\ â‰¤& \left(1âˆ’{\mathrm{Î±}}_{k}\mathrm{Ï„}\right)\frac{\mathrm{Î¼}âˆ¥F\left(\stackrel{Ë†}{x}\right)âˆ¥}{\mathrm{Ï„}}+\mathrm{Î¼}{\mathrm{Î±}}_{k}âˆ¥F\left(\stackrel{Ë†}{x}\right)âˆ¥\\ =& \frac{\mathrm{Î¼}âˆ¥F\left(\stackrel{Ë†}{x}\right)âˆ¥}{\mathrm{Ï„}}.\end{array}$

Thus, ${z}^{k}âˆˆ\stackrel{Ë†}{C}$, ${S}_{k}\left[{Pr}_{\stackrel{Ë†}{C}}\left({z}^{k}\right)\right]={S}_{k}\left({z}^{k}\right)={z}^{k}$ and hence ${\mathrm{Î¾}}^{k}={z}^{k}âˆˆ\stackrel{Ë†}{C}$. Therefore, there exists a subsequence $\left({\mathrm{Î¾}}^{{k}_{i}}\right)$ of the sequence $\left({\mathrm{Î¾}}^{k}\right)$ such that ${\mathrm{Î¾}}^{{k}_{i}}â‡€\stackrel{Â¯}{\mathrm{Î¾}}$. Combining this and the assumption ${lim}_{kâ†’\mathrm{âˆž}}{\mathrm{Î±}}_{k}=0$, we get

(3.3)

It follows from (3.1) that the mapping ${Pr}_{C}\left(â‹\dots âˆ’{\mathrm{Î±}}_{k}G\left(â‹\dots \right)\right)$ is nonexpansive on â„‹. Using Lemma 2.4, (3.3) and ${\mathrm{Î¾}}^{{k}_{i}}â‡€\stackrel{Â¯}{\mathrm{Î¾}}$, we obtain ${Pr}_{C}\left(\stackrel{Â¯}{\mathrm{Î¾}}âˆ’\mathrm{Î»}G\left(\stackrel{Â¯}{\mathrm{Î¾}}\right)\right)=\stackrel{Â¯}{\mathrm{Î¾}}$, which implies $\stackrel{Â¯}{\mathrm{Î¾}}âˆˆSol\left(G,C\right)$. Now we will prove that ${lim}_{jâ†’\mathrm{âˆž}}{\mathrm{Î¾}}^{{k}_{j}}={x}^{âˆ—}âˆˆSol\left(\mathrm{BVI}\right)$.

Set ${\stackrel{Â¯}{z}}^{k}={Pr}_{C}\left({\mathrm{Î¾}}^{k}âˆ’\mathrm{Î»}G\left({\mathrm{Î¾}}^{k}\right)\right)$, ${v}^{âˆ—}=\left(\mathrm{Î¼}Fâˆ’I\right)\left({x}^{âˆ—}\right)$ and ${v}^{k}=\left(\mathrm{Î¼}Fâˆ’I\right)\left({\stackrel{Â¯}{z}}^{k}\right)$, where I is the identity mapping. Since ${S}_{{k}_{j}}\left({\mathrm{Î¾}}^{{k}_{j}}\right)={\mathrm{Î¾}}^{{k}_{j}}$ and ${x}^{âˆ—}={Pr}_{C}\left({x}^{âˆ—}âˆ’\mathrm{Î»}G\left({x}^{âˆ—}\right)\right)$, we have

$\left(1âˆ’{\mathrm{Î±}}_{{k}_{j}}\right)\left({\mathrm{Î¾}}^{{k}_{j}}âˆ’{\stackrel{Â¯}{z}}^{{k}_{j}}\right)+{\mathrm{Î±}}_{{k}_{j}}\left({\mathrm{Î¾}}^{{k}_{j}}+{v}^{{k}_{j}}\right)=0$

and

$\left(1âˆ’{\mathrm{Î±}}_{{k}_{j}}\right)\left[Iâˆ’{Pr}_{C}\left(â‹\dots âˆ’\mathrm{Î»}G\left(â‹\dots \right)\right)\right]\left({x}^{âˆ—}\right)+{\mathrm{Î±}}_{{k}_{j}}\left({x}^{âˆ—}+{v}^{âˆ—}\right)={\mathrm{Î±}}_{{k}_{j}}\left({x}^{âˆ—}+{v}^{âˆ—}\right).$

Then

$\begin{array}{rcl}âˆ’{\mathrm{Î±}}_{{k}_{j}}ã€ˆ{x}^{âˆ—}+{v}^{âˆ—},{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}ã€‰& =& \left(1âˆ’{\mathrm{Î±}}_{{k}_{j}}\right)ã€ˆ{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}âˆ’\left({\stackrel{Â¯}{z}}^{{k}_{j}}âˆ’{x}^{âˆ—}\right),{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}ã€‰\\ +{\mathrm{Î±}}_{{k}_{j}}ã€ˆ{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}+{v}^{{k}_{j}}âˆ’{v}^{âˆ—},{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}ã€‰.\end{array}$
(3.4)

By the Schwarz inequality, we have

$\begin{array}{rcl}ã€ˆ{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}âˆ’\left({\stackrel{Â¯}{z}}^{{k}_{j}}âˆ’{x}^{âˆ—}\right),{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}ã€‰& â‰¥& {âˆ¥{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}âˆ¥}^{2}âˆ’âˆ¥{\stackrel{Â¯}{z}}^{{k}_{j}}âˆ’{x}^{âˆ—}âˆ¥âˆ¥{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}âˆ¥\\ â‰¥& {âˆ¥{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}âˆ¥}^{2}âˆ’{âˆ¥{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}âˆ¥}^{2}\\ =& 0\end{array}$
(3.5)

and

$\begin{array}{rcl}ã€ˆ{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}+{v}^{{k}_{j}}âˆ’{v}^{âˆ—},{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}ã€‰& â‰¥& {âˆ¥{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}âˆ¥}^{2}âˆ’âˆ¥{v}^{{k}_{j}}âˆ’{v}^{âˆ—}âˆ¥âˆ¥{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}âˆ¥\\ â‰¥& {âˆ¥{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}âˆ¥}^{2}âˆ’\left(1âˆ’\mathrm{Ï„}\right){âˆ¥{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}âˆ¥}^{2}\\ =& \mathrm{Ï„}{âˆ¥{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}âˆ¥}^{2}.\end{array}$
(3.6)

Combining (3.4), (3.5) and (3.6), we get

$\begin{array}{rcl}âˆ’\mathrm{Ï„}{âˆ¥{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}âˆ¥}^{2}& â‰¥& ã€ˆ{x}^{âˆ—}+{v}^{âˆ—},{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}ã€‰\\ =& \mathrm{Î¼}ã€ˆF\left({x}^{âˆ—}\right),{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}ã€‰\\ =& \mathrm{Î¼}ã€ˆF\left({x}^{âˆ—}\right),{\mathrm{Î¾}}^{{k}_{j}}âˆ’\stackrel{Â¯}{\mathrm{Î¾}}ã€‰+\mathrm{Î¼}ã€ˆF\left({x}^{âˆ—}\right),\stackrel{Â¯}{\mathrm{Î¾}}âˆ’{x}^{âˆ—}ã€‰\\ â‰¥& \mathrm{Î¼}ã€ˆF\left({x}^{âˆ—}\right),{\mathrm{Î¾}}^{{k}_{j}}âˆ’\stackrel{Â¯}{\mathrm{Î¾}}ã€‰.\end{array}$

Then we have

$\mathrm{Ï„}{âˆ¥{\mathrm{Î¾}}^{{k}_{j}}âˆ’{x}^{âˆ—}âˆ¥}^{2}â‰¤\mathrm{Î¼}ã€ˆF\left({x}^{âˆ—}\right),\stackrel{Â¯}{\mathrm{Î¾}}âˆ’{\mathrm{Î¾}}^{{k}_{j}}ã€‰.$

Let $jâ†’\mathrm{âˆž}$, and hence the sequence $\left\{{\mathrm{Î¾}}^{{k}_{j}}\right\}$ converges strongly to ${x}^{âˆ—}$. This implies that the sequence $\left\{{\mathrm{Î¾}}^{k}\right\}$ also converges strongly to ${x}^{âˆ—}$.

Otherwise, by using (3.2), we have

$\begin{array}{rcl}âˆ¥{x}^{k}âˆ’{\mathrm{Î¾}}^{k}âˆ¥& â‰¤& âˆ¥{x}^{k}âˆ’{\mathrm{Î¾}}^{kâˆ’1}âˆ¥+âˆ¥{\mathrm{Î¾}}^{kâˆ’1}âˆ’{\mathrm{Î¾}}^{k}âˆ¥\\ =& âˆ¥{S}_{kâˆ’1}\left({x}^{kâˆ’1}\right)âˆ’{S}_{kâˆ’1}\left({\mathrm{Î¾}}^{kâˆ’1}\right)âˆ¥+âˆ¥{\mathrm{Î¾}}^{kâˆ’1}âˆ’{\mathrm{Î¾}}^{k}âˆ¥\\ â‰¤& \left(1âˆ’{\mathrm{Î±}}_{kâˆ’1}\mathrm{Ï„}\right)âˆ¥{x}^{kâˆ’1}âˆ’{\mathrm{Î¾}}^{kâˆ’1}âˆ¥+âˆ¥{\mathrm{Î¾}}^{kâˆ’1}âˆ’{\mathrm{Î¾}}^{k}âˆ¥.\end{array}$
(3.7)

Moreover, by Lemma 2.3, we have

$\begin{array}{rl}âˆ¥{\mathrm{Î¾}}^{kâˆ’1}âˆ’{\mathrm{Î¾}}^{k}âˆ¥=& âˆ¥{S}_{kâˆ’1}\left({\mathrm{Î¾}}^{kâˆ’1}\right)âˆ’{S}_{k}\left({\mathrm{Î¾}}^{k}\right)âˆ¥\\ =& âˆ¥\left(1âˆ’{\mathrm{Î±}}_{k}\right){\stackrel{Â¯}{z}}^{k}âˆ’{\mathrm{Î±}}_{k}{v}^{k}âˆ’\left(1âˆ’{\mathrm{Î±}}_{kâˆ’1}\right){\stackrel{Â¯}{z}}^{kâˆ’1}+{\mathrm{Î±}}_{kâˆ’1}{v}^{kâˆ’1}âˆ¥\\ =& âˆ¥\left(1âˆ’{\mathrm{Î±}}_{k}\right)\left({\stackrel{Â¯}{z}}^{k}âˆ’{\stackrel{Â¯}{z}}^{kâˆ’1}\right)âˆ’{\mathrm{Î±}}_{k}\left({v}^{k}âˆ’{v}^{kâˆ’1}\right)+\left({\mathrm{Î±}}_{kâˆ’1}âˆ’{\mathrm{Î±}}_{k}\right)\left({\stackrel{Â¯}{z}}^{kâˆ’1}+{v}^{kâˆ’1}\right)âˆ¥\\ â‰¤& \left(1âˆ’{\mathrm{Î±}}_{k}\right)âˆ¥{\stackrel{Â¯}{z}}^{k}âˆ’{\stackrel{Â¯}{z}}^{kâˆ’1}âˆ¥+{\mathrm{Î±}}_{k}âˆ¥{v}^{k}âˆ’{v}^{kâˆ’1}âˆ¥+|{\mathrm{Î±}}_{kâˆ’1}âˆ’{\mathrm{Î±}}_{k}|\mathrm{Î¼}âˆ¥F\left({\stackrel{Â¯}{z}}^{kâˆ’1}\right)âˆ¥\\ â‰¤& \left(1âˆ’{\mathrm{Î±}}_{k}\right)âˆ¥{\stackrel{Â¯}{z}}^{k}âˆ’{\stackrel{Â¯}{z}}^{kâˆ’1}âˆ¥+{\mathrm{Î±}}_{k}\sqrt{1âˆ’\mathrm{Î¼}\left(2\mathrm{Î²}âˆ’\mathrm{Î¼}{L}^{2}\right)}âˆ¥{\mathrm{Î¾}}^{k}âˆ’{\mathrm{Î¾}}^{kâˆ’1}âˆ¥\\ +|{\mathrm{Î±}}_{kâˆ’1}âˆ’{\mathrm{Î±}}_{k}|\mathrm{Î¼}âˆ¥F\left({\stackrel{Â¯}{z}}^{kâˆ’1}\right)âˆ¥\\ â‰¤& \left(1âˆ’{\mathrm{Î±}}_{k}\right)âˆ¥{\mathrm{Î¾}}^{k}âˆ’{\mathrm{Î¾}}^{kâˆ’1}âˆ¥+{\mathrm{Î±}}_{k}\sqrt{1âˆ’\mathrm{Î¼}\left(2\mathrm{Î²}âˆ’\mathrm{Î¼}{L}^{2}\right)}âˆ¥{\mathrm{Î¾}}^{k}âˆ’{\mathrm{Î¾}}^{kâˆ’1}âˆ¥\\ +|{\mathrm{Î±}}_{kâˆ’1}âˆ’{\mathrm{Î±}}_{k}|\mathrm{Î¼}âˆ¥F\left({\stackrel{Â¯}{z}}^{kâˆ’1}\right)âˆ¥.\end{array}$

This implies that

${\mathrm{Î±}}_{k}\mathrm{Ï„}âˆ¥{\mathrm{Î¾}}^{kâˆ’1}âˆ’{\mathrm{Î¾}}^{k}âˆ¥â‰¤|{\mathrm{Î±}}_{kâˆ’1}âˆ’{\mathrm{Î±}}_{k}|\mathrm{Î¼}âˆ¥F\left({\stackrel{Â¯}{z}}^{kâˆ’1}\right)âˆ¥$

and hence

$âˆ¥{\mathrm{Î¾}}^{k}âˆ’{\mathrm{Î¾}}^{kâˆ’1}âˆ¥â‰¤\frac{\mathrm{Î¼}|{\mathrm{Î±}}_{kâˆ’1}âˆ’{\mathrm{Î±}}_{k}|âˆ¥F\left({\stackrel{Â¯}{z}}^{kâˆ’1}\right)âˆ¥}{{\mathrm{Î±}}_{k}\mathrm{Ï„}}.$

So, we have

$âˆ¥{x}^{k}âˆ’{\mathrm{Î¾}}^{k}âˆ¥â‰¤\left(1âˆ’{\mathrm{Î±}}_{kâˆ’1}\mathrm{Ï„}\right)âˆ¥{x}^{kâˆ’1}âˆ’{\mathrm{Î¾}}^{kâˆ’1}âˆ¥+\frac{\mathrm{Î¼}|{\mathrm{Î±}}_{kâˆ’1}âˆ’{\mathrm{Î±}}_{k}|âˆ¥F\left({\stackrel{Â¯}{z}}^{kâˆ’1}\right)âˆ¥}{{\mathrm{Î±}}_{k}\mathrm{Ï„}}.$

Let

${\mathrm{Î´}}_{k}:=\frac{\mathrm{Î¼}|{\mathrm{Î±}}_{k}âˆ’{\mathrm{Î±}}_{k+1}|âˆ¥F\left({\stackrel{Â¯}{z}}^{k}\right)âˆ¥}{{\mathrm{Î±}}_{k}{\mathrm{Î±}}_{k+1}{\mathrm{Ï„}}^{2}},\phantom{\rule{1em}{0ex}}kâ‰¥0.$

Then

$âˆ¥{x}^{k}âˆ’{\mathrm{Î¾}}^{k}âˆ¥â‰¤\left(1âˆ’{\mathrm{Î±}}_{kâˆ’1}\mathrm{Ï„}\right)âˆ¥{x}^{kâˆ’1}âˆ’{\mathrm{Î¾}}^{kâˆ’1}âˆ¥+{\mathrm{Î±}}_{kâˆ’1}\mathrm{Ï„}{\mathrm{Î´}}_{kâˆ’1},\phantom{\rule{1em}{0ex}}\mathrm{âˆ€}kâ‰¥1.$

Since $\left\{F\left({\stackrel{Â¯}{z}}^{k}\right)\right\}$ is bounded, $âˆ¥F\left({\stackrel{Â¯}{z}}^{k}\right)âˆ¥â‰¤K$ for all $kâ‰¥0$, we have

$\underset{kâ†’\mathrm{âˆž}}{lim}{\mathrm{Î´}}_{k}=\underset{kâ†’\mathrm{âˆž}}{lim}\frac{\mathrm{Î¼}|{\mathrm{Î±}}_{k}âˆ’{\mathrm{Î±}}_{k+1}|âˆ¥F\left({\stackrel{Â¯}{z}}^{k}\right)âˆ¥}{{\mathrm{Î±}}_{k}{\mathrm{Î±}}_{k+1}{\mathrm{Ï„}}^{2}}â‰¤\frac{\mathrm{Î¼}K}{{\mathrm{Ï„}}^{2}}\underset{kâ†’\mathrm{âˆž}}{lim}|\frac{1}{{\mathrm{Î±}}_{k+1}}âˆ’\frac{1}{{\mathrm{Î±}}_{k}}|=0.$

Applying Lemma 2.5, ${lim}_{kâ†’\mathrm{âˆž}}âˆ¥{x}^{k}âˆ’{\mathrm{Î¾}}^{k}âˆ¥=0$. Combining this and the fact that the sequence $\left\{{\mathrm{Î¾}}^{k}\right\}$ converges strongly to ${x}^{âˆ—}$, the sequence $\left\{{x}^{k}\right\}$ also converges strongly to the unique solution to problem (BVI).â€ƒâ–¡

Now we consider the special case $F\left(x\right)=x$ for all $xâˆˆ\mathcal{H}$. It is easy to see that F is Lipschitz continuous with constant $L=1$ and strongly monotone with constant $\mathrm{Î²}=1$ on â„‹. Problem (BVI) becomes the minimum-norm problems of the solution set of the variational inequalities.

Corollary 3.2 Let C be a nonempty closed convex subset of a real Hilbert space â„‹. Let $G:\mathcal{H}â†’\mathcal{H}$ be Î·-inverse strongly monotone. The iteration sequence $\left({x}^{k}\right)$ is defined by

$\left\{\begin{array}{c}{y}^{k}:={Pr}_{C}\left({x}^{k}âˆ’\mathrm{Î»}G\left({x}^{k}\right)\right),\hfill \\ {x}^{k+1}=\left(1âˆ’\mathrm{Î¼}{\mathrm{Î±}}_{k}\right){y}^{k}.\hfill \end{array}$

The parameters satisfy the following:

$\left\{\begin{array}{c}0<{\mathrm{Î±}}_{k}â‰¤min\left\{1,\frac{1}{\mathrm{Ï„}}\right\},\phantom{\rule{2em}{0ex}}\mathrm{Ï„}=1âˆ’|1âˆ’\mathrm{Î¼}|,\hfill \\ {lim}_{kâ†’\mathrm{âˆž}}{\mathrm{Î±}}_{k}=0,\phantom{\rule{2em}{0ex}}{lim}_{kâ†’\mathrm{âˆž}}|\frac{1}{{\mathrm{Î±}}_{k+1}}âˆ’\frac{1}{{\mathrm{Î±}}_{k}}|=0,\hfill \\ {âˆ‘}_{k=0}^{\mathrm{âˆž}}{\mathrm{Î±}}_{k}=\mathrm{âˆž},\phantom{\rule{2em}{0ex}}0<\mathrm{Î»}â‰¤2\mathrm{Î·},\phantom{\rule{2em}{0ex}}0<\mathrm{Î¼}<2.\hfill \end{array}$

Then the sequences $\left\{{x}^{k}\right\}$ and $\left\{{y}^{k}\right\}$ converge strongly to the same point $\stackrel{Ë†}{x}={Pr}_{Sol\left(G,C\right)}\left(0\right)$.

References

1. Anh PN, Muu LD, Hien NV, Strodiot JJ: Using the Banach contraction principle to implement the proximal point method for multivalued monotone variational inequalities. J. Optim. Theory Appl. 2005, 124: 285â€“306. 10.1007/s10957-004-0926-0

2. Baiocchi C, Capelo A: Variational and Quasivariational Inequalities: Applications to Free Boundary Problems. Wiley, New York; 1984.

3. Facchinei F, Pang JS: Finite-Dimensional Variational Inequalities and Complementary Problems. Springer, New York; 2003.

4. Xu MH, Li M, Yang CC: Neural networks for a class of bi-level variational inequalities. J. Glob. Optim. 2009, 44: 535â€“552. 10.1007/s10898-008-9355-1

5. Moudafi A: Proximal methods for a class of bilevel monotone equilibrium problems. J. Glob. Optim. 2010, 47: 287â€“292. 10.1007/s10898-009-9476-1

6. Luo ZQ, Pang JS, Ralph D: Mathematical Programs with Equilibrium Constraints. Cambridge University Press, Cambridge; 1996.

7. Solodov M: An explicit descent method for bilevel convex optimization. J. Convex Anal. 2007, 14: 227â€“237.

8. Anh PN: An interior-quadratic proximal method for solving monotone generalized variational inequalities. East-West J. Math. 2008, 10: 81â€“100.

9. Anh PN: An interior proximal method for solving pseudomonotone non-Lipschitzian multivalued variational inequalities. Nonlinear Anal. Forum 2009, 14: 27â€“42.

10. Anh PN, Muu LD, Strodiot JJ: Generalized projection method for non-Lipschitz multivalued monotone variational inequalities. Acta Math. Vietnam. 2009, 34: 67â€“79.

11. Daniele P, Giannessi F, Maugeri A: Equilibrium Problems and Variational Models. Kluwer Academic, Dordrecht; 2003.

12. Giannessi F, Maugeri A, Pardalos PM: Equilibrium Problems: Nonsmooth Optimization and Variational Inequality Models. Kluwer Academic, Dordrecht; 2004.

13. Konnov IV: Combined Relaxation Methods for Variational Inequalities. Springer, Berlin; 2000.

14. Yao, Y, Marino, G, Muglia, L: A modified Korpelevichâ€™s method convergent to the minimum-norm solution of a variational inequality. Optimization, 1â€“11, iFirst (2012)

15. Zegeye H, Shahzad N, Yao Y: Minimum-norm solution of variational inequality and fixed point problem in Banach spaces. Optimization 2013. 10.1080/02331934.2013.764522

16. Trujillo-Cortez R, Zlobec S: Bilevel convex programming models. Optimization 2009, 58: 1009â€“1028. 10.1080/02331930701763330

17. Glackin J, Ecker JG, Kupferschmid M: Solving bilevel linear programs using multiple objective linear programming. J.Â Optim. Theory Appl. 2009, 140: 197â€“212. 10.1007/s10957-008-9467-2

18. Sabharwal A, Potter LC: Convexly constrained linear inverse problems: iterative least-squares and regularization. IEEE Trans. Signal Process. 1998, 46: 2345â€“2352. 10.1109/78.709518

19. Anh PN, Kim JK, Muu LD: An extragradient method for solving bilevel variational inequalities. J. Glob. Optim. 2012, 52: 627â€“639. 10.1007/s10898-012-9870-y

20. Kalashnikov VV, Kalashnikova NI: Solving two-level variational inequality. J. Glob. Optim. 1996, 8: 289â€“294. 10.1007/BF00121270

21. Iiduka H: Strong convergence for an iterative method for the triple-hierarchical constrained optimization problem. Nonlinear Anal. 2009, 71: e1292-e1297. 10.1016/j.na.2009.01.133

22. Suzuki T: Strong convergence of Krasnoselskii and Mannâ€™s type sequences for one parameter nonexpansive semigroups without Bochner integrals. J. Math. Anal. Appl. 2005, 305: 227â€“239. 10.1016/j.jmaa.2004.11.017

Acknowledgements

The authors are very grateful to the anonymous referees for their really helpful and constructive comments that helped us very much in improving the paper.

Author information

Authors

Corresponding author

Correspondence to Tran TH Anh.

Competing interests

The authors declare that they have no competing interests.

Authorsâ€™ contributions

The main idea of this paper was proposed by TTHA. The revision is made by LBL and TVA. All authors read and approved the final manuscript.

Rights and permissions

Reprints and permissions

Anh, T.T., Long, L.B. & Anh, T.V. A projection method for bilevel variational inequalities. J Inequal Appl 2014, 205 (2014). https://doi.org/10.1186/1029-242X-2014-205