Open Access

Finite-Step Relaxed Hybrid Steepest-Descent Methods for Variational Inequalities

Journal of Inequalities and Applications20082008:598632

DOI: 10.1155/2008/598632

Received: 22 August 2007

Accepted: 13 March 2008

Published: 8 April 2008


The classical variational inequality problem with a Lipschitzian and strongly monotone operator on a nonempty closed convex subset in a real Hilbert space was studied. A new finite-step relaxed hybrid steepest-descent method for this class of variational inequalities was introduced. Strong convergence of this method was established under suitable assumptions imposed on the algorithm parameters.

1. Introduction

Let be a real Hilbert space with inner product and norm . Let be a nonempty closed convex subset of and let be an operator. The classical variational inequality problem: find such that
was initially studied by Kinderlehrer and Stampacchia [1]. It is also known that the is equivalent to the fixed-point equation

where is the (nearest point) projection from onto , that is, for each and where is an arbitrarily fixed constant. If is strongly monotone and Lipschitzian on and is small enough, then the mapping determined by the right-hand side of this equation is a contraction. Hence the Banach contraction principle guarantees that the Picard iterates converge in norm to the unique solution of the . Such a method is called the projection method. However, Zeng and Yao [2] point out that the fixed-point equation involves the projection which may not be easy to compute due to the complexity of the convex set . To reduce the complexity problem probably caused by the projection , a class of hybrid steepest-descent methods for solving VI( has been introduced and studied recently by many authors (see, e.g., [3, 4]). Zeng and Yao [2] have established the method of two-step relaxed hybrid steepest-descent for variational inequalities. A natural arising problem is whether there exists a general relaxed hybrid steepest-descent algorithm that is more than two steps for finding approximate solutions of VI( or not. Motivated and inspired by the recent research work in this direction, we introduce the following finite step relaxed hybrid steepest-descent algorithm for finding approximate solutions of VI( and aim to unify the convergence results of this kind of methods.

Algorithm 1.1

Let , , for , and take fixed numbers , . Starting with arbitrarily chosen initial points , compute the sequences such that

We will prove a strong convergence result for Algorithm 1.1 under suitable restrictions imposed on the parameters.

2. Preliminaries

The following lemmas will be used for proving the main result of the paper in next section.

Lemma 2.1 (See [5]).

Let be a sequence of nonnegative real numbers satisfying the inequality

where and satisfy the following conditions:

(i) , or equivalently,

(ii) ;

(iii) .


Lemma 2.2 (See [6]).

Demiclosedness principle: assume that is a nonexpansive self-mapping on a nonempty closed convex subset of a Hilbert space If has a fixed point, then is demiclosed; that is, whenever is a sequence in weakly converging to some and the sequence strongly converges to some , it follows that Here is the identity operator of

The following lemma is an immediate consequence of an inner product.

Lemma 2.3.

In a real Hilbert space there holds the inequality

Lemma 2.4.

Let be a nonempty closed convex subset of . For any and , the following statements hold:

(i) ;

(ii) .

3. Convergence Theorem

Let be a real Hilbert space and let be a nonempty closed convex subset of Let be an operator such that for some constants is -Lipschitzian and -strongly monotone on that is, satisfies the conditions

respectively. Since F is -strongly monotone, the variational inequality problem has a unique solution (see, e.g., [7]).

Assume that is a nonexpansive mapping with the fixed points set Note that obviously For any given numbers and , we define the mapping by

Lemma 3.1 (See [3]).

Let be a contraction provided that and Indeed,


We now state and prove the main result of this paper.

Theorem 3.2.

Let be a real Hilbert space and let be a nonempty closed convex subset of Let be an operator such that for some constants is -Lipschitzian and -strongly monotone on . Assume that is a nonexpansive mapping with the fixed points set the real sequences , for , in Algorithm 1.1 satisfy the following conditions:

(i) , for ;

(ii) and for ;

(iii) ;

(iv) , for all .

Then the sequences generated by Algorithm 1.1 converge strongly to which is the unique solution of the .


Since is -strongly monotone, by [7], the has the unique solution . Next we divide the rest of the proof into several steps.

Step 1.

Let is bounded for each . Indeed, let us denote that , then we have
where . Moreover, we also have
where , and for ,
where , and

where .

Thus we obtain
for In particular,
Hence, substituting (3.8) in (3.3) and by condition (iv), we obtain
By induction, it is easy to see that
where . Indeed, for , from (3.9) we obtain
Suppose that for We want to claim that Indeed,
Therefore, we have , for all and , for all . In this case, from (3.8), it follows that

Step 2.

Let Indeed by Step 1, is bounded for and so are and for . Thus from the conditions that , for and , we have, for ,
and so

Step 3.

Let Indeed, we observe that
and, for ,
Hence it follows from the above inequalities (3.17)–(3.19) that
Let us substitute (3.19) into (3.20), then we have
We put
Then and

From (ii)–(iv), we obtain as . Furthermore, from (i), By Lemma 2.1, we deduce that as .

Step 4.

Let as . From Steps 2 and 3, we have

as .

Step 5.

Let , for . Let be a subsequence of such that
Without loss of generality, we assume that weakly for some . By Step 4, we derive weakly. But by Lemma 2.2 and Step 4, we have Since is the unique solution of the we obtain
From the proof of Step 2,
for Then


Step 6.

Let in norm and so does for Indeed using Lemma 2.3 and (3.7) we get
From (ii), (iii), and Step 5, we have for and for , and is bounded; by Lemma 2.4, we conclude that

Consequently from Lemma 2.1, we obtain and hence it follows from for that for .



This research was partially supported by Grant no. NSC95-2115-M-039-001- from the National Science Council of Taiwan.

Authors’ Affiliations

Department of Occupational Safety and Health, General Education Center, China Medical University


  1. Kinderlehrer D, Stampacchia G: An Introduction to Variational Inequalities and Their Applications, Pure and Applied Mathematics. Volume 88. Academic Press, New York, NY, USA; 1980:xiv+313.Google Scholar
  2. Zeng LC, Yao J-C: Two step relaxed hybrid steepest-descent methods for variational inequalities. to appear in Applied Mathematics and Mechanics
  3. Yamada I: The hybrid steepest-descent method for the variational inequality problem over the intersection of fixed point sets of nonexpansive mappings. In Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications, Studies in Computational Mathematics. Volume 8. Edited by: Butnariu D, Censor Y, Reich S. North-Holland, Amsterdam, The Netherlands; 2001:473–504.Google Scholar
  4. Zeng LC, Wong NC, Yao J-C: Convergence analysis of modified hybrid steepest-descent methods with variable parameters for variational inequalities. Journal of Optimization Theory and Applications 2007,132(1):51–69. 10.1007/s10957-006-9068-xMATHMathSciNetView ArticleGoogle Scholar
  5. Xu H-K: Iterative algorithms for nonlinear operators. Journal of the London Mathematical Society 2002,66(1):240–256. 10.1112/S0024610702003332MATHMathSciNetView ArticleGoogle Scholar
  6. Goebel K, Kirk WA: Topics in Metric Fixed Point Theory, Cambridge Studies in Advanced Mathematics. Volume 28. Cambridge University Press, Cambridge, UK; 1990:viii+244.View ArticleGoogle Scholar
  7. Yao J-C: Variational inequalities with generalized monotone operators. Mathematics of Operations Research 1994,19(3):691–705. 10.1287/moor.19.3.691MATHMathSciNetView ArticleGoogle Scholar


© Yen-Cherng Lin. 2008

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.