Skip to main content

Some new bounds for the minimum eigenvalue of the Hadamard product of an M-matrix and an inverse M-matrix

Abstract

Let A and B be nonsingular M-matrices. Several new bounds on the minimum eigenvalue for the Hadamard product of B and the inverse matrix of A are given. These bounds can improve considerably some previous results.

MSC:15A42, 15B34.

1 Introduction

Let C n × n ( R n × n ) denote the set of all n×n complex (real) matrices, A=( a i j ) C n × n , N={1,2,,n}. We write A0 if a i j 0 for any i,jN. If A0, A is called a nonnegative matrix. The spectral radius of A is denoted by ρ(A).

We denote by Z n the class of all n×n real matrices, whose off-diagonal entries are nonpositive. A matrix A=( a i j ) Z n is called a nonsingular M-matrix if there exist a nonnegative matrix B and a nonnegative real number s such that A=sIB with s>ρ(B), where I is the identity matrix. M n will be used to denote the set of all n×n nonsingular M-matrices. Let us denote τ(A)=min{Re(λ):λσ(A)}, where σ(A) denotes the spectrum of A.

The Hadamard product of two matrices A=( a i j ) C n × n and B=( b i j ) C n × n is the matrix AB=( a i j b i j ) C n × n . If A,B M n , then B A 1 is also an M-matrix (see [1]).

Let A=( a i j ) be an n×n matrix with all diagonal entries being nonzero throughout. For i,j,kN, ij, denote

R i = j i | a i j | , d i = R i | a i i | ; r j i = | a j i | | a j j | k j , i | a j k | , r i = max j i { r j i } ; m j i = | a j i | + k j , i | a j k | r i | a j j | , m i = max j i { m i j } ; u j i = | a j i | + k j , i | a j k | m k i | a j j | , u i = max j i { u i j } .

In 2013, Zhou et al. [2] obtained the following result: If A=( a i j ) M n is a strictly row diagonally dominant matrix, B=( b i j ) M n and A 1 =( α i j ), then

τ ( B A 1 ) min i N { b i i m i j i | b j i | a i i } .
(1)

In 2013, Cheng et al. [3] presented the following result: If A=( a i j ) M n and A 1 =( α i j ) is a doubly stochastic matrix, then

τ ( A A 1 ) min 1 i n { a i i u i j i | a j i | 1 + j i u j i } .
(2)

In this paper, we present some new lower bounds of τ(B A 1 ) and τ(A A 1 ), which improve (1) and (2).

2 Main results

In this section, we present our main results. Firstly, we give some lemmas.

Lemma 1 [4]

Let A=( a i j ) R n × n . If A is a strictly row diagonally dominant matrix, then A 1 =( α i j ) satisfies

| α j i | d j | α i i |,j,iN,ji.

Lemma 2 Let A=( a i j ) R n × n . If A is a strictly row diagonally dominant M-matrix, then A 1 =( α i j ) satisfies

α j i w j i α i i ,j,iN,ji,

where

w j i = | a j i | + k j , i | a j k | m k i h i | a j j | , h i = max j i { | a j i | | a j j | m j i k j , i | a j k | m k i } .

Proof This proof is similar to the one of Lemma 2.2 in [3]. □

Lemma 3 If A=( a i j ) M n and A 1 =( α i j ) is a doubly stochastic matrix, then

α i i 1 1 + j i w j i ,iN,

where w j i is defined as in Lemma  2.

Proof This proof is similar to the one of Lemma 3.1 in [3]. □

Lemma 4 [4]

If A=( a i j ) R n × n is a strictly row diagonally dominant M-matrix, then, for A 1 =( α i j ),

α i i 1 a i i ,iN.

Lemma 5 [5]

If A=( a i j ) C n × n and x 1 , x 2 ,, x n are positive real numbers, then all the eigenvalues of A lie in the region

i j { z C : | z a i i | x i k i 1 x k | a k i | , i N } .

Lemma 6 [6]

If A=( a i j ) C n × n and x 1 , x 2 ,, x n are positive real numbers, then all the eigenvalues of A lie in the region

i j { z C : | z a i i | | z a j j | ( x i k i 1 x k | a k i | ) ( x j k j 1 x k | a k j | ) , i , j N } .

Theorem 1 If A=( a i j ), B=( b i j ) M n and A 1 =( α i j ), then

τ ( B A 1 ) min 1 i n { b i i w i j i | b j i | a i i } ,
(3)

where w i = max j i { w i j } and w i j is defined as in Lemma  2.

Proof It is evident that the result holds with equality for n=1.

We next assume that n2.

Since A is an M-matrix, there exists a positive diagonal matrix D such that D 1 AD is a strictly row diagonally dominant M-matrix, and

τ ( B A 1 ) =τ ( D 1 ( B A 1 ) D ) =τ ( B ( D 1 A D ) 1 ) .

Therefore, for convenience and without loss of generality, we assume that A is a strictly row diagonally dominant matrix.

(i) First, we assume that A and B are irreducible matrices. Then, for any iN, we have 0< w i <1. Since τ(B A 1 ) is an eigenvalue of B A 1 , then by Lemma 2 and Lemma 5, there exists an i such that

| τ ( B A 1 ) b i i α i i | w i j i 1 w j | b j i α j i | w i j i 1 w j | b j i | w j i | α i i | w i j i 1 w j | b j i | w j | α i i | = w i | α i i | j i | b j i | .

By Lemma 4, the above inequality and 0τ(B A 1 ) b i i α i i , for any iN, we obtain

| τ ( B A 1 ) | b i i α i i w i | α i i | j i | b j i | b i i w i j i | b j i | a i i min 1 i n { b i i w i j i | b j i | a i i } .

(ii) Now, assume that one of A and B is reducible. It is well known that a matrix in Z n is a nonsingular M-matrix if and only if all its leading principal minors are positive (see [7]). If we denote by T=( t i j ) the n×n permutation matrix with t 12 = t 23 == t n 1 , n = t n 1 =1, the remaining t i j zero, then both AϵT and BϵT are irreducible nonsingular M-matrices for any chosen positive real number ϵ sufficiently small such that all the leading principal minors of both AϵT and BϵT are positive. Now, we substitute AϵT and BϵT for A and B, respectively, in the previous case, and then letting ϵ0, the result follows by continuity. □

From Lemma 3 and Theorem 1, we can easily obtain the following corollaries.

Corollary 1 If A=( a i j ),B=( b i j ) M n and A 1 =( α i j ) is a doubly stochastic matrix, then

τ ( B A 1 ) min 1 i n { b i i w i j i | b j i | 1 + j i w j i } .

Corollary 2 If A=( a i j ) M n and A 1 =( α i j ) is a doubly stochastic matrix, then

τ ( A A 1 ) min 1 i n { a i i w i j i | a j i | 1 + j i w j i } .
(4)

Remark 1 We next give a simple comparison between (3) and (1), (4) and (2), respectively. Since m j i h i r i , 0 h i 1, j,iN, ji, then w j i m j i , w i m i and w j i u j i , w i u i for any j,iN, ji. Therefore,

τ ( B A 1 ) min 1 i n { b i i w i j i | b j i | a i i } min 1 i n { b i i m i j i | b j i | a i i } , τ ( A A 1 ) min 1 i n { a i i w i j i | a j i | 1 + j i w j i } min 1 i n { a i i u i j i | a j i | 1 + j i u j i } .

So, the bound in (3) is bigger than the bound in (1) and the bound in (4) is bigger than the bound in (2).

Theorem 2 If A=( a i j ),B=( b i j ) M n and A 1 =( α i j ), then

τ ( B A 1 ) min i j 1 2 { α i i b i i + α j j b j j [ ( α i i b i i α j j b j j ) 2 + 4 ( w i k i | b k i | α i i ) ( w j k j | b k j | α j j ) ] 1 2 } ,

where w i (iN) is defined as in Theorem  1.

Proof It is evident that the result holds with equality for n=1.

We next assume that n2. For convenience and without loss of generality, we assume that A is a strictly row diagonally dominant matrix.

(i)First, we assume that A and B are irreducible matrices. Let R j σ = k j | a j k | m k i h i , j,iN, ji. Then, for any j,iN, ji, we have

R j σ = k j | a j k | m k i h i | a j i |+ k j , i | a j k | m k i h i R j < a j j .

Therefore, there exists a real number z j i (0 z j i 1) such that

| a j i |+ k j , i | a j k | m k i h i = z j i R j +(1 z j i ) R j σ ,j,iN,ji.

Hence,

w j i = z j i R j + ( 1 z j i ) R j σ a j j ,jN.

Let z j = max i j z j i . Obviously, 0< z j 1 (if z j =0, then A is reducible, which is a contradiction). Let

w j = max i j { w j i }= z j R j + ( 1 z j ) R j σ a j j ,jN.

Since A is irreducible, then R j >0, R j σ 0, and 0< w j <1. Let τ(B A 1 )=λ. By Lemma 6, there exist i 0 , j 0 N, i 0 j 0 such that

|λ α i 0 i 0 b i 0 i 0 ||λ α j 0 j 0 b j 0 j 0 | ( w i 0 k i 0 1 w k | α k i 0 b k i 0 | ) ( w j 0 k j 0 1 w k | α k j 0 b k j 0 | ) .

And by Lemma 2, we have

( w i 0 k i 0 1 w k | α k i 0 b k i 0 | ) ( w j 0 k j 0 1 w k | α k j 0 b k j 0 | ) ( w i 0 k i 0 | b k i 0 | α i 0 i 0 ) ( w j 0 k j 0 | b k j 0 | α j 0 j 0 ) .

Therefore,

|λ α i 0 i 0 b i 0 i 0 ||λ α j 0 j 0 b j 0 j 0 | ( w i 0 k i 0 | b k i 0 | α i 0 i 0 ) ( w j 0 k j 0 | b k j 0 | α j 0 j 0 ) .

Furthermore, we obtain

λ 1 2 { α i 0 i 0 b i 0 i 0 + α j 0 j 0 b j 0 j 0 [ ( α i 0 i 0 b i 0 i 0 α j 0 j 0 b j 0 j 0 ) 2 + 4 ( w i 0 k i 0 | b k i 0 | α i 0 i 0 ) ( w j 0 k j 0 | b k j 0 | α j 0 j 0 ) ] 1 2 } ,

that is,

τ ( B A 1 ) 1 2 { α i 0 i 0 b i 0 i 0 + α j 0 j 0 b j 0 j 0 [ ( α i 0 i 0 b i 0 i 0 α j 0 j 0 b j 0 j 0 ) 2 + 4 ( w i 0 k i 0 | b k i 0 | α i 0 i 0 ) ( w j 0 k j 0 | b k j 0 | α j 0 j 0 ) ] 1 2 } min i j 1 2 { α i i b i i + α j j b j j [ ( α i i b i i α j j b j j ) 2 + 4 ( w i k i | b k i | α i i ) ( w j k j | b k j | α j j ) ] 1 2 } .

(ii) Now, assume that one of A and B is reducible. We substitute AϵT and BϵT for A and B, respectively, in the previous case, and then letting ϵ0, the result follows by continuity. □

Corollary 3 If A=( a i j ) M n and A 1 =( α i j ), then

τ ( A A 1 ) min i j 1 2 { α i i a i i + α j j a j j [ ( α i i a i i α j j a j j ) 2 + 4 ( w i k i | a k i | α i i ) ( w j k j | a k j | α j j ) ] 1 2 } .

Example 1 Let

A = ( 39 16 2 3 2 5 2 3 5 0 26 44 2 4 2 1 0 2 3 3 1 9 29 3 4 0 5 4 1 1 2 3 10 36 12 0 5 1 2 0 0 3 1 9 44 16 3 4 4 3 3 4 3 4 12 48 18 1 0 2 2 1 4 3 4 16 45 9 4 1 1 2 2 2 3 1 5 38 20 1 2 1 0 3 4 5 2 10 47 19 1 4 4 4 0 3 4 3 7 31 ) , B = ( 90 3 2 7 4 7 6 3 9 3 4 100 5 4 8 7 1 9 8 8 5 9 62 4 7 9 9 1 4 8 8 8 10 99 0 6 8 9 3 6 3 8 10 6 62 3 6 7 5 1 2 3 5 10 6 55 5 1 3 10 8 5 8 8 3 3 52 6 1 4 4 5 8 4 1 1 6 57 7 7 2 1 6 10 2 6 5 9 86 5 5 7 3 9 5 7 9 5 9 72 ) .

It is easily proved that A and B are nonsingular M-matrices and A is a doubly stochastic matrix.

(i) If we apply Theorem 4.8 of [2], we have

τ ( B A 1 ) min 1 i n { b i i m i j i | b j i | a i i } =0.0027.

If we apply Theorem 2.4 of [8], we have

τ ( B A 1 ) ( 1 ρ ( J A ) ρ ( J B ) ) min 1 i n a i i b i i =0.3485.

But, if we apply Theorem 1, we have

τ ( B A 1 ) min 1 i n { b i i w i j i | b j i | a i i } =0.0435.

If we apply Corollary 1, we have

τ ( B A 1 ) min 1 i n { b i i w i j i | b j i | 1 + j i w j i } =0.2172.

If we apply Theorem 2, we have

τ ( B A 1 ) min i j 1 2 { α i i b i i + α j j b j j [ ( α i i b i i α j j b j j ) 2 + 4 ( w i k i | b k i | α i i ) ( w j k j | b k j | α j j ) ] 1 2 } = 0.7212 .
  1. (ii)

    If we apply Theorem 3.2 of [3], we get

    τ ( A A 1 ) min 1 i n { a i i u i j i | a j i | 1 + j i u j i } =0.3269.

But, if we apply Corollary 2, we get

τ ( A A 1 ) min 1 i n { a i i w i j i | a j i | 1 + j i w j i } =0.3605.

If we apply Corollary 3, we get

τ ( A A 1 ) min i j 1 2 { α i i a i i + α j j a j j [ ( α i i a i i α j j a j j ) 2 + 4 ( w i k i | b k i | α i i ) ( w j k j | b k j | α j j ) ] 1 2 } = 0.4072 .

References

  1. Fiedler M, Markham T: An inequality for the Hadamard product of an M -matrix and inverse M -matrix. Linear Algebra Appl. 1988, 101: 1–8.

    Article  MathSciNet  MATH  Google Scholar 

  2. Zhou DM, Chen GL, Wu GX, Zhang XY: On some new bounds for eigenvalues of the Hadamard product and the Fan product of matrices. Linear Algebra Appl. 2013, 438: 1415–1426. 10.1016/j.laa.2012.09.013

    Article  MathSciNet  MATH  Google Scholar 

  3. Cheng GH, Tan Q, Wang ZD: Some inequalities for the minimum eigenvalue of the Hadamard product of an M -matrix and its inverse. J. Inequal. Appl. 2013, 2013(65):1–9.

    MathSciNet  Google Scholar 

  4. Yong XR, Wang Z: On a conjecture of Fiedler and Markham. Linear Algebra Appl. 1999, 288: 259–267.

    Article  MathSciNet  MATH  Google Scholar 

  5. Varga RS: Minimal Gerschgorin sets. Pac. J. Math. 1965, 15: 719–729. 10.2140/pjm.1965.15.719

    Article  MATH  Google Scholar 

  6. Horn RA, Johnson CR: Matrix Analysis. Cambridge University Press, Cambridge; 1985.

    Book  MATH  Google Scholar 

  7. Berman A, Plemmons RJ: Nonnegative Matrices in the Mathematical Sciences. SIAM, Philadelphia; 1994.

    Book  MATH  Google Scholar 

  8. Zhou DM, Chen GL, Wu GX, Zhang XY: Some inequalities for the Hadamard product of an M -matrix and an inverse M -matrix. J. Inequal. Appl. 2013, 2013(16):1–10.

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors are very indebted to the referees for their valuable comments and corrections, which improved the original manuscript of this paper. This work was supported by the National Natural Science Foundation of China (11361074), IRTSTYN and Foundation of Yunnan University (2012CG017).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yao-tang Li.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

All authors contributed equally to this work. All authors read and approved the final manuscript.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Li, Yt., Wang, F., Li, Cq. et al. Some new bounds for the minimum eigenvalue of the Hadamard product of an M-matrix and an inverse M-matrix. J Inequal Appl 2013, 480 (2013). https://doi.org/10.1186/1029-242X-2013-480

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1029-242X-2013-480

Keywords