School of Mathematics and Statistics, Southwest University, Chongqing 400715, China
dongwendxq@sina.com
Show less
History+
Received
Accepted
Published
Issue Date
Revised Date
2025-03-20
PDF
(513KB)
Abstract
In this paper, through the use of image space analysis, optimality conditions for a class of variational inequalities with cone constraints are proposed. By virtue of the nonlinear scalarization function, known as the Gerstewitz function, three nonlinear weak separation functions, two nonlinear regular weak separation functions and a nonlinear strong separation function are introduced. According to nonlinear separation functions, some optimality conditions of the weak and strong alternative for variational inequalities with cone constraints are derived.
Wen DONG, Junrong ZHANG, Yiyun WANG, La HUANG.
Optimality conditions for a class of variational inequalities with cone constraints.
Front. Math. China, 2025, 20(1): 39-53 DOI:10.3868/s140-DDD-025-0004-x
As is well-known, variational inequalities have been widely applied in control theory, equilibrium problems, complementarity problems, fixed-point problems, and game theory, among others. In recent years, the theory and methods related to variational inequalities have gained increasing attention from scholars [5, 6, 11]. Castellani and Giannessi [2] first introduced mimetic space analysis, which serves as a very powerful tool in studying constrained extrema, classical variational inequalities, and Ky Fan inequalities, and has been further explored in [8]. Mimetic space analysis primarily studies optimization problems that can be equivalently expressed as the infeasibility of a parametric system and the separation of two sets in the mimetic space on constrained optimization problems. With the help of mimetic space analysis, Li and Huang [12] studied the optimality conditions for variational inequality problems and applied them to traffic equilibrium problems. Mastroeni et al. [14] studied saddle points and gap functions for vector quasi-equilibrium problems with cone constraints. Gu and Li [10] explored vector quasi-equilibrium problems and used linear separation functions to investigate the optimality conditions and error bounds of vector quasi-equilibrium problems. Based on mimetic space analysis, Chen et al. [4] studied vector-like variational inequalities with cone constraints, thereby deriving the optimality conditions and continuity of gap functions. Xu [15] employed mimetic space analysis to study nonlinear separation problems of inverse variational inequalities with cone constraints. However, it fails to successfully transform the variational inequality problem into the infeasibility form of a parametric system and separate the corresponding sets in its mimetic space in this paper, which is the core to study the problem with the help of mimetic space analysis. To resolve this issue, [15] will be improved in this paper.
Some symbols and definitions are reviewed first. The closure, relative interior, interior, and boundary of a set are denoted by clM, riM, intM and , respectively. Let be an interior non-empty closed convex cone.
Given a function , the following considers variational inequality with cone constraints. Find such that
where being a vector-valued mapping, and an interior non-empty closed convex cone. In [15], the variational inequality (1.1) is also referred to as the inverse variational inequality.
The following shows the main characteristics of the image in the variational inequality (1.1). Given , define the map
Consider the set
is called the image of the variational inequality (1.1), and is referred to as the image space. Clearly, is a solution to the variational inequality (1.1) if and only if and the generalized system
is infeasible, or equivalently, and .
For a function h defined on a set and , the sets
are called the non-negative level set and the positive level set of the function h, respectively.
Definition 1.1 [9] Given a function , the function is called a weak separation function class, denoted as , if it satisfies the following conditions:
(i) ,
(ii) ,
where is a certain parameter set.
Definition 1.2 [9] Given a function , the function ω is called a regular weak separation function class, denoted as , if it satisfies the condition: , where П is a certain parameter set.
Definition 1.3 [9] Given a function , the function s is called a strong separation function class, denoted as , if it satisfies the following conditions:
(i)
(ii)
where П is a certain parameter set.
Note 1.1 When , Thus, when , there is
Lemma 1.1 [1] Let Y be a normed space, and letand E is an interior non-empty convex cone. Then,
Given , define the Gerstewitz function as .
Proposition 1.1 [3, 7, 13] For any given , , and :
(i)
(ii)
(iii) The functionis decreasing on , that is,
2 Separation functions
This section introduces four nonlinear functions and discusses their properties.
where
where
where is a positive real number, and the expansion function is upper semicontinuous with
Proposition 2.1 (i) The nonlinear functionis a weak separation function.
(ii) When , the nonlinear functionis a regular weak separation function.
(iii) The nonlinear functionis a weak separation function.
(iv) If , then the nonlinear function s is a strong separation function.
(v) The nonlinear functionis a weak separation function.
(vi) If , then the nonlinear functionis a regular weak separation function.
Proof (i) For any , there is . From Proposition 1.1(ii), we know
Thus,
Now, it is required to prove
To prove this, it is sufficient to show that if , then , i.e., when , there exists such that . Let . The following discusses the three cases.
Case 1: If , and , then there exists such that .
Case 2: If , and , then there exists . According to Proposition 1.1(ii),
Case 3: if , and , then there exists . According to Proposition 1.1(ii),
According to Equations (2.5) and (2.6),
(ii) From (i), it is proved
Now, it is required to prove
If , there exists such that . Let . The following discusses the three cases:
Case 1: If , and , assuming , .
Case 2: If , and , assuming . According to Proposition 1.1(ii), , and
Case 3: If , and , we can assume . According to Proposition 1.1(ii), , and
From Equations (2.7) and (2.8), it is concluded that
(iii) For any and , there is . Moreover, according to Proposition 1.1(ii), it is known that , which implies
Now, it is required to prove
In fact, let . According to Proposition 1.1(i), it is known
Furthermore, from and Equation (2.11), it is concluded that Equation (2.10) holds. Therefore, according to Equations (2.9) and (2.10),
(iv) From (iii), it is proved
Now, it is required to prove
First, it is required to prove
In fact, take any . Then there exists such that According to Proposition 1.1(i), Then, by Lemma 1.1, there is Therefore, Equation (2.14) holds.
Next, it is required to prove
In fact, let From Proposition 1.1(i), it is known that Thus,
From Equations (2.14) and (2.15), it is concluded that Equation (2.13) holds. Therefore, based on Equations (2.12) and (2.13),
(v) For any , and , there is . From Proposition 1.1(ii), it is known that and , and it is deduced that
which implies
Now, it is required to prove
To do this, if , there exists such that . Let . The following discusses the three cases:
Case 1: If , and , let Since , it follows that for all . Also, since and , there is
Therefore, it is obtained that .
Case 2: If , and , let . From Proposition 1.1(ii), it is known that Furthermore, by Proposition 1.1(iii) and since for all , it is deduced that
Case 3: If , and , let . Based on Proposition 1.1(ii), it is known that Additionally, by Proposition 1.1(iii) and since for all , there is
From Equations (2.16) and (2.17), it is concluded that
(vi) The proof for (vi) is similar to (v). □
Note 2.1 (i) From the definitions of and , it is known that . In fact, by Proposition 1.1(iii) and since for all , there is
(ii) Let , and define
where , and . Then the function is a regular weak separation function.
3 Optimality conditions
This section establishes some weak and strong selection theorems, and discusses the optimality conditions for the variational inequality (1.1).
Theorem 3.1 (i) System (1.2) with respect to the variable y is not simultaneously feasible with the system
(ii) System (1.2) with respect to the variable y is not simultaneously feasible with the system
(iii) System (1.2) with respect to the variable y is not simultaneously feasible with the system
(iv) System (1.2) with respect to the variable y is not simultaneously feasible with the system
Proof (i) Assume that System (1.2) is feasible. Then, there exists such that From Proposition 2.1(i), it is known that
Thus, System (3.1) is infeasible.
Conversely, assume that the system (3.1) is feasible. Then,
That is,
which implies From Proposition 2.1(i), it is known that , which means that for any , Hence, the system (1.2) with respect to the variable y is infeasible.
(ii) By combining Proposition 2.1(iii) and the proof of part (i), the conclusion follows easily.
(iii) By combining Proposition 2.1(v) and the proof of part (i), the conclusion follows easily.
(iv) Assume that System (1.2) is infeasible. Then . Therefore, from Proposition 2.1(iv), for any , there exists such that , i.e.,
Thus, System (3.4) is feasible.
Conversely, assume that System (3.4) is infeasible. Then, for any , there exists such that
i.e., . From Proposition 2.1(iv), it is known that . Therefore, System (1.2) with respect to the variable y is feasible. □
Theorem 3.2 (i) System (1.2) with respect to the variable y is not simultaneously feasible with the system
(ii) System (1.2) with respect to the variable y is not simultaneously feasible with the system
Proof The proof of this theorem is similar to the proof of Theorem 3.1(i). □
According to Theorems 3.1 and 3.2, it is derived the necessary and sufficient optimality conditions for the variational inequality (1.1).
Theorem 3.3 (i) Letand there existssuch that
Thenx* is a solution to the variational inequality (1.1).
(ii) Letand there existssuch that
Then x* is a solution to the variational inequality (1.1).
(iii) If x* is a solution to the variational inequality (1.1) and , then there existssuch that
Proof By combining Theorem 3.1 and Theorem 3.2, the conclusion can be drawn. □
The following example illustrates the feasibility of Theorem 3.3.
Example 3.1 Let , and . Define
, so that .
Let , then and . Let . By the definition of and Note 2.1(i), it is known that
Thus, Equations (3.7) and (3.8) hold. By calculation, is a solution to the variational inequality (1.1).
Let , then
Based on Proposition 1.1(ii), it is known
which implies that Equation (3.9) holds.
Using the nonlinear regular weak separation functions (3.1) and (3.4), the necessary and sufficient optimality conditions for the variational inequality (1.1) can be obtained.
Theorem 3.4Let . Thenis a solution to the variational inequality (1.1) if and only ifand
or
Proof We only need to prove Equation (3.10); the proof for Equation (3.11) is similar. Let
First, it is required to prove
In fact, by Note 2.1(ii), it is known that is a regular weak separation function, so the first equality in Equation (3.12) holds. The second inclusion is obviously valid. Now it is required to prove
For any , by Proposition 1.1(ii) and , there is
This implies that
From Equation (3.12), it is known that
If is a solution to the variational inequality (1.1), then . According to Equation (3.13), there is
which means
In particular, when , there is
On the other hand, since , there is
Thus, there is
By combining this with Equation (3.14), it is concluded that Equation (3.10) holds.
Conversely, assume that Equation (3.10) holds. Then, there is
At the same time, from Equation (3.12), it is known that
Therefore, , and since , it is concluded that is a solution to the variational inequality (1.1). □
Next, the definition of nonlinear separation is provided to proceed to investigate the optimality conditions for the variational inequality (1.1).
Definition 3.1 The sets and satisfy nonlinear separation if and only if there exist such that
Additionally, if , the separation is called regular.
Proposition 3.1Ifandsatisfy regular nonlinear separation and , thenis asolution to the variational inequality (1.1).
Proof This proof is similar to Proposition 4.1 in literature [15]. □
Assume . Define the generalized Lagrange function as
Theorem 3.5Let . The following two statements are equivalent:
(i) andsatisfy nonlinear separation.
(ii) There existsuch thatis a saddle point ofon , i.e.,
Proof (i)(ii). Assume that and satisfy nonlinear separation. Then there exist such that Equation (3.15) holds. In Equation (3.15), let . Then
Since , there is , and thus
Therefore,
Since
There is
Next, it is required to prove the second inequality of Equation (3.17). In fact, since Equation (3.15) holds, there is
(ii)(i). Assume that is a saddle point of on , i.e., for all , and ,
In the first inequality of Equation (3.18), let . Then
From the second inequality of Equation (3.18), it is obtained
Therefore, Equation (3.15) holds, and and satisfy nonlinear separation. □
Corollary 3.1Letand there existsuch thatis a saddlepoint ofon , thenis a solution to the variational inequality (1.1).
Proof By combining Theorem 3.5 and Proposition 3.1, the conclusion follows. □
Breckner W.W. , Kassay, G.. A systematization of convexity concepts for sets and functions. J. Convex Anal.1997; 4(1): 109–127
[2]
CastellaniG.. and Giannessi, F., Decomposition of mathematical programs by means of theorems of alternative for linear and nonlinear systems, In: Survey of Mathematical Programming (Proc. Ninth Internat. Math. Programming Sympos., Budapest, 1976), Vol. 2, Amsterdam: North-Holland, 1979, 423–439
[3]
ChenG.Y., Huang, X.X.. and Yang, X.Q., Vector Optimization: Set-Valued and Variational Analysis, Lecture Notes in Economics and Math. Systems, Vol. 541, Berlin: Springer-Verlag, 2005
[4]
Chen J.W., Li, S.J., Wan, Z.P. , Yao, J.C.. Vector variational-like inequalities with constraints: separation and alternative. J. Optim. Theory Appl.2015; 166(2): 460–479
[5]
FacchineiF. , Pang, J.S., Finite-Dimensional Variational Inequalities and Complementarity Problems, Springer Ser. Oper. Res. Financ. Eng., New York: Springer-Verlag, 2003
[6]
Ferris M.C. , Pang, J.S.. Engineering and economic applications of complementarity problems. SIAM Rev.1997; 39(4): 669–713
[7]
Gerth C. , Weidner, P.. Nonconvex separation theorems and some applications in vector optimization. J. Optim. Theory Appl.1990; 67(2): 297–320
[8]
Giannessi F.. Theorems of the alternative and optimality conditions. J. Optim. Theory Appl.1984; 42(3): 331–365
[9]
GiannessiF., Constrained Optimization and Image Space Analysis, Vol. 1: Separation of Sets and Optimality Conditions, Mathematical Concepts and Methods in Science and Engineering, Vol. 49, New York: Springer Science+Business Media, 2005
[10]
Guu S.M. , Li, J.. Vector quasi-equilibrium problems: separation, saddle points and error bounds for the solution set. J. Global Optim.2014; 58(4): 751–767
[11]
Harker P.T. , Pang, J.S.. Finite-dimensional variational inequality and nonlinear complementarity problems: a survey of theory, algorithms and applications. Math. Program.1990; 48: 161–220
[12]
Li J. , Huang, N.J.. Image space analysis for variational inequalities with cone constraints applications to traffic equilibria. Sci. China Math.2012; 55(4): 851–868
[13]
LucD.T., Theory of Vector Optimization, Lecture Notes in Economics and Math. Systems, Vol. 319, Berlin: Springer-Verlag, 1989
[14]
Mastroeni G., Panicucci, B., Passacantando, M. , Yao, J.C.. A separation approach to vector quasi-equilibrium problems: saddle point and gap function. Taiwanese J. Math.2009; 13(2): 657–673
Note: Please be aware that the following content is generated by artificial intelligence. This website is not responsible for any consequences arising from the use of this content.