The nonnegative tensor (matrix) factorization finds more and more applications in various disciplines including machine learning, data mining, and blind source separation, etc. In computation, the optimization problem involved is solved by alternatively minimizing one factor while the others are fixed. To solve the subproblem efficiently, we first exploit a variable regularization term which makes the subproblem far from ill-condition. Second, an augmented Lagrangian alternating direction method is employed to solve this convex and well-conditioned regularized subproblem, and two accelerating skills are also implemented. Some preliminary numerical experiments are performed to show the improvements of the new method.
Nonnegative tensor decomposition allows us to analyze data in their ‘native’ form and to present results in the form of the sum of rank-1 tensors that does not nullify any parts of the factors. In this paper, we propose the geometrical structure of a basis vector frame for sum-of-rank-1 type decomposition of real-valued nonnegative tensors. The decomposition we propose reinterprets the orthogonality property of the singularvectors of matrices as a geometric constraint on the rank-1 matrix bases which leads to a geometrically constrained singularvector frame. Relaxing the orthogonality requirement, we developed a set of structured-bases that can be utilized to decompose any tensor into a similar constrained sum-of-rank-1 decomposition. The proposed approach is essentially a reparametrization and gives us an upper bound of the rank for tensors. At first, we describe the general case of tensor decomposition and then extend it to its nonnegative form. At the end of this paper, we show numerical results which conform to the proposed tensor model and utilize it for nonnegative data decomposition.
The real rectangular tensors arise from the strong ellipticity condition problem in solid mechanics and the entanglement problem in quantum physics. In this paper, we study the singular values/vectors problem of real nonnegative partially symmetric rectangular tensors. We first introduce the concepts of l k,s-singular values/vectors of real partially symmetric rectangular tensors. Then, based upon the presented properties of l k,s-singular values /vectors, some properties of the related l k,s-spectral radius are discussed. Furthermore, we prove two analogs of Perron-Frobenius theorem and weak Perron-Frobenius theorem for real nonnegative partially symmetric rectangular tensors.
The signless Laplacian tensor and its H-eigenvalues for an even uniform hypergraph are introduced in this paper. Some fundamental properties of them for an even uniform hypergraph are obtained. In particular, the smallest and the largest H-eigenvalues of the signless Laplacian tensor for an even uniform hypergraph are discussed, and their relationships to hypergraph bipartition, minimum degree, and maximum degree are described. As an application, the bounds of the edge cut and the edge connectivity of the hypergraph involving the smallest and the largest H-eigenvalues are presented.
Consider the problem of computing the largest eigenvalue for nonnegative tensors. In this paper, we establish the Q-linear convergence of a power type algorithm for this problem under a weak irreducibility condition. Moreover, we present a convergent algorithm for calculating the largest eigenvalue for any nonnegative tensors.