Although deep learning-based approximation algorithms have been applied very successfully to numerous problems, at the moment the reasons for their performance are not entirely understood from a mathematical point of view. Recently, estimates for the convergence of the overall error have been obtained in the situation of deep supervised learning, but with an extremely slow rate of convergence. In this note, we partially improve on these estimates. More specifically, we show that the depth of the neural network only needs to increase much slower in order to obtain the same rate of approximation. The results hold in the case of an arbitrary stochastic optimization algorithm with i.i.d. random initializations.
This paper proposes a novel method to compute the diffeomorphic registration of 3D surfaces with point and curve feature landmarks. First the surfaces are mapped to the canonical domain by a curve constrained harmonic map, where the landmark curves are straightened to line segments and their positions and inclining angles are determined intrinsically by the surface geometry and its curve landmarks. Then, the canonical domains are registered by aligning the corresponding point and straight line segments using the dynamic quasiconformal map (DQCM), which introduces the combinatorial diagonal switches to the quasiconformal optimization such that the resultant map is diffeomorphic. The end points of the source curve landmarks are mapped to their corresponding points on the target surface, while the interior points of the source curves can slide on the corresponding target curves, which provides more freedom for the surface registration than the point-based registration methods. Experiments on the real surfaces with point and curve landmarks demonstrate the efficiency, efficacy and robustness of the proposed method.
The method of moving surfaces is an effective tool to implicitize rational parametric surfaces, and it has been extensively studied in the past two decades. An essential step in surface implicitization using the method of moving surfaces is to compute a
Assessing the influence of individual observations of the functional linear models is important and challenging, especially when the observations are subject to missingness. In this paper, we introduce three case-deletion diagnostic measures to identify influential observations in functional linear models when the covariate is functional and observations on the scalar response are subject to nonignorable missingness. The nonignorable missing data mechanism is modeled via an exponential tilting semiparametric functional model. A semiparametric imputation procedure is developed to mitigate the effects of missing data. Valid estimations of the functional coefficients are based on functional principal components analysis using the imputed dataset. A smoothed bootstrap sampling method is introduced to estimate the diagnostic probability for each proposed diagnostic measure, which is helpful to unveil which observations have the larger influence on estimation and prediction. Simulation studies and a real data example are conducted to illustrate the finite performance of the proposed methods.
In this paper, the structure of finite groups in which maximal subgroups of some Sylow subgroups have a