Estimation of Time-Varying Parameters in Statistical Models: An Optimization Approach
作者:Dimitris Bertsimas, David Gamarnik, John N. Tsitsiklis
摘要
We propose a convex optimization approach to solving the nonparametric regression estimation problem when the underlying regression function is Lipschitz continuous. This approach is based on the minimization of the sum of empirical squared errors, subject to the constraints implied by Lipschitz continuity. The resulting optimization problem has a convex objective function and linear constraints, and as a result, is efficiently solvable. The estimated function computed by this technique, is proven to convergeto the underlying regression function uniformly and almost surely, when the sample size grows to infinity, thus providing a very strong form of consistency. Wealso propose a convex optimization approach to the maximum likelihood estimation of unknown parameters in statistical models, where the parameters depend continuously on some observable input variables. For a number of classical distributional forms, the objective function in the underlying optimization problem is convex and the constraints are linear. These problems are, therefore, also efficiently solvable.
论文关键词:nonparametric regression, VC dimension, convex optimization
论文评审过程:
论文官网地址:https://doi.org/10.1023/A:1007586831473