Robust Estimation with Exponentially Tilted Hellinger Distance
This paper is concerned with estimation of parameters defined by moment equalities. In this context, Kitamura, Otsu and Evdokimov (2013a) have introduced the minimum Hellinger distance (HD) estimator which is asymptotically semiparametrically efficient when the model is correctly specified and achieves optimal minimax robust properties under small deviations from the model (local misspecification). This paper evaluates the performance of inference procedures under two comple-mentary types of misspecification, local and global. After showing that HD is not robust to global misspecification, we introduce, in the spirit of Schennach (2007), the exponentially tilted Hellinger distance (ETHD) estimator by combining the Hellinger distance and the Kullback-Leibler information criterion. Our estimator shares the same desirable asymptotic properties as HD under correct specification and local misspecification, and remains well-behaved under global misspecification. ETHD therefore appears to be the first estimator that is efficient under correct specification, and robust to both global and local misspecification.