Check yourself before you wreck yourself: Assessing discrete choice models through predictive simulations
Typically, discrete choice modelers develop ever-more advanced models and estimation methods. Compared to the impressive progress in model development and estimation, model-checking techniques have lagged behind. Often, choice modelers use only crude methods to assess how well an estimated model represents reality. Such methods usually stop at checking parameter signs, model elasticities, and ratios of model coefficients. In this paper, I greatly expand the discrete choice modelers' assessment toolkit by introducing model checking procedures based on graphical displays of predictive simulations. Overall, my contributions are as follows. Methodologically, I introduce a general and 'semi-automatic' algorithm for checking discrete choice models via predictive simulations. By combining new graphical displays with existing plots, I introduce methods for checking one's data against one's model in terms of the model's predicted distributions of P (Y), P (Y|X), and P (X | Y). Empirically, I demonstrate my proposed methods by checking the models from Brownstone and Train (1998). Through this case study, I show that my proposed methods can point out lack-of-model-fit in one's models and suggest concrete model improvements that substantively change the results of one's policy analysis. Moreover, the case study highlights a practical trade-off between precision and robustness in model checking.
READ FULL TEXT