Feature Importance for Time Series Data: Improving KernelSHAP

10/05/2022
by   Mattia Villani, et al.
0

Feature importance techniques have enjoyed widespread attention in the explainable AI literature as a means of determining how trained machine learning models make their predictions. We consider Shapley value based approaches to feature importance, applied in the context of time series data. We present closed form solutions for the SHAP values of a number of time series models, including VARMAX. We also show how KernelSHAP can be applied to time series tasks, and how the feature importances that come from this technique can be combined to perform "event detection". Finally, we explore the use of Time Consistent Shapley values for feature importance.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset