James Ming Chen (Michigan State University – College of Law) has posted “Linear Beta Coefficients Alongside Feature Importance in Machine Learning” on SSRN. Here is the abstract:
Machine-learning regression models lack the interpretability of their conventional linear counterparts. Tree- and forest-based models offer feature importances, a vector of probabilities indicating the impact of each predictive variable on a model’s results. This brief note describes how to interpret the beta coefficients of the corresponding linear model so that they may be compared directly to feature importances in machine learning.