Understanding the impact of changes in features on the unconditional distribution of outcomes is crucial for various applications. Despite their predictive accuracy, existing black-box models are limited in addressing such questions. In this work, we propose a novel approximation method to compute feature importance curves, which quantify changes across the quantiles of the outcome distribution due to shifts in features. Our approach leverages pre-trained black-box models, combining their predictive strength with interpretation. Through extensive simulations and real-world data applications, we show that our method delivers sparse, reliable results while maintaining computational efficiency, making it a practical tool for interpretation.