The increasing complexity of data, models, and methods often render parameter fitting computationally expensive. This presentation will illustrate how these computational costs can be reduced through statistical theory. Specifically, we will interweave gradient-based optimization algorithms with statistical guarantees (“oracle inequalities”) of lasso-type estimators to derive fast and trustworthy algorithms.