When modeling data, we would like to know that our models are extracting facts about the data itself, and not about something arbitrary, like the order of the factors used in the model-ing. Formally speaking, this means we want the model to be invariant with respect to certain transformations. Here we look at different models and the nature of their invariants. We find that regression, MLE and Bayesian estimation all are invariant with respect to linear transformations, whereas regularized regressions have a far more limited set of invariants. As a result, regularized regressions produce results that are less about the data itself and more about how it is parameterized. To correct this, we propose an alternative expression of regularization which we call functional regularization. Ridge regression and lasso can be recast in terms of functional regularization, as can Bayesian estimation. But functional regularization preserves model invariance, whereas ridge and lasso do not. It is also more flexible, easier to understand, and can even be applied to non-parametric models.