A rescaling-invariant Lipschitz bound based on path-metrics for modern ReLU network parameterizations
Résumé
Lipschitz bounds on neural network parameterizations are important to establish generalization, quantization or pruning guarantees, as they control the robustness of the network with respect to parameter changes. Yet, there are few Lipschitz bounds with respect to parameters in the literature, and existing ones only apply to simple feedforward architectures, while also failing to capture the intrinsic rescaling-symmetries of ReLU networks. This paper proves a new Lipschitz bound in terms of the so-called path-metrics of the parameters. Since this bound is intrinsically invariant with respect to the rescaling symmetries of the networks, it sharpens previously known Lipschitz bounds. It is also, to the best of our knowledge, the first bound of its kind that is broadly applicable to modern networks such as ResNets, VGGs, U-nets, and many more.
Domaines
Apprentissage [cs.LG]Origine | Fichiers produits par l'(les) auteur(s) |
---|