In-Person Poster presentation / poster accept
Tuning Frequency Bias in Neural Network Training with Nonuniform Data
Annan Yu · Yunan Yang · Alex Townsend
MH1-2-3-4 #157
Keywords: [ neural networks ] [ training ] [ frequency bias ] [ nonuniform ] [ Sobolev norms ] [ neural tangent kernel ] [ Theory ]
Small generalization errors of over-parameterized neural networks (NNs) can be partially explained by the frequency biasing phenomenon, where gradient-based algorithms minimize the low-frequency misfit before reducing the high-frequency residuals. Using the Neural Tangent Kernel (NTK), one can provide a theoretically rigorous analysis for training where data are drawn from constant or piecewise-constant probability densities. Since most training data sets are not drawn from such distributions, we use the NTK model and a data-dependent quadrature rule to theoretically quantify the frequency biasing of NN training given fully nonuniform data. By replacing the loss function with a carefully selected Sobolev norm, we can further amplify, dampen, counterbalance, or reverse the intrinsic frequency biasing in NN training.