TY - JOUR UR - https://openreview.net/forum?id=2la55BeWwy A1 - Clerico, Eugenio A1 - Guedj, Benjamin N2 - We establish explicit dynamics for neural networks whose training objective has a regularising term that constrains the parameters to remain close to their initial value. This keeps the network in a lazy training regime, where the dynamics can be linearised around the initialisation. The standard neural tangent kernel (NTK) governs the evolution during the training in the infinite-width limit, although the regularisation yields an additional term that appears in the differential equation describing the dynamics. This setting provides an appropriate framework to study the evolution of wide networks trained to optimise generalisation objectives such as PAC-Bayes bounds, and hence contribute to a deeper theoretical understanding of such networks. VL - 2024 AV - public EP - 20 SP - 1 Y1 - 2024/// N1 - This version is the version of record. For information on re-use, please refer to the publisher's terms and conditions. ID - discovery10196071 IS - 04 TI - A note on regularised NTK dynamics with an application to PAC-Bayesian training JF - Transactions on Machine Learning Research ER -