?url_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Adc&rft.title=Distance-Based+Regularisation+of+Deep+Networks+for+Fine-Tuning&rft.creator=Gouk%2C+Henry&rft.creator=Hospedales%2C+Timothy+M&rft.creator=Pontil%2C+Massimiliano&rft.description=We+investigate+approaches+to+regularisation+during+fine-tuning+of+deep+neural%0D%0Anetworks.+First+we+provide+a+neural+network+generalisation+bound+based+on%0D%0ARademacher+complexity+that+uses+the+distance+the+weights+have+moved+from+their%0D%0Ainitial+values.+This+bound+has+no+direct+dependence+on+the+number+of+weights+and%0D%0Acompares+favourably+to+other+bounds+when+applied+to+convolutional+networks.+Our%0D%0Abound+is+highly+relevant+for+fine-tuning%2C+because+providing+a+network+with+a+good%0D%0Ainitialisation+based+on+transfer+learning+means+that+learning+can+modify+the+weights%0D%0Aless%2C+and+hence+achieve+tighter+generalisation.+Inspired+by+this%2C+we+develop+a+simple%0D%0Ayet+effective+fine-tuning+algorithm+that+constrains+the+hypothesis+class+to+a+small%0D%0Asphere+centred+on+the+initial+pre-trained+weights%2C+thus+obtaining+provably+better%0D%0Ageneralisation+performance+than+conventional+transfer+learning.+Empirical+evaluation+shows+that+our+algorithm+works+well%2C+corroborating+our+theoretical+results.%0D%0AIt+outperforms+both+state+of+the+art+fine-tuning+competitors%2C+and+penalty-based%0D%0Aalternatives+that+we+show+do+not+directly+constrain+the+radius+of+the+search+space.&rft.publisher=ICLR&rft.date=2021&rft.type=Proceedings+paper&rft.language=eng&rft.source=+++++In%3A++Proceedings+of+the+International+Conference+on+Learning+Representations+ICLR+2021.++++ICLR+(2021)+++++&rft.format=text&rft.identifier=https%3A%2F%2Fdiscovery.ucl.ac.uk%2Fid%2Feprint%2F10164239%2F1%2F2216_distance_based_regularisation_.pdf&rft.identifier=https%3A%2F%2Fdiscovery.ucl.ac.uk%2Fid%2Feprint%2F10164239%2F&rft.rights=open