?url_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Adc&rft.title=A+Non-Asymptotic+Analysis+for%0D%0AStein+Variational+Gradient+Descent&rft.creator=Korba%2C+A&rft.creator=Salim%2C+A&rft.creator=Arbel%2C+M&rft.creator=Luise%2C+G&rft.creator=Gretton%2C+A&rft.description=We+study+the+Stein+Variational+Gradient+Descent+(SVGD)+algorithm%2C+which+optimises+a+set+of+particles+to+approximate+a+target+probability+distribution+%CF%80+%E2%88%9D+e%0D%0A%E2%88%92V%0D%0Aon+R%0D%0Ad.+In+the+population+limit%2C+SVGD+performs+gradient+descent+in+the+space%0D%0Aof+probability+distributions+on+the+KL+divergence+with+respect+to+%CF%80%2C+where+the%0D%0Agradient+is+smoothed+through+a+kernel+integral+operator.+In+this+paper%2C+we+provide+a%0D%0Anovel+finite+time+analysis+for+the+SVGD+algorithm.+We+provide+a+descent+lemma%0D%0Aestablishing+that+the+algorithm+decreases+the+objective+at+each+iteration%2C+and+rates%0D%0Aof+convergence+for+the+averaged+Stein+Fisher+divergence+(also+referred+to+as+Kernel%0D%0AStein+Discrepancy).+We+also+provide+a+convergence+result+of+the+finite+particle%0D%0Asystem+corresponding+to+the+practical+implementation+of+SVGD+to+its+population%0D%0Aversion.&rft.publisher=Neural+Information+Processing+Systems+Conference&rft.contributor=Larochelle%2C+H.&rft.contributor=Ranzato%2C+M.&rft.contributor=Hadsell%2C+R.&rft.contributor=Balcan%2C+M.F.&rft.contributor=Lin%2C+H.&rft.date=2020-12-06&rft.type=Proceedings+paper&rft.language=eng&rft.source=+++++In%3A+Larochelle%2C+H.+and+Ranzato%2C+M.+and+Hadsell%2C+R.+and+Balcan%2C+M.F.+and+Lin%2C+H.%2C+(eds.)+NIPS'20%3A+Proceedings+of+the+34th+International+Conference+on+Neural+Information+Processing+Systems.++++Neural+Information+Processing+Systems+Conference%3A+Vancouver%2C+Canada.+(2020)+++++&rft.format=text&rft.identifier=https%3A%2F%2Fdiscovery.ucl.ac.uk%2Fid%2Feprint%2F10166658%2F1%2FNeurIPS-2020-a-non-asymptotic-analysis-for-stein-variational-gradient-descent-Paper.pdf&rft.identifier=https%3A%2F%2Fdiscovery.ucl.ac.uk%2Fid%2Feprint%2F10166658%2F&rft.rights=open