- To see my recent publications, visit Google Scholar. The text below is outdated.
- Three Deep Learning + Optimization papers with Frank Hutter:
Online batch selection for faster training of neural networks Up to 5 times faster than Adam/Adadelta on MNIST.
CMA-ES for Hyperparameter Optimization of Deep Neural Networks CMA-ES is competitive in the regime of parallel evaluations.
SGDR: Stochastic Gradient Descent with Restarts New state-of-the-art results: 3.74% on CIFAR-10 and 18.70% on CIFAR-100. - Black Box Optimization Competition (BBComp'2016)
together with Tobias Glasmachers, I am co-organizing the second edition of the BBComp. Last year we had 53 participants/submissions. This year we have 2 single-objective and 3 new multi-objective tracks with 1000 problems in each track. - Anytime Bi-Objective Optimization with a Hybrid Multi-Objective CMA-ES (HMO-CMA-ES)
together with Tobias Glasmachers we designed HMO-CMA-ES, an algorithm for anytime multi-objective optimization. We are glad to announce that HMO-CMA-ES is the best multi-objective optimizer according to the biobj-BBOB framework. It is roughly 10 times faster than the closest competitor on 55 diverse multi-objective problems. Click the image below.- LM-CMA: an Alternative to L-BFGS for Large Scale Black-box Optimization     source code
sole-author paper, to appear in Evolutionary Computation journal 2016, MIT press. We further improve LM-CMA to compete with L-BFGS in gradient-free settings. The results on large scale nonsmooth problems look promising. - LM-CMA: an Alternative to L-BFGS for Large Scale Black-box Optimization     source code