Open Feedback Dialog

Please visit my Google Scholar account. The list below might be incomplete and outdated.

Refereed Papers

  • Ilya Loshchilov
    to appear in Evolutionary Computation journal 2016, MIT press
                source code    
    Median (out of 11 runs) number of function evaluations required to find f(x) = 1e−10 for LM-CMA, L-BFGS with exact gradients, CL-BFGS with central differencing, active CMA-ES and VD-CMA. Dotted lines depict extrapolated results.   

  • Ilya Loshchilov, Marc Schoenauer, Michele Sebag and Nikolaus Hansen
    In: "Parallel Problem Solving from Nature (PPSN XIII)", ACM Press : p. 70-79. September 2014.
                source code    
    Evolution of learning rates c_1, c_{\mu}, c_c (lines with markers, left y-axis) and log10(objective function) (plain line, right y-axis) self-CMA-ES on 10- and 20-dimensional Sphere and Rosenbrock functions. The medians of 15 runs are shown.   

  • (nominated for the best paper at GECCO'14)
    Ilya Loshchilov
    In: "Genetic and Evolutionary Computation Conference (GECCO-2014)", ACM Press : p. 397-404. July 2014.
                source code    
    Timing results of LM-CMA-ES on the separable Ellipsoid compared to sep-CMA-ES and Cholesky-CMA-ES. The results were computed using at most 10^5 function evaluations for sep-CMA-ES and LM-CMA-ES and using at most 10^4 for Cholesky-CMA-ES.   
    The median of 11 runs on separable Ellipsoid function for different problem dimensions. The dotted lines correspond to extrapolated results by preserving the same scaling as between the last two actual estimations. Note that sep-CMA-ES is not rotation invariant and while it can solve the separable Ellipsoid, it cannot solve the non-separable one in a reasonable amount of function evaluations. However, LM-CMA-ES and Cholesky-CMA-ES are rotation invariant while the latter is limited to e.g. 10^3 ... 10^4 variables due to its quadratic time and space complexity per fuction evaluation.   

  • Ilya Loshchilov, Marc Schoenauer, Michèle Sebag
    In: "Genetic and Evolutionary Computation Conference (GECCO-2013)", ACM Press : p. 439-446. July 2013.
                   source code    
    Comparison of the proposed surrogate-assisted versions of IPOP-CMA-ES algorithms on 20-dimensional Rotated Ellipsoid function. The trajectories show the median of 15 runs.   
    The proposed algorithms outperform BIPOP-aCMA-ES and demonstrate the best overall performance w.r.t. about 100 optimization algorithms tested during the BBOB-2009, 2010 and 2012. The performance of the proposed HCMA (a hybrid CMA-ES) is comparable to the portfolio 'best2009' algorithm (i.e., an artificial algorithm representing a combination of best results shown by a set of algorithms). See BBOB-2013 workshop paper for details.   

  • Ilya Loshchilov
    In: "Congress on Evolutionary Computation (CEC)", IEEE : p. 369-376. June 2013.
                original source code    source code with bound handling    a note on bound handling
    Alternative restarts strategies of CMA-ES with active covariance matrix update (NBIPOP-aCMA-ES and NIPOP-aCMA-ES) demonstrate the overall best performance on 28 noiseless functions in dimensions 10, 30 and 50. The figure shows empirical cumulative distribution of all function-target pairs solved on all functions, dimensions and runs (in overall, 428400 pairs).   

  • Ilya Loshchilov
    In: "Conference sur l'Apprentissage Automatique (CAP)", ArXiv e-prints, 1308.2655. June 2013.
               
    Ellipsoidal 95% confidence regions of the Gaussian distribution P_a (black thin line) and three other Gaussian distributions P_b (color marked lines) with same Kullback-Leibler divergence KL(P_a||P_b). In surrogate-assisted optimization an allowed KL-divergence from the original distribution when optimizing the surrogate can be linked with the estimated surrogate model error. The KL-divergence can be used to define some trust-region for local optimization.   

  • Ilya Loshchilov, Marc Schoenauer, Michèle Sebag
    In: "Parallel Problem Solving from Nature (PPSN XII)", Springer Verlag, LNCS 7491 : p. 296-305. September 2012.
                source code    
    Restart performances of CMA-ES in the 2D hyper-parameter space (population size and initial mutation step size in log. coordinates). For each objective function (20 dimensional Rastrigin - top-left, Gallagher 21 peaks - top-right, Katsuuras - bottom-left and Lunacek bi-Rastrigin bottom-right), the median best function value out of 15 runs is indicated. Legends indicate that the optimum up to precision f(x) = 1e-10 is found always (+), sometimes (⊕) or never (◦). Black regions are better than white ones.   

  • Ilya Loshchilov, Marc Schoenauer, Michèle Sebag
    In: "Genetic and Evolutionary Computation Conference (GECCO-2012)", ACM Press : p. 321-328. July 2012.
                   source code     erratum
    A comparison of the proposed saACM algorithms with 40+ classical (BFGS, GLOBAL, NEWOUA,...) and evolutionary optimization algorithms on 20-dimensional problems of the BBOB framework. The figure shows the proportion of functions which can be solved with a given precision (axis y) using a given number of function/model evaluations (axis x).   

  • Ilya Loshchilov, Marc Schoenauer, Michèle Sebag
    In: "Genetic and Evolutionary Computation Conference (GECCO-2011)", ACM Press : p. 885-892. July 2011.
                   source code
    CMA-like Adaptive Encoding Update (b) mostly based on Principal Component Analysis (a) is used to extend Coordinate Descent method (c) to the optimization of non-separable problems (d).
    Fast Adaptive Coordinate Descent has linear time complexity and is suitable for large-scale (D>>100) non-linear optimization.
      
    The adaptation of an appropriate coordinate system allows Adaptive Coordinate Descent to outperform Coordinate Descent on non-separable functions. The following figure illustrates the convergence of both algorithms on 2-dimensional Rosenbrock function up to a target function value 10^-10, starting from the initial point (-3,-4). The Adaptive Coordinate Descent reaches the target value after only 325 function evaluations (about 70 times faster than Coordinate Descent), that is comparable to gradient-based methods.   

  • Ilya Loshchilov, Marc Schoenauer, Michèle Sebag
    In: "Evolutionary Multi-Criterion Optimization 2011 (EMO 2011)", Springer Verlag, LNCS 6576 : p. 31-45. April 2011.
              
    Multi-Armed Bandit (MAB) paradigm applied to parental selection in evolutionary multiobjective optimization allows to quickly catch the fruitful directions of search.   
    Plots of all 10 populations found after 100,000 function evaluations by original (mu + 1)-MO-CMA (left), tournament-based (mu +1)-MOCMA (center) and MAB-based (mu + 1rank)-MO-CMA (right) in the x1-x2-x3 space on LZ09-2 and LZ09-3 30-dimensional multi-objective problems with complicated Pareto front in decision space (gray curve).   

  • Ilya Loshchilov, Marc Schoenauer, Michèle Sebag
    In: "Simulated Evolution and Learning (SEAL 2010)", Springer Verlag, LNCS 6457 : p. 230-239. December 2010.
              
    In multi-criteria decision-making and multiobjective optimization it might be difficult to quantify the score of solutions according to given criteria/objectives. However, the comparison/preference relations is a sufficient source of information to build an efficient surrogate model of the problem.   
    An approach to build comparison-based surrogate models of multi-objective problems using Ranking Support Vector Machine (Rank-SVM) is proposed. It learns the surrogate model using primary comparison constraints (<, =, >), then iteratively adds and learns the most violated secondary constraints. The model can take into account i). Pareto dominance relations ii). Quality indicators (Hypervolume) iii). Preference relations, defined by Decision-Maker.   

  • Ilya Loshchilov, Marc Schoenauer, Michèle Sebag
    In: "Parallel Problem Solving from Nature (PPSN XI)", Springer Verlag, LNCS 6238 : p. 364-373. September 2010.
                   source code     erratum
    Surrogate models may predict very well f(x), but very poorly 1000*f(x) due to parametrization bias, while these two functions have exactly the same structure. We propose to use comparison-based surrogate models which are invariant to rank-preserving tranformation of the optimized function and also invariant to rotation of the search space, using adapted covariance matrix from the state-of-the-art continuous optimizer CMA-ES.   

  • Ilya Loshchilov, Marc Schoenauer, Michèle Sebag
    In: "Genetic and Evolutionary Computation Conference (GECCO-2010)", ACM Press : p. 471-478. July 2010.
              
    A mixture of One-Class Ranking Support Vector Machine (SVM) for dominated points and Regression SVM for non-dominated points can be used to build a mono surrogate for multiobjective optimization.   

Papers in Workshop Proceedings

  • Ilya Loshchilov, Marc Schoenauer, Michèle Sebag
    In: "Genetic and Evolutionary Computation Conference (GECCO-2013 Companion)", ACM Press : p. 1177-1184. July 2013.
    4th GECCO Workshop for Real-Parameter Optimization (Black-Box Optimization Benchmarking (BBOB), 2013
                   source code

  • Ilya Loshchilov, Marc Schoenauer, Michèle Sebag
    In: "Probabilistic Numerics Workshop of NIPS 2012". December 2012.
        

  • Ilya Loshchilov, Marc Schoenauer, Michèle Sebag
    In: "Genetic and Evolutionary Computation Conference (GECCO-2012)", ACM Press : 269-276. July 2012.
    3rd GECCO Workshop for Real-Parameter Optimization (Black-Box Optimization Benchmarking (BBOB), 2012
                   source code

  • Ilya Loshchilov, Marc Schoenauer, Michèle Sebag
    In: "Genetic and Evolutionary Computation Conference (GECCO-2012)", ACM Press : 175-182. July 2012.
    3rd GECCO Workshop for Real-Parameter Optimization (Black-Box Optimization Benchmarking (BBOB), 2012
              

  • Ilya Loshchilov, Marc Schoenauer, Michèle Sebag
    In: "Genetic and Evolutionary Computation Conference (GECCO-2012)", ACM Press : 261-268. July 2012.
    3rd GECCO Workshop for Real-Parameter Optimization (Black-Box Optimization Benchmarking (BBOB), 2012
              

  • Ilya Loshchilov, Marc Schoenauer, Michèle Sebag
    In: "Genetic and Evolutionary Computation Conference (GECCO-2010)", ACM Press : p. 1979-1982. July 2010.
    1st GECCO Workshop on Theoretical Aspects of Evolutionary Multiobjective Optimization, 2010
              

Locations of visitors to this page