International Conference on Continuous Optimization 2019

Die International Conference on Continuous Optimization (ICCOPT) ist eine der Hauptkonferenzen der Mathematical Optimization Society (MOS) und findet alle drei Jahre statt.

Die Veranstaltung bietet mit ihrer internationalen Ausrichtung einen Austausch im Bereich der mathematischen Optimierung kontinuierlicher Probleme und will Anregungen für Forschung und Praxis liefern. Themenschwerpunkte sind unter anderem Big Data and Machine Learning, Nonlinear/Global Optimization und Derivative-free and Simulation-based Optimization.

Die ICCOPT 2019

Die ICCOPT wird im Jahr 2019 erstmals in der Bundeshauptstadt Berlin abgehalten und findet vom 3. bis zum 8. August 2019 statt. Ausgerichtet wird die Konferenz vom Weierstraß-Institut für Angewandte Analysis und Stochastik (WIAS). Für die Vorträge und Seminare werden die Räumlichkeiten der Technischen Universität Berlin (TU Berlin) genutzt.

Unsere Interessenschwerpunkte

Wir sind bei folgenden Veranstaltungen, Vorträgen und Workshops vor Ort und freuen uns über den fachlichen Austausch:

Montag
Keynote
  • Coordinate Descent Methods for Nonconvex Optimization
Session 1: Applications of Multi-Objective Optimization
  • A multi-objective optimal design of experiments framework for online model identification platforms
  • Inverse multiobjective optimization: Inferring decision criteria from data
  • A Generalized Nash Game for Computation Offloading
Session 2: The Interface of Generalization and Optimization in Machine Learning
  • Training on the Test Set and Other Heresies
  • New Thoughts on Adaptivity, Generalization and Interpolation
Semi-Keynote
  • On Statistical Inference for Optimization Problems with Composite Risk Functionals
Best Paper Session
  • Directional differentiability of quasi-variational inequalities and related optimization problems
  • Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems
  • Stochastic Subgradient Method Converges on Tame Functions
  • Exploiting Second Order Sparsity in Big Data Optimization
Dienstag
Keynote
  • Addressing Uncertainty in Equilibrium Problems
Session 1: The Interface of Generalization and Optimization in Machine Learning
  • Optimization for machine learning: from training to test error
  • Towards demystifying over-parameterization in deep learning
  • Interpolation and its implications
Session 2: Advances in Data-driven and Robust Optimization
  • Data-Pooling in Stochastic Optimization
  • Toward a Genomic Liquid Biopsy
  • Decision Forests: A Nonparametric Model for Irrational Choice
Session 3: Emerging Trends in Derivative-Free Optimization
  • Global optimization of noisy multimodal functions with RBF surrogates
  • Diagonal acceleration for covariance matrix adaptation evolution strategies
  • The evolution of RBFOpt: improving the RBF method, with one eye on performance
Session 4: Techniques in Global Optimization
  • Gaining and losing perspective
  • Handling separable non-convexities with disjunctive cuts
  • Random projections for quadratic optimization
Mittwoch
Keynote
  • Randomized Methods for Low-Rank Tensor Decomposition
Semi-Keynote
  • Proximal Algorithms for Nonsmooth Nonconvex Optimization
Session 1: Emerging Trends in Derivative-free Optimization
  • COMO-CMA-ES: a linearly convergent derivative free multi-objective solver
  • MADMS: Mesh adaptive direct multisearch for blackbox constrained multiobjective optimization
  • On the use of quadratic polynomial models in multiobjective directional direct search
Session 2: Global Optimization
  • Solving Optimization over the Efficient Set of a Multiobjective Linear Programming Problem as a Mixed Integer Problem
  • Exploiting recursive non-linear transformations of the variables space in global optimization problems
  • A global optimization algorithm by listing KKT points for a quadratic reverse convex programming problem
Session 3: Generalized Distances and Envelope Functions
  • The Fitzpatrick Distance
  • A universal majorization-minimization framework for the convergence analysis of nonconvex proximal algorithms
  • Polar envelope and polar proximal map
Semi-Keynote
  • Structured sparsity in semidefinite programming
Donnerstag
Session 1: Big Data and Machine Learning
  • Human Activity Recognition using Intuitionistic Fuzzy Proximal Support Vector Machines
  • Online Optimization for Time Series of Parametrizable Objective Functions
  • A data-driven model-based method for quantitative MRI
Session 2: Recent Advances in Derivative-free Optimization
  • A Surrogate for Local Optimization using Delaunay Triangulations
  • Detecting model uncertainty via parameter estimation and optimal design of experiments
  • Convergence properties of line search strategies for multi-modal functions
Session 3: Dynamic Optimization under Data Uncertainty
  • From Robust Optimization to Online Inverse Optimization
  • Robust Periodic-Affine Policies for Multiperiod Dynamic Problems
  • Monitoring With Limited Information
Semi-Keynote
  • New Quasi-Newton Ideas for (Non)smooth Optimization
Keynote
  • Nonsmoothness in PDE-Constrained Optimization

Weiterlesen:


Montag bis Freitag von 9 bis 22 Uhr stehen wir Ihnen persönlich und diskret zur Verfügung.
Rufen Sie uns an oder senden Sie uns eine E-Mail, unter .
Wir freuen uns auf Ihre Kontaktaufnahme.