Three researchers from the Mathematics and Computer Science division at Argonne National Laboratory – Sungho Shin, François Pacaud, and Mihai Anitescu – have been named COIN-OR Cup winners for their open-source software packages ExaModels and MadNLP.

The COIN-OR project is building an open-source community for operations research software in order to speed the development and deployment of models, algorithms, and cutting-edge computational research. COIN-OR also provides a forum for peer review of software similar to that provided by archival journals for theoretical research.

The two winning packages, both written in the Julia programming language, were selected as representing a significant breakthrough in efficiently solving large-scale nonlinear problems (NLPs) by leveraging the capabilities of modern graphics processing units (GPUs).

“Our packages dramatically improve performance in solving real-world nonlinear optimization problems at scale,” said Shin, a postdoctoral appointee in Argonne’s MCS division. Indeed, the research team recently succeeded in accelerating alternating current optimal power flow problems by up to a factor of 10 compared with state-of-the-art tools.




ExaModels is an algebraic modeling and automatic differentiation tool specialized for single instruction, multiple data abstraction of nonlinear programs, that is aimed for efficiently instantiating optimization models on exascale architectures. ExaModels compiles derivative evaluation codes tailored to each computational pattern and, using these codes, carries out reverse-mode automatic differentiation.

“ExaModels goes beyond traditional boundaries of algebraic modeling systems, enabling derivative evaluation on GPU accelerators,” said Pacaud, a postdoctoral appointee in Argonne’s MCS division.




MadNLP is asolver for nonlinear programming. MadNLP seeks to streamline the development of modeling and algorithmic paradigms in order to exploit structures and make efficient use of high-performance computers. The codeis interfaced with several modeling packages as well as with non-Julia sparse and dense linear solvers.

“The principal advantages of both ExaModels and MadNLP are their flexibility and portability,” said Anitescu, a senior computational mathematician in Argonne’s MCS division. Anitescu is the Argonne lead for the Exascale Computing Project’s EXASGD activity, which supported this research. “Using abstractions, we can wrap the different vendor APIs in Julia as a single vendor API. As a result, users can run the same code on CPUs as well as on Intel, AMD, and NVIDIA GPUs. And, since Julia is a compiled language, this flexibility does not compromise performance.”