JuMP.jl
Modeling language for Mathematical Optimization (linear, mixed-integer, conic, semidefinite, nonlinear)
Top Related Projects
Optimization functions for Julia
Quick Overview
JuMP.jl is an open-source modeling language for mathematical optimization embedded in Julia. It allows users to express optimization problems in a high-level, algebraic syntax while maintaining the performance of commercial optimization software. JuMP supports a wide range of problem classes, including linear, mixed-integer, quadratic, conic, and general nonlinear programming.
Pros
- High-level, intuitive syntax for expressing optimization problems
- Excellent performance due to Julia's just-in-time compilation
- Supports a wide range of solvers and problem types
- Extensible architecture allowing for user-defined constraints and objectives
Cons
- Steeper learning curve for users unfamiliar with Julia programming
- Documentation can be overwhelming for beginners
- Some advanced features may require in-depth knowledge of optimization theory
- Dependency on external solvers for certain problem types
Code Examples
- Linear Programming Example:
using JuMP, HiGHS
model = Model(HiGHS.Optimizer)
@variable(model, x >= 0)
@variable(model, y >= 0)
@objective(model, Max, 5x + 3y)
@constraint(model, 1x + 5y <= 3)
optimize!(model)
println("Optimal solution: x = ", value(x), ", y = ", value(y))
- Quadratic Programming Example:
using JuMP, Ipopt
model = Model(Ipopt.Optimizer)
@variable(model, x)
@variable(model, y)
@objective(model, Min, x^2 + y^2)
@constraint(model, x + y >= 1)
optimize!(model)
println("Optimal solution: x = ", value(x), ", y = ", value(y))
- Mixed-Integer Programming Example:
using JuMP, GLPK
model = Model(GLPK.Optimizer)
@variable(model, 0 <= x <= 2, Int)
@variable(model, 0 <= y <= 30)
@objective(model, Max, 5x + 3y)
@constraint(model, 1x + 5y <= 3.5)
optimize!(model)
println("Optimal solution: x = ", value(x), ", y = ", value(y))
Getting Started
To get started with JuMP, first install Julia, then install JuMP and a solver:
using Pkg
Pkg.add("JuMP")
Pkg.add("HiGHS") # or any other supported solver
using JuMP, HiGHS
# Create a simple model
model = Model(HiGHS.Optimizer)
@variable(model, x >= 0)
@objective(model, Max, x)
@constraint(model, x <= 10)
optimize!(model)
println("Optimal value: ", objective_value(model))
This example sets up a basic linear programming model, solves it, and prints the optimal value.
Competitor Comparisons
Optimization functions for Julia
Pros of Optim.jl
- Focused on unconstrained and box-constrained optimization problems
- Simpler API for straightforward optimization tasks
- Includes derivative-free optimization methods
Cons of Optim.jl
- Limited support for constrained optimization problems
- Fewer solver options compared to JuMP.jl
- Less flexibility in problem formulation
Code Comparison
Optim.jl:
using Optim
rosenbrock(x) = (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2
result = optimize(rosenbrock, [0.0, 0.0], BFGS())
JuMP.jl:
using JuMP, Ipopt
model = Model(Ipopt.Optimizer)
@variable(model, x[1:2])
@NLobjective(model, Min, (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2)
optimize!(model)
Optim.jl provides a more concise syntax for simple optimization problems, while JuMP.jl offers greater flexibility and support for complex constrained optimization problems. Optim.jl is better suited for unconstrained or box-constrained problems, whereas JuMP.jl excels in handling a wide range of optimization problem types and integrating with various solvers.
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
JuMP is a domain-specific modeling language for mathematical optimization embedded in Julia. You can find out more about us by visiting jump.dev.
Latest release | Development version | |
---|---|---|
Tagged version | ||
Install | import Pkg; Pkg.add("JuMP") | import Pkg; Pkg.pkg"add JuMP#master" |
Documentation | ||
Get help | Ask a question on the Community forum | Join the Developer chatroom |
Source code | release-1.0 | master |
Testing status | ||
Coverage |
Need help?
Use the Community forum to search for answers to previously asked questions, or ask a new question.
The post Please read: make it easier to help you, describes the best practices for asking a question.
Bug reports
Please report any issues via the GitHub issue tracker. All types of issues are welcome and encouraged; this includes bug reports, documentation typos, feature requests, etc.
Citing JuMP
If you find JuMP useful in your work, we kindly request that you cite the following paper (journal, preprint):
@article{Lubin2023,
author = {Miles Lubin and Oscar Dowson and Joaquim {Dias Garcia} and Joey Huchette and Beno{\^i}t Legat and Juan Pablo Vielma},
title = {{JuMP} 1.0: {R}ecent improvements to a modeling language for mathematical optimization},
journal = {Mathematical Programming Computation},
volume = {15},
pages = {581â589},
year = {2023},
doi = {10.1007/s12532-023-00239-3}
}
For earlier works, see:
-
Our paper in SIAM Review (journal, pdf):
@article{DunningHuchetteLubin2017, author = {Iain Dunning and Joey Huchette and Miles Lubin}, title = {{JuMP}: {A} {M}odeling {L}anguage for {M}athematical {O}ptimization}, journal = {SIAM Review}, volume = {59}, number = {2}, pages = {295-320}, year = {2017}, doi = {10.1137/15M1020575}, }
-
Our paper in IJOC (journal, preprint):
@article{LubinDunningIJOC, author = {Miles Lubin and Iain Dunning}, title = {{C}omputing in {O}perations {R}esearch {U}sing {J}ulia}, journal = {INFORMS Journal on Computing}, volume = {27}, number = {2}, pages = {238-248}, year = {2015}, doi = {10.1287/ijoc.2014.0623}, }
JuMP is a Sponsored Project of NumFOCUS, a 501(c)(3) nonprofit charity in the United States. NumFOCUS provides JuMP with fiscal, legal, and administrative support to help ensure the health and sustainability of the project. Visit numfocus.org for more information.
You can support JuMP by donating.
Donations to JuMP are managed by NumFOCUS. For donors in the United States, your gift is tax-deductible to the extent provided by law. As with any donation, you should consult with your tax adviser about your particular tax situation.
JuMP's largest expense is the annual JuMP-dev workshop. Donations will help us provide travel support for JuMP-dev attendees and take advantage of other opportunities that arise to support JuMP development.
Top Related Projects
Optimization functions for Julia
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot