-
-
Notifications
You must be signed in to change notification settings - Fork 386
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve access to LpProblem.variables() #776
Comments
Hello! Thanks for flagging. I haven't analyzed this logic much in the past. Mainly because it's never been the bottleneck in my use cases. Is it in yours? How are you using it? It seems that once a variable is in the Lines 1574 to 1605 in 6af3801
And I'm guessing that even if you just added one or two variables and re-sort, the sorting goes really fast because it starts with an almost sorted list. |
My use case is implementing reduced cost variable fixing while doing some lexicographical multi-objective optimisation in a pseudo-boolean problem (all variables are binary variables). This means for each of my lexicographic objectives, I'd do something like deactivated = set()
while True:
for variable in problem.variables():
if gap < reducedCost[variable]:
variable.upBound = 0
deactivated += variable
problem.solve()
if problem.objective.value() > target:
target += 1
for variable in deactivated:
variable.upBound = 1
deactivated = set()
else:
break where It wasn't a huge bottle-neck, but I was able to go from ~10 seconds to ~6 seconds to solve my overall problem by avoiding calling For the record, I have ~200k-300k variables and 250 constraints in test instances, and significantly more in the real instances I want to solve. |
ok, that's indeed a niche way to solve a problem many many times. In your example: deactivated = set()
my_variables = problem.variables()
while True:
for variable in my_variables:
if gap < reducedCost[variable]:
variable.upBound = 0
deactivated += variable
problem.solve()
if problem.objective.value() > target:
target += 1
for variable in deactivated:
variable.upBound = 1
deactivated = set()
else:
break |
That's exactly what I did, in the end. Once I get to this step, I am no longer adding variables or constraints. As for the technique itself, it's only useful if the objective value of the linear relaxation is often close (within an integer or two) of the objective value of the integer problem itself. If you really want more background, our paper where we use this technique (and the code I'm re-implementing) is from https://pubsonline.informs.org/doi/full/10.1287/opre.2022.2374 |
Describe the new feature
For what seems like historical reasons, accessing
LpProblem.variables()
pulp/pulp/pulp.py
Line 1593 in 6af3801
addConstraint
calls the same code when a constraint is added. Additionally, after doing this,LpProblem.variables()
sorts the variables by name. For me, this is not necessary and just a waste of cycles. I tried to check the history, and it seems it's just been this way for at least 12 years.My first question is whether there's a need to do
addVariables()
at all insidevariables()
? If not, can we simplifyLpProblem.variables()
to just returnself._variables
?For what it's worth, I'm currently just bypassing
variables()
in my code and accessing (and caching)LpProblem._variables
for now. However, I realise this is bad practice.Additional info
Please answer these questions before submitting your feature request.
Is your feature request related to an issue? Please include the issue number.
I don't think so.
Does this feature exist in another product or project? Please provide a link.
I'm guessing it probably is.
The text was updated successfully, but these errors were encountered: