-
Notifications
You must be signed in to change notification settings - Fork 414
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: optimize_acqf_mixed does not respect parameter constraint with fixed features #2708
Comments
Thanks for raising this issue. I'm having some trouble with the reproducible example though, |
Hi, my bad. Here is the right code without using the prediction model. I modified my code to test each element of fixed_features_list individually in order to eliminate fixed features that are incompatible with the constraints. In this code, I just kept one fixed_feature (one that does not respect constraints on S variables. With optimize_acqf_mixed, if a fixed feature violated a constraint, the optimization would raise an error, allowing me to identify and filter it. I also noticed a bug related to this behavior (see the pending PR: #2614), but my current code does not yet incorporate this fix. 🚨 Main issue:
What I understand: 🔹 Another concern:
|
@ysellai unfortunately your code is still not fully runnable - the
This request generally makes sense to me. I don't think it'll be easy to make any changes on the scipy side here, but we could incorporate the validation on the botorch side. It will be easier to investigate and pinpoint this if you share a fully runnable repro. |
My bad I forgot to include the import for lhs in that code. To make it fully runnable, simply add this import at the beginning:
This should resolve the issue with lhs_samples = lhs(n_dims, samples=num_points). I am working with pyDOE==0.3.8 Also, do you think that using AX, for the permutation constraints I simulate with fixed_feature (e.g. one or 2 null variables each time), could work with constraints? I've tried to use non-linear constraints with botorch but the batch initialization makes it too complex. |
Hi everyone, I am facing an issue using inequality constraints and fixed features.
Issue Description
When using
optimize_acqf_mixed
in BoTorch with inequality constraints and fixed features, I encounter inconsistent behavior:Error with
CandidateGenerationError
:In some cases, I get the following error:
This indicates that the candidate generation failed because no valid point could satisfy both the constraints and the fixed features.
Warning from
scipy.optimize.minimize
:In other cases, I see the following warning:
Despite the warning, a candidate is generated.
No warning but invalid candidates:
Occasionally, no warning or error is raised, but the generated candidate violates one or more of the inequality constraints. For example:
Constraints:
Generated candidate violating constraint 3 (
S >= 15.4
):Another weird behaviour:
With fixed_feature =
[{0: 0.0, 1: 0.0},
{0: 0.0, 2: 0.0},
{0: 0.0, 3: 0.0},
{0: 0.0, 4: 0.0},
{1: 0.0, 2: 0.0},
{1: 0.0, 3: 0.0},
{1: 0.0, 4: 0.0},
{2: 0.0, 3: 0.0},
{2: 0.0, 4: 0.0},
{3: 0.0, 4: 0.0}]
Error: CandidateGenerationError: Inequality constraint 2 not met with fixed_features.
In the first case, with this fixe feature listfixed feature should (so one of them) {0: 0.0, 4: 0.0} works but BoTorch return the error.
In the second case, with only this fixed feature {1: 0.0, 4: 0.0} in the list, It finds a point that respect constraint
Expected Behavior
The optimization should:
CandidateGenerationError
) when no valid candidates can be generated.Actual Behavior
scipy.optimize.minimize
but proceeds to generate candidates.I understand that in 2022, BoTorch can send back a point that doesn't respect the constraints because of scipy. But I don't know if this has been repaired and I can't explain the other behavior.
Your help will be much appreciated
Please provide a minimal, reproducible example of the unexpected behavior.
Steps to Reproduce
Here is a code to reproduce this case:
With fixed_feature = [{0: 0.0, 1: 0.0},
{0: 0.0, 2: 0.0},
{0: 0.0, 3: 0.0},
{0: 0.0, 4: 0.0},
{1: 0.0, 2: 0.0},
{1: 0.0, 3: 0.0},
{1: 0.0, 4: 0.0},
{2: 0.0, 3: 0.0},
{2: 0.0, 4: 0.0},
{3: 0.0, 4: 0.0}] Error: CandidateGenerationError: Inequality constraint 2 not met with fixed_features.
In this se case, fixed feature should (so one of them) {0: 0.0, 4: 0.0} works but BoTorch return the error.
In the second case, with only this fixed feature {1: 0.0, 4: 0.0}, It finds a point that respect constraint
optimize_acqf_mixed
with a set of fixed features, like:qExpectedImprovement
acquisition function and parameters like:Please paste any relevant traceback/logs produced by the example provided.
BoTorch Version
0.12.0
Python Version
3.12
Operating System
Linux
Code of Conduct
The text was updated successfully, but these errors were encountered: