Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question regarding convergence of integer optimization #542

Open
yolking opened this issue Jan 17, 2025 · 7 comments
Open

Question regarding convergence of integer optimization #542

yolking opened this issue Jan 17, 2025 · 7 comments

Comments

@yolking
Copy link

yolking commented Jan 17, 2025

Hello! I am using pre-release version because I am interested in integer optimization. So far on the one hand I get pretty fast close to global optimum using it.

from bayes_opt import SequentialDomainReductionTransformer
pbounds = {
    "x1": (1, 1000, int),  
    "x2": (1, 1000, int),
    "x3": (1, 1000, int),
}
optimizer = BayesianOptimization(
    f=opt_func,
    pbounds=pbounds,
    verbose=2, 
)

best_target_ind = 0
best_target = -1000000000000
for i in count():
    next_point = optimizer.suggest()
    target = opt_func(**next_point)
    optimizer.register(params=next_point, target=target)
    if target>best_target:
        best_target = target
        best_target_ind = i
    if i-best_target_ind>150:      
        print(f"\t{next_point}")
        break
print("Best result:")
print(optimizer.max)

But BO seems to make very little effort to improve found optimum and continues to look far away from found optimal point. I checked docs for possible solutions to make it look closer to found optimum. SequentialDomainReductionTransformer may have been one of them, but it isn't available currently for integer optimization. Changing acquisition function after N unsuccessful iterations to
acquisition.ExpectedImprovement(xi=0.0)
seems like another possible solution, but it doesn't seem to do much.
Here is chunk of log of my optimization. The best optimum was found on 270 evaluation 0.5739394489614946 at [129, 2,740] and after that for 150 iterations no better point was found. Unreached global optimum is 0.5751567341679235 at [128, 2, 711]

  1. 0.5739394489614946 : ( 129, 2, 740)
  2. 0.562122923291349 : ( 141, 433, 755)
  3. 0.5637633205086482 : ( 133, 591, 706)
  4. 0.5671965653573532 : ( 137, 75, 682)
  5. 0.5648119388511881 : ( 135, 384, 718)
  6. 0.5556753869790383 : ( 168, 94, 804)
  7. 0.5391487427464875 : ( 201, 750, 693)
  8. 0.5620342532315347 : ( 135, 575, 757)
  9. 0.5094327951926114 : ( 441, 6, 710)
  10. 0.49312577749028136 : ( 325, 854, 978)
  11. 0.5717251849734983 : ( 129, 45, 700)
  12. 0.4993160246208407 : ( 389, 388, 934)
  13. 0.5616862869471217 : ( 112, 19, 756)
  14. 0.5668709566752652 : ( 141, 94, 728)
  15. 0.5412575931496266 : ( 120, 235, 856)
  16. 0.5260654968073734 : ( 293, 224, 772)
  17. 0.5607839319758643 : ( 164, 1, 808)
  18. 0.5570700592187176 : ( 154, 426, 727)
  19. 0.5679625492297719 : ( 132, 2, 789)
  20. 0.4135498252513754 : (1000, 646, 641)
  21. 0.5578181264865069 : ( 145, 265, 795)
  22. 0.4923969331624663 : ( 559, 1, 709)
  23. 0.5713064661872675 : ( 132, 29, 701)
  24. 0.5496693450599132 : ( 127, 287, 581)
  25. 0.5608862509741278 : ( 134, 475, 781)
  26. 0.5653586598894028 : ( 123, 395, 676)
  27. 0.4899158653866104 : ( 399, 95, 999)
  28. 0.5738924692628588 : ( 125, 1, 699)
  29. 0.5649884129475095 : ( 150, 6, 754)
  30. 0.5403166987731776 : ( 198, 545, 821)
  31. 0.5048640345902715 : ( 481, 590, 863)
  32. 0.5152179966179371 : ( 420, 8, 849)
  33. 0.5695567422933128 : ( 122, 76, 700)
  34. 0.5526048705799629 : ( 153, 687, 682)
  35. 0.551958875136273 : ( 189, 3, 680)
  36. 0.5220407644276102 : ( 329, 136, 877)
  37. 0.5624521509865348 : ( 139, 479, 704)
  38. 0.5624314035303094 : ( 133, 123, 790)
  39. 0.5726957026706154 : ( 132, 3, 743)
  40. 0.5574865183309355 : ( 147, 601, 733)
  41. -0.003711854927608495: ( 991, 338, 14)
  42. 0.5717020699778077 : 133, 22, 703
  43. 0.5659367577295359 : 135, 5, 633
  44. 0.5712659572231308 : 125, 29, 678
  45. 0.5639185645465573 : 146, 2, 798
  46. 0.5677979749663973 : 119, 70, 674
  47. 0.5721542636991069 : 129, 32, 692
  48. 0.5728171813402892 : 124, 19, 694
  49. 0.5727085645868546 : 137, 2, 727
  50. 0.5733057101998734 : 128, 20, 693
  51. 0.5622654692128538 : 117, 186, 651
  52. 0.5730132343477814 : 132, 7, 716
  53. 0.5713460763264467 : 131, 23, 684
  54. 0.5722638791551792 : 122, 2, 741
  55. 0.4259971144997654 : 997, 265, 991
  56. 0.5712120332543718 : 139, 6, 709
  57. 0.5669250140427018 : 139, 91, 716
  58. 0.4380931357144683 : 543, 1, 495
  59. 0.5702820490066987 : 136, 39, 720
  60. 0.5714631035555393 : 119, 1, 726
  61. 0.48784666007730465 : 539, 827, 739
  62. 0.5711345693363578 : 132, 27, 693
  63. 0.5640471909903313 : 134, 229, 767
  64. 0.5713302396633932 : 136, 20, 718
  65. 0.5705349537264771 : 121, 29, 674
  66. 0.5685407764156462 : 122, 10, 769
  67. 0.5709470591129177 : 130, 31, 748
  68. 0.5717805631344037 : 127, 37, 711
  69. 0.5706775254772627 : 133, 1, 767
  70. 0.5654006178672313 : 127, 161, 760
  71. 0.5696959478895336 : 115, 2, 706
  72. 0.56588772894393 : 121, 203, 728
Is this expected behavior? Other algorithms like DE and CMA become much more interested in convergence around optimal point after some time. Is it currently possible after, say, 100 unsuccessful iterations to look for optimum close to best point so far? Probably modifying bounds by hand or starting again with smaller bounds is one possible solution.
@till-m
Copy link
Member

till-m commented Jan 17, 2025

Hey @yolking,

would it be possible for you to post a complete snippet, including the function you're optimizing, so that I can run the code and check for myself?

@till-m
Copy link
Member

till-m commented Jan 24, 2025

I assume the problem is fixed. Feel free to re-open if you encounter it again.

@till-m till-m closed this as completed Jan 24, 2025
@yolking
Copy link
Author

yolking commented Jan 27, 2025

Issue isn't fixed, it's just this function is estimated on specific dataframes and I can't share it all here easily publicly. I'll reopen it if I come up with some simple example, for now I focused on CMA libraries where this issue doesn't happen, though they have other smaller issues.

@yolking
Copy link
Author

yolking commented Jan 27, 2025

I don't have permissions to reopen issue. Hope you'll see this comment.
Here is simple example where it's unable to find obvious global optimum and also I don't see any signs of convergence:

import numpy as np
from bayes_opt import BayesianOptimization
from itertools import count
from termcolor import colored

np.random.seed(42)
class Problem():
    def gen_weights(self, x):
        return x[0]*x[1]+x[2]
# Objective Function
def opt_func(x1,x2,x3):
    weights = Problem().gen_weights(x=[x1,x2,x3])
    objective = -1 * weights
    return objective  # Negative for minimization

# Define the bounds for the parameters
pbounds = {
    "x1": (1, 1000, int),  # Example bounds for integer variable x1
    "x2": (1, 1000, int),
    "x3": (1, 1000, int)
}

# Initialize Bayesian Optimization
optimizer = BayesianOptimization(
    f=opt_func,
    pbounds=pbounds,
    verbose=2,  # Verbosity: 1 prints only warnings, 0 is silent, 2 prints all
    random_state=42,  # For reproducibility
    allow_duplicate_points=True
)
best_target_ind = 0
best_target = -1000000000000
for i in count():
    next_point = optimizer.suggest()
    target = opt_func(**next_point)
    optimizer.register(params=next_point, target=target)
    if target>best_target:
        best_target = target
        best_target_ind = i
        print(colored(f"\t{i:<2} {target:<20}: ({next_point['x1']:>4}, {next_point['x2']:>4}, {next_point['x3']:>4})",'blue'))
    else:
        print(f"\t{i:<2} {target:<20}: ({next_point['x1']:>4}, {next_point['x2']:>4}, {next_point['x3']:>4})")
    if i-best_target_ind>250:
        print(f"\t{next_point}")
        break
print("Best result:")
print(optimizer.max)

Maybe it cannot find optimum due to border solution, but also it continues looking very far from it even 300 iterations later.

0  -45769  : ( 103,  436,  861)
1  -52310  : ( 459,  112,  902)
2  -53903  : ( 108,  491,  875)
3  -64057  : ( 182,  349,  539)
4  -38637  : (  83,  455,  872)
5  -41202  : (  90,  448,  882)
6  -21256  : (  67,  311,  419)
7  -37134  : (  79,  459,  873)
8  -12833  : (  47,  264,  425)
9  -21208  : (  99,  210,  418)
10 -4699   : (  17,  256,  347)
11 -692: (   2,  160,  372)
12 -18435  : (  87,  207,  426)
13 -635: (   4,  105,  215)
14 -6407   : (  26,  244,   63)
15 -67 : (   7,8,   11)
16 -8118   : ( 208,   39,6)
17 -13513  : (  20,  674,   33)
18 -111903 : ( 112,  999,   15)
19 -20465  : (  95,  211,  420)
20 -831: (   2,   24,  783)
21 -17322  : ( 961,   18,   24)
22 -453103 : ( 985,  460,3)
23 -3849   : ( 980,3,  909)
24 -3505   : ( 771,4,  421)
25 -8843   : (   8,  985,  963)
26 -946645 : ( 982,  963,  979)
27 -3097   : (   3,  886,  439)
28 -3316   : ( 657,5,   31)
29 -1326   : ( 166,2,  994)
30 -1674   : ( 999,1,  675)
31 -1597   : ( 497,2,  603)
32 -3792   : ( 700,4,  992)
33 -7967   : (   8,  905,  727)
34 -2294   : ( 799,2,  696)
35 -735: ( 423,1,  312)
36 -1244   : (   1,  718,  526)
37 -1365   : (  17,   23,  974)
38 -4691   : ( 899,5,  196)
39 -3330   : (   3,  784,  978)
40 -1475   : (   2,  633,  209)
41 -1406   : ( 271,3,  593)
42 -548: (  11,2,  526)
43 -1244   : ( 419,1,  825)
44 -1200   : ( 392,3,   24)
45 -946: ( 661,1,  285)
46 -9975   : ( 889,   11,  196)
47 -537: (   1,  512,   25)
48 -623: ( 193,2,  237)
49 -2338   : (   7,  192,  994)
50 -3402   : ( 997,3,  411)
51 -1869   : ( 445,2,  979)
52 -310: (   8,3,  286)
53 -1509   : (   4,  203,  697)
54 -1569   : (   1,  991,  578)
55 -1003   : (   1,  426,  577)
56 -953: (   2,  400,  153)
57 -1214   : ( 177,2,  860)
58 -608: ( 485,1,  123)
59 -2190   : ( 698,2,  794)
60 -2276   : (   2,  733,  810)
61 -1583   : (   1,  591,  992)
62 -734: ( 285,1,  449)
63 -1103   : (   1,  843,  260)
64 -4913   : (   5,  787,  978)
65 -978: ( 537,1,  441)
66 -985: (   1,  560,  425)
67 -892: (   5,  177,7)
68 -2506   : ( 850,2,  806)
69 -2344   : (   2,  998,  348)
70 -732: (  54,1,  678)
71 -889: ( 800,1,   89)
72 -736: ( 147,2,  442)
73 -1914   : ( 917,1,  997)
74 -1284   : (   1,  581,  703)
75 -456: ( 344,1,  112)
76 -653: (   1,   67,  586)
77 -248: (  16,5,  168)
78 -967: (   1,  109,  858)
79 -1128   : ( 641,1,  487)
80 -187: ( 164,1,   23)
81 -1363   : (   1,  370,  993)
82 -1499   : ( 928,1,  571)
83 -1012   : (   1,  675,  337)
84 -2732   : (   2,  999,  734)
85 -796: (   2,  385,   26)
86 -476: ( 171,2,  134)
87 -759: (   1,  484,  275)
88 -839: ( 829,1,   10)
89 -1768   : (   1,  899,  869)
90 -686: (   1,  584,  102)
91 -400: (   4,   88,   48)
92 -263: ( 249,1,   14)
93 -1266   : ( 358,1,  908)
94 -760: ( 592,1,  168)
95 -968: ( 258,1,  710)
96 -1103   : (   1,  382,  721)
97 -572: ( 349,1,  223)
98 -880: (   2,  324,  232)
99 -931: (   1,  772,  159)
100 -358: (   1,  203,  155)
101 -793: (   1,  390,  403)
102 -1121   : (   1,  289,  832)
103 -633: ( 313,1,  320)
104 -565: ( 547,1,   18)
105 -1226   : ( 537,1,  689)
106 -438: ( 117,1,  321)
107 -409: (   2,8,  393)
108 -795: (   1,  245,  550)
109 -1151   : (   1,  611,  540)
110 -1238   : ( 916,1,  322)
111 -894: ( 411,1,  483)
112 -118: (   9,5,   73)
113 -201: (  72,1,  129)
114 -1453   : ( 547,1,  906)
115 -647: (  98,1,  549)
116 -420: (   1,  162,  258)
117 -74 : (  67,1,7)
118 -1314   : ( 662,1,  652)
119 -1092   : ( 403,1,  689)
120 -642: (   1,  118,  524)
121 -514: (   2,   66,  382)
122 -11 : (   4,2,3)
123 -857: ( 565,1,  292)
124 -352: (   1,  273,   79)
125 -963: (  46,2,  871)
126 -680: (   7,2,  666)
127 -306: (  67,1,  239)
128 -1289   : ( 292,1,  997)
129 -716: ( 709,1,7)
130 -71 : (   2,   35,1)
131 -83 : (  14,2,   55)
132 -1445   : (   1,  997,  448)
133 -258: (   1,  199,   59)
134 -976: (   1,  739,  237)
135 -659: ( 447,1,  212)
136 -229: (   2,  113,3)
137 -214: (   2,   47,  120)
138 -701: (   1,  669,   32)
139 -1373   : (   1,  742,  631)
140 -828: ( 167,1,  661)
141 -1092   : ( 807,1,  285)
142 -390: (   2,   59,  272)
143 -808: (   1,  666,  142)
144 -623: ( 232,1,  391)
145 -274: (   1,  250,   24)
146 -236: (  75,2,   86)
147 -360: ( 240,1,  120)
148 -730: (   1,  405,  325)
149 -1110   : ( 992,1,  118)
150 -1451   : (   1,  581,  870)
151 -1785   : ( 837,1,  948)
152 -1054   : (  16,4,  990)
153 -63 : (   6,3,   45)
154 -890: (   3,1,  887)
155 -270: (   1,  169,  101)
156 -262: (   1,   65,  197)
157 -486: (  45,2,  396)
158 -85 : (   1,   18,   67)
159 -325: (  96,1,  229)
160 -241: (   1,  118,  123)
161 -794: ( 677,1,  117)
162 -201: (   5,2,  191)
163 -450: (   1,  341,  109)
164 -42 : (   1,   21,   21)
165 -1334   : ( 789,1,  545)
166 -162: ( 105,1,   57)
167 -909: (   1,  324,  585)
168 -324: (   1,2,  322)
169 -310: ( 308,1,2)
170 -229: (   2,   35,  159)
171 -221: ( 112,1,  109)
172 -168: (  35,3,   63)
173 -331: ( 270,1,   61)
174 -503: (   1,  465,   38)
175 -230: (   5,7,  195)
176 -190: ( 146,1,   44)
177 -553: (   2,   55,  443)
178 -183: (  18,1,  165)
179 -182: (   3,   36,   74)
180 -258: ( 126,2,6)
181 -82 : (   1,   81,1)
182 -528: (   1,  237,  291)
183 -140: (  15,9,5)
184 -179: (  13,3,  140)
185 -311: (   1,  290,   21)
186 -61 : (   1,   20,   41)
187 -198: (   1,   42,  156)
188 -119: (  23,1,   96)
189 -87 : (  39,1,   48)
190 -97 : (   3,   24,   25)
191 -24 : (   1,4,   20)
192 -265: ( 171,1,   94)
193 -443: (   1,  430,   13)
194 -149: (   1,   13,  136)
195 -96 : (  76,1,   20)
196 -104: (  85,1,   19)
197 -165: (   1,  132,   33)
198 -204: (   1,   99,  105)
199 -819: (   1,   85,  734)
200 -151: (   2,   34,   83)
201 -288: ( 121,1,  167)
202 -367: (  57,1,  310)
203 -78 : (   3,   21,   15)
204 -132: (   8,   10,   52)
205 -999: ( 632,1,  367)
206 -114: (   2,   21,   72)
207 -643: (   1,  611,   32)
208 -127: (   4,5,  107)
209 -275: (   1,  123,  152)
210 -112: (   7,3,   91)
211 -189: (   2,   41,  107)
212 -106: (  18,3,   52)
213 -82 : (   4,2,   74)
214 -90 : (   1,   41,   49)
215 -103: (  44,2,   15)
216 -902: ( 121,1,  781)
217 -80 : (   1,   40,   40)
218 -652: ( 577,1,   75)
219 -102: (  35,2,   32)
220 -1177   : ( 972,1,  205)
221 -82 : (  20,2,   42)
222 -423: (   1,  215,  208)
223 -278: (   3,   16,  230)
224 -212: (   1,  132,   80)
225 -62 : (  33,1,   29)
226 -388: (   1,   65,  323)
227 -182: (  20,6,   62)
228 -226: ( 213,1,   13)
229 -247: ( 105,2,   37)
230 -161: (  12,   12,   17)
231 -501: ( 264,1,  237)
232 -1592   : (   1,  687,  905)
233 -186: ( 146,1,   40)
234 -234: (   6,   18,  126)
235 -649: (   1,  516,  133)
236 -387: ( 198,1,  189)
237 -151: (  43,2,   65)
238 -88 : (   1,3,   85)
239 -104: (  22,4,   16)
240 -96 : (   8,5,   56)
241 -258: (   1,  105,  153)
242 -192: (  51,2,   90)
243 -223: ( 208,1,   15)
244 -697: ( 187,1,  510)
245 -61 : (   4,   10,   21)
246 -231: (   1,   93,  138)
247 -169: (   4,2,  161)
248 -128: (  85,1,   43)
249 -219: (   3,   27,  138)
250 -138: (   9,8,   66)
251 -159: (   1,   71,   88)
252 -130: (   1,5,  125)
253 -126: ( 117,1,9)
254 -97 : (   1,   89,8)
255 -162: (   1,   87,   75)
256 -103: (   1,   11,   92)
257 -278: ( 193,1,   85)
258 -180: (   6,   11,  114)
259 -136: (  51,2,   34)
260 -97 : (  41,2,   15)
261 -87 : (   2,8,   71)
262 -147: (  10,5,   97)
263 -149: (   7,3,  128)
264 -106: (  40,2,   26)
265 -175: ( 168,1,7)
266 -78 : (   1,   35,   43)
267 -34 : (   1,   17,   17)
268 -47 : (  13,3,8)
269 -767: (   1,  334,  433)
270 -138: (  20,4,   58)
271 -253: ( 125,1,  128)
272 -157: ( 131,1,   26)
273 -383: (  46,1,  337)
274 -178: (   6,   17,   76)
275 -310: (   1,  173,  137)
276 -240: (   1,  139,  101)
277 -199: (   1,  168,   31)
278 -186: (   1,  142,   44)
279 -90 : (  17,5,5)
280 -122: (  52,2,   18)
281 -161: (   2,   77,7)
282 -479: ( 474,1,5)
283 -88 : (   1,   67,   21)
284 -452: (   1,  271,  181)
285 -322: (  61,2,  200)
286 -144: (   3,4,  132)
287 -179: (  25,6,   29)
288 -165: (   1,  135,   30)
289 -202: (   6,   32,   10)
290 -140: (  77,1,   63)
291 -219: (   1,  206,   13)
292 -332: (   1,  288,   44)
293 -272: (  47,2,  178)
294 -284: (  31,2,  222)
295 -142: (   7,2,  128)
296 -99 : (   1,   56,   43)
297 -1007   : ( 917,1,   90)
298 -206: ( 203,1,3)
299 -30 : (  14,1,   16)
300 -74 : (   3,6,   56)
301 -181: (   1,   74,  107)
302 -139: (   3,   12,  103)
303 -204: (   3,   48,   60)
304 -359: (   1,7,  352)
305 -117: (   1,   47,   70)
306 -265: (   1,  253,   12)
307 -250: (  15,   16,   10)
308 -186: (  45,1,  141)
309 -54 : (   2,   17,   20)
310 -115: (  21,2,   73)
311 -67 : (   1,   34,   33)
312 -510: (  50,1,  460)
313 -215: (  11,2,  193)
314 -221: (  64,2,   93)
315 -425: (   1,  105,  320)
316 -219: (   1,   14,  205)
317 -161: (   5,   27,   26)
318 -24 : (   7,2,   10)
319 -277: (   1,   46,  231)
320 -178: (   2,   57,   64)
321 -627: ( 625,1,2)
322 -258: (   2,   39,  180)
323 -217: (   1,  170,   47)
324 -776: (   1,  139,  637)
325 -293: (  32,2,  229)
326 -133: (  28,4,   21)
327 -65 : (   5,   11,   10)
328 -642: (   1,  320,  322)
329 -273: (   2,  134,5)
330 -162: (   4,   36,   18)
331 -185: (  49,3,   38)
332 -18 : (  11,1,7)
333 -173: (   7,   13,   82)
334 -260: (   1,  239,   21)
335 -196: (  21,8,   28)
336 -108: (  75,1,   33)
337 -240: (   1,  188,   52)
338 -309: (   1,  197,  112)
339 -407: ( 290,1,  117)
340 -151: (   4,   34,   15)
341 -184: (   2,   36,  112)
342 -181: (   1,  163,   18)
343 -507: ( 209,1,  298)
344 -106: (   5,9,   61)
345 -164: (   2,   60,   44)
346 -921: (   1,  596,  325)
347 -100: (  31,3,7)
348 -241: (  18,   11,   43)
349 -68 : (  33,1,   35)
350 -57 : (  22,1,   35)
351 -154: (   1,   28,  126)
352 -185: (   1,   76,  109)
353 -159: ( 112,1,   47)
354 -317: (   1,  119,  198)
355 -126: (  11,8,   38)
356 -84 : (   4,5,   64)
357 -509: ( 435,1,   74)
358 -88 : (   3,3,   79)
359 -186: (   5,   21,   81)
360 -242: (  30,6,   62)
361 -117: (  66,1,   51)
362 -140: (  63,1,   77)
363 -74 : (   4,   17,6)
364 -200: (  28,7,4)
365 -345: (   1,  327,   18)
366 -88 : (   5,   12,   28)
367 -245: (   1,  186,   59)
368 -117: (   1,   56,   61)
369 -74 : (   3,   21,   11)
370 -86 : (   1,   24,   62)
371 -245: ( 104,1,  141)
372 -33 : (   3,1,   30)
373 -168: ( 108,1,   60)
{'x1': 108, 'x2': 1, 'x3': 60}
Best result:
{'target': -11.0, 'params': {'x1': 4.0, 'x2': 2.0, 'x3': 3.0}}

I tried looking at the code for specific reasons why integer may not work, but looking at values during the run I have no idea what's really should be returned by acquisition function without deep dive in theory

@till-m till-m reopened this Jan 27, 2025
@yolking
Copy link
Author

yolking commented Jan 27, 2025

I tried completely dropping integers and on floats it takes much more iterations while exploring completely random looking points, so I guess it's not really integer problem.

@till-m
Copy link
Member

till-m commented Jan 29, 2025

Hi @yolking,

I had a bit of a look at your problem. Let me start by saying that purely non-continuous optimization is not the intended, and definitely not the ideal use-case for this package.

The problem is essentially the optimization of the acquisition function, which, for non-continuous parameters, is based on random sampling. This means that having while a guess close to the optimal solution might mean that the optimal solution is rated highly by the acquisition function (though this is by all means not guaranteed), if the optimal point is never produced by the random sampling then it will simply never be suggested.

One way to deal with the problem would be to simply suggest all possible points at any step, but in case of your problem, this is 1 billion points at any step, which is massive (might still be feasible depending on your machine...).

Depending on how much time you want to invest, you could overwrite the suggest step of the acquisition function to use a different optimization method for finding the maximum of the acquisition function.

Sorry I couldn't be of more help.

@yolking
Copy link
Author

yolking commented Jan 29, 2025

Okay thank you. That answers half of my question. But what about continuous case, meaning

pbounds = {
    "x1": (1, 1000), 
    "x2": (1, 1000),
    "x3": (1, 1000)
}

Running same code with continuous bounds and looking at printed algorithm guesses it looks like it has only very rough idea of global optimum direction. Do you consider this behavior normal?
For example, CMA guesses of parameters after ~500 iterations look like this

[1.17936865 1.01478383 2.28821332]
[1.16048908   1.18377871 1.4734578 ]
[1.20250959   1.08793453 1.63387184]
[1.3322617  1.00167885 1.11437587]
[1.00118322   1.00748533 1.20751108]
[1.02609631   1.06017618 1.13131181]
[1.52868413   1.12628887 1.55163662]
[1.45220963   1.20445465 1.25900217]
[1.23980059   1.13537257 1.00922053]
[1.12756339   1.03537426 1.49343499]
[1.45922633   1.0010981  1.2306204 ]
[1.1560473  1.01304744 2.12473375]
[1.18598119   1.0430322  1.00000044]
[1.21477937   1.16322498 1.00989137]
[1.40001542   1.09094171 1.48538542]
[1.10216811   1.08643813 1.01434249]
[1.02372171   1.10919709 1.73179119]
[1.33009068   1.00108415 1.06899792]
[1.3169501  1.07735381 1.00792969]
[1.10144338   1.0498856  1.20987253]
[1.0076468  1.02101221 1.32336584]

Bayesian optimization produces this after 700:

705 -10.688311119039415 : (1.7636214472775185, 3.0447432889223056, 5.3185365532417475)
706 -17.81982427519412  : (2.2905342454039537, 5.872777646787187, 4.368025959585223)
707 -13.258516660991045 : (1.1973474698198106, 2.1128983837629276, 10.728643127206137)
708 -8.221384357138588  : (3.5171902422179215, 1.4627818278327294, 3.0765023857916156)
709 -16.76687619478646  : (7.440573951962644, 1.4464320065032017, 6.004591883913675)
710 -14.521568701759328 : (4.3340269595549925, 1.5412134813447738, 7.841907923181473)
711 -17.501982351082017 : (2.9519298070885887, 4.407004396408562, 4.492814713353129)
712 -8.930100553159178  : (4.910468712797913, 1.5172967534533122, 1.479462317296841)
713 -9.577463358622195  : (2.1170858159501047, 3.78436819554267, 1.5656311295061163)
714 -7.162072524980228  : (2.7133258739827726, 1.3397051659304866, 3.527015834752655)
715 -17.67870318918449  : (1.0844577267415971, 11.974682153374406, 4.692666602682908)
716 -14.391141322290288 : (1.3144067304370282, 8.500990898833948, 3.217381669479026)
717 -10.91248831228271  : (4.088547577688113, 2.173681389731967, 2.025288531628346)

Since it's like this even in continuous case, I would consider BO inefficient in convergence to single global optimum after reaching some close proximity of it or suspect a bug.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants