Skip to content

Commit

Permalink
Merge pull request #16 from UBC-CS/bp
Browse files Browse the repository at this point in the history
objective function with known lipschitz constant
  • Loading branch information
bradleypick authored Jul 25, 2018
2 parents cce84fb + 2a3a710 commit 69ceff3
Show file tree
Hide file tree
Showing 11 changed files with 128 additions and 32 deletions.
28 changes: 0 additions & 28 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,31 +34,3 @@ cd lipo-python
docker build . -t lipo-python
docker run -v <path-to-cloned-repo>/:/home/ -it lipo-python
```

## Resources

[Global optimization of Lipschitz functions](https://arxiv.org/abs/1703.02628).

* C. Malherbe and N. Vayatis. "Global optimization of Lipschitz functions". ICML. 2314 - 2323. (2017)

[BayesOpt](https://arxiv.org/abs/1405.7430)

* R. Martinez-Cantin. BayesOpt: {A} Bayesian Optimization Library for Nonlinear Optimization, Experimental Design and Bandits. CoRR. 1405.7430. (2014)

[CMA-ES - Covariance Matrix Adaptation Evolution Strategy](https://www.researchgate.net/publication/227050324_The_CMA_Evolution_Strategy_A_Comparing_Review)

* N. Hansen. The CMA Evolution Strategy: A Comparing Review. In J.A. Lozano, P. Larrañaga, I. Inza and E. Bengoetxea (Eds.). Towards a new evolutionary computation. Advances in estimation of distribution algorithms. Springer, pp. 75-102 (2006).

**IS ABOVE POINTING AT THE CORRECT PAPER?**

[CRS - Controlled Random Search with Local Mutation](https://link.springer.com/article/10.1007/s10957-006-9101-0)

* P. Kaelo and M. M. Ali, "Some variants of the controlled random search algorithm for global optimization," J. Optim. Theory Appl. 130 (2), 253-264 (2006).

[DIRECT](https://link.springer.com/article/10.1007/BF00941892)

* D. R. Jones, C. D. Perttunen, and B. E. Stuckmann, "Lipschitzian optimization without the lipschitz constant," J. Optimization Theory and Applications, vol. 79, p. 157 (1993).

[MLSL - Multi-Level Single-Linkage](https://link.springer.com/article/10.1007/BF02592071)

* A. H. G. Rinnooy Kan and G. T. Timmer, "Stochastic global optimization methods," Mathematical Programming, vol. 39, p. 27-78 (1987). (Actually 2 papers — part I: clustering methods, p. 27, then part II: multilevel methods, p. 57.)
2 changes: 2 additions & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,6 @@ numpy == 1.14.2
scipy == 0.19.1
scikit-learn == 0.19.1
pandas == 0.23.1
matplotlib == 2.1.2
tqdm == 4.23.4

27 changes: 27 additions & 0 deletions resources.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
## Resources

[Global optimization of Lipschitz functions](https://arxiv.org/abs/1703.02628).

* C. Malherbe and N. Vayatis. "Global optimization of Lipschitz functions". ICML. 2314 - 2323. (2017)

[BayesOpt](https://arxiv.org/abs/1405.7430)

* R. Martinez-Cantin. BayesOpt: {A} Bayesian Optimization Library for Nonlinear Optimization, Experimental Design and Bandits. CoRR. 1405.7430. (2014)

[CMA-ES - Covariance Matrix Adaptation Evolution Strategy](https://www.researchgate.net/publication/227050324_The_CMA_Evolution_Strategy_A_Comparing_Review)

* N. Hansen. The CMA Evolution Strategy: A Comparing Review. In J.A. Lozano, P. Larrañaga, I. Inza and E. Bengoetxea (Eds.). Towards a new evolutionary computation. Advances in estimation of distribution algorithms. Springer, pp. 75-102 (2006).

**IS ABOVE POINTING AT THE CORRECT PAPER?**

[CRS - Controlled Random Search with Local Mutation](https://link.springer.com/article/10.1007/s10957-006-9101-0)

* P. Kaelo and M. M. Ali, "Some variants of the controlled random search algorithm for global optimization," J. Optim. Theory Appl. 130 (2), 253-264 (2006).

[DIRECT](https://link.springer.com/article/10.1007/BF00941892)

* D. R. Jones, C. D. Perttunen, and B. E. Stuckmann, "Lipschitzian optimization without the lipschitz constant," J. Optimization Theory and Applications, vol. 79, p. 157 (1993).

[MLSL - Multi-Level Single-Linkage](https://link.springer.com/article/10.1007/BF02592071)

* A. H. G. Rinnooy Kan and G. T. Timmer, "Stochastic global optimization methods," Mathematical Programming, vol. 39, p. 27-78 (1987). (Actually 2 papers — part I: clustering methods, p. 27, then part II: multilevel methods, p. 57.)
Binary file added results/parabola-k.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added results/parabola-test.pkl
Binary file not shown.
Binary file added results/paraboloid2d-k.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added results/paraboloid2d-test.pkl
Binary file not shown.
18 changes: 17 additions & 1 deletion src/objective_functions.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,12 +47,28 @@ def deb_one(x):

deb_one_bounds = [(-5,5)]*5

def parabola(x):
if x.shape[0] != 1:
raise ValueError('Input array first dimension should be of size 1')
return - x**2

parabola_bounds = [(-1,1)]

def paraboloid2d(x):
if x.shape[0] != 2:
raise ValueError('Input array first dimension should be of size 2')
return -(x[0]**2 + x[1]**2)

paraboloid2d_bounds = [(-1,1)]*2

synthetic_functions = {
'Holder Table' : {'func': holder_table, 'bnds': holder_bounds},
'Rosenbrock': {'func': rosenbrock, 'bnds': rosenbrock_bounds},
'Linear Slope': {'func': linear_slope, 'bnds': linear_slope_bounds},
'Sphere': {'func': sphere, 'bnds': sphere_bounds},
'Deb N.1': {'func': deb_one, 'bnds': deb_one_bounds}
'Deb N.1': {'func': deb_one, 'bnds': deb_one_bounds},
'Parabola': {'func': parabola, 'bnds': parabola_bounds},
'Paraboloid2d': {'func': paraboloid2d, 'bnds': paraboloid2d_bounds}
}

def get_data(path):
Expand Down
3 changes: 2 additions & 1 deletion src/optimize.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,8 @@ def main(num_sim, num_iter, optimizers, objectives):
parser.add_argument('--objective', type=str,
help='type of objective functions to optimize',
choices=['Holder Table', 'Rosenbrock', 'Sphere',
'Linear Slope', 'Deb N.1', 'Housing', 'Yacht'])
'Linear Slope', 'Deb N.1', 'Housing', 'Yacht',
'Parabola', 'Paraboloid2d'])
parser.add_argument('--synthetic', default=False, action='store_true')
args = parser.parse_args()

Expand Down
70 changes: 70 additions & 0 deletions src/plot_k.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
#!usr/bin/env python

# Script to plot the lipschitz estimates
# Usage:
# - python plot_k.py inputfile 1 myplot

import argparse
import pickle
import numpy as np
import matplotlib
matplotlib.use('Agg')
import matplotlib.pyplot as plt

def main(results, lipschitz_constant, filename,
optimizer='AdaLIPO', q=(5,95), figsize=(10,5)):

if len(results.keys()) > 1:
raise RuntimeError('Inputfile must be simulation using single objective function!')

func_name = list(results.keys())[0]
d = results[func_name][optimizer][0]['x'].shape[1]
num_sim = len(results[func_name][optimizer])
num_iter = len(results[func_name][optimizer][0]['k'])

k_results = np.zeros((num_sim, num_iter))

for sim in range(num_sim):
k_results[sim,:] = results[func_name][optimizer][sim]['k']

median_loss = np.median(a=k_results, axis=0)
upper_loss = np.percentile(a=k_results, q=q[1], axis=0)
lower_loss = np.percentile(a=k_results, q=q[0], axis=0)
yerr = np.abs(np.vstack((lower_loss, upper_loss)) - median_loss)

fig, ax = plt.subplots()
fig.set_size_inches(figsize[0], figsize[1])

ax.plot(range(1,num_iter+1), [lipschitz_constant]*num_iter, color='red')

ax.plot(range(1,num_iter+1), median_loss)
ax.errorbar(
x=range(1,num_iter+1),
y=median_loss,
yerr=yerr,
linestyle='None',
alpha=0.5,
capsize=200/num_iter
)
ax.set(xlabel='Iteration Number', ylabel='Lipschitz Constant')

plt.legend(['True', 'Estimated', '90 % Error bars'])
plt.title('Convergence of Lipschitz Constant Estimate of {}-d Paraboloid'.format(d))
if filename:
fig = ax.get_figure()
fig.savefig(filename)
else:
plt.show()

if __name__ == '__main__':

parser = argparse.ArgumentParser()
parser.add_argument('inputfile', type=str)
parser.add_argument('--K', type=float, default=None)
parser.add_argument('--filename', type=str, default=None)
args = parser.parse_args()

with open(args.inputfile, 'rb') as f:
results = pickle.load(f)

main(results, args.K, args.filename)
12 changes: 10 additions & 2 deletions src/sequential.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,6 +71,7 @@ def adaptive_lipo(func,
y = np.zeros(n) - np.Inf
x = np.zeros((n, d))
loss = np.zeros(n)
k_arr = np.zeros(n)

# preallocate the distance arrays
# x_dist = np.zeros((n * (n - 1)) // 2)
Expand All @@ -87,10 +88,12 @@ def adaptive_lipo(func,
x_prop = u * (bound_maxs - bound_mins) + bound_mins
x[0] = x_prop
y[0] = func(x_prop)
k_arr[0] = k

upper_bound = lambda x_prop, y, x, k: np.min(y+k*np.linalg.norm(x_prop-x,axis=1))

for t in np.arange(1, n):
print('Iteration {}'.format(t))

# draw a uniformly distributed random variable
u = np.random.rand(d)
Expand All @@ -99,6 +102,7 @@ def adaptive_lipo(func,
# check if we are exploring or exploiting
if np.random.rand() > p: # enter to exploit w/ prob (1-p)
# exploiting - ensure we're drawing from potential maximizers
print('Into the while...')
while upper_bound(x_prop, y[:t], x[:t], k) < np.max(y[:t]):
u = np.random.rand(d)
x_prop = u * (bound_maxs - bound_mins) + bound_mins
Expand All @@ -108,7 +112,7 @@ def adaptive_lipo(func,
# print(upper_bound(x_prop, y[:t], x[:t], k))
# print(np.max(y[:t]))
# print('------')
# print("BREAK")
print('Out of the while')
else:
pass
# we keep the randomly drawn point as our next iterate
Expand Down Expand Up @@ -147,10 +151,14 @@ def adaptive_lipo(func,
# print(k)
i_t = np.ceil(np.log(k_est)/np.log(1+alpha))
k = (1+alpha)**i_t
print('Lipschitz Constant Estimate: {}'.format(k))
print('\n')
# print(k)
# print("----")
k_arr[t] = k


output = {'loss': loss, 'x': x, 'y': y}
output = {'loss': loss, 'x': x, 'y': y, 'k': k_arr}
return output

optimizers = {
Expand Down

0 comments on commit 69ceff3

Please sign in to comment.