-
Notifications
You must be signed in to change notification settings - Fork 86
GSoC 2011
Dear Pygmoers, after the big disappointment of not making it into GSoC 2011, we are still planning to carry ut the projects proposed. Should anyone of you want to contribute and take one of these as their personal project, we are still able to provide close mentoring ....
Before starting with contributing to PyGMO, in case you are a new developer, we suggest to:
- Install the PyGMO module in your python
- Get familiar with the classes problem.base, algorithm.base and topology.base
- Instantiate some archipelagos with homogeneus and heterogeneus algorithms
- Analyze graphically results from some optimization
- Contact the mentors via the IRC channel as to ask for more details on what is in our mind for the particular project you are interested in.
You will find mentors and senior Pygmoers in our mailing list.
So as to lower the entry barriers found by new users of PaGMO, we wish to provide a Graphical User Interface that will allow for as much of PaGMO's power as possible to be tuned, deployed and results analysed from the comfort of an intuitive graphical application.
The student will be mentored to implement in Python a modular and extensible graphical application that will provide as many as possible of the following functionalities:
- selection and tuning of optimization algorithms exposed in PyGMO;
- graphical configuration of the multiple sub-populations (islands) to undergo optimization. This includes setting up the topology of the archipelago, the multiple algorithms controlling each island, and the migrations between islands;
- possibility to generate a Python file with the sequence of PyGMO instructions needed to achieve the setup configured in the graphical application, either for loading back in future optimizations, or for the user to then manually extend the setup according to his/her needs;
- managing/monitoring of computing resources available through the network (visual interface to the MPI parallelization developed last year in GSoC);
- plots updated in real-time with statistics from ongoing optimizations (at the level of the archipelago, but also of individual islands);
- visualization and inspection of solutions along the Pareto front in optimizations with two and three objectives;
- Parallel coordinates visualization showing relationships between performance and values in the multiple variables under optimization (ideally with a degree of interactivity similar to the one provided by parvis).
Implementation Details: The GUI will essentially serve to instantiate and explore interactively an object of the class PyGMO.archipelago. This is a rather complex object containing (essentially) a PyGMO.topology and a number of PyGMO.islands Each PyGMO.island contains a PyGMO.algorithm and a PyGMO.population. Each PyGMO.population contains a set of individuals and a PyGMO.problem. PyQT4 could be the underlying tool to achieve this goal, but other equivalent modules can also be proposed (such as wxWidgets and PyGTK; see the book Matplotlib for Python Developers for a good comparison between all of these). A static, non-interactive visualization of archipelago topologies is already provided through the NetworkX library (see PyGMO/topology/init.py). The student may build on top of this structure, or reimplement a more suited visualization.
The student is asked to write clean and well documented code.
Skills Required: Knowledge of Python
Skills that help: An artistic sense for the beautiful & functional!
Mentors: Luís F. Simões (luis.felismino.simoes AT esa.int), Francesco Biscani (Francesco.Biscani AT esa.int)
General Description: Once again, we try to put order in the immense sea of algorithms and papers claiming outperformance and fighting each other to prove their value. For this project, the idea is to build, in PyGMO, a set of tools able to help comparing the performances of different algorithms implemented (Differential Evolution, Particle Swarm Optimization, Bee Colony, Nelder-Mead etc. etc.) [1].
a. First, one would fix a problem P (say, the interplanetary transfer of the Cassini probe to Jupiter) and implement a comparison between an algoritmic pair A and B (say Differential Evolution and Particle Swarm Optimization). This has to be based on statistical tests and ideally one would want to implement at least the Welch t-test, tests based on resampling statistics [2] and a racing method [3,4]. b. Using the developed capability, a separate part of the toolbox would work on a list of PyGMO.algorithms, and a problem P. A pre-order is then built using the statistical tool developed above. c. The third and final step of the testing strategy would then work on a list of algorithms and a list of problems, build preorders of algorithms on each problem and use Kendal Tau to compare the obtained rankings. Before any test start the time of the test duration should be estimated and the user should have the possibility to choose whether to perform or not the test.
A simple GUI has to be provided to perform the statistical tests and visually explore the produced statistical data (samples, preorders, etc...). It should be compatible, and eventually integrated into the PyGMO GUI developed in the project above.
References
- [1] Ruciński, M., Izzo, D., & Biscani, F. (2010). On the impact of the migration topology on the Island Model. Parallel Computing, 36(10-11), 555-571.
- [2] Cohen, P., 1995. Empirical Methods for Artificial Intelligence. MIT Press, Cambridge, Massachusetts.
- [3] V. Mnih, C. Szepesvari, J.-Y. Audibert, Empirical Bernstein Stopping, in: 25th International Conference on Machine Learning (ICML 2008), Vol. 307, 2008, pp. 672-679.
- [4] V. Heidrich-Meisner and C. Igel, (2009), Hoeffding and Bernstein races for selecting policies in evolutionary direct policy search, ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Skills Required: Python
Skills that help: Familiarity with Global Optimization Algorithms, Statistics, Sphinx documentation, Magic the Gathering
Mentors: Dario Izzo (dario.izzo AT esa.int), Guido de Croon (guido.de.croon AT esa.int)
General Description: This project is a follow-up to last year's very successful "SoC2010/Meta" project. Its purpose is, once again, to bring into PaGMO algorithms which are among the success stories and state-of-the-art in different domains of Evolutionary Computing. Within PaGMO they will then benefit from its powerful parallel computing architecture, and from the heterogeneous interaction it provides with different algorithms during optimization. The student will be mentored in the implementation at least two algorithms of his or her choice from among those listed below, with the restriction that at least one should come from the area of multi-objective optimization. The student is free to suggest in the application form alternative algorithms in these areas, of identical standing in the scientific community.
Evolutionary Multi-Objective Optimization:
In many real world optimization problems we have multiple objectives, often conflicting ones, that we would like to optimize. Among the most successful approaches for dealing with this type of problems we find multiple algorithms coming from the field of Evolutionary Multi-Objective Optimization [1]:
- NSGA-II (Non-dominated Sorting Genetic Algorithm II) [2]
- SPEA2 (Strength Pareto Evolutionary Algorithm 2) [3]
- MOGA (Multi-Objective Genetic Algorithm) [4]
The student is asked to extend PaGMO's abstraction layer for handling multiple objective functions during optimization (population.cpp, population.h), and to implement on top of it one or more of the algorithms listed above. At the level of the abstraction layer, efficient algorithms should be implemented for tasks such as non-dominated sorting and Pareto ranking (see [5] for more details). In last year's project, NSGA-II was one of the 5 implemented heuristics (nsga2.cpp, nsga2.h). Given that that implementation failed to reproduce some of the published results, we leave to the student the choice of whether to work on top of it, or write a new one from scratch.
References
*[1] Coello Coello, C. A. (2006). Evolutionary multi-objective optimization: a historical view of the field. IEEE Computational Intelligence Magazine, 1(1), 28-36. *[2] Deb, K., Pratap, A., Agarwal, S., & Meyarivan, T. (2002). A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation, 6(2), 182-197. *[3] Zitzler, E., Laumanns, M., & Thiele, L. (2001). SPEA2: Improving the Strength Pareto Evolutionary Algorithm. Technical Report 103, Computer Engineering and Networks Laboratory (TIK), Swiss Federal Institute of Technology (ETH). Zurich, Switzerland. *[4] Fonseca, C. M., & Fleming, P. J. (1993). Genetic Algorithms for Multiobjective Optimization: Formulation, Discussion and Generalization. In S. Forrest (Ed.), Proceedings of the 5th International Conference on Genetic Algorithms (pp. 416-423). San Francisco, CA, USA: Morgan Kaufmann Publishers Inc. *[5] Jensen, M. T. (2003). Reducing the run-time complexity of multiobjective EAs: The NSGA-II and other algorithms. IEEE Transactions on Evolutionary Computation, 7(5), 503-515. For introductions and further references on some of these algorithms, check the chapters on NSGA and SPEA in the book Clever Algorithms: Nature-Inspired Programming Recipes by Jason Brownlee (2011).
State-of-the-art algorithms for continuous optimization:
The Evolutionary Computing community has for many years now organized yearly competitions, where multiple optimization algorithms are applied to common sets of benchmark problems. For this year's CEC competition, our very own Messenger-Full and Cassini2 interplanetary trajectory optimization problems were included among the set of benchmark problems (you can obviously find them both in PaGMO already). A student pursuing this line of work shall be contributing to our goal of bringing into PaGMO algorithms which have proven themselves in those competitions to be among the state-of-the-art in approaches for continuous optimization. We have selected for that purpose a few algorithms we list below:
- CMA-ES (Covariance Matrix Adaptation Evolution Strategy) [6]
- a constant presence among the top ranking algorithms in many competitions over the years
- εDEg (ε Constrained Differential Evolution with an Archive and Gradient-Based Mutation) [7]
- Winner of the 2010 competition on Constrained Real-Parameter Optimization
- Dynamic Multi-Swarm Particle Swarm Optimizer with Subregional Harmony Search [8]
- Third place in the 2010 competition on Large Scale Global Optimization
References
- [6] http://www.lri.fr/~hansen/cmaesintro.html
- [7] Takahama, T., & Sakai, S. (2010). Constrained optimization by the ε constrained differential evolution with an archive and gradient-based mutation. 2010 IEEE Congress on Evolutionary Computation (CEC).
- [8] Zhao, S.-Z., Suganthan, P. N., & Das, S. (2010). Dynamic multi-swarm particle swarm optimizer with sub-regional harmony search. 2010 IEEE Congress on Evolutionary Computation (CEC).
Implementation Details: The student is asked to write clean and well documented code, as well as to provide Unit Tests for validating the implementations against results published in the literature. So as to allow the student to gradually come to an understanding of how PaGMO works, we will mentor him/her in the implementation of a very simple optimization algorithm, such as Population-Based Incremental Learning.
Skills Required: Knowledge of Python and C/C++.
Skills that help: Sense of adventure!
Mentors: Luís F. Simões (luis.felismino.simoes AT esa.int), Dario Izzo (dario.izzo AT esa.int)
General Description: During last year's GSoC, our brilliant pygmoer John Glover, tutored by Martin and Dario, managed to achieve the impossible in the project named Alife on Asteroids (web page and Final Report available). Overall the project delivered to PyGMO a new PyGMO.problem: a stochastic optimization problem concerning the evolution of locomotion of a four legged creature over a small gravity asteroid. The idea is that low-gravity will result in different morphology and control than the earth-based evolution of walking robots [1,2,3]. The optimization problem could be instantiated via a graphical interface that would also instantiate an optimizer and start the evolution. The project was not inserted straight away into the PyGMO master branch for the following reasons:
- At that time (i.e. last year) using problems written in python, rather than in C++ meant to lose parallelization.
- The creature morphology was far too simple to result in credible walking gaits http://en.wikipedia.org/wiki/Gait.
- The many dependencies needed by this PyGMO.problem resulted in an incomplete build system that was not considered to be reliable enough to be considered in the master branch.
The project we propose this year for GSoC 2011, and called SETAL: Search for Extra-Terrestrial Artificial Life aims at surpassing these three points and to go beyond the simple evolution implemented so far to achieve truly artificial life on an imaginary planet!!!!
Implementation Details: Thanks to the great efforts of Francesco, PyGMO problems can now be implemented in python directly preserving parallelization. The new structure of PyGMO requires some changes with respect to previously written code.
a. The first task in this project is to take the code developed last year and port it into the new PyGMO structure. This requires to write the serialization code for a few classes of the ODE package..... This first part of the project will help the student familiarize with PaGMO, PyGMO and the Alife problem b. Once the code is integrated and tested to function in the main PaGMO build system, the existing virtual world and GUI would have to be worked on. Ideas include: extended morphology of the creature (i.e. legs made by two segments), adding vision (adding virtual 'ommatidia' that provide visual inputs to the simulated robot, which can be used for evolution too!!!), adding flexibility to the asteroid shape (can the shape be randomized interfacing to basic blender commands?), separate the problem graphical instantiator, the algorithm instantiator (actually not necessary within this project) and a chromosome visualizer ... to see the evolved gait http://en.wikipedia.org/wiki/Gait. c. Some credible walking gaits http://en.wikipedia.org/wiki/Gait would have to be evolved for different gravity levels.
References
- [1] Beer, R.D., Chiel, H.J. and Gallagher, J.C. (1999). Evolution and analysis of model CPGs for walking II. General principles and individual variability. J. Computational Neuroscience 7(2):119-147.
- [2] Chiel, H.J., Beer, R.D. and Gallagher, J.C. (1999). Evolution and analysis of model CPGs for walking I. Dynamical modules. J. Computational Neuroscience 7:(2):99-118.
- [3] A. Omer, H. Lim, A. Takanishi, (2010) Simulation Study of a Bipedal Robot Jumping Motion Approach on Moon Gravity, IEEE ROBIO conference 2010
Skills Required: Python, PyBrain, C++
Skills that help: Not being a creationist, knowing what serialization is, neural networks Mentor: Guido de Croon, Eduardo Martin Moraud
General Description: One of the successful projects undertaken during PaGMO's GSoC2010 was the implementation of SoC2010/MPI, which extended PaGMO's multi-thread parallel architecture to MPI clusters. In the months following the completion of the project, PaGMO was further extended to take advantage of IPython's clustering technology. The next logical step on this path from single-machine parallelism to massively parallel multi-machine architectures is the interfacing of PaGMO with BOINC, the famous open-source platform for volunteer computing and grid computing on which projects such as SETI@home are based. The ultimate goal is to empower PaGMO's users with the possibility to tap into the huge computing potential of volunteer-based parallel architectures.
Implementation Details: At the core of this project is the creation of a new island class that exploits the BOINC infrastructure to offload optimization tasks to the computers of volunteers. This new functionality can be implemented from C++ or Python - but we are inclined to suggest an implementation from Python, both for ease of programming and future extensibility.
The student would be asked to follow roughly the following steps:
a. set-up a minimal BOINC server/infrastructure; b. code a 'BOINC island class' that sends/receives optimization tasks to/from the BOINC infrastructure. In PaGMO's terminology, an 'island' is a class that runs a given optimization algorithm over a given optimization problem. The student might take inspiration from one of the already implemented island classes (i.e., the local thread island class, the MPI island class and the IPython island class); c. code a and package a BOINC screensaver application which volunteers can download and run to contribute computing power to PaGMO's BOINC islands.
Skills Required: Python and C++
Skills that help: Experience with the BOINC infrastructure is a strong asset.
Mentors: Francesco Biscani (bluescarni AT gmail.com), Luís F. Simões (luis.f.m.simoes AT gmail.com)