Framework for Optimization using Metaheuristics

Labs Tool

In the past decades, a large amount of work has been carried out on development and research about heuristics and metaheuristics. However, most of the efforts were directed to solve classical OR problems using this techniques, trying to improve results obtained using traditional techniques, and showing that this alternative methods are relevant and advantageous. However, that generated a lot of documentation, algorithms and researches about this techniques applied and adapted to this problems.

What is FOM?

Until now, applying this techniques required a big effort to develop and adapt them to specific problems at hand. This effort is added to the inherent complexity of some of this techniques, having as a consequence that users decide to apply other techniques, based on more declarative paradigms like mathematical programming or constraint based solvers. To avoid this, we have developed a tool that is aimed at making applicable these techniques in real production environments, with a minimum effort on development and adaptation.

This tool is flexible and extendable enough so as to be used in heterogeneous environments and to let the user to tune and retouch the needed items in order to adapt it to his needs. In this sense our tool try to avoid ad-hoc implementations for specific problems, that reduce to the maximum the computational time to obtain a solution insignificantly better. However our tool try to make applicable these methods to a increasing number of problems in a easy, simple and fast way. And our tool is open to the numerous variants and chances that this methods bring us. It also let us make implementations of efficient methods that can be adapted to the special characteristics liable to have problems.

Why a Framework?

The developed tool is called a Framework in terms of software engineering. A frameworks is similar to a software library or API. However, a frameworks not only promotes functional reuse, but also promotes the resuse of a design which can be adapted and completed so as to fit the needs of the user. A framework defines a entire structure for an application. Although the object-oriented paradigm has been in use for many years, the production of mainstream, quality frameworks is a comparatively new phenomenon. The application of frameworks to combinatorial optimization is even more recent. Therefore we can say that this is a new area of research and development, that can contribute many advantages in order to approach optimization problems. There are some precedents of frameworks of this type. Relevant developments in this way have are ECJ, HeuristicLab, EO, Paradiseo, JCLEC, etc. However we think that our framework, FOM, has enough innovations on both its design and implementation to consider it an advance in this area. Finally a relevant reference about the state of the art in these developments, and a compendium of the relevant references can be found in [1].


Supported Metaheuristics

Local Search
  • Steepest descent | Hill Climibing.
  • Simulated Annealing
    • Various Cooling Schemes (Linear, Geometric, etc.)
  • Tabu Search
    • Recency Based Memory
    • Frecuency Based Memory
    • Aspiration Criteria
  • VNS
Partial Solution Components & Solution Building
Population Based
  • Evolutionary Algorithms
    • Genetic Algorithms
    • ES
  • Ant Systems
    • Classical Ant System
    • Ant Colony Optimization
    • Min-Max Ant System

Advanced Metaheuristic Features

  • HyperHeuristics creation support
  • Hooks for creating Hybrid techniques

Optimization Process Support

  • Various predefined execution end criteria
    • Number of iterations
    • Number of fitness function evaluations
    • Execution time
    • Minimum improvement on (exec. time, nº of iterationes, etc.)
  • Multithreaded optimization task execution
  • Support for Experiment design, modelling and execution
  • Statistical Analysis of experitment capabilities
  • Basic User Interface
  • Flexible Statistics and Charting

Architecture & Design

As a previous stage to design the framework, we have analysed the relationships between the optimization techniques, their components, the abstract problems and the specific instances of the problems that we want to solve. We have delimited and expressed in a formal way the interfaces between these elements, which are required to solve the optimization problems. Starting from this study, we have designed a structure that let us apply these techniques to the problems, and we have implemented them using the previous interfaces. Consequently, we have introduced an abstraction layer between problems and solutions (that encapsulate nearly all domain dependent elements), and metaheuristic techniques used to solve it, decoubling those elements.

Problems and Solutions

We show below a class diagram that depicts the final structure of some of these interfaces. These diagrams express the relationships between the solutions to our problems and similar problems, as well as the characteristics that our solutions should have in order to apply our different metaheuristics.

The most simple interface, called Solution, only defines the requirements that an object must comply with for being a solution to one of our problems: should be able to be evaluated and provide a new random solution to our problem. The design defines furthermore a class that implements this interface, called AbstractSolution, and a interface that represents the problem we want to solve, called Problem; likewise, it is established a relationship by means of which all AbstractSolution is associated to a Problem. Through this relationship the problem can evaluate the actual data for the solution, and determine if it is a valid solution. Finally the diagram defines some interfaces that bring the required functionalities to apply the different techniques. For example, we define an interface called Neighbourhood that defines the neighbourhood of a specific solution and give us the required operations to move through it.. There is another interface called ExplorableSolution that implements the interfaces Solution and Neighbourhood. In this way, using an object that implements this interface we, can explore the neighbourhood of this solution and “jump” from one solution to other, moving us around the solution space of the problem. This let us apply techniques based on local search.

Techniques for optimization

In the same way the different interfaces that are needed to apply the different techniques are defined. Also, the elements used in the different techniques and their functionality are defined.

Figure 2 shows both definitions and relationships between the metaheuristics. A class called Metaheuristic is defined in the previous diagram. This class defines the interface that must have every elements that we want to use as an optimization technique in the framework. It only defines a public method, that will be the optimization algorithm, and that has a solution as parameter (the initial solution) and returns another solution (the optimal solution we found). All optimization methods that can be expressed and called in this way could be used in the framework in order to find an optimal solution to a problem.

In the previous diagram another class, called LocalSearch is defined; it only inherits the previous class, with the particularity that the initial solution must be a ExplorableSolution just as was defined in the Figure 1. In this way we can explore the neighbourhood of our solution and implement several techniques based on local search.

Figure 3 shows a diagram that defines the behaviour of Simulated Annealing. In this diagram the class SimulatedAnnealing is defined, that implements LocalSearch. This class uses a interface called Cooler, in order to decrease the value of their attribute actualTemperature. The interface Cooler defines a method called toCool, this method will give us the value of the temperature parameter in the next iteration. Therefore, that which determines the cooling schedule of this algorithm is the concrete Cooler object with what is connected the SimulatedAnnealing object. As a result, we have techniques like Genetic Algorithms, Tabu Search or Simulated Annealing implemented in an abstract way. Also they are implemented in a independent way from the details of the problem we want to solve, based only on the specifications defined by our interfaces. Our framework defines a set of elements that could be useful in order to tune the concrete parameters and variants of this metaheuristic techniques that we should use in order to solve a specific problem.

Optimization Helpers

It is widely accepted that a correct tuning of the parameters that control the behaviour of the metaheuristic techniques has big impact on the success of this techniques when solving problems. It is very useful in order to adapt and tune those parameters follow the value of some data along the execution of the metaheuristic technique. We could determine the best initial values that we should use for a metaheuristic, and even determine what variant of a technique should be choosen when a specific problem is being solved This functionality is built into our framework by the use of some elements called “observers”. Those elements will trace the value of the property they are following in every iteration of the algorithm. Those elements will let us to follow the evolution of any parameter of a metaheuristic. These parameters could be any of the following: the function value for the best solution we have found, the average function value of the population of a Genetic Algorithm in a iteration, the evolution of the temperature parameter in Simulated Annealing, etc. Moreover, our observers should be implemented only once, and could be used for any problem and metaheuristic technique. Also, it had been developed a set of elements, called “visualizators”, that let us treat the information obtained by the observers in different ways. Using those visualizators we could by example, save in a file all the values obtained by the “observers”, or show charts about their evolution. Furthermore those visualizators can be applied to any observer we have defined, even those defined for a specific problem by the user. As an example, all charts that we show later in the computational experiences had been obtained using a charting visualizator.

Additional Features & Capabilities

Finally, it is clear that using our framework we can experiment with different variants of this metaheuristic techniques without any codification effort. For example, as we have seen previously figure 3, the cooling schedule that uses Simulated Annealing is determined by the Cooler object that has associated. In this way we could change the Cooler object using one implementation or another, and subsequently using one cooling schedule or another. In this way we could experiment different cooling schedules for any problem defined for the framework, only changing the implementation of the Cooler object that has our Simulated Annealing object. In the same way we could change the type of memory used by Tabu Search, etc. Other relevant point of the framework is the reuse of functional elements across different metaheuristic techniques. Thus, abstraction and division of responsabilities of different functional elements can be reused. For example: in Genetic Algorithms is necessary to have criteria to choose some individuals: choose the individuals that could have a mutation or inversion, choose the individuals that could be crossed and obtain from them other individuals, and choose the individuals that could survive and to be a member of the population in the next iteration. In order to achieve this problem we have defined a set of elements called “selectors”. We show a class diagram with their structure:

Class Selector define the selection method, that taking a set of candidate solutions returns a subset that would be chosen. Moreover we define different classes that implements different criteria to choose elements, like tournaments of n elements, random, better, etc. Those elements could be used by the Ant System metaheuristic in order to choose ants that will update the feromona’s trail. There are present again the advantages of a correct abstraction and definition of the responsibilities of functional elements.

Applications and Usage of FOM

With the aim of proving the efficiency and applicability of the framework. We have approached two classical problems in the area of Operational Research, the SAT problem and the TSP problem. We used standard and public libraries of instances of these problems. We used the SATLib for the SAT problem, that we gathered from: We used the TSPLib for the TSP problem, that we gathered from:

We will show some results obtained applying the metaheuristic techniques to one instance of the selected problems, in order to show the capabilities and possibilities that our framework give us. We will show the obtained solutions, and finally we will present (if we consider it is relevant) a chart showing the evolution of interesting parameters. Because of space limitations we will show only the results for techniques that we consider interesting. Moreover, detailed values of parameters used for every technique will not be listed, but we have roughly used standard values from literature However, we have a detailed description of these values and a more exhaustive study of the results obtained applying all techniques, as well as applications of the techniques that form the framework to other instances of this problems.

The SAT problem

We have chosen the SAT’s problem instance defined in the file JNH207.SAT of the SATLib to realize our tests. We will show the results obtained applying different techniques.

Tabu Search: Results: We firstly executed this algorithm using a little size for the tabu memory, and few iterations too. We could not find an optimal solution. However, we executed this algorithm again using a bigger memory size and more iterations and we found an optimal solution. Finally we executed this algorithm again in order to show the what is the critical factor, memory size or number of iterations, so we choose the same memory size than in the first application and a large number of iterations, and we could not found an optimal solution. However and according to the results that we can see in the next figures there were necessary a bigger number of iterations. Interesting facts: We show now two charts that show the evolution of the actual solution in every iteration, first with a little memory size and later with a bigger memory size. In the second chart we used a higher number of iterations.

Genetic Algorithms: Results: We found an optimal solution every time we applied this technique. Interesting facts: We think that is interesting to have a look to the charts that show the evolution of the number of clauses not satisfied for every individual, and the figure that shows de evolution of the diversity of the population. Diversity of population is calculated comparing all pairs of individuals evaluating their equality with a value between 0 and 1, finally dividing the sum by the number of individuals. We can notice the evolution of our population to better solutions, and at the same time how their diversity decrease.

The TSP problem

In order to carry out our tests we have chosen the TSP’s problem instance defined in the file att48.tsp from TSPLib. We will show results we obtained for some techniques: Tabu Search: Results: We obtained a solution with objective function of 37711.35. It is the best solution obtained in the tests shown. However, It should not be forgotten that we used the highest iteration number of whole of the tests shown in this article. Interesting facts: We show the evolution of the evaluated solution in every iteration.

Simulated Annealing: Results: We obtained a solution with objective function of 54500.51 (the number of iterations was smaller than in tabu search). Interesting facts: We have chosen a cooling schedule that makes the temperature decrease geometrically. Figure 10 shows the curve of the descending temperature. Moreover, a chart that show the value of the actual solution allow us to discern the operation of this algorithm.

Genetic Algorithms: Results: We executed two variants of this technique in order to show their efficiency. In one hand we applied this technique without other mechanism for optimization than mutation, inversion and crossover of individuals, and we obtained a solution with objective function of 81012.90. On the other hand we used an advanced technique, i.e. a steepest descent algorithm applied to all individuals, we obtained better results, the obtained solution was 45817.78. Interesting facts: It is interesting to show the variation between Figures 12 and 13 which show the evolution of the individuals in the population. Evolution showed in Figure 13 is more “jagged”, and has a bigger initial improvement of the population. It was because of the effect of the local search.

Useful Links

Other metaheuristic Optimization Frameworks:

How to cite

If you need to cite FOM please use the following reference:

FOM: A Framework for Metaheuristic Optimization J. A. Parejo, J. Racero, F. Guerrero, T. Kwok, K. A. Smith Computational Science — ICCS 2003 Lecture Notes in Computer Science Volume 2660, 2003, pp 886-895

You can download the Bibtex citation here

Framework Documentation

Sample Problems

We have solved some classical optimization problems using FOM:

FOM Laboratory GUI

You can download a basic GUI laboratory here.


[1] Metaheuristic optimization frameworks: a survey and benchmarking José Antonio Parejo, Antonio Ruiz-Cortés, Sebastián Lozano, Pablo Fernandez Soft Computing March 2012, Volume 16, Issue 3, pp 527-561