# GECCO '19- Proceedings of the Genetic and Evolutionary Computation Conference

Full Citation in the ACM Digital Library## SESSION: Ant colony optimization and swarm intelligence

###
AM_{clr}: an improved ant-miner to extract comprehensible and diverse classification rules

Ant-Miner, a rule-based classifier, is a powerful algorithm used to extract classification rules. It has the ability to classify the complex data accurately but it has limitations of high selective pressure, and pre-mature convergence or very slow convergence. One of the big limitations is the correct prediction based evaluation of the rules that has not yet been addressed. In this paper, we propose a new ant-miner (named as AM_{clr}) based on a new quality function, rank based term selection and on rule rejection thresholds. The new quality function is based on rule length, coverage and correct prediction. The rule rejection thresholds are based on coverage and quality of the rule. The proposed algorithm has been tested on various benchmark datasets and compared with other state of the art algorithms. The experimental results showed that the proposed approach achieved better results in terms of accuracy, convergence speed, and comprehensibility.

### A stable hybrid method for feature subset selection using particle swarm optimization with local search

The determination of a small set of biomarkers to make a diagnostic call can be formulated as a feature subset selection (FSS) problem to find a small set of genes with high relevance for the underlying classification task and low mutual redundancy. However, repeated application of a heuristic, evolutionary FSS technique usually fails to produce consistent results. Here, we introduce COMB-PSO-LS, a novel hybrid (wrapper-filter) FSS algorithm based on Particle Swarm Optimization (PSO) that features a local search strategy to select the least dependent and most relevant feature subsets. In particular, we employ a Randomized Dependence Coefficient (RDC)-based filter technique to guide the search process of the particle swarm, allowing the selection of highly relevant and consistent features. Classifying cancer samples through patient gene expression profiles, we found that COMB-PSO-LS provides highly stable and non-redundant gene subsets that are relevant for the classification process, outperforming standard PSO methods.

### Control parameter sensitivity analysis of the multi-guide particle swarm optimization algorithm

This paper conducts a sensitivity analysis of the recently proposed multi-objective optimizing algorithm, namely the multi-guide particle swarm optimization algorithm (MGPSO). The MGPSO uses subswarms to explore the search space, where each subswarm optimises one of the multiple objectives. A bounded archive is used to share previously found non-dominated solutions between subswarms. A third term, the archive guide, is added to the velocity update equation that represents a randomly selected solution from the archive. The influence of the archive guide on a particle is controlled by the archive balance coefficient and is proportional to the social guide. The original implementation of the MGPSO used static values randomly sampled from a uniform distribution in the range [0,1] for the archive balance coefficient. This paper investigates a number of approaches to dynamically adjust this control parameter. These approaches are evaluated on a variety of multi-objective optimization problems. It is shown that a linearly increasing strategy and stochastic strategies outperformed the standard approach to initializing the archive balance coefficient on two-objective and three objective optimization problems.

### A peek into the swarm: analysis of the gravitational search algorithm and recommendations for parameter selection

The Gravitational Search Algorithm is a swarm-based optimization metaheuristic that has been successfully applied to many problems. However, to date little analytical work has been done on this topic.

This paper performs a mathematical analysis of the formulae underlying the Gravitational Search Algorithm. From this analysis, it derives key properties of the algorithm's expected behavior and recommendations for parameter selection. It then confirms through empirical examination that these recommendations are sound.

### Simultaneous meta-data and meta-classifier selection in multiple classifier system

In ensemble systems, the predictions of base classifiers are aggregated by a combining algorithm (meta-classifier) to achieve better classification accuracy than using a single classifier. Experiments show that the performance of ensembles significantly depends on the choice of meta-classifier. Normally, the classifier selection method applied to an ensemble usually removes all the predictions of a classifier if this classifier is not selected in the final ensemble. Here we present an idea to only remove a subset of each classifier's prediction thereby introducing a simultaneous meta-data and meta-classifier selection method for ensemble systems. Our approach uses Cross Validation on the training set to generate meta-data as the predictions of base classifiers. We then use Ant Colony Optimization to search for the optimal subset of meta-data and meta-classifier for the data. By considering each column of meta-data, we construct the configuration including a subset of these columns and a meta-classifier. Specifically, the columns are selected according to their corresponding pheromones, and the meta-classifier is chosen at random. The classification accuracy of each configuration is computed based on Cross Validation on meta-data. Experiments on UCI datasets show the advantage of proposed method compared to several classifier and feature selection methods for ensemble systems.

### Scaling techniques for parallel ant colony optimization on large problem instances

Ant Colony Optimization (ACO) is a nature-inspired optimization metaheuristic which has been successfully applied to a wide range of different problems. However, a significant limiting factor in terms of its scalability is memory complexity; in many problems, the *pheromone matrix* which encodes trails left by ants grows quadratically with the instance size. For very large instances, this memory requirement is a limiting factor, making ACO an impractical technique. In this paper we propose a restricted variant of the pheromone matrix with linear memory complexity, which stores pheromone values only for members of a candidate set of next moves. We also evaluate two selection methods for moves outside the candidate set. Using a combination of these techniques we achieve, in a reasonable time, the best solution qualities recorded by ACO on the *Art TSP* Traveling Salesman Problem instances, and the first evaluation of a parallel implementation of *MAX-MIN* Ant System on instances of this scale (≥ 10^{5} vertices). We find that, although ACO cannot yet achieve the solutions found by state-of-the-art genetic algorithms, we rapidly find approximate solutions within 1 -- 2% of the best known.

### On the selection of the optimal topology for particle swarm optimization: a study of the tree as the universal topology

In this paper, we deal with the problem of selecting the best topology in Particle Swarm Optimization. Unlike most state-of-the-art papers, where statistical analysis of a large number of topologies is carried out, in this work we formalize mathematically the problem. In this way, the problem is to find the best topology in the set of all simple connected graphs of *n* nodes. To determine which is the best topology, each graph in this set must be measured with a function that evaluates its quality. We introduce the concepts of equivalent neighborhood and equivalent topology to prove that for any simple connected graph there is an equivalent tree. The equivalence between two topologies means that each particle belonging to these has the same local best in both. Therefore, the problem can be simplified in complexity to find the best tree in the set of all trees with *n* nodes. Finally, we give some examples of equivalent topologies, as well as the applicability of the obtained result.