GECCO '19- Proceedings of the Genetic and Evolutionary Computation ConferenceFull Citation in the ACM Digital Library
SESSION: Real world applications
Optimising bus routes with fixed terminal nodes: comparing hyper-heuristics with NSGAII on realistic transportation networks
The urban transit routing problem (UTRP) is concerned with finding efficient travelling routes for public transportation systems. This problem is highly complex, and the development of effective algorithms to solve it is very challenging. Furthermore, realistic benchmark data sets are lacking, making it difficult for researchers to compare their problem-solving techniques with those of other researchers. In this paper we contribute a new set of benchmark instances that have been generated by a procedure that scales down a real world transportation network, yet preserves the vital characteristics of the network layout including "terminal nodes" from which buses are restricted to start and end their journeys. In addition, we present a hyper-heuristic solution approach, specially tailored to solving instances with defined terminal nodes. We use our hyper-heuristic technique to optimise the generalised costs for passengers and operators, and compare the results with those produced by an NSGAII implementation on the same data set. We provide a set of competitive results that improve on the current bus routes used by bus operators in Nottingham.
Deep neural networks are vulnerable to adversarial examples, even in the black-box setting, where the attacker is restricted solely to query access. Existing black-box approaches to generating adversarial examples typically require a significant number of queries, either for training a substitute network or performing gradient estimation. We introduce GenAttack, a gradient-free optimization technique that uses genetic algorithms for synthesizing adversarial examples in the black-box setting. Our experiments on different datasets (MNIST, CIFAR-10, and ImageNet) show that GenAttack can successfully generate visually imperceptible adversarial examples against state-of-the-art image recognition models with orders of magnitude fewer queries than previous approaches. Against MNIST and CIFAR-10 models, GenAttack required roughly 2,126 and 2,568 times fewer queries respectively, than ZOO, the prior state-of-the-art black-box attack. In order to scale up the attack to large-scale high-dimensional ImageNet models, we perform a series of optimizations that further improve the query efficiency of our attack leading to 237 times fewer queries against the Inception-v3 model than ZOO. Furthermore, we show that GenAttack can successfully attack some state-of-the-art ImageNet defenses, including ensemble adversarial training and non-differentiable or randomized input transformations. Our results suggest that evolutionary algorithms open up a promising area of research into effective black-box attacks.
Community energy systems (CESs) are shared energy systems in which multiple communities generate and consume energy from renewable resources. At regular time intervals, each participating community decides whether to self-supply, store, trade, or sell their energy to others in the scheme or back to the grid according to a predefined policy which all participants abide by. The objective of the policy is to maximise average satisfaction across the entire CES while minimising the number of unsatisfied participants. We propose a multi-class, multi-tree genetic programming approach to evolve a set of specialist policies that are applicable to specific conditions, relating to abundance of energy, asymmetry of generation, and system volatility. Results show that the evolved policies significantly outperform a default handcrafted policy. Additionally, we evolve a generalist policy and compare its performance to specialist ones, finding that the best generalist policy can equal the performance of specialists in many scenarios. We claim that our approach can be generalised to any multi-agent system solving a common-pool resource allocation problem that requires the design of a suitable operating policy.
After the Big Bang, it took about 200 million years before the very first stars would form - now more than 13 billion years ago. Unfortunately, we will not be able to observe these stars directly. Instead, we can observe the 'fossil' records that these stars have left behind, preserved in the oldest stars of our own galaxy. When the first stars exploded as supernovae, their ashes were dispersed and the next generation of stars formed, incorporating some of the debris. We can now measure the chemical abundances in those old stars, which is similar to a genetic fingerprint that allows us to identify the parents.
In this paper, we develop a Genetic Algorithm (GA) for identifying the 'parents' of these old stars in our galaxy. The objective is to study the now extinct first stars in the universe - what their properties were, how they lived and died, how many they were, and even how different or alike they were. The GA is evaluated on its effectiveness in finding the right combination of ashes from theoretical models. The solutions found by the GA are compared to observational data. The aim is to find out which theoretical data, i.e., abundances of chemical elements, best matches the current observations.
Carsharing services have successfully established their presence and are now growing steadily in many cities around the globe. Carsharing helps to ease traffic congestion and reduce city pollution. To be efficient, carsharing fleet vehicles need to be located on city streets in high population density areas and considering demographics, parking restrictions, traffic and other relevant information in the area to satisfy travel demand. This work proposes to formulate the initial placement of a fleet of cars for a round-trip carsharing service as a multi-objective optimization problem. The performance of state-of-the-art metaheuristic algorithms, namely, SPEA2, NSGA-II, and NSGA-III, on this problem is evaluated on a novel benchmark composed of synthetic and real-world instances built from real demographic data and street network. Inverted generational distance (IGD), spread and hypervolume metrics are used to compare the algorithms. Our findings demonstrate that NSGA-II yields significantly lower IGD and higher hypervolume than the rest and SPEA2 has a significantly better diversity if compared with NSGA-II and NSGA-III.
Multiobjective shape design in a ventilation system with a preference-driven surrogate-assisted evolutionary algorithm
We formulate and solve a real-world shape design optimization problem of an air intake ventilation system in a tractor cabin by using a preference-based surrogate-assisted evolutionary multi-objective optimization algorithm. We are motivated by practical applicability and focus on two main challenges faced by practitioners in industry: 1) meaningful formulation of the optimization problem reflecting the needs of a decision maker and 2) finding a desirable solution based on a decision maker's preferences when solving a problem with computationally expensive function evaluations. For the first challenge, we describe the procedure of modelling a component in the air intake ventilation system with commercial simulation tools. The problem to be solved involves time consuming computational fluid dynamics simulations. Therefore, for the second challenge, we extend a recently proposed Kriging-assisted evolutionary algorithm K-RVEA to incorporate a decision maker's preferences. Our numerical results indicate efficiency in using the computing resources available and the solutions obtained reflect the decision maker's preferences well. Actually, two of the solutions dominate the baseline design (the design provided by the decision maker before the optimization process). The decision maker was satisfied with the results and eventually selected one as the final solution.
Relative Expression Analysis (RXA) focuses on finding interactions among a small group of genes and studies the relative ordering of their expression rather than their raw values. Algorithms based on that idea play an important role in biomarker discovery and gene expression data classification. We propose a new evolutionary approach and a paradigm shift for RXA applications in data mining as we redefine the inter-gene relations using the concept of a cluster of co-expressed genes. The global hierarchical classification allows finding various sub-groups of genes, unifies the main variants of RXA algorithms and explores a much larger solution space compared to current solutions based on exhaustive search. Finally, the multi-objective fitness function, which includes accuracy, discriminative power of genes and clusters consistency, as well as specialized variants of genetic operators improve evolutionary convergence and reduce model underfitting. Importantly, patterns in predictive structures are kept comprehensible and may have direct applicability. Experiments carried out on 8 cancer-related gene expression datasets show that the proposed approach allows finding interesting patterns and significantly improves the accuracy of predictions.
Brain-Computer Interface (BCI) is an emergent technology with a wide range of applications. For instance, it can be used for post-stroke motor rehabilitation in order to restore part of the motor control of someone injured in an accident. The BCI process involves the signal acquisition and preprocessing of data, extraction and selection of features, and classification. Thus, in order to have a correct classification of the movements, several filters are commonly used to handle the signal data. Here we propose the use of Differential Evolution (DE) with cross-entropy as the objective function to find an appropriate filter. Computational experiments are performed using 2 datasets from BCI competitions for motor imagery with signals of Bipolar and Monopolar Electroencephalography. Also, these problems involve two-class and multiclass classifications. The results show that the proposed DE obtained mean results 9.85% better than the well-known approach Filter Bank Common Spatial Pattern for BCIs with 2 classes for bipolar signals, and it allows for the reduction of the number of electrodes for BCIs with 4 classes.
Clinical scorecards of risk factors associated with disease severity or mortality outcome are used by clinicians to make treatment decisions and optimize resources. This study develops an automated tool or framework based on evolutionary algorithms for the derivation of scorecards from clinical data. The techniques employed are based on the NSGA-II Multi-objective Optimization Genetic Algorithm (GA) which optimizes the Pareto-front of two clinically-relevant scorecard objectives, size and accuracy. Three automated methods are presented which improve on previous manually derived scorecards. The first is a hybrid algorithm which uses the GA for feature selection and a decision tree for scorecard generation. In the second, the GA generates the full scorecard. The third is an extended full scoring system in which the GA also generates the scorecard scores. In this system combinations of features and thresholds for each scorecard point are selected by the algorithm and the evolutionary process is used to discover near-optimal Pareto-fronts of scorecards for exploration by expert decision makers. This is shown to produce scorecards that improve upon a human derived example for C.Difficile, an important infection found globally in communities and hospitals, although the methods described are applicable to any disease where the required data is available.
The extraction of 3D models represented by Constructive Solid Geometry (CSG) trees from point clouds is a common problem in reverse engineering pipelines as used by Computer Aided Design (CAD) tools. We propose three independent enhancements on state-of-the-art Genetic Algorithms (GAs) for CSG tree extraction: (1) A deterministic point cloud filtering mechanism that significantly reduces the computational effort of objective function evaluations without loss of geometric precision, (2) a graph-based partitioning scheme that divides the problem domain in smaller parts that can be solved separately and thus in parallel and (3) a 2-level improvement procedure that combines a recursive CSG tree redundancy removal technique with a local search heuristic, which significantly improves GA running times. We show in an extensive evaluation that our optimized GA-based approach provides faster running times and scales better with problem size compared to state-of-the-art GA-based approaches.
Network virtualization has enabled new business models by allowing infrastructure providers to lease or share their physical infrastructure. A fundamental network management problem that infrastructure providers face to support customized virtual network services is the Virtual Network Embedding (VNE). This requires solving the (NP-hard) problem of matching constrained virtual networks onto the physical network. In this paper, we propose EVA, an architecture that solves the virtual network embedding problem with evolutionary algorithms. By tuning the fitness function and several other policies and parameters, EVA adapts to different types of network topologies and virtualization environments. We compared a few representative policies of EVA with recent virtual network embedding and virtual network function chain allocation solutions; our findings show how with EVA we obtain higher acceptance ratio performance as well as quicker convergence time. We release our implementation code to allow researchers to experiment with evolutionary policy programmability.
Effective soil-sampling is essential for the construction of prescription maps used in Precision Agriculture for Variable Rate Application of nutrients. In practice, designing a field sampling plan is subject to hard limitations, merely due to the associated expenses, where only a few sample points are taken for evaluation. The accuracy of constructed maps is affected by the number of sampling points, their geographical dispersion and their coverage of the feature space. To improve the accuracy, ancillary data in the form of low-cost, high-resolution field scans could be used for inferring statistical measures for devising the high-cost sampling plan. The current study targets algorithmically-guided sampling plans using available ancillary data. We propose possible models for quantifying spatial coverage and diversity concerning the ancillary data. We investigate models as objective functions, devise Pareto optimization problems and solve them using NSGA-II. We analyzed the obtained sampling plans in an agricultural field, and suggest statistical tools for sample-size determination and plans' ranking according to additional information criteria. We argue that our approach is successful in attaining a practical sampling plan, constituting a fine trade-off between objectives, and possessing no discrepancies.
Towards better generalization in WLAN positioning systems with genetic algorithms and neural networks
The most widely used positioning system today is the GPS (Global Positioning System), which has many commercial, civil and military applications, being present in most smartphones. However, this system does not perform well in indoor locations, which poses a constraint for the positioning task on environments like shopping malls, office buildings, and other public places. In this context, WLAN positioning systems based on fingerprinting have attracted a lot of attention as a promising approach for indoor localization while using the existing infrastructure. This paper contributes to this field by presenting a methodology for developing WLAN positioning systems using genetic algorithms and neural networks. The fitness function of the genetic algorithm is based on the generalization capabilities of the network for test points that are not included in the training set. By using this approach, we have achieved state-of-the-art results with few parameters, and our method has shown to be less prone to overfitting than other techniques in the literature, showing better generalization in points that are not recorded on the radio map.
Augmented evolutionary intelligence: combining human and evolutionary design for water distribution network optimisation
Evolutionary Algorithms (EAs) have been employed for the optimisation of both theoretical and real-world problems for decades. These methods although capable of producing near-optimal solutions, often fail to meet real-world application requirements due to considerations which are hard to define in an objective function. One solution is to employ an Interactive Evolutionary Algorithm (IEA), involving an expert human practitioner in the optimisation process to help guide the algorithm to a solution more suited to real-world implementation. This approach requires the practitioner to make thousands of decisions during an optimisation, potentially leading to user fatigue and diminishing the algorithm's search ability. This work proposes a method for capturing engineering expertise through machine learning techniques and integrating the resultant heuristic into an EA through its mutation operator. The human-derived heuristic based mutation is assessed on a range of water distribution network design problems from the literature and shown to often outperform traditional EA approaches. These developments open up the potential for more effective interaction between human expert and evolutionary techniques and with potential application to a much larger and diverse set of problems beyond the field of water systems engineering.
One of the most promising applications of near-term quantum computing is the simulation of quantum systems, a classically intractable task. Quantum simulation requires computationally expensive matrix exponentiation; Trotter-Suzuki decomposition of this exponentiation enables efficient simulation to a desired accuracy on a quantum computer. We apply the Covariance Matrix Adaptation Evolutionary Strategy (CMA-ES) algorithm to optimise the Trotter-Suzuki decompositions of a canonical quantum system, the Heisenberg Chain; we reduce simulation error by around 60%. We introduce this problem to the computational search community, show that an evolutionary optimisation approach is robust across runs and problem instances, and find that optimisation results generalise to the simulation of larger systems.
High-Altitude Pseudo-Satellites (HAPS) are long-endurance, fixed-wing, lightweight Unmanned Aerial Vehicles (UAVs) that operate in the stratosphere and offer a flexible alternative for ground activity monitoring/imaging at specific time windows. As their missions must be planned ahead (to let them operate in controlled airspace), this paper presents a Genetic Algorithm (GA)-guided Hierarchical Task Network (HTN)-based planner for multiple HAPS. The HTN allows to compute plans that conform with airspace regulations and operation protocols. The GA copes with the exponentially growing complexity (with the number of monitoring locations and involved HAPS) of the combinatorial problem to search for an optimal task decomposition (that considers the time-dependent mission requirements and the time-varying environment). Besides, the GA offers a flexible way to handle the problem constraints and optimization criteria: the former encodes the airspace regulations, while the latter measures the client satisfaction, the operation efficiency and the normalized expected mission reward (that considers the wind effects in the uncertainty of the arrival-times at the monitoring-locations). Finally, by integrating the GA into the HTN planner, the new approach efficiently finds overall good task decompositions, leading to satisfactory task plans that can be executed reliably (even in tough environments), as the results in the paper show.
Safety assurance for automotive products is crucial and challenging. It becomes even more difficult when the variability in automotive products is considered. Recently, the notion of automotive multi-product lines (multi-PL) is proposed as a unified framework to accommodate different sources of variability in automotive products. In the context of automotive multi-PL, we propose a stability analysis for safety, motivated by our industrial collaboration, where we observed that under certain operation scenarios, safety varies drastically with small fluctuations in production parameters, environmental conditions, or driving inputs. To characterize instability, we formulate a multi-objective optimization problem, and solve it with a search-based approach. The proposed technique is applied to an industrial automotive multi-PL, and experimental results show its effectiveness to spot instability. Moreover, based on information gathered during the search, we provide some insights on both testing and quality engineering of automotive products.
Structured grammatical evolution is a recent grammar-based genetic programming variant that tackles the main drawbacks of Grammatical Evolution, by relying on a one-to-one mapping between each gene and a non-terminal symbol of the grammar. It was applied, with success, in previous works with a set of classical benchmarks problems. However, assessing performance on hard real-world problems is still missing. In this paper, we fill in this gap, by analyzing the performance of SGE when generating predictive models for the glucose levels of diabetic patients. Our algorithm uses features that take into account the past glucose values, insulin injections, and the amount of carbohydrate ingested by a patient. The results show that SGE can evolve models that can predict the glucose more accurately when compared with previous grammar-based approaches used for the same problem. Additionally, we also show that the models tend to be more robust, since the behavior in the training and test data is very similar, with a small variance.
Evolutionary learning of link allocation algorithms for 5G heterogeneous wireless communications networks
Wireless communications networks are operating at breaking point during an era of relentless traffic growth. Network operators must utilize scarce and expensive wireless spectrum efficiently in order to satisfy demand. Spectrum on the links between cells and user equipments ('users': smartphones, tablets, etc.) frequently becomes congested. Capacity can be increased by transmitting data packets via multiple links. Packets can be routed through multiple Long Term Evolution (LTE) links in existing fourth generation (4G) networks. In future 5G deployments, users will be equipped to receive packets over LTE, WiFi, and millimetre wave links simultaneously.
How can we allocate spectrum on links, so that all customers experience an acceptable quality of service? Building effective schedulers for link allocation requires considerable human expertise. We automate the design process through the novel application of evolutionary algorithms. Evolved schedulers boost downlink rates by over 150% for the worst-performing users, relative to a single-link baseline. The proposed techniques significantly outperform a benchmark algorithm from the literature. The experiments illustrate the promise of evolutionary algorithms as a paradigm for managing 5G software-defined wireless communications networks.
This paper studies a bi-objective optimization problem in which the goal is to optimize connections between entities in the market in order to make the entire system resilient to shocks of varying magnitudes. As observed in the literature concerning contagion on the inter-bank market, no system structure maximizes resilience to shocks of all sizes. For larger shocks more connections between entities make the crisis worse, and for smaller shocks more connected systems turn out to be more resilient than less connected ones.
In order to benefit from the information about the system connectivity, a machine learning model is used to estimate the values of the objectives attained by a given solution. The paper presents a comparison of a multiobjective optimization algorithm using simulations for evaluating solutions with a surrogate-based algorithm using a machine learning model. In the experiments the surrogate-based method outperformed the simulation-based one. This observation along with the analysis of the dependence between system connectivity and the resilience to shocks of different magnitudes presented in the paper allow to conclude that the information on system connectivity can be used for improving the working of optimization methods aimed at making the system less susceptible to financial contagion.
The analysis of electroencephalogram (EEG) waves is of critical importance for the diagnosis of sleep disorders, such as sleep apnea and insomnia, besides that, seizures, epilepsy, head injuries, dizziness, headaches and brain tumors. In this context, one important task is the identification of visible structures in the EEG signal, such as sleep spindles and K-complexes. The identification of these structures is usually performed by visual inspection from human experts, a process that can be error prone and susceptible to biases. Therefore there is interest in developing technologies for the automated analysis of EEG. In this paper, we propose a new Genetic Programming (GP) framework for feature construction and dimensionality reduction from EEG signals. We use these features to automatically identify spindles and K-complexes on data from the DREAMS project. Using 5 different classifiers, the set of attributes produced by GP obtained better AUC scores than those obtained from PCA or the full set of attributes. Also, the results obtained from the proposed framework obtained a better balance of Specificity and Recall than other models recently proposed in the literature. Analysis of the features most used by GP also suggested improvements for data acquisition protocols in future EEG examinations.
To control fluid flow in underground geologic formations, the placement of injection/production wells needs to be optimized taking geological characteristics of the reservoir into account. However, the optimum solution might not perform beneficially in the real world as the simulation result indicated because a reservoir model generally contains considerable geological uncertainty due to limited information of deep underground. Optimizing objective function integrating the response values from different models can be considered as an approach to this difficulty. In the previous study, such objective functions were proposed. However, their applicability has not been evaluated deeply. Therefore, their applicability was examined through well placement optimization for Carbon dioxide Capture and Storage (CCS) under geological statistical uncertainty as a case study. In the case study, we considered an optimization of the multiple wells for CO2 injection in a heterogeneous reservoir whose geological uncertainty was represented by 50 statistically independent reservoir models. As a result, the optimum solution with using all models showed the high applicability by comparing with the sensitivity analysis of nominal solutions, which is independently optimized for the single model, against the uncertainty. In addition, one of the proposed objective functions showed the superior result.
A hybrid evolutionary algorithm framework for optimising power take off and placements of wave energy converters
Ocean wave energy is a source of renewable energy that has gained much attention for its potential to contribute significantly to meeting the global energy demand. In this research, we investigate the problem of maximising the energy delivered by farms of wave energy converters (WEC's). We consider state-of-the-art fully submerged three-tether converters deployed in arrays. The goal of this work is to use heuristic search to optimise the power output of arrays in a size-constrained environment by configuring WEC locations and the power-take-off (PTO) settings for each WEC. Modelling the complex hydrodynamic interactions in wave farms is expensive, which constrains search to only a few thousand model evaluations. We explore a variety of heuristic approaches including cooperative and hybrid methods. The effectiveness of these approaches is assessed in two real wave scenarios (Sydney and Perth) with farms of two different scales. We find that a combination of symmetric local search with Nelder-Mead Simplex direct search combined with a back-tracking optimization strategy is able to outperform previously defined search techniques by up to 3%.
Mission planning for Earth Observation Satellite operators typically implies dynamically altering how requests from different customers are prioritised in order to meet expected deadlines. A request corresponds to a given area of interest to capture on Earth. This exercise is challenging for different reasons. First, satellites are limited by maneuvers and power consumption constraints resulting in a limited surface that can be covered at each orbit. Consequently, many requests are in competition with each other and so all of them cannot be treated at each orbit. When a request priority is boosted, it may incidentally penalise surrounding requests. Second, there are several uncertain factors such as weather (in particular cloud cover) and future incoming requests that can impact the completion progress of the requests. With order books of increasing size and the planned operations of a growing number of satellites in a close future, there is a clear need for a decision support method.
In this paper, we investigate the potential of Evolutionary Algorithms (EAs) for this problem and propose several approaches to optimise request priorities based on Local Search and Population-Based Incremental Learning (PBIL). Using a certified algorithm to decode request priorities into satellite actions and a satellite simulator developed by Airbus, we are able to realistically evaluate the potential of these methods and benchmark them against operator baselines.
Experiments on several scenarios and order books show that EAs can outperform baselines and significantly improve operations both in terms of delay reduction and successful image capture. While black-box approaches yield significant improvements over the baseline when there are many delayed requests in the order books, introducing domain knowledge is required to handle cases with fewer delayed requests.
Employee scheduling problems are of critical importance to large businesses. These problems are hard to solve due to large numbers of conflicting constraints. While many approaches address a subset of these constraints, there is no single approach for simultaneously addressing all of them. We hybridise 'Evolutionary Ruin & Stochastic Recreate' and 'Variable Neighbourhood Search' metaheuristics to solve a real world instance of the employee scheduling problem to near optimality. We compare this with Simulated Annealing, exploring the algorithm configuration space using the irace software package to ensure fair comparison. The hybrid algorithm generates schedules that reduce unmet demand by over 28% compared to the baseline. All data used, where possible, is either directly from the real world engineer scheduling operation of around 25,000 employees, or synthesised from a related distribution where data is unavailable.
A novel hybrid scheme using genetic algorithms and deep learning for the reconstruction of portuguese tile panels
This paper presents a novel scheme, based on a unique combination of genetic algorithms (GAs) and deep learning (DL), for the automatic reconstruction of Portuguese tile panels, a challenging real-world variant of the jigsaw puzzle problem (JPP) with important national heritage implications. Specifically, we introduce an enhanced GA-based puzzle solver, whose integration with a novel DL-based compatibility measure (DLCM) yields state-of-the-art performance, regarding the above application. Current compatibility measures consider typically (the chromatic information of) edge pixels (between adjacent tiles), and help achieve high accuracy for the synthetic JPP variant. However, such measures exhibit rather poor performance when applied to the Portuguese tile panels, which are susceptible to various real-world effects, e.g., monochromatic panels, non-squared tiles, edge degradation, etc. To overcome such difficulties, we have developed a novel DLCM to extract high-level texture/color statistics from the entire tile information.
Integrating this measure with our enhanced GA-based puzzle solver, we have demonstrated, for the first time, how to deal most effectively with large-scale real-world problems, such as the Portuguese tile problem. Specifically, we have achieved 82% accuracy for the reconstruction of Portuguese tile panels with unknown piece rotation and puzzle dimension (compared to merely 3.5% average accuracy achieved by the best method known for solving this problem variant). The proposed method outperforms even human experts in several cases, correcting their mistakes in the manual tile assembly.
Sentiment analysis consists of evaluating opinions or statements based on text analysis. Among the methods used to estimate the degree to which a text expresses a certain sentiment are those based on Gaussian Processes. However, traditional Gaussian Processes methods use a predefined kernels with hyperparameters that can be tuned but whose structure can not be adapted. In this paper, we propose the application of Genetic Programming for the evolution of Gaussian Process kernels that are more precise for sentiment analysis. We use use a very flexible representation of kernels combined with a multi-objective approach that considers simultaneously two quality metrics and the computational time required to evaluate those kernels. Our results show that the algorithm can outperform Gaussian Processes with traditional kernels for some of the sentiment analysis tasks considered.
In this study, a hybrid Multi-Objective Evolutionary Algorithm (MOEA) is proposed for the Multi-Objective Short-Term Unit Commitment (MO-STUC) problem, considering the minimization of operation cost and emissions as objectives. The proposed algorithm is based on a real-coded Differential Evolution (DE) and a two-step function to simultaneously deal with both the scheduling of the state of the units and the dispatching of the power among the committed units. Moreover, a local search technique, which combines two distinct local search procedures based on Pareto dominance and scalar fitness function, is hybridized with the proposed MOEA. In addition, a heuristic repair mechanism and a problem-specific mutation operator are adopted to enhance the algorithm's performance. The method is tested on two frequently studied test systems comprising 10 and 20 units, respectively. The non-dominated fronts provided by the proposed method adequately approximate the Pareto fronts of the test instances. Simulation results reveal that the proposed algorithm outperforms two standard MOEAs, based on DE and Genetic Algorithm, with respect to the employed quality indicators. Furthermore, the beneficial impact of combining the two local search procedures is demonstrated.
An increasing emphasis on reducing pollution and congestion in city centres combined with an increase in online shopping is changing the ways in which logistics companies address vehicle routing problems (VRP). We introduce the micro-depot-VRP, in which a single supply vehicle is used to supply a set of micro-depots distributed across a city; deliveries are then made from the micro-depot by couriers using electric vehicles, bicycles and on foot. We present a formal definition of the problem, and propose a representation that can be used with an optimisation algorithm to minimise the total cost associated with delivering packages. Using five instances created from real-data obtained from delivery companies operating within the City of Frankfurt, we apply an illumination algorithm in order to obtain a set of results that minimise costs but have differing characteristics in terms of emissions, distance travelled and number of couriers used. Results show that solutions can be obtained that have equivalent costs to the baseline standard VRP solution, but considerably improve on this in terms of minimising the secondary criteria relating to emissions, couriers and distance.
Generative mechanism-based models of social systems, such as those represented by agent-based simulations, require that intra-agent equations (or rules) be specified. However there are often many different choices available for specifying these equations, which can still be interpreted as falling within a particular class of mechanisms. Whilst it is important for a generative model to reproduce historically observed dynamics, it is also important for the model to be theoretically enlightening. Genetic programs (our own included) often produce concatenations that are highly predictive but are complex and hard to interpret theoretically. Here, we develop a new method - based on multi-objective genetic programming - for automating the exploration of both objectives simultaneously. We demonstrate the method by evolving the equations for an existing agent-based simulation of alcohol use behaviors based on social norms theory, the initial model structure for which was developed by a team of human modelers. We discover a trade-off between empirical fit and theoretical interpretability that offers insight into the social norms processes that influence the change and stasis in alcohol use behaviors over time.
Using a genetic algorithm with histogram-based feature selection in hyperspectral image classification
Optical sensing has the potential to be an important tool in the automated monitoring of food quality. Specifically, hyperspectral imaging has enjoyed success in a variety of tasks ranging from plant species classification to ripeness evaluation in produce. Although effective, hyperspectral imaging is prohibitively expensive to deploy at scale in a retail setting. With this in mind, we develop a method to assist in designing a low-cost multispectral imager for produce monitoring by using a genetic algorithm (GA) that simultaneously selects a subset of informative wavelengths and identifies effective filter bandwidths for such an imager. Instead of selecting the single fittest member of the final population as our solution, we fit a univariate Gaussian mixture model to the histogram of the overall GA population, selecting the wavelengths associated with the peaks of the distributions as our solution. By evaluating the entire population, rather than a single solution, we are also able to specify filter bandwidths by calculating the standard deviations of the Gaussian distributions and computing the full-width at half-maximum values. In our experiments, we find that this novel histogram-based method for feature selection is effective when compared to both the standard GA and partial least squares discriminant analysis.
Consumer-grade 3D printers have made the fabrication of aesthetic objects and static assemblies easier, opening the door to automate the design of such objects. However, while static designs are easily produced with 3D printing, functional designs, with moving parts, are more difficult to generate: The search space is high-dimensional, the resolution of the 3D-printed parts is not adequate, and it is difficult to predict the physical behavior of imperfect, 3D-printed mechanisms. An example challenge for automating the design of functional, 3D-printed mechanisms is producing a diverse set of reliable and effective gear mechanisms that could be used after production without extensive post-processing. To meet this challenge, an indirect encoding based on a Recurrent Neural Network (RNN) is proposed and evolved using Novelty Search. The elite solutions of each generation are 3D printed to evaluate their functional performance in a physical test platform. The proposed RNN model successfully discovers sequential design rules that are difficult to discover with other methods. Compared to a direct encoding of gear mechanisms evolved with Genetic Algorithms (GAs), the designs produced by the RNN are geometrically more diverse and functionally more effective, thus forming a promising foundation for the generative design of 3D-printed, functional mechanisms.
Cloud-based dynamic distributed optimisation of integrated process planning and scheduling in smart factories
In smart factories, process planning and scheduling need to be performed every time a new manufacturing order is received or a factory state change has been detected. A new plan and schedule need to be determined quickly to increase the responsiveness of the factory and enlarge its profit. Simultaneous optimisation of manufacturing process planning and scheduling leads to better results than a traditional sequential approach but is computationally more expensive and thus difficult to be applied to real-world manufacturing scenarios. In this paper, a working approach for cloud-based distributed optimisation of process planning and scheduling is presented. It executes a multi-objective genetic algorithm on multiple subpopulations (islands). The number of islands is automatically decided based on the current optimisation state. A number of test cases based on two real-world manufacturing scenarios are used to show the applicability of the proposed solution.