Trends in Artificial Intelligence

The goal of this work is to suggest a new hybrid algorithm to solve integer programming by incorporating the bat algorithm with direct search methods. The suggested algorithm is named hybrid bat direct search algorithm (HBDS). In HBDS, the global diversification and the local intensification process are balanced. The bat algorithm has a good capability to make intensification and diversification search. The intensification ability of the suggested algorithm is increased by employing the pattern search method as a local search method instead of the random walk method in the classic bat algorithm. In the final stage of the algorithm, the Nelder-Mead method is used to improve the best found solution from the bat and pattern search method instead of running the algorithm more iterations without any enhancements in the fitness function value. The performance of the HBDS algorithm is examined on 7 integer programming problems and compared to 10 benchmark algorithms for solving integer programming problems. The computational results show that HBDS is a promising algorithm and outperforms the other algorithms in most cases.


Introduction
Integer programming (IP) appear closely in every research area in applied operations research and mathematical programming. Variety of many real life applications for IP problems such as, scheduling problem, VLSI (very large scale integration) circuits design problems, engineering design problems, warehouse location problem, robot path planning problems, [1][2][3] can be formulated as IP problems.
On one hand, traditional integer programming methods such as dynamic programming or branch and bound have high computational cost, because they examine a search tree that has hundreds or more nodes when large scale real-life problems are considered. On the other hand, heuristic and metaheuritsic methods can be applied for solving integer programming problems. Swarm intelligence (SI) algorithms are novel meta-heuristics algorithms, which find their inspiration from the behavior of a group of social organisms. These algorithms are used to solve global optimization problems and their applications such as ant colony optimization (ACO) [4,5] artificial bee colony [6] particle swarm optimization (PSO) [7][8][9], bacterial foraging [10], bat algorithm [11,12], bee colony optimization (BCO) [13], wolf search [14], cat swarm [15], Cuckoo search [16], firefly algorithm [17], fish swarm/school [18], etc.
Bat algorithm (BA) is a recent population based algorithm inspired from the echolocation behavior of the microbats [12]. BA is capable to balance the global diversification and the local intensification during the search process. Because of the powerful performance of the BA, it has been used by many researchers to solve diverse applications, for example, Lin, et al. [34] used parameter estimation in dynamic biological systems using a chaotic bat algorithm by integrating Levy flights and chaotic maps. Zhang and Wang [35] improved the diversity of solutions by using the mutation with bat algorithm for image matching. Yang [36] applied BA to solve multi-objective optimization and benchmark engineering problems. Komarasamy and Wahi [37] integrated K-means and bat algorithm (KMBA) for efficient clustering. Nakamura, et al. [38] developed a discrete version of bat algorithm to solve classifications and feature selection problems. Xie, et al. [39] presented a variant of bat algorithm integrating Levy flights and differential operator to solve function optimization problems. In addition, Wang and Guo [40] integrated harmony search with bat algorithm and generated a hybrid bat algorithm for numerical optimization of function benchmarks.
The purpose of this paper is to avoid the slow convergence of the BA and avoid trapping in local minima. In order to solve these two issues, we suggest a new hybrid bat algorithm with direct search methods to solve integer programming problems [41]. The suggested algorithm is named hybrid bat direct search algorithm (HBDS). In HBDS, the pattern search is employed as a local search method to exploit the search around the best found solution at each iteration and in the final stage of the algorithm, the Nelder-Mead method is called to enhance the best obtained solution from the bat and pattern search method. Using the Nelder-Mead method can hasten the search and evade running the algorithm with more iterations without any enhancements in the results.
The rest of this paper is structured as follows. In Section 4, the integer programming problems and the applied direct search methods, the classic BA are described. In Section 5, the main concepts of the suggested HBDS algorithm are given. The numerical experimental and re-sults are shown in Section 6. Finally, the conclusion and future work are presented in Section 7.

Definition of the Problems and an Overview of the Applied Algorithms
In this section and its subsections, the definitions of the integer programming problems are presented and an overview of the BA and the pattern search method is given as follows.

The integer programming problem definition
An integer programming problem is a mathematical optimization problem in which all of the variables are restricted to be integers. The unconstrained integer programming problem can be defined as follows.
Where  is the set of integer variables, S is a not necessarily bounded set.

Pattern search method
The authors in [42] introduced the pattern search method (PS). Pattern search method is an applied direct search method to solve a global optimization problems. In direct search method, there is no need for any information about the gradient of the objective function to solve optimization problem. PS method has two type of moves, the exploratory moves and the pattern moves. In the exploratory moves a coordinate search is used around a chosen solution with a step length of ∆ in Algorithm 1. The exploratory move is considered successful if the function value of the new solution is better than the current solution. Otherwise, the step length is reduced. If the exploratory move is successful, then the pattern search is used in order to produce the iterate solution. If the iterate solution is better than the current solution, then exploratory move is used on the iterate solution and the iterate solution is accepted as a new solution. Otherwise, if the exploratory move is unsuccessful, the pattern move is rejected and the step length ∆ is decreased. The operation is repeated until stopping criteria are satisfied. The algorithm of Hook and Jeeves (HJ) pattern search and the main steps of it are outlined in Algorithm 2. The parameters in Algorithms 1, 2 are outlined in Table 1.

Overview of the Bat algorithm
In this section, an overview of the main concepts and structure of the BA is given.

Main concepts:
Bat algorithm (BA) is a population based metaheuristic algorithm, was developed by Xin-She Yang in 2010 [12]. BA is based on the echolocation of microbats, which use a type of sonar (echolocation) to detect prey and avoid obstacles in the dark. The main advantage of the BA is that it can provide a fast convergence at a very initial stage by switching from diversification to intensification, however, switching from diversification to intensification quickly may lead to stagnation after some initial stage.
The rules of the BA: Based on the bat characteristics, Xin-She Yang developed the bat algorithm with the following rules.

•
All bats can distinguish between pray and barriers/obstacles by using echolocation to sense distance.

Order.
Order and re-label the n + 1 vertices from lowest function value f(x 1 ) to highest function value  • Each bat randomly moves with velocity v i at a position x i with a frequency f min varying loudens A 0 and pulse emission rate r.

•
Assume that the loudness varies from a large value A 0 to a minimum value A min .
Bat movement: The BA is a population based method, where the population size consists of bats (solutions). Each bat (solution) in the population is randomly moving with velocity v i and a location x i . Also each bat is randomly assigned a frequency drawn uniformly from [f min , f max ]. The position of each bat in the population is updated as shown in the following equations.
Where β ∈ [0, 1] is a random vector drawn from a uniform distribution.

Variation of loudness and pulse emission rates:
The loudness A i and the pulse rate emission r i are very important to let the algorithm switch between diversification and intensification process. When the bat has found its pray, the loudness decreases and the rate of pulse emission increases. The BA starts with an initial value of the loudness A 0 and the rate of pulse emission r 0 , then these values are updated as shown in the following equations.
Where α ∈ [0, 1] and γ > 0 are constant, the parameter plays a similar role as the cooling factor in the simulated annealing algorithm.

Bat algorithm:
The main steps of the classic bat algorithm are shown in Algorithm 4 and can be summarized in the following steps.
Step 1: The algorithm starts by setting the initial values of its parameters and setting zero to the main iteration counter. (Lines 1-2) Step 2: The initial population is randomly generated by generating the initial position x 0 and the initial velocity v 0 for each bat (solution) in the population. The initial frequency f i is assigned to each solution in the population, where f is randomly generated from [f min , f max ]. The initial population is evaluated by calculating the objective function for each solution via the initial population f(x 0 ), the values of pulse rate r i and initial loudness A i where r Replace x n+1 with the reflected point x r and go to Step 7.
Replace x n+1 with x e and go to Step 7.
else Replace x n+1 with x r and go to Step 7.
Replace x n+1 with x oc and go to Step 7.
else Go to Step 6.
Replace x n+1 with x ic and go to Step 7.
Order and re-label the vertices of the new simplex as   (2), (3) and (4).

∈ [0, 1] and A i varies from a large A 0 to A min . (Lines 3-9)
Step 3: The new population is generated by adjusting the position x i and the velocity v i for each solution in the population as in (2), (3), and (4). (Lines 12-13) Step 4: The new population is evaluated by calculating the objective function for each solution and the best solution x * is selected from the population. (Lines 14-15) Step 5: The local search method is applied in order to refine the best found solution at each iteration. (Lines

16-19)
Step 6: The new solution is accepted with some proximity depending on parameter A i , increasing the rate of pulse emission and decreasing the loudness. The values of A i and r i are updated as in (5) and (6).
Step 7: The new population is evaluated, and the best solution is selected from the population. The operations are repeated until termination criteria are satisfied and the overall best solution is produced. (Lines 25-28)

The Suggested HBDS Algorithm
The main steps of the suggested HBDS algorithm are given in Algorithm 5 and the description of the suggested algorithm can be summarized as follows. Step 2: The initial population is randomly generated by generating the initial position x 0 and the initial velocity v 0 for each bat (solution) in the population. The initial frequency 0 i f is assigned to each solution in the population. The initial population is evaluated by calculating the objective function for each solution via the initial population 0 ( ) i f x and the values of pulse rate r i and initial loudness A i . (Lines 3-9) Step 3: The new population is generated by adjusting the position x i and the velocity v i for each solution in the population as in (2), (3) and (4). (Lines 12-13) Step 4: The new population is determined by evaluating the objective function for each solution and the best solution x * is selected from the population. (Lines 14-15) Step 5: The pattern search method is considered as a local search method and used in Algorithm 2 in order to improve the best found solution at each iteration. (Lines

16-19)
Step 6: The new solution is accepted with some proximity depending on parameter A i , increasing the rate of pulse emission and decreasing the loudness as in (5) and (6). (Lines 21-24) Step 7: The new population is calculated, and the best solution is chosen from the population. The operations are repeated until stopping criteria are satisfied.
Step 8: The Nelder-Mead method is used on the best found solution in the previous stage as a final intensification process in order to speed up the search and avoid running the algorithm with more iterations without any enhancement. (Line 28)

Numerical Experiments
In this section, the efficiency of the HBDS algorithm

Employ Nelder-Mead method on the best solutions, N elite , as shown in Algorithm 3.{Final intensifica-tion}
Step 1: The parameters of the minimum frequency f min , maximum frequency f max , population size P, the loudness constant α, the rate of pulse emission constant γ, the initial loudness A 0 , the minimum loudness A min , the initial rate of pulse emission r 0 , the maximum num- The initial loudness 1 r 0 The initial pulse rate 0.5 α The loudness constant 0.9 γ The rate of pulse emission constant 0.9 ε Step size for checking for decent directions 10 -3 m Local PS repetition number 5 ∆ Pulse emission rate r: The value of the rate of pulse emission parameter r is very important to apply the local search method in the algorithm. The experimental tests show that the best value of r is 0.9 and the rate of pulse emission constant is γ = 0.9.
Pattern search parameters: HBDS uses PS as a local search method in order to enhance the best obtained solution from the BA at each iteration. In PS the mesh size is initialized as ∆ 0 , in our experiments ∆ 0 = (Ui -L i )/3 and when no enhancement accomplished in the diversification search process, the mesh size is deducted by using reduction factor σ. The experimental results show that the best value of σ is 0.01. The PS steps are repeated m times, in order to enhance the intensification process of the algorithm. In our experiment m = 5 as a pattern search iteration number (Table 3).

Stopping condition parameters:
HBDS stops the search when the number of iterations reaches to the desired maximum number of iterations or any other termination relying on the comparison with other algorithms. In our experiment, the value of the maximum number is determined by giving its general performance on various benchmark functions and comparing the results of the suggested algorithm with different algorithms. In the following subsections, the parameters setting of the suggested algorithm and the properties of the applied test functions are outlined. Also, the performance analysis of the suggested algorithm is given with the comparative results between for other algorithms ( Table 2).

Parameter setting
The parameters of the HBDS algorithm have been outlined with their designated values in Table 2. Note that the parameters values are based on the common setting in the literature.

Population size P:
The experimental tests show that the best population size is P = 20, increasing this number will not improve the results, but the evaluation function values will increase. Frequency parameter f: Bat movement relies on the value of the frequency parameter f. In HBDS algorithm, the quality of the solution is associated to the value of f parameter. The experimental tests show that the minimum value of f is f min = 0 and the best maximum value of f is f max = 5.

Loudness parameters A, α:
Loudness parameter A is one of the most essential parameters in the BA. The acceptance of the new generated solutions is based on the value of A. The α parameter plays a comparable role as the cooling factor in the simulated annealing algorithm. The initial value of A is set to 1 and the value of α is set to 0.9.

Integer programming optimization test problems
The efficiency of the HBDS algorithm has been examined on 7 benchmark integer programming problems (FI 1 -FI 7 ). The properties of the benchmark functions (the global optimal of each problem, function number, problem bound, and dimension of the problem) are outlined in Table 3 and the functions with their definitions are presented in Table 4 as follows.
of iteration Max itr = 2d, where d is the dimension of the problems.

Final intensification:
The best obtained solutions from the BA and the pattern search method are recorded in a list in order to use the Nelder-Mead method on them, the number of the solutions in this list is called N elit , in order to evade increasing the value of the function evaluation value, N elit = 1.   Table 5, the results show that invoking the Nelder-Mead method in the final stage can hasten the search and help the algorithm to get the optimal or near optimal solution faster than the suggested algorithm without using the Nelder-Mead method (Table 5).

HBDS and other algorithms
The HBDS algorithm is compared with four benchmark algorithms (namely, particle swarm optimization and its variants) in order to verify of the efficiency of the suggested algorithm. Before presenting the comparison results of all algorithms, a brief description about the comparative four algorithms [24] is described.

RWMPSOg:
RWMPSOg is Random Walk Memetic Particle Swarm Optimization (with global variant), which incorporates the particle swarm optimization with random walk as direction exploitation.

RWMPSOl: RWMPSOl is Random Walk Memetic
Particle Swarm Optimization (with local variant), which incorporates the particle swarm optimization with random walk as direction exploitation.
PSOg: PSOg is standard particle swarm optimization with global variant without local search method.
PSOl: PSOl is standard particle swarm optimization with local variant without local search method.
Comparison between RWMPSOg, RWMPSOl, PSOg, PSOl and HBDS for integer programming problems: In this subsection, the comparison results between our HBDS algorithm and the other algorithms is given in order to validate the efficiency of our HBDS algorithm. The four comparative algorithms are examined on 7 benchmark functions. The results of the comparative algorithms are considered from their original paper [24]. The minimum (min), maximum (max), average (Mean), standard deviation (St. D) and success rate (% Suc) of the evaluation function values are outlined over 50 runs in Table 6. The run is regarded successful if the algorithm gets to the global minimum of the solution within an error of 10 -6 before the 20,000 function evaluation value. The best results between the comparative algorithms are outlined in boldface text. The results in Table 6, show that the suggested HBDS algorithm is successful in all runs and gets the objective value of each function faster than the other algorithms in 5 of 7 functions.

Comparison of HBDS without NM and HBDS with NM
The classic BA is compared with the suggested HBDS algorithm without applying the final intensification process (Nelder-Mead method) to verify the efficiency of the suggested HBDS. The values of the parameters are designated the same for both algorithms in order to have a fair comparison. The functions FI 1 , FI 2 and FI 3 have been chosen to show the efficiency of the suggested algorithm by plotting the values of function values versus the number of iterations as shown in Figure 1. In Figure 1, the solid line alludes to the suggested HBDS results, while the dotted line alludes to the classic bat results after 50 iterations. Figure 1 shows that the function values rapidly reduce as the number of iterations expands for HBDS results than those of the classic BA. From Figure 1, it can be deduced that the incorporation between the classic BA with pattern search method can enhance the performance of the classic BA and speed the convergence of the suggested algorithm ( Table 4).
The general performance of the suggested algorithm on the integer programming problems has been investigated by plotting the values of function values versus the number of iterations as shown in Figure 2 for four test functions FI 4 , FI 5 , FI 6 and FI 7 . The results in Figure  2 are the results of the suggested algorithm without applying the Nelder-Mead method in the final stage of the algorithm after 50 iterations. From Figure 2, it can be deduced that the function values of the suggested HBDS rapidly reduce as the number of iterations expands and the hybridization between the BA and the pattern search method can hasten the search and help the algorithm to get the optimal or close to optimal solution in reasonable time.
The Nelder-Mead (NM) method is used in the final stage of the suggested HBDS algorithm in order to speed the convergence of the suggested algorithm and evade running the algorithm with more iterations without any enhancement in the obtained results. The results in Table 5 show the mean evaluation function values of the suggested HBDS without and with using Nelder-Mead method, respectively. The stopping criteria mean that the rithms are outlined over 50 runs as shown in Table 7.
It can be concluded from Table 7 that the suggested algorithm can get the desired optimum values faster than the other SI algorithm.

HBATDS and the branch and bound method
In order to investigate the power of the HBATDS algorithm, it is compared with another famous method which is called branch and bound method (BB) [46][47][48][49]. Before discussing the comparative results between the proposed algorithm and BB method, the BB method and the main steps of its algorithm are presented.
Branch and bound method: The branch and bound method (BB) is one of the most widely used method for The HBDS algorithm is investigated with various meta-heuristics and swarm intelligence algorithms (SI) such as genetic algorithm (GA) [44], standard particle swarm optimization (PSO) [8], firefly (FF) [17], cuckoo search (CS) [16] and grey wolf optimization algorithms (GWO) [45]. Having fair comparison, the population size is set 20 for all algorithms and the stopping criteria for all algorithm are the same which are the algorithm gets to the global minimum of the solution within an error of 10 -4 before the 20,000 function evaluation value. For the GA the probability of crossover is set to PC = 0.8, probability of mutation PM = 0.01 and the roulette wheel selection is used. For the other SI algorithms, Table 6 the standard parameter setting for each algorithm is considered. The average (Avg) and standard deviation (SD) of all algo- 14. until (i ≤ m) Step 3: The algorithm is terminated, if the bounds are equal or very close, i.e. α = β (or α -β ≤ ∊) ∊, is a predefined positive constant.
Step 4: Otherwise, if the bounds are not equal or very close, some of the subsets M i are selected and partitioned in order to obtain a more refined partition of M 0 .
Step 5: The procedure is repeated until termination criteria are satisfied.
Comparison between BB method and HBATDS algorithm for integer programming problems: In this subsection, the HBATDS algorithm is tested with the BB method. The results of the BB method are taken from its original paper [?]. In [?], the BB algorithm transforms the initial integer problem programming problem to a continuous problem. For the bounding, the BB uses the sequential quadratic programming method to solve the generated sub problems, while for branching, BB uses depth first traversal with backtracking. In Table 8, the comparative results between the BB method and the proposed algorithm are reported. In Table 8, the average (Mean), standard deviation (St. D) and rate of success (Suc) are reported after 30 runs. The best mean evaluation values between the two algorithms are marked in boldface. The results in Table 8 show that the proposed algorithm results are better than the results of the BB method, however the BB method is better than the proposed algorithm in function FI 2 . The overall results in Table 8 and Figure 3 show that the proposed algorithm is faster and more efficient than the BB method.
It can be concluded from the two comparison tests between the proposed HBATDS algorithm and the 5 benchmark algorithms, that the proposed HBATDS al-solving optimization problems. The main idea of BB method is the feasible region of the problem is subsequently partitioned into several sub regions, this operation is called branching. The lower and upper bounds value of the function can be determined over these partitions, this operation is called bounding. The main steps of BB method are reported in Algorithm 6, and the BB algorithm can be summarized in the following steps.
Step 1: The algorithm starts with a relaxed feasible region M 0 ⊃ S, where S is the feasible region of the problem. This feasible region M 0 is partitioned into finitely many subsets M i .  The HBDS algorithm is intensely tested on 7 integer programming problems. The suggested algorithm is compared with other 10 algorithms to test its performance for integer programming problems. The numerical results illustrate that the suggested HBDS algorithm is a gorithm is a promising algorithm and can obtain the optimal or near optimal function values for most of the test functions (Table 8).

Conclusion and Future Work
In this paper, a new hybrid algorithm is suggested by incorporating the bat algorithm with direct search methods to solve integer programming problems. The suggested algorithm is named hybrid bat direct search