Next Article in Journal
Learning Experiences of a Participatory Approach to Educating for Sustainable Development in a South African Higher Education Institution Yielding Social Learning Indicators
Previous Article in Journal
Green Supply Chain Management with Cooperative Promotion
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Improved Gray Wolf Optimization Algorithm to Solve Engineering Problems

1
Institute of Management Science and Engineering, and School of Business, Henan University, Kaifeng 475004, China
2
School of Business, Henan University, Kaifeng 475004, China
3
Institute of Intelligent Network Systems, and Software School, Henan University, Kaifeng 475004, China
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(6), 3208; https://doi.org/10.3390/su13063208
Submission received: 27 December 2020 / Revised: 4 March 2021 / Accepted: 11 March 2021 / Published: 15 March 2021

Abstract

:
With the rapid development of the economy, the disparity between supply and demand of resources is becoming increasingly prominent in engineering design. In this paper, an improved gray wolf optimization algorithm is proposed (IGWO) to optimize engineering design problems. First, a tent map is used to generate the initial location of the gray wolf population, which evenly distributes the gray wolf population and lays the foundation for a diversified global search process. Second, Gaussian mutation perturbation is used to perform various operations on the current optimal solution to avoid the algorithm falling into local optima. Finally, a cosine control factor is introduced to balance the global and local exploration capabilities of the algorithm and to improve the convergence speed. The IGWO algorithm is applied to four engineering optimization problems with different typical complexity, including a pressure vessel design, a tension spring design, a welding beam design and a three-truss design. The experimental results show that the IGWO algorithm is superior to other comparison algorithms in terms of optimal performance, solution stability, applicability and effectiveness; and can better solve the problem of resource waste in engineering design. The IGWO also optimizes 23 different types of function problems and uses Wilcoxon rank-sum test and Friedman test to verify the 23 test problems. The results show that the IGWO algorithm has higher convergence speed, convergence precision and robustness compared with other algorithms.

1. Introduction

Many countries see the importance of resource optimization and sustainable development and consequently direct research to save resources and maintain sustainable development. As an important part of the entire life cycle of engineering construction projects, engineering design is essential to transform science and technology into actual production and is also at the key stage of determining and controlling engineering costs. Engineering design directly affects the construction of the project.
In the process of engineering construction, an unreasonable construction process will consume large amounts of raw materials and produce large amounts of construction waste, lead to natural resources shortage and environmental pollution [1]. Reasonable engineering structure design can save materials and costs, thus saves resources and alleviates the disparity between resource supply and economic and social development. It is also important for achieving sustainable economic and social development and protecting national resources, security and the environment in the long run. Therefore, the engineering optimization design problem has attracted the increasing attention of scholars.
As a new optimization algorithm, an intelligent optimization algorithm can better solve engineering design optimization problems without global information and has gradually become a new method to solve engineering optimization problems [2]. It is based on the collective behavior of creatures living in swarms or colonies. The common intelligent optimization algorithms include particle swarm optimization [3] (PSO), moth–flame optimization algorithm [4] (MFO), and the novel ant lion optimization [5] (ALO), sine–cosine algorithm [6] (SCA), the bat algorithm [7] (BA), flower pollination algorithm [8] (FPA) and salp swarm algorithm [9].
In recent years, many scholars have applied intelligent optimization algorithms to resource conservation, low green carbon and other problems. Afsharis used constrained particle swarm optimization to solve the multi-reservoir cogeneration scheduling problem [10]. Kim studied the relationship between road transportation cost and carbon dioxide emissions, proposed the main factors affecting carbon dioxide emissions in the transportation process, optimized the cargo transportation mode, and reduced carbon dioxide emissions within a reasonable cost and time limit [11]. Lin et al. proposed an integrated model and teaching optimization (TLBO) algorithm for machining parameter optimization and flow shop scheduling to minimize the maximum completion time and carbon emissions [12]. He et al. proposed an energy-saving optimization method. They used machine tool selection to reduce machine processing energy consumption and adjusted operation sequence to reduce energy waste when the machine was idle [13]. Du et al. explored a new solution to the optimal allocation of water resources to alleviate the contradiction between supply and demand of water resources and improve the utilization rate of water resources. The objective function is constructed by integrating social, economic and ecological benefits, and the optimal allocation model based on simulated annealing particle swarm optimization is designed [14]. Huang et al. proposed a multipollutant cost-effective optimization system based on genetic algorithm (GA) in machine learning to provide a cost-effective air quality control strategy for large-scale applications [2].
Gray wolf optimization (GWO) was proposed by scholar Mirjalili in 2014, which was inspired by the leadership hierarchy and hunting mechanism of gray wolves in nature [15]. The optimization process mainly includes social hierarchy, encircling the prey and attacking the prey. GWO algorithm is a new type of swarm intelligence optimization algorithm and has many characteristics of swarm intelligence optimization algorithm. For example, gray wolf has no special requirements on the objective function, does not depend on the rigorous mathematical characteristics of the optimization problem. Additionally, their mechanisms are easy to implement. In terms of algorithm implementation, the gray wolf algorithm can design a reasonable algorithm implementation method according to specific problems, so the algorithm has strong versatility. At present, the algorithm has been applied to practical problems, such as the workshop scheduling problem [16], control system [17], thermal power system [18,19], image segmentation [20], mechanical design [21], neural network [22], and achieved good optimization results.
As a new optimization method, the gray wolf algorithm also has pros and cons. The algorithm has good optimization potentials, and at the same time, it suffers from issues, including low exploration and slow convergence rates. With the continuous research of this algorithm, many scholars have analyzed and studied these existing problems of the algorithm and put forward some improvement methods [23]. Jagdish Chand Bansal et al. proposed an exploration equation to explore a large area of search space in view of the insufficient exploration ability and ease of falling into local optimization of the gray wolf algorithm. In addition, opposition-based learning was added to enrich the initial population diversity and improve the learning, search and optimization capabilities of the GWO [23]. Wang et al. proposed an improved gray wolf optimization algorithm with an evolutionary elimination mechanism. By adding the principle of survival of the fittest (SOF), biological evolution and differential evolution algorithm, the algorithm avoids falling into local optima, further accelerates the convergence speed of the GWO and improves the convergence accuracy [24]. Hi Jun T et al. proposed a GWO algorithm combined with a PSO algorithm, which retains the optimal position information of individuals and avoids the algorithm falling into local optima. This algorithm is used to test 18 benchmark functions and has better results than other algorithms [25]. Zhang et al. added elite opposition-based learning and a simple method to solve the problem of poor population diversity and slow convergence speed of the GWO, which increased the population diversity of gray wolf and improved the exploration ability of gray wolf [26].
In the basic gray wolf algorithm, the initialization and evolution process of the gray wolf population is random, which produces certain blindness for the position and velocity update of the gray wolf, such as the tendency to fall into local optima and premature convergence. To solve the above problems, an improved optimization algorithm of gray wolf (IGWO) is proposed. The tent chaos is used to increase the population diversity, the Gaussian perturbation is used to expand the search range, and the cosine control factor is used to balance the global exploration and local development ability so as to avoid the basic gray wolf algorithm falling into the local optima and improve the solution accuracy. IGWO is applied to four engineering problems: pressure vessel design, spring design, welding beam design and three truss design. The results show that the cost of engineering problems solved by the IGWO algorithm is effectively reduced compared with other algorithms. The IGWO is also tested on seven unimodal, six multimodal and ten various fixed dimension multimodal functions, and the results are verified by a comparative study with SCA, MFO, PSO, BA, FPA and SSA. The validation of the proposed modification is done on a set of 23 standard benchmark test problems using the Wilcoxon rank-sum and Friedman test. The results show that the IGWO algorithm has higher convergence speed, convergence precision and robustness compared with other algorithms.

2. Gray Wolf Optimization

GWO is inspired by the hierarchy and hunting behavior of gray wolf populations. The algorithm achieves optimization by mathematically simulating the tracking, surrounding, hunting and attacking the process of gray wolf populations. The gray wolf hunting process involves three steps: social hierarchy stratification, encircling the prey and attacking the prey.

2.1. Social Hierarchy

Gray wolves are social canids at the top of the food chain and follow a strict hierarchy of social dominance. The best solution is marked as the α ; the second-best solutions are marked as the β , the third-best solutions are marked as the δ , the rest of solutions are marked as ω . Its dominant social hierarchy, as shown in Figure 1:

2.2. Encircling the Prey

Gray wolves encircle prey during the hunt; in order to mathematically model encircling behavior, the following equations are used:
X t + 1 = X p t A · C · X p t X t
A = 2 a · r 1 a
C = 2 · r 2  
a = 2 2 t M a x _ i t e r
where X indicates the position vector of the gray wolf, X p indicates the position vectors of prey, t indicates the current iteration,   A and C are coefficient vectors, r 1 and r 2 are random vectors in [0, 1] ^n, a is distance control parameter, and its value decreases linearly from 2 to 0 over the course of iterations, and Max_iter is the maximum iterations.

2.3. Attacking the Prey

Gray wolves have the ability to recognize the location of potential prey, and the search process is mainly carried out by the guidance of α , β ,   δ wolves. In each iteration, the best three wolves ( α , β ,   δ ) in the current population are retained, and then the positions of other search agents are updated according to their position information. The following formulas are proposed in this regard:
X 1 = X α A 1 · C 1 · X α X
X 2 = X β A 2 · C 2 · X β X
X 3 = X δ A 3 · C 3 · X δ X
X t + 1 = X 1 t + X 2 t   + X 3 t 3
In the above equation, X α , X β and X δ are the position vectors of α, β and δ wolves, respectively, the calculations of A 1 , A 2 and A 3 are similar to A , and the calculations of C 1 , C 2 and C 3 are similar to C .   D α = C 1 · X α X , D β = C 2 · X β X , D δ = C 3 · X δ X represents the distance between the current candidate wolves and the best three wolves. As can be seen from Figure 2, the candidate solution finally falls within the random circle defined by α ,   β and δ . Then, under the guidance of the current best three wolves, the other candidates randomly update their positions near the prey. They start to search the prey position information in a scattered way and then concentrate on attacking the prey.
The flowchart of the GWO is given in Figure 3.

3. Improved Gray Wolf Optimization Algorithm (IGWO)

3.1. Tent Chaos Initialization

When solving practical problems, GWO usually uses the randomly generated data as the initial population information, which will make it difficult to retain the diversity of the population and lead to poor results. However, chaotic motion has the characteristics of randomness, ergodicity and regularity [27], and using these characteristics to optimize the search can maintain the diversity of the population and improve the global searchability. In general, the random motion state obtained by the deterministic equation is called chaos, and the variable representing the chaotic state is called chaotic variable. Chaos is a common phenomenon in nonlinear systems [28].
Many scholars have introduced chaotic maps and chaotic search into the GWO. The existing chaotic maps include logistic maps, tent maps, etc. However, different chaotic maps have a great impact on the chaos optimization process. At present, most of the existing chaotic maps used in the literature contain logistic maps, and the inhomogeneity of logistic traversals will affect the optimization speed, thereby reducing the efficiency of the algorithm.
Reference [29] pointed out that tent map has better ergodicity, regularity and faster speed than a logistic map. It is also proved that the tent map has a prerequisite for optimizing the chaotic sequence of algorithms by strict mathematical reasoning. Tent map is a kind of piecewise linear mapping mathematically, which is named because its function is like tent [28]. Moreover, it is a 2D chaotic map, which is widely used in chaotic encryption systems (such as image encryption), and the chaotic spread spectrum code generation, chaos encryption system construction and implementation of the chaotic optimization algorithm are also often used.
Figure 4 shows the distribution of logistic chaotic sequence and tent chaotic sequence. It can be found from Figure 4 that the probability of value in the interval of [0, 0.05] and [0.95, 1] is greater than that of other sections, and the probability of tent in each feasible region is relatively uniform. The uniformity of tent chaotic sequence generated by tent chaotic map is obviously better than that of logistic chaotic map.
In this paper, the tent chaotic map is used to optimize the search by using randomness, ergodicity and regularity, which can effectively maintain the diversity of the population, suppress the algorithm to fall into the local optima, and improve the global searchability.
In recent years, tent chaotic map has been applied to various algorithms, and good results have been achieved. Li et al. proposed a new image encryption scheme based on the tent chaotic map. They used known methods to analyze the performance and security of the proposed image encryption scheme and proved the effectiveness and security of the scheme through fault security analysis [30]. Indu et al. proposed an improved tent map particle swarm optimization algorithm (ITM-CPSO), which solved the nonlinear congestion management cost problem, realized the nonlinear congestion management cost problem of the algorithm, and further reduced the deviation of the timing pulse generator output predetermined level to reduce the overall unloading and cost [31]. Gokhale SS et al. proposed a tent chaos firefly algorithm (CFA) to optimize the time coordination of the relay, and tested it in a number of systems, obtained better results [32]. Petrovic et al. added chaotic map to the initialization of the fruit fly optimization algorithm and studied 10 different chaotic maps. The statistical results showed that FOA was improved in terms of convergence speed and overall performance [33].
The tent mapping model, which generates a chaotic sequence for population initialization, and its mathematical expression is:
y i + 1 = y i / α , 0 y < α 1 y i / 1 α , α   y 1
Theoretical research shows that tent chaotic map is represented by Bernoulli shift transformation, as follows [29]:
y i + 1 = 2 y i m o d 1
where y i j 0 , 1 is a chaotic variable, i = 1, 2… n represents the ordinal number of chaotic variables, j = 1, 2… n represents the population size. In this paper, α = 0.7.
Figure 5 and Figure 6, respectively, show the initial population of the GWO and IGWO when solving the sphere function. Among them, n = 30, d = 3, and the search space range is [−100, 100]. The red circle represents the global optimal point, and the blue circle represents the gray wolf population. In the initialization phase of the algorithm, gray wolves are randomly dispersed in the search space, and the more uniform the population is, the stronger the global exploration ability will be. It can be seen from Figure 5 and Figure 6 that the tent map makes the initial population distribution more uniform, enriches the diversity of the population, reduces the probability of the algorithm falling into local optima and improves the convergence accuracy.

3.2. Gaussian Perturbation

According to Equation (1), the GWO’s C also plays a decisive role, Equation (3) shows that C is a random vector in [0, 2], the C provides random weights for prey, which can increase ( C > 1) or reduce ( C < 1) distance between the gray wolves and the prey. The size of C helps GWO to display random search behavior during the optimization process so as to avoid the algorithm falling into local optima [16]. The leading wolf position in a population plays an important role in guiding the group to move towards the best solution. If the leading wolf’s position falls into local optima, it is easy to stop searching and lack diversity within the group. During the movement of wolves, their leading wolves were randomly generated and unpredictable. In order to avoid the premature convergence and balance the global and local exploration capabilities, the original random generation coefficient C was changed to Gaussian perturbation, which will cause some perturbation to the leader and maintain the diversity of the population.
Figure 7 is the random value generated by the original C and Gaussian distribution. It can be clearly seen from the figure that the random number generated by Gaussian distribution has a wider range, which can show the random walk behavior of gray wolves in the GWO algorithm more accurately.
X t + 1 = X p t A · G a u s s i a n δ · X p t X t

3.3. Cosine Control Factor

In the GWO, the coefficient vector A balances the global and local search capabilities. When A   > 1, the gray wolves will expand the search scope, namely global searching. When A   < 1, the gray wolves will shrink the search scope to attack their prey, namely local search. In the early-stage of optimization, gray wolf individuals should be widely distributed in the entire search space. In the late-stage of optimization, gray wolf individuals should converge to the optimal global value by using the collected information. According to Equation (2), the attenuation factor a affects the coefficient vector A , which further affects the balance between GWO’s exploration and development capacity. When a   > 1 (the exploration stage), the gray wolves will engage in search and hunting activities, and when a   < 1 (the development stage), the gray wolves will only engage in hunting activities.
In the GWO, the convergence factor A decreases linearly with the number of iterations from 2 to 0, but the individual gray wolf does not change linearly in the process of searching prey, so the linear decline of the convergence factor A cannot fully reflect the actual optimization search process. It can be seen from Figure 8 and Figure 9, the improved a and A can better reflect the actual optimization search process.
Therefore, inspired by reference [34], the control factor a of linear change is replaced by the control factor a of cosine change, and its expression is as follows:
a = 2 cos π 2 t M a x i t e r
The inertia weight factor is a very important parameter [35]. When the inertia weight is large, the algorithm has a strong global searching ability, which can expand the search range. When the inertia weight is small, the local search ability of the algorithm is strong, and it can search around the optimal solution and accelerate the convergence speed.
Combined with the cosine change of the control parameter a above, this paper introduces a weight cosine control factor B(t) that changes synchronously with a into the position update of the GWO to further enhance the global exploration capability. With the increase of iterations, the algorithm adjustment step length becomes smaller, the global search ability gradually weakens, and the local search ability gradually strengthens. When B(t) is very small, the position of the individual population is finetuned instead of approaching the origin.
B t = cos π 2 t M a x i t e r
  X 1 = X B t · A 1 · C 1 · X α X
X 2 = X β B t · A 2 · C 2 · X β X
X 3 = X δ B t · A 3 · C 3 · X γ X

4. The Simulation Results

4.1. Optimization Function and Experimental Environment

In this section, to analyze the performance of the IGWO algorithm, 7 unimodal test functions, 6 multimodal optimization functions and 10 fixed dimensional multimodal optimization functions in reference [15] are selected. Table 1, Table 2 and Table 3 show the names, specific expressions, search space and optimal values of these functions. Among them, F1–F7 are the unimodal optimization functions, F8–F13 are the multimodal optimization functions, F14–F23 are the fixed dimensional multimodal functions. The unimodal test function is mainly used to estimate the solution accuracy and convergence speed of the IGWO, and the multimodal test function is mainly used to estimate the global surveying ability of the IGWO.
In order to improve the accuracy of the experiment, the 8 selected algorithms all adopt the same experimental parameters, namely, the swarm size n = 30, dimension d = 30, the maximum iterations Max_iter = 500, each algorithm is run 30 times independently and the results recorded. In, which C1 = C2 = 2, W = 0.7294 in PSO, A = 0.5, r = 0.5 in BA and P = 0.2 in FPA. In the simulation experiment, the hardware configuration of the simulation environment is Windows 10 Professional operating system; the CPU is Intel® Celeron® CPU N3060 @1.60 GHz processor, 4 GB memory, and the software configuration of the simulation environment is MATLABR2016a.

4.2. Analysis of Different Strategies

Different improvement strategies have different effects on the optimization results, and their contributions to the whole improved algorithm are also different. In order to analyze the effectiveness of the improved strategy of the IGWO algorithm, three strategies of Tent map, Gaussian distribution and cosine control factor are combined with the basic GWO, respectively, to study the influence of each strategy on the effectiveness of the algorithm. GWO1 means the combination of the GWO algorithm and tent map, GWO2 means the combination of the GWO algorithm and Gaussian distribution, and GWO3 means the combination of the GWO algorithm and cosine control factor. The test results are shown in Table 4 below.
From Table 4, it can be seen that different strategies can improve the performance of the GWO algorithm, and GWO2 plays a greater role in improving the performance of the GWO algorithm. Overall, the combination of different strategies makes the performance of the IGWO algorithm greatly improved.

4.3. Analysis of Experimental Results

4.3.1. Compared with Other Algorithms

To further evaluate the performance of the IGWO, the IGWO algorithm was tested on 23 benchmark functions, and the results were compared with the GWO, SCA, MFO, PSO, BA, FPA and SSA 7 algorithms. The average values and standard deviation of each algorithm running 30 times are used as the evaluation criteria, and the solution with the highest accuracy is bolded in the table. The experimental test results are shown in Table 5, Table 6 and Table 7.
The average values and standard deviation in Table 5, Table 6 and Table 7 can reflect the convergence accuracy and optimization capability of the IGWO. For the 7 unimodal functions, the IGWO performs better in accuracy and standard deviation when solving F1, F2, F3, F4 and F6 functions, although the optimization accuracy does not reach the theoretical optimal value of 0. For the 7 multimodal functions, the optimization accuracy reaches the theoretical optimal value 0, and the standard deviation is also 0 when solving the F9 and F11 functions, which fully demonstrates the algorithm’s solution accuracy and strong robustness. Meanwhile, compared with other optimization algorithms, the IGWO also achieves a better value when solving the F10 function. For the 10 fixed dimensional multimodal functions, F15 gets a better value than other algorithms, and the results of the remaining functions are less different from those of the contrast algorithm. It can be seen that the IGWO has advantages in solving unimodal, multimodal and fixed dimensional multimodal functions.

4.3.2. Convergence Analysis

The convergence rate of the algorithm is an important index to measure the performance of the algorithm, and the algorithm convergence curve can directly compare the convergence performance of the algorithm. Figure 10a–w shows the convergence curves of fitness values of 8 algorithms, the IGWO, GWO, SCA, MFO, PSO, BA, FPA and SSA when solving 23 different types of test functions. The convergence curves are plotted by extracting the values generated by random running once when dimension d = 30.
According to (aw) in Figure 10, when solving the unimodal test function, the convergence curve of the IGWOS algorithm decreases to the lower right corner as the iterations increase, and the convergence accuracy is higher than other algorithms. Among the multimodal test functions, the IGWO tests other functions except F8 have better performance. In particular, F9 and F11 can quickly find the optimal solution with fast convergence speed and high precision. For the fixed dimensional multimodal functions, most of the function curve drops in a stepped way. This is because the algorithm keeps exploring in the iterative process, jumps out of the optimal local solution and looks for the optimal global solution, and its convergence accuracy is relatively good.
In conclusion, the IGWO algorithm combining tent maps, Gaussian distribution and control factor can fully expand the search range in the early iteration, avoid the algorithm falling into local optima, and better balance the global and local search capabilities.

4.3.3. Numerical Result Test

In this section, Wilcoxon rank-sum test [36] will be carried out on the algorithm results of 30 independent runs, and the Friedman test will be carried out on the average values in Table 5, Table 6 and Table 7. The following results were analyzed in IBM SPSS Statistics 21, and the significance level was set at 0.05. Table 8 shows the Wilcoxon rank-sum test results based on the IGWO algorithm. The bold data indicates that there is no significant difference between the comparison algorithm and the calculation results of the IGWO algorithm. Table 9 shows the rank generated by the Friedman test according to the mean values in Table 5, Table 6 and Table 7.
From Table 8, it can be concluded that the proposed IGWO significantly outperforms algorithms GWO, SCA, MFO, PSO, BA, FPA and SSA.
According to the results of Table 9, the rank means of the 8 algorithms are 2.39 (IGWO), 3.22 (GWO), 3.93 (SCA), 5.70 (MFO), 4.96(PSO), 5.73 (BA), 5.61 (FPA), and 4.37 (SSA). The priority order is IGWO > GWO > SCA > SSA > PSO > FPA > MFO > BA. IGWO compared with SCA p = 0.011 < 0.05 indicates that there is a statistically significant difference between IGWO and SCA. IGWO compared with MFO p = 0.001 < 0.05 indicates that there is a statistically significant difference between IGWO and MFO. IGWO compared with PSO p = 0.002 < 0.05 indicated that there was a statistically significant difference between IGWO and PSO. IGWO compared with BA p = 0 < 0.05 indicates that there is a statistically significant difference between IGWO and BA. IGWO compared with FPA was p = 0.01 < 0.05 indicated that there was a statistically significant difference between IGWO and FPA. IGWO compared with SSA p = 0.005 < 0.05 indicates that there is a statistically significant difference between IGWO and SSA.

5. Application to Solve Engineering Optimization Problem

In this section, four structural design problems with constraints in mechanical engineering are optimized to reduce the cost of engineering design and save resources. For these constraints, the penalty function method is used to construct the fitness function. For the objective problem f x , the constraint g i x 0 , h = 0 is treated, as shown in Equation (17), f ˜ x is a newly constructed objective function. Both β 1 and β 2 represent a positive integer large enough (such as 10 × 1012). m and l are the numbers of inequality constraints and equality constraints. Parameters α 1 and α 2 take 2 and 1, respectively. When dealing with the objective problem, this method excludes all the solutions that do not meet the constraints from the candidate solutions. If the search solution satisfies the constraints, then the objective function is f ˜ x = f x . When solving these engineering optimization problems, the population size n = 30, the maximum iteration number Max_iter = 500, and the results in the table are the best values obtained by running 30 times independently. The bold values indicate the best one among all methods.
f ˜ x = f x + i m β 1 · max 0 , g i x a 1 + j l β 2 · h j x a 2

5.1. Pressure Vessel Design Problem

Figure 11 shows that the objective of the pressure vessel design optimization problem is to minimize the sum of material cost, synthesis cost and welding cost while satisfying the decision variables of pressure tube thickness Ts x 1 , head cover thickness Th x 2 , inner radius R x 3 and cylindrical length. Equation (18) gives its mathematical model.
Min   f x = 0.6224 x 1 x 3 x 4 + 1.7781 x 2 x 3 2 + 3.1661 x 1 2 x 4 + 19.84 x 1 x 3
Subjected to:
g 1 x = x 1 + 0.0193 x 3 0
g 2 x = x 2 + 0.0954 x 3 0
g 3 x = π x 3 2 x 4 4 3 x 3 2 + 1 , 296 , 000 0
g 4 x = x 4 240 0
x 1 , x 2 0.0625 , 99 0.0625
x 3 , x 4 10 , 200
In order to objectively demonstrate the performance of the IGWO, the MVO in [37], GSA algorithm in [38], PSO algorithm in [39], MSCA algorithm in [40], GA (Coello) algorithm in [41], GA (Coello and Montes) algorithm in [42], GA (Deb et al.) algorithm in [43], ES algorithm in [44], DE (Huang et al.) algorithm in [45], ACO (Kaveh et al.) algorithm in [46], IHS algorithm in [47] and WOA algorithm in [48] are selected. Table 10 suggests that the IGWO finds a design with the minimum cost for this problem.
According to the results of Table 10, the IGWO, obtaining the optimum cost of 5888.6000, found the best feasible optimal design among all these algorithms. Therefore, it can be concluded that the IGWO is excellent in solving the pressure vessel design problem.

5.2. Spring Design Problem

This subsection is an optimized test of the spring design problem. When the spring is applied in engineering, it is necessary to minimize the gravity of the spring and reduce the waste of materials under the setting conditions of factors, such as tortuosity, shear pressure, impact frequency and outer diameter. A schematic diagram of the spring design structure is shown in Figure 12; the effect variable is the diameter of the coil d (x1), the wire diameter D(x2), the number of effective windings P(x3). For this problem, the mathematical model of the objective function and constraint function is shown in Equation (19).
M i n   f x = x 1 2 x 2 x 3 + 2
subjected to:
g 1 x = 1 x 2 3 71 , 785 x 1 4 0
g 2 x = 4 x 2 2 x 1 x 2 12 , 566 x 2 x 1 3 + 1 5108 s 1 2 1 0
g 3 x = 1 140 x 1 x 2 2 x 3 0
g 4 x = x 1 + x 2 1.5 1 0
x 1 0.05 , 2
x 2 0.25 , 1.3
x 3 2 , 15
In order to objectively demonstrate the performance of the IGWO, the mathematical optimization method (Belegundu) algorithm in [49], GSA algorithm in [38], GSA algorithm in [46], SCA algorithm in [6], and MVO algorithm in [37] are selected. The comparison results are shown in Table 11.
As can be seen from Table 11, the IGWO outperforms other comparison algorithms except for the GWO. However, the difference between IGWO and GWO is small.

5.3. Welded Beam Design Problem

In practical engineering application, the stressed part of the engineering can be described as a welded beam design problem. That is, one end of the beam wall does not produce axial direction, does not rotate and has no vertical displacement, and the other end is a free end. As shown in Figure 13, this problem is very widespread in engineering design. The mathematical model is shown in Equation (20), where the function represents the shear stress of the joint, the bending stress of the member, the buckling stress of the member and the deflection function of the end of the member. The optimization goal of the welded beam design problem is to minimize the total manufacturing cost when the constraints are met, that is, to reduce the waste of resources.
Min   f x = 1.10471 x 1 2 x 2 + 0.04811 x 3 x 4 14 + x 2
Subjected to:
g 1 x = τ x τ m a x 0
g 2 x = σ x σ m a x 0
g 3 x = x 1 x 4 0
g 4 x = 0.125 x 1 0
g 5 x = σ x 0.25 0
g 6 x = P P c x 0
g 7 x = 0.10471 x 1 2 x 2 + 0.04811 x 3 x 4 14 + x 2 5 0
0.1 x 1 , x 4 2 ; 0.1 x 2 , x 3 10
τ x = τ 1 2 + 2 τ 1 τ 1 ( x 2 2 R ) + τ 2 2 ;
τ 1 = P 2 x 1 x 2
τ 2 = M R J
M = P L + x 2 2
J x = 2 2 x 1 x 2 x 2 2 12 + x 1 + x 3 2 2
R = X 2 2 4 + x 1 + x 3 2 2 ;
σ x = 6 P L x 4 x 3 2
δ x = 4 P L 3 E x 3 3 x 4
P c x = 4.013 E G x 3 2 x 4 6 L 2 ( 1 X 3 2 L E 4 G )
G =   ×   10 6 , E = 30   ×   10 6 , P = 6000 , L = 14
IGWO was used to solve the welding beam problem. The comparison algorithm includes NGS-WOA [50], WOA algorithm [48], RO algorithm [51], MVO algorithm [37], CPSO algorithm [52], CPSO algorithm [53], HS algorithm [54], GSA algorithm [38], GA algorithm [55], GA algorithm [56], Coello algorithm [40], and Coello and Monters algorithm [41]. The comparison results are shown in Table 12.
According to Table 12 statistics, the data results of each algorithm in solving the welding beam design problem; compared with the basic GWO algorithm, the solution result of the IGWO algorithm is better than that of the GWO algorithm. Judging from the results of other algorithms, the IGWO algorithm is superior to other algorithms except for MVO.

5.4. Three Truss Design Problem

The three-truss design problem is a nonlinear optimization problem with three nonlinear inequality constraints and two continuous decision variables to minimize the cost. The goal is to optimize the cross-sectional areas of rod 1 and 3, x1, and the cross-sectional areas of rod 2, x2. The mathematical model is shown in Equation (21). The schematic diagram is shown in Figure 14.
Min   f x = 2 2 x 1 + x 2 )   ×   l
Subjected to:
g 1 x = 2 x 1 + x 2 2 x 1 2 + 2 x 1 x 2 p δ 0
g 2 x = x 2 2 x 1 2 + 2 x 1 x 2 p δ 0
g 3 x = 1 2 x 2 + x 1 p δ 0
0 x 1 , x 2 1 , l   =   100   cm ,   p   =   2   kN / cm , δ = 2   kN / cm 2
In order to objectively demonstrate the performance of the IGWO, the PSO-DE in [39], MBA algorithm in [57], DEDS algorithm in [58] CS algorithm in [59], Ray and Sain algorithm in [60], and Tsa algorithm in [61] are selected. Table 13 shows the comparison of solution results between the IGWO algorithm and other algorithms.
It can be seen from Table 13 that the solution result of the IGWO algorithm is slightly inferior to the Tsa algorithm. However, compared with other algorithms, the IGWO algorithm still has some advantages. Therefore, the IGWO algorithm has certain practicability in the engineering design of three truss structures.

6. Conclusions and Future Work

To save materials and cost in the process of engineering construction, an improved gray wolf optimization algorithm termed IGWO is proposed. It combines tent chaos, Gaussian perturbation and cosine control factor to achieve a better promote the conservation and rational use of resources, alleviate the contradiction between resource supply and socioeconomic development. Twenty-three functions with different characteristics of unimodal, multimodal and fixed dimensional multimodal with different characteristics are tested, and the results are compared with seven algorithms. The experimental results show that the IGWO optimization performance and stability of the algorithm are superior to the other seven different types of comparable algorithms. Then four challenging engineering constraint optimization problems with different objective functions, constraint conditions and properties, including pressure vessel design, spring design, welded beam design and truss design, are solved. Meanwhile, the Wilcoxon rank-sum test and Friedman were used to evaluate the results of the IGWO algorithm. The experimental results show that the IGWO is more competitive than other comparison algorithms and can be an effective tool to solve engineering design problems and save resources.
For the study of this paper, the results show that the IGWO was successfully implemented to solve the practical constraints of engineering design problems. Additionally, the IGWO is able to provide very competitive results in terms of minimizing total cost. It was observed that the IGWO has the ability to converge to a better quality near-optimal solution and possesses better convergence characteristics than other prevailing techniques reported in the literature. It is also clear from the results obtained by different functions that the IGWO shows a good balance between exploration and exploitation that results in high local optima avoidance. It can be concluded that the IGWO is an effective and efficient algorithm.
However, the IGWO algorithm still has some shortcomings in some aspects, for example, it cannot be effectively applied to all problems, and the convergence accuracy can be further improved. Therefore, research on the IGWO still has much room for improvement. Our future work is divided into two parts: (1) to apply the proposed method to more practical cases to exert its full effective merits, especially in the areas of engineering optimization control; and (2) to reconstruct the proposed method as a fusion procedure with other metaheuristics to further investigate its performance thoroughly.

Author Contributions

J.L.: methodology. X.L.: resources, writing—original draft. Y.L.: writing—review and editing. All authors have read and agreed to the published version of the manuscript.

Funding

This study is supported by the National Natural Science Foundation of China (No. 71601071), the Science & Technology Program of Henan Province, China (No. 182102310886 and 162102110109), and an MOE Youth Foundation Project of Hu-manities and Social Sciences (No. 15YJC630079).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The study did not report any data.

Acknowledgments

This study is supported by the National Natural Science Foundation of China (No. 71601071), the Science and Technology Program of Henan Province, China (No. 182102310886 and 162102110109), and a MOE Youth Foundation Project of Humanities and Social Sciences (No. 15YJC630079).

Conflicts of Interest

There are no conflicts to declare.

References

  1. Begum, R.A.; Siwar, C.; Pereira, J.J.; Jaafar, A.H. A benefit–cost analysis on the economic feasibility of construction waste minimisation: The case of Malaysia. Resour. Conserv. Recycl. 2006, 48, 86–98. [Google Scholar] [CrossRef]
  2. Huang, J.; Zhu, Y.; Kelly, J.T.; Jang, C.; Wang, S.; Xing, J.; Yu, L. Large-scale optimization of multi-pollutant control strategies in the Pearl River Delta region of China using a genetic algorithm in machine learning. Sci. Total Environ. 2020, 722, 137701. [Google Scholar] [CrossRef] [PubMed]
  3. Guo-Chu, C.; Jin-Shou, Y.U. Particle swarm optimization algorithm. Inf. Control 2005, 186, 454–458. [Google Scholar]
  4. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl. Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  5. Mirjalili, S. The Ant Lion Optimizer. Adv. Eng. Softw. 2015, 83, 80–98. [Google Scholar] [CrossRef]
  6. Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl. Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  7. Gandomi, A.H.; Yang, X.-S. Chaotic bat algorithm. J. Comput. Sci. 2014, 5, 224–232. [Google Scholar] [CrossRef]
  8. Yang, X.-S. Flower Pollination Algorithm for Global Optimization. In International Conference on Unconventional Computing and Natural Computation; Springer: Berlin, Heidelberg, 2012; Volume 7445, pp. 240–249. [Google Scholar]
  9. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
  10. Afshar, M.H. Extension of the constrained particle swarm optimization algorithm to optimal operation of mul-ti-reservoirs system. Int. J. Electr. Power Energy Syst. 2013, 51, 71–81. [Google Scholar] [CrossRef]
  11. Kim, N.S.; Janic, M.; Van Wee, B. Trade-off between carbon dioxide emissions and logistics costs based on multi-objective optimization. Transp. Res. Rec. 2009, 2139, 107–116. [Google Scholar] [CrossRef] [Green Version]
  12. Lin, W.; Yu, D.Y.; Zhang, C.; Liu, X.; Zhang, S.; Tian, Y.; Xie, Z. A multi-objective teaching−learning-based opti-mization algorithm to scheduling in turning processes for minimizing makespan and carbon footprint. J. Clean. Prod. 2015, 101, 337–347. [Google Scholar] [CrossRef]
  13. He, Y.; Li, Y.; Wu, T.; Sutherland, J.W. An energy-responsive optimization method for machine tool selection and operation sequence in flexible machining job shops. J. Clean. Prod. 2015, 87, 245–254. [Google Scholar] [CrossRef]
  14. Du, B.; Zhang, J.F.; Gao, Z.H.; Li, T.; Huang, Z.Q.; Zhang, N. Based on simulated annealing particle swarm algorithm of optimal allocation of water resources research. J. Drain. Irrig. Mach. Eng. 2020, 1–10. Available online: http://kns.cnki.net/kcms/detail/32.1814.th.20200927.0952.002.html (accessed on 1 February 2021).
  15. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  16. Maharana, D.; Kotecha, P. Optimization of Job Shop Scheduling Problem with Grey Wolf Optimizer and JAYA Al-gorithm. In Smart Innovations in Communication and Computational Sciences; Springer: Singapore, Singapore, 2019; Volume 669, pp. 47–58. [Google Scholar]
  17. Precup, R.-E.; David, R.-C.; Petriu, E.M. Grey Wolf Optimizer Algorithm-Based Tuning of Fuzzy Control Systems With Reduced Parametric Sensitivity. IEEE Trans. Ind. Electron. 2016, 64, 527–534. [Google Scholar] [CrossRef]
  18. Sharma, Y.; Saikia, L.C. Automatic generation control of a multi-area ST—Thermal power system using Grey Wolf Optimizer algorithm based classical controllers. Int. J. Electr. Power Energy Syst. 2015, 73, 853–862. [Google Scholar] [CrossRef]
  19. Shakarami, M.; Davoudkhani, I.F. Wide-area power system stabilizer design based on Grey Wolf Optimization algorithm considering the time delay. Electr. Power Syst. Res. 2016, 133, 149–159. [Google Scholar] [CrossRef]
  20. Yao, X.; Li, Z.; Liu, L.; Cheng, X. Multi-Threshold Image Segmentation Based on Improved Grey Wolf Optimization Algorithm. IOP Conf. Series: Earth Environ. Sci. 2019, 252, 042105. [Google Scholar] [CrossRef] [Green Version]
  21. Yang, J.C.; Long, W. Improved Grey Wolf Optimization Algorithm for Constrained Mechanical Design Problems. Appl. Mech. Mater. 2016, 851, 553–558. [Google Scholar] [CrossRef]
  22. Chandar, S.K. Grey Wolf optimization-Elman neural network model for stock price prediction. Soft Comput. 2021, 25, 649–658. [Google Scholar] [CrossRef]
  23. Bansal, J.C.; Singh, S. A better exploration strategy in Grey Wolf Optimizer. J. Ambient. Intell. Humaniz. Comput. 2021, 12, 1099–1118. [Google Scholar] [CrossRef]
  24. Wang, J.-S.; Li, S.-X. An Improved Grey Wolf Optimizer Based on Differential Evolution and Elimination Mechanism. Sci. Rep. 2019, 9, 1–21. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Teng, Z.-J.; Lv, J.-L.; Guo, L.-W. An improved hybrid grey wolf optimization algorithm. Soft Comput. 2019, 23, 6617–6631. [Google Scholar] [CrossRef]
  26. Zhang, S.; Luo, Q.; Zhou, Y. Hybrid Grey Wolf Optimizer Using Elite Opposition-Based Learning Strategy and Simplex Method. Int. J. Comput. Intell. Appl. 2017, 16, 1750012. [Google Scholar] [CrossRef]
  27. Tian, D.P. Particle Swarm Optimization Based on Tent Chaotic Sequences. Comput. Eng. 2010, 4, 180–182. [Google Scholar]
  28. Shan, L.; Qiang, H.; Li, J.; Wang, Z. Chaos optimization algorithm based on Tent mapping. Control Decis. 2005, 2, 179–182. [Google Scholar]
  29. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the IEEE International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995. [Google Scholar]
  30. Li, C.; Luo, G.; Qin, K.; Li, C. An image encryption scheme based on chaotic tent map. Nonlinear Dyn. 2017, 87, 127–133. [Google Scholar] [CrossRef]
  31. Batra, I.; Ghosh, S. An Improved Tent Map-Adaptive Chaotic Particle Swarm Optimization (ITM-CPSO)-Based Novel Approach Toward Security Constraint Optimal Congestion Management. Iran. J. Sci. Technol. Trans. Electr. Eng. 2018, 42, 261–289. [Google Scholar] [CrossRef]
  32. Gokhale, S.; Kale, V. An application of a tent map initiated Chaotic Firefly algorithm for optimal overcurrent relay coordination. Int. J. Electr. Power Energy Syst. 2016, 78, 336–342. [Google Scholar] [CrossRef]
  33. Mitić, M.; Vuković, N.; Petrović, M.; Miljković, Z. Chaotic fruit fly optimization algorithm. Knowl. Based Syst. 2015, 89, 446–458. [Google Scholar] [CrossRef]
  34. Huang, Q.; Li, J.; Song, C.; Xu, C.; Lin, X. A whale optimization algorithm based on cosine control factor and polynomial variation. Control Decis. 2020, 35, 50–59. [Google Scholar]
  35. Chatterjee, A.; Siarry, P. Nonlinear inertia weight variation for dynamic adaptation in particle swarm optimization. Comput. Oper. Res. 2006, 33, 859–871. [Google Scholar] [CrossRef]
  36. Derrac, J.; García, S.; Molina, D.; Herrera, F. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 2011, 1, 3–18. [Google Scholar] [CrossRef]
  37. Mirjalili, S.; Mirjalili, S.M.; Hatamlou, A. Multi-verse optimizer: A nature-inspired algorithm for global optimization. Neural Comput. Appl. 2016, 27, 495–513. [Google Scholar] [CrossRef]
  38. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  39. Chen, H.; Wang, M.; Zhao, X. A multi-strategy enhanced sine cosine algorithm for global optimization and con-strained practical engineering problems. Appl. Math. Comput. 2020, 369, 124872. [Google Scholar]
  40. Coello, C.A.C.; Montes, E.M. Constraint-handling in genetic algorithms through the use of dominance-based tournament selection. Adv. Eng. Inform. 2002, 16, 193–203. [Google Scholar] [CrossRef]
  41. Coello, C.A.C. Use of a self-adaptive penalty approach for engineering optimization problems. Comput. Ind. 2000, 41, 113–127. [Google Scholar] [CrossRef]
  42. Deb, K. GeneAS: A Robust Optimal Design Technique for Mechanical Component Design. In Evolutionary Algorithms in Engineering Applications; Springer: Berlin, Heidelberg, 1997; pp. 497–514. [Google Scholar]
  43. Mezura-Montes, E.; Coello, C.A.C. An empirical study about the usefulness of evolution strategies to solve con-strained optimization problems. Int. J. Gen. Syst. 2008, 37, 443–473. [Google Scholar] [CrossRef]
  44. Huang, F.-Z.; Wang, L.; He, Q. An effective co-evolutionary differential evolution for constrained optimization. Appl. Math. Comput. 2007, 186, 340–356. [Google Scholar] [CrossRef]
  45. Kaveh, A.; Talatahari, S. An improved ant colony optimization for constrained engineering design problems. Eng. Comput. 2010, 27, 155–182. [Google Scholar] [CrossRef]
  46. Mahdavi, M.; Fesanghary, M.; Damangir, E. An improved harmony search algorithm for solving optimization problems. Appl. Math. Comput. 2007, 188, 1567–1579. [Google Scholar] [CrossRef]
  47. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  48. Belegundu, A.D.; Arora, J.S. A study of mathematical programming methods for structural optimization. Part I: Theory. Int. J. Numer. Methods Eng. 1985, 21, 1583–1599. [Google Scholar] [CrossRef]
  49. Zhang, J.; Wang, J.S. Improved Whale Optimization Algorithm Based on Nonlinear Adaptive Weight and Golden Sine Operator. IEEE Access 2020, 8, 77013–77048. [Google Scholar] [CrossRef]
  50. Kaveh, A.; Khayatazad, M. A new meta-heuristic method: Ray Optimization. Comput. Struct. 2012, 112–113, 283–294. [Google Scholar] [CrossRef]
  51. Krohling, R.A.; Coelho, L.D.S. Coevolutionary Particle Swarm Optimization Using Gaussian Distribution for Solving Constrained Optimization Problems. IEEE Trans. Syst. Man, Cybern. Part B 2006, 36, 1407–1416. [Google Scholar] [CrossRef]
  52. He, Q.; Wang, L. An effective co-evolutionary particle swarm optimization for constrained engineering design problems. Eng. Appl. Artif. Intell. 2007, 20, 89–99. [Google Scholar] [CrossRef]
  53. Sun, W.-Z.; Wang, J.-S.; Wei, X. An Improved Whale Optimization Algorithm Based on Different Searching Paths and Perceptual Disturbance. Symmetry 2018, 10, 210. [Google Scholar] [CrossRef] [Green Version]
  54. Coello, C.A.C. Constraint-handling using an evolutionary multiobjective optimization technique. Civ. Eng. Environ. Syst. 2000, 17, 319–346. [Google Scholar] [CrossRef]
  55. Deb, K. An efficient constraint handling method for genetic algorithms. Comput. Methods Appl. Mech. Eng. 2000, 186, 311–338. [Google Scholar] [CrossRef]
  56. Coello, C.A.C. Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: A survey of the state of the art. Comput. Methods Appl. Mech. Eng. 2002, 191, 1245–1287. [Google Scholar] [CrossRef]
  57. Sadollah, A.; Bahreininejad, A.; Eskandar, H.; Hamdi, M. Mine blast algorithm: A new population based algorithm for solving constrained engineering optimization problems. Appl. Soft Comput. 2013, 13, 2592–2612. [Google Scholar] [CrossRef]
  58. Zhang, M.; Luo, W.; Wang, X. Differential evolution with dynamic stochastic selection for constrained optimization. Inf. Sci. 2008, 178, 3043–3074. [Google Scholar] [CrossRef]
  59. Gandomi, A.H.; Yang, X.-S.; Alavi, A.H. Cuckoo search algorithm: A metaheuristic approach to solve structural optimization problems. Eng. Comput. 2013, 29, 17–35. [Google Scholar] [CrossRef]
  60. Ray, T.; Saini, P. Engineering design optimization using a swarm with an intelligent information sharing among individuals. Eng. Optim. 2001, 33, 735–748. [Google Scholar] [CrossRef]
  61. Tsai, J.-F. Global optimization of nonlinear fractional programming problems in engineering design. Eng. Optim. 2005, 37, 399–409. [Google Scholar] [CrossRef]
Figure 1. Hierarchy of wolves.
Figure 1. Hierarchy of wolves.
Sustainability 13 03208 g001
Figure 2. Position updating in the gray wolf optimization (GWO).
Figure 2. Position updating in the gray wolf optimization (GWO).
Sustainability 13 03208 g002
Figure 3. GWO algorithm optimization process.
Figure 3. GWO algorithm optimization process.
Sustainability 13 03208 g003
Figure 4. Chaotic sequence distribution.
Figure 4. Chaotic sequence distribution.
Sustainability 13 03208 g004
Figure 5. Random initialization.
Figure 5. Random initialization.
Sustainability 13 03208 g005
Figure 6. Tent chaotic initialization.
Figure 6. Tent chaotic initialization.
Sustainability 13 03208 g006
Figure 7. Values of Gaussian perturbations.
Figure 7. Values of Gaussian perturbations.
Sustainability 13 03208 g007
Figure 8. Convergence curve of parameter a.
Figure 8. Convergence curve of parameter a.
Sustainability 13 03208 g008
Figure 9. Value of control parameter A.
Figure 9. Value of control parameter A.
Sustainability 13 03208 g009
Figure 10. Evolution curve of the function (d = 30).
Figure 10. Evolution curve of the function (d = 30).
Sustainability 13 03208 g010aSustainability 13 03208 g010bSustainability 13 03208 g010c
Figure 11. Pressure vessel design problem.
Figure 11. Pressure vessel design problem.
Sustainability 13 03208 g011
Figure 12. Spring design problem.
Figure 12. Spring design problem.
Sustainability 13 03208 g012
Figure 13. Welding beam problem.
Figure 13. Welding beam problem.
Sustainability 13 03208 g013
Figure 14. The three truss design problem.
Figure 14. The three truss design problem.
Sustainability 13 03208 g014
Table 1. Unimodal optimization function.
Table 1. Unimodal optimization function.
NumberNameBenchmarkDimRange f m i n
F1Sphere f 1 x = i = 1 n x i 2 30[−100, 100]0
F2Schwefel’problem2.22 f 2 x = i = 1 n x i + i = 1 n x i 30[−10, 10]0
F3Schwefel’problem1.2 f 3 x = i = 1 n j 1 i x j 2 30[−100, 100]0
F4Schwefel’problem2.21 f 4 x = m a x i x i , 1 i n 30[−100, 100]0
F5Rosenbrock f 5 x = i = 1 n 1 100 x i + 1 x i 2 2 + x i 1 2 30[−30, 30]0
F6Step f 6 x = i = 1 n x i + 0.5 2 30[−100, 100]0
F7Noise f 7 x = i = 1 n i x i 4 + r a n d o m 0 , 1 30[−1.28, 1.28]0
Table 2. Multimodal optimization function.
Table 2. Multimodal optimization function.
NumberNameBenchmarkDimRange f m i n
F8Generalized Schwfel’s problem f 8 x = i = 1 n x i s i n x i 30[−500, 500]−12, 569.5
F9Rastrigin f 9 x = i = 1 n x i 2 10 c o s 2 π x i + 10 30[−5.12, 5.12]0
F10Ackley f 10 x = 20 + e 20 exp 0.2 1 n i = 1 n x i 2 e x p 1 n i = 1 n c o s 2 π x i 30[−32, 32]0
F11Griewank f 11 x = 1 4000 i = 1 n x i i = 1 n c o s x i i + 1 30[−600, 600]0
F12Generalized penalized function 1 f 12 x = π n 10 sin π y 1 + i = 1 n 1 y i 1 2 1 + 10 sin 2 π y i + 1 + y n 1 2 + i = 1 n u x i , 10 , 100 , 4
y i = 1 + x i + 1 4
u x i , a , k , m = k x i a m       x i > a 0 a < x i < a k x i a m    x i < a
30[−50, 50]0
F13Generalized Penalized Function 2 f 13 x = 0.1 sin 2 3 π x 1 + i = 1 n x i 1 2 1 + sin 2 3 π x i + 1 + x n 1 2 1 + sin 2 2 π x n + i = 1 n u x i , 5 , 100 , 4 30[−50, 50]0
Table 3. Fixed dimensional multimodal optimization function.
Table 3. Fixed dimensional multimodal optimization function.
NumberNameBenchmarkDimRange f m i n
F14Shekel’s foxholes function f 14 x = 1 500 + j = 1 25 1 j + i = 1 2 x i a ij 6 1 2[−65.536, 65.536]1
F15Kowalik’s Function f 15 x = i = 1 11 a i x 1 b i 2 + b i x 2 b i 2 + b i x 3 + x 4 2 4[−5, 5]0.0003
F16Six-hump camelback f 16 x = 4 x 1 2 2.1 x 1 4 + x 1 6 3 + x 1 x 2 4 x 2 2 + 4 x 2 4 2[−5, 5]−1.0316
F17Branin f 17 = x 2 5.1 4 π 2 x 1 2 + 5 π x 1 6 2 + 10 1 1 8 π cos x 1 + 10 2[−5, 10] [10, 15]0.39788
F18Goldstein–Price function f 18 x = 1 + x 1 + x 2 + 1 2 · 19 14 x 1 + 3 x 1 2 14 x 2 + 6 x 1 x 2 + 3 x 2 2 × 30 + 2 x 1 3 x 2 2 2 · 18 32 x 1 + 12 x 1 2 + 48 x 2 2 36 x 1 x 2 + 27 x 2 2 2[−2, 2]3
F19Hartmann 1 f 19 x = i = 1 4 c i exp j = 1 3 a ij x j p ij 2 3[0, 1]−3.86
F20Hartmann 2 f 20 x = i = 1 4 c i exp j = 1 3 a ij x j p ij 2 6[1, 6]−3.32
F21Shekel 1 f 21 x = i = 1 5 X a i X a i T + C i 1 4[0, 10]−10.1532
F22Shekel 2 f 22 x = i = 1 7 X a i X a i T + C I 1 4[0, 10]−10.4028
F23Shekel 3 f 23 x = i = 1 10 X a i X a i T + C I 1 4[0, 10]−10.5363
Table 4. Test the results of different strategies.
Table 4. Test the results of different strategies.
FunctionGWOGWO1GWO2GWO3IGWO
avestdavestdavestdavestdavestd
F11.55 × 10−272.95 × 10−271.18 × 10−302.15 × 10−301.72 × 10−386.97 × 10−383.25 × 10−284.78 × 10−288.34 × 10−402.35 × 10−39
F29.89 × 10−179.40 × 10−178.14 × 10−174.84 × 10−189.97 × 10−191.92 × 10−181.03 × 10−176.89 × 10−189.96 × 10−241.23 × 10−23
F32.47 × 10−56.57 × 10−53.03 × 10−57.30 × 10−53.19 × 10−101.70 × 10−93.74 × 10−57.15 × 10−55.95 × 10−89.72 × 10−8
F47.54 × 10−71.16 × 10−68.47 × 10−99.94 × 10−95.21 × 10−151.04 × 10−147.45 × 10−85.55 × 10−71.78 × 10−113.00 × 10−11
F52.73 × 1017.53 × 10−12.98 × 1022.46 × 10−12.98 × 1022.39 × 10−12.98 × 1022.09 × 10−12.72 × 1017.30 × 10−1
F67.37 × 10−13.06 × 10−17.26 × 10−13.53 × 10−11.725.35 × 10−16.56 × 10−13.95 × 10−11.684.28 × 10−1
F72.40 × 10−31.40 × 10−32.20 × 10−39.42 × 10−47.55 × 10−44.52 × 10−42.00 × 10−31.60 × 10−38.87 × 10−45.52 × 10−4
F81.84 × 10−148.25 × 102−6.09 × 1036.96 × 102−4.44 × 1031.37 × 103−5.98 × 1039.15 × 102−4.64 × 1031.26 × 103
F92.413.301.993.53003.51 × 10−21.40 × 10−100
F101.03 × 10−131.84 × 10−141.06 × 10−152.26 × 10−149.30 × 10−152.38 × 10−151.66 × 10−135.28 × 10−141.88 × 10−144.01 × 10−15
F113.20 × 10−36.70 × 10−31.20 × 10−45.00 × 10−3003.50 × 10−69.40 × 10−600
F124.59 × 10−22.13 × 10−25.43 × 10−23.12 × 10−21.11 × 10−14.30 × 10−23.84 × 10−22.28 × 10−21.02 × 10−13.95 × 10−2
F136.41 × 10−12.37 × 10−16.48 × 10−12.23 × 10−11.102.32 × 10−15.21 × 10−11.92 × 10−11.052.24 × 10−1
F144.554.365.024.114.984.235.044.495.084.31
F156.50 × 10−39.30 × 10−34.40 × 10−38.10 × 10−35.19 × 10−41.86 × 10−45.10 × 10−38.60 × 10−34.94 × 10−41.22 × 10−4
F16−1.034.65 × 10−8−1.031.69 × 10−8−1.031.04 × 10−5−1.031.86 × 10−11−1.034.84 × 10−8
F173.98 × 10−13.88 × 10−63.98 × 10−11.63 × 10−63.98 × 10−11.63 × 10−53.98 × 10−11.16 × 10−43.98 × 10−13.83 × 10−4
F183.002.99 × 10−55.701.48 × 1013.003.17 × 10−53.004.14 × 10−53.00× 3.15 × 10−5
F194.562.90 × 10−3−3.862.40 × 10−3−3.862.90 × 10−3−3.862.70 × 10−3−3.862.90 × 10−3
F20−3.277.48 × 10−2−3.276.75 × 10−2−3.198.67 × 10−2−3.238.52 × 10−2−3.239.19 × 10−2
F21−9.392.00−8.472.68−8.282.49−9.811.29−7.872.69
F22−1.04 × 1019.22 × 10−4−1.02 × 1019.70 × 10−1−9.781.90−1.00 × 1011.35−9.691.83
F23−1.04 × 1019.87 × 10−1−1.05 × 1017.63 × 10−4−1.03 × 1011.07−1.05 × 1012.36 × 10−7−1.02 × 1011.37
Table 5. Simulation results of unimodal functions.
Table 5. Simulation results of unimodal functions.
FunctionIndexIGWOGWOSCAMFOPSOBAFPASSA
F1ave8.34 × 10−401.55 × 10−272.70 × 10−122.34 × 1038.76 × 10−34.644.22 × 1012.67 × 10−7
std2.35 × 10−392.95 × 10−277.91 × 10−125.04 × 1031.32 × 10−21.581.61 × 1013.31 × 10−7
F2ave9.96 × 10−249.89 × 10−179.26 × 10−102.92 × 1011.12 × 10−11.05 × 1018.402.72
std1.23 × 10−239.40 × 10−172.51 × 10−91.78 × 1011.01 × 10−12.101.65 × 1011.94
F3ave5.95 × 10−82.47 × 10−55.49 × 10−12.03 × 1045.61 × 10−37.056.02 × 1011.38 × 103
std9.72 × 10−86.57 × 10−53.001.02 × 1046.75 × 10−32.423.33 × 1016.63 × 102
F4ave1.78 × 10−117.54 × 10−71.00 × 10−36.78 × 1017.66 × 10−21.232.611.14 × 101
std3.00 × 10−111.16 × 10−62.60 × 10−31.02 × 1018.77 × 10−211.36 × 10−15.22 × 10−13.18
F5ave2.72 × 1012.73 × 1017.362.68 × 1065.38 × 10−34.98 × 1022.48 × 1043.11 × 102
std7.30 × 10−17.53 × 10−13.81 × 10−11.46 × 1077.82 × 10−31.65 × 1022.02 × 1044.69 × 102
F6ave1.687.37 × 10−14.53 × 10−11.35 × 1038.57 × 10−36.075.04 × 1012.11 × 10−7
std4.28 × 10−13.06 × 10−11.55 × 10−14.37 × 1031.44 × 10−21.721.54 × 1013.85 × 10−7
F7ave8.87 × 10−42.40 × 10−32.50 × 10−32.831.12 × 10−11.52 × 1022.99 × 1031.74 × 10−1
std5.52 × 10−41.40 × 10−32.00 × 10−34.451.03 × 10−13.51 × 1012.15 × 1037.68 × 10−2
Table 6. Simulation results of multimodal functions.
Table 6. Simulation results of multimodal functions.
FunctionIndexIGWOGWOSCAMFOPSOBAFPASSA
F8ave−4.64 × 1031.84 × 10−14−2.16 × 103−8.61 × 103−5.57 × 102-Inf−4.73 × 101−7.40 × 103
std1.26 × 1038.25 × 1021.77 × 1028.71 × 1024.79 × 10−3---8.836.39 × 102
F9ave02.415.52 × 10−11.64 × 1025.70 × 10−14.22 × 1011.90 × 1025.43 × 101
std03.302.922.98 × 1016.01 × 10−17.283.08 × 1012.24 × 101
F10ave1.88 × 10−141.03 × 10−135.50 × 10−61.59 × 1013.963.445.602.46
std4.01 × 10−151.84 × 10−142.76 × 10−56.567.342.51 × 10−16.66 × 10−18.54 × 10−1
F11ave03.20 × 10−38.05 × 10−23.70 × 1013.36 × 10−32.32 × 10−18.64 × 10−11.95 × 10−2
std06.70 × 10−31.45 × 10−15.06 × 1013.83 × 10−38.29 × 10−21.12 × 10−11.67 × 10−2
F12ave1.02 × 10−14.59 × 10−29.27 × 10−21.55 × 1022.27 × 10−12.33 × 10−12.066.16
std3.95 × 10−22.13 × 10−24.55 × 10−24.58 × 1025.16 × 10−19.85 × 10−27.12 × 10−12.26
F13ave1.056.41 × 10−13.00 × 10−11.37 × 1071.38 × 10−22.938.201.20 × 101
std2.24 × 10−12.37 × 10−11.08 × 10−17.49 × 1071.19 × 10−27.01 × 10−12.791.54 × 101
Table 7. Simulation results of the fixed dimensional multimodal function.
Table 7. Simulation results of the fixed dimensional multimodal function.
FunctionIndexIGWOGWOSCAMFOPSOBAFPASSA
F14ave5.084.551.943.139.98 × 10−11.26 × 1011.27 × 1011.26
std4.3104.369.97 × 10−12.518.07 × 10−63.48 × 10−11.88 × 10−146.35 × 10−1
F15ave4.94 × 10−46.50 × 10−31.00 × 10−31.06 × 10−35.30 × 10−33.70 × 10−36.31 × 10−32.20 × 10−3
std1.22 × 10−49.30 × 10−33.66 × 10−44.24 × 10−45.47 × 10−36.10 × 10−31.01 × 10−24.90 × 10−3
F16ave−1.03−1.03−1.03−1.031.22 × 101−1.03−1.03−1.03
std4.84 × 10−84.65 × 10−84.28 × 10−56.78 × 10−163.29 × 1012.11 × 10−94.99 × 10−71.68 × 10−14
F17ave3.98 × 10−13.98 × 10−14.01 × 10−13.98 × 10−13.203.99 × 10−13.98 × 10−13.81
std3.83 × 10−43.88 × 10−63.00 × 10−303.302.90 × 10−33.54 × 10−103.57 × 10−2
F18ave3.003.003.003.008.14 × 1031.83 × 1016.603.00
std3.15 × 10−52.99 × 10−51.54 × 10−42.23 × 10−151.88 × 1042.53 × 1011.54 × 1012.45 × 10−13
F19ave−3.864.56−3.85−3.86−1.08−3.42−3.86−3.86
std2.90 × 10−32.90 × 10−33.10 × 10−32.71 × 10−158.23 × 10−19.85 × 10−12.86 × 10−75.68 × 10−9
F20ave−3.23−3.27−2.93−3.24−5.03 × 10−1−3.16−2.93−3.22
std9.19 × 10−27.48 × 10−22.17 × 10−17.08 × 10−25.34 × 10−13.74 × 10−12.17 × 10−16.12 × 10−2
F21ave−7.87−9.39−1.93−7.72−2.49 × 10−1−5.23−1.02 × 101−7.15
std2.692.001.573.123.82 × 10−19.31 × 10−13.92 × 10−43.56
F22ave−9.69−1.04 × 101−4.11−7.46−1.52 × 10−1−5.09−1.04 × 101−8.92
std1.839.22 × 10−41.663.491.20 × 10−15.23 × 10−75.50 × 10−32.78
F23ave−1.02 × 101−1.04 × 101−3.82−7.58−2.45 × 10−1−5.30−1.05 × 101−8.34
std1.379.87 × 10−11.593.742.23 × 10−19.38 × 10−13.26 × 10−33.46
Table 8. Results of the p value Wilcoxon rank-sum test on the functions.
Table 8. Results of the p value Wilcoxon rank-sum test on the functions.
FunctionIGWO vs. GWO
p-Value Win
IGWO vs. SCA
p-Value Win
IGWO vs. MFO
p-Value Win
IGWO vs. PSO
p-Value Win
IGWO vs. BA
p-Value Win
IGWO vs. FPA
p-Value Win
1GWO vs. SSA
p-Value Win
F13.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F23.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F31.78 × 10−103.029 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F43.02 × 10−113.02 × 10−113.02 × 10−113.01 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F54.13 × 10−22.58 × 10−112.58 × 10−112.58 × 10−118.26 × 10−62.57 × 10−111.26 × 10−10
F62.97 × 10−112.50 × 10−22.70 × 10−22.97 × 10−112.97 × 10−114.89 × 10−112.97 × 10−11
F77.27 × 10−63.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F83.01 × 10−111.81 × 10−13.01 × 10−112.99 × 10−113.01 × 10−113.02 × 10−111.21 × 10−10
F91.19 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.20 × 10−121.21 × 10−12
F101.58 × 10−111.68 × 10−111.68 × 10−115.65 × 10−131.63 × 10−111.68 × 10−111.68 × 10−11
F111.10 × 10−21.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−12
F121.06 × 10−73.01 × 10−113.01 × 10−113.01 × 10−119.12 × 10−11.61 × 10−103.01 × 10−11
F134.50 × 10−113.01 × 10−113.01 × 10−113.01 × 10−118.14 × 10−54.07 × 10−111.86 × 10−6
F143.26 × 10−18.26 × 10−16.58 × 10−14.76 × 10−45.12 × 10−112.30 × 10−121.79 × 10−4
F151.84 × 10−27.08 × 10−87.64 × 10−85.07 × 10−101.10 × 10−49.50 × 10−38.84 × 10−7
F161.06 × 10−112.36 × 10−121.61 × 10−12.36 × 10−126.50 × 10−142.49 × 10−121.61 × 10−1
F178.44 × 10−72.04 × 10−98.59 × 10−72.42 × 10−111.20 × 10−78.59 × 10−73.71 × 10−7
F183.79 × 10−11.76 × 10−11.21 × 10−123.01 × 10−111.21 × 10−125.96 × 10−111.21 × 10−12
F198.00 × 10−31.20 × 10−81.20 × 10−121.20 × 10−123.71 × 10−77.35 × 10−111.20 × 10−12
F202.23 × 10−11.01 × 10−81.50 × 10−31.21 × 10−126.25 × 10−45.53 × 10−81.40 × 10−2
F215.11 × 10−11.95 × 10−102.48 × 10−13.01 × 10−112.55 × 10−13.79 × 10−11.02 × 10−1
F225.07 × 10−103.68 × 10−115.73 × 10−23.01 × 10−117.73 × 10−115.57 × 10−106.35 × 10−2
F233.01 × 10−114.50 × 10−111.70 × 10−13.01 × 10−118.82 × 10−103.01 × 10−111.23 × 10−9
Table 9. The numerical results of 8 algorithms.
Table 9. The numerical results of 8 algorithms.
FunctionIGWOGWOSCAMFOPSOBAFPASSA
F11.51.5385674
F21.51.5384765
F312483567
F412384567
F534281675
F654382671
F712364785
F837415--62
F914273586
F1012386574
F1112583674
F1231284567
F1343281567
F1465341782
F1518236574
F161.55558551.5
F172.52.562.5752.58
F1833338763
F192853.5763.51
F20316.52856.54
F2132748615
F2231.575861.54
F2332758614
Table 10. Comparison results of pressure vessel design problem.
Table 10. Comparison results of pressure vessel design problem.
AlgorithmOptimum VariablesOptimum Cost
TsThRL
MVO [37]0.81250.437542.0907382176.7386906060.8066
GSA [38]1.1250000.625055.98865984.45420258538.8359
PSO [39]0.8125000.43750042.091266176.7465006061.0777
MSCA [40]0.7762560.39960040.325450199.92135935.7161
GA (Coello) [41]0.8125000.434540.323900200.00006288.7445
GA (Coello and Montes) [42]0.8125000.437542.097397176.6540506059.9463
GA (Deb et al.) [43]0.9375000.5000048.329000112.6790006410.3811
ES [44]0.8125000.43750042.098087176.6405186059.745605
DE (Huang et al.) [45]0.81250.437542.098411176.6376906059.7340
ACO (Kaveh et al.) [46]0.81250.437542.103624176.5726566059.0888
HIS [47]1.1250000.62500058.2901543.692687197.7300
MFO0.81250.437542.098445176.6365966059.7143
WOA [48]0.8125000.43750042.098209176.6389986059.7410
IGWO [21]0.81250.437542.0984456176.6365966059.7143
GWO0.78526860.389150440.67564195.64365913.8838
IGWO0.77844580.385403440.33393199.80195888.6000
Table 11. Comparison results of results for the spring design problem.
Table 11. Comparison results of results for the spring design problem.
AlgorithmOptimum VariablesOptimum Cost
dND
Mathematical optimization method (Belegundu) [49]0.0533960.317714.02600.0127303
GSA (Kaveh) [46]0.0500000.31731214.228670.0128739
GSA (Rashedi) [38]0.0502760.32368013.5254100.0127022
SCA [6]0.0507800.33477912.722690.127097
MVO [37]0.050000.31595614.226230.128169
GWO0.050000.3173914.03510.012699
IGWO0.051590.35433711.43010.012700
Table 12. Comparison results of results for welding beam problem.
Table 12. Comparison results of results for welding beam problem.
AlgorithmOptimum VariablesOptimum Cost
hltb
NGS-WOA [50]0.2023693.5442149.048210.2057231.72802
WOA [48]0.20539633.4842939.0374260.2062761.730499
RO [51]0.2036873.5284679.0042330.2072411.735344
MVO [37]0.207227443.3939693129.0188740010.2072257741.7250
CPSO [52]0.2054633.4731939.0445020.2056951.72645
CPSO [53]0.2023693.5442149.0482100.2057231.73148
HS [54]0.24426.22318.29150.24332.3807
GSA [38]0.1821293.85697910.00000.2023761.87995
GA [55]0.18294.04839.36660.20591.82420
GA [56]0.24896.17308.17890.25332.43312
Coello [40]0.2088003.4205008.9975000.21001.74831
Coello and Monters [41]0.2059863.4713289.0202240.2064801.72822
GWO0.205273.48199.03890.205831.7269
IGWO0.204963.48729.03660.205731.7254
Table 13. Comparison results of results for the three truss design problem.
Table 13. Comparison results of results for the three truss design problem.
AlgorithmOptimum VariablesOptimum Cost
X1X2
PSO-DE [39]0.78867510.4082482263.8958433
MBA [57]0.78856500.4085597263.8958522
DEDS [58]0.788675130.40824828263.8958434
CS [59]0.788670.40902263.9716
Ray and Sain [60]0.7950.395264.3
Tsa [61]0.7880.408263.68
WOA [48]0.7890505440.407187512263.8959474
MFO0.7882447709319220.409466905784741263.895979682
GWO0.787690.41108263.9011
IGWO0.788460.40884263.8959
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Li, Y.; Lin, X.; Liu, J. An Improved Gray Wolf Optimization Algorithm to Solve Engineering Problems. Sustainability 2021, 13, 3208. https://doi.org/10.3390/su13063208

AMA Style

Li Y, Lin X, Liu J. An Improved Gray Wolf Optimization Algorithm to Solve Engineering Problems. Sustainability. 2021; 13(6):3208. https://doi.org/10.3390/su13063208

Chicago/Turabian Style

Li, Yu, Xiaoxiao Lin, and Jingsen Liu. 2021. "An Improved Gray Wolf Optimization Algorithm to Solve Engineering Problems" Sustainability 13, no. 6: 3208. https://doi.org/10.3390/su13063208

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop