optimization of process
Dotaz
Zobrazit nápovědu
Stochastic-based optimization algorithms are effective approaches to addressing optimization challenges. In this article, a new optimization algorithm called the Election-Based Optimization Algorithm (EBOA) was developed that mimics the voting process to select the leader. The fundamental inspiration of EBOA was the voting process, the selection of the leader, and the impact of the public awareness level on the selection of the leader. The EBOA population is guided by the search space under the guidance of the elected leader. EBOA's process is mathematically modeled in two phases: exploration and exploitation. The efficiency of EBOA has been investigated in solving thirty-three objective functions of a variety of unimodal, high-dimensional multimodal, fixed-dimensional multimodal, and CEC 2019 types. The implementation results of the EBOA on the objective functions show its high exploration ability in global search, its exploitation ability in local search, as well as the ability to strike the proper balance between global search and local search, which has led to the effective efficiency of the proposed EBOA approach in optimizing and providing appropriate solutions. Our analysis shows that EBOA provides an appropriate balance between exploration and exploitation and, therefore, has better and more competitive performance than the ten other algorithms to which it was compared.
Process planning optimization is a well-known NP-hard combinatorial problem extensively studied in the scientific community. Its main components include operation sequencing, selection of manufacturing resources and determination of appropriate setup plans. These problems require metaheuristic-based approaches in order to be effectively and efficiently solved. Therefore, to optimize the complex process planning problem, a novel hybrid grey wolf optimizer (HGWO) is proposed. The traditional grey wolf optimizer (GWO) is improved by employing genetic strategies such as selection, crossover and mutation which enhance global search abilities and convergence of the traditional GWO. Precedence relationships among machining operations are taken into account and precedence constraints are modeled using operation precedence graphs and adjacency matrices. Constraint handling heuristic procedure is adopted to move infeasible solutions to a feasible domain. Minimization of the total weighted machining cost of a process plan is adopted as the objective and three experimental studies that consider three different prismatic parts are conducted. Comparative analysis of the obtained cost values, as well as the convergence analysis, are performed and the HGWO approach demonstrated effectiveness and flexibility in finding optimal and near-optimal process plans. On the other side, comparative analysis of computational times and execution times of certain MATLAB functions showed that the HGWO have good time efficiency but limited since it requires more time compared to considered hybrid and traditional algorithms. Potential directions to improving efficiency and performances of the proposed approach are given in conclusions.
- Klíčová slova
- constraint handling, crossover, grey wolf optimizer, mutation, precedence constraints, process planning optimization, selection,
- Publikační typ
- časopisecké články MeSH
Multienzyme processes represent an important area of biocatalysis. Their efficiency can be enhanced by optimization of the stoichiometry of the biocatalysts. Here we present a workflow for maximizing the efficiency of a three-enzyme system catalyzing a five-step chemical conversion. Kinetic models of pathways with wild-type or engineered enzymes were built, and the enzyme stoichiometry of each pathway was optimized. Mathematical modeling and one-pot multienzyme experiments provided detailed insights into pathway dynamics, enabled the selection of a suitable engineered enzyme, and afforded high efficiency while minimizing biocatalyst loadings. Optimizing the stoichiometry in a pathway with an engineered enzyme reduced the total biocatalyst load by an impressive 56 %. Our new workflow represents a broadly applicable strategy for optimizing multienzyme processes.
- Klíčová slova
- biocatalysis, biotransformations, kinetic modeling, multienzyme reaction, stoichiometry optimization,
- MeSH
- algoritmy MeSH
- biokatalýza * MeSH
- chemické modely MeSH
- enzymy chemie MeSH
- kinetika MeSH
- proteinové inženýrství MeSH
- průběh práce MeSH
- Publikační typ
- časopisecké články MeSH
- práce podpořená grantem MeSH
- Názvy látek
- enzymy MeSH
At the battalion level, NATO ROLE1 medical treatment command focuses on the provision of primary health care being the very first physician and higher medical equipment intervention for casualty treatments. ROLE1 has paramount importance in casualty reductions, representing a complex system in current operations. This study deals with an experiment on the optimization of ROLE1 according to the key parameters of the numbers of physicians, the number of ambulances and the distance between ROLE1 and the current battlefield. The very first step in this study is to design and implement a model of current battlefield casualties. The model uses friction data generated from an already executed computer assisted exercise (CAX) while employing a constructive simulation to produce offense and defense scenarios on the flow of casualties. The next step in the study is to design and implement a model representing the transportation to ROLE1, its structure and behavior. The deterministic model of ROLE1, employing a system dynamics simulation paradigm, uses the previously generated casualty flows as the inputs representing human decision-making processes through the recorder CAX events. A factorial experimental design for the ROLE1 model revealed the recommended variants of the ROLE1 structure for both offensive and defensive operations. The overall recommendation is for the internal structure of ROLE1 to have three ambulances and three physicians for any kind of current operation and any distance between ROLE1 and the current battlefield within the limit of 20 min. This study provides novelty in the methodology of casualty estimations involving human decision-making factors as well as the optimization of medical treatment processes through experimentation with the process model.
- Klíčová slova
- casualty treatment optimization, complex systems, system dynamics,
- Publikační typ
- časopisecké články MeSH
This paper presents a novel design methodology that validates and utilizes the results of topology optimization as the final product shape. The proposed methodology aims to streamline the design process by eliminating the need for remodeling and minimizing printing errors through process simulation. It also eliminates the repeated export and import of data between software tools. The study includes a case study involving the steering column housing of a racing car, where Siemens NX Topology Optimization was used for optimization, and verification analysis was conducted using the NX Nastran solver. The final solution was fabricated using AlSi10Mg via direct metal laser sintering on a 3D printer and successfully validated under real conditions. In conclusion, this paper introduces a comprehensive design methodology for the direct utilization of topology optimization, which was validated through a case study, yielding positive results.
- Klíčová slova
- L-PBF, additive manufacturing, design methodology, numerical simulation, process simulation, topology optimization,
- Publikační typ
- časopisecké články MeSH
The purpose of this study was to find and optimize the process parameters of producing tool steel 1.2709 at a layer thickness of 100 μm by DMLS (Direct Metal Laser Sintering). HPDC (High Pressure Die Casting) tools are printed from this material. To date, only layer thicknesses of 20-50 μm are used, and parameters for 100 µm were an undescribed area, according to the state of the art. Increasing the layer thickness could lead to time reduction and higher economic efficiency. The study methodology was divided into several steps. The first step was the research of the single-track 3D printing parameters for the subsequent development of a more accurate description of process parameters. Then, in the second step, volume samples were produced in two campaigns, whose porosity was evaluated by metallographic and CT (computed tomography) analysis. The main requirement for the process parameters was a relative density of the printed material of at least 99.9%, which was achieved and confirmed using the parameters for the production of the samples for the tensile test. Therefore, the results of this article could serve as a methodological procedure for optimizing the parameters to streamline the 3D printing process, and the developed parameters may be used for the productive and quality 3D printing of 1.2709 tool steel.
- Klíčová slova
- 3D printing, 3D printing parameters optimization, Direct Metal Laser Sintering (DMLS), additive manufacturing, energy density, layer thickness,
- Publikační typ
- časopisecké články MeSH
The objective of this investigational analysis was to study the influence of process variables on the response during the drilling of LM6/B4C composite materials. Stir casting was employed to produce the LM6/B4C composites. A Vertical Machining Center (VMC) with a dynamometer was used to drill the holes and to record the thrust force. An L27 orthogonal array was used to carry out the experimental work. A grey relational analysis (GRA) was employed to perform optimization in order to attain the lowest Thrust Force (TF), Surface Roughness (SR) and Burr Height (BH). For minimal responses, the optimum levels of the process variables viz. the feed rate (F), spindle speed (S), drill material (D) and reinforcing percentage (R) were determined. The process variables in the drilling of the LM6/B4C composites were indeed optimized, according to confirmational investigations. The predicted Grey Relational Grade was 0.846, whereas the experimental GRG was 0.865, with a 2.2% error-indicating that the optimization process was valid.
- Klíčová slova
- ANOVA, composites, drilling, optimization, parameters,
- Publikační typ
- časopisecké články MeSH
This research paper develops a novel hybrid approach, called hybrid particle swarm optimization-teaching-learning-based optimization (hPSO-TLBO), by combining two metaheuristic algorithms to solve optimization problems. The main idea in hPSO-TLBO design is to integrate the exploitation ability of PSO with the exploration ability of TLBO. The meaning of "exploitation capabilities of PSO" is the ability of PSO to manage local search with the aim of obtaining possible better solutions near the obtained solutions and promising areas of the problem-solving space. Also, "exploration abilities of TLBO" means the ability of TLBO to manage the global search with the aim of preventing the algorithm from getting stuck in inappropriate local optima. hPSO-TLBO design methodology is such that in the first step, the teacher phase in TLBO is combined with the speed equation in PSO. Then, in the second step, the learning phase of TLBO is improved based on each student learning from a selected better student that has a better value for the objective function against the corresponding student. The algorithm is presented in detail, accompanied by a comprehensive mathematical model. A group of benchmarks is used to evaluate the effectiveness of hPSO-TLBO, covering various types such as unimodal, high-dimensional multimodal, and fixed-dimensional multimodal. In addition, CEC 2017 benchmark problems are also utilized for evaluation purposes. The optimization results clearly demonstrate that hPSO-TLBO performs remarkably well in addressing the benchmark functions. It exhibits a remarkable ability to explore and exploit the search space while maintaining a balanced approach throughout the optimization process. Furthermore, a comparative analysis is conducted to evaluate the performance of hPSO-TLBO against twelve widely recognized metaheuristic algorithms. The evaluation of the experimental findings illustrates that hPSO-TLBO consistently outperforms the competing algorithms across various benchmark functions, showcasing its superior performance. The successful deployment of hPSO-TLBO in addressing four engineering challenges highlights its effectiveness in tackling real-world applications.
- Klíčová slova
- exploitation, exploration, hybrid-based algorithm, metaheuristic, optimization, particle swarm optimization, teaching–learning-based optimization,
- Publikační typ
- časopisecké články MeSH
The cluster technique involves the creation of clusters and the selection of a cluster head (CH), which connects sensor nodes, known as cluster members (CM), to the CH. The CH receives data from the CM and collects data from sensor nodes, removing unnecessary data to conserve energy. It compresses the data and transmits them to base stations through multi-hop to reduce network load. Since CMs only communicate with their CH and have a limited range, they avoid redundant information. However, the CH's routing, compression, and aggregation functions consume power quickly compared to other protocols, like TPGF, LQEAR, MPRM, and P-LQCLR. To address energy usage in wireless sensor networks (WSNs), heterogeneous high-power nodes (HPN) are used to balance energy consumption. CHs close to the base station require effective algorithms for improvement. The cluster-based glow-worm optimization technique utilizes random clustering, distributed cluster leader selection, and link-based routing. The cluster head routes data to the next group leader, balancing energy utilization in the WSN. This algorithm reduces energy consumption through multi-hop communication, cluster construction, and cluster head election. The glow-worm optimization technique allows for faster convergence and improved multi-parameter selection. By combining these methods, a new routing scheme is proposed to extend the network's lifetime and balance energy in various environments. However, the proposed model consumes more energy than TPGF, and other protocols for packets with 0 or 1 retransmission count in a 260-node network. This is mainly due to the short INFO packets during the neighbor discovery period and the increased hop count of the proposed derived pathways. Herein, simulations are conducted to evaluate the technique's throughput and energy efficiency.
- Klíčová slova
- cluster head, glow-worm, multi-parameters, optimization and heterogeneous, retransmission ratio,
- Publikační typ
- časopisecké články MeSH
This article introduces a new metaheuristic algorithm called the Serval Optimization Algorithm (SOA), which imitates the natural behavior of serval in nature. The fundamental inspiration of SOA is the serval's hunting strategy, which attacks the selected prey and then hunts the prey in a chasing process. The steps of SOA implementation in two phases of exploration and exploitation are mathematically modeled. The capability of SOA in solving optimization problems is challenged in the optimization of thirty-nine standard benchmark functions from the CEC 2017 test suite and CEC 2019 test suite. The proposed SOA approach is compared with the performance of twelve well-known metaheuristic algorithms to evaluate further. The optimization results show that the proposed SOA approach, due to the appropriate balancing exploration and exploitation, is provided better solutions for most of the mentioned benchmark functions and has superior performance compared to competing algorithms. SOA implementation on the CEC 2011 test suite and four engineering design challenges shows the high efficiency of the proposed approach in handling real-world optimization applications.
- Klíčová slova
- bio-inspired, engineering systems, exploitation, exploration, metaheuristic, optimization, serval,
- Publikační typ
- časopisecké články MeSH