Glossary

Computer Aided Engineering (CAE) - analyzing an engineering design to calculate its performance. Analysis program can calculate mechanical properties such as stress, fluid flow, vibration, thermal dissipation, or electrical properties such as conductivity, resistance, voltage, or electric flux. Optimization frequently uses CAE analysis tools to calculate and thereby optimize these properties. Computational Fluid Dynamics (CFD) - modeling fluid or airflow in an engineering design.

Component Object Model (COM) interface - the interface that all Microsoft Windows applications use to communicate. Since eArtius only creates optimization products, Visual Optimizer is completely agnostic as to what program is doing the analysis. eArtius just focuses on optimization.

Constraints - Define allowable ranges of stress, deflection, frequency, and so on. Both minimum and maximum values can be specified.

Design of experiments ( DOE ) - The process of determining what sample points to generating when creating response surfaces.

Design space - The entire range of all independent variables. Measures or amounts that are directly controllable.

Design Variables - Define what can be changed in the model, such as the wall thickness, hole diameter, fillet radius, and so on.

Dynamically Dimensioned Response Surface Method (DDRSM) - allows direct optimization of expensive models with dozens and thousands of design variables. DDRSM is almost equally efficient for both low-dimensional and high-dimensional tasks.

Efficiency - In optimization, a measurement of how efficient the optimization algorithm is in finding Pareto Optimal solutions. It is generally expresses as a ratio of how many evaluations were done compared to the number of Pareto Optimal points found. For example eArtius??Ts Pareto Navigator will generally only need 2-3 evaluations for each Pareto Optimal point found, while genetic algorithms may need 100 evaluations per Pareto optimal point.

Evaluation - A call to evaluate a formula or call an analysis program to calculate stress or some other engineering value. Depending on the type of evaluation it can take anywhere from milliseconds to hours or even days. Reducing the number of evaluations is key to speeding up the optimization process.

Execute in parallel - This option controls how the design points will be executed during optimization. (This option is not available for all techniques.) If selected, design points will be executed in parallel, in small batches. The size of the batch depends on the number of selected Design Variables for numerical techniques or the size of the population for exploratory techniques.

FEA Finite Element Analysis - is a technique used in engineering analysis that breaks the component being evaluated into small pieces (typically triangles), called finite elements and does the analysis by solving the small pieces.

Fair Evolutionary Multiobjective Optimizer (FEMO) - genetic optimization algorithm.

Genetic Algorithm - Genetic algorithms attempt to find solutions to problems by mimicking biological evolutionary processes, with a cycle of random mutations yielding successive generations of "solutions". In optimization, Genetic Algorithms are currently the most popular optimization algorithms and they work by taking a large selection of random points that cover all the ranges of all the input parameters and evaluating each of them. Then the best points are selected, and a new generation of points is created by some heuristic from the first set, and the next generation is evaluated. Since the new points are chosen from the best of the previous generation, these algorithms improve with each generation of points. The algorithms generally run for a fixed number of generations, and the best points will generally be close to the Pareto Surface. Genetic Algorithms have three major weaknesses ??" one is that the analysis must be done many times; if the analysis takes a long time, the time required could be enormous, even years or hundreds of years. Secondly, although the heuristics will find many Pareto optimal points and do not get stuck in local minimums or maximums, there is no certainty that they find all the minimums or maximums which might exist. Thirdly, only a small percentage of the points found by Genetic Algorithms are Pareto optimal, and the rest of them can be located far away from the Pareto Frontier.

Global Pareto Frontier - A surface representing the global optimum in the area being evaluated, for example the highest peak in an area.

Gradient Analysis - In optimization, a method of finding an optimum by following the slope of a parameter or parameters being evaluated. Single gradient methods have fallen into disfavor because they tend to get stuck in local optimum, and will not continue on to find the global optimum. eArtius??Ts Concurrent Gradient Method is the first (and we believe the only) algorithm to use multiple gradients simultaneously.

Heuristic - in optimization, a technique that uses a series of rules to determine how to find the optimal solution.

Infeasible design - a design that infringes certain design constraints

Kriging - A response surface technique used to define a surface from limited sample data and the estimation of values at un-sampled locations. Two stage process: (1) studying the gathered data to establish the predictability of values from place to place in the study area; this study results in a graph which models the difference between a value at one location and the value at another location according to the distance and direction between them. (2) estimating values at those locations which have not been sampled. This process is known as 'Kriging'. The basic technique 'ordinary Kriging' uses a weighted average of neighboring samples to estimate the 'unknown' value at a given location.

Local Pareto Frontier - A surface representing a local optimum, for example a peak that is higher than the area around it, but is not the highest peak in the model being evaluated Monte Carlo Methods - In optimization, using random evaluations and simply picking the best one. There is no way to know if the solution is optimal or not Multi-Objective Optimization - Optimizing multiple objectives to find the best tradeoffs. In multiobjective optimization there is not one best solution, but instead an infinite number of optimal solutions that lie on the Pareto Frontier. For example there is a tradeoff between driving fast and getting good gas mileage, and once the optimal solutions are found, one needs to pick the best tradeoff for the current situation.

Multi-Disciplinary Optimization (MDO) - Running optimization across multiple disciplines - for example optimizing stress, airflow, and vibration all at the same time.

Multi-Gradient Explorer (MGE) - is the first (and we believe only) algorithm that can mathematically calculate how to move closer to the Pareto surface. The result is a smaller number of iterations and evaluations, as well as the ability to generate a solution within a specified tolerance. Since evaluations are typically the most time consuming aspect of the optimization, this is a huge advantage for Pareto Explorer.

Multi-Gradient Pathfinder (MGP) - allows the user to improve certain design variables while still maintaining an optimal design. This provides a quick and easy way to select which optimal design variant to use.

Normal Boundary Intersection (NBI) - Optimization algorithm, published in 1998, whose strength is its ability to produce an even spread of points on the Pareto surface.

Nondominated Sorting Genetic Algorithm (NSGA) - genetic algorithm proposed in 1994, based on a concept of fitness sharing.

Nondominated Sorting Genetic Algorithm 2 (NSGA2) - a more advanced version of NSGA that does a better job of finding points that are spread across the Pareto Surface. Objective: Also called the optimization criterion or optimization goal, the objective defines the goals of the optimization.

Objective space - The entire range of all dependent variables. Not known until design space has been explored.

Pareto efficiency - or Pareto optimality, is an important concept originally defined in economics, now applied to engineering as well as game theory and social sciences. The term is named after Vilfredo Pareto, an Italian economist who used the concept in his studies of economic efficiency and income distribution in the 1800??Ts. Given a set of tradeoffs of goals such as performance and cost, improving performance alternative allocations of, say, goods or income for a set of individuals, a movement from one allocation to another that can make at least one individual better off without making any other individual worse off is called a Pareto improvement. An allocation is Pareto efficient or Pareto optimal when no further Pareto improvements can be made.

Pareto Frontier (also called the Pareto Front or Pareto Surface) - The Pareto frontier or Pareto surface is the set of solutions that are all Pareto efficient ??" that no individual parameter can be improved without another parameter being made worse. Finding Pareto frontiers is particularly useful in engineering. By calculating a range of potentially optimal solutions, a designer can make focused tradeoffs within this constrained set of parameters, rather than needing to consider the full ranges of parameters.

Pareto Optimal Solution - is one where in order to improve one criterion, other criteria must get worse. In product design, the Pareto surface is the mathematical embodiment of all possible optimal solutions. Thus, the mathematics of optimization focuses on finding solutions close to this Pareto surface.

Pareto Surface - see Pareto Frontier

Process Integration and Design Optimization (PIDO) - a framework that runs an optimization process by connecting the optimization process to multiple CAD and CAE systems to change and evaluate the design. The design is modified, and analyzed under control of the PIDO.

Response Surface - A method utilizing a simple formula that can be used instead of a full evaluation. . It is created by running a set of sample evaluations at the beginning, and then using them to create the formula. A frequently used technique is to always use the same simple polynomial calculate the coefficients that will make that formula match the initial evaluation. This model is easy to estimate and apply, but it is an approximation whose accuracy is unknown. Sometimes it will be a good approximation, and other times it will be way off, and there is no way to tell which.

Simple Evolutionary Multiobjective Optimizer2 (SEMO2) - A genetic algorithm published in 2003 that uses an archive of variable size that stores all non-dominated individuals.

Sequential quadratic programming (SQP) - An early optimization algorithm that it finds a step away from the current point by minimizing a quadratic model of the problem.

Strength Pareto evolutionary algorithm (SPEA2) - is a genetic optimization algorithm that incorporates a fine-grained fitness assignment strategy, a density estimation technique, and an enhanced archive truncation method.

Stochastic - involving a random variable Taguchi Method ??" a simple optimization method that uses a fixed number of evaluations and a summation technique to calculate an improved solution.

Utopia Point - The theoretical optimum that is not achievable in objective space. Combination of the best instances of each objective function into a single point.

Weighted Sum Method - A method of approximating a multi-objective optimization problem with a single objective optimization. This makes use of an artificial scalarization technique which substitutes multi-objective optimization tasks by single-objective ones. It allows the use of traditional gradient-based methods which are restricted to single objective optimizations to attempt to solve multi-objective optimization tasks. It is an approximation technique that sometimes has excellent results, but frequently does not, and it is not possible to know how close the approximation is.

Home |  Products |  Solutions |  Technologies |  News & Events |  About Us |  Support |  Contact |  Site Map