.

# Data Analysis Method: Mathematics Optimization to Build Decision Making

Optimization is a problem associated with the best decision that is effective and efficient decisions whether it is worth maximum or minimum by way of determining a satisfactory solution.

Optimization is not a new science. It has grown even since Newton in the 17th century discovered how to count roots. Currently the science of optimization is still evolving in terms of techniques and applications. Many cases or problems in everyday life that involve optimization to solve them. Lately much developed especially in the emergence of new techniques to solve the problem of optimization. To mention some, among others, conic programming, semi definite programming, semi infinite programming and some meta heuristic techniques.

Optimization plays an important role in the process of designing a system. With optimization, the design of a system can result in cheaper or higher cost, lower processing time and so on. For now, much software help is needed to solve the wrong problem found to get the optimal solution with computation time not too long. Applications of optimization techniques have mushroomed in various fields quickly.

Successful application of optimization techniques requires at least three conditions. These requirements are the ability to make mathematical models of problems encountered, knowledge of optimization techniques and knowledge of computer programs. The notion of optimization can be explained as a set of mathematical formulas and numerical methods for finding and identifying the best candidates from a set of alternatives without having to explicitly compute and evaluate all possible alternatives.

Optimization is the process of maximizing or minimizing a function of purpose by keeping in mind the existing constraints. A function is defined as a rule that assigns each choice of value x with a unique value y = f (x). In this case x is the independent variable and y is the dependent variable. Mathematically,  suppose we have set S ⊂ R, where R is the set of all real numbers. We can define a transformation assigning a numerical value for each x ⊂ S. This relationship is often called the scalar function f defined in the set S.

Optimization problems can be divided according to several categories:

1. Unconstrained optimization: If a function f applies to S = R, then our function is a function of one unconstrained or unconstrained function, either with one variable or two variables.
2. Constrained optimization: If S is a subset of R, then we have a function defined in a constrained region or constrained region.

In addition, optimization problems can also be grouped by the number of variables:

1. Optimization of one variable: The optimization problem with one variable is the most disarmed form of the optimization problem. Functions with one variable become the center of the optimization problem both in theory and practice because this form is most often faced by engineers in practice. In addition, a function with one variable is usually also a sub-problem in the iteration procedure of solving optimization problems with multi variables. Because of its important role, it is not surprising that many algorithms are developed that are usually intended for the completion of functions with one variable, but not infrequently also the problem of daily optimization is a multi-variable problem.
2. Multi-variable optimization: An optimization problem involving more than one variable.

Optimization problems can also be seen from the value of the variable. Grouping optimization problems based on variable values ​​are:

1. The optimization problem with continuous variable, the optimization problem with the value of x can be anything in the feasible area. The problem of linear programming or quadratic programming is an example of continuous optimization.
2. Discrete optimization problems, is an optimization problem with the value of the solution is limited to certain values ​​that are usually integers. In general, solving discrete optimization problems is more difficult than continuous optimization. The problem of integer programming is an example of discrete optimization. In a situation where there is no information about the exact polynomial algorithm (computational time proportional to Nn, where N is the number of parameters sought, and n an integer constant) it is said to be the problem as NP-hard. In the NP-hard category there is no value of n so the computation time is limited by a polynomial with the power of n.

Specifically the problem of optimization can be categorized by the level of decision variables, objective and constraint functions, among others as follows:

1. Linear programming (LP): objective and linear constraint function, the decision variable is continuous.
2. Nonlinear programming (NLP): Objective and constraint functions are not linear. The decision variable is continuous.
3. Integer programming (IP): the decision variable is integer.
4. Mixed linear integer programming (MILP): objective function and linear constraint. The decision variable is an integer and real mix.
5. Mixed integer nonlinear programming (MINLP): nonlinear programming problem with integer and continuous decision variables.
6. Discrete optimization: Problems that have discrete decision variables (integer). These include Integer Programming, Mixed Integer Linear Programming, and Mixed Integer Nonlinear Programming.
7. Optimal control: the decision variable is a vector.
8. Stochastic programming or stochastic optimization: also often called optimization with uncertainty. In this problem, objective and constraint functions contain random variables containing uncertainty.
9. Multi-objective optimization: An optimization problem that has more than one objective. Can be linear or not linear either for objective function or constrain. So for this optimization problem is the most difficult level to solve, because in practice it will involve hundreds or even thousands of constrains variables that may only be solved by using a proper computer program used to obtain the result of problem resolution, size, type and quality On the results of the output is certainly optimal.

Related to this you can read at: https://en.wikipedia.org/wiki/Multi-objective_optimization

Hopefully my writing can be useful for all of us

Thanks (^_^)

Views: 13008

Comment

Join Data Science Central

Comment by Norberto J. Sanchez on July 16, 2018 at 4:06am

Thank you for the reply, Jeefri.

Comment by Jeefri A. Moka on July 15, 2018 at 7:55pm

Thank you for reading my article.

for discussion of the chart above, I apologize because of course I can not discuss it through this media.
for further understanding maybe you can read my book.

and my website : https://exa.000webhostapp.com/

Comment by Norberto J. Sanchez on July 15, 2018 at 8:14am

Nice introduction to optimization. However, you used an image of charts that are not discussed in your blog. It would have been nice to see your interpretation of these charts. Maybe you will consider doing so in a future posting.