Introduction to the simulation of C # numerical calculations (1)

zhaozj2021-02-16  47

Summary

This paper introduces the basic idea of ​​simulating annealing, in order to select the primary parameters of the simulation, then give a specific problem and solution to the two-dimensional function extreme, and give the C # source code.

l overview

In terms of management science, computer science, molecular physics and biology and oversized integrated circuit design, code design, image processing, and electronic engineering, there is a large number of combined optimizations. Many of these problems such as the goods should be questioned, the picture coloring problem, the equipment layout problem, and the wiring problem, etc., there is no effective polynomial time algorithm. These issues have been proven to be NP full problems.

In 1982, Kirkpatrick introduced annealing ideology into the field of combined optimization, and proposed an algorithm for solving large-scale combined optimization issues, particularly effective for NP full combined optimization. This is derived from the solid annealing process, that is, the temperature will be added high, then slowly cool down (ie annealing), so that the lowest energy is achieved. If rapid cooling (ie, quenching) does not reach the lowest point.

Explained in [1] is: Simulation Annealing is a technique which can be applied to any minimisation or learning process based on successive update steps (either random or deterministic) where the update step length is proportional to an arbitrarily set parameter which can play The Role of a Temperature. The Role of a Temperature. The Roule of Metals, The Temperature IS Made High in The Early Stages Or Learning, The IS Reduced for Greater Stability.

That is, analog annealing algorithm is a learning process (random or decisive) that can be applied to the minimum problem or the basic update. In this process, each step is proportional to the corresponding parameters, and these parameters play a temperature role. Then, similar to the metal annealing principle, in order to minimize or learn, the temperature is raised in order to faster, then (slowly) cooling for stability.

l The main idea of ​​simulating annealing algorithm

In terms of the minimum function of the function, the main idea of ​​simulation annealing is: randomly swim in the search range (2D plane), then the random selection points, make the random tour gradually converge in local optimal solution. The temperature is an important control parameter in the Metropolis algorithm, which can be considered that the size of this parameter controls the time slower to move to local or global optimal solution at any time.

Cooling parameters, domain structure, and new solution generator, acceptance criteria and random number generator (ie METROPOLIS algorithm) together constitute the three pillars of the algorithm.

l Key sampling and metroplis algorithm:

Metropolis is an effective focus sampling method that changes from E1 to E2 when the system changes from energy to another, the probability is p = exp [- (E2- E1) / KT] . If E2

When the focus is sampled, if the new state is accepted (topically), if the up (global search) is accepted at a certain chance. The simulation annealing method is starting from a initial solution. After a large solution transformation, the relative optimization of the optimization problem can be combined when the given control parameter value is given. Then reduce the value of the control parameter T. Repeat the Metropolis algorithm, and the overall optimization of the combined optimization problem can be finally obtained when the control parameter T tends to be zero. The value of the control parameter must be slowly attenuated. One of the temperature is an important control parameter for metropolis, and analog annealing can be considered as an iteration of the Metroplis algorithm for decrementing control parameters. Start T value, it may accept a poor deterioration, with the decrease of T, can only accept better deterioration, and finally, when T tend to 0, it will no longer accept any deterioration.

At an infinite high temperature, the system is immediately distributed, accepting all proposed transformations. The smaller the attenuation of T, the longer the time T reaches the end point; however, the smaller the Horsefla chain, the shorter the time to reach the quasi-balance distribution,

l Parameter selection:

We call a series of important parameters for adjusting the simulated annealing method for cooling schedules. It controls the initial value of the parameter T and its attenuation function, the corresponding Markov chain length and stop conditions are very important.

A cooling schedule should specify the following parameters:

1. The initial value T0 of the control parameter T;

2. Control parameter T attenuated function;

3. The length LK of the Markov chain. (Ie, every random walking process, how many times to iterate can tend to tend to distribute, namely a partial convergence location)

4. End condition

Effective cooling schedule criteria:

One. The convergence of the algorithm: depends mainly on the selection of the attenuation function and the length of the Marco chain and the selection of stop guidelines.

two. Experimental performance of the algorithm: the quality of the final solution and the time of the CPU

Parameter selection:

A) Control parameter initial value T0 selection

Generally, the value of the initial value T0 is sufficient, i.e., at a high temperature state, and the reception rate of Metropolis is about 1.

2) Selection of attenuation functions

The attenuation function is used to control the annealing speed of the temperature, and a common function is: T (n 1) = k * t (n), where K is a constant very close to 1.

3) Picking of the length L of the Markov chain

The principle is that the premise of the attenuation function of the attenuation parameter t is selected, and L should be restored to each value on the control parameter.

Termination conditions

There are many selection of termination conditions, and various conditions have great influence on the quality of the performance reconciliation of the algorithm. This article only introduces a common termination condition. That is, the difference between the last most optimal solution and the latest optimal solution is less than a certain tolerance, you can stop the iteration of this Markov chain.

l Example:

The above explanation may be too abstract, the next section will be described in an actual example, where all the source code has been posted, and many details can be learned.

l Reference:

1. Http://www.computer-dictionary-online.org/index.asp?q=simulated annealing computer dictionary

2. Numeric Recipes in C

3. Calculation Method Book Non-Numerical Parallel Algaith (Volume 1) Simulation Annealing Algorithm

转载请注明原文地址:https://www.9cbs.com/read-26317.html

New Post(0)