SA

zhaozj2021-02-16  64

The annealing concept is proposed when studying a combination optimization problem in the early 1980s, which solves the starting point of the optimization problem is similarity between the annealing process and the general combined optimization problem based on physics. When the solid matter is annealing, it is usually warmed to heat it, so that the particles can be moved freely, then cool down, the particles gradually form a low energy state crystal. If the temperature near the coagulation point is slow enough, the solid matter must be capable of forming the lowest energy. The annealing annealed is to simulate this process, so that the global (approximate) optimal solution for combined optimization issues.

E [{xi}] indicates that a system is internal energy under microscopic state {xi} (x x} as a set of state variables, speed, position, etc.), for a given temperature T, if the system is in thermal balance, then E [{xi}] will be distributed from Boltzmann and the distribution function is:

F = C (T) E ^ (- E [{xi}] / KT) C (T) -1 / (E ^ (- E [{x1}] / KT) E ^ (- E [{x2} ] / KT) ... E ^ (- E [{xn}] / KT))

Where K is the Boltzmann constant.

T decline will result in deceleration of internal energy E, and if the fall speed is slow enough, the system can maintain thermal balance to minimize it in this temperature. When T = 0 (openback temperature), the energy will reach the minimum. Such a cooling process is an annealing process.

The metropolis is often used during annealing, which can be used to simulate the thermal balance of the system under temperature T.

Randomly select an initial state {xi}, then randomly add an disturbance {DELTA Xi}, and the internal energy increase is:

Delta e = e [{xi delta xi}] - e [{}]

If Delta E <0, then this disturbance will be accepted, otherwise the disturbance will be accepted by probability E ^ (- Delta E / KT). If the disturbance is accepted, then use {xi delta xi} instead of the original {xi}; otherwise it produces a new disturbance ...

If it is repeated, {xi} will gradually meet the aforementioned Boltzmann distribution.

If t gradually decreases from a sufficiently large value, the system is thermally balanced for each T, and when T = 0, analog annealing, E [{xi}] reaches minimum.

The idea of ​​simulating annealing in computer is to use each possible combination state as {xi}, e as the target function, T as the control parameter, the order T gradually decreases to 0, thereby obtaining the optimal value of the target function.

The basic steps are:

Initialization: Take the initial state {xi}, take the initial value t (0), calculate the target function E [xi}]; 1. Generate random disturbance, calculate Delta E = E [x delta xi}] - E [{ Xi}] 2. If Delta E <0, Goto 4, otherwise generate a uniform distribution random number Y; 3. If e ^ (- E / T) <= Y, GTO 1; 4. Instead of {xi delta xi} instead of {xi}, E DELTA E instead of E; 5. Test if Metropolis sampling is stable, if it is unstable, GTO 1; 6. T reduction; 7. Do you meet the target, it is end, Otherwise goto 1.

Whether analog annealing can reach E, it is determined to be large enough to be sufficiently large, and whether T is low enough, and whether it is stable for each T, Metropolis sampling.

The typical feature of the annealing is to accept an attenuation limit in addition to the improvement of the target function. When T is large, the large attenuation is received. When T gradually decreases, the smaller attenuation is accepted, when T is 0, It will no longer receive attenuation. This feature means that analog annealing is reversed in terms of local search, which avoids local minimal and maintains the versatility and simplicity of local search. It is worth noting that when T is 0, analog annealing becomes a special case of local search.

转载请注明原文地址:https://www.9cbs.com/read-17748.html

New Post(0)