1963 winter one experiment, the MIT meteorologist Lorenz use of computer simulation of the Earth’s atmosphere 13 equations. He as usual in the office operational meteorological computer. Normally, he simply temperature, humidity, pressure and other meteorological data input, the computer will be built based on three differential equations to calculate the next moment may meteorological data, simulate changes in the weather map. On this day, in order to more carefully examine the results to learn more about a certain record subsequent changes, in a scientific calculation, Lorenz initial input data for the fourth place after the decimal point rounding. He put out an intermediate solution of 0.506, to improve the accuracy to 0.506127 then returned. At that time, the computer data processing data fast enough, the results came out, enough to him a cup of coffee and chat with friends while.

In an hour later, when he drank a cup of coffee and come back when you look surprised: Originally a small difference between the results before and after they deviated from the galaxy! Curves before and after the results of the two similarity completely disappeared. Checking the computer again found no problems are detected, the problem lies in his difference of 0.000127 input data, and this subtle difference, but the results caused a world of difference. Lorenz found that due to the error will grow exponentially, in this case, a small error with the continuous passage of time has caused tremendous consequences. Later, in a speech Lorenz raised this issue. He believes that the process of movement in the atmosphere, even a very small variety of errors and uncertainties, there may be accumulated in the process will result, through progressively larger, forming a huge atmospheric motion. Therefore, the long-term it is impossible to predict the weather.