Anon05/15/26, 07:27No.16976389
Since (1+x/n) gives a first order change, let's say you want to rotate the complex number a + ib. Try and rotate it counterclockwise infinitesimally by an infinitely small part [math] (1+x/n)(a+ib) = (a - \tfrac{\Delta}{n} b) + i(b + \tfrac{\Delta}{n} a) [/math] . This means x must be equal to [math] i\Delta[/math] That's a first order change. To get infinite order, the object that rotates a+ib is [math] e^{i\Delta} [/math] , or just theta.
Instead of complex number a + ib, you could do vector [a,b]^T, and you'd find that (1+x/n) is such that 1 is the 2x2 identity matrix I, and x is a [math] \theta [/math] times the 2x2 matrix i that squares to -I.If independently sampling from any distribution, the larger the random sample, the sampling distribution of the mean becomes more and more gaussian (look up central limit theorem). The end of the proof relies on the doing [math] (1 - \tfrac{ t^2/2}{n} )^n [/math] , which goes to the gaussian [math] e^{-t^2/2} [/math] . In some way, gaussian functions maximize entropy.If you want to solve the differential equation [math] \tfrac{d}{dx}^n f(x) = f(x),\ f(0) = 1 [/math], you could imagine that the 'n' basic solutions are of the form [math] \tfrac{d}{dx}^n f(x) = a_i^n f(x),\ a_i^n = 1 [/math], meaning you'd like to find the various n-roots of unity. The solution of course is [math] e^{a_i x} [/math] where i is from 0 to n-1. Lots of differential equations will often involve e^x in some way then. Solutions to the heat/diffusion equation, solutions to wave equations, etc.When n=2, we get sin and cos, meaning these two can be written in terms of e^x in some way. Fouriers Theorem tell us that period functions can be written in terms of sines and cosines, meaning they can also be written in terms of e^x. Extend the period to infinite size and you can get the Fourier Transform. Instead of sticking to only i in the Fourier Transform, you can try any complex number s, and now you got the Laplace Transform.