Fourier Analysis

Fourier analysis—encompassing the Fourier transform and the Fourier series—is one of the most powerful and important mathematical concepts ever conceived. It was invented in the early 1800’s by Joseph Fourier to model heat flow through materials, but physicists and mathematicians soon realized that its implications were much further-reaching. It has revolutionized such diverse fields as acoustics, signal processing, image processing, econometrics, data compression, and the one we care about: optics.

Unless you’re feeling really confident, it’s probably a good idea to start with an intuitive description of Fourier analysis before you dive into the math. The key principle behind both the transform and the series is that any arbitrary function can be expressed as an infinite sum of sinusoidal waves. If this seems farfetched, just watch the first few minutes of this video.

Because Fourier analysis is such an important concept, people have made many tools to help you understand it. This Java app allows you to play with the various sines and cosines in order to make all kinds of interesting waveforms, and it has a feature that very intuitively bridges the gap between the series and the transform. Youtuber 3Blue1Brown takes a totally different approach to teaching the Fourier transform than what I’ve described here.

= The Fourier Series =

Periodic functions can be represented by a Fourier series of the form

$$ f(x) = \sum_{n=0}^{\infty} a_n \cos (nkx) + b_n \sin (nkx), $$

where $a$ and $b$  are constants and $k$  is the the fundamental wavenumber of oscillation. The method for finding $a$ and $b$  is remarkably clever, but beyond the scope of this paper. I highly recommend you read more about it here. The important thing is that, for each $n$, there’s an associated sinusoid.

The Fourier series can also represent a finite region of a non-periodic function by assuming that the function just repeats outside the region in question. In this case, the fundamental wavelength is twice the length of the function domain. That is, if you only care about the function $f(x)$ within the domain $0<x<L$, then the fundamental wavelength is $2L$  and the fundamental wavenumber is $\pi/L$. As the domain gets arbitrarily large, the fundamental wavenumber gets arbitrarily small, and the sum starts to look suspiciously like an integral. Enter, the Fourier transform.

= The Fourier Transform =

The Fourier transform belongs to a family of operations called integral transforms. Traditional functions take in a variable and output another variable. Integral transforms take in a function of one variable and output a function of a different variable. In the Fourier transform, the input is a function of space (or time), and the output is a function of wavenumber (or frequency) which represents the spectrum of the input function, and is defined as

$$ F(k) = \mathcal{F}[f(x)]_{x\rightarrow k} = \frac{1}{\sqrt{2\pi}} \int_{-\infty}^{\infty}f(x)e^{ikx}\;dx. $$

Likewise, the original function is the inverse Fourier transform of the spectrum function:

$$ f(x) = \mathcal{F}^{-1}[F(k)]_{k\rightarrow x} = \frac{1}{\sqrt{2\pi}} \int_{-\infty}^{\infty}F(k)e^{-ikx}\;dk. $$

The Fourier transform maps a function from its original space (often called &quot;function space&quot; or &quot;real space&quot;) into a new space (often called &quot;Fourier space,&quot; &quot;frequency space,&quot; &quot;reciprocal space,&quot; or &quot;k-space&quot;). In a way, $f(x)$ and $F(k)$  really are the same function, just viewed from different perspectives, as  shows really well. Consequently, the total amount of &quot;function-energy&quot; is the same in either space. This principle is known as Parseval’s theorem, and can be expressed:

$$\int_{-\infty}^{\infty}|f(x)|^2\;dx = \int_{-\infty}^{\infty}|F(k)|^2\;dk$$

In my opinion, the most fascinating aspect of the Fourier transform is when it shows up in physical systems. Over a century ago, Albert Michelson built a machine that performs a Fourier transform mechanically (i.e. using gears, springs, and pulleys). You can learn all about the &quot;Harmonic Analyzer&quot; here and I highly recommend you do. But even more awesome is the fact that a far-field diffraction pattern just so happens to be the 2-dimensional Fourier transform of the aperture that causes it. At first glance, there seems to be no reason at all that this should be the case. How about that!

= The Discrete Fourier Transform =

When I introduced the Fourier transform, I described the process of making the fundamental frequency smaller and smaller until the sum becomes an integral. If we were to stop right before that point, we would end up with something called a discrete Fourier transform (DFT). It’s technically a Fourier series, but the spacing between frequencies is small enough to create a reasonably sampled function in Fourier space.

One interesting property of the DFT is that the domain in Fourier space (i.e. the range of $k$ for which $F(k)$  is defined) is determined by the sample rate in real space, and the sample rate in Fourier-space (i.e. the spacing between $k_n$  and $k_{n+1}$ ) is determined by the signal domain in real space. The reason for this is related to the Nyquist limit (see Signal Processing).



Discrete Fourier transforms can be performed extremely quickly, thanks to methods such as the Cooley-Tukey algorithm, and others. These algorithms are referred to as fast-Fourier transforms (FFTs). They’re so fast, in fact, that many signal processors (such as the music visualizer shown here) can perform them in realtime.