Abstract: The Discrete Fourier Transform (DFT) is a fundamental component of numerous computational techniques in signal processing and scientific computing. The most popular means of computing the DFT is the Fast Fourier Transform (FFT). However, with the emergence of big data problems, in which the size of the processed data sets can easily exceed terabytes, the "Fast" in Fast Fourier Transform is often no longer fast enough. In addition, in many big data applications it is hard to acquire a sufficient amount of data in order to compute the desired Fourier transform in the first place. The Sparse Fourier Transform (SFT) addresses the big data setting by computing a compressed Fourier transform using only a subset of the input data, in time sub-linear in the data set size. The goal of this talk is to survey these recent developments, to explain the basic techniques with examples and applications in big data, to demonstrate trade-offs in empirical performance of the algorithms, and to discuss the connection between the SFT and other techniques for massive data analysis such as streaming algorithms and compressive sensing.