Abstract: The Kullback–Leibler divergence (also called relative entropy) and its Renyi versions are measures of the difference between two probability distributions. In quantum information theory, a newly-defined Renyi relative entropy for density matrices has found its applications in various problems like strong converses and recoverability. I will present a recent work of Berta, Scholz and Tomamichel, in which they generalize Renyi relative entropy from finite dimensional matrices to arbitrary von Neumann algebras. The connection goes back to the constructions of noncommutative $L_p$ space for Type III von Neumann algebra around '80s, due to Araki-Masuda, Haagerup, Hilsum, Kosaki and others. We will see that the basic structures of noncommutative $L_p$ spaces imply the data processing inequality for the generalized Renyi divergence.