Abstract: Learning on graphs is an important problem in machine learning, computer vision, and data mining. Traditional algorithms for learning on graphs primarily take into account only low-order connectivity patterns described at the level of individual vertices and edges. However, in many applications, high-order relations among vertices are necessary to properly model a real-life problem. In contrast to the low-order cases, in-depth algorithmic and analytic studies supporting high-order relations among vertices are still lacking. To address this problem, we introduce a new mathematical model family, termed inhomogeneous hypergraphs, which captures the high-order relations among vertices in a very extensive and flexible way. Specifically, as opposed to classic hypergraphs that treats vertices within a high-order structure in a uniform manner, inhomogeneous hypergraphs allow one to model the fact that different subsets of vertices within a high-order relation may have different structural importance. We propose a series of algorithmic and analytic results for this new model, including inhomogeneous hypergraph clustering, spectral hypergraph theory, and novel applications ranging from food-web and ranking analysis to subspace segmentation. All proposed algorithms come with provable performance guarantees and are evaluated on real datasets; the results demonstrate significant performance improvements compared to classical learning algorithms.