Sketching and random projection methods are a powerful set of techniques to speed up computations in numerical linear algebra, statistics, machine learning, optimization and data science. In this talk, we will discuss some of our works on developing a "big data" asymptotic perspective on sketching in the fundamental problems of linear regression and principal component analysis. This can lead to remarkably clean and elegant mathematical results, which yield powerful insights into the performance of various sketching methods. To highlight one, orthogonal sketches such as the Subsampled Randomized Hadamard Transform are provably better than iid sketches such as Gaussian sketching. This is obtained by using deep recent tools from asymptotic random matrix theory and free probability, including asymptotically liberating sequences (Anderson & Farrell, 2014). This is based on joint works with Jonathan Lacotte, Sifan Liu, Mert Pilanci, David P. Woodruff, and Fan Yang.