In high dimensional statistics, estimation and inference are often done by making use of the underlying signal structures. We consider the cases of sparse, low rank and shape-restricted signal structures for a variety of problems, and propose new approaches for estimating related quantities and make valid inference. The methods we develop can be used in many real world applications such as gene expression data analysis in genetics, portfolio management in finance and experimental A/B testing in industry. Chapter 2 discusses selective inference for group sparse linear models. We develop tools to construct confidence intervals and p-values for testing selected groups of variables in a linear model with group sparsity. Chapter 3 studies one dimensional isotonic regression which is an example of shape-restricted nonparametric regression. We characterize the contractive property of the isotonic projection with respect to any norm and use this to analyze the convergence properties of isotonic regression. Chapter 4 considers variable ranking in high dimensional sparse linear regression with rare and weak signals. We propose a two step approach to rank variables so that signal variables tend to have higher rank than noise variables. Chapter 5 considers the problem of decomposing a large covariance matrix into a low rank part plus a diagonally dominant part. We propose several algorithms to perform such tasks and demonstrate its usefulness in estimating large covariance matrices for high dimensional data. Chapter 6 discusses estimation and inference for zero-inflated semi-continuous data. We propose several machine learning approaches to estimate related quantities in both one sample setting and two group setting.