Bayesian approach for inference has become one of the central interests in statistical inference, due to its advantages for expressing uncertainty in probability space, and the main hurdle is to sample from posterior distribution in various cases. Markov chain Monte Carlo (MCMC) is the most popular technique while the theoretical properties have not yet been well understood. In this work, we study and propose several new sampling algorithms in cases of discrete variable sampling, when prior distribution is not smooth or even unbounded, and when likelihood itself is analytically unavailable and computationally intractable. The approaches that we develop in this work can be well applied to many real world applications such as community detection, image uncertainty quantification, and so on. Chapter 2 studies computational complexity of a Metropolis-Hasting algorithm for Bayesian community detection. We first establish a posterior strong consistency result for a natural prior distribution on stochastic block models under the optimal signal-to-noise ratio condition in the literature, and then give a set of conditions that guarantee rapid mixing of a simple Metropolis-Hasting Markov chain. Chapter 3 proposes a novel sampling algorithm for non-smooth posterior sampling problem, motivated by the idea of unadjusted Langevin Monte Carlo algorithm and Tweedie's posterior mean formula. We provide rigorous non-asymptotic convergence analysis, and show that it outperforms other algorithms in some aspects. Chapter 4 provides an approximate Bayesian distribution via approximate Bayesian computation that naturally incorporates prior information with loss function, when likelihood is not accessible. Asymptotic contraction results and Bernstein-von Mises type of property are proved for proposed Bayesian distribution under certain conditions.