Markov processes play an important role in reliability analysis and particularly in modeling the stochastic evolution of survival/failure behavior of systems. The probability law of Markov processes is described by its generator or the transition rate matrix. In this paper, we suppose that the process is doubly stochastic in the sense that the generator is also stochastic. In our model, we suppose that the entries in the generator change with respect to the changing states of yet another Markov process. This process represents the random environment that the stochastic model operates in. In fact, we have a Markov modulated Markov process which can be modeled as a bivariate Markov process that can be analyzed probabilistically using Markovian analysis. In this setting, however, we are interested in Bayesian inference on model parameters. We present a computationally tractable approach using Gibbs sampling and demonstrate it by numerical illustrations. We also discuss cases that involve complete and partial data sets on both processes.