Explore BrainMass

Explore BrainMass

    Bayesian Regression

    Bayesian linear regression is an approach to linear regression in which the statistical analysis is undertaken within the context of Bayesian inference. When the regression model has errors that have a normal distribution, and if a particular form of prior distribution is assumed, explicit results are available for the posterior probability distributions of the model’s parameters.

    In the Bayesian approach, the data are supplemented with additional information in the form of a prior probability distribution. The prior belief about the parameters is combined with the data’s likelihood function according to Bayes theorem to yield the posterior belied about the parameters β and σ. The prior can take different functional forms depending on the domain and the information that is available to priority.

    The model evidence is the probability of the data given the model m. It is also known as the marginal likelihood and as the prior predictive density. The model is defined by the likelihood function and the prior distribution on the parameters. The model evidence captures in a single number how well such a model explains the observations. The model evidence of the Bayesian Linear regression model presented in this section can be used in compare competing linear models by Bayesian model comparison. These may differ in the umber and values of the predictor variables as well as in their priors on the model parameters. 

    © BrainMass Inc. brainmass.com April 29, 2024, 9:09 am ad1c9bdddf