The Linear Ballistic Accumulator (LBA: Brown and Heathcote, 2008) model is used as a measurement tool to answer questions about applied psychology. The analyses based on this model depend upon the model selected and its estimated parameters. Modern approaches use hierarchical Bayesian models and Markov chain Monte-Carlo (MCMC) methods to estimate the posterior distribution of the parameters. Although there are several approaches available for model selection, they are all based on the posterior samples produced via MCMC, which means that the model selection inference inherits the properties of the MCMC sampler. To improve on current approaches to LBA inference we propose two methods that are based on recent advances in particle MCMC methodology; they are qualitatively different from existing approaches as well as from each other. The first approach is particle Metropolis-within-Gibbs; the second approach is density tempered sequential Monte Carlo. Both new approaches provide very efficient sampling and can be applied to estimate the marginal likelihood, which provides Bayes factors for model selection. The first approach is usually faster. The second approach provides a direct estimate of the marginal likelihood, uses the first approach in its Markov move step and is very efficient to parallelise on high performance computers. The new methods are illustrated by applying them to simulated and real data, and through pseudo code. The code implementing the methods is freely available.