• STATISTICS SEMINAR
  • Speaker: Dr Barrie Stokes, School of Medicine and Public Health, The University of Newcastle
  • Title: Nested Sampling and its "Central Problem"
  • Location: Room V105, Mathematics Building (Callaghan Campus) The University of Newcastle
  • Time and Date: 3:00 pm, Fri, 29th Apr 2016
  • Abstract:

    Nested Sampling (NS) is a numerical algorithm for fitting models to data in the Bayesian setting, put forward by John Skilling in 2004. It has some advantages over Markov chain Monte Carlo algorithms; no starting point issues, no burn-in, no proposal distributions.

    Nested Sampling calculates the Evidence Pr[data|I] directly; posterior samples are in some sense a by-product.

    The "central problem" is the drawing of a likelihood-restricted prior sample at each compression step.

    Consideration of new such sampling methods has led to some work on equidistribution testing.

  • [Permanent link]


  • STATISTICS SEMINAR
  • Speaker: Dr Barrie Stokes, School of Medicine and Public Health, The University of Newcastle
  • Title: Nested Sampling: How it Works and why it’s Good
  • Location: Room V101, Mathematics Building (Callaghan Campus) The University of Newcastle
  • Time and Date: 2:00 pm, Fri, 6th Jun 2014
  • PhD Confirmation
  • Abstract:

    [Supervisors: Professor Irene Hudson, Dr. Frank Tuyl, School of Mathematical and Physical Sciences]

    Nested Sampling (NS) is a computationally-intensive algorithm for fitting parametric statistical models to data in a Bayesian setting, first announced by its originator, John Skilling, in 2004.

    NS has several distinguishing features that differentiate it from the large class of algorithms having the same purpose that fall under the heading MCMC (Markov Chain Monte Carlo).

    NS has as its principal aim the evaluation of the evidence Z, the denominator in the Bayes’ Theorem expression for the posterior probability. Posterior samples are in a sense by-products of the process. NS requires no burn-in period, and poses no starting point problem. In principle it can deal with multimodal likelihoods and very large datasets.

    The NS algorithm will be explained with the aid of Mathematica animated graphics, and some current applications of NS will be briefly mentioned.

    One of the aims of the project is to produce a general NS package written in Mathematica, which will be used for investigating the behaviour of NS. The package should be useful in furthering the use of NS in a variety of applications.

    All algorithm development, testing, and implementation, and thesis writing is being done in Mathematica. Reasons for this will be offered.

  • [Permanent link]


  • STATISTICS SEMINAR
  • Speaker: Dr Barrie Stokes, School of Medicine and Public Health, The University of Newcastle
  • Title: Maximum Entropy Alternatives to Pearson Family Distributions
  • Location: Room V101, Mathematics Building (Callaghan Campus) The University of Newcastle
  • Time and Date: 3:15 pm, Fri, 5th Aug 2011
  • Abstract:

    In the spirit of ET Jaynes' Maximum Entropy Principle, a (Bayesian) prior distribution conforming or constrained, say, to known moments should have the maximum entropy of all such distributions. At a previous MaxEnt conference [MaxEnt 2009, Oxford Mississippi] a method of obtaining MaxEnt univariate distributions under a variety of constraints was presented. The Mathematica function Interpolation[], normally used with numerical data, can also process "semi-symbolic" data, and Lagrange Multiplier equations were solved for a set of symbolic ordinates describing the required MaxEnt probability density function. We apply a developed version of this approach to finding MaxEnt distributions having prescribed beta1 (skewness squared) and beta2 (kurtosis) values, and compare the entropies of the MaxEnt distributions to those of the Pearson family distributions having the same beta1 and beta2. (This work was presented in poster form at the recent MaxEnt 2011 conference in Waterloo, Canada.)

  • [Permanent link]