• CARMA RHD MEETING
  • Speaker: Prof David Bailey, Berkeley, California
  • Title: Discovery of large Poisson polynomials using a new arbitrary precision package
  • Location: Room V205, Mathematics Building (Callaghan Campus) The University of Newcastle
  • Dates: Tue, 10th Nov 2015 - Tue, 10th Nov 2015
  • [Permanent link]


  • CARMA RHD MEETING
  • Speaker: Prof David Bailey, Berkeley, California
  • Title: A Parallelizable, High-Performance Arbitrary Precision Package
  • Location: Room V205, Mathematics Building (Callaghan Campus) The University of Newcastle
  • Dates: Thu, 12th Mar 2015 - Thu, 12th Mar 2015
  • [Permanent link]


  • AMSI ACCESS GRID SEMINAR
  • Speaker: Prof David Bailey, Berkeley, California
  • Title: Fooling the masses: Reproducibility in high-performance computing
  • Location: Room V205, Mathematics Building (Callaghan Campus) The University of Newcastle
  • Access Grid Venue: UNewcastle [ENQUIRIES]
  • Time and Date: 3:00 pm, Tue, 15th Jul 2014
  • Download: Flyer (416 K)
  • Abstract:

    Reproducibility is emerging as a major issue for highly parallel computing, in much the same way (and for many of the same reasons) that it is emerging as an issue in other fields of science, technology and medicine, namely the growing numbers of cases where other researchers cannot reproduce published results. This talk will summarize a number of these issues, including the need to carefully document computational experiments, the growing concern over numerical reproducibility and, once again, the need for responsible reporting of performance. Have we learned the lessons of history?

  • [Permanent link]


  • CARMA OANT SEMINAR
  • Speaker: Prof David Bailey, Berkeley, California
  • Title: Big data computing: Science and pseudoscience
  • Location: Room V205, Mathematics Building (Callaghan Campus) The University of Newcastle
  • Access Grid Venue: UNewcastle [ENQUIRIES]
  • Time and Date: 3:00 pm, Fri, 11th Jul 2014
  • Download: Flyer (416 K)
  • Abstract:

    The relentless advance of computer technology, a gift of Moore's Law, and the data deluge available via the Internet and other sources, has been a gift to both scientific research and business/industry. Researchers in many fields are hard at work exploiting this data. The discipline of "machine learning," for instance, attempts to automatically classify, interpret and find patterns in big data. It has applications as diverse as supernova astronomy, protein molecule analysis, cybersecurity, medicine and finance. However, with this opportunity comes the danger of "statistical overfitting," namely attempting to find patterns in data beyond prudent limits, thus producing results that are statistically meaningless.

    The problem of statistical overfitting has recently been highlighted in mathematical finance. A just-published paper by the present author, Jonathan Borwein, Marcos Lopez de Prado and Jim Zhu, entitled "Pseudo-Mathematics and Financial Charlatanism," draws into question the present practice of using historical stock market data to "backtest" a new proposed investment strategy or exchange-traded fund. We demonstrate that in fact it is very easy to overfit stock market data, given powerful computer technology available, and, further, without disclosure of how many variations were tried in the design of a proposed investment strategy, it is impossible for potential investors to know if the strategy has been overfit. Hence, many published backtests are probably invalid, and this may explain why so many proposed investment strategies, which look great on paper, later fall flat when actually deployed.

    In general, we argue that not only do those who directly deal with "big data" need to be better aware of the methodological and statistical pitfalls of analyzing this data, but those who observe these problems of this sort arising in their profession need to be more vocal about them. Otherwise, to quote our "Pseudo-Mathematics" paper, "Our silence is consent, making us accomplices in these abuses."

  • [Permanent link]


  • CARMA OANT SEMINAR
  • Speaker: Prof David Bailey, Berkeley, California
  • Title: Hand to hand combat with thousand-digit integrals
  • Location: Room V205, Mathematics Building (Callaghan Campus) The University of Newcastle
  • Access Grid Venue: UNewcastle [ENQUIRIES]
  • Time and Date: 3:00 pm, Mon, 3rd Sep 2012
  • Abstract:

    A frequent theme of 21st century experimental math is the computer discovery of identities, typically done by means of computing some mathematical entity (a sum, limit, integral, etc) to very high numeric precision, then using the PSLQ algorithm to identify the entity in terms of well known constants.

    Perhaps the most successful application of this methodology has been to identify integrals arising in mathematical physics. This talk will present numerous examples of this type, including integrals from quantum field theory, Ising theory, random walks, 3D lattice problems, and even mouse brains. In some cases, it is necessary to compute these integrals to 3000-digit precision, and developing techniques to do such computations is a daunting technical challenge.

  • [Permanent link]


  • CARMA SEMINAR
  • Speaker: Prof David Bailey, Berkeley, California
  • Title: Normality and non-normality of mathematical constants
  • Location: Room V129, Mathematics Building (Callaghan Campus) The University of Newcastle
  • Time and Date: 4:00 pm, Thu, 30th Aug 2012
  • Abstract:

    Given a positive integer b, we say that a mathematical constant alpha is "b-normal" or "normal base b" if every m-long string of digits appears in the base-b expansion of alpha with precisely the limiting frequency 1/b^m. Although it is well known from measure theory that almost all real numbers are b-normal for all integers b > 1, nonetheless proving normality (or nonnormality) for specific constants, such as pi, e and log(2), has been very difficult.

    In the 21st century, a number of different approaches have been attempted on this problem. For example, a recent study employed a Poisson model of normality to conclude that based on the first four trillion hexadecimal digits of pi, it is exceedingly unlikely that pi is not normal. In a similar vein, graphical techniques, in most cases based on the digit-generated "random" walks, have been successfully employed to detect certain nonnormality in some cases.

    On the analytical front, it was shown in 2001 that the normality of certain reals, including log(2) and pi (or any other constant given by a BBP formula), could be reduced to a question about the behavior of certain specific pseudorandom number generators. Subsequently normality was established for an uncountable class of reals (the "Stoneham numbers"), the simplest of which is: alpha_{2,3} = Sum_{n >= 0} 1/(3^n 2^(3^n)), which is provably normal base 2. Just as intriguing is a recent result that alpha_{2,3}, for instance, is provably NOT normal base 6. These results have now been generalized to some extent, although many open cases remain.

    In this talk I will present an introduction to the theory of normal numbers, including brief mention of new graphical- and statistical-based techniques. I will then sketch a proof of the normality base 2 (and nonnormality base 6) of Stoneham numbers, then suggest some additional lines of research. Various parts of this research were conducted in collaboration with Richard Crandall, Jonathan and Peter Borwein, Francisco Aragon, Cristian Calude, Michael Dinneen, Monica Dumitrescu and Alex Yee.

  • [Permanent link]


  • CARMA COLLOQUIUM
  • Speaker: Prof David Bailey, Berkeley, California
  • Title: Hand-to-Hand Combat with Thousand-Digit Integrals
  • Location: Room V129, Mathematics Building (Callaghan Campus) The University of Newcastle
  • Time and Date: 12:00 pm, Fri, 22nd Jul 2011
  • Abstract:

    One of the most effective avenues in recent experimental mathematics research is the computational of definite integrals to high precision, followed by the identification of resulting numerical values as compact analytic formulas involving well-known constants and functions. In this talk we summarize several applications of this methodology in the realm of applied mathematics and mathematical physics, in particular Ising theory, "box integrals", and the study of random walks.

  • [Permanent link]


  • CARMA ANALYSIS AND NUMBER THEORY SEMINAR
  • Speaker: Prof David Bailey, Berkeley, California
  • Title: The PSLQ Algorithm: Techniques for Efficient Computation
  • Location: Room V205, Mathematics Building (Callaghan Campus) The University of Newcastle
  • Time and Date: 3:30 pm, Wed, 25th Aug 2010
  • Abstract:

    The PSLQ algorithm is an algorithm for finding integer relations in a set of real numbers. In particular, if (x1, ..., xn) is a vector of real numbers, then PSLQ finds integers (a1, ..., an), not all zero, such that a1*x1 + a2*x2 + ... + an*xn = 0, if such integers exist. In practice, PSLQ finds a sequence of matrices B_n such that if x is the original vector, then the reduced vector y = x * B_n tends to have smaller and smaller entries, until one entry is zero (or a very small number commensurate with precision), at which point an integer relation has been detected. PSLQ also produces a sequence of bounds on the size of any possible integer, which bounds grow until either precision is exhausted or a relation has been detected.

  • Download: Talk slides (1.4 BM)
  • [Permanent link]


  • CARMA ANALYSIS AND NUMBER THEORY SEMINAR
  • Speaker: Prof David Bailey, Berkeley, California
  • Title: High-Precision Numerical Integration and Experimental Mathematics
  • Location: Room V205, Mathematics Building (Callaghan Campus) The University of Newcastle
  • Time and Date: 3:30 pm, Wed, 18th Aug 2010
  • Abstract:

    Computation of definite integrals to high precision (typically several hundred digit precision) has emerged as a particularly fruitful tool for experimental mathematics. In many cases, integrals with no known analytic evaluations have been experimentally evaluated (pending subsequent formal proof) by applying techniques such as the PSLQ integer relation algorithm to the output numerical values. In other cases, intriguing linear relations have been found in a class of related integrals, relations which have subsequently been proven as instances of more general results. In this lecture, Bailey will introduce the two principal algorithms used for high-precision integration, namely Gaussian quadrature and tanh-sinh quadrature, with some details on efficient computer implementations. He will also present numerous examples of new mathematical results obtained, in part, by using these methods.

    In a subsequent lecture, Bailey will discuss the PSLQ algorithm and give the details of efficient multi-level and parallel implementations.

  • Download: Seminar slides
  • [Permanent link]