Imperial College London > Talks@ee.imperial > CAS Talks >  A probabilistic programming approach to the accuracy and stability of numerical algorithms

A probabilistic programming approach to the accuracy and stability of numerical algorithms

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact George A Constantinides.

(Practice talk for Asilomar 2019)

We propose adopting the framework of probabilistic programming to quantify the accuracy and stability of of low-precision numerical algorithms. Specifically, we model rounding errors as noise terms, that is to say we model each arithmetic operation as a probabilistic operation, and reason about numerical algorithms as probabilistic programs. In contrast with the traditional approaches to error analysis which focus on worst-case errors, this framework provides methods to reason about the confidence in the computation, which often validates the use of low-precision computations.

For programs involving a small number of operations, such as the implementations of transcendental functions or simple ray-tracing algorithms we can compute the probability density function of the output explicitly and provide confidence intervals on the error by direct integration. For larger programs such as classification algorithms or model predictive controllers we aim to provide confidence intervals based on concentration of measure inequalities.

This talk is part of the CAS Talks series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

Changes to Talks@imperial | Privacy and Publicity