My research is generally focused on computational methods for Bayesian inference and machine learning. My most recent project involves analysing the optimality of a class of Monte Carlo algorithms called parallel tempering (PT) or replica exchange models (REM). PT was developed by physicists in the 80's to efficiently simulate from spin-glasses. It was later adapted as a general framework to improve mixing of MCMC algorithms sampling from complicated multimodal distributions by exploiting parallel computing architectures. PT is still considered to be the state-of-the-art technique in the toolbox of computational scientists, statisticians, and data scientists.
I am currently analysing the scaling properties of reversible and non-reversible variants of PT and using this to detmine their efficiency. We have established tight upper bounds on the optimal performance of these algorithms, as well as give a black-box approach to hyper-parameter tuning.
We are currently in the process of submitting a manuscript outlining this work. If you are inerested in learning more, please feel free to contact me!
Masters Thesis
My masters research was in probability theory and stochastic analysis. In particular, I studied a class of stochastic partial differential equations that arise when studying limiting behaviour of evolutionary systems with a spatial interactions.
Projects
I wrote some expository papers meant as projects for various graduate courses I have taken. I will post them here in the hopes that someone might find them interesting.
- Analysis of continuous time vs discrete time modelling for mammal tracking.
[2018] Submitted for STAT 548: PhD Qualifying Course
In this paper we provide a summary of the paper "When to be discrete: the importance of time formulation in understanding animal movement" by McClintock et al. and comparte and contrast the role of discrete versus continous-time modelling of mammal tracking.
- Self Concordance for Empirical Likelihood.
[2017] Submitted for STAT 548: PhD Qualifying Course
In this report we introduce the empirical likelihood function in non-parametric statistics, and discuss its properties as well as the state-of-the-art methods used to numerically approximate it.
- Toward genetically generated ensembles of neural networks.
[2016] Submitted for CPSC 540: Machine Learning
In this paper we develop a genetic algorithm to find an optimal structure and fine-tune hyperparameters for a neural network and constructs an ensemble classifier in the process. The project and Matlab implementation can be found at:
github.com/s-syed/Genetic-NN.git
- An introduction to, wait for it...the Renewal Process.
[2014] Submitted for AMATH 777: Stochastic processes in the physical sciences
This paper provides an introduction to the renewal process, it’s properties, the renewal theorem, and some applications.
- Mathematical methods to capture shape.
[2013] Submitted for AMATH 875: General relativity for cosmology
We give an overview of how one can use the metric space structure, curvature (sectional, Gauss, Ricci, Weyl), and the holonomy group to determine information about the shape of a pseudo-Riemannian manifold. We focus on building the geometric intuition over technical details.
- Numerical explorations of the dynamics of FRW cosmologies.
[2013] Submitted for AMATH 875: General relativity for cosmology
After deriving the Friedman equations, we study how the scale parameter evolves in different epochs, and in the presence of curvature.
- Statistical methods for quantum state tomography.
[2013] Submitted for AMATH 876: Open quantum systems
We provide a brief introduction to quantum state tomography and the commonly used frequentest approach with the maximum likelihood estimators (MLE). I then outline the more sophisticated Bayesian framework with Bayesian mean estimators (BME) and analyze it’s properties and shortcomings.
- Characteristic functions and the central limit theorem.
[2013] Submitted for PMATH 800: Topics in real and complex analysis: The probabilistic method
This is an expository paper on the central limit theorem, and it’s generalizations. We build up the theory of characteristic functions and provide a rigourous proof of the Lindeberg-Lévy- Feller central limit theorem and provide some applications.
Presentations
Here are notes from some of the presentations I have given in the last few years.