My research involves studying how large datasets from cosmic microwave background (CMB) experiments and galaxy surveys can constrain fundamental physics. I use numerical computation, data analysis and sophisticated statistical methods such as Bayesian statistics and Monte Carlo techniques.
Extreme data compression and super-fast parameter estimation
Using Karhunen-Loeve methods I have developed a data compression algorithm, which is able to estimate model parameters (marginalized likelihoods) in under a minute, much faster than Markov Chain Monte Carlo (MCMC) methods. The data compression automatically marginalizes over all other parameters. Instead of carrying out a full likelihood evaluation over the whole parameter space, we need to evaluate the likelihood only for the parameter of interest. We therefore achieve extreme data compression (EC) by i) compressing the entire dataset into just a few numbers, and ii) reducing the dimensionality of the parameter space that needs to be explored. You can find the paper here.
Constraining neutrino masses with galaxy surveys
As part of my PhD thesis, I studied the effects of massive neutrinos and different dark energy models on large-scale structure, in particular on the angular clustering of galaxies in bins of photometric redshift in the Dark Energy Survey (DES). I developed a likelihood analysis pipeline to test how well DES will be able to constrain the sum of the neutrino masses. I carried out a joint constraints analysis, combining mock data from DES and the Planck CMB experiment. You can find the paper here.
I am interested in:
- Neutrino effects on structure formation
- Lossless data compression algorithms
- Physics of the cosmic microwave background
- Nature of dark matter and dark energy
- Primordial Gravitational Waves as signatures of Inflation
- Cosmic Reionization in the 21cm