Many parameters, many problems.
Statistical Inference is the process by which the properties of a population are inferred from a sample. In Cosmology this translates to learning the properties of the Universe from a handful of galaxies. The difficulty of this exercise greatly depends on the number of properties we want to learn, i.e. the number of parameters of our model. In this talk I will motivate why future cosmological models will inevitably possess hundreds if not thousands of parameters, describe how the traditional techniques we currently use to constrain our models grow inefficient for these number of dimensions and propose possible solutions. Namely, I will present our recent work (2301.11895 & 2301.11978) developing analytical marginalization schemes for “nuisance” parameters in the context 3x2 analyses. I will also discuss our current work developing gradient-based marginalization techniques to solve the problem numerically.