I work in computational statistics, focusing on the computational and theoretical properties of estimators computed using numerical approximations, which is a lot of estimators. Right now I am working on mixed models and additive models.
I supervise students in the Department of Statistics and Actuarial Science at the University of Waterloo. Here's how to join my group:
Doctoral: do one of the following two things:
Apply to the PhD program in our department. Say in your statement of purpose that you're interested in working with me. If your application passes the initial committee review, I will see it, and contact you if there is a suitable opportunity.
Or, if you're already a master's student in our department, ask me to supervise your essay. Indicate as early as possible that you are interested in a PhD, so we make sure we're on the same page. This does not guarantee your admission, and likewise it does not commit you to working with me, however it is an excellent way to assess mutual fit, since we already know each other and have worked together a bit. Both my current students were admitted this way.
Master's: apply to the masters program in our department. Once you have been admitted and started your classes, you can email me directly, saying you're interested in me supervising your research essay. We will meet and chat, and if it's a good fit, then we'll go ahead. Any time up to and including February is ok. I take no more than 3 students per year, so if you get on it early you may have better luck.
Undergraduate: apply for MURA and/or NSERC funding, see the department website for the procedure. You can email me directly to ask me to supervise, or I may contact you if the department sends me your resume.
Publications related to what I would describe as the "core" of my research program, and without student trainees. The * indicates alphabetical authorship.
B. Bilodeau*, A. Stringer*, Y. Tang* (2022). Stochastic Convergence Rates and Applications of Adaptive Quadrature in Bayesian Inference. Journal of the American Statistical Association. Accepted.
A. Stringer, P. Brown, J. Stafford. (2023) Fast, Scalable Approximations to Posterior Distributions in Extended Latent Gaussian Models. Journal of Computational and Graphical Statistics. 32(1):84–98.
A. Stringer, P. Brown, J. Stafford. (2021) Approximate Bayesian Inference for Case Crossover Models. Biometrics, 77(3):785–795.
An underlined name indicates a student or other trainee, or a "student collaborator" (a collaborator who is a student).
Z. Ziang, A. Stringer, P. Brown, J. Stafford. (2023) Bayesian Inference for Cox Proportional Hazard Models with Partial Likelihoods, Nonlinear Covariate Effects and Correlated Observations. Statistical Methods in Medical Research. 32(1):165–180.
Z. Ziang, A. Stringer, P. Brown, J. Stafford. (2023) Model-based Smoothing with Integrated Wiener Processes and Overlapping Splines. Journal of Computational and Graphical Statistics. Accepted.
The * indicates alphabetical authorship.
G. McGee*, A. Stringer* (2023): Marginal additive models for population-averaged inference in longitudinal and cluster-correlated data. Scandinavian Journal of Statistics. Accepted.
A. Stringer (2023). Identifiability constraints in generalized additive models. Canadian Journal of Statistics. Accepted.
A. Stringer (2022). Automatic differentiation of Box-Cox transformations with application to random effects models for continuous non-normal response. Statistics and Computing. 32:89.
B. Bilodeau, Y. Tang, A. Stringer (2023). On the tightness of the Laplace approximation for statistical inference. Statistics and Probability Letters. 198:109839.
A. Stringer (2023): Approximate Marginal Likelihood Inference in Mixed Models for Grouped Data. arXiv:2310.01589 [stat.ME]
A. Stringer, T. Akkaya Hocagil, R. Cook, L. Ryan, S.W. Jacobson, J.L. Jacobson (2023): Semi-parametric benchmark dose analysis with monotone additive models. arXiv:2311.09935 [stat.ME]
A. Stringer (2021): Implementing Adaptive Quadrature for Bayesian Inference: the aghq Package. arXiv:2101.04468 [stat.CO]
Note: If you're looking at the presentations below, you should view them in "presentation" mode, because I make (way too) heavy use of the "pause" feature in Beamer, which only really shows up well in presentation mode. And the "handout" option which removes the pauses obscures some figures. Sorry!
A. Stringer (2021). Implementing approximate Bayesian inference using adaptive quadrature: the aghq package
A. Stringer (with P. Brown and J. Stafford) (2020). Bayesian inference for Extended Latent Gaussian Models.
l’Ecole Polytechnique Federale de Lausanne. Lausanne, Switzerland (invited research presentation) (November 2021)
Statistical Society of Canada annual conference (June 2021)
A. Stringer (2020). Smooth estimation of nonlinear rate curves using Bayesian inference.
A. Stringer (with P. Brown and J. Stafford) (2020). Approximate Bayesian inference for Case Crossover models.
Canadian Statistics Student Conference. Ottawa, Canada (research presentation).