Skip to main content Accessibility help
×
Hostname: page-component-cd9895bd7-8ctnn Total loading time: 0 Render date: 2024-12-22T21:15:33.912Z Has data issue: false hasContentIssue false

References

Published online by Cambridge University Press:  27 July 2023

Daniel Sanz-Alonso
Affiliation:
University of Chicago
Andrew Stuart
Affiliation:
California Institute of Technology
Armeen Taeb
Affiliation:
University of Washington
Get access

Summary

Image of the first page of this content. For PDF version, please use the ‘Save PDF’ preceeding this image.'
Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2023

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Aanonsen, S. I., Nævdal, G., Oliver, D. S., Reynolds, A. C., and Vallès, B. 2009. The ensemble Kalman filter in reservoir engineering–a review. SPE Journal, 14(03), 393412.Google Scholar
Abarbanel, H. 2013. Predicting the Future: Completing Models of Observed Complex Systems. Springer.CrossRefGoogle Scholar
Agapiou, S., Larsson, S., and Stuart, A. M. 2013. Posterior contraction rates for the Bayesian approach to linear ill-posed inverse problems. Stochastic Processes and their Applications, 123(10), 38283860.Google Scholar
Agapiou, S., Papaspiliopoulos, O., Sanz-Alonso, D., and Stuart, A. M. 2017a. Importance sampling: Intrinsic dimension and computational cost. Statistical Science, 32(3), 405431.Google Scholar
Agapiou, S., Burger, M., Dashti, M., and Helin, T. 2017b. Sparsity-promoting and edge-preserving maximum a posteriori estimators in non-parametric Bayesian inverse problems. Inverse Problems, 34(4), 045002.CrossRefGoogle Scholar
Agrawal, S., Kim, H., Sanz-Alonso, D., and Strang, A. 2022. A variational inference approach to inverse problems with gamma hyperpriors. SIAM/ASA Journal on Uncertainty Quantification, 10(4), 15331559.Google Scholar
Akyildiz, Ö. D., and Míguez, J. 2021. Convergence rates for optimised adaptive importance samplers. Statistics and Computing, 31(2), 117.Google Scholar
Al Ghattas, O., and Sanz-Alonso, D. 2022. Non-asymptotic analysis of ensemble Kalman updates: effective dimension and localization. arXiv preprint arXiv:2208.03246.Google Scholar
Albers, D. J., Blancquart, P. A., Levine, M. E., Seylabi, E. E., and Stuart, A. M. 2019. Ensemble Kalman methods with constraints. Inverse Problems, 35(9), 095007.CrossRefGoogle Scholar
Anderson, B., and Moore, J. B. 1979. Optimal Filtering. Prentice-Hall Information and System Sciences Series. Prentice Hall.Google Scholar
Anderson, E. C. 2014. Monte Carlo methods and importance sampling. Lecture Notes for Statistical Genetics. Unpublished lecture notes, available at https://ib.berkeley.edu/labs/slatkin/eriq/classes/guest_lect/mc_lecture_notes.pdfGoogle Scholar
Anderson, J. L. 2001. An ensemble adjustment Kalman filter for data assimilation. Monthly Weather Review, 129(12), 28842903.Google Scholar
Asch, M., Bocquet, M., and Nodet, M. 2016. Data Assimilation: Methods, Algorithms, and Applications. Vol. 11 of Fundamentals of Algorithms. Society for Industrial and Applied Mathematics. 192Google Scholar
Ayanbayev, B., Klebanov, I., Lie, H. C., and Sullivan, T. J. 2021. Γ-convergence of Onsager– Machlup functionals: I. With applications to maximum a posteriori estimation in Bayesian inverse problems. Inverse Problems, 38(2), 025005.CrossRefGoogle Scholar
Bain, A., and Crisan, D. 2008. Fundamentals of Stochastic Filtering. Vol. 60 of Stochastic Modelling and Applied Probability. Springer Science & Business Media.Google Scholar
Bal, G. 2012. Introduction to Inverse Problems. Lecture Notes-Department of Applied Physics and Applied Mathematics, Columbia University, New York. Available at www.stat.uchicago.edu/guillaumebal/PAPERS/IntroductionInverseProblems.pdfGoogle Scholar
Bassiri, P., Holmes, C., and Walker, S. 2016. A general framework for updating belief distributions. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 78(5), 11031130.Google Scholar
Bayes, T. 1763. An essay towards solving a problem in the doctrine of chances. Philosophical Transactions of the Royal Society of London, 53, 370418.Google Scholar
Bell, B. M. 1994. The iterated Kalman smoother as a Gauss–Newton method. SIAM Journal on Optimization, 4(3), 626636.Google Scholar
Bell, B. M., and Cathey, F. W. 1993. The iterated Kalman filter update as a Gauss-Newton method. IEEE Transactions on Automatic Control, 38(2), 294297.Google Scholar
Beskos, A., Jasra, A., Law, K. J. H., Tempone, R., and Zhou, Y. 1994. Multilevel sequential Monte Carlo samplers. Stochastic Processes and their Applications, 127(5), 14171440.Google Scholar
Bickel, P., Li, B., and Bengtsson, T. 2008. Sharp failure rates for the bootstrap particle filter in high dimensions. Pages 318–329 of: Pushing the Limits of Contemporary Statistics: Contributions in Honor of Jayanta K. Ghosh. Institute of Mathematical Statistics.Google Scholar
Bishop, C. M. 2006. Pattern Recognition and Machine Learning. Vol. 128 of Information Science and Statistics. Springer.Google Scholar
Bishop, C. H., Etherton, B. J., and Majumdar, S. J. 2001. Adaptive sampling with the ensemble transform Kalman filter. Part I: Theoretical aspects. Monthly Weather Review, 129(3), 420436.Google Scholar
Blei, D. M., Kucukelbir, A., and McAuliffe, J. D. 2017. Variational inference: A review for statisticians. Journal of the American Statistical Association, 112(518), 859–877.Google Scholar
Blömker, D., Schillings, C., and Wacker, P. 2018. A strongly convergent numerical scheme from ensemble Kalman inversion. SIAM Journal on Numerical Analysis, 56(4), 25372562.Google Scholar
Blömker, D., Schillings, C., Wacker, P., and Weissmann, S. 2019. Well posedness and convergence analysis of the ensemble Kalman inversion. Inverse Problems, 35(8), 085007.Google Scholar
Bocquet, M., Brajard, J., Carrassi, A., and Bertino, Laurent. L. 2020. Bayesian inference of chaotic dynamics by merging data assimilation, machine learning and expectation maximization. Foundations of Data Science, 2(1), 5580.Google Scholar
Bottou, L., Curtis, F. E., and Nocedal, J. 2018. Optimization methods for large-scale machine learning. SIAM Review, 60(2), 223311.Google Scholar
Boyd, S., Boyd, S. P., and Vandenberghe, L. 2004. Convex Optimization. Cambridge University Press.Google Scholar
Brajard, J., Carrassi, A., Bocquet, M., and Bertino, L. 2020. Combining data assimilation and machine learning to emulate a dynamical model from sparse and noisy observations: a case study with the Lorenz 96 model. Journal of Computational Science, 44, 101171.Google Scholar
Branicki, M., Majda, A. J., and Law, K. J. H. 2018. Accuracy of some approximate Gaussian filters for the Navier–Stokes equation in the presence of model error. Multiscale Modeling & Simulation, 16(4), 17561794.Google Scholar
Brett, C., Lam, K., Law, K. J. H., McCormick, D., Scott, M., and Stuart, A. M. 2013. Accuracy and stability of filters for dissipative PDEs. Physica D: Nonlinear Phenomena, 245(1), 3445.Google Scholar
Bröcker, J. 2013. Existence and uniqueness for four-dimensional variational data as similation in discrete time. SIAM Journal on Applied Dynamical Systems, 16(1), 361374.Google Scholar
Brooks, S., Gelman, A., Jones, G., and Meng, X. 2011. Handbook of Markov Chain Monte Carlo. CRC Press.Google Scholar
Bugallo, M. F., Elvira, V., Martino, L., Luengo, D., Miguez, J., and Djuric, P. M. 2017. Adaptive importance sampling: The past, the present, and the future. IEEE Signal Processing Magazine, 34(4), 6079.Google Scholar
Bui-Thanh, T., Ghattas, O., Martin, J., and Stadler, G. 2013. A computational framework for infinite-dimensional Bayesian inverse problems Part I: The linearized case, with application to global seismic inversion. SIAM Journal on Scientific Computing, 35(6), A2494A2523.Google Scholar
Caflisch, R. E. 1998. Monte Carlo and quasi-Monte Carlo methods. Acta Numerica, 7, 149.Google Scholar
Calvello, E., Reich, S., and Stuart, A. M. 2022. Ensemble Kalman methods: A mean field perspective. arXiv preprint arXiv:2209.11371.Google Scholar
Calvetti, D., and Somersalo, E. 2007. An Introduction to Bayesian Scientific Computing: Ten Lectures on Subjective Computing. Vol. 2 of Surveys and Tutorials in the Applied Mathematical Sciences. Springer Science & Business Media.Google Scholar
Carrassi, A., Bocquet, M., Bertino, L., and Evensen, G. 2018. Data assimilation in the geo sciences: An overview of methods, issues, and perspectives. Wiley Interdisciplinary Reviews: Climate Change, 9(5).Google Scholar
Carrillo, J. A., Hoffmann, F., Stuart, A. M., and Vaes, U. 2022. The Ensemble Kalman filter in the near-Gaussian setting. arXiv preprint arXiv:2212.13239.Google Scholar
Chada, N. K. 2018. Analysis of hierarchical ensemble Kalman inversion. arXiv preprint arXiv:1801.00847.Google Scholar
Chada, N., and Tong, X. 2022. Convergence acceleration of ensemble Kalman inversion in nonlinear settings. Mathematics of Computation, 91(335), 12471280.Google Scholar
Chada, N. K., Iglesias, M. A., Roininen, L., and Stuart, A. M. 2018. Parameterizations for ensemble Kalman inversion. Inverse Problems, 34(5), 055009.Google Scholar
Chada, N. K., Schillings, C., and Weissmann, S. 2019. On the incorporation of box constraints for ensemble Kalman inversion. Foundations of Data Science, 1(4), 433.Google Scholar
Chada, N. K., Stuart, A. M., and Tong, X. T. 2020. Tikhonov regularization within ensemble Kalman inversion. SIAM Journal on Numerical Analysis, 58(2), 12631294.Google Scholar
Chada, N. K., Chen, Y., and Sanz-Alonso, D. 2021. Iterative ensemble Kalman methods: A unified perspective with some new variants. Foundations of Data Science, 3(3), 331369.Google Scholar
Chatterjee, S., and Diaconis, P. 2018. The sample size required in importance sampling. The Annals of Applied Probability, 28(2), 10991135.Google Scholar
Chen, Y., and Oliver, D. 2002. Ensemble randomized maximum likelihood method as an iterative ensemble smoother. Mathematical Geosciences, 44(1), 126.Google Scholar
Chen, Y., Sanz-Alonso, D., and Willett, R. 2022. Auto-differentiable ensemble Kalman filters. SIAM Journal on Mathematics of Data Science, 4(2), 801833.Google Scholar
Chopin, N., and Papaspiliopoulos, O. 2020. An Introduction to Sequential Monte Carlo. Springer.Google Scholar
Cotter, S., Dashti, M., and Stuart, A. M. 2010. Approximation of Bayesian inverse problems for PDE’s. SIAM Journal on Numerical Analysis, 48(1), 322345.Google Scholar
Cotter, S. L., Roberts, G. O., Stuart, A. M., and White, D. 2013. MCMC methods for functions: modifying old algorithms to make them faster. Statistical Science, 424446.Google Scholar
Crisan, D., and Doucet, A. 2002. A survey of convergence results on particle filtering methods for practitioners. Signal Processing, IEEE Transactions on, 50(3), 736–746.Google Scholar
Crisan, D., and Rozovskii, B. 2011. The Oxford Handbook of Nonlinear Filtering. Oxford University Press.Google Scholar
Crisan, D., Moral, P., and Lyons, T. 1998. Discrete filtering using branching and interacting particle systems. Université de Toulouse. Laboratoire de Statistique et Probabilités [LSP].Google Scholar
Dashti, M., and Stuart, A. M. 2017. Bayesian approach to inverse problems. Pages 311–428 of: Handbook of Uncertainty Quantification. Springer.Google Scholar
Dashti, M., Law, K. J. H., Stuart, A. M., and Voss, J. 2013. MAP estimators and their consistency in Bayesian nonparametric inverse problems. Inverse Problems, 29(9), 095017.Google Scholar
De Finetti, B. 2017. Theory of Probability: A Critical Introductory Treatment. Vol. 6 of Wiley Series in Probability and Statistics. John Wiley & Sons.Google Scholar
Del Moral, P. 2004. Feynman-Kac Formulae: Genealogical and Interacting Particle Systems with Applications. Springer Science & Business Media.Google Scholar
Del Moral, P., Doucet, A., and Jasra, A. 2006. Sequential Monte Carlo samplers. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 68(3), 411–436.Google Scholar
Deniz Akyildiz, Ö. 2022. Global convergence of optimized adaptive importance samplers. arXiv preprint arXiv:2201.00409.Google Scholar
Dennis Jr., J. E., and Schnabel, R. B. 1996. Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Society for Industrial and Applied Mathematics. Dick, J., Kuo, F. Y., and Sloan, I. H. 2013. High-dimensional integration: the quasi-Monte Carlo way. Acta Numerica, 22, 133.Google Scholar
Ding, Z., and Li, Q. 2021. Ensemble Kalman sampler: mean-field limit and convergence analysis. SIAM Journal on Mathematical Analysis, 53(2), 10.1137.Google Scholar
Doob, J. L. 1949. Application of the theory of martingales. Pages 23–27 of: Le calcul des probabilités et ses applications, Éditions du Centre National de la Recherche Scientifique (C. N. R. S.).Google Scholar
Doucet, A., Godsill, S., and Andrieu, C. 2000. On sequential Monte Carlo sampling methods for Bayesian filtering. Statistics and Computing, 10(3), 197208.Google Scholar
Doucet, A., Freitas, N. de, and Gordon, N. 2001. An introduction to sequential Monte Carlo methods. Pages 3–14 of: Sequential Monte Carlo Methods in Practice. Springer.Google Scholar
Dunlop, M. M. 2019. Multiplicative noise in Bayesian inverse problems: Well-posedness and consistency of MAP estimators. arXiv preprint arXiv:1910.14632.Google Scholar
Emerick, A., and Reynolds, A. 2013. Investigation of the sampling performance of ensemble-based methods with a simple reservoir model. Computational Geosciences, 17(2), 325350.CrossRefGoogle Scholar
Engl, H., Hanke, M., and Neubauer, A. 1996. Regularization of Inverse Problems. Springer Science and Business Media.Google Scholar
Ernst, O., Sprungk, B., and Starkloff, H. 2015. Analysis of the ensemble and polynomial chaos Kalman filters in Bayesian inverse problems. SIAM/ASA Journal on Uncertainty Quantification, 3(1), 823851.Google Scholar
Evensen, G. 1995. Sequential data assimilation with a nonlinear quasi-geostrophic model using Monte Carlo methods to forecast error statistics. Journal of Geophysical Research: Oceans, 99(c5), 1014310162.Google Scholar
Evensen, G. 2009. Data Assimilation: the Ensemble Kalman Filter. Springer Science and Business Media.Google Scholar
Evensen, G., and Leeuwen, P. Van. 1996. Assimilation of Geosat altimeter data for the Agulhas current using the ensemble Kalman filter with a quasigeostrophic model. Monthly Weather Review, 124(1), 8596.Google Scholar
Evensen, G., and Van Leeuwen, P. J. 2000. An ensemble Kalman smoother for nonlinear dynamics. Monthly Weather Review, 128(6), 18521867.Google Scholar
Evensen, G., Vossepoel, F. C., and van Leeuwen, P. J. 2022. Data Assimilation Fundamentals: A Unified Formulation of the State and Parameter Estimation Problem. Springer.Google Scholar
Farchi, A., and Bocquet, M. 2018. Comparison of local particle filters and new implementations. Nonlinear Processes in Geophysics, 25(4), 765807.Google Scholar
Fienberg, S. E. 2006. When did Bayesian inference become “Bayesian”? Bayesian Analysis, 1(1), 140.Google Scholar
Fisher, M., Nocedal, J., Trémolet, Y., and Wright, S. 2009. Data assimilation in weather forecasting: a case study in PDE-constrained optimization. Optimization and Engineering, 10(3), 409426.Google Scholar
Franklin, J. 1970. Well-posed stochastic extensions of ill-posed linear problems. Journal of Mathematical Analysis and Applications, 31(3), 682716.Google Scholar
Frei, M., and Künsch, H. R. 2013. Bridging the ensemble Kalman and particle filters. Biometrika, 100(4), 781800.Google Scholar
Gamerman, D., and Lopes, H. 2006. Markov Chain Monte Carlo: Stochastic Simulation for Bayesian Inference. CRC Press.Google Scholar
Garbuno-Inigo, A., Nüsken, N., and Reich, S. 2020a. Affine invariant interacting Langevin dynamics for Bayesian inference. SIAM Journal on Applied Dynamical Systems, 19(3), 16331658.Google Scholar
Garbuno-Inigo, A., Hoffmann, F., Li, W., and Stuart, A. M. 2020b. Interacting Langevin diffusions: Gradient structure and ensemble Kalman sampler. SIAM Journal on Applied Dynamical Systems, 19(1), 412441.Google Scholar
Garcia Trillos, N., and Sanz-Alonso, D. 2018. Continuum limits of posteriors in graph Bayesian inverse problems. SIAM Journal on Mathematical Analysis, 50(4), 40204040.Google Scholar
Garcia Trillos, N., and Sanz-Alonso, D. 2020. The Bayesian update: variational formulations and gradient flows. Bayesian Analysis, 15(1), 2956.Google Scholar
Garcia Trillos, N., Kaplan, Z., and Sanz-Alonso, D. 2019. Variational characterizations of local entropy and heat regularization in deep learning. Entropy, 21(5), 511.Google Scholar
Garcia Trillos, N., Kaplan, Z., Samakhoana, T., and Sanz-Alonso, D. 2020. On the consistency of graph-based Bayesian semi-supervised learning and the scalability of sampling algorithms. Journal of Machine Learning Research, 21(28), 147.Google Scholar
Gelb, A., Kasper, J. F., Nash, R. A., Price, C. F., and Sutherland, A. A. 1974. Applied Optimal Estimation. MIT Press.Google Scholar
Gelman, A., Carlin, J. B, Stern, H. S., Dunson, D. B., Vehtari, A., and Rubin, D. B. 2013. Bayesian Data Analysis. Chapman and Hall/CRC.CrossRefGoogle Scholar
Ghil, M., Cohn, S., Tavantzis, J., Bube, K., and Isaacson, E. 1981. Applications of estimation theory to numerical weather prediction. Pages 139–224 of: Dynamic Meteorology: Data Assimilation Methods. Springer.Google Scholar
Gibbs, A., and Su, F. 2002. On choosing and bounding probability metrics. International Statistical Review, 70(3), 419435.Google Scholar
Giles, M. 2015. Multilevel Monte Carlo methods. Acta Numerica, 24, 259328.Google Scholar
Gine, E., and Nickl, R. 2015. Mathematical Foundations of Infinite-dimensional Statistical Models. Cambridge University Press.Google Scholar
Giordano, M., and Nickl, R. 2020. Consistency of Bayesian inference with Gaussian process priors in an elliptic inverse problem. Inverse Problems, 36(8), 085001.Google Scholar
Gland, F., Monbet, V., and Tran, V. 2009. Large sample asymptotics for the ensemble Kalman filter. PhD Thesis.Google Scholar
Goodfellow, I., Bengio, Y., and Courville, A. 2016. Deep Learning. MIT Press.Google Scholar
Goodman, J., and Weare, J. 2010. Ensemble samplers with affine invariance. Communications in Applied Mathematics and Computational Science, 5(1), 6580.Google Scholar
Gottwald, G. A., and Majda, A. J. 2013. A mechanism for catastrophic filter divergence in data assimilation for sparse observation networks. Nonlinear Processes in Geophysics, 20(5), 705712.Google Scholar
Gottwald, G. A., and Reich, S. 2021. Supervised learning from noisy observations: Combining machine-learning techniques with data assimilation. Physica D: Nonlinear Phenomena, 423, 132911.Google Scholar
Gu, Y., and Oliver, D. S. 2007. An iterative ensemble Kalman filter for multiphase fluid flow data assimilation. SPE Journal, 12(04), 438446.CrossRefGoogle Scholar
Guth, P. A., Schillings, C., and Weissmann, S. 2020. Ensemble Kalman filter for neural network based one-shot inversion. arXiv preprint arXiv:2005.02039.Google Scholar
Haber, E., Lucka, F., and Ruthotto, L. 2018. Never look back – A modified EnKF method and its application to the training of neural networks without back propagation. arXiv preprint arXiv:1805.08034.Google Scholar
Hairer, M., Stuart, A. M., and Voss, J. 2011. Signal processing problems on function space: Bayesian formulation, stochastic PDEs and effective MCMC methods. Pages 833–873 of: The Oxford Handbook of Nonlinear Filtering. Oxford University Press.Google Scholar
Hairer, M., Stuart, A. M., Voss, J., and Wiberg, P. 2013. Analysis of SPDEs arising in path sampling. Part I: The Gaussian case. Communications in Mathematical Sciences, 3(4), 587603.Google Scholar
Hairer, M., Stuart, A. M., and Vollmer, S. J. 2014. Spectral gaps for a Metropolis– Hastings algorithm in infinite dimensions. The Annals of Applied Probability, 24(6), 24552490.Google Scholar
Hammersley, J., and Handscomb, D. 1964. Percolation processes. Pages 134–141 of: Monte Carlo Methods, Springer.CrossRefGoogle Scholar
Hanke, M. 1997. A regularizing Levenberg-Marquardt scheme, with applications to inverse groundwater filtration problems. Inverse Problems, 13(1), 7995.Google Scholar
Harlim, J., Sanz-Alonso, D., and Yang, R. 2020. Kernel methods for Bayesian elliptic inverse problems on manifolds. SIAM/ASA Journal on Uncertainty Quantification, 8(4), 14141445.Google Scholar
Harvey, A. 1964. Forecasting, Structural Time Series Models and the Kalman Filter. Cambridge University Press.Google Scholar
Hastings, W. K. 1970. Monte Carlo sampling methods using Markov chains and their applications. Biometrika, 57(1), 97109.Google Scholar
Hayden, K., Olson, E., and Titi, E. 2011. Discrete data assimilation in the Lorenz and 2D Navier–Stokes equations. Physica D: Nonlinear Phenomena, 240(18), 1416–1425.Google Scholar
Helin, T., and Burger, M. 2015. Maximum a posteriori probability estimates in infinite-dimensional Bayesian inverse problems. Inverse Problems, 31(8), 085009.Google Scholar
Herty, M., and Visconti, G. 2019. Kinetic methods for inverse problems. Kinetic & Related Models, 12(5), 1109.Google Scholar
Hosseini, B. 2017. Well-posed Bayesian inverse problems with infinitely divisible and heavy-tailed prior measures. SIAM/ASA Journal on Uncertainty Quantification, 5(1), 10241060.Google Scholar
Hosseini, B., and Nigam, N. 2017. Well-posed Bayesian inverse problems: priors with exponential tails. SIAM/ASA Journal on Uncertainty Quantification, 5(1), 436–465.Google Scholar
Houtekamer, P. L., and Derome, J. 1995. Methods for ensemble prediction. Monthly Weather Review, 123(7), 21812196.Google Scholar
Houtekamer, P. L., and Mitchell, H. 1998. Data assimilation using an ensemble Kalman filter technique. Monthly Weather Review, 126(3), 796811.Google Scholar
Huang, D. Z., and Huang, J. 2021. Unscented Kalman inversion: efficient Gaussian approximation to the posterior distribution. arXiv preprint arXiv:2103.00277.Google Scholar
Huang, D. Z., Schneider, T., and Stuart, A. M. 2021. Unscented Kalman inversion. arXiv preprint arXiv:2102.01580.Google Scholar
Huang, D. Z., Huang, J., Reich, S., and Stuart, A. M. 2022a. Efficient derivative free Bayesian inference for large-scale inverse problems. arXiv preprint arXiv:2204.04386.Google Scholar
Huang, D. Z., Schneider, T., and Stuart, A. M. 2022b. Iterated Kalman methodology for inverse problems. Journal of Computational Physics, 463, 111262.Google Scholar
Iglesias, M. A. 2016. A regularizing iterative ensemble Kalman method for PDE constrained inverse problems. Inverse Problems, 32(2), 025002.Google Scholar
Iglesias, M. A., and Yang, Y. 2021. Adaptive regularisation for ensemble Kalman inversion. Inverse Problems, 37(2), 025008.Google Scholar
Iglesias, M. A., Law, K. J. H., and Stuart, A. M. 2014a. Ensemble Kalman methods for inverse problems. Inverse Problems, 29(4), 045001.Google Scholar
Iglesias, M. A., Lin, K., and Stuart, A. M. 2014b. Well-posed Bayesian geometric inverse problems arising in subsurface flow. Inverse Problems, 30(11), 114001.Google Scholar
Jazwinski, A. 2007. Stochastic Processes and Filtering Theory. Courier Corporation.Google Scholar
Johansen, A., and Doucet, A. 2008. A note on auxiliary particle filters. Statistics and Probability Letters, 78(12), 14981504.Google Scholar
Jordan, M. I., Ghahramani, Z., Jaakkola, T. S., and Saul, L. K. 1999. An introduction to variational methods for graphical models. Machine Learning, 37(2), 183233.Google Scholar
Kahn, H. 1955. Use of Different Monte Carlo Sampling Techniques. Rand Corporation.Google Scholar
Kahn, H., and Marshall, A. W. 1953. Methods of reducing sample size in Monte Carlo computations. Journal of the Operations Research Society of America, 1(5), 263278.Google Scholar
Kaipio, J., and Somersalo, E. 2006. Statistical and Computational Inverse Problems. Springer Science & Business Media, 160.Google Scholar
Kalman, R. 1960. A new approach to linear filtering and prediction problems. Journal of Basic Engineering, 82(1), 3545.Google Scholar
Kalman, R., and Bucy, R. 1961. New results in linear filtering and prediction theory. Journal of Basic Engineering, 83(1), 95108.Google Scholar
Kalnay, E. 2003. Atmospheric Modeling, Data Assimilation and Predictability. Cambridge University Press.Google Scholar
Kantas, N., Beskos, A., and Jasra, A. 2014. Sequential Monte Carlo methods for high dimensional inverse problems: a case study for the Navier Stokes equations. SIAM Journal on Uncertainty Quantification, 2(1), 464489.Google Scholar
Kawai, R. 2017. Adaptive importance sampling Monte Carlo simulation for general multivariate probability laws. Journal of Computational and Applied Mathematics, 319, 440459.Google Scholar
Kelly, D., and Stuart, A. M. 2019. Ergodicity and accuracy of optimal particle filters for Bayesian data assimilation. Chinese Annals of Mathematics, Series B, 40(5), 811842.Google Scholar
Kelly, D., Law, K. J. H., and Stuart, A. M. 2014. Well-posedness and accuracy of the ensemble Kalman filter in discrete and continuous time. Nonlinearity, 27(10), 2579.Google Scholar
Kiefer, J., and Wolfowitz, J. 1952. Stochastic estimation of the maximum of a regression function. The Annals of Mathematical Statistics, 23(3), 462466.Google Scholar
Kim, H., Sanz-Alonso, D., and Strang, A. 2023. Hierarchical ensemble Kalman methods with sparsity-promoting generalized gamma hyperpriors. Foundations of Data Science, 5(3), 366388.Google Scholar
Knapik, B., van der Vaart, A., and van Zanten, J. 2011. Bayesian inverse problems with Gaussian priors. Annals of Statistics, 39(5), 26262657.Google Scholar
Kovachki, N. B., and Stuart, A. M. 2019. Ensemble Kalman inversion: A derivative-free technique for machine learning tasks. Inverse Problems, 35(9), 095005.Google Scholar
Krishnan, R., Shalit, U., and Sontag, D. 2017. Structured inference networks for nonlinear state space models. In: Proceedings of the AAAI Conference on Artificial Intelligence, 31(1), 21012109.Google Scholar
Kwiatkowski, E., and Mandel, J. 2015. Convergence of the square root ensemble Kalman filter in the large ensemble limit. SIAM/ASA Journal on Uncertainty Quantification, 3(1), 117.Google Scholar
Lalley, S. P. 1999. Beneath the noise, chaos. The Annals of Statistics, 27(2), 461–479.Google Scholar
Lasanen, S. 2012a. Non-Gaussian statistical inverse problems. Part I: Posterior distribu-tions. Inverse Problems & Imaging, 6(2), 215266.Google Scholar
Lasanen, S. 2012b. Non-Gaussian statistical inverse problems. Part II: Posterior convergence for approximated unknowns. Inverse Problems & Imaging, 6(2), 267.Google Scholar
Latz, J. 2020. On the well-posedness of Bayesian inverse problems. SIAM/ASA Journal on Uncertainty Quantification, 8(1), 451482.Google Scholar
Law, K. J. H, and Zankin, V. 2022. Sparse online variational Bayesian regression. SIAM/ASA Journal on Uncertainty Quantification, 10(3), 10701100.CrossRefGoogle Scholar
Law, K. J. H., Shukla, A., and Stuart, A. M. 2014. Analysis of the 3DVAR filter for the partially observed Lorenz’63 model. Discrete and Continuous Dynamical Systems, 34(3), 10611078.Google Scholar
Law, K. J. H., Stuart, A. M., and Zygalakis, K. 2015. Data Assimilation. Springer.Google Scholar
Law, K. J. H., Sanz-Alonso, D., Shukla, A., and Stuart, A. M. 2016. Filter accuracy for the Lorenz 96 model: Fixed versus adaptive observation operators. Physica D: Nonlinear Phenomena, 325, 113.Google Scholar
Lee, Y. 2021. 𝑙p regularization for ensemble Kalman inversion. SIAM Journal on Scientific Computing, 43(5), A3417A3437.Google Scholar
Leeuwen, P. Van, Cheng, Y., and Reich, S. 2015. Nonlinear Data Assimilation. Springer.Google Scholar
Lehtinen, M. S., Paivarinta, L., and Somersalo, E. 1989. Linear inverse problems for generalised random variables. Inverse Problems, 5(4), 599.Google Scholar
Levine, M., and Stuart, A. 2022. A framework for machine learning of model error in dynamical systems. Communications of the American Mathematical Society, 2(7), 283344.Google Scholar
Li, G., and Reynolds, A. C. 2007. An iterative ensemble Kalman filter for data assimilation. In: SPE Annual Technical Conference and Exhibition. Society of Petroleum Engineers.Google Scholar
Lieberman, C., Willcox, K., and Ghattas, O. 2010. Parameter and state model reduction for large-scale statistical inverse problems. SIAM Journal on Scientific Computing, 32(5), 25352542.Google Scholar
Lindvall, T. 2002. Lectures on the Coupling Method. Springer.Google Scholar
Liu, J. S. 2008. Monte Carlo Strategies in Scientific Computing. Springer Science & Business Media.Google Scholar
Lorenc, A. 1986. Analysis methods for numerical weather prediction. Quarterly Journal of the Royal Meteorological Society, 112(474), 11771194.CrossRefGoogle Scholar
Lorenc, A. C., Ballard, S. P., Bell, R. S., Ingleby, N. B., Andrews, P. L. F., Barker, D. M., Bray, J. R., Clayton, A. M., Dalby, T., Li, D., et al. 2000. The Met. Office global three-dimensional variational data assimilation scheme. Quarterly Journal of the Royal Meteorological Society, 126(570), 29913012.Google Scholar
Lorentzen, R., Fjelde, R., FrØyen, J., Lage, A., Naevdal, G., and Vefring, E. 2001. Underbalanced and low-head drilling operations: Real time interpretation of measured data and operational support. SPE Annual Technical Conference and Exhibition.Google Scholar
Lu, Y., Stuart, A. M., and Weber, H. 2017. Gaussian approximations for probability measures on ℝd. SIAM/ASA Journal on Uncertainty Quantification, 5(1), 11361165.Google Scholar
MacKay, D. 2003. Information Theory, Inference and Learning Algorithms. Cambridge University Press.Google Scholar
Majda, A. J., and Harlim, J. 2012. Filtering Complex Turbulent Systems. Cambridge University Press.Google Scholar
Mandel, J., Cobb, L., and Beezley, J. D. 2011. On the convergence of the ensemble Kalman filter. Applications of Mathematics, 56(6), 533541.Google Scholar
Martin, J., Wilcox, L., Burstedde, C., and Omar, G. 2012. A stochastic Newton MCMC method for large-scale statistical inverse problems with application to seismic inversion. SIAM Journal on Scientific Computing, 34(3), A1460A1487.Google Scholar
Martino, L., Elvira, V., and Louzada, F. 2017. Effective sample size for importance sampling based on discrepancy measures. Signal Processing, 131, 386401.Google Scholar
Marzouk, Y., and Xiu, D. 2009. A stochastic collocation approach to Bayesian inference in inverse problems. Communications in Computational Physics, 6(4), 826847.Google Scholar
Mattingly, J., Stuart, A.M., and Higham, D. 2002. Ergodicity for PDE’s and approximations: locally Lipschitz vector fields and degenerate noise. Stochastic Processes and Their Applications, 101(2), 185232.Google Scholar
Metropolis, N., Rosenbluth, A. W., Rosenbluth, M. N., Teller, A. H., and Teller, E. 1953. Equation of state calculations by fast computing machines. The Journal of Chemical Physics, 21(6), 10871092.Google Scholar
Meyn, S., and Tweedie, R. 2012. Markov Chains and Stochastic Stability. Springer Science and Business Media.Google Scholar
Miller, E. L., and Karl, W. C. 2003. Fundamentals of Inverse Problems. Unpublished lecture notes, available at https://ece.northeastern.edu/fac-ece/elmiller/eceg398f03/notes.pdf.Google Scholar
Minka, T. P. 2013. Expectation propagation for approximate Bayesian inference. arXiv preprint arXiv:1301.2294.Google Scholar
Moodey, A., Lawless, A., Potthast, R., and Leeuwen, P. Van. 2013. Nonlinear error dynamics for cycled data assimilation methods. Inverse Problems, 29(2), 025002.Google Scholar
Morzfeld, M., Hodyss, D., and Snyder, C. 2017. What the collapse of the ensemble Kalman filter tells us about particle filters. Tellus A: Dynamic Meteorology and Oceanography, 69(1), 1283809.Google Scholar
Nickl, R. 2020. Bernstein–von Mises theorems for statistical inverse problems I: Schrödinger equation. Journal of the European Mathematical Society, 22(8), 26972750.Google Scholar
Nickl, R. 2022. Bayesian Non-linear Statistical Inverse Problems. Lecture Notes, Department of Mathematics, ETH Zürich.Google Scholar
Nickl, R., and Paternain, G. 2021. On some information-theoretic aspects of non-linear statistical inverse problems. arXiv preprint arXiv:2107.09488.Google Scholar
Nickl, R., and Söhl, J. 2019. Bernstein–von Mises theorems for statistical inverse problems II: compound Poisson processes. Electronic Journal of Statistics, 13(2), 3513–3571.Google Scholar
Nickl, R., van de Geer, S., and Wang, S. 2020. Convergence rates for penalised least squares estimators in PDE-constrained regression problems. SIAM/ASA Journal on Uncertainty Quantification, 8(1), 374413.Google Scholar
Nielsen, F., and Garcia, V. 2009. Statistical exponential families: A digest with flash cards. arXiv preprint arXiv:0911.4863.Google Scholar
Nocedal, J., and Wright, S. 2006. Numerical Optimization. Springer Science & Business Media.Google Scholar
Oliver, D., Reynolds, A., and Liu, N. 2008. Inverse Theory for Petroleum Reservoir Characterization and History Matching. Cambridge University Press.Google Scholar
Oljaca, L., Brocker, J., and Kuna, T. 2018. Almost sure error bounds for data assimilation in dissipative systems with unbounded observation noise. SIAM Journal on Applied Dynamical Systems, 17(4), 28822914.Google Scholar
Owhadi, H., Scovel, C., Sullivan, T. J., McKerns, M., and Ortiz, M. 2013. Optimal uncertainty quantification. SIAM Review, 55(2), 271345.Google Scholar
Owhadi, H., Scovel, C., and Sullivan, T. J. 2015a. Brittleness of Bayesian inference under finite information in a continuous world. Electronic Journal of Statistics, 9(1), 1–79.Google Scholar
Owhadi, H., Scovel, C., and Sullivan, T. J. 2015b. On the brittleness of Bayesian inference. SIAM Review, 57(4), 566582.Google Scholar
Paulin, D., Jasra, A., Crisan, D., and Beskos, A. 2018. On concentration properties of partially observed chaotic systems. Advances in Applied Probability, 50(2), 440479.Google Scholar
Paulin, D., Jasra, A., Crisan, D., and Beskos, A. 2019. Optimization based methods for partially observed chaotic systems. Foundations of Computational Mathematics, 19(3), 485559.Google Scholar
Pavliotis, G. A. 2014. Stochastic Processes and Applications: Diffusion Processes, the Fokker-Planck and Langevin Equations. Vol. 60 of Texts in Applied Mathematics. Springer.Google Scholar
Pecora, L. M., and Carroll, T. L. 1990. Synchronization in chaotic systems. Physical Review Letters, 64(8), 821.Google Scholar
Petersen, K., and Pedersen, M. 2008. The Matrix Cookbook. Technical University of Denmark.Google Scholar
Petra, N., Martin, J., Stadler, G., and Ghattas, O. 2014. A computational framework for infinite-dimensional Bayesian inverse problems, Part II: Stochastic Newton MCMC with application to ice sheet flow inverse problems. SIAM Journal on Scientific Computing, 36(4), A1525A1555.Google Scholar
Pidstrigach, J., and Reich, S. 2023. Affine-invariant ensemble transform methods for logistic regression. Foundations of Computational Mathematics, 23(2), 675708.Google Scholar
Pinski, F., Simpson, F., Stuart, A. M., and Weber, H. 2015a. Algorithms for Kullback– Leibler approximation of probability measures in infinite dimensions. SIAM Journal on Scientific Computing, 37(6), A2733A2757.Google Scholar
Pinski, F., Simpson, F., Stuart, A. M., and Weber, H. 2015b. Kullback–Leibler approximation for probability measures on infinite dimensional spaces. SIAM Journal on Mathematical Analysis, 47(6), 40914122.Google Scholar
Pitt, M., and Shephard, N. 1999. Filtering via simulation: Auxiliary particle filters. Journal of the American Statistical Association, 94(446), 590599.CrossRefGoogle Scholar
Rauch, H., Striebel, C., and Tung, F. 1965. Maximum likelihood estimates of linear dynamic systems. AIAA Journal, 3(8), 14451450.Google Scholar
Rawlins, F., Ballard, S. P., Bovis, K. J., Clayton, A. M., Li, D., Inverarity, G. W., Lorenc, A. C., and Payne, T. J. 2007. The Met Office global four-dimensional variational data assimilation scheme. Quarterly Journal of the Royal Meteorological Society: A Journal of the Atmospheric Sciences, Applied Meteorology and Physical Oceanography, 133(623), 347362.Google Scholar
Rebeschini, P., and Handel, R. Van. 2015. Can local particle filters beat the curse of dimensionality? Annals of Applied Probability, 25(5), 28092866.Google Scholar
Reich, S. 2017. A dynamical systems framework for intermittent data assimilation. BIT Numerical Mathematics, 51(1), 235249.Google Scholar
Reich, S. 2019. Data assimilation: the Schrödinger perspective. Acta Numerica, 28, 635711.Google Scholar
Reich, S., and Cotter, C. 2015. Probabilistic Forecasting and Bayesian Data Assimilation. Cambridge University Press.Google Scholar
Reynolds, A. C., Zafari, M., and Li, G. 2006. Iterative forms of the ensemble Kalman filter. Pages cp–23 of: ECMOR X-10th European conference on the mathematics of oil recovery. European Association of Geoscientists & Engineers.Google Scholar
Robbins, H., and Monro, S. 1951. A stochastic approximation method. The Annals of Mathematical Statistics, 22(3), 400407.Google Scholar
Robert, C., and Casella, G. 2013. Monte Carlo Statistical Methods. Springer Science & Business Media.Google Scholar
Roberts, G. O., Rosenthal, J. S., et al. 2001. Optimal scaling for various Metropolis Hastings algorithms. Statistical Science, 16(4), 351367.Google Scholar
Ryu, E. K., and Boyd, S. P. 2014. Adaptive importance sampling via stochastic convex programming. arXiv preprint arXiv:1412.4845.Google Scholar
Sakov, P., Oliver, D. S., and Bertino, L. 2012. An iterative EnKF for strongly nonlinear systems. Monthly Weather Review, 140(6), 19882004.Google Scholar
Sanz-Alonso, D. 2018. Importance sampling and necessary sample size: An information theory approach. SIAM/ASA Journal on Uncertainty Quantification, 6(2), 867879.Google Scholar
Sanz-Alonso, D., and Stuart, A. M. 2015. Long-time asymptotics of the filtering distribution for partially observed chaotic dynamical systems. SIAM/ASA Journal on Uncertainty Quantification, 3(1), 12001220.Google Scholar
Sanz-Alonso, D., and Stuart, A. M. 2017. Gaussian approximations of small noise diffusions in Kullback-Leibler divergence. Communications in Mathematical Sciences, 15(7), 20872097.Google Scholar
Sanz-Alonso, D., and Wang, Z. 2021. Bayesian update with importance sampling: Required sample size. Entropy, 23(1), 22.Google Scholar
Särkkä, S. 2013. Bayesian Filtering and Smoothing. Vol. 3 of Institute of Mathematical Statistics Textbooks. Cambridge University Press.Google Scholar
Savage, L. J. 1972. The Foundations of Statistics. Courier Corporation.Google Scholar
Schillings, C., and Stuart, A. M. 2017. Analysis of the ensemble Kalman filter for inverse problems. SIAM Journal on Numerical Analysis, 55(3), 12641290.Google Scholar
Schillings, C., and Stuart, A. M. 2018. Convergence analysis of ensemble Kalman inversion: the linear, noisy case. Applicable Analysis, 97(1), 107123.Google Scholar
Schneider, T., Stuart, A. M., and Wu, J.-L. 2022. Ensemble Kalman inversion for sparse learning of dynamical systems from time-averaged data. Journal of Computational Physics, 470, 111559.Google Scholar
Skjervheim, J.-A., Evensen, G., Hove, J., and Vabø, J. G. 2011. An ensemble smoother for assisted history matching. In: SPE Reservoir Simulation Symposium. OnePetro.Google Scholar
Sloan, I. H., and Woźniakowski, H. 1998. When are quasi-Monte Carlo algorithms efficient for high dimensional integrals? Journal of Complexity, 14(1), 133.Google Scholar
Smith, R. C. 2013. Uncertainty Quantification: Theory, Implementation, and Applications. Vol. 12 of Computational Science and Engineering. Society for Industrial and Applied Mathematics.Google Scholar
Snyder, C. 2011. Particle filters, the “optimal” proposal and high-dimensional systems. Pages 161–170 of: Proceedings of the ECMWF Seminar on Data Assimilation for Atmosphere and Ocean, European Centre for Medium-Range Weather Forecasts.Google Scholar
Snyder, C., Bengtsson, T., and Morzfeld, M. 2015. Performance bounds for particle filters using the optimal proposal. Monthly Weather Review, 143(11), 47504761.Google Scholar
Snyder, C., Bengtsson, T., Bickel, P., and Anderson, J. L. 2016. Obstacles to high-dimensional particle filtering. Monthly Weather Review, 136(12), 46294640.Google Scholar
Stordal, A. S., Karlsen, H. A., Nævdal, G., Skaug, H. J, and Vallès, B. 2011. Bridging the ensemble Kalman filter and particle filters: the adaptive Gaussian mixture filter. Computational Geosciences, 15(2), 293305.Google Scholar
Stuart, A. M. 2010. Inverse problems: a Bayesian perspective. Acta Numerica, 19, 451559.Google Scholar
Stuart, A. M., and Humphries, A. R. 1998. Dynamical Systems and Numerical Analysis. Vol. 2 of Cambridge Monographs on Applied and Computational Mathematics. Cambridge University Press.Google Scholar
Sullivan, T. J. 2015. Introduction to Uncertainty Quantification. Vol. 63 of Texts in Applied Mathematics. Springer.Google Scholar
Tarantola, A. 2015a. Inverse Problem Theory and Methods for Model Parameter Estimation. Society for Industrial and Applied Mathematics.Google Scholar
Tarantola, A. 2015b. Towards adjoint-based inversion for rheological parameters in nonlinear viscous mantle flow. Physics of the Earth and Planetary Interiors, 234, 2334.Google Scholar
Tikhonov, A. N., and Arsenin, V. Y. 1977. Solutions of Ill-Posed Problems. Washington, Winston & Sons.Google Scholar
Tippett, M. K., Anderson, J. L., Bishop, C. H., Hamill, T. M., and Whitaker, J. S. 2003. Ensemble square root filters. Monthly Weather Review, 131(7), 14851490.Google Scholar
Tokdar, S., Kass, S., and Kass, R. 2010. Importance sampling: a review. Wiley Interdisciplinary Reviews: Computational Statistics, 2(1), 5460.Google Scholar
Tong, X. T., Majda, A. J., and Kelly, D. 2015. Nonlinear stability of the ensemble Kalman filter with adaptive covariance inflation. Nonlinearity, 29(2), 5460.Google Scholar
Tong, X. T., Majda, A. J, and Kelly, D. 2016. Nonlinear stability and ergodicity of ensemble based Kalman filters. Nonlinearity, 29(2), 657.Google Scholar
Ungarala, S. 2012. On the iterated forms of Kalman filters using statistical linearization. Journal of Process Control, 22(5), 935943.Google Scholar
Van der Vaart, A. 1998. Asymptotic Statistics. Cambridge University Press.Google Scholar
Vogel, C. R. 2002. Computational Methods for Inverse Problems. Society for Industrial and Applied Mathematics.Google Scholar
Wainwright, M. J., and Jordan, M. I. 2008. Graphical Models, Exponential Families, and Variational Inference. Foundations and Trends in Machine Learning, 1(1–2), 1305.Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×