Publikationsserver der Universitätsbibliothek Marburg

Titel:Numerical Methods of Optimum Experimental Design Based on a Second-Order Approximation of Confidence Regions
Autor:Nattermann, Max
Weitere Beteiligte: Kostina, Ekaterina (Prof. Dr.)
Veröffentlicht:2014
URI:https://archiv.ub.uni-marburg.de/diss/z2015/0054
URN: urn:nbn:de:hebis:04-z2015-00540
DOI: https://doi.org/10.17192/z2015.0054
DDC: Mathematik
Titel (trans.):Numerische Methoden der Optimalen Versuchsplanung auf Basis Quadratisch Approximierter Konfidenzgebiete
Publikationsdatum:2015-02-23
Lizenz:https://rightsstatements.org/vocab/InC-NC/1.0/

Dokument

Schlagwörter:
Sensitivity Analysis, Robuste Versuchsplanung, DoE, Konfidenzgebiete, Sensitivitätsanalyse, Parameter Estimation, Parameterschätzung, Robustification, Confidence Regions, Optimum Experimental Design, Optimale Versuchsplanung

Summary:
A successful application of model-based simulation and optimization of dynamic processes requires an exact calibration of the underlying mathematical models. Here, a fundamental task is the estimation of unknown and nature given model coefficients by means of real observations. After an appropriate numerical treatment of the differential systems, the parameters can be estimated as the solution of a finite dimensional nonlinear constrained parameter estimation problem. Due to the fact that the measurements always contain defects, the resulting parameter estimate cannot be seen as an ultimate solution and a sensitivity analysis is required to quantify the statistical accuracy. The goal of the design of optimal experiments is the identification of those measurement times and experimental conditions which allow a parameter estimate with a maximized statistical accuracy. Also the design of optimal experiments problem can be formulated as an optimization problem, where the objective function is given by a suitable quality criterion based on the sensitivity analysis of the parameter estimation problem. In this thesis, we develop a quadratic sensitivity analysis to enable a better assessment of the statistical accuracy of a parameter estimate in the case of highly nonlinear model functions. The newly introduced sensitivity analysis is based on a quadratically approximated confidence region which is an expansion of the commonly used linearized confidence region. The quadratically approximated confidence region is analyzed extensively and adequate bounds are established. It is shown that exact bounds of the quadratic components can be obtained by solving symmetric eigenvalue problems. One main result of this thesis is that the quadratic part is essentially bounded by two Lipschitz constants, which also characterize the Gauss-Newton convergence properties. This bound can also be used for an approximation error of the validity of the linearized confidence regions. Furthermore, we compute a quadratic approximation of the covariance matrix, which delivers another possibility for the statistical assessment of the solution of a parameter estimation problem. The good approximation properties of the newly introduced sensitivity analysis are illustrated in several numerical examples. In order to robustify the design of optimal experiments, we develop a new objective function - the Q-criterion - based on the introduced sensitivity analysis. Next to the trace of the linear approximation of the covariance matrix, the Q-criterion consists of the above-mentioned Lipschitz constants. Here, we especially focus on a numerical computation of an adequate approximation of the constants. The robustness properties of the new objective function in terms of parameter uncertainties is investigated and compared to a worst-case formulation of the design of optimal experiments problem. It is revealed that the Q-criterion covers the worst-case approach of the design of optimal experiments problem based on the A-criterion. Moreover, the properties of the new objective function are considered in several examples. Here, it becomes evident that the Q-criterion leads to a drastic improve of the Gauss-Newton convergence rate at the following parameter estimation. Furthermore, in this thesis we consider efficient and numerically stable methods of parameter estimation and the design of optimal experiments for the treatment of multiple experiment parameter estimation problems. In terms of parameter estimation and sensitivity analysis, we propose a parallel computation of the Gauss-Newton increments and the covariance matrix based on orthogonal decompositions. Concerning the design of optimal experiments, we develop a parallel approach to compute the trace of the covariance matrix and its derivative.

Bibliographie / References

  1. S. Körkel. Numerische Methoden für Optimale Versuchsplanungsprobleme bei nicht- linearen DAE-Modellen. PhD thesis, Ruprecht-Karls-Universität Heidelberg, 2002.
  2. R. I. Oliveira. Sums of random Hermitian matrices and an inequality by Rudelson. Electronic Communications in Probability, 15:203 – 212, 2010.
  3. F. Pukelsheim. Optimal Design of Experiments. John Wiley & Sons, Inc., New York, 1993.
  4. E. S. Keeping. Introduction to Statistical Inference. Dover books on mathematics. Dover Publications, 1962.
  5. J. F. Bonnans, J. C. Gilbert, C. Lemaréchal, and C. A. Sagastizábal. Numerical Optimization: Theoretical and Practical Aspects (Universitext). Springer-Verlag New York, Inc., Secaucus, NJ, USA, 2006.
  6. M. Kot. Elements of Mathematical Ecology. Cambridge University Press, Cam- bridge, 2001.
  7. L. S. Pontryagin and V. G. Boltyanskii. The mathematical, theory of optimal processes. 1962.
  8. C. Geiger and C. Kanzow. Theorie und Numerik restringierter Optimierungsauf- gaben. Springer-Verlag, Berlin Heidelberg, 2002.
  9. A. E. Bryson. Applied Optimal Control: Optimization, Estimation and Control. Halsted Press book'. Taylor & Francis, 1975.
  10. A. Griewank. Evaluating Derivatives: Principles and Techniques of Algorithmic Dif- ferentiation. Number 19 in Frontiers in Applied Mathematics. SIAM, Philadelphia, PA, 2000.
  11. E. L. Lehmann. Testing Statistical Hypotheses. John Wiley & Sons, New York, 1959.
  12. M. J. D. Powell. The convergence of variable metric methods for nonlinearly con- strained optimization calculations. In O. L. Mangasarian, R. R. Meyer, and S. M. Robinson, editors, Nonlinear Programming, volume 3. Academic Press: New York, 1978.
  13. Y. Saad. Numerical Methods for Large Eigenvalue Problems. Manchester University Press, Manchester, UK, 1992.
  14. D. M. Bates and D. G. Watts. Nonlinear Regression Analysis and Its Applications. Series in Probability and Mathematical Statistics. John Wiley & Sons, 1988.
  15. G. L. Nemhauser and L. A. Wolsey. Integer and Combinatorial Optimization. John Wiley & Sons, 1988.
  16. J. Nocedal and S. J. Wright. Numerical Optimization. Springer Science and Business Media, 2006.
  17. N. Strigul, H. Dette, and V.B Melas. A practical guide for optimal designs of experiments in the Monod model. Environmental Modelling & Software, 24:1019 – 1026, 2009.
  18. V. H. Schulz. Reduced SQP methods for large-scale optimal control problems in DAE with application to path planning problems for satellite mounted robots. PhD thesis, Universität Heidelberg, 1996.
  19. G. A. F. Seber and C. J. Wild. Nonlinear Regression. Series in Probability and Mathematical Statistics. John Wiley & Sons, New York, 1989.
  20. A. Cervantes and L. T. Biegler. Large-scale DAE optimization using a simultaneous NLP formulation. American Institute of Chemical Engineers Journal, 44:1038 – 1050, 1998.
  21. W. C. Rooney and L. Biegler. Design for model parameter uncertainty using nonlinear confidence regions. American Institute of Chemical Engineers Journal, 47,No.8:1794–1804, 2001.
  22. H. G. Bock, E. Kostina, and J. P. Schlöder. On the role of natural level functions to achieve global convergence for damped newton methods. In M.J.D. Powell and S. Scholtes, editors, System Modelling and Optimization, volume 46 of IFIP — The International Federation for Information Processing, pages 51–74. Springer US, 2000.
  23. K. H. Borgwardt. Optimierung, Operation Research, Spieltheorie. Birkhäuser Ver- lag, 2001.
  24. H. G. Bock, S. Körkel, E. Kostina, and J. P. Schlöder. Robustness aspects in param- eter estimation, optimal design of experiments and optimal control. In W. Jäger, R. Rannacher, and J. Warnatz, editors, Reactive Flows, Diffusion and Transport, pages 117–146. Springer Berlin Heidelberg, 2007.
  25. H. Wolkowicz and G. P. H Styan. Bounds for eigenvalues using traces. Linear Algebra and its Applications, 29:471 – 506, 1980.
  26. P. A. Vanrolleghem, M. V. Daele, and D. Dochain. Practical identifiability of a biokinetic model of activated sludge respiration. Water Research, 29(11):2561 – 2570, 1995.
  27. P. J. Ossenbruggen, H. Spanjers, and A. Klapwik. Assessment of a two-step nuitri- fication model for activated sludge. Water Research, 30(4):939 – 953, 1996.
  28. W. Merkel, A. Schwarz, S. Fritz, M. Reuss, and K. Krauth. New strategies for estimating kinetic parameters in anaerobic wastewater treatment plants. Water Science and Technology, 34(5–6):393 – 401, 1996.
  29. M. Schwaab, E. C. Biscaia, J. L. Monteiro, and J. C. Pinto. Nonlinear parameter estimation through particle swarm optimization. Chemical Engineering Science, 63(6):1542–1552, March 2008.
  30. G. Franceschini and S. Macchietto. Model-based design of experiments for param- eter precision: State of the art. Chemical Engineering Science, 63(19):4846 – 4872, 2008.
  31. J. Donaldson and R. Schnabel. Computational experience with confidence regions and confidence intervals for non-linear least squares. Technometrics, 29(1):67 – 82, 1987.
  32. N. R. Draper and W. G. Hunter. Design of experiments for parameter estimation in multiresponse situations. Biometrica, 53(3/4):525 – 533, 1966.
  33. H. G. Bock, E. Kostina, and O. Kostyukova. Covariance matrices for parameter estimation of constrained parameter estimation problems. SIAM Journal on Matrix Analysis and Applications, 29:626–642, March 2007.
  34. U. Ascher. Collocation for two-point boundary value problems revisited. SIAM Journal on Numerical Analysis, 23(3):596 – 609, 1986.
  35. C. Daniel and F. S. Wood. Fitting Equations to Data: Computer Analysis of Mul- tifactor Data. John Wiley & Sons, Inc., New York, NY, USA, 2nd edition, 1999. References
  36. S. F. Walter. Structured Higher-Order Algorithmic Differentiation in the Forward and Reverse Mode with Application in Optimum Experimental Design. PhD thesis, Humboldt-Universität zu Berlin, 2011.
  37. H. G. Bock and K. Plitt. A multiple shooting algorithm for direct solution of optimal control problems. In Proceedings of the 9th IFAC world congress, Budapest. Pergamon Press, 1984.
  38. R. Hettich and K. O. Kortanek. Semi-infinite programming: Theory, methods, and applications. SIAM Review, 35(3):380–429, 1993.
  39. J. Kiefer. Optimum experimental designs. Journal of the Royal Statistical Society. Series B (Methodological), 21(2):pp. 272–319, 1959.
  40. R. Potock´Potock´y and T. V. Ban. Confidence regions in nonlinear regression models. Application of Mathematics, 37:29 – 39, 1992.
  41. J. Stoer and R. Bulirsch. Introduction to Numerical Analysis. Springer, New York, 1980.
  42. N. R. Draper and H. Smith. Applied Regression Analysis. Series in Probability and Mathematical Statistics. John Wiley & Sons, New York, 2nd edition, 1981.
  43. H. G. Bock, S. Körkel, and J. P. Schlöder. Model Based Parameter Estimation: Theory and Applications, volume 4 of Contributions in Mathematical and Compu- tational Sciences, chapter Parameter estimation and optimum experimental design for nonlinear differential equation models, pages 1 – 30. 2013.
  44. K. Schmidt and G. Trenkler. Einführung in die Moderne Matrix-Algebra. Springer Berlin-Heidelberg, 2006.
  45. P. A. Vanrolleghem and K. J. Kessman. Identification of biodegradation models under model and data uncertainty. Water Science and Technology, 33(2):91 –105, 1996.
  46. A. L. Koch. The Monod Model and Its Alternatives. In A. L. Koch, J. A. Robinson, and G. A. Milliken, editors, Mathematical Modeling in Microbial Ecology, Chapman & Hall Microbiology Series, pages 62–93. Springer US, 1998.
  47. M. J. D. Powell. A fast algorithm for nonlinearly constrained optimization calcula- tions. Lecture Notes in Mathematics 630, Springer-Verlag, Berlin, pages 144–157, 1978.
  48. S. Körkel, I. Bauer, H. G. Bock, and J. P. Schlöder. A sequential approach for nonlinear optimum experimental design in DAE systems. In F. Keil, W. Mackens, H. Voss, and J. Werther, editors, Scientific Computing in Chemical Engineering II, pages 338 – 345. Springer Verlag, Berlin, Heidelberg, 1999.
  49. L. B. Rall. Automatic Differentiation: Techniques and Applications, volume 120 of Lecture Notes in Computer Science. Springer, Berlin, 1981.
  50. D. Marske. Biochemical oxygen demand data interpretation using sum of squares. Master's thesis, University of Wisconsin-Madison, 1967.
  51. Confidence intervals of example List of Tables List of Tables
  52. Confidence regions of example 3.5.1 with probability level 1 − α = 0.95. . 55
  53. Confidence regions of example 3.5.2 with probability level 1 − α = 0.95. . 57
  54. G. E. P. Box and H. L. Lucas. Design of experiments in non-linear situations. Biometrika, 46(1/2):77 – 90, 1959.
  55. R. Bulirsch. Die Mehrzielmethode zur numerischen Lösung von nichtlinearen Randwertproblemen und Aufgaben der optimalen Steuerung. Technical report, Carl-Cranz-Gesellschaft, 1971.
  56. V. H. Schulz. Ein effizientes Kollokationsverfahren zur numerischen Behandlung von Mehrpunktrandwertaufgaben in der Parameteridentifizierung und Optimalen Steuerung. Master's thesis, Universität Augsburg, 1990.
  57. V. Bär. Ein Kollokationsverfahren zur numerischen Lösung allgemeiner Mehrpunk- trandwertaufgaben mit Schalt-und Sprungbedingungen mit Anwendungen in der Optimalen Steuerung und Parameteridentifizierung. Master's thesis, Universität Bonn, 1983.
  58. T. W. Lohmann. Ein numerisches Verfahren zur Berechnung optimaler Versuch- spläne für beschränkte Parameteridentifizierungsprobleme. Reihe Informatik. Verlag Shaker, Aachen, 1993.
  59. G. E. P. Box and N. R. Draper. Empirical model-building and response surfaces. Wiley series in probability and mathematical statistics: Applied probability and statistics. Wiley, 1987.
  60. Illustration of the commonly used DoE objective functions, Körkel [62]. . 65
  61. D. G. Luenberger and Y. Ye. Linear and Nonlinear Programming. Kluwer Academic Publishers, second edition, 2003.
  62. F. W. Scholz. Maximum Likelihood Estimation. John Wiley & Sons, Inc., 2004.
  63. E. F. Camacho and C. Bordons. Model Predictive Control. Advanced Textbooks in Control and Signal Processing. Springer London, 2004.
  64. M. Morari, C.E. Garcia, and D. M. Prett. Model predictive control: Theory and practice—A survey. Automatica, 25(3):335–348, 1989.
  65. W. L. Brogan. Modern control theory. QPI series. Quantum Publishers, 1974.
  66. F. Allgöwer and A. Z. Zheng. Nonlinear Model Predictive Control: Assessment and Future Directions for Research. Birkhäuser, Progress in Systems and Control Series, Basel, 2004.
  67. A. Gifi. Nonlinear Multivariate Analysis. University of Leiden, 1981.
  68. I. Bauer, H. G. Bock, S. Körkel, and J. P. Schlöder. Numerical methods for initial value problems and derivative generation for DAE models with application to op- timum experimental design of chemical processes. In F. Keil, W. Mackens, H. Voß, and J. Werther, editors, Scientific Computing in Chemical Engineering II, pages 282–289. Springer, 1999.
  69. E. Kostina and G. Kriwet. Numerical methods of robust optimum experimental design for model discrimination. Technical report, Philipps-Universität Marburg, 2014.
  70. U. M. Ascher, Robert M. M. Mattheij, and R. D. Russell. Numerical Solution of Boundary Value Problems for Ordinary Differential Equations. Classics in Applied Mathematics. Society for Industrial and Applied Mathematics, 1995.
  71. A. Dieses. Numerische Verfahren zur Diskriminierung nichtlinearer Modelle für dynamische chemische Prozesse. Master's thesis, Interdisziplinares Zentrum für Wissenschaftliches Rechnen der Universität Heidelberg, 1997.
  72. M. Athans and P. L. Falb. Optimal control : an introduction to the theory and its applications. Lincoln Laboratory publications. McGraw-Hill, New York, Saint Louis, San Francisco, 1966.
  73. F. Tröltzsch. Optimal Control of Partial Differential Equations: Theory, Meth- ods, and Applications. Graduate Studies in Mathematics. American Mathematical Society, 2010.
  74. D.E. Kirk. Optimal Control Theory: An Introduction. Dover Books on Electrical Engineering Series. Dover Publications, 2004.
  75. M. Hinze, R. Pinnau, M. Ulbrich, and S. Ulbrich. Optimization with PDE Con- straints. Mathematical Modelling: Theory and Applications. Springer, 2010.
  76. A. C. Atkinson and A. N. Donev. Optimum Experimental Design. Oxford University Press, 1992.
  77. J. V. Gallitzendörfer. Parallele Algorithmen für Optimierungsrandwertprobleme. Forsch.-Ber. VDI, VDI Verlag, 10(514), 1997.
  78. E. Kostina and M. Nattermann. Parallele Berechnung der Kovarianzmatrix und deren Ableitung bei Mehrfachexperimenten. Technical report, Philipps-Universität Marburg, 2011.
  79. A. C. Atkinson and D. R. Cox. Planning experiments for discriminating between models. Journal of the Royal Statistical Society. Series B, 36:321 – 334, 1971.
  80. J. T. Betts. Practical Methods for Optimal Control Using Nonlinear Programming. Advances in Design and Control. Society for Industrial and Applied Mathematics, 2001.
  81. S. J. Pirt. Principles of microbe and cell cultivation. Halsted Press book. Wiley, 1975.
  82. H. G. Bock. Randwertproblemmethoden zur Parameteridentifizierung in Systemen nichtlinearer Differentialgleichungen. Bonner Mathematische Schriften, 183, 1987.
  83. H. G. Bock, M. Diehl, J. P. Schlöder, F. Allgöwer, R. Findeisen, and Z. Nagy. Real- time optimization and nonlinear model predictive control of processes governed by differential-algebraic equations. In ADCHEM2000 -International Symposium on Advanced Control of Chemical Processes, volume 2, pages 695–703, Pisa, 2000.
  84. E. Kostina. Robust parameter estimation in dynamic systems. Optimization and Engineering, 5:461 – 484, 2004.
  85. V. V. Fedorov. Theory of Optimal Experiments. Probability and Mathematical Statistics. Academic Press, London, 1972.
  86. B. N. Parlett. The Symmetric Eigenvalue Problem. Society for Industrial and Applied Mathematics, 1998.
  87. H. L. Van Trees. Detection, Estimation, and Modulation Theory. Detection, Esti- mation, and Modulation Theory. Wiley, 2004.
  88. W. Wiechert, C. Siefke, A. de Graaf, and A. Marx. Bidirectional Reaction Steps in Metabolic Networks: II. Flux Estimation and Statistical Analysis. Biotechnology and Bioengineering, 55, no 1:118–135, 1997.
  89. H. G. Bock, E. Kostina, and J. P. Schlöder. Numerical methods for parameter esti- mation in nonlinear differential algebraic equations. GAMM-Mitteilungen, 30(2):376 – 408, 2007.
  90. C. F. Gauss. Theoria motus corporum coelestium in sectionibus conicis solem am- bientium. F. Perthes & I. H. Besser, Hamburg, 1809.
  91. K. Jordán. Calculus of Finite Differences. AMS Chelsea Publishing Series. Chelsea Publishing Company, 1965.
  92. P. Chaudhuri and P. A. Mykland. Nonlinear experiments: Optimal design and inference based on likelihood. Journal of the American Statistical Association, 88(422):538–546, 1993.
  93. J. R. Magnus and H. Neudecker. Matrix Differential Calculus with Applications in Statistics and Econometrics. John Wiley & Sons Ltd, 1999.
  94. A. Pázman. Foundations of Optimum Experimental Design. Reidel, Dodrecht, 1986.
  95. A. R. Gallant. Nonlinear statistical models. Wiley Series in Probability and Math- ematical Statistics. Wiley, New York [u.a.], 1987.
  96. G. E. P. Box, J. S. Hunter, and W. G. Hunter. Statistics for experimenters: an introduction to design, data analysis, and model building. 1978.
  97. Y. Bard. Nonlinear Parameter Estimation. Academic Press, Inc., 1974.
  98. H. T. Banks, S. L. Ernstberger, and S. L. Grove. Standard errors and confidence intervals in inverse problems: Sensitivity and associated pitfalls. Journal Inverse and Ill-Posed Problems, 10:1 – 8, 2007.
  99. H. T. Banks, S. Dediu, and S. L. Ernstberger. Sensitivity functions and their uses in inverse problems. Journal of Inverse and Ill-posed Problems, 15(7):1–32, 2007.
  100. E. M. L. Beale. Confidence regions in non-linear estimation. Journal of the Royal Statistical Society. Series B (Methodological), 22(1):41–88, 1960.
  101. R. Horn. Statistical methods for model discrimination. applications to gating ki- netics and permeation of the acetylcholine receptor channel. Biophysical Journal, 51(2):255 – 263, 1987.
  102. R. A. Fisher. The design of experiments. Edinburgh, 1935.
  103. R. Findeisen and F. Allgöwer. An introduction to nonlinear model predictive. In Control, 21st Benelux Meeting on Systems and Control, Veidhoven, pages 1–23, 2002.
  104. E. E. Leamer. Model choice and specification analysis. In Z. Griliches † and M. D. Intriligator, editors, Handbook of Econometrics, volume 1, chapter 05, pages 285– 330. Elsevier, 1 edition, 1983.
  105. D. Birkes and Y. Dodge. Alternative Methods of Regression. John Wiley & Sons, 1993.
  106. D. Q. Mayne, J. B. Rawlings, C. V. Rao, and P. O. M. Scokaert. Constrained model predictive control: Stability and optimality. Automatica, 36(6):789–814, June 2000.
  107. I. Bauer, H. G. Bock, S. Körkel, and J. P. Schlöder. Numerical methods for opti- mum experimental design in DAE systems. Journal of Computational and Applied Mathematics, 120(1-2):1–15, 2000.
  108. W. H. Kwon, A. M. Bruckstein, and T. Kailath. Stabilizing state-feedback design via the moving horizon method. Int. Journal on Control, 37(3):631–643, 1983.
  109. E. Kostina and M. Nattermann. A higher order sensitivity analysis of parameter estimation problems. Technical report, Philipps-Universität Marburg, 2013.


* Das Dokument ist im Internet frei zugänglich - Hinweise zu den Nutzungsrechten