Nonlinear Regression Modeling for Engineering Applications – Modeling, Model Validation, and Enabling Design of Experiments
John Wiley & Sons Inc (Hersteller)
978-1-118-59797-2 (ISBN)
- Keine Verlagsinformationen verfügbar
- Artikel merken
This book will assist either the academic or industrial practitioner to properly classify the system, choose between the various available modeling options and regression objectives, design experiments to obtain data capturing critical system behaviors, fit the model parameters based on that data, and statistically characterize the resulting model. The author has used the material in the undergraduate unit operations lab course and in advanced control applications.
R. Russell Rhinehart, Oklahoma State University, USA. Professor Rhinehart obtained his Ph.D. in Chemical Engineering in 1985 from North Carolina State University, USA. His research interests include process improvement (modeling, optimization, and control), and product improvement (modeling and design). In 2004 he was named as one of InTECHs 50 most influential industry innovators of the past 50 years, and was inducted into the Automation Hall of Fame for the Process Industries in 2005. He has written extensively for numerous journals and refereed articles.
Preface Acknowledgment Nomenclature Symbols Section 1 Introduction CH 1 & 2 Section 2 Preparation for Underlying Skills CH 3-6 Section 3 Regression, Validation, Design CH 7-20 Section 4 Case Studies and Data CH 21-23 1. Introductory Concepts 1.1. Illustrative Example of Regression 1.2. How Models are Used 1.3. Nonlinear Regression 1.4. Variable Types 1.5. Simulation 1.6. Issues 1.7. Takeaway 1.8. Exercises 2. Model Types 2.1. Model Terminology 2.2. A Classification of Mathematical Model Types: 2.3. Steady-State & Dynamic Models 2.3.1. Steady-State Models 2.3.2. Dynamic Models (time dependent) 2.4. Pseudo-First-Principles Appropriated First Principles 2.5. Pseudo-First-Principles Pseudo Components 2.6. Empirical Models with Theoretical Grounding 2.6.1. Empirical Steady-State 2.6.2. Empirical Time-Dependent 2.7. Empirical Models with No Theoretical Grounding 2.8. Partitioned Models 2.9. Empirical or Phenomenological? 2.10. Ensemble Models 2.11. Simulators 2.12. Stochastic & Probabilistic Models 2.13. Linearity 2.14. Discrete or Continuous 2.15. Constraints 2.16. Model Design (Architecture, Functionality, Structure) 2.17. Takeaway 2.18. Exercises 3. Propagation of Uncertainty 3.1. Introduction 3.2. Sources of Error and Uncertainty 3.3. Significant Digits 3.4. Rounding Off 3.5. Estimating Uncertainty on Values 3.6. Propagating Uncertainty Overview 2 Types, 2 Ways Each 3.6.1. Maximum Uncertainty 3.6.1.1. Propagation of Maximum Uncertainty, Numerical Approximation 3.6.1.2. Propagation of Maximum Uncertainty, Analytical Approximation 3.6.2. Probable Uncertainty 3.6.2.1. Propagation of Variance, Analytical Approximation 3.6.2.2. Propagation of Variance, Numerical Approximation 3.6.3. Generality 3.7. Which to Report? Maximum or Probable Uncertainty 3.8. Bootstrapping 3.9. Bias and Precision 3.10. Takeaway 3.11. Exercises 4. Essential Probability and Statistics 4.1. Variation and its Role in Topics 4.2. Histogram and its pdf and CDF Views 4.3. Constructing a Data-Based View of CDF and pdf 4.4. Parameters that Characterize the Distribution 4.5. Some Representative Distributions 4.5.1. Gaussian 4.5.2. Log-Normal 4.5.3. Logit 4.5.4. Exponential 4.5.5. Binomial 4.6. Confidence Interval 4.7. Central Limit Theorem 4.8. Hypothesis and Testing 4.9. Type-I & Type-II errors, Alpha and Beta 4.10. Essential Statistics for this Text 4.10.1. t-Test for Bias 4.10.2. Wilcoxon Signed Rank Test for Bias 4.10.3. r-lag-1 Autocorrelation Test 4.10.4. Runs Test 4.10.5. Test for Steady-State in a Noisy Signal 4.10.6. Chi-Square Contingency Test 4.10.7. Kolmogorov-Smirnov Distribution Test 4.10.8. Test for Proportion 4.10.9. F-Test for Equal Variance 4.11. Takeaway 4.12. Exercises 5. Simulation 5.1. Introduction 5.2. Three Sources of Deviations Measurements, Inputs, Coefficients 5.3. Two types of Perturbations: Noise (independent) and Drifts (persistence) 5.4. Two Types of Influence: Additive and Scaled with Level 5.5. Using the Inverse CDF to Generate n and u from UID(0,1) 5.6. Takeaway 5.7. Exercises 6. Steady and Transient State Identification 6.1. Introduction 6.1.1. General Applications 6.1.2. Concepts and Issues in Detecting Steady-State 6.1.3. Approaches and Issues to SSID & TSID 6.2. Method 6.2.1. Conceptual Model 6.2.2. Equations 6.2.3. Coefficient, Threshold, and Sample Frequency Values 6.2.4. Noiseless Data 6.3. Applications 6.3.1. Applications for Process Monitoring 6.3.2. Applications for Determining Regression Convergence 6.4. Takeaway 6.5. Exercises 7. Regression Target Objective Function 7.1. Introduction 7.2. Experimental and Measurement Uncertainty Static and Continuous Valued 7.3. Likelihood 7.4. Maximum Likelihood 7.5. Estimating sigma values 7.6. Vertical SSD A Limiting Consideration of Variability 7.7. r-square as a measure of fit 7.8. Normal, Total, Perpendicular SSD 7.9. Akahko s Method 7.10. Using the Model Inverse in Regression 7.11. Choosing the Dependent Variable 7.12. Model Prediction with Dynamic Models 7.13. Model Prediction with Classification Models 7.14. Model Prediction with Rank Models 7.15. Probabilistic Models 7.16. Stochastic Models 7.17. Takeaway 7.18. Exercises 8. Constraints 8.1. Introduction 8.2. Constraint Types 8.3. Expressing Hard Constraints in the Optimization Statement 8.4. Expressing Soft Constraints in the Optimization Statement 8.5. Equality Constraints 8.6. Takeaway 8.7. Exercises 9. The Distortion of Linearizing Transforms 9.1. Linearizing Coefficient Expression in Nonlinear Functions 9.2. The Associated Distortion 9.3. Sequential Coefficient Evaluation 9.4. Takeaway 9.5. Exercises 10. Optimization Algorithms 10.1. Introduction 10.2. Optimization Concepts 10.3. Gradient-Based Optimization 10.3.1. Numerical Derivative Evaluation 10.3.2. Steepest Descent The Gradient 10.3.3. Cauchy s Method 10.3.4. Incremental Steepest Descent (ISD) 10.3.5. Newton-Raphson (NR) 10.3.6. Levenberg-Marquardt (LM) 10.3.7. Modified LM 10.3.8. Generalized Reduced Gradient (GRG) 10.3.9. Work Assessment 10.3.10. Successive Quadratic 10.3.11. Perspective 10.4. Direct Search Optimizers 10.4.1. Cyclic-Heuristic Direct Search 10.4.2. Multi-player Direct Search Algorithms 10.4.3. Leapfrogging 10.5. Takeaway 10.6. Exercises 11. Multiple Optima 11.1. Introduction 11.2. Quantifying the Probability of Finding the Global Best 11.3. Approaches to Find the Global Optimum 11.4. Best-of-N Rule for Regression Starts 11.5. Interpreting the CDF 11.6. Takeaway 11.7. Exercises 12. Regression Convergence Criterion 12.1. Introduction 12.2. Convergence vs. Stopping 12.3. Traditional Criteria for Claiming Convergence 12.4. Combining DV influence on OF 12.5. Use Relative Impact as Convergence Criterion 12.6. Steady-State Convergence Criterion 12.7. NN Validation 12.8. Takeaway 12.9. Exercises 13. Model Design Desired and Undesired Model Characteristics and Effects 13.1. Introduction 13.2. Redundant Coefficients 13.3. Coefficient Correlation 13.4. Asymptotic and Uncertainty Effects When Inverted 13.5. Irrelevant Coefficients 13.6. Poles and Sign Flips w.r.t the DV 13.7. Too Many Adjustable Coefficients or Too Many Regressors 13.8. Irrelevant Model Coefficients 13.8.1. Standard Error of the Estimate 13.8.2. Backward Elimination 13.8.3. Logical Tests 13.8.4. Propagation of Uncertainty 13.8.5. Bootstrapping 13.9. Scale-Up or Scale-Down Transition to New Phenomena 13.10. Takeaway 13.11. Exercises 14. Data Pre- and Post-Processing 14.1. Introduction 14.2. Pre-Processing Techniques 14.2.1. Steady and Transient State Selection 14.2.2. Internal Consistency 14.2.3. Truncation 14.2.4. Averaging and Voting 14.2.5. Data Reconciliation 14.2.6. Real-time filtering for noise reduction (MA, FOF, STF) 14.2.7. Real-time filtering for Outlier Removal (Median Filter) 14.2.8. Real-time Noise Filtering, Statistical Process Control 14.2.9. Imputation of Input Data 14.3. Post-Processing 14.3.1. Outliers and Criterion for Rejection 14.3.2. Bi-Modal Residual Distribution 14.3.3. Imputation of Response Data 14.4. Takeaway 14.5. Exercises 15. Incremental Model Adjustment 15.1. Introduction 15.2. Choosing the Adjustable Coefficient in Phenomenological Models 15.3. Simple Approach 15.4. Alternate Approach 15.5. Other Approaches 15.6. Takeaway 15.7. Exercises 16. Model Experimental Validation 16.1. Introduction 16.1.1. Concepts 16.1.2. Deterministic models 16.1.3. Stochastic Models 16.1.4. Reality! 16.2. Logic-Based Validation Criteria 16.3. Data-Based Validation Criteria and Statistical Tests 16.3.1. Continuous Valued, Deterministic, SS or End-of-Batch 16.3.1.1. Data Patterns that Lead to Rejecting a Model 16.3.1.2. Test for Bias 16.3.1.3. Test for Skew or Curvature 16.3.1.4. Test for Variance Expectation 16.3.1.5. Test Outline 16.3.2. Continuous Valued, Deterministic, Transient 16.3.3. Class/Discrete/Rank Valued, Deterministic, Batch or Steady-State 16.3.4. Continuous Valued, Stochastic, Batch or Steady-State 16.3.5. Test for Normally Distributed Residuals 16.3.6. Experimental Procedure Validation 16.4. Model Discrimination 16.4.1. Mechanistic Models 16.4.2. Purely Empirical Models 16.5. Procedure Summary 16.6. Alternate Validation Approaches 16.7. Takeaway 16.8. Exercises 17. Model Prediction Uncertainty 17.1. Introduction 17.2. Bootstrapping 17.3. Takeaway 17.4. Exercises 18. Design of Experiments for Model Development and Validation 18.1. Concept Plan and Data 18.2. Sufficiently Small Experimental Uncertainty Methodology 18.3. Screening Designs A Good Plan for an Alternate Purpose 18.4. Experimental Design a Plan for Validation and Discrimination 18.4.1. Continually Redesign 18.4.2. Experimental Plan 18.5. EHS&LP 18.6. Visual Examples of Undesired Designs 18.7. Sequence in an Experimental Plan 18.8. Takeaway 18.9. Exercises 19. Utility vs. Perfection 19.1. Competing and Conflicting Measures of Goodness 19.2. Attributes for Model Utility Evaluation 19.3. Takeaway 19.4. Exercises 20. Trouble Shooting 20.1. Introduction 20.2. Bimodal and Multimodal Residuals 20.3. Trends in the Residuals 20.4. Parameter Correlation 20.5. Convergence Criterion too tight, too loose 20.6. Overfittintg (Memorization) 20.7. Solution Procedure Encounters EXE Errors 20.8. Not a crisp CDF 20.9. Outliers 20.10. Average Residual not Zero 20.11. Irrelevant Model Coefficients 20.12. Data Workup After the Trials 20.13. Too many rs! 20.14. Propagation of Uncertainty Doesn t Match Residuals 20.15. Multiple Optima 20.16. Very Slow Progress 20.17. All Residuals are Zero 20.18. Takeaway 20.19. Exercises 21. Case Studies 21.1. Valve Characterization on absorber air FCV 21.2. Orifice Q-dot vs. i model 21.3. Enrollment trends 21.4. Algal response to sunlight 21.5. Batch Reaction Kinetics 22. References 23. Appendix 23.1. VBA Primer 23.2. Leapfrogging Optimizer Regression Code for Steady-State Models 23.3. Bootstrapping code for Steady-State Model Uncertainty
| Verlagsort | New York |
|---|---|
| Sprache | englisch |
| Maße | 150 x 250 mm |
| Gewicht | 666 g |
| Themenwelt | Mathematik / Informatik ► Mathematik ► Angewandte Mathematik |
| Technik ► Maschinenbau | |
| ISBN-10 | 1-118-59797-4 / 1118597974 |
| ISBN-13 | 978-1-118-59797-2 / 9781118597972 |
| Zustand | Neuware |
| Informationen gemäß Produktsicherheitsverordnung (GPSR) | |
| Haben Sie eine Frage zum Produkt? |
aus dem Bereich