Design of Experiments: Improved Experimental Methods in Aerospace Testing

doe

Instructed by Dr. Drew Landman, Professor of Aerospace Engineering, Old Dominion University

Overview

Aerospace researchers with considerable subject-matter expertise but little formal training in the design of experiments can improve their research quality and productivity through learning statistically based experiment design. A formal approach to experiment design ensures empirical model adequacy, quantifies variability in predictions and identifies all possible independent variable interactions. Built in safeguards protect against influences of unwanted variability. Examples are drawn from specific studies that illustrate resource savings, quality improvements, and enhanced insights from well-designed experiments. The class structure features lectures with in-class problem solving using software. Students receive note sets containing all slides presented, exercises and solutions. 

Learning Objectives

  •  Key advantages of Design of Experiments (DOE) over traditional experiment design methods
  •  How to specify the proper volume of data to enhance the probability of success and understand the concepts of inferential risk
  •  Full and fractional factorial designs to efficiently quantify main effects and interactions
  •  Experimental tactics to minimize and quantify unexplained variance
  •  Introduction to Response Surface Methods for higher order modeling
  •  Experience with experiment design software (Design Expert) including in-class exercises
  • [Detailed Outline Below]

Who Should Attend

This is a course in experimental methods that is applicable to multiple disciplines. It is intended for scientists, engineers, and technical program, project, and line managers involved in the design and execution of experimental aerospace research, or in product/process improvement. Undergraduate and graduate students in engineering and scientific disciplines would also benefit from exposure to the concepts presented in this course.

Course Requirements

Software for the course is Design Expert. Analysis is presented throughout the lectures and exercises using Design Expert software. Students will be sent instructions for downloading and installing the software before the course starts. Students will need Adobe Reader, and Microsoft Office software to work the exercises (Word, and Excel). The textbook Design and Analysis of Experiments by D. Montgomery is also strongly recommended. (The 10th edition is preferred but 8th or 9th will work too.)

 
Course Information:

Type of Course: Instructor-Led Short Course
Course Length: 2 days
AIAA CEU's available: Yes 

 
 
Outline
  • Introduction
    • Overview
    • History and need for improved experimental methods in Aerospace Testing
    • Introduction to Factorial Experiments
  • Example of Factorial Experiment
    • Statistics Fundamentals
    • Basic Descriptive Statistics
    • The t and F distributions
    • Confidence Intervals
    • Simple Comparative Experiments: Two-sample t-test
    • Power and Sample size
    • Analysis ToolPak in Excel
    • Introduction to ANOVA
    • Examples with Excel
  • Single Factor Experiments
    • Why ANOVA?
    • Partitioning Variance
    • The Effects Model
    • Mean Squares
    • Significance Testing
    • ANOVA Table
    • Treatment Comparisons: Least Significant Difference
    • Residual Analysis
    • Introduction to Regression Modeling
    • Example
  • Introduction to Factorial designs
    • Main and Interaction effects2k Design and analysis
    • Coded Variables
    • Regression Modeling
    • ANOVA
    • 2^3 Design for force transducer calibration
    • Example using Design Expert
  • Experiment Planning
    • Screening
    • Characterization
    • Comparison
    • Optimization
    • Use of a Design Guide
  • Wind Tunnel Testing Example
    • Factor Choices
    • Factorial Design Build in Design Expert
    • Model Building and Interpretation
    • Model Adequacy Testing
    • Summary Statistics
    • Analysis in Design Expert
  • Additional Examples
  • 2k Design and Analysis Details
    • Center points and replication: pure error
    • Tests for curvature and model lack of fit
    • Regression models
    • Sequential Assembly
  • Second Order Modeling
    • Introduction to Response Surfaces
    • Augmenting Designs to Second Order
    • ANOVA and Regression models
    • Model Adequacy
    • Residual Analysis and Summary Statistics
    • Wind Tunnel Test Example using Design Expert
  • Fractional Factorial Designs
    • Why use fractionals ?Screening Designs
    • Efficient Characterization
    • Aliasing and Design Resolution
    • Foldovers
    • Ground Vehicle Aerodynamic Characterization Example
    • Wind Tunnel External Balance Calibration Case Study
  • Blocking
    • Reasons to Block
    • Blocking Examples
  • Response Surface Methods
    • Central Composite Designs
    • Box Behnken Designs
    • Regression and Use of ANOVA
    • Wind Tunnel Case Study
  • Assign Optional At-Home Experiment
  • Review and Discuss At-Home Experiment Results
  • Introduction to Experiments with Restricted Randomization
    • Split Plot Designs: Two Levels of Variance
    • Split Plot Factorial Designs
    • ANOVA with REML
    • Model Interpretation
    • Effects on Inference
    • Second Order Example
  • Case Studies
    • Balance Calibration with Temperature Case Study
    • Mars Parachute Wind Tunnel Test Case Study
Materials
 
Instructors

landman1





Dr. Drew Landman
, an AIAA Associate Fellow, is professor of Aerospace Engineering at Old Dominion University where he has developed graduate courses in applied statistical engineering including Design of Experiments (DOE) and Response Surface methods.  He served as Chief Engineer at the Langley Full-Scale Tunnel where he developed DOE based wind tunnel test programs and force measurement system calibrations.  Dr. Landman has served as a consultant in DOE and DOE training to NASA, the US Navy, AEDC, Institute for Defense Analysis, and industry.

 

AIAA Training Links