Technology & Engineering

Polynomial Regression

Polynomial regression is a type of regression analysis used to model the relationship between the independent variable and the dependent variable. It involves fitting a polynomial equation to the data points to find the best-fitting curve. This allows for capturing non-linear relationships between variables, making it a valuable tool in predictive modeling and data analysis.

Written by Perlego with AI-assistance

5 Key excerpts on "Polynomial Regression"

  • Statistical Methods for Engineers and Scientists
    • Robert M. Bethea(Author)
    • 2018(Publication Date)
    • CRC Press
      (Publisher)
    9

    Regression Analysis

    9.1 Introduction

    Regression is a highly useful statistical technique for developing a quantitative relationship between a dependent variable and one or more independent variables. It utilizes experimental data on the pertinent variables to develop a numerical relationship showing the influence of the independent variables on a dependent variable of the system.
    Throughout engineering, regression may be applied to correlating data in a wide variety of problems ranging from the simple correlation of physical properties to the analysis of a complex industrial system. If nothing is known from theory about the relationship among the pertinent variables, a function may be assumed and fitted to experimental data on the system. Frequently, a linear function is assumed. In other cases where a linear function does not fit the experimental data properly, the engineer might try a polynomial or exponential function.

    9.2 Simple Linear Regression

    In the simplest case the proposed functional relationship between two variables is
    In this model Y is the dependent variable, X the independent variable, and ε a random error (or residual) which is the amount of variation in Y not accounted for by the linear relationship. The parameters β 0 and β 1 , called the regression coefficients, are unknown and are to be estimated. The variable X is not a random variable but takes on fixed values. It will be assumed that the errors ε are independent and have a normal distribution with mean 0 and variance σ2 , regardless of what fixed value of X is being considered. Taking the expectation of both sides of Eq. (9.1) , we have
    where we note that the expected value of the errors is zero.
    In the simple linear regression model, the variable X can be taken to be a random variable, in which case Eq. (9.2) is written as
    In this representation E(Y\X) is the mean or expected value of Y given a fixed value of X. The mean E(Y\X) is a conditional mean, that is, the mean of Y given X. This conditional mean can be written as
    E(Y\X) = μY|X .
    Equation (9.3) is called the regression of Y on X.
  • Introduction to Linear Regression Analysis
    • Douglas C. Montgomery, Elizabeth A. Peck, G. Geoffrey Vining(Authors)
    • 2021(Publication Date)
    • Wiley
      (Publisher)
    CHAPTER 7 Polynomial Regression MODELS

    7.1 INTRODUCTION

    The linear regression model y = Xβ + ε is a general model for fitting any relationship that is linear in the unknown parameters β. This includes the important class of Polynomial Regression models. For example, the second-order polynomial in one variable
    and the second-order polynomial in two variables
    are linear regression models.
    Polynomials are widely used in situations where the response is curvilinear, as even complex nonlinear relationships can be adequately modeled by polynomials over reasonably small ranges of the x’s. This chapter will survey several problems and issues associated with fitting polynomials.

    7.2 POLYNOMIAL MODELS IN ONE VARIABLE

    7.2.1 Basic Principles

    As an example of a Polynomial Regression model in one variable, consider
    Figure 7.1
    An example of a quadratic polynomial.
    (7.1)
    This model is called a second-order model in one variable. It is also sometimes called a quadratic model, since the expected value of y is
    which describes a quadratic function. A typical example is shown in Figure 7.1 . We often call β1 the linear effect parameter and β2 the quadratic effect parameter. The parameter β0 is the mean of y when x = 0 if the range of the data includes x = 0. Otherwise β0 has no physical interpretation.
    In general, the kth-order polynomial model in one variable is
    (7.2)
    If we set xj = xj , j = 1, 2, …, k, then Eq. (7.2) becomes a multiple linear regression model in the k regressors x1 , x2 , … xk . Thus, a polynomial model of order k may be fitted using the techniques studied previously.
    Polynomial models are useful in situations where the analyst knows that curvilinear effects are present in the true response function. They are also useful as approximating functions to unknown and possibly very complex nonlinear relationships. In this sense, the polynomial model is just the Taylor series expansion of the unknown function. This type of application seems to occur most often in practice.
  • A First Course in the Design of Experiments
    eBook - ePub
    • John H. Skillings, Donald Weber(Authors)
    • 2018(Publication Date)
    • CRC Press
      (Publisher)
    3 . We can then obtain the prediction model
    y ^
    =
    b 0
    +
    b 1
    x 1
    +
    b 2
    x 2
    +
    b 3
    x 3
    ,
    which can be used to predict electric usage costs for given values of x1 , x2 and x3 .
    We close this section by noting that the simple linear regression model is the multiple regression model with k = 1. This fact will be utilized in the following manner. Whenever we obtain results for the multiple regression model, we have also obtained results for the simple case.
    2.5 Polynomial Regression
    When using the simple linear regression model, we assumed that a dependent variable Y and a single independent variable x had a linear relationship. While this is a reasonable assumption in many problems, there are situations where the relationship between Y and x is not linear in nature. The following example is an illustration of such a situation.
    Example 2.5.1. A large department store chain has twelve stores of approximately equal size in a district. In past years some stores stocked more Christmas trees than they could sell, and as a consequence, these stores had a lower profit on tree sales than some other stores. A study was undertaken to determine the optimal number of trees that a store should stock. For the study, each store stocked a different number of trees and the profit obtained from Christmas tree sales was recorded. The results are given in Table 2.5.1 and these data are plotted in Figure 2.5.1 .
    Table 2.5.1 Christmas Tree Profit Study Figure 2.5.1 Number of Christmas Trees Stocked
    It is clear that we cannot expect a straight line to fit the data in Figure 2.5.1 very well. Instead we need to assume some curvilinear relationship.
    A common, easy—to—use, curvilinear relationship between Y and x is a polynomial. Recall that a polynomial of degree k has the form
  • Statistical Methods in Medical Research
    • Peter Armitage, Geoffrey Berry, J. N. S. Matthews(Authors)
    • 2013(Publication Date)
    • Wiley-Blackwell
      (Publisher)

    12

    Further regression models for a continuous response

    12.1 Polynomial Regression

    Reference was made in §11.6 to the possibility of creating new predictor variables defined as the squares of existing variables, to cope with non-linear or curvilinear relationships. This is an important idea, and is most easily studied in situations in which there is originally only one predictor variable x . Instead of the linear regression equation
    (12.1)
    introduced in §7.2, we consider the polynomial model
    (12.2)
    The highest power of x , denoted here by p , is called the degree of the polynomial. Some typical shapes of low-degree polynomial curves are shown in Fig. 12.1 . The curve for p = 2, when the term in x 2 is added, is called quadratic ; that for p = 3 cubic , and that for p = 4 quartic . Clearly, a wide variety of curves can be represented by polynomials. The quadratic curve has one peak or trough; the cubic has at most two peaks or troughs; and so on. A particular set of data may be fitted well by a portion of a low-degree polynomial even though no peaks or troughs are present. In particular, data showing a moderate amount of curvature can often be fitted adequately by a quadratic curve.
    The general principle of Polynomial Regression analysis is to regard the successive powers of x as separate predictor variables. Thus, to fit the p -degree polynomial (12.2), we could define x 1 = x , x 2 = x 2 , …
    xp
    =
    xp
  • Correlation and Regression
    eBook - ePub

    Correlation and Regression

    Applications for Industrial Organizational Psychology and Management

    IX EXPANDING REGRESSION REPERTOIRE: POLYNOMIAL AND INTERACTION TERMS CHAPTER OBJECTIVES After reading this chapter, you should be able to:
    • Understand why (or how) there are some nonlinear relationships between X and Y that can be analyzed using a general “linear” model.
    • Define Polynomial Regression and suggest situations appropriate for its use.
    • Explain why the linear term (e.g., X ) typically remains in an analysis when using Polynomial Regression.
    • Explain what an interaction is and suggest situations in which it would be appropriate to use an interactive regression. • Understand why the statistical power for detecting interactions can be lower than one might have hoped for. • Explain how changes in coding affect regressions containing interaction terms. As in the previous chapter, we are still going to consider the multiple regression model    
    However, our focus in this chapter will be on the question, How can we algebraically modify the X ’s to fit our theoretical requirements? For example, we will soon discuss (among other possibilities) the idea of replacing X 2 by . Then, the underlying regression model becomes
       
    Such a model will be handy-dandy if we believe that Y and X are related in nonlinear ways. Specifically, Equation IX.B suggests that Y is related to the square of X as well as to X itself.
    Before we begin, a comment is in order about what we mean by “linear models.” Note that we will be able to analyze some relationships between Y and X that appear to be “nonlinear.” For example, we can use the above trick of including X 2 as a predictor in the regression. On the other hand, statisticians still label the model in Equation IX.B a “general linear model.” What they mean is that Y is assumed to be a linear combination of the predictors, plus error. That is,
    Y = B 0 + B 1 × predictor#1 + B 2 × predictor#2 + ··· + e
    In other words, it doesn’t matter what the predictor looks like. It can be X , X 2 , , or sin X , as long as Y is a constant (B -weight) times a predictor plus another constant (B
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.