Mathematics

Probability Generating Function

A probability generating function is a mathematical tool used to describe the probability distribution of a discrete random variable. It provides a way to calculate the probabilities of different outcomes and is particularly useful for analyzing the number of successes in a fixed number of trials. The function is defined as a power series and is often used in the study of probability and statistics.

Written by Perlego with AI-assistance

4 Key excerpts on "Probability Generating Function"

  • Probably Not
    eBook - ePub

    Probably Not

    Future Prediction Using Probability and Statistical Inference

    • Lawrence N. Dworsky(Author)
    • 2019(Publication Date)
    • Wiley
      (Publisher)
    2 Probability Distribution Functionsand Some Math Basics

    The Probability Distribution Function

    (Aside) This section includes a primer/review of the mathematical notations introduced in this chapter. The mathematics is mostly just combinations of adding, subtracting, multiplying, and dividing lists of numbers. Since – in many cases – we will be dealing with very many numbers at a time (hundreds or even thousands of them), it is impractical to write out all of the numbers involved in a given calculation. What we do instead is introduce a summarizing notation that includes things such as subscripted variables, summation signs.
    The probability distribution function (PDF) is a powerful tool for studying and understanding probabilities. However, before discussing it is important to introduce the idea of a mathematical function. A mathematical function, or more simply, a function, is the mathematical equivalent of a food processor: you pour in one or more numbers, push the “grind” button for a few seconds, and pour out the resulting concoction.1 Three important facts were just presented:
    1. One or more numbers, usually called variables or independent variables go into the function, depending upon the particular function “recipe” we're dealing with.
    2. The function somehow processes these numbers to produce a result (a number) which is usually called the value of the function for the particular variable(s) that went into the function.
    3. The function produces exactly one value (result) for any given input variable(s).
    Fact 1 is true in the general case, but the cases we will be dealing with have only one number going into the function – so from now on we'll limit our discussion to this case. As an example, consider the function “Double the number coming in and then add 3 to it.” If the number coming in is 5, the result is 13; if the number coming in is 2, the result is 7, etc.
  • Elements of Simulation
    An alternative generating function is the Probability Generating Function, defined by G (z) = ℰ [ z X ] M.g.f.’s for some of the distributions considered earlier in this chapter are given in Table 2.1. For the distributions of Table 2.1, the m.g.f. may be used to check the values of means and variances given earlier, since M ′ (0) = ℰ [ X ], and M ″ (0) = ℰ [ X 2 ], illustrating why the m.g.f. is so named. A glance at the m.g.f.’s of Table 2.1 shows that binomial, negative-binomial and gamma random variables can be expressed as convolutions of identically distributed random variables. We see why this is so as follows: Table 2.1 Common distributions and associated moment generating. functions Distribution m.g.f. geometric:Pr(X = i)= q i −1 p pe θ (1− qe θ) −1,for qe θ <1 binomial: B (n, p) : Pr (X = i) = (n i) p i q n − i (q + pe θ) n negative-binomial: Pr (X = n + i) = (n + i − 1 i) p i q n p n e[--=PL. GO-SEPARATOR=--]nθ (1− qe θ) − n,. for qe θ <1 Poisson: Pr (X = i) = e − λ λ i i ! e λ (e θ− 1) normal: N (0, 1) : f (x) = e − x 2 / 2 √ (2 π) e θ 2 / 2 exponential: f (x)= λe −λx λ λ − θ for θ < λ gamma: Γ (n, λ) : f (x)[--=PLGO-SEPARATO. R=--]= e − λ x λ n x n − 1 Γ (n) (λ λ − θ) n for θ < λ Let S = ∑ i = 1 n X i then M S (θ) = ℰ [ exp (θ ∑ i = 1 n X i) ] = ℰ [ ∏ i = 1 n exp (θ X i) ] and if the { X i } are mutually independent,. then M S (θ) = ∏ i = 1 n ℰ [ exp (θ X i) ] = ∏ i = 1 n M X i (θ) Furthermore, if the { X i } have the common m.g.f., Μ X (θ), say, then M S (θ) = (M X (θ)) n (2.5) Thus, for example, a random variable X with the Γ (n, λ) distribution can be written as X = ∑ i = 1 n E i where. the E i are independent, identically distributed exponential random variables with parameter λ (cf
  • Acceptance Sampling in Quality Control
    3

    Probability Functions

    Many sampling situations can be generalized to the extent that specific functions have proved useful in computing the probabilities associated with the operating characteristic curve and other sampling characteristics.
    These are functions of a random variable X that take on specific values x at random with a probability evaluated by the function. Such functions are of two types:
    Frequency function: It gives the relative frequency (or density) for a specific value of the random variable X. It is represented by the function f(x).
    Distribution function: It gives the cumulative probability of the random variable X up to and including a specific value of the random variable. It can be used to obtain probability over a specified range by appropriate manipulation. It is represented by F(x).
    In the case of a discrete, go/no-go, random variable
    f
    ( x )
    = P
    (
    X = x
    )
    and the distribution function is simply the sum of the values of the frequency function up to and including x:
    F
    ( x )
    =
    i = 0
    X
    f
    ( x )
    X discrete
    When X is continuous, that is, a measurement variable, it is the integral from the lowest possible value of X, taken here to be –∞, up to x:
    F
    ( x )
    =
    x
    f
    ( t )
    d t X continuous
    where the notation
    a b
    f
    ( t )
    d t
    may be thought of as representing the cumulative probability of f(t) from a lower limit of a to an upper limit of b. In either case, these functions provide a tool for assessment of sampling plans and usually have been sufficiently well tabulated to avoid extensive mathematical calculation.
    The probability functions can be simply illustrated by a single toss of a six-sided die. Here, the random variable X
  • Statistics
    eBook - ePub

    Statistics

    A Concise Mathematical Introduction for Students, Scientists, and Engineers

    • David W. Scott(Author)
    • 2020(Publication Date)
    • Wiley
      (Publisher)
    These are dimensionless quantities. 3.3.6 Moment Generating Function Calculating all of these moments can become quite tedious. Fortunately, in this era of easy symbolic computation, there is an alternative. As the name suggests, the moment generating function (MGF), which is defined as (3.27) allows us to extract the th non‐central moment via differentiation, and finally the central moments via manipulations such as in Equation (3.17). Assuming the MGF is a “nice” function, then Thus the th non‐central moment, which we denote by, is given by (3.28) Example: From Equation (3.27), Consider the uniform density on, for which. The MGF of the Unif density is Next, we compute, which in the limit as equals 1/2. Likewise, the second derivative equals, which equals 1/3 in the limit as. Thus, we have shown that and ; hence, using Equation (3.17). Admittedly, it would have been much easier to compute these directly without the MGF, but for other examples forthcoming, the MGF has its advantages. In particular, the MGF has other important applications in analyzing sums of random variables. Aside 1: Mathematica (Wolfram Research Inc. (2018)) was used to compute these formulae via Aside 2: The moment generating function is essentially the Laplace transform of the density. The Laplace transform is unique and invertible; however, it is not well defined for all values of. A more general transformation is the characteristic function (CF), defined as the expectation of, which exists for all values of, where. The CF is essentially the Fourier transform. We have no need for the extra generality, and will rely on the MGF only, rather than the CF. 3.3.7 Measurement Scales and Units of Measurement The primary distinction we have drawn in the definition of a random variable is whether it is discrete or continuous
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.