Mathematics

Discrete Random Variable

A discrete random variable is a variable that can take on a countable number of distinct values. These values are typically the result of counting or enumerating, such as the number of students in a class or the outcomes of rolling a die. The probability distribution of a discrete random variable can be described using a probability mass function.

Written by Perlego with AI-assistance

9 Key excerpts on "Discrete Random Variable"

  • Introduction to Bayesian Statistics
    • William M. Bolstad, James M. Curran(Authors)
    • 2016(Publication Date)
    • Wiley
      (Publisher)
    CHAPTER 5 Discrete Random VariableS
    In the previous chapter, we looked at random experiments in terms of events. We also introduced probability defined on events as a tool for understanding random experiments. We showed how conditional probability is the logical way to change our belief about an unobserved event given that we observed another related event. In this chapter we introduce Discrete Random Variables and probability distributions.
    A random variable describes the outcome of the experiment in terms of a number. If the only possible outcomes of the experiment are distinct numbers separated from each other (e.g., counts), then we say that the random variable is discrete. There are good reasons why we introduce random variables and their notation:
    1. It is quicker to describe an outcome as a random variable having a particular value than to describe that outcome in words. Any event can be formed from outcomes described by the random variable using union, intersection, and complements.
    2. The probability distribution of the Discrete Random Variable is a numerical function. It is easier to deal with a numerical function than with probabilities being a function defined on sets (events). The probability of any possible event can be found from the probability distribution of the random variable using the rules of probability. So instead of having to know the probability of every possible event, we only have to know the probability distribution of the random variable.
    3. It becomes much easier to deal with compound events made up from repetitions of the experiment.

    5.1 Discrete Random Variables

    A number that is determined by the outcome of a random experiment is called a random variable. Random variables are denoted by uppercase letters, e.g., Y. The value the random variable takes is denoted by lowercase letters, e.g., y. A Discrete Random Variable, Y, can only take on the distinct values yk . There can be a finite possible number of values; for example, the random variable defined as “number of heads in n tosses of a coin” has possible values 0, 1,..., n
  • Probability Theory and Mathematical Statistics for Engineers
    • V. S. Pugachev(Author)
    • 2014(Publication Date)
    • Pergamon
      (Publisher)
    CHAPTER 2 RANDOM VARIABLES Publisher Summary This chapter describes random variables. A random variable is a variable that assumes, as a result of a trial, only one of the set of possible values and with which is connected some field of events representing its occurrences in given sets, contained in the main field of events δ. Random variables may be both scalar and vector. In correspondence with general definition of a vector, one can call a vector random variable or a random vector any ordered set of scalar random variables. A random variable with countable or finite set of possible values is called a Discrete Random Variable. The distribution of a Discrete Random Variable is completely determined by the probabilities of all of its possible values. It is impossible to determine the distribution of a random variable with an uncountable set of possible values by the probabilities of its values. Therefore, another approach to such random variables is necessary. 2.1 General definitions. Discrete Random Variables 2.1.1 Definition of a random variable In Section 1.2.1 an intuitive definition of a random variable was given based on experimentally observable facts, and it was shown that with every random variable may be connected some events, its occurrences in different sets. For studying random variables it is necessary that the probabilities be determined for some set of such events, i.e. that this set of events belongs to the field of events δ connected with a trial. Furthermore, it is expedient to require that this set of events be itself a field of events (a subfield of the field δ)
  • Designing High Availability Systems
    eBook - ePub

    Designing High Availability Systems

    DFSS and Classical Reliability Techniques with Practical Real Life Examples

    • Zachary Taylor, Subramanyam Ranganathan(Authors)
    • 2013(Publication Date)
    • Wiley-IEEE Press
      (Publisher)
    A Discrete Random Variable has a finite or countable infinite number of values. A continuous random variable has an uncountable infinite number of values. Let us consider Discrete Random Variables first. A Discrete Random Variable X is a variable whose value is taken from a set of possible discrete values we have defined. For the finite case, those values are: (x 0, x 1, x 2, … , x n, …, x N). These values must be real numbers in order to characterize the resulting probability distributions. We will use a lower case letter to denote the value of a random variable and an upper case letter to denote the random variable. The set of events we are interested in (φ 1, φ 2, …, φ M) must be mapped to these discrete values (x 0, x 1, x 2, …, x n, …, x N) by the random variable X. The set of discrete values associated with this mapping is referred to as the value of X (φ). The random variable represents a set of random values that are possible for a particular sample space. Typically, X (φ) is abbreviated to X (the sample point parameter is omitted for brevity), and X represents all possible values in the sample set. The sample space S is called the domain of the random variable, and the set of all values of X is called the range of the random variable. Note that two or more different sample values may give the same random value of X (φ), but two different numbers in the range cannot be assigned to the same sample value. We can write the general relation: (4.3) Once we have this relationship, we are interested in the probability that the events (φ 1, φ 2, … , φ M) occur. For each x n, we associate a probability that the random variable X is equal to x n. that is: (4.4) As an example, when a coin is flipped, we have two possibilities: it lands on one side of the coin, heads (H) or the other side of the coin, tails (T). The total sample space we have to choose from is { H,T }. Our random variable X thus maps these two sample values to a real number
  • Applied Medical Statistics
    • Jingmei Jiang(Author)
    • 2022(Publication Date)
    • Wiley
      (Publisher)
    X a random variable.

    Definition 4.1

    Assume that E is a random experiment with sample space Ω = {ω}. If for each ω ∈ Ω, there is one and only one numeric value X(ω) corresponding to it, then a function X(ω) defined on Ω is obtained, and X(ω) is called a random variable.
    We typically use capital letters XYZ,… to denote random variables.
    For example, in the experiment of flipping one coin, the sample space is
    Ω =
    {
    ω 1
    ,  
    ω 2
    }
    , where
    ω 1
    denotes heads and
    ω 2
    denotes tails. According to Definition 4.1, the random variable X is a function of ω defined on Ω:
    For the experiment of flipping 50 coins and counting the number k of “heads” events, the sample space is
    Ω 3
    =
    {
    ω i
    ;   i = 0 ,   1 ,   2 , , 50
    }
    and the random variable X can be expressed as
    An important characteristic of random variables is that each value of the variable has a corresponding probability. Therefore, to fully understand a random variable, we need to know:
    1. every possible value, or the interval of values of the random variable.
    2. the probabilities corresponding to the values or value ranges.
    Two classes of random variables exist: discrete and continuous. A Discrete Random Variable can assume only certain values, either finite or countably infinite. A continuous random variable can assume values that cannot be enumerated or may be expressed within intervals of real numbers. In the following, we discuss the Discrete Random Variable and continuous random variable separately because these two classes are handled somewhat differently. We first examine the pattern of behavior and predictability of Discrete Random Variables in this chapter. Then we introduce the continuous random variable in Chapter 5 .

    4.2 Probability Distribution of the Discrete Random Variable

    4.2.1 Probability Mass Function

    In statistics, the pattern of behavior of a Discrete Random Variable is described by the probability mass function (pmf). We illustrate this concept using Example 4.1
  • Statistical Theory
    eBook - ePub
    • Bernard Lindgren(Author)
    • 2017(Publication Date)
    • Routledge
      (Publisher)
    3Random Variables
    In selecting a population member at random, or performing any experiment of chance for which a probability structure has been postulated, it is common for one to focus on and observe some quantity or quality associated with the outcome. Since what is observed depends on the outcome of an experiment of chance, we call it a random variable . Thus, the random variable X is simply a function X defined on a probability space Ω.
    What one observes about an outcome ω may be a number , or possibly a vector of numbers. Some reserve the terms “random variable” and “random vector” for these cases, respectively. However, qualitative aspects of ω are also variable, and there is no reason not to think of them as random variables. Thus, when ω denotes an individual drawn at random from some population, the age of that individual is a numerical random variable, and eye color is a qualitative or categorical random variable.

    3.1 Discrete Random Variables

    Consider first the case of discrete Ω, in which all subsets are “events” and have assigned probabilities. The value-space of a function X can only be discrete, and we can define probabilities in this space by assigning probabilities to the individual possible “values.” (These can be either numbers or category labels.) Thus, the probability of [X = a ] is the probability of the set of ω ’s such that X (ω ) = a :
    P
    ( X = a )
    = P
    (
    { ω : X
    ( ω )
    = a }
    )
    =
    X
    ( ω )
    = a
    P
    ( ω )
    .
    With probabilities of individual values so defined, the value space of X is a discrete probability space.
    When Ω is not discrete, the value space of a function X may or may not be discrete. When the value-space of X is discrete (whether Ω itself is discrete or not), we say that X is a Discrete Random Variable . The possible values are countable: (x 1 , x 2 , ...}. This value-space is a discrete probability space when we define the probability of a set of values as the sum of the probabilities of its elements. The distribution of probability in this space, also referred to as the distribution of X , is defined by the probability function (p.f.)
  • Handbook of Probability
    • Ionut Florescu, Ciprian A. Tudor(Authors)
    • 2013(Publication Date)
    • Wiley
      (Publisher)
    Chapter Four Random Variables: The Discrete Case

    4.1 Introduction/Purpose of the Chapter

    This chapter treats Discrete Random Variables. After having introduced the general notion of a random variable, we discuss specific cases. Discrete Random Variables are presented next, and continuous random variables are left to the next chapter. In this chapter we learn about calculating simple probabilities using a probability mass function. Several probability functions for Discrete Random Variables warrant special mention because they arise frequently in real-life situations. These are the probability functions for, among others, the so-called geometric, hypergeometric, binomial, and Poisson distributions. We focus on the physical assumptions underlying the application of these functions to real problems. Although we can use computers to calculate probabilities from these distributions, it is often convenient to use special tables, or even use approximate methods in which one probability function can be approximated quite closely by another function. We introduce the concepts of distribution, cumulative distribution function, expectation, and variance for Discrete Random Variables. We also discuss higher-order moments of such variables.

    4.2 Vignette/Historical Notes

    Historically, the Discrete Random Variables were the first type of random outcomes studied in practice. The documented exchange of letters in 1964 between Pascal and Fermat was prompted by a game of dice which essentially dealt with discrete random outcomes (faces of the dies). Even earlier, the 16th-century Italian mathematician and physician Cardano wrote “ On Casting the Die,” a study dealing with Discrete Random Variables; however, it was not published until 1663, 87 years after the death of Cardano. He introduced concepts of combinatorics into calculations of probability and defined probability as “the number of favorable outcomes divided by the number of possible outcomes.”
  • Probability, Statistics, and Data
    eBook - ePub

    Probability, Statistics, and Data

    A Fresh Approach Using R

    3 Discrete Random Variables DOI: 10.1201/9781003004899-3 A statistical experiment produces an outcome in a sample space, but frequently we are more interested in a number that summarizes that outcome. For example, if we randomly select a person with a fever and provide them with a dosage of medicine, the sample space might be the set of all people who currently have a fever, or perhaps the set of all possible people who could currently have a fever. However, we are more interested in the summary value of “how much did the temperature of the patient decrease.” This is a random variable. Definition 3.1. Let S be the sample space of an experiment. A random variable is a function from S to the real line. Random variables are usually denoted by a capital letter. Many times we will abbreviate the words random variable with rv. Suppose X is a random variable. The events of interest for X are those that can be defined by a set of real numbers. For example, X = 2 is the event consisting of all outcomes s ∈ S with X (s) = 2. Similarly, X > 8 is the event consisting of all outcomes s ∈ S with X (s) > 8. In general, if U ⊂ ℝ : X ∈ U is the event { s ∈ S | X (s) ∈ U } Example 3.1. Suppose that three coins are tossed. The sample space is S = { H H H, H H T, H T H, H T T, T H H, T H T, T T H, T T T }, and all eight outcomes are equally likely, each occurring with probability 1/8. A natural random variable here is the number of heads observed, which we will call X. As a function from S to the real numbers, X is given by: X (H H H) = 3 X (H H T) = X (H T H) = X (T H H) = 2 X (T T H) = X (T H T) = X (H T T) = 1 X (T T T) = 0 The event X = 2 is the set of. outcomes { H H T, H T H, T H H } and so: P (X = 2) = P ({ H H T, H T H, T H H }) = 3 8. It is often easier, both notationally and for doing computations, to hide the sample space and focus only on the random variable. We will not always explicitly define the sample space of an experiment
  • Practitioner's Guide to Statistics and Lean Six Sigma for Process Improvements
    • Mikel J. Harry, Prem S. Mann, Ofelia C. De Hodgins, Richard L. Hulbert, Christopher J. Lacke(Authors)
    • 2011(Publication Date)
    • Wiley
      (Publisher)
    Many other variables are continuous, especially those that involve some measure of weight, volume, distance, or area. Often continuous random variables are used to approximate distributions that involve money, even though it is a Discrete Random Variable. This occurs when there is a large set of possible values, such as the price of a house. Following are a few more examples of continuous random variables:
    1. The weight of an outgoing shipment 2. The amount of gas dispensed when the fuel pump states one gallon 3. The distance traveled by a delivery truck in a single day 4. The price of a gallon of gas This chapter is limited to discussion of Discrete Random Variables. Chapter 12 contains discussion of continuous random variables. 11.5 PROBABILITY DISTRIBUTIONS OF A Discrete Random Variable
    Let x be a Discrete Random Variable. The probability distribution of x describes how the probability is distributed over the possible values of x . Example 11.3 demonstrates the concept of a probability distribution of a Discrete Random Variable by extending Example 11.2.
    Example 11.3 Recall from Example 11.2 the outcomes and values of x corresponding to the number of customers out of four who make a deposit. This information is reproduced in Table 11.3 . In Example 11.2, we did not make any assumptions regarding how likely a customer is to make a deposit, but we will provide an example here. In addition, we will make a common assumption that customers arriving at a bank make deposits independently of each other.
    Suppose that there is a probability of .50 that any given customer makes a deposit. On the basis of this information, Table 11.3 lists the various values of x, the corresponding outcomes, and the probabilities of various values of x . Note that this example contains 16 outcomes and all of them are equally likely (because of .50 probability of deposit), so that the probability of any specific value of x
  • Probability, Statistics, and Stochastic Processes
    • Peter Olofsson, Mikael Andersson(Authors)
    • 2012(Publication Date)
    • Wiley
      (Publisher)
    Chapter 2 Random Variables

    2.1 Introduction

    We saw in the previous chapter that many random experiments have numerical outcomes. Even if the outcome itself is not numerical, such as the case is Example 1.4, where a coin is flipped twice, we often consider events that can be described in terms of numbers, for example, {the number of heads equals 2}. It would be convenient to have some mathematical notation to avoid the need to spell out all events in words. For example, instead of writing {the number of heads equals 1} and {the number of heads equals 2}, we could start by denoting the number of heads by X and consider the events {X = 1} and {X = 2}. The quantity X is then something whose value is not known before the experiment but becomes known after.
    Definition 2.1. A random variable is a real-valued variable that gets its value from a random experiment.
    There is a more formal definition that defines a random variable as a real-valued function on the sample space. If X denotes the number of heads in two coin flips, we would thus, for example, have X (HH ) = 2. In a more advanced treatment of probability theory, this formal definition is necessary, but for our purposes, Definition 2.1 is enough.
    A random variable X is thus something that does not have a value until after the experiment. Before the experiment, we can only describe the set of possible values, that is, the range of X and the associated probabilities. Let us look at a simple example.
    Example 2.1. Flip a coin twice and let X denote the number of heads. Then X has range {0, 1, 2} and the associated probabilities are
    and we refer to these probabilities as the distribution of X .
    In the last example, any three numbers between 0 and 1 that sum to 1 is a possible distribution (recall Section 1.4), and the particular choice in the example indicates that the coin is fair. Let us next restate some of the examples from Section 1.2 in terms of random variables. In each case, we define X
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.