class: center, middle, inverse, title-slide .title[ # Discrete random variables ] .author[ ###
MACS 33000
University of Chicago ] --- `$$\newcommand{\E}{\mathrm{E}} \newcommand{\Var}{\mathrm{Var}} \newcommand{\Cov}{\mathrm{Cov}}$$` # Misc * CALCULATORS CLAIMED (need your own if you want it!) * Survey: https://forms.gle/CyQnBsq7hbLtmpfX6 * Please BE SPECIFIC so I can address concerns! * instead of 'more examples', try 'lecture X was challenging. more examples on Y' * **Upcoming dates:** * career services (9/10) (tentative: 11-1) * MACSS Econ orientation (9/20) * orientation (9/23) + rest of week * pub night (tbd) --- # Learning objectives * Define random variables * Distinguish between discrete and interval variables * Identify discrete random variable distributions relevant to social science * Review measures of central tendency and dispersion * Define expected value and variance * Define cumulative mass functions (CMFs) for discrete random variables --- # Random variable * A random process or variable with a numerical outcome * Random variable `\(X\)` that is a function of the sample space * Number of incumbents who win * An indicator whether a country defaults on a loan (1 if a default, 0 otherwise) * Number of casualties in a war (rather than all possible outcomes) `$$X:\text{Sample Space} \rightarrow \Re$$` --- # Treatment assignment * `\(3\)` units, flipping fair coin `\(\frac{1}{2}\)` to assign each unit * Assign to `\(T=\)`Treatment or `\(C=\)`control * `\(X\)` = Number of units received treatment * Defining the function `$$X = \left \{ \begin{array} {ll}0 \text{ if } (C, C, C) \\1 \text{ if } (T, C, C) \text{ or } (C, T, C) \text{ or } (C, C, T) \\2 \text{ if } (T, T, C) \text{ or } (T, C, T) \text{ or } (C, T, T) \\3 \text{ if } (T, T, T) \end{array} \right.$$` `$$\begin{aligned}X( (C, C, C) ) & = 0 \\X( (T, C, C)) & = 1 \\X((T, C, T)) & = 2 \\X((T, T, T)) & = 3 \end{aligned}$$` --- # Examples * `\(X\)` = Number of Calls into congressional office in some period `\(p\)` * `\(X(c) = c\)` * Outcome of Election * Define `\(v\)` as the proportion of vote the candidate receives * Define `\(X = 1\)` if `\(v>0.50\)` * Define `\(X = 0\)` if `\(v<0.50\)` * For example, if `\(v = 0.48\)`, then `\(X(v) = 0\)` * An indicator whether a country defaults on a loan (1 if a default, 0 otherwise) * Not all are experimental - most are observational --- # Discrete random variables * A random variable with a finite or countably infinite range is **discrete** * A random variable with an uncountably infinite number of values is **continuous** --- # Probability mass functions * Probability of the values that our value can take. * FOR DISCRETE VALUES ONLY!! * Operator is the `\(\sum\)` (sum) because we are ADDING PIECES * Need a corresponding probability for each value (remember what we said re: probability ( `\(0 \leq p \leq 1\)` )) --- # Probability mass function (PMF) Define these for our domain (x). NOTE: little 'x' is the value that is being taken on while big 'X' is the random variable itself. -- So, `\(P(X=x)\)` refers to the probability that our random variable takes on the value of x. You can also see `\(P_X(X=x)\)` and similar variants. --- ### Intuition * `\(\Pr(C, T, C) = \Pr(C)\Pr(T)\Pr(C) = \frac{1}{2}*\frac{1}{2}*\frac{1}{2} = \frac{1}{8}\)` `$$\begin{aligned}p(X = 0) & = \Pr(C, C, C) = \frac{1}{8}\\p(X = 1) & = \Pr(T, C, C) + \Pr(C, T, C) + \Pr(C, C, T) = \frac{3}{8} \\p(X = 2) & = \Pr(T, T, C) + \Pr(T, C, T) + \Pr(C, T, T) = \frac{3}{8} \\p(X = 3) & = \Pr(T, T, T) = \frac{1}{8}\end{aligned}$$` * `\(p(X = a) = 0\)`, for all `\(a \notin (0, 1, 2, 3)\)` --- # Probability mass function <img src="10-discrete-random-vars_files/figure-html/pmf-1.png" width="864" style="display: block; margin: auto;" /> --- # Probability mass function * For a discrete random variable `\(X\)` `$$p_X(x) = \Pr(X = x)$$` * Note that `$$\sum_x p_{X}(x) = 1$$` -- * Can also add probabilities for smaller sets `\(S\)` of possible values of `\(X \in S\)` `$$\Pr(X \in S) = \sum_{x \in S} p_X (x)$$` `$$\Pr (X > 0) = \sum_{x=1}^2 p_X (x) = \frac{1}{2} + \frac{1}{4} = \frac{3}{4}$$` -- *remind you of anything* **::cough::** *integrals* --- # Calculating our PMF! To calculate the PMF of a random variable `\(X\)` * For each possible value `\(x\)` of `\(X\)`, collect all possible outcomes that give rise to the event `\(\{ X=x \}\)` * Add their probabilities to obtain `\(p_X (x)\)` --- # Topic model example * Topics: distinct concepts (war in Ukraine, national debt, fire department grants) * Mathematically: PMF on words * Set of words `\(\{ \text{ukraine, fire, department, soldier, troop, war, grant}\}\)` -- ### Topic 1 `\(\Pr(\text{ukraine}) = 0.3\)`; `\(\Pr(\text{fire}) = 0.0001\)`; `\(\Pr(\text{department}) = 0.0001\)`; `\(\Pr(\text{soldier}) = 0.2\)`; `\(\Pr(\text{troop}) = 0.2\)`; `\(\Pr(\text{war})=0.2997\)`; `\(\Pr(\text{grant})=0.0001\)` -- ### Topic 2 `\(\Pr(\text{ukraine}) = 0.0001\)`; `\(\Pr(\text{fire}) = 0.3\)`; `\(\Pr(\text{department}) = 0.2\)`; `\(\Pr(\text{soldier}) = 0.0001\)`; `\(\Pr(\text{troop}) = 0.0001\)`; `\(\Pr(\text{war})=0.0001\)`; `\(\Pr(\text{grant})=0.2997\)` -- * Topic models: take a set of documents and estimate topics --- # Cumulative mass function * For a random variable `\(X\)` `$$F(x) = \Pr(X \leq x)$$` -- * Characterizes how probability **CUMULATES** as `\(X\)` gets larger * `\(F(x) \in [0,1]\)` * `\(F(x)\)` is non-decreasing --- # Three person experiment Consider the three person experiment: `\(\Pr(T) = \Pr(C) = 1/2\)` -- What is `\(F(2)\)`? `$$\begin{aligned}F(2) & = \Pr(X = 0) + \Pr(X = 1) + \Pr(X = 2) \\& = \frac{1}{8} + \frac{3}{8} + \frac{3}{8} \\& = \frac{7}{8} \end{aligned}$$` -- What is `\(F(2) - F(1)\)`? `$$\begin{aligned} F(2) - F(1) & = [\Pr(X = 0) + \Pr(X = 1) + \Pr(X = 2)] \nonumber \\& \quad -[\Pr(X = 0) + \Pr(X = 1)] \\F(2) - F(1) & = \Pr(X = 2)\end{aligned}$$` --- class: middle, center # PMFs and CMFs PMF is to our original function as CMF is to an integral --- # PMFs and CMFs <img src="10-discrete-random-vars_files/figure-html/cmf-1.png" width="864" style="display: block; margin: auto;" /> * Similar to integration, but over a discrete set of values --- class: middle, center # Distributions and the functions we love <img src="https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcSbEyaYCtKvH69f_7DIShQ_Kc67Me-j5knH0A&usqp=CAU" style="display: block; margin: auto;" /> --- # Function & Distribution overview So far, we've looked at how to add up values. However, the values we get depend upon the underlying function. There are a few well-known distributions that we'll cover as these will be increasingly useful as you move through your coursework. * Bernoulli * Binomial * Geometric * Poisson -- This is a selection of distributions -- each has certain properties that can be useful for predicting outcomes and events but there are many, many more that may be useful to you as well. --- # Bernoulli Suppose `\(X\)` is a random variable, with `\(X \in \{0, 1\}\)` and `\(\Pr(X = 1) = \pi\)`. Then we will say that `\(X\)` is **Bernoulli** random variable, `$$p_X(k)= \pi^{k} (1- \pi)^{1 - k}$$` for `\(k \in \{0,1\}\)` and `\(p_X(k) = 0\)` otherwise -- `$$Y \sim \text{Bernoulli}(\pi)$$` --- # Bernoulli Suppose we flip a fair coin and `\(Y = 1\)` if the outcome is Heads $$ `\begin{aligned} Y & \sim \text{Bernoulli}(1/2) \nonumber \\ p_X(1) & = (1/2)^{1} (1- 1/2)^{ 1- 1} = 1/2 \nonumber \\ p_X(0) & = (1/2)^{0} (1- 1/2)^{1 - 0} = (1- 1/2) \nonumber \end{aligned}` $$ --- # Bernoulli PMF <div class="figure" style="text-align: center"> <img src="10-discrete-random-vars_files/figure-html/bernoulli-1.png" alt="Example Bernoulli probability mass functions" width="864" /> <p class="caption">Example Bernoulli probability mass functions</p> </div> --- # Binomial A model to count the number of successes across `\(N\)` trials. -- Suppose `\(X\)` is a random variable that counts the number of successes in `\(N\)` independent and identically distributed Bernoulli trials. Then `\(X\)` is a **Binomial** random variable, -- `$$p_X(k) = {N\choose{k}}\pi^{k} (1- \pi)^{n-k}$$` for `\(k \in \{0, 1, 2, \ldots, N\}\)` and 0 otherwise `$$\binom N{k} = \frac{N!}{(N-k)! k!}$$` -- To say that a variable, `\(Y\)`, has or follows a binomial distribution we write: `$$Y \sim \text{Binomial}(N, \pi)$$` --- # Binomial PMF: trials -- <img src="10-discrete-random-vars_files/figure-html/binomial-1.png" width="864" style="display: block; margin: auto;" /> --- # Binomial Recall our experiment example * `\(\Pr(T) = \Pr(C) = 1/2\)` * `\(Z =\)` number of units assigned to treatment -- $$ `\begin{aligned} Z & \sim \text{Binomial}(1/2)\\ p_Z(0) & = {3\choose0} (1/2)^0 (1- 1/2)^{3-0} = 1 * \frac{1}{8}\\ p_Z(1) & = {3\choose1} (1/2)^1 (1 - 1/2)^{2} = 3 * \frac{1}{8} \\ p_Z(2) & = {3\choose2} (1/2)^2 (1- 1/2)^1 = 3 * \frac{1}{8} \\ p_Z(3) & = {3\choose3} (1/2)^3 (1 - 1/2)^{0} = 1 * \frac{1}{8} \end{aligned}` $$ --- # Geometric * A model to **count the number of trials** of a Bernoulli outcome **before success occurs the FIRST time** -- * Suppose `\(X\)` is a random variable that counts the number of tosses needed for a head to come up the first time. Its PMF is `$$p_X(k) = (1 - p)^{k-1}p, \quad k = 1, 2, \ldots$$` -- We can then add up all these possibilities to ensure it satisfies our criteria for probability: $$ `\begin{aligned} \sum_{k=1}^{\infty} p_X(k) &= \sum_{k=1}^{\infty} (1 - p)^{k-1}p \\ &= p \sum_{k=1}^{\infty} (1 - p)^{k-1} \\& = p * \frac{1}{1 - (1-p)} \\ &= 1 \end{aligned}` $$ --- # Geometric PMF <img src="10-discrete-random-vars_files/figure-html/geometric-1.png" width="864" style="display: block; margin: auto;" /> --- # Poisson * Often interested in counting number of events that occur * Number of wars started * Number of speeches made * Number of bribes offered * Number of people waiting for licenses * Generally referred to as **event counts** --- # Poisson Suppose `\(X\)` is a random variable that takes on values `\(X \in \{0, 1, 2, \ldots, \}\)` and that `\(\Pr(X = k) = p_X(k)\)` is, `$$p_X(k) = e^{-\lambda} \frac{\lambda^{k}}{k!}, \quad k = 0,1,2,\ldots$$` for `\(k \in \{0, 1, \ldots, \}\)` and `\(0\)` otherwise. (Here, `\(\lambda\)` is the mean number of events per year that we're interested in.) -- `$$X \sim \text{Poisson}(\lambda)$$` --- # Poisson PMF <img src="10-discrete-random-vars_files/figure-html/poisson-1.png" width="864" style="display: block; margin: auto;" /> --- # Example * Suppose the number of threats a president makes in a term is given by `\(X \sim \text{Poisson}(5)\)` * What is the probability the president will make ten threats? -- `$$p_X(10) = e^{-5} \frac{5^{10}}{10!}$$` <img src="10-discrete-random-vars_files/figure-html/poisson-president-1.png" width="864" style="display: block; margin: auto;" /> --- # Example: working backwards * Suppose we have an event which we believe follows a Poisson distribution. Perhaps this is the number of students who forget their calculator for the exam. If we observed that last year, about `\(4.979\%\)` of the time all students remembered their calculator. How could we figure out the likelihood that exactly one student forgets theirs in the upcoming exam? -- `$$p_X(0) = e^{-\lambda} \frac{\lambda^{x}}{x!}$$` -- `$$p_X(0) = e^{-\lambda} \frac{\lambda^{0}}{0!}=0.04979$$` -- `$$0.04979 = e^{-\lambda}$$` -- `$$log(0.0479) = - \lambda \, \text{ therefore } \lambda \approx 3.0$$` --- # Poisson plot <img src="10-discrete-random-vars_files/figure-html/poisson-student-1.png" width="864" style="display: block; margin: auto;" /> --- # Approximating a binomial RV The Poisson PMF with parameter `\(\lambda\)` is a good approximation for a binomial PMF with parameters `\(n\)` and `\(p\)` `$$e^{-\lambda} \frac{\lambda^{k}}{k!} \approx {N\choose{k}}\pi^{k} (1- \pi)^{n-k}, \quad \text{if } k \ll n$$` * Assumes * `\(\lambda = np\)` * `\(n\)` is very large * `\(p\)` is very small --- # Approximating a binomial RV * `\(n = 100\)` and `\(p = 0.01\)` -- #### Using the binomial PMF `$$\frac{100!}{95! 5!} * 0.01^5 (1 - 0.01)^{95} = 0.00290$$` -- #### Using the Poisson PMF * `\(\lambda = np = 100 * 0.01 = 1\)` `$$e^{-1} \frac{1}{5!} = 0.00306$$` --- # Function recap: * We're dealing here ONLY with **DISCRETE** functions * Random variable takes on a limited number of values * Functions offer a form for calculating probability * Different assumptions for different types of functions --- class: center, middle, inverse # More fun with random variables --- # Functions of random variables Given a random variable `\(X\)`, you may wish to create a new random variable `\(Y\)` using transformations of `\(X\)` `$$Y = g(X) = aX + b$$` `$$g(X) = \log(X)$$` -- * If `\(Y = g(X)\)` is a function of a random variable `\(X\)`, then `\(Y\)` is also a random variable * All outcomes in the sample space defined with a numerical value `\(x\)` for `\(X\)` also have a numerical value `\(y = g(x)\)` for `\(Y\)` --- # Expectation, mean, and variance * PMF of a random variable `\(X\)` provides probabilities of all possible values of `\(X\)` * Often desirable to summarize this information into a single representative number * **Expectation** of `\(X\)` --- # Motivation * Consider spinning a wheel of fortune many times * At each spin, one of the numbers `\(m_1, m_2, \ldots, m_n\)` comes up with corresponding probability `\(p_1, p_2, \ldots, p_n\)`, and this is your monetary reward from that spin * What is the amount of money that you "expect" to get "per spin"? -- * Spin `\(k\)` times * `\(k_i\)` and `\(m_i\)` * `\(m_1 k_1 + m_2 k_2 + \ldots + m_n k_n\)` `$$M = \frac{m_1 k_1 + m_2 k_2 + \ldots + m_n k_n}{k}$$` `$$\frac{k_i}{k} \approx p_i, i = 1, \ldots,n$$` `$$M = \frac{m_1 k_1 + m_2 k_2 + \ldots + m_n k_n}{k} \approx m_1p_1 + m_2p_2 + \ldots + m_np_n$$` --- # Expectation Random variable `\(X\)`, with PMF `\(p_X\)` `$$\E[X] = \sum_{x:p(x)>0} x p(x)$$` --- # Example of expected value Suppose `\(X\)` is number of units assigned to treatment `$$X = \left \{ \begin{array} {ll}0 \text{ if } (C, C, C) \\1 \text{ if } (T, C, C) \text{ or } (C, T, C) \text{ or } (C, C, T) \\2 \text{ if } (T, T, C) \text{ or } (T, C, T) \text{ or } (C, T, T) \\3 \text{ if } (T, T, T) \end{array} \right.$$` What is `\(\E[X]\)`? -- `$$\begin{aligned}\E[X] & = 0* \frac{1}{8} + 1 * \frac{3}{8} + 2 * \frac{3}{8} + 3 * \frac{1}{8} \\& = 1.5 \end{aligned}$$` -- * Weighted average * Measure of central tendency --- # A single person poll Suppose that there is a group of `\(N\)` people * Suppose `\(M< N\)` people approve of the president's performance * `\(N - M\)` disapprove of their performance -- Draw one person `\(i\)`, with `\(\Pr(\text{Draw } i ) = \frac{1}{N}\)` `$$X = \left \{ \begin{array} {ll} 1 \text{ if person Approves} \\ 0 \text{ if Disapproves} \\\end{array} \right.$$` -- What is `\(\E[X]\)`? -- `$$\begin{aligned}\E[X] & = 1 * \Pr(\text{Approve}) + 0 * \Pr(\text{Disapprove}) \\ & = 1 * \frac{M}{N} \\ & = \frac{M}{N} \end{aligned}$$` --- # Indicator variables and probabilities Suppose `\(A\)` is an event. Define random variable `\(I\)` such that `\(I= 1\)` if an outcome in `\(A\)` occurs and `\(I =0\)` if an outcome in `\(A^{c}\)` occurs. Then, `$$\E[I] = \Pr(A)$$` -- `$$\begin{aligned}\E[I] & = 1 * \Pr(A) + 0 * \Pr(A^{c}) \\ & = \Pr(A) \end{aligned}$$` --- # Moments * 1st moment: `\(\E[X^1] = \E[X]\)` * 2nd moment: `\(\E[X^2]\)` * `\(n\)`th moment: `\(\E[X^n]\)` --- # Variance * Expected value is a measure of **central tendency** * What about **spread** or **dispersion**? -- For each value, measure distance from the center. Recall that `\(E[X]\)` is a SET VALUE. `$$d(x, E[X])^{2} = (x - E[X])^2$$` -- Weighted average of these distances `$$\begin{aligned} \E[(X - \E[X])^2] & = \sum_{x:p_X(x)>0} (x - \E[X])^2p_X(x)\end{aligned}$$` -- `$$\sum_{x:p_X(x)>0} \left(x^2 p_X(x)\right)- 2 \E[X]\sum_{x:p_X(x)>0} \left(x p_X(x)\right) + \E[X]^2\sum_{x:p_X(x)>0} p_X(x)$$` -- `$$= \E[X^2] - 2\E[X]^2 + \E[X]^2 \\ = \E[X^2] - \E[X]^2 \\= \text{Var}(X)$$` --- # Variance Expected value of the random variable `\((X - \E[X])^2\)` `$$\begin{aligned}\Var(X) &= \E[(X - \E[X])^2] \\&= \E[X^2] - \E[X]^2\end{aligned}$$` * Measure of **dispersion** of `\(X\)` around its mean * Standard deviation of `\(X\)` - `\(\sigma_X = \sqrt{\Var(X)}\)` --- # Calculating variance of a random variable * Generate the PMF of the random variable `\((X - \E[X])^2\)` * Calculate the expectation of this function -- ### Expected value rule for functions of random variables Let `\(X\)` be a random variable with PMF `\(p_X\)`, and let `\(g(X)\)` be a function of `\(X\)`. Then, the expected value of the random variable `\(g(X)\)` is given by `$$\E[g(X)] = \sum_{x} g(x) p_X(x)$$` -- `$$\begin{align}\Var(X) &= \E[(X - \E[X])^2] \\\Var(X) &= \E[X^2] - \E[X]^2\end{align}$$` --- # Bernoulli variable * Suppose `\(Y \sim \text{Bernoulli}(\pi)\)` `$$\begin{aligned} \E[Y] & = 1 * \Pr(Y = 1) + 0 * \Pr(Y = 0) \nonumber \\& = \pi + 0 (1 - \pi) \nonumber = \pi \\\Var(Y) & = \E[Y^2] - \E[Y]^2 \nonumber \\\E[Y^2] & = 1^{2} \Pr(Y = 1) + 0^{2} \Pr(Y = 0) \nonumber \\& = \pi \nonumber \\ \Var(Y) & = \pi - \pi^{2} \nonumber \\& = \pi(1 - \pi ) \nonumber\end{aligned}$$` -- * `\(\E[Y] = \pi\)` * `\(\Var(Y) = \pi(1- \pi)\)` * What is the maximum variance? -- `$$\begin{aligned} \Var(Y) & = \pi(1- \pi) \nonumber \\& = 0.5(1 - 0.5 ) \\& = 0.25\end{aligned}$$` --- # Binomial `$$Z = \sum_{i=1}^N Y_{i} \text{ where } Y_{i} \sim \text{Bernoulli}(\pi)$$` -- `$$\begin{aligned}\E[Z] & = \E[Y_{1} + Y_{2} + Y_3 + \ldots + Y_N ] \\& = \sum_{i=1}^N \E[Y_{i} ] \\& = N \pi \\\Var(Z) & = \sum_{i=1}^N \Var(Y_{i}) \\& = N \pi (1-\pi)\end{aligned}$$` --- # Decision making using expected values * Optimizes the choice between several candidate decisions that result in random rewards * View the expected reward of a decision as its average payoff over a large number of trials * Choose a decision with maximum expected reward --- # Example: going to war * Suppose country `\(1\)` is engaged in a conflict and can either win or lose * Define `\(Y = 1\)` if the country wins and `\(Y = 0\)` otherwise `$$Y \sim \text{Bernoulli}(\pi)$$` -- * Suppose country `\(1\)` is deciding whether to fight a war * Engaging in the war will cost the country `\(c\)` * If they win, country `\(1\)` receives `\(B\)` * What is `\(1\)`'s expected utility from fighting a war? -- `$$\begin{aligned}\E[U(\text{war})] & = U(\text{war} | \text{win}) * \Pr(\text{win}) + U(\text{war} | \text{lose}) * \Pr(\text{lose}) \\& = (B - c) \Pr(Y = 1) + (- c) \Pr(Y = 0 ) \\& = B * \Pr(Y = 1) - c(\Pr(Y = 1) + \Pr(Y = 0)) \\& = B * \pi - c \end{aligned}$$` --- # Cumulative mass function * Defines the the cumulative probability `\(F_X(x)\)` up to the value of `\(x\)` * For a discrete random variable `\(X\)` `$$F_X(x) = \Pr (X \leq x) = \sum_{k \leq x} p_X(k)$$` * All random variables with a PMF have a CMF --- # Cumulative mass function * `\(F_X\)` is monotonically non-decreasing - if `\(x \leq y\)`, then `\(F_X(x) \leq F_X(y)\)` * `\(F_X(x)\)` tends to `\(0\)` as `\(x \rightarrow -\infty\)`, and to `\(1\)` as `\(x \rightarrow \infty\)` * `\(F_X(x)\)` is a piecewise constant function of `\(x\)` * If `\(X\)` is discrete and takes integer values, the PMF and the CMF can be obtained from each other by summing or differencing: `$$F_X(k) = \sum_{i = -\infty}^k p_X(i),$$` `$$p_X(k) = \Pr (X \leq k) - \Pr (X \leq k-1) = F_X(k) - F_X(k-1)$$` for all integers `\(k\)` --- # Bernoulli CMF <img src="10-discrete-random-vars_files/figure-html/bernoulli-cmf-1.png" width="864" style="display: block; margin: auto;" /> --- # Binomial CMF <img src="10-discrete-random-vars_files/figure-html/binomial-cmf-1.png" width="864" style="display: block; margin: auto;" /> --- # Binomial CMF <img src="10-discrete-random-vars_files/figure-html/binomial-cmf-2-1.png" width="864" style="display: block; margin: auto;" /> --- # Binomial CMF <img src="10-discrete-random-vars_files/figure-html/binomial-cmf-3-1.png" width="864" style="display: block; margin: auto;" /> --- # Binomial CMF <img src="10-discrete-random-vars_files/figure-html/binomial-cmf-4-1.png" width="864" style="display: block; margin: auto;" /> --- # Geometric CMF <img src="10-discrete-random-vars_files/figure-html/geometric-cmf-1.png" width="864" style="display: block; margin: auto;" /> --- # Poisson CMF <img src="10-discrete-random-vars_files/figure-html/poisson-cmf-1.png" width="864" style="display: block; margin: auto;" /> --- # RECAP: * Random variables: `\(P(X=x)\)` * DISCRETE VARIABLES: **different from continuous** * PMF: probability mass function * Probability for a particular value * CDF: cumulative distribution function * ADD UP VALUES (akin to a sum) * **Distributions**: * Bernoulli: * pmf: `\(1-p\)` for k = 0; `\(p\)` for `\(k = 1\)` * mean `\(p\)`, var `\(p(1-p)\)` * Binomial: * pmf: `\(p_X(k) = {N\choose{k}}\pi^{k} (1- \pi)^{n-k}\)` * mean `\(np\)`, var `\(np(1-p)\)` (also written `\(npq\)`) * Geometric: * pmf: `\(p_X(k) = (1 - p)^{k-1}p, \quad k = 1, 2, \ldots\)` * mean `\(\frac{1}{p}\)` , var `\(\frac{(1-p)}{p^2}\)` * Poisson: * pmf: `\(p_X(k) = e^{-\lambda} \frac{\lambda^{k}}{k!}, \quad k = 0,1,2,\ldots\)` * mean `\(\lambda\)`, var `\(\lambda\)`