class: center, middle, inverse, title-slide .title[ # Logarithms, sequences, and limits ] .author[ ###
MACS 33000
University of Chicago ] --- # Learning objectives * Define logarithmic and exponential functions * Practice simplifying power, logarithmic, and exponential functions * Define sequences * Distinguish convergence and divergence * Define limits * Define continuity * Calculate limits of sequences and functions --- # Logarithms and exponential functions * Important component to many mathematical and statistical methods in social science * Exponents * Logarithms --- # Functions with exponents `$$f(x) = x \times x = x^2$$` `$$f(x) = x \times x \times x = x^3$$` <img src="02-sequences-limits_files/figure-html/functions-with-exp-1.png" width="864" style="display: block; margin: auto;" /> --- # Common rules of exponents * `\(x^0 = 1\)` * `\(x^1 = x\)` * `\(\left ( \frac{x}{y} \right )^a = \left ( \frac{x^a}{y^a}\right ) = x^a y^{-a}\)` * `\((x^a)^b = x^{ab}\)` * `\((xy)^a = x^a y^a\)` * `\(x^a \times x^b = x^{a+b}\)` --- # Logarithms * Class of functions * `\(\log_{b}(x) = a \Rightarrow b^a = x\)` * What number `\(a\)` solves `\(b^a = x\)` --- # Base 10 `$$\log_{10}(100) = 2 \Rightarrow 10^2 = 100$$` `$$\log_{10}(0.1) = -1 \Rightarrow 10^{-1} = 0.1$$` <img src="02-sequences-limits_files/figure-html/log-base-10-1.png" width="864" style="display: block; margin: auto;" /> --- # Base 2 `$$\log_{2}(8) = 3 \Rightarrow 2^3 = 8$$` `$$\log_{2}(1) = 0 \Rightarrow 2^0 = 1$$` <img src="02-sequences-limits_files/figure-html/log-base-2-1.png" width="864" style="display: block; margin: auto;" /> --- # Base `\(e\)` * Natural logarithm `$$\log_{e}(e) = 1 \Rightarrow e^1 = e$$` <img src="02-sequences-limits_files/figure-html/log-base-e-1.png" width="864" style="display: block; margin: auto;" /> * Natural logarithms are incredibly useful in math * Often `\(\log()\)` is assumed to be a natural log * Also seen as `\(\ln()\)` --- # Rules of logarithms * `\(\log_b(1) = 0\)` * `\(\log(x \times y) = \log(x) + \log(y)\)` * `\(\log(\frac{x}{y}) = \log(x) - \log(y)\)` * `\(\log(x^y) = y \log(x)\)` --- # Sequence * A sequence is a function whose domain is the set of positive integers * We'll write a sequence as, `$$\left\{u_{n} \right\}_{n=1}^{\infty} = (u_{1} , u_{2}, \ldots, u_{N}, \ldots )$$` --- # Sequence `$$\left\{\frac{1}{n} \right\} = (1, 1/2, 1/3, 1/4, \ldots, 1/N, \ldots, )$$` <img src="02-sequences-limits_files/figure-html/seq-1-1.gif" style="display: block; margin: auto;" /> --- # Sequence `$$\left\{\frac{1}{n^2} \right\} = (1, 1/4, 1/9, 1/16, \ldots, 1/N^2, \ldots, ) \\$$` <img src="02-sequences-limits_files/figure-html/seq-2-1.gif" style="display: block; margin: auto;" /> --- # Sequence `$$\left\{\frac{1 + (-1)^n}{2} \right\} = (0, 1, 0, 1, \ldots, 0,1,0,1 \ldots, ) \\$$` <img src="02-sequences-limits_files/figure-html/seq-3-1.gif" style="display: block; margin: auto;" /> --- # Arithmetic progression * A sequence `\(\{ u_n \}\)` with the property that the difference between each pair of successive terms is the same * `\(u_{n+1} - u_n\)` is the same for all `\(n\)` * The arithmetic progression with first term `\(a\)` and common difference `\(d\)` is `$$a, a + d, a + 2d, a +3d, \ldots$$` * The `\(n\)`th term is given by `$$u_n = a + (n-1)d$$` --- # Arithmetic progression <img src="02-sequences-limits_files/figure-html/arithmetric-progression-1.png" width="864" style="display: block; margin: auto;" /> --- # Geometric progression * A sequence `\(\{ u_n \}\)` in which each term is obtained from the preceding one by multiplication by the same number: * The ratio `\(\frac{u_{n+1}}{u_n}\)` is the same for all `\(n\)` * The geometric progression with first term `\(a\)` and common ratio `\(x\)` is `$$a, ax, ax^2, ax^3, \ldots$$` * The `\(n\)`th term is given by `$$u_n = ax^{n-1}$$` --- # Geometric progression <img src="02-sequences-limits_files/figure-html/geometric-progression-1.png" width="864" style="display: block; margin: auto;" /> --- # Convergence `$$\left\{\frac{(-1)^{n} }{n} \right \} = (-1, \frac{1}{2}, \frac{-1}{3}, \frac{1}{4}, \frac{-1}{5}, \frac{1}{6}, \frac{-1}{7}, \frac{1}{8}, \ldots )$$` <img src="02-sequences-limits_files/figure-html/seq-convergence-1.gif" style="display: block; margin: auto;" /> --- # Convergence A sequence `\(\left\{u_{n} \right\}_{n=1}^{\infty}\)` converges to a real number `\(A\)` if for each `\(\epsilon >0\)` there is a positive integer `\(N\)` such that for all `\(n \geq N\)` we have `\(|u_{n} - A| < \epsilon\)` -- * If a sequence converges, it converges to **one** number `\(A\)` * `\(\epsilon>0\)` is some **arbitrary** real-valued number * `\(N\)` will depend upon `\(\epsilon\)` * Implies the sequence never gets further than `\(\epsilon\)` away from `\(A\)` -- ## Divergence and boundedness * If a sequence `\(\left\{u_{n} \right\}\)` converges we'll call it **convergent** * If it doesn't we'll call it **divergent** * If there is some number `\(M\)` such that, for all `\(n\)` `\(|u_{n}|<M\)`, then we'll call it **bounded** --- # An unbounded sequence `$$\left\{ n \right \} = (1, 2, 3, 4, \ldots, N, \ldots )$$` <img src="02-sequences-limits_files/figure-html/seq-unbounded-1.png" width="864" style="display: block; margin: auto;" /> --- # A bounded sequence that doesn't converge `$$\left\{\frac{1 + (-1)^n}{2} \right\} = (0, 1, 0, 1, \ldots, 0,1,0,1 \ldots, )$$` <img src="02-sequences-limits_files/figure-html/seq-bounded-1.png" width="864" style="display: block; margin: auto;" /> -- * All convergent sequences are bounded * If a sequence is **constant**, `\(\left\{C \right \}\)` it converges to `\(C\)` --- # Algebra of sequences Suppose `\(\left\{a_{n} \right \}\)` converges to `\(A\)` and `\(\left\{b_{n} \right\}\)` converges to `\(B\)`. Then, * `\(\left\{a_{n} + b_{n} \right\}\)` converges to `\(A + B\)` * `\(\left\{a_{n} b_{n} \right\}\)` converges to `\(A \times B\)` * Suppose `\(b_{n} \neq 0 \forall n\)` and `\(B \neq 0\)`. Then `\(\left\{\frac{a_{n}}{b_{n}} \right\}\)` converges to `\(\frac{A}{B}\)` --- # Algebra of sequences * Consider the sequence `\(\left\{\frac{1}{n} \right\}\)` - what does it converge to? * Consider the sequence `\(\left\{\frac{1}{2n} \right \}\)` - what does it converge to? -- <img src="02-sequences-limits_files/figure-html/converge-similar-denominator-1.png" width="864" style="display: block; margin: auto;" /> --- # Challenge questions * What does `\(\left\{3 + \frac{1}{n}\right\}\)` converge to? * What about `\(\left\{ (3 + \frac{1}{n} ) (100 + \frac{1}{n^4} ) \right\}\)`? * Finally, `\(\left\{ \frac{ 300 + \frac{1}{n} }{100 + \frac{1}{n^4}} \right\}\)`? --- # Limits * Sequences `\(\leadsto\)` limits of functions * Calculus/Real Analysis: study of functions on the **real line** * Limit of a function: how does a function behave as it gets close to a particular point? -- ## Uses of limits * Derivatives * Asymptotics * Game Theory --- # Limit of a function <img src="02-sequences-limits_files/figure-html/lim-sin-x-1.gif" style="display: block; margin: auto;" /> --- # Limit of a function * Suppose `\(f: \Re \rightarrow \Re\)` * We say that `\(f\)` has a limit `\(L\)` at `\(x_{0}\)` if, for `\(\epsilon>0\)`, there is a `\(\delta>0\)` such that `$$|f(x) - L| < \epsilon \, \forall \, x \backepsilon 0 < |x - x_0 | < \delta$$` -- * Limits are about the behavior of functions at **points**. Here `\(x_{0}\)` * As with sequences, we let `\(\epsilon\)` define an **error rate** * `\(\delta\)` defines an area around `\(x_{0}\)` where `\(f(x)\)` is going to be within our error rate --- # Examples of limits * The function `\(f(x) = x + 1\)` has a limit of `\(1\)` at `\(x_{0} = 0\)` <img src="02-sequences-limits_files/figure-html/limit-xplus1-1.png" width="864" style="display: block; margin: auto;" /> --- # Proof * **Without loss of generalization** (WLOG) choose `\(\epsilon >0\)` * Show that there is `\(\delta_{\epsilon}\)` such that `$$|f(x) - L| < \epsilon \, \text{for all} \, x \, \text{such that} \, 0 < |x - x_0 | < \delta$$` `$$\begin{aligned}|(x + 1) - 1| < \epsilon \, \text{for all} \, x \, &\text{such that} \, 0 < |x - 0 | < \delta \\|x| < \epsilon \, \text{for all} \, x \, &\text{such that} \, 0 < |x | < \delta \\\end{aligned}$$` -- * But if `\(\delta_{\epsilon} = \epsilon\)` then this holds --- # `\(f(x_{0} ) \neq L\)` * A function can have a limit of `\(L\)` at `\(x_{0}\)` even if `\(f(x_{0} ) \neq L\)` -- * The function `\(f(x) = \frac{x^2 - 1}{x - 1}\)` has a limit of `\(2\)` at `\(x_{0} = 1\)` <img src="02-sequences-limits_files/figure-html/limit-discontinuous-1.png" width="864" style="display: block; margin: auto;" /> --- # `\(f(x_{0} ) \neq L\)` * For all `\(x \neq 1\)`, `$$\begin{aligned}\frac{x^2 - 1}{x - 1} & = \frac{(x + 1)(x - 1) }{x - 1} \\ & = x + 1 \end{aligned}$$` * Choose `\(\epsilon >0\)` and set `\(x_{0}=1\)`. Then, we're looking for `\(\delta_{\epsilon}\)` such that `$$\begin{aligned}|(x + 1) -2 | < \epsilon \, \text{for all} \, x \, &\text{such that} \, 0 < |x - 1 | < \delta \\|x - 1 | < \epsilon \, \text{for all} \, x \, &\text{such that} \, 0 < |x - 1 | < \delta \\\end{aligned}$$` * If `\(\delta_{\epsilon} = \epsilon\)`, then this is satisfied --- # Not all functions have limits Consider `\(f:(0,1) \rightarrow \Re\)`, `\(f(x) = \frac{1}{x}\)`. `\(f(x)\)` does not have a limit at `\(x_{0}=0\)` <img src="02-sequences-limits_files/figure-html/limit-none-1.png" width="864" style="display: block; margin: auto;" /> --- # Not all functions have limits * Choose `\(\epsilon>0\)`. We need to show that there **does not** exist `$$\begin{aligned}|\frac{1}{x} - L| < \epsilon \, \text{for all} \, x \, &\text{such that} \, 0 < |x - 0 | < \delta \\|\frac{1}{x} - L| < \epsilon \, \text{for all} \, x \, &\text{such that} \, 0 < |x| < \delta \\\end{aligned}$$` * But, there is a problem `$$\begin{aligned}\frac{1}{x} - L & < \epsilon \\\frac{1}{x} & < \epsilon + L \\x & > \frac{1}{L + \epsilon} \end{aligned}$$` * This implies that there **can't** be a `\(\delta\)` --- # When there are no limits! <img src="https://media.giphy.com/media/7JvlHfd7C2GDr7zfZF/giphy.gif" style="display: block; margin: auto;" /> -- When there is no limit, this has a lot of implications for us w/r/t (with respect to) how we can expect the function(s) to behave and what we can do with the function(s) mathematically. --- # Intuitive definition of a limit `$$\lim_{x \rightarrow x_{0}} f(x) = L$$` -- ## Right-hand limits `$$\lim_{x \rightarrow x_{0}^{+} } f(x) = L$$` -- ## Left-hand limits `$$\lim_{x \rightarrow x_{0}^{-} } f(x) = L$$` --- # Algebra of limits Suppose `\(f:\Re \rightarrow \Re\)` and `\(g: \Re \rightarrow \Re\)` with limits `\(A\)` and `\(B\)` at `\(x_{0}\)`. Then, $$ `\begin{aligned} \text{i.) } \lim_{x \rightarrow x_{0} } (f(x) + g(x) ) & = \lim_{x \rightarrow x_{0}} f(x) + \lim_{x \rightarrow x_{0}} g(x) = A + B \\ \text{ii.) }\lim_{x \rightarrow x_{0} } f(x) g(x) & = \lim_{x \rightarrow x_{0}} f(x) \lim_{x\rightarrow x_{0}} g(x) = A B \end{aligned}` $$ -- Suppose `\(g(x) \neq 0\)` for all `\(x \in \Re\)` and `\(B \neq 0\)` then `\(\frac{f(x)}{g(x)}\)` has a limit at `\(x_{0}\)` and `$$\lim_{x \rightarrow x_{0}} \frac{f(x)}{g(x)} = \frac{\lim_{x\rightarrow x_{0} } f(x) }{\lim_{x \rightarrow x_{0} } g(x) } = \frac{A}{B}$$` --- # Continuity <img src="02-sequences-limits_files/figure-html/continuity-1.png" width="864" style="display: block; margin: auto;" /> * Limit exists at 1 * But hole in function * Fails the **pencil** test, **discontinuous** at 1 --- # Defining continuity * Suppose `\(f:\Re \rightarrow \Re\)` and consider `\(x_{0} \in \Re\)` * `\(f\)` is continuous at `\(x_{0}\)` if for each `\(\epsilon>0\)` there is a `\(\delta>0\)` such that if, `$$\begin{aligned}|x - x_{0} | & < \delta \text{ for all } x \in \Re \text{ then } \nonumber \\|f(x) - f(x_{0})| & < \epsilon \nonumber \end{aligned}$$` -- * Previously `\(f(x_{0})\)` was replaced with `\(L\)` * Now: `\(f(x)\)` has to converge on itself at `\(x_{0}\)` * **Continuity is more restrictive than limit** --- # Examples of continuity <img src="02-sequences-limits_files/figure-html/continuity-abs-1.png" width="864" style="display: block; margin: auto;" /> --- # Examples of continuity <img src="02-sequences-limits_files/figure-html/continuity-cos-1.png" width="864" style="display: block; margin: auto;" /> --- # Examples of continuity <img src="02-sequences-limits_files/figure-html/continuity-x-sq-1.png" width="864" style="display: block; margin: auto;" /> --- # Measuring incumbency advantage * Incumbency advantage * U.S. House of Representatives * Win rate of incumbent parties `\(>90\%\)` * Win rate of incumbent candidates `\(\approx 90\%\)` (conditional on running for election) * `\(88\%\)` probability of running for reelection -- * Runners-up * `\(3\%\)` chance of winning the next election * Only `\(20\%\)` chance of running in the next election * Is there an electoral advantage to incumbency? * Can this be proven through observational study? --- # Ideal experiment * Randomly assign incumbent parties in a district between Democrats and Republicans * Keep all other factors constant * Corresponding increase in Democratic electoral success in the next election would represent the overall electoral benefit due to being the incumbent party in the district * Realistic? --- # Regression discontinuity design * Dichotomous treatment is a deterministic function of a single, continuous covariate * Treatment is assigned to those individuals whose score crosses a known threshold * If you know the score, you can reverse-engineer the treatment assignment * Assumes as-if random assignment in the local neighborhood around a probability of `\(50\%\)` --- # RDD + incumbency advantage <img src="images/rdd-incumbency-advantage.png" width="269" style="display: block; margin: auto;" /> Source: [Lee, D. S. (2008). Randomized experiments from non-random selection in US House elections. *Journal of Econometrics*, 142(2), 675-697.](https://www.sciencedirect.com/science/article/pii/S0304407607001121) --- # Continuity and limits * Let `\(f: \Re \rightarrow \Re\)` with `\(x_{0} \in \Re\)` * Then `\(f\)` is continuous at `\(x_{0}\)` if and only if `\(f\)` has a limit at `\(x_{0}\)` and that `\(\lim_{x \rightarrow x_{0} } f(x) = f(x_{0})\)` -- * Suppose `\(f\)` is continuous at `\(x_{0}\)` * This implies that `\(|f(x) - f(x_0)| < \epsilon \, \text{for all} \, x \, \text{such that} \, |x - x_0 | < \delta\)` * Definition of a **limit**, with `\(L = f(x_{0})\)` -- * Suppose `\(f\)` has a limit at `\(x_{0}\)` and that limit is `\(f(x_{0})\)` * This implies that `\(|f(x) - f(x_0)| < \epsilon \, \text{for all} \, x \, \text{such that} \, |x - x_0 | < \delta\)` * This is the definition of **continuity** --- # Algebra of continuous functions Suppose `\(f:\Re \rightarrow \Re\)` and `\(g:\Re \rightarrow \Re\)` are continuous at `\(x_{0}\)`. Then, 1. `\(f(x) + g(x)\)` is continuous at `\(x_{0}\)` 1. `\(f(x) g(x)\)` is continuous at `\(x_{0}\)` 1. If `\(g(x_0) \neq 0\)`, then `\(\frac{f(x) } {g(x) }\)` is continuous at `\(x_{0}\)` --- class: center, middle, inverse # RECAP --- # Today's concepts: * Sequences * arithmetic * geometric * logs * convergence * Limits * continuity * what they tell us * how to find them --- # NEXT UP! * DERIVATIVES!!!!!!!! * Thursday: Academic integrity * Check in with any questions! (clipperton@uchicago.edu) ### TA TEAM! * Alejandro Sarria: asarria@uchicago.edu * Huanrui Chen: hchen0628@uchicago.edu * Nalin Bhatt: nalinb@uchicago.edu * Charlotte Zhou: jialez@uchicago.edu