Instructor: Tim Chumley
Office: Clapp 423
Phone: 413-538-2525
e-mail: tchumley
Office Hours: Mondays & Wednesdays 1:00-2:00; Tuesdays & Thursdays 2:00-3:00 and 4:00-5:00; additional availability by appointment

Textbook: Introduction to Stochastic Processes with R by Robert P. Dobrow, ISBN: 9781118740651;
on library reserve under QC20.7.S8 D63 2016;
available as a free e-text


Announcements

Announcements will be posted on the Course Announcements Moodle forum throughout the semester, but essentially all other materials will be posted on this page.

Syllabus

Check the syllabus for all the important class policies (grades, attendance, etc.).

Homework

There will be weekly homework assignments throughout the semester to be turned in. Please read these guidelines for writing in our class.

  • General information. A selection of problems will be assigned to be written up individually and turned in each week.
    • These problems will be due Fridays at 5 pm.
    • You may work with others but the writing should be done on your own.
  • Gradescope. Homework will be turned in through Gradescope.
    • You should be enrolled automatically. Please let me know if you have any issues logging in.
    • Gradescope has made a short tutorial on submitting homework.
  • Collaboration. I want you to work together on the homework! The process of explaining your ideas to one another really helps in learning math. However, you must write up the assignments on your own and avoid copying others’ work directly. Also, please only write what you understand so that I know where to help, and please avoid using online forums like Math StackExchange, solutions manuals, or similar resources. A huge part of learning in this class comes from figuring out how to get unstuck by spending time thinking on your own or talking to me and classmates; these other resources can end up being counter-productive in the long term.
  • Rewrites. Homework is for practice, and you are not expected to write perfect answers from the start!
    • You will be allowed to submit revisions of most problems for full credit each week.
    • Your revisions will be due on Fridays at 5 pm.
    • Please resubmit (only the problems you’re revising) on Gradescope.
Assignment Due
Homework 0 Jan 28
Homework 1 Feb 4
Homework 2 Feb 11
Homework 3 Feb 18
Homework 4 Feb 25
Homework 5 Mar 4
Homework 6 Mar 25
Homework 7 Apr 1
Homework 8 Apr 8
Homework 9 Apr 15
Homework 10 Apr 22

Exams

There will be two midterm exams. The dates for the exams are subject to change slightly.

Exam Due Date Format Material
Exam 1 Mar 11 Take home Chapters 1-3
Exam 2 May 9 Take home Chapters 6-7

Project

We’ll devote the last week of the semester to a mini-symposium of short group presentations. Since the field of stochastic processes is rich with interesting examples and topics, more than we could cover in a single semester, each group of students will choose a topic/application that we might otherwise not have time for in class.

Assignments

  • Project notes related to your topic are due in advance of class on April 19.
  • Presentation on your topic will be April 28 or May 3.
  • Project report on your topic is due May 3, although it will be accepted any time until May 9.

Presentations

Course plan

Our plan is to cover most of chapters 1-3 and 6-7 in the textbook, and possibly some of 4, 5, or 8, time permitting. Below is a rough outline of what is to be covered week by week through the semester. Please check back regularly for precise details on what is covered, as well as postings of class materials like lecture notes.

Introduction


Tuesday
  • Topic: Sections 1.1, 1.2: Introduction. We start to get familiar with the notion of a stochastic process and discuss the terms state space, index set, Markov chain, transition state diagram, and transition matrix.
  • Class materials: Lecture notes, Introduction to Markov chains
  • After class: Work on Homework 0.
Thursday
  • Topic: Sections 2.1, 2.3: \(n\)-step transitions. We give a formal definition of a discrete time, discrete state Markov chain and begin to discuss how the transition matrix of a Markov chain can be used to understand multi-step transition probabilities using the conditional law of total probability.
  • Class materials: Lecture notes, Transition probabilities
  • After class: Work on Homework 1 and make sure you can run RStudio in class Tuesday.

Chapter 2


Tuesday
  • Topic: Sections 2.2, 2.3, 2.4: Random walks. We continue our discussion on using matrix powers to understand \(n\)-step transition probabilities. Our main example will be a random walk on a graph, but the ideas hold for all kinds of Markov chains.
  • Class materials: Lecture notes, Matrix powers
  • After class: Read about the distribution of \(X_n\) and joint distributions in Section 2.3.
Thursday
  • Topic: Section 2.3: Joint distributions. We discuss the (unconditional) distribution of \(X_n\) and joint distributions of a Markov chain at multiple time steps.
  • Class materials: Lecture notes, Distributions
  • After class: Read about a random walk on a cycle in Section 2.4 and about simulation in Example 2.19 of Section 2.5.

Chapters 3


Tuesday
  • Topic: Section 3.1: Limiting distributions. We begin our discussion of the notion of limiting distributions. Our aim in the chapter will be to understand the long term behavior of Markov chains theoretically, but we start with some numerical work.
  • Class materials: Lecture notes, Limiting distributions
  • After class: Make sure to bring a laptop to class next time and download the rref.Rmd file in advance.
Thursday
  • Topic: Section 3.2: Stationary distributions. We begin our discussion of how to turn the problem of understanding limiting distributions from a question about limits to a question about algebra. This entails introducing the idea of a stationary distribution and solving linear systems of equations.
  • Class materials: Lecture notes, Stationary distributions
  • After class: Post a question on Piazza about Problem 3 in today’s worksheet or some other piece of class material so far. Feel free to respond to others’ questions. For Tuesday, read Examples 3.8 and 3.11 and the definition of communication class.

Chapter 3


Tuesday
  • Topic: Section 3.3: Communication classes. We discuss what it means for states to communicate and get introduced to the ideas of recurrence and transience. This discussion, while separate from limiting and stationary distributions, is the first piece to developing an understanding of the Limit Theorem for Regular Matrices and more broadly the question of which Markov chains have a limiting distribution.
  • Class materials: Lecture notes, Communication classes
  • After class: Read pages 96-98 on recurrence and transience.
Thursday
  • Topic: Section 3.3: Recurrence and transience. Our aim is to deepen our understanding of recurrent and transient states. We discuss the related ideas of excursions, regeneration, and the geometric distribution in order to give a quantitative criterion for showing a state is recurrent or transient.
  • Class materials: Lecture notes, Excursions
  • After class: Finish Problem 2 of today’s worksheet. Read Theorem 3.3, Corollary 3.4, and Lemma 3.5.

Chapter 3


Tuesday
  • Topic: Section 3.3: Canonical decomposition. We discuss how the recurrence and transience series criterion from last time’s worksheet can be used to show that states in a communication class are either all recurrent or all transient. We then aim to understand the structure of the canonical form of the transition matrix.
  • Class materials: Lecture notes, Canonical decomposition
  • After class: Read about first step analysis in Section 3.4, focusing on Example 3.17.
Thursday
  • Topic: Section 3.4: Irreducible Markov chains. We discuss finite state irreducible Markov chains and a theorem that states that these Markov chains always have a unique stationary distribution whose components can be found using the expected length of an excursion. We discuss a computational method called first step analysis for finding the expected length of an excursion.
  • Class materials: Lecture notes
  • After class: Read about periodicity in Section 3.5. Put your focus on Lemma 3.7 and Example 3.18.

Chapter 3


Tuesday
  • Topic: Sections 3.5, 3.6: Periodicity and ergodicity. Through examples we have seen two main obstructions for the existence of a limiting distribution with positive components: multiple communication classes (which obstructs the limiting matrix from having equal rows) or states having a periodic structure (which prevents \(\lim_{n \to\infty} P^n\) from existing like in the random walk on the 6-cycle). We discuss periodicity in more depth and a new theorem, the Limit Theorem for Ergodic Markov chains, which tells us that once these obstructions are overcome we must have a limiting distribution: any finite state, irreducible Markov chain whose states have period 1 must have a limiting distribution with positive components.
  • Class materials: Lecture notes, Ergodicity
  • After class: Read the opening discussion in Section 3.8 up to Example 3.27.
Thursday
  • Topic: Section 3.8: Absorbing chains. Our final aim in our study of discrete time Markov chains is to understand in detail the limiting behavior of a chain that starts in a transient state. We begin with the special case of absorbing Markov chains. These chains consist of transient states and states \(i\) which satisfy the property \(P_{ii} = 1\).
  • Class materials: Lecture notes, Absorbing chains, R code
  • After class: Read Theorem 3.11 and the second proof given.

Chapter 3


Tuesday
  • Topic: Section 3.8: Absorbing chains. We discuss the expected number of steps until absorption. We also present the idea of using absorption probabilities to understand the limiting matrix of a Markov chain with transient states and two recurrent communication classes.
  • Class materials: Lecture notes, More on absorption
  • After class: Work on the exam!
Thursday
  • Topic: Exam work day, no class.
  • After class: Enjoy spring break!

Spring Break


Tuesday
  • Topic: Spring break, no class.
Thursday
  • Topic: Spring break, no class.

Chapter 6


Tuesday
  • Topic: Section 6.1: Poisson process. We introduce our first example of a continuous time Markov chain, the Poisson process, which models the occurence of arrivals of events (for example, telephone calls or customers entering a store) over continuous time.
  • Class materials: Lecture notes, Poisson process
  • After class: Review your probability theory notes on the exponential distribution. Respond to the mid-semester feedback form
Thursday
  • Topic: Section 6.2: Arrival times. We continue our study of the Poisson process, discussing the inter-arrival times in a Poisson process. We show that these times are exponentially distributed. The gamma distribution is also used to model the time of the \(n\)th arrival.
  • Class materials: Lecture notes, Arrival times
  • After class: Please respond to the mid-semester feedback form by Monday at 9 am. Also, take a look at Homework 7 and start thinking about a topic for the group project (see Problem 2) and feel free to come to me with questions.

Chapter 6


Tuesday
  • Topic: Community day, no class.
Thursday
  • Topic: Section 6.4: Thinning and superposition. We spend the day discussing what kinds of statements and questions can be addressed about independent Poisson processes and independent exponentially distributed random variables.
  • Class materials: Lecture notes, Thinning and superposition
  • After class: Try the worksheet problems above, come ask questions about them, and read the introductory paragraphs of Sections 7.1 and 7.2.

Chapter 7


Tuesday
  • Topic: Sections 6.3, 7.1: Introduction to continuous-time Markov chains. We introduce the definition of a Markov chain with finite or countable state space that evolves in continuous time. This involves the interrelated notions of hold times (the exponentially distributed amount time until a transition to another state occurs), the embedded chain (the discrete time Markov chain that determines which state to transition to), and the idea that in a certain sense a Poisson process can have at most one arrival in an infinitesimal time interval.
  • Class materials: Lecture notes, Hold times and embedded chains
  • After class: Read Example 7.5.
Thursday
  • Topic: Section 7.2: Alarm clocks and transition rates. We work on more examples of continuous-time Markov chains and further discuss the idea that a CTMC can be defined by its transition rates.
  • Class materials: Lecture notes, Transition rates
  • After class: Read the introductory discussion of Section 7.3 and the proof of the Kolmogorov forward equation.

Chapter 7


Tuesday
  • Topic: Section 7.3: Infinitesimal generator. We introduce the matrix \(Q\) of transition rates between states and its relationship with the transition probability function \(P(t)\). The Kolmogorov forward equation establishes a differential equation that relates these ideas, and the matrix exponential \(e^{tQ}\) enables us to solve the Kolmogorov forward equation.
  • Class materials: Lecture notes, Infinitesimal generator
  • After class: Read the Theorem 7.2 and the proposition and proof on the following page titled Stationary Distribution and Generator Matrix.
Thursday
  • Topic: Section 7.4: Stationary distribution. We discuss limiting and stationary distributions in the regime of continuous time Markov chains. Our main goal is to establish an algebraic equation that can be solved to find stationary distributions analogous to the equation \(pi P=pi\) from the discrete time case.
  • Class materials: Lecture notes, Limiting distribution
  • After class: Read and take notes on your project topic in preparation for next Tuesday’s class meeting.

Group work time


Tuesday
  • Topic: Group work time. We spend the day working in groups learning about the group project topic, discussing questions, and beginning to think about presentations.
  • Class materials: Presentation rubric
  • After class: Spend some time working on the ideas discussed in your group today. Check in with your group about outside-of-class meeting time.
Thursday
  • Topic: Group work time. We continue our group work time.
  • After class: Check in with your group about outside-of-class meeting time. Begin preparing ideas for presentation. Think about what will be presented and how the presentation will be divided between group mates.

Group work time, presentations


Tuesday
  • Topic: Group work time. We spend our last day on group work time before the presentations begin.
  • After class: Meet with your group. Do practice runs of your presentation, particularly if you’re presenting Thursday.
Thursday

Presentations


Tuesday

Getting help

Here are a few ways to get help:

  • Office Hours: Mondays & Wednesdays 1:00-2:00; Tuesdays & Thursdays 2:00-3:00 and 4:00-5:00; additional availability by appointment
  • Study groups: Other students in the class are a wonderful resource. I want our class to feel like a community of people working together. Please get in touch if you’d like me to help you find study partners, and please reach out if you’d like others to join your group. You may work together on homework, explain and check answers, but make sure you write what you know on homework in order to get good feedback.
  • Piazza: I’ve set up an online forum for asking and answering questions about class material, including homework problems. My plan is to check the forum every day and answer questions as they come up, but I hope everyone in the class will pitch in and answer others’ questions when possible.

Resources

  • Everyone is invited to join DataCamp, which provides an introductory R tutorial. It’s a convenient way to gain some familiarity with R, a useful tool for our course and beyond. Our textbook also provides a thorough tutorial of some R basics in the appendix.
  • Our textbook also has a useful collection of R scripts available; contained there are all the R code snippets you’ll notice interspersed in the text.
  • I’ve collected some resources to help you with some basics of RMarkdown.
    • Here is a RMarkdown/LaTeX template file for writing nicely formatted documents, along with its pdf output.
    • A LaTeX quick reference is available for commonly used symbols.
    • RStudio Server is a cloud service that lets you edit and compile R and RMarkdown files through a web browser so that no local installation is needed. The server is hosted on the MHC network and you need to be on the VPN to access it if you’re away from campus.
    • You can also install R and RStudio locally on your personal computer (you must install R before RStudio), or you can also use RStudio Cloud, which is a commercial RStudio cloud service with a free tier.
  • Here are some useful RMarkdown documents that have code snippets for ideas that will be used through the class.