Program Info
Program Title | Program Evaluation and Data Analytics |
Course Info
Course Title | Foundations of Program Evaluation Part III |
Course Number | CPP 525 |
Canvas Shell | https://canvas.asu.edu/courses/115725 |
Course Level | Graduate |
Course Start-End | March 13th to May 3rd, 2022 |
Course Prerequisites | CPP 523 and CPP 526 |
Class Meeting Times | Asynchronous |
Class Location |
Course Instructors
David Selby, PhD | Professor | ||||
Office Location: | virtual |
Office Hours
David Selby, PhD | By appointment | Zoom | SCHEDULE |
Lab Sessions
Discussion Session Time | Thursdays at 7:30pm |
Discussion Session Location | https://asu.zoom.us/j/6829300585 |
Assignment Discussion Board | SUBMIT A QUESTION |
Textbooks
Impact Evaluation in Practice | Gertler, P. J., Martinez, S., Premand, P., Rawlings, L. B., & Vermeersch, C. M. J. | 2011 | Free online | |
Real Stats: Using Econometrics for Political Science and Public Policy | Bailey, M. A. | 2016 | Recommended Reference |
I. Course Description, Course Goal and Course Learning Objectives:
Regression serves as the foundation for modern quantitative program evaluation techniques. It is a powerful set of tools used to examine relationships in data and test hypotheses concerning the significance of these relationships. Regression can be used to analyze observational data, in which case it can be used to identify correlational relationships to predict when events will occur together. In the program evaluation context we are specifically interested in causal analysis, which allows us to determine whether a management practice, a nonprofit or government program, or a specific public policy has a positive impact. When certain conditions are met we can use regression analysis to produce an imperfect but reasonable estimate the causal impact of a policy or program.
This course helps you expand your program evaluation toolkit by demonstrating how to estimate several common regression models that leverage unique data and counterfactual specifications. The previous course on research design (CPP 524) covered a collection of experimentatal and quasi-experimental approaches to estimating program impact. This course extends the previous material by translating each specific form of the counterfactual – pre-post with comparisons, reflexive design, and the post-test only design – into specific regression models that leverage each counterfactual. This course teaches you how to estimate program effects using a given research design. We will cover the following models organized by counterfactuals:
Pre-post with comparison group:
(1) difference-in-difference regression
(2) panel models using fixed effects
Reflexive design:
(3) time series analysis
Post-test only design:
(4) propensity score matching
(5) regression discontinuity design
(6) instrumental variables
We also cover (7) logistic regression, a common technique to use when your outcome is binary (1/0).
The main learning objectives for the course are:
- Gaining comfort with hypothesis-testing using regression models
- Developing the ability to select the best model to evaluate a specific program
- Be able to identify the implicit counterfactual used in each model
- Gain knowledge of assumptions of each model
Course Prerequisites:
This course will build upon material presented in Foundations of Program Evaluation II.
To be successful in this program you need foundational knowledge about regression models covered in CPP 523 or a similar class. Basic R knowledge is assumed, including the ability to use R Markdown documents for labs.
Math
This course utilizes algebra and some geometry, specifically the slope-intercept equation of a line:
y = mx + b
We will use basic probability, logarithms, and exponents, all at a high school level. We will NOT be using calculus, matrix algebra, or proofs for this course.
We will rely heavily on visual reasoning with the data, an intuitive understanding of regression mechanics, and a strong understanding of the interpretation of results, but for the most part we will rely on software for advanced mathematical calculations.
II. Assessment of Student Learning Performance & Proficiency: Keys to Student Success
Assessment of student performance in this course is based on indications that the course learning objectives stated above have been achieved. Several areas of measurement will be used to produce a final student performance rating. These areas of performance assessment include the following:
- Knowledge of key concepts associated with regression models, the interpretation of program impact in quantitative studies, the mechanics of control variables, and the differentiation between correlational and causal analysis.
- Ability to run and interpret program evaluation models by correctly specifying a multiple regression equation, diagnosing problems, and presenting findings to stakeholders.
- Completed assignments are measured and assessed based on a demonstrated understanding of core regression concepts and the ability to clearly and accurately interpret model results.
Students will demonstrate competency in understanding, producing and communicating results of their analyses through weekly labs.
The course grade is a direct reflection of performance on labs. Students should take stated expectations seriously regarding preparation, conduct, and academic honesty in order to receive a grade reflective of outstanding performance.
Students should be aware that merely completing assigned work in no way guarantees an outstanding grade in the course. To receive an outstanding course grade all assigned work should completed on time with careful attention to assignment details.
III. Course Structure and Operations; Performance Expectations
A. Format and Pedagogical Theory
Mastering advanced analytical techniques is like learning a language. You start by mastering basic vocabulary that is specific to statistics. Through your coursework you will become conversant in the domains of regression analysis, research design, and data analysis.
Progress might be slow at first as you work to master core concepts, integrate the building blocks into a coherent mental model of real-world problems, learn to translate technical results into clear narratives for non-technical audiences, and become comfortable with data programming skills. Over time you will find that your thought processes change as you approach problem-solving in a more structured and evidence-based manner, you apply counter-factual reasoning to performance problems, and you start reading the news and viewing scientific evidence differently. You begin to think and speak like a program evaluator.
By the end of this degree you will be conversant in statistics, research design, and data programming. Fluency takes time and will be developed through professional experience. It requires you to practice these skills to develop muscle memory. You can do this through participating in evaluations on the job and gaining experience building and cleaning data sets from scratch. Understand, though, that this degree focuses on building foundations for your career. Don't be nervous if it feels like it's impossible to master all of the material in this program – it is impossible to learn everything in this field in a year.
Similar to immersion in a language, the best way to learn the material is to be consistent in doing course work each day. The more frequently you revisit concepts and practice data programming the more you will absorb. The curriculum has been designed around this approach. Lectures are split into small units, and each unit includes questions to test your understanding of the material. Weekly labs allow you to spend some time applying the material to a specific problem. The final exam at the end of the semester is designed to help you make connections between concepts and consolidate knowledge. You will be much better off spending a small amount of time each day on the material instead of trying to cram everything into a couple of days a week.
Online discussion boards, when used, are design to accomplish three things: (1) allow students to interact with their peers and share ideas and interpretations of the assigned material, (2) such peer-to-peer discussion online helps build professional relationships with potential future colleagues in the field, and (3) the discussions permit the instructor to assess student engagement with the assigned material.
The online discussions are explicitly intended to meet the objectives stated above. They are not intended as another form of "lecture" where the instructor provides commentary and students simply react to that. Rather, the discussions are a chance for peer-to-peer interaction and proactive engagement by each individual student.
The purpose of all exams and assigned written work is also threefold: (1) the assignments and written exam afford students the opportunity to demonstrate substantive understanding of materials covered in course readings, lectures and online discussion, (2) the assignments and exam permit students to develop and demonstrate research, analytic and written communication skills, and (3) the written work permits the instructor to assess student knowledge, skills and ability within this subject domain.
B. Assigned Reading Materials
There is one required texts for this course, and it is available online for free:
- Gertler, P. J., Martinez, S., Premand, P., Rawlings, L. B., & Vermeersch, C. M. J. (2011). Impact Evaluation in Practice. The World Bank. Washington. Available free online.
Reference Texts
Each author approaches material in a slightly different way, so different textbooks work for different people. The following texts are recommended as good resources if you would like additional references:
- Field, A., Miles, J., & Field, Z. (2012). Discovering statistics using R. Sage publications.
- Bailey, M. A. (2016). Real Stats: Using Econometrics for Political Science and Public Policy. Oxford University Press.
- Bingham, R., & Felbinger, C. (2002). Evaluation in Practice: A Methodological Approach. CQ Press.
- Fox, J. (1991). Regression diagnostics: An introduction (Vol. 79). Sage.
- Berry, W. D., & Feldman, S. (1985). Multiple regression in practice (No. 50). Sage.
- Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2013). Applied Multiple Regression/Correlation Analysis for the Behavioral Sciences. Routledge.
- William R.. Shadish, Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Wadsworth Cengage learning.
- Cumming, G. (2013). Understanding the New Statistics: Effect Sizes, Confidence Intervals, and Meta-Analysis. Routledge.
- Stock, J. H., & Watson, M. W. (2007). Introduction to Econometrics.
- Wooldridge, J. M. (2015). Introductory Econometrics: A Modern Approach. Nelson Education.
In addition to the required textbooks, the instructor will supplement the assigned unit readings with various journal articles, policy reports, or other related material. These will be made available in the course shell.
C. Course Grading System for Assigned Work, including Final Exam:
Your grade will be based on your performance in the following areas:
- Weekly labs
Letter grades comport with a traditional set of intervals:
- 100 – 99% = A+
- 98 – 94% = A
- 93 – 90% = A-
- 89 – 87% = B+
- 86 – 84% = B
- 83 – 80% = B-
The assigned work for the term comes in the form of four elements, described below:
- Weekly Labs (100%): Each week you will be given lecture notes and a lab covering a new regression model. The labs require a substantial amount of work preparing the raw data for analysis, running models, and interpretting results. Your grade is derived from performance on these six labs. You are allowed one round of corrections for each lab, due at the same time as the lab for the next week.
D. General Grading Rubric for Written Work
In general, any submitted work written work (assignments and/or exams) is assessed on these evaluative criteria:
- Assignment completeness – all elements of the assignment are addressed
- Quality of analysis – substantively rigorous in addressing the assignment
- Demonstrated synthesis of core concepts from lecture notes and ability to apply to new problems
Most assignments in this course are labs that are graded pass-fail based upon completeness and correctness of responses (every attempt must be made to complete labs, and they must be more than 50% correct to receive credit). Discussion boards that accumulate points through each activity on the board.
The final project will be accompanied by a rubric describing the allocation of points and criteria for evaluation.
E. Late and Missing Assignments
Grades for the course are largely based on weekly labs. Assigned work is accompanied by detailed instructions, adequate time for completion and opportunities to consult the instructor with questions. As a result, each assignment element in the course is expected to be completed in a timely fashion by the due date. Labs must completed by the due date in order to recieve credit.
F. Course Communications and Instructor Feedback:
Course content is hosted on this website. Lecture files, assignments and other course communications will be transmitted via this site and/or through the class email list. All assignment submissions will be made through the Canvas shell.
Please post lab questions on the Get Help page on this site, schedule individual office hours using the Calendly link provided above, and email the instructor directly instead of using the Canvas system.
Students should be aware that the course instructor will attempt to respond to any course-related email as quickly as possible. Students are asked to allow between 24 and 48 hours for replies to direct instructor emails, generally, as a reasonable time to reply to questions or other issues posed in an email. Additionally, the general timeline for instructor grading or other feedback on assignments, either writer work or online discussion work, is between 5 and 10 work days.
I. Student Learning Environment: Accommodations
Disability Accommodations: Students should be fully aware that the Arizona State University, the MA in EMHS program, and all program course instructors are committed to providing reasonable accommodation and access to programs and services to persons with disabilities. Students with disabilities who wish to seek academic accommodations must contact the ASU Disability Resources Center directly. Information on the Center's procedures, resources and how to contact its staff can be found here: https://eoss.asu.edu/drc/. The Disability Resources Center is responsible for reviewing any student's requests; once that review has taken place, the Center will provide the student with appropriate information on academic accommodations which in turn will be provided to the course instructor.
Religious accommodations: Students will not be penalized for missing an assignment due solely to a religious holiday/observance, but as this class operates with a fairly flexible schedule, all efforts should be made to complete work within the required timeframe. If this is not possible, students must notify the instructor as far in advance as possible in order to make an alternative arrangement.
Military Accommodations: A student who is a member of the National Guard, Reserve, or other branch of the armed forces and is unable to complete classes because of military activation may request complete or partial unrestricted administrative withdrawals or incompletes depending on the timing of the activation. For more information see ASU policy USI 201-18.
IV. Course Schedule and Unit-specific Learning Objectives
A. Schedule: Overview of Readings and Assignments
As students are all aware, ASU Online courses are typically offered on a seven and a half week schedule. A schedule for each week of the term is outlined here; the course is divided into seven units with specific learning objectives for each unit.
Please note: the course instructor may from time to time adjust assigned readings or adjust the due dates for assignment. The basic course content approach and learning objectives will not change, but slight modifications are possible if circumstances warrant an adjustment.
Couse Schedule
Unit 1 - Interrupted Time Series
Unit 2 - Difference-in-Difference Models
Unit 3 - Panel Data with Fixed Effects
Unit 4 - Instrumental Variables
Unit 5 - Regression Discontinuity Design
Unit 6 - Logistic Regression
Unit 7 - Propensity Score Matching