Processing math: 100%
+ - 0:00:00
Notes for current slide
Notes for next slide

In-person
Online
session 4

February 1, 2024

PMAP 8521: Program evaluation
Andrew Young School of Policy Studies

1 / 43

Plan for today

2 / 43

Plan for today

Regression FAQs

2 / 43

Plan for today

Regression FAQs

Regression with R

2 / 43

Plan for today

Regression FAQs

Regression with R

Measuring outcomes

2 / 43

Plan for today

Regression FAQs

Regression with R

Measuring outcomes

(Maybe)
DAGs

2 / 43

Regression FAQs

3 / 43

How was the 0.05 significance
threshold determined?

Could we say something is significant
if p > 0.05, but just note that it is at
a higher p-value?
Or does it have to fall under 0.05?

4 / 43

Why all this convoluted
logic of null worlds?

5 / 43
Oatmeal ratings
6 / 43
Oatmeal ratings
7 / 43

Why does this matter for evaluation?

Statistical power!

8 / 43

Do we care about the actual coefficients
or just whether or not they're significant?

How does significance relate to causation?

If we can't use statistics to assert causation
how are we going to use this information
in program evaluation?

9 / 43

What counts as a "good" R²?

10 / 43
Euler diagram
11 / 43
R2 prediction
12 / 43
R2 estimation
13 / 43
R2 estimation vs prediction
14 / 43

Simpson's Paradox

15 / 43

Regression with R

16 / 43

Measuring
outcomes

17 / 43

The paradox of evaluation

18 / 43

The paradox of evaluation

Evaluation is good, but expensive

"Evaluation thinking"

18 / 43

The paradox of evaluation

Evaluation is good, but expensive

"Evaluation thinking"

Too much evaluation is bad

Taming programs

18 / 43

Outcomes and programs

Outcome variable

Thing you're measuring

19 / 43

Outcomes and programs

Outcome variable

Thing you're measuring

Outcome change

∆ in thing you're measuring over time

19 / 43

Outcomes and programs

Outcome variable

Thing you're measuring

Outcome change

∆ in thing you're measuring over time

Program effect

∆ in thing you're measuring over time because of the program

19 / 43

Outcomes and programs

Outcomes and program effect
20 / 43

Abstraction

21 / 43

DAGs

22 / 43

Causal thinking is necessary—
even for descriptive work!

23 / 43
Supercentenarians
24 / 43

Necessity of causal thinking: Mention the McElreath tweet on birth certificate introduction and death ages: https://twitter.com/rlmcelreath/status/1427564280744976384

https://www.biorxiv.org/content/10.1101/704080v2

"Every time I get a haircut, I become more mature!"

Benjamin haircut
25 / 43

"Every time I get a haircut, I become more mature!"

E[Maturitydo(Get haircut)]

26 / 43

Getting older opens a backdoor path

27 / 43

But what does that mean,
"opening a backdoor path"?

How does statistical association
get passed through paths?

28 / 43

How do I know which of these is which?

DAG associations
29 / 43
Switch and slider

30 / 43
Switch and slider

31 / 43
Switch and slider

32 / 43
33 / 43
34 / 43
35 / 43

d-separation

Except for the one arrow between X and Y,
no statistical association can flow between X and Y

This is identification
all alternative stories are ruled out
and the relationship is isolated

36 / 43

How exactly do colliders
mess up your results?

It looks like you can
still get the effect of X on Y

37 / 43

38 / 43
Facebook collider
39 / 43

Does niceness improve appearance?

40 / 43

Does niceness improve appearance?

40 / 43

Collider distorts the true effect!

41 / 43

Collider distorts the true effect!

41 / 43

Effect of race on police use of force
using administrative data

42 / 43

Effect of race on police use of force
using administrative data

Use of force
Use of force
43 / 43

Plan for today

2 / 43
Paused

Help

Keyboard shortcuts

, , Pg Up, k Go to previous slide
, , Pg Dn, Space, j Go to next slide
Home Go to first slide
End Go to last slide
Number + Return Go to specific slide
b / m / f Toggle blackout / mirrored / fullscreen mode
c Clone slideshow
p Toggle presenter mode
t Restart the presentation timer
?, h Toggle this help
oTile View: Overview of Slides
Esc Back to slideshow