Poor causal thinking about education research

Disclaimer: nothing about what I’m saying below is new. I have been influenced by the ideas of many excellent people that I’ve encountered along the way (thank you twitter). These people include Julia Rohrer whose work on causal inference in psychology is excellent, Richard McElreath whose online videos are seriously entertaining (if you like statistics), Shadish, Cook & Campbell (whose book is noted below), and several other articles I’ve read over the years mainly in the realm of behaviour genetics and psychology. Oh and of course the hundreds of hours of Quantitude I’ve listened to 🙂

I feel like the online algorithms are sending me more articles just recently about ways to make improvements to educational outcomes. This is a pretty broad area so let me try and narrow down with some examples. What I’m talking about are articles which claim, for example, any given intervention in a school leads to better academic results for students. Or perhaps it’s the other direction, that circumstances within schools cause unhappiness amongst young people. Or perhaps its something about students’ mindsets that explain improved outcomes for some students relative to others – thanks to @Steven_Kolber and the #edureading research group for this gem of an example:

PISA findings support the idea that instilling a growth mindset in students could result in better academic performance“.

Did you notice it? That word? Cause.

We think we know what we’re talking about when we make claims about one thing causing another.

Here’s another one that arises annually in Australia: NAPLAN tests cause students to be anxious. People that think standardized tests are the devil’s work love this one. “Ha!” They yell (digitally), “Told you so! Tests are bad for children because they make children anxious.”

But how do we define what a cause is? And are these claims defensible?

I find thinking about causal inference very interesting, and not something that was explicitly covered in my teacher training. Causes exist and we somehow imbibe them just by observation. But personal observations can be wildly inaccurate and biased. Our views are always coloured by the perspective from which we view a thing.

My first introduction to thinking systematically about causal inference was via a classic of social science research methods, Experimental and Quasi-Experimental Designs for Generalized Causal Inference by Shadish, Cook and Campbell (2012). The book was given to me by one of my PhD supervisors, Dr Callie Little.

In the initial chapters the authors identify three key conditions that must hold if a phenomenon is to be considered a cause:

1) The cause must be related to the effect

2) The cause must precede the effect

3) There must be no other plausible alternative explanations (for the effect).

I think what happens a lot of the time in the articles, blogs, tweets etc. that I’ve been reading, is that the causal claims are supported ONLY by the first requirement: that the cause is related to the effect. The fact is, in the world of education and student achievement, many things are related to many other things. Many aspects of student behaviour, psychology, academic achievement, motivation, context, home life and so on, are interrelated. To make a claim that one thing causes the other, however, we can’t simply rely on this fact. For a cause to be logically defensible the additional two conditions must also hold.

If the cause must precede the effect then we have a big problem with claims made from cross-sectional data (looking at you every single report on correlations between PISA test outcomes and questionnaire data). Even if your survey is enormous – thousands and thousands of students! – and you have rigorously measured the attributes of interest, if the data are cross-sectional there really is no way of telling whether the claimed cause really did precede the effect of interest.

Things get so much more complicated when we look at the development of children over any time span: two things might be related, but they may develop together, in a reciprocal fashion, with improvements in one area pushing along improvements in another and vice versa in a complex system. It becomes very difficult to pin down what specific thing is the cause and what thing is the effect – and this is just an example with two variables, let alone a system with many variables. Many causes and many effects, perhaps.

Which leads to the third requirement: there must be no other plausible explanation. If students are unhappy in schools for example, how can we claim that it is something about schools that is causing this unhappiness? Unless, that is, we have collected information about every other plausible cause, and discounted them all. You see how this becomes monumentally difficult? As far as I can see, there are always plausible alternative explanations. Acknowledging this point is potentially the first step towards talking in a more moderate way about causes and effects in educational domains.

One way around this, of course, is to set up a randomized trial whereby the random allocation of students to intervention and control groups should mean that external factors driving differences between groups are not causal (because they apply equally to each group if your randomization has worked correctly). But these kinds of trials aren’t always the panacea that some seem to think: we can’t answer every research question of interest with a randomized trial.

Furthermore, the types of interventions that can be evaluated in randomized trials may be limited because of the fact that schools are real-world settings replete with all the contradictions and difficulties of the people who inhabit them. Plus, theories of developmental change aren’t always considered in these designs. Group mean differences on an outcome between intervention and control groups doesn’t tell us a lot about how students skills grow and develop over time: we need different research designs for these questions.

So you see why I get slightly annoyed when I come across yet another article making sweeping causal claims in the educational space. Most of the time each piece of research is just that: one piece in a much larger puzzle that we are working at trying to figure out one section at a time. Of course different research designs allow for different types of inferences, with some allowing stronger causal claims, or more rigorous generalizability than others. Each piece of research might give us new insights, help us to think about a problem in a different way, or add to existing evidence about phenomena in the world. Writing about these studies as though they’ve found out ‘the truth’ about the concept of interest, though, is misleading and something that we should all push back against.

2 thoughts on “Poor causal thinking about education research

  1. Thanks Sally, its really important for teachers to understand these sorts of studies and their limitations. The link you’ve made to PISA is relevant given the political ramifications of PISA analysis.An understanding of where these types of studies sit in the quality protocols of the large and influential evidence organisations is also very important. The large USA- What works Clearing House protocols have these studies very low in their quality assessments. The English- Education Endowment Foundation, also warn about the problems of these studies. However, Hattie’s Visible Learning, gives cross sectional studies the same weight as RCT or quasi experiments. In fact, Hattie’s top 3 influences are correlations converted into an effect size – Collective Teacher Efficacy, Piagetian Programs & Self Report grades. A number of peer reviews write about this, in particular Bergeron & Rivard -summaries here – https://visablelearning.blogspot.com/p/correlation.html

    Liked by 1 person

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s