Sunday, 30 August 2015

Opportunity cost: A new red flag for evaluating interventions for neurodevelopmental disorders

Back in 2012, I wrote a blogpost offering advice to parents who were trying to navigate their way through the jungle of alternative interventions for children with dyslexia. I suggested a set of questions that should be asked of any new intervention, and identified a set of 'red flags', i.e., things that should make people think twice before embracing a new treatment.

The need for an update came to mind as I reflected on the Arrowsmith program, an educational approach that has been around in Canada since the 1980s, but has recently taken Australia and New Zealand by storm. Despite credulous press coverage in the UK, Arrowsmith has not, as far as I know, taken off here. Australia, however, is a different story, with Arrowsmith being taken up by the Catholic Education Office in Sydney after they found 'dramatic results' in a pilot evaluation.

For those who remember the Dore programme, this seems like an action replay. Dore was big in both the UK and Australia in the period around 2007-2008. Like Dore, it used the language of neuroscience, claiming that its approach treated the underlying brain problem, rather than the symptoms of conditions such as dyslexia and ADHD. Parents were clamouring for it, it was widely promoted in the media, and many people signed up for long-term payment plans to cover a course of treatment. People like me, who worked in the area of neurodevelopmental disorders, were unimpressed by the small amount of published data on the program, and found the theoretical account of brain changes unconvincing (see this critique). However, we were largely ignored until a Four Corners documentary was made by Australian ABC, featuring critics as well as advocates of Dore. Soon after, the company collapsed, leaving both employees of Dore and many families who had signed up to long-term financial deals, high and dry. It was a thoroughly dismal episode in the history of intervention for children with neurodevelopmental problems.

With Arrowsmith, we seem to be at the start of a similar cycle in Australia. Parents, hearing about the wondrous results of the program, are lobbying for it to be made more widely available. There are even stories of parents moving to Canada so that their child can reap the benefits of Arrowsmith. Yet Arrowsmith ticks many of the 'red flags' that I blogged about, lacks any scientific evidence for efficacy, and has attracted criticism from mainstream experts in children's learning difficulties. As with Dore, the Arrowsmith people seem to have learned that if you add some sciency-sounding neuroscience terms to justify what you do, people will be impressed. It is easy to give the impression that you are doing something much more remarkable than just training skills through repetition.

They also miss the point that, as Rabbitt (2015, p 235) noted regarding brain-training in general: "Many researchers have been frustrated to find that ability on any particular skill is surprisingly specific and often does not generalise even to other quite similar situations." There's little point in training children to type numbers into a computer rapidly if all that happens is that they get better at typing numbers into a computer. For this to be a viable educational strategy, you'd need to show that this skill had knock-on effects on other learning. That hasn't been done, and all the evidence from mainstream psychology suggests it would be unusual to see such transfer of training effects.

Having failed to get a reply to a request for more information from the Catholic Education Office in Sydney, I decided to look at the evidence for the program that was cited by Arrowsmith's proponents. An ongoing study by Dr Lara Boyd of the University of British Columbia features prominently on their website, but, alas, Dr Boyd was unresponsive to an email request for more information. It would seem that in the thirty-five years Arrowsmith has been around, there have been no properly conducted trials of its effectiveness, but there are a few reports of uncontrolled studies looking at children's cognitive scores and attainments before and after the intervention. One of the most comprehensive reviews is in the D.Phil. thesis of Debra Kemp-Koo from the University of Saskatchewan in 2013. In her introduction, Dr Kemp-Koo included an account of a study of children attending the private Arrowsmith school in Toronto:
All of the students in the study completed at least one year in the Arrowsmith program with most of them completing two years and some of them completing three years. At the end of the study many students had completed their Arrowsmith studies and left for other educational pursuits. The other students had not completed their Arrowsmith studies and continued at the Arrowsmith School. Most of the students who participated in the study were taking 6 forty minute modules of Arrowsmith programming a day with 1 forty minute period a day each of English and math at the Arrowsmith School. Some of the students took only Arrowsmith programming or took four modules of Arrowsmith programming with the other half of their day spent at the Arrowsmith school or another school in academic instruction (p. 34-35; my emphasis).
Two of my original red flags concerned financial costs, but I now realise it is important to consider opportunity costs: i.e., if you enlist your child in this intervention, what opportunities are they going to miss out as a consequence? For many of the interventions I've looked at, the time investment is not negligible, but Arrowsmith seems in a league of its own. The cost of spending one to three years working on unevidenced, repetitive exercises is to miss out on substantial parts of a regular academic curriculum. As Kemp-Koo (2013) remarked:
The Arrowsmith program itself does not focus on academic instruction, although some of these students did receive some academic instruction apart from their Arrowsmith programming. The length of time away from academic instruction could increase the amount of time needed to catch up with the academic instruction these students have missed. (p. 35; my emphasis).

Kemp-Koo, D. (2013). A case study of the Learning Disabilities Association of Saskatchewan (LDAS) Arrowsmith Program. Doctor of Philosophy thesis, University of Saskatchewan, Saskatoon.  

Rabbitt, P. M. A. (2015). The aging mind. London and New York: Routledge.


  1. I would have thought that those figures concerning time spent on this program would be of considerable interest to the local schools and/or curriculum inspection authorities. This sounds more like a cult than a treatment (for anything).

  2. Several research studies are described on the Arrowsmith website. All but one are uncontrolled i.e. they assess change in a single group of children pre and post an Arrowsmith programme. As Dorothy has said repeatedly, these are useless because it's not clear that any improvement over time is an effect of the programme. Just one study - Lara Boyd's, mentioned by Dorothy - has a control group. It was scheduled to collect data in Jan and June 2014, and Jan 2015. The fact that Dr Boyd didn't respond to a recent request for information suggests that the results might not have been favourable.

  3. Re Boyd, it may depend when Dorothy wrote the email. I live in Canada and the last 3-4 weeks would be prime vacation time for a prof, plus school is restarting in many places this week most everything from kindergarten to university so she might be getting kids of her own off to school, dealing with a new influx of students into her lab but it does not look all that good.

    1. Just to say that I didn't actually request details of the results, as the video I saw suggested the study was still ongoing. I asked two questions that should be easy to answer: (a) was the study registered and (b) who was funding it. Registration of clinical trials is pretty standard these days, and helps give confidence in the results as it means that the measures and hypotheses are prespecified.

    2. (This is from an anonymous commenter who had their comment eaten by Blogger)
      Following from jrkrideau (6 Sep) and Dorothy's reply (11 Sep); from a Research Initiatives document on the Arrowsmith website (see below) I had learned that Lara Boyd's study was scheduled to complete initial data collection in Jan 2015. So I thought it worth emailing her myself, to ask whether any data were yet available.

      She replied by return as follows:
      '... Our study is still underway - we were fortunate enough to receive additional funding which has allowed us to continue to recruit and test a larger number of individuals. As a result we do not yet have data to release.... If you would like to be added to a mailing list, email We will update you once we have published our results.'

      I asked to join the mailing list, and was told that data collection from the increased sample is due to finish in aUtumn 2016.

      The 'Research Initiatives' document dated Mar 2014 is available, along with other information docs and videos, at:
      Info about Dr Boyd's study as originally designed is on pp.7-9. It gives the numbers of participants in each group (two different versions) and some details of the measures and research plan. It does not answer the specific questions raised by Dorothy on 11 Sep.

      Scanning this doc again, I found one other study (apart from Dr Boyd's) that included a control group, by W.J. Lancee (Toronto) 2003. The results are reported as favouring the Arrowsmith group. The groups are small, the data are complex and they involve varying participant selection for statistical analysis. Judging from Google Scholar, they have never been published in a peer-reviewed journal.

      So the scientific case for benefit from the Arrowsmith program remains unclear. Dr Boyd's study, when finally completed, may cast more light.