Over the last few weeks, thanks to the power of twitter (and particularly tweeter @LMarryat) I became aware of some journal articles and blog posts about the Triple P parenting support programme. I must confess to a possibly not entirely unbiased interest in this, as before my current academic position I worked as a health visitor in an authority which invested heavily in the Triple P programme and over a period of a few years we were all trained up and expected to deliver the programme at every possible opportunity.
Triple P is marketed as an evidence-based programme which provides support for parents of 0-18s who have issues with various aspects of behaviour. It is provided at several levels and the support can be provided to individual families or done as group work. As a practitioner, I was trained at level 3, which is the basic parenting support package for individual families. It meant that I was trained to give advice and support around all sorts of parenting issues – food refusal, tantrums, toilet training, home safety, whining, hitting, etc etc – through the structured programme, supported by tipsheets and a DVD (although I must confess to never using the DVD myself). For some families it seemed to be really helpful, for others less so – to be honest the materials seemed as good as anything else I’d seen, but not especially better, and there were some aspects of it that really jarred. Triple P is based in Australia so all the materials are developed there, and part of the deal is that the purchasing authority does not deviate from them, photocopy them or adapt them in any way. In particular I got a right bee in my bonnet about the Home Safety tipsheet – in an inner-city area of multiple deprivation, high rise flats etc it really got on my wick that I had to hand families a sheet which told them how they must make sure that their children don’t go near the swimming pool in the garden unattended. That’s the most extreme example, but I just think it typifies the issues that many of us had with the actual materials. The other thing that made me cross was how we weren’t allowed to get the sheets translated for our families who spoke minimal or no English – there were some sheets in a few languages, but not enough for everyone who needed to use them across the city, and not in all the languages we came across in our practice, in a city which was one of the major reception cities for dispersed asylum seekers. This meant that it was really hard to offer the same level of support to families who were just as much in need of help as the English-speaking population. We also did have other issues which in a way were not Triple P’s fault and were more to do with the way management basically forced us to deliver the programme whether we wanted to or not – it became an exercise in numbers, in filling in forms and ticking boxes, and is one of the main reasons why I finally decided it was time to leave. It was getting to the point where I felt that I had no choice but to offer families a Triple P intervention, regardless of whether I thought it was what they needed, because of the pressure (and there really was pressure) to meet targets and figures. What also irritated was that Triple P was rolled out across the city at the same time as all the staff had to endure a 2 year pay freeze, and when we complained about the pressure to deliver Triple P we were basically told that as they had spent so much money bringing it in, tough we had to do it anyway. So when I say that I’m not entirely unbiased, perhaps it would be more accurate to say that I’m really quite bitter about it! (grrrr!)
One of the things that we were often told, again and again, was that Triple P was so great because it was “evidence-based”. I didn’t have access to university libraries at the time so it wasn’t easy to access articles, only abstracts, but I did a Google Scholar search and noticed that pretty much every article about Triple P featured the name of Dr Matt Sanders, who is the person from the University of Queensland who developed Triple P and is very involved in the materials, promotion and training development for the programme. I wasn’t able to get hold of the full articles due to paywalls, but this discovery did leave me with some questions about the evidence and the risk of conflicts of interest.
More recently, studies have started to emerge which call the evidence base for Triple P into question. Mostly they seem to be highlighting that the claims of the evidence to date can be questioned due to small sample sizes of the various research. There is also a recent trial with larger numbers that did not involve the Triple P team which compared Triple P level 4 with two other parenting programmes in Birmingham (Little et al 2012) where there were no discernible effects from Triple P. Following this, a systematic review and meta-analysis article by Scottish researchers in BMC Medicine (Wilson et al 2012) raised questions about the current evidence base (as outlined above – small sample sizes, potential conflicts of interest), which was also blogged about at PLOSOne by Dr James Coyne here. There then followed a response from Dr Matt Sanders et al here, and Dr James Coyne & Dr Linda Kwakkenbos also responded to both article and response here. [And as an aside, hooray for open access!] There is also a blog post which summarises the discussion and issues by Dr Pedro De Bruyckere here, which also provides a link (unfortunately in Dutch, not one of my languages sadly) to a recently defended PhD which found no significant effect of Triple P interventions. I must say from my experience as a practitioner I absolutely agree with his concern about a lot of money being involved which I do think affected how we as professionals were expected to push the programme.
What is worrying about that final blog post is the discussion about the difficulty in publishing null or negative results of research. This seems like the ideal point to publicise the Alltrials campaign, which is highlighting how much publicly-funded clinical research is not reported at all, meaning that trial results are lost to future researchers. Please do sign the petition if you have not already done so! There are all sorts of reasons why research might be buried, but regardless I agree that it is vital that all clinical trial results are published so that their findings are available not only to inform future research but also to inform current clinical practice. There can be no excuse, particularly in these straitened times, for publically funded research to be buried. What has shocked me though – I guess I must be naive, it hadn’t actually occurred to me before that this could happen – was the issue of publishers declining null/negative research results. That is a really worrying development that must be resisted.
[Edited to add]
One thing which saddened me as a practitioner was that, in discussing with my health visiting colleagues it seemed that there was another parenting programme (whose name escapes me at the moment) that they had been trained in and using before the drive to use Triple P. All of them were unanimous about how much they liked it, how much they thought it was making a difference, but that when the decision was made to plump for a programme which would be rolled out across the city it lost out as it did not have the requisite “evidence-base”. I just think that is such a lost opportunity – I appreciate that in a city where parenting and anti-social behaviour is often an issue at a large level that a community approach needs to be taken, but why could they have not done some good quality research, perhaps even comparing this programme with Triple P, to start to build an evidence base? I wish they had been brave enough to do that – they might well have kept the goodwill of their staff (which was sorely lacking with the way Triple P was introduced) and seen some great results. I can think of one researcher (not a million miles away) who would have loved to have been involved with the qualitative aspect of that sort of study.