“So, how do you know that what you do works, Barrett?”
When I hear this I always pause a bit. I know that this is one of those questions that should generate a clear and concise answer from me, but I don’t really have one. And I feel after all these years that I may never.
I begin by saying that it is not my intention to convince anyone that my methods are more successful at producing a therapeutic result than anyone else’s. I’m disinterested in competing with other therapists in this way and I don’t see how a study could be designed to prove that in any case. I say, ” The purpose of science is to make sense of things. The scientific method is not designed so much to assure us that something is true as it is to demonstrate that one set of circumstances will lead to another, or that a certain observation can be explained by considering what we currently know about physical reality. If I provoke the body in a certain way, what follows can be predicted at times, and, at times, not. This depends upon the nature of the provocation and the tissue provoked. The geometry of the nervous system, for instance, is of the sort that.”
Along about now there are two major reactions to my little speech: deep regret that the question was asked, and/or boredom. Some, I suspect, are certain that I’m just dodging the question. I’m not. Believe it or not, I don’t think that the issue of what “works” is all that compelling or that prolonged discussions about it ever produce much.
Let me quote the editor of the APTA Journal, Jules Rothstein in a recent discussion about what constitutes evidence in evidence-based practice: “…at best, we have an argument that a treatment makes sense, that is, a case for `biological plausibility.’ This is not evidence of effectiveness, and it proves nothing other than the treatment is derived from an idea.” (Physical Therapy Vol. 80 No. 1 Jan. 2000) He goes on to say that others might not find the idea “reasonable.” Fair enough.
But I see here a massive distinction between evidence of effectiveness and evidence of reasonableness. I find discussions of the reasonable nature of our ideas infinitely more productive than this question of effectiveness. I can always point to literature to defend my thinking and how it is manifest as technique. How that is eventually seen in outcome studies is research (as recently described the same issue of the journal) “not for the faint of heart.” (Expert Practice in Physical Therapy; Jensen et al)
Consider this from Richard Di Fabio PhD, PT: “Clinical research often produces contradictory findings regarding the effectiveness of care. (It is clinical expertise that) guides the application of clinical research.” (Myth of Evidence-Based Practice JOSPT Vol. 29 No. 11 Nov. 1999) Again, I see the distinction between thinking about what would be reasonable to assume about the method of practice and the results obtained. When someone asks me why I do what I do, I can talk endlessly, and I can cite excellent literature. When they ask me if it “works” I simply don’t know, and I probably never will. If they stop listening at that point, all the reasoning will be lost, and no real progress toward more effective care can be made.
Does it work? That’s the wrong question.