It occurred to me recently that it's a bit odd that most of my "real world" exposure to research comes in the form of the variety of clinical trials that go on around me on a regular basis, and yet I rarely comment on clinical trials.
This is probably because most clinical trials are a little dense to get through, and the results tend to be less interesting to people (it turns out the reuptake limitations actually weren't as dramatic as they made them out to be!) and there's rarely much media involvement to mix things up.
Anyway, I heard a tidbit recently about when to be suspicious of results of clinical trials that I thought I'd pass along.
In any trial assessing a new treatment/drug/etc vs a placebo, you would expect to see more dropouts in the "treated" arm of the study. This, of course, is because most drugs/treatments have very real side effects that will bother people and cause them to drop out. Therefore, if you see a trial where the dropout rate is higher in the placebo arm, you should be suspicious. Placebo studies should almost always be blinded for the patients (and ideally for the providers), but if significantly more of those in the placebo arm drop out, you know this has gone wrong. Patients don't keep showing up if they know they're not actually getting treated with anything...and once we've established that the patients know which arm of the study they're in, the results become much less reliable.
I thought that was an interesting tidbit to keep in mind.
I don't know how it is in other medications, but with psych meds, the presence of side effects tips everyone off that Patient E isn't getting the placebo. This may not be a terrible cost when the results are entirely objective, such as a lab result. But our stuff tends to be more subjective (even though we are pretty good at phrasing questions to get under that somewhat), and awareness can change the results.
ReplyDeleteI don't have a solution to that.
That's an interesting thought, thank you for sharing. I've had exposure to a lot of clinical trials in Salt Lake City, UT, and it is interesting how dependent we are as researchers on the data.
ReplyDelete