Monday, October 5, 2009

Mediterranean diet is good for your brain

In this month's Archives of General Psychiatry, a study by Sanchez-Villegas et al. is published showing a strong association between lower rates of depression, and consuming a Mediterranean diet (lots of vegetables, fruits, nuts, whole grains, and fish, with low intake of meat, moderate intake of alcohol & dairy, and lots of monounsaturated fatty acids compared to saturated fatty acids). Data was gathered prospectively during a period averaging over 4 years, and was based on following about 10 000 initially healthy students in Spain who reported food intake on questionnaires.

I'll have to look closely at the full text of the article. I'm interested to consider the question of whether the results strongly suggest causation, or whether the results could be due to non-causal association. That is, perhaps people in Spain with a higher tendency to become depressed tend to choose non-Mediterranean diets. Another issue is cultural: the study was done in Spain, where a Mediterranean diet may be associated with certain--perhaps more traditional--cultural or subcultural features, and this cultural factor may then mediate the association with depressive risk.

In any case, in the meantime, given the preponderance of other data showing health benefits from a Mediterranean-style diet, I wholeheartedly (!) recommend consuming more nuts, vegetables, olive oil, fish, whole grains, and fruit; and less red meat.

The need for CME

Here's another article from "the last psychiatrist" on CME:
http://thelastpsychiatrist.com/2009/07/who_should_pay_for_continuing.html#more

Another insightful article, but pretty cynical!

But here are some of my opinions on this one:

1) I think that, without formalized CME documentation requirements, there would be some doctors who would fall farther and farther behind in understanding current trends of practice, current research evidence, etc.
2) In the education of intelligent individuals, I have long felt that process is much more important than content. A particular article with accompanying quiz is bound to convey a certain biased perspective. It is my hope that most professionals are capable of understanding and resisting such biases. In this modern age, I do think that most of us have a greater understanding of bias, of being "sold" something. Anyway, I think that the process of working through such an article is a structure to contemplate a particular subject, and perhaps to raise certain questions or a debate in one's mind about it, to reflect further upon, or to research further, later on. Yet, I agree that there are many psychiatrists who might be more easily swayed in a non-critical manner, by a biased presentation of information. The subsequent quiz, and the individual's high marks on the quiz, become reinforcers for learning biased information.
3) After accurately critiquing a problem, we should then move on and try to work together to make more imaginative, creative educational programs which are stimulating, enjoyable, fair, and as free of bias as possible.

I think this concludes my little journey through this other blog. While interesting, I find it excessively cynical. It reminds me of someone in the back seat of my car continuously telling me--accurately, and perhaps even with some insightful humour--all the things I'm doing wrong. Maybe I need to hear this kind of feedback periodically--but small doses are preferable! Actually, I find my own writing at this moment becoming more cynical than I want it to be.

Opinions on mistakes psychiatrists make

Here's another interesting link from "the last psychiatrist" blog:

http://thelastpsychiatrist.com/2006/11/post_2.html#more


I agree with many of his points.

But here are a few counterpoints, in order:

1.) I think some psychiatrists talk too little. There's a difference between nervous or inappropriate chatter diluting or interrupting a patient's opportunity to speak, and an engaged dialog focusing on process or content of a problem. There is a trend in psychiatric practice, founded or emphasized by psychoanalysis, that the therapist is to be nearly silent. Sometimes I think these silences are unhelpful, unnecessary, inefficient, even harmful. There are some patients I can think of for whom silence in a social context is extremely uncomfortable, and certainly not an opportunity for them to learn in therapy. Therapy in some settings can be an exercise in meaningful dialog, active social skills practice, or simply a chance to converse or laugh spontaneously.

I probably speak too much, myself--and I need to keep my mouth shut a little more often. I have to keep an eye on this one.

It is probably better for most psychiatrists to err on the side of speaking too little, I would agree. An inappropriately overtalkative therapist is probably worse than an inappropriately undertalkative one. But I think many of us have been taught to be so silent that we cannot be fully present, intuitively, personally, intellectually, to help someone optimally. In these cases, sometimes the tradition of therapeutic silence can suppress healthy spontaneity, positivity, and humour in a way which only delays or obstructs a patient's therapy experience.

2) I agree strongly with this one--especially when history details are ruminated about interminably during the first few sessions.
However, I do think that a framework to be comprehensive is important. And sometimes it is valuable, in my opinion, to entirely review the whole history, after seeing a patient for a year, or for many years. There is so much focus on comprehensive history-taking during the first few sessions, or the first hour, that we forget to revisit or deepen this understanding after knowing a patient much better, later on. Sometimes whole elements of a patient's history can be forgotten, because they were only talked about once, during the first session.

There is a professional standard of doing a "comprehensive psychiatric history" in a single interview of no longer than 55 minutes. There may even be a certain bravado among residents, or an admiration for someone who can "get the most information" in that single hour. I object to this being a dogmatic standard. A psychiatric history, as a personal story, may take years to understand well, and even then the story is never complete. It can be quite arrogant to assume that a single brief interview (which, if optimal exchange of "facts" is to take place, can sound like an interrogation) can lead to a comprehensive understanding of a patient.

I do believe, though, that certain elements of comprehensiveness should be aimed for, and aimed for early. For example, it is very important to ask about someone's medical ailments, about substance use, about various symptoms the person may be too embarrassed to mention unless asked directly, etc. Otherwise an underlying problem could be entirely missed, and the ensuing therapy could be very ineffective or even deleterious.

Also, some individual patients may feel a benefit or relief to go through a very comprehensive historical review in the first few sessions, with the structure of the dialog supplied mainly from the therapist. Other individual patients may feel more comfortable, or find it more beneficial, to supply the structure of their story themselves. So maybe it's important not to make strong imperative statements on this question: as with so many other things in psychiatry, a lot depends on the individual situation.

3. I think it's important not to ignore ANY habitual behavior that could be harmful. Yet perhaps some times are better than others to address or push for things like smoking or soft-drink cessation: a person with a chronically unstable mood disorder may require improved mood stability (some of which may actually come from cigarette smoking, in a short-term sense anyway), before they are able to embark on a quit-smoking plan.

4. not much to add here
5. Well, point taken. I've written a post about psychiatry and politics before, and suggested a kind of detached, "monastic role." But on the other hand, any person or group may have a certain influence--the article here suggests basically that it's none of psychiatry's business to deal with political or social policy. Maybe not. But the fact is, psychiatry does have some influence to effect social change. And, in my opinion, it is obvious that social and political dynamics are driven by forces that are similar to the dynamics which operate in a single family, or in an individual's mind. So, if there is any wisdom in psychiatry, it could certainly be applicable to the political arena. Unfortunately, it appears to me that psychiatrists I have seen getting involved in politics or other group dynamics are just as swept up in dysfunctional conflict, etc. as anyone else.
But if there's something that psychiatry can do to help with war or world hunger, etc. -- why not? In some historic situations an unlikely organized group has come to the great aid of a marginalized or persecuted group in need of relief or justice, even though the organized group didn't necessarily have any specialized knowledge of the matter they were dealing with.

6. I strongly agree. I prefer to offer therapy to most people I see. And I think most people do not have adequate opportunities to experience therapy. Yet I do also observe that many individuals could be treated with a medication prescribed by a gp, and simply experience resolution of their symptoms. Subsequent "therapy" is done by the individual in their daily life, and does not require a "therapist." In these cases, the medication may not be needed anymore, maybe after a year or so. Sometimes therapists may end up offering something that isn't really needed, or may aggrandize the role or importance of "therapy" (we studied all those years to learn to be therapists, after all--therefore a therapist's view on the matter may be quite biased), when occasionally the best therapy of all could simply be self-provided. Yet, of course, many situations are not so simple at all, and that's where a therapy experience can be very, very important. I support the idea of respecting the patient's individual wishes on this matter, after providing the best possible presentation of benefits and risks of different options. Of course, we're all biased in how we understand this benefit/risk profile.
7. some interesting points here...but subject to debate. Addressing these complex subjects in an imperative manner makes me uncomfortable.
8. polypharmacy should certainly not be a norm, though intelligent use of combination therapies, in conjunction with a clear understanding of side-effect risks, can sometimes be helpful. Some of the statements made in this section have actually not been studied well, for example it makes no pharmacological sense to combine two different SSRI antidepressants at the same time. But there has not been a body of research data PROVING that such a combination is in fact ineffectual. Therefore, before we scoff at the practitioner who prescribes two SSRIs at once, I think we should look at the empirical result--since there are no prospective randomized studies, the best we can do is see whether the individual patient is feeling better, or not.
9. I'm not a big fan of "diagnosis", but sometimes, and for some individuals, it can be part of a very helpful therapy experience, to be able to give a set of problems a name. This name, this category, may lead the person to understand more about causes & solutions. Narrative therapy makes a good use, I think, of "naming" (a variant of "diagnosing") as a very useful therapeutic construct.

10. There isn't a number 10 here, but the comments at the end of this article were good.

Biased Presentation of statistical data: LOCF vs. MMRM

This is a brief posting about biostatistics.

In clinical trials, some subjects drop out.

The quality of a study is best if there are few drop-outs, and if data continues to be collected on those who have dropped out.

LOCF and MMRM are two different statistical approaches to dealing with study populations where some of the subjects have dropped out.

One technique or the other may generate different conclusions, different numbers to present.

The following article illustrates how these techniques can skew the presentation of data, and therefore change our conclusions about an issue, despite nothing "dishonest" taking place:

http://thelastpsychiatrist.com/2009/06/its_not_a_lie_if_its_true.html#more

While I agree with the general point of the above article, I find that the specific example it refers to is not necessarily more biased: as I research the subject myself, I find that LOCF is not necessarily superior to MMRM, although LOCF is the most commonly used method to deal statistically with drop-outs. The following references make a case that MMRM is less biased than LOCF most of the time (although it should be known that whenever there are any drop-outs which are lost to follow-up, the absence of data on these subjects weakens the study results--it is important to consider this issue closely when reading a paper):
http://www.stat.tamu.edu/~carroll/talks/locfmmrm_jsm_2004_rjc.pdf
http://www3.interscience.wiley.com/journal/114177424/abstract?CRETRY=1&SRETRY=0

In conclusion, I can only encourage readers of studies to be more informed about statistics. And, if you are looking at a study which could change your treatment of an illness, then it is important to read the whole study, in detail, if possible (not just the abstract).

Which is better, a simple drug or a complex drug?

Here is another critique of medication marketing trends in psychiatry:

http://thelastpsychiatrist.com/2009/04/how_dangerous_is_academic_psyc_1.html#more

I agree quite strongly that there has been a collusion between:
- psychiatrists who eagerly yearn to meaningfully apply their knowledge of psychopharmacology, pharmacokinetics, neurotransmitter receptor binding profiles, etc. (to justify all those years of study)
- and pharmaceutical company sales reps

I can think of attending many academic rounds presentations in which a new drug would be discussed, for example a newly released SSRI. During the talk, there would be boasting about how the new drug had the highest "receptor specificity", or had the lowest activity at receptors other than those for serotonin (e.g. for histamine or acetylcholine).

These facts that I was being shown, while enjoying my corporate-sponsored lunch, were true. But they were used as sales tactics, by-passing clear scientific thought. Just because something is more "receptor-specific" doesn't mean that it works better! It may in some cases be related to a difference in side effects. Yet sometimes those very side-effects may be related to the efficacy of the drug.

By way of counter-example, I would cite the most effective of all antipsychotic medications, clozapine. This drug has very little "receptor-specificity." It interacts will all sorts of different receptors. And it has loads of side effects too. Perhaps this is part of the reason it works so well. Unfortunately, this does not sit well with those of us who yearn to explain psychiatric medication effects using simple flow charts.

Similarly, the pharmacokinetic differences between different medications are often used as instruments of persuasion--yet often times they are either clinically irrelevant, of unproven clinical relevance, or even clinically inferior (e.g. newer SSRI antidepressants have short half-lives, which can be advantageous in some regards; but plain old Prozac, with its very long half-life, can be an excellent choice, because individuals taking it can safely skip a dose without a big change in the serum level, and ensuing side-effects).

I should not be too cynical here -- it is important to know the scientific facts that can be known about something. Receptor binding profiles and half-lives, etc. are important. And it can be useful to find medications that have fewer side-effects, because of fewer extraneous receptor effects. The problem is when we use facts spuriously, or allow them to persuade us as part of someone's sales tactic.

So, coming back to the question in the title, I would say it is not necessarily relevant whether a drug works in a simple or complex way. It is relevant whether it works empirically, irrespective of the complexity of its pharmacologic effects.

Pregnancy & Depressive Relapse

I was looking at an article in JAMA from 2006, which was about pregnant women taking antidepressants. They were followed through pregnancy, and depressive relapses were related to changes in antidepressant dose. Here's a link to the abstract:

http://www.ncbi.nlm.nih.gov/pubmed/16449615

The study is too weakly designed to allow strong conclusions. Yet the abstract makes a statement about "pregnancy not being protective" which--while possibly true--is not directly related to the findings from the study. This criticism was wisely conceived by the author of "The Last Psychiatrist" blog:
http://thelastpsychiatrist.com/2006/10/jama_deludes.html

Yet the JAMA study is not uninformative.

And the criticism mentioned above goes a bit too far, in my opinion. The critique itself makes overly strong statements in its own title & abstract.

It appears quite clear that pregnant women with a history of depressive illness, who are taking antidepressants, but decrease or discontinue their medication during the pregnancy, have a substantially higher risk of depressive relapse.

Because the study was not randomized, we cannot know for sure that this association is causal. But causation would be reasonably suggested. It does not seem likely that this large effect would have been caused by women whose "unstable" depressive symptoms led them to discontinue their antidepressants (i.e. it does not seem likely to me that "reverse causation" would be a prominent cause for this finding). I think this could happen in some cases, but not frequently. Nor does it seem likely to me that a woman already taking an antidepressant, who becomes more depressed during the pregnancy, would therefore stop taking her medication. This, too, could happen (I can think of clinical examples), but I don't think it would be common. It seems most likely to me that the causation is quite simple: stabilized depressive illness during pregnancy is likely to become less stable, and more prone to relapse, if antidepressant medication is discontinued.

The critique of this article also discusses the fact that women in the study who increased their doses of medication also had higher rates of depressive relapse, yet this fact is not mentioned very much in the abstract or conclusion. This finding is also not surprising--what other reason would a pregnant woman have to increase a dose of medication which she was already taking during her pregnancy, other than an escalation of symptoms? In this case, depressive relapse (which can happen despite medication treatment) is likely the cause of the increased dose--the increased dose is unlikely to have caused the depressive relapse.

Yet, as I said above, the study only allows us to infer these conclusions, as it was not randomized. And I agree that the authors overstate their conclusions in the abstract. In order to more definitively answer these questions, a randomized prospective study would need to be done.

Tuesday, September 29, 2009

Astronomical Photographs

For something completely different--

Have a look at NASA's "astronomy picture of the day" site: http://apod.nasa.gov/apod/

It's interesting, awe-inspiring--and I hope therapeutic--to be reminded of things much larger than ourselves.

Here are some of my favourite pictures from the NASA site:

the sun:
http://antwrp.gsfc.nasa.gov/apod/ap030418.html
http://antwrp.gsfc.nasa.gov/apod/ap021114.html
http://antwrp.gsfc.nasa.gov/apod/ap061204.html
http://antwrp.gsfc.nasa.gov/apod/ap000928.html
http://antwrp.gsfc.nasa.gov/apod/ap080924.html

galaxies:
http://antwrp.gsfc.nasa.gov/apod/ap081012.html
http://antwrp.gsfc.nasa.gov/apod/ap080927.html
http://antwrp.gsfc.nasa.gov/apod/ap050112.html
http://antwrp.gsfc.nasa.gov/apod/ap090701.html

jupiter:
http://antwrp.gsfc.nasa.gov/apod/ap090106.html

N-Acetylcysteine for treatment of compulsive disorders

N-acetylcysteine is an antioxidant which modulates the glutamate system in the brain. Glutamate is actually the most prevalent neurotransmitter in the brain, and generally has strongly activating effects on nerve cells.

A recent study in Archives of General Psychiatry described groups of individuals with compulsive hair-pulling behavior (trichotillomania), randomized to receive either placebo, or N-acetylcysteine 1200 mg/day, then up to 2400 mg/day, over 12 weeks:
http://www.ncbi.nlm.nih.gov/pubmed/19581567

The N-acetylcysteine group had about 50% reduction in hair-pulling behaviour, with no change in the placebo group. Those in the N-acetylcysteine group did not report any side effects. In fact, the only side effects were among those in the placebo group.

The same author published a study in 2008 showing a substantial improvement in compulsive gambling behavior in a group given NAC at an average dose of about 1500 mg/d:
http://www.ncbi.nlm.nih.gov/pubmed/17445781

A very preliminary study showed that NAC may have some promise in treating cocaine addiction:
http://www.ncbi.nlm.nih.gov/pubmed/17113207

NAC has shown some promise as an adjunctive treatment for chronic schizophrenia; in this study the dose was 1000 mg twice daily, over 24 weeks. Once again, there were no side-effects. As I look at the body of the paper, I see that there was a definite favorable effect from the NAC compared to placebo, in several domains, but the size of the effect seemed clinically modest:
http://www.ncbi.nlm.nih.gov/pubmed/18436195

So NAC appears to be an appealing therapy for a variety of frequent, and often difficult-to-treat psychiatric symptoms. There do not appear to be side effect problems.

At this point, NAC can be obtained from health food stores in Canada, as a nutritional supplement.  It is also on the prescription formulary in an injectable form for treating acetaminophen toxicity. 

Friday, September 25, 2009

Randomized Controlled Trials in psychiatry

There is a good debate presented in the September 2009 issue of the Canadian Journal of Psychiatry (pp. 637-643), about the importance of randomized controlled trials in psychiatric research and clinical practice.

Steven Hollon presents a strong case supporting the philosophical foundations of RCT research, while Bruce Wampold presents many good points about the present limitations and weaknesses prevalent in current psychiatric RCT research studies. In particular, Wampold points out that much evidence exists regarding the relevance of the individual therapist (and, I might add, of the individual sense of patient-therapist alliance or connection) in determining therapeutic outcomes, and that this very individual factor may have a stronger influence on outcome than the particular "treatment" being offered (whether it be CBT, psychoanalysis, a medication combination, etc.).

My own view of a lot of the evidence resonates with these ideas. I strongly support the importance of randomized controlled trials in medicine and psychiatry. Yet it often seems to me that many variables are not accounted for. The impact of the individual therapist is one specific factor. If the patient is more comfortable with one therapist than another, than this factor alone may greatly outweigh the effect of the particular style of therapy being offered. Interestingly, this factor may not necessarily depend on the length of experience of the therapist -- sometimes a trainee may have a more positive therapeutic impact than a therapist who has decades of experience. This fact is not surprising to me: a lot of psychotherapy can have a lot to do with the capacity for the therapeutic relationship to grow and be healthy, which may depend substantially on very personal factors in the therapist. This may be humbling to those of us who revere the notion of psychotherapeutic theory being of paramount importance.

The whole of psychiatric theory may, at least in some cases, be less important than the goodness of a single interpersonal connection.

But I do also believe that certain therapeutic techniques are more effective than others. I think that strategies which promote daily long-term psychological work just have to be more effective (along the lines of language learning again). Also I think that strategies which encourage and help a person to face their fears or to move away from destructive habits are more likely to be helpful than strategies which do not look at these issues.

Many other factors are often not controlled (or examined at all) in present psychiatric RCTs, including nutrition, exercise, other self-care activities, supportive relationship involvement, community involvement, altruistic activity, etc.

Another factor that I have considered is the heterogeneity of many studied psychiatric populations. Different individuals with so-called "major depressive disorder" may in fact have different underlying causes for their symptoms; some of these individuals may respond well to one type of treatment, others may respond to something else. I suppose the RCT design remains appropriate in this situation, yet a powerful focus in research, in my opinion, needs to be to examine why some people respond to something, while others don't.

This erratic pattern of response doesn't just happen with individuals in a particular study. There are whole studies in which a well-proven psychiatric treatment (such as an antidepressant) doesn't end up differing from placebo. I don't think such studies show that antidepressants (or other treatments) are ineffective, but I do think it strongly suggests that the current criteria for psychiatric diagnoses are insufficient to predict treatment response as consistently as we need.
Often times, these negative studies are dismissed automatically. In many cases, such studies have been poorly designed, and that is the main problem. But in other cases, I think we need to very carefully examine such negative studies, to understand why they were negative.

This is consistent with another type of scientific rigor (different from the RCT empirical approach): in mathematics, a single counterexample is sufficient to disprove a theorem. If such a counterexample is found, it can be extremely fruitful to examine why it occurred--in this way a new and more valuable theorem can be conceived. The process of generating the disproven theorem was not a waste of time, but could be understood as part of a process to find the accurate theorem. Such examples abound in other fields, such as computer programming--a program or algorithm may work quite well, but generate errors or break down in certain situations. Careful examination of why the errors are taking place is the only way to improve the program, and perhaps also to more deeply understand the problem the program was supposed to solve.

Wednesday, September 16, 2009

Perils of Positive Thinking?

Joanne Wood et al. had an article published in Psychological Science in June 2009. It was a study in which subjects with low self-esteem felt worse after doing various "positive thinking" exercises. Subjects with higher self-esteem felt better with self-affirming statements.

Here is a link to the abstract: http://www.ncbi.nlm.nih.gov/pubmed/19493324


So the study seems to suggest that it could be detrimental to engage in "positive thinking" if you are already having depressive thoughts, or negative thoughts about yourself or your situation. The authors theorize that if you if have a negative view of yourself, then it may simply draw more attention in your mind to your own negative self-view, if you force yourself to make a positive statement about yourself. The positive statement may simply seem ridiculous, unrealistic, unattainable, perhaps a reminder of something you don't have or feel that you cannot ever have.

However, the study is weak, and demonstrates something that most of us could see to be obviously true. The study is cross-sectional, and looks at the effect of a single episode of forced "positive thinking." This is like measuring the effect of marathon training after one single workout, and finding that those already in good shape really enjoyed their workout, while those who hadn't run before felt awful afterward.

Any exercise to change one's mind has to be practiced and repeated over a period of months or years. A single bout of exercise will usually accomplish very little. In fact, it will probably lead to soreness or injury, especially if the exercise is too far away from your current fitness level. I suppose if the initial "exercise" is a gentle and encouraging introduction, without overdoing it, then much more could be accomplished, as it could get one started into a new habit, and encourage hope.

"Positive thinking" exercises would, in my opinion, have to feel realistic and honest in order to be helpful. They may feel somewhat contrived, but I think this is also normal, just as phrases in a new language may initially feel contrived as you practice them.

And, following a sort of language-learning or athletic metaphor again, I think that "positive thinking" exercises cannot simply be repeating trite phrases such as "I am a good person!" Rather, they need to be dialogs in your mind, or with other people -- in which you challenge yourself to generate self-affirming statements, perhaps then listen to your mind rail against them, then generate a new affirming response. It becomes an active conversation in your mind rather than bland repetition of statements you don't find meaningful. This is just like how learning a language requires active conversation.

Self-affirmation may initially be yet another tool which at times helps you get through the hour or the day. But I believe that self-affirming language will gradually become incorporated deeply into your identity, as you practice daily, over a period of years. Actually, I think the "language" itself is not entirely the source of identity change; I think such language acts as a catalyst which resonates with a core of positive identity which already exists within you, and allows it to develop and grow with greater ease. This core of positivity may have been suppressed due to years of depression, environmental adversity, or other stresses.

Monday, September 14, 2009

A list of individuals who developed talents later in life

This is a follow-up to my language-learning metaphor entry.

One comment was about the unlikelihood of mastering a "new language" (literally or metaphorically) if you only start learning beyond childhood or adolescence.

This seems to be a common view.

I always like to look for counterexamples (it's my mathematical way coming out in me):

1) the first one that leapt to my mind is Joseph Conrad, one of the greatest authors in the history of the Engish language. Conrad did not speak a word of English until he was 21. He began writing in English at age 32. His first published works came out when he was about 37. In order to learn English, he did not attend language classes or read grammar books, but chose to live and work in an English-speaking environment (immersion!).

2) I don't know much about rock musicians, but my research led me to a biography of Tom Scholz, from the group Boston. He started playing musical instruments at 21.

3) Here's a link to someone else's list:
http://creativejourneycafe.com/2008/04/09/10-creative-late-bloomers/

4) Here's another list, which is part of a review of a book called Defying Gravity: A Celebration of Late-Blooming Women:
http://www.bookpleasures.com/Lore2/idx/28/2190/Womens_Issues/article/Defying_Gravity.html

5) Another link with good examples:
http://en.wikipedia.org/wiki/Late_bloomer
(I'm the one who added Joseph Conrad to this list).

...I invite other suggestions to expand my list!

Friday, September 11, 2009

Making it through a difficult day or night

It can be hard to make it through the next hour, if you are feeling desperately unhappy, agitated, empty, worthless, or isolated, especially if you also feel disconnected from love, meaning, community, "belongingness," or relationships with others.

Such desperate places of mind can yet be familiar places, and a certain set of coping tactics may evolve. Sometimes social isolation or sleep can help the time pass; other times there can be addictive or compulsive behaviours of different sorts. These tactics may either be distractions from pain or distress, or may serve to anesthetize the symptoms in some way, to help the time pass.

Time can become an oppressive force to be battled continuously, one minute after the next.

I'd like to work on a set of ideas to help with situations like this. I realize a lot of these ideas may be things that are already very familiar, or that may seem trite or irrelevant. Maybe things that are much easier said than done. But I'd like to just sort of brainstorm here for a moment:

1) One of the most important things, I think, is to be able to hold onto something positive or good (large or small), in your mind, to focus on it, to rehearse it, to nurture its mental image, even if that good thing is not immediately present. The "good thing" could be anything -- a friend or loved one, a song, a place, a memory, a sensation, a dream, a goal, an idea. In the darkest of moments we are swept into the immediacy of suffering, and may lose touch with the internalized anchors which might help us to hold on, or to help us direct our behaviour safely through the next 24 hours.

In order to practice "holding on" I guess one would have to get over the skepticism many would have that such a tactic could actually help.

In order to address that, I would say that "covert imagery" is a well-established technique, with an evidence base in such areas as the treatment of phobias, learning new physical activities, practicing skills, even athletic training (imagining doing reps will actually strengthen muscles). The pianist Glenn Gould used covert imagery to practice the piano, and preferred to do much of his practice and rehearsal away from any keyboard; he preferred to learn new pieces entirely away from the piano. There is nothing mystical about the technique -- it is just a different way of exercising your brain, and therefore your body (which is an extension of your brain).

In order for covert imagery to work, it really does help to believe in it though (skepticism is highly demotivating).

Relationships can be "covertly imagined" as well -- and I think this is a great insight from the psychoanalysts. An internalized positive relationship can stay with us, consciously or unconsciously, even when we are physically alone. If you have not had many positive relationships, or your relationships have not been trustworthy, safe, or stable, then you may not have a positive internalized relationship to comfort you when you are in distress. You may feel comforted in the moment, if the situation is right, but when alone, you may be right back to a state of loneliness or torment.

The more trust and closeness that develops in your relationship life, the easier it will be to self-soothe, as you "internalize" these relationships.

Here are some ways to develop these ideas in practical ways:

-journaling, not just about distress, but about any healthy relationship or force in your life which helps soothe you and comfort you

-using healthy "transitional objects" which symbolize things which are soothing or comforting, without those things literally being present. These objects may serve to cue your memory, and help interrupt a cycle of depressive thinking or action.

-if there is a healthy, positive, or soothing relationship with someone in your life, imagine what that person might say to comfort or guide you in the present moment; and "save up" or "put aside" some of your immediate distress to discuss with that person when you next meet.

2) Healthy distraction.
e.g. music (listening or performing); reading (silently or aloud, or being read to); exercise (in healthy moderation); hobbies (e.g. crafts, knitting, art); baking
-consider starting a new hobby (e.g. photography)

3) Planning healthy structured activities
e.g. with community centres, organized hikes, volunteering, deliberately and consciously phoning friends

4) Creating healthy comforts
e.g. hot baths, aromatherapy, getting a massage, preparing or going out for a nice meal

5) Recognizing and blocking addictive behaviours
-there may be a lot of ambivalence about this, as the addictive behaviours may have a powerful or important role in your life; but freeing oneself from an addiction, or from recurrent harmful behaviour patterns, can be one of the most satisfying and liberating of therapeutic life changes.
An addictive process often "convinces" one that its presence is necessary and helpful, and that its absence would cause even worse distress.

6) Humour
-can anyone or anything make you laugh?
-can you make someone laugh?

7) Meditation
-takes a lot of practice, but can be a powerful tool for dealing safely with extreme pain
-could start with a few Kabat-Zinn books & tapes, or consider taking a class or seminar (might need to be patient to find a variety of meditation which suits you)

8) Being with animals (dogs, cats, horses, etc.). If you don't or can't have a pet, then volunteering with animals (e.g. at the SPCA) could be an option.

9) Caring for other living things (e.g. pets, plants, gardens)

10) Arranging for someone else to take care of you for a while (e.g. by friends, family, or in hospital if necessary)

11) Visiting psychiatry blogs
-(in moderation)


...I'm just writing this on the spur of the moment, I'll have to do some editing later, feel free to comment...

Tuesday, September 8, 2009

When your therapist makes a mistake

Sometimes your therapist will make a mistake:
- an insensitive or clumsy comment
- an intrusive line of questioning
- a failure to notice, attend to, or take seriously, something important in the session
- unwelcome or way-off-base advice.

If such problems are recurrent and severe, it may be a sign that you don't have a very good therapist, and that it is important to seek a referral to someone else.

Some problems could be forms of malpractice (e.g. being given dangerous medications inappropriately), and could be pursued through legal channels.

I think that a healthy therapy frame is one in which the therapist will be open to discussing any problems or mistakes.

The therapist should sincerely apologize for all mistakes, and be open to making a plan to prevent similar mistakes from happening again.

You deserve to feel safe, respected and cared for in therapy.

There are other types of conflicts that can arise in therapy, when one person or the other feels hurt, frustrated, or misunderstood. I can think of situations over the past ten years in which there have been tense conflicts, and in which my patient chose not to continue seeing me. In some of these cases, I have felt that there was a conflict--a problem in the relationship--which needed to be resolved. Sometimes these conflicts were made more likely by my own character style or behavioral quirks; other times I think these conflicts were at least partly "transferential," in that my actions triggered memories associated with conflicts from previous relationships (such as with parents growing up). In a few cases, I think the conflict was influenced by active mood symptoms (e.g. severe irritability). I think many conflicts have a mixture of different causes, and are not necessarily caused by just one thing.

In any case, I do strongly believe that resolving conflict in therapy is very important. And I believe a therapist must gently and empathically invite a dialog about conflicts, in a manner which is open, non-defensive, and "non-pushy." Such a moment of conflict-resolution, if it occurs, could be one of the most valuable parts of a therapy experience, a source of peace and freedom.

Monday, August 31, 2009

Language Learning Metaphor


I have often compared psychological change to language learning.

This could be appreciated on a metaphorical level, but I think that neurologically the processes are similar.

Many people approach psychological change as they would approach something like learning Spanish. Reasons for learning Spanish could be very practical (e.g. benefits at work, moving to a Spanish-speaking country, etc.), or could be more whimsical or esthetic (e.g. always enjoying Spanish music or movies). There is a curiosity and desire to learn and change, and steps are taken to begin changing. A Spanish language book would be acquired. An initial vigorous burst of energy would be spent learning some Spanish vocabulary.

This process often might last a few weeks or months. There might be a familiarity with certain phrases, an intellectual appreciation of the grammatical structure, and perhaps the ability to ask for something in a coffee shop.

Then the Spanish book would sit on the shelf, and never be opened again.

Another pathway could be like the French classes I remember during elementary school. We must have had some French lessons every week for eight years. I did well academically, and had high grades in French.

But I never learned to speak French.

And most people don't learn to speak Spanish either, despite their acquisition of instructional books.

So, there is a problem here: motivation exists to change or learn something new. There is a reasonable plan for change. Effort is invested into changing. But change doesn't really happen. Or the change only happens in a very superficial way.

Here is what I think is required to really learn a language:

1) Immersion is the optimal process. That is, you have to use only the new language, constantly, for weeks, months, or years at a time. This constrains one's mind to function in the new language. Without such a constraint, the mind shifts back automatically to the old language most of the time, and the process of change is much slower, or doesn't happen at all.
2) Even without immersion, there must be daily participation in the learning task, for long periods of time.
3) The process must include active participation. It is helpful to listen quietly, to read, to understand grammar intellectually -- but the most powerful acts of language learning require you to participate actively in conversation using the new language.
4) Perhaps 1000 hours of active practice are required for fluency. 100 hours of practice will help you to get by on a very basic level. 6-10 hours of work per week is a reasonable minimum.
5) Along the way, you have to be willing to function at what you believe is an infantile level of communication, and stumble through, making lots of mistakes, possibly being willing to embarrass yourself. It will feel awkward and slow at first.
6) It is probably necessary to have fellow speakers of the new language around you, to converse with during your "immersion" experience.
7) Part of the good news is that once you get started, even with a few hours' practice, there will be others around you to help you along enthusiastically.

I think that psychological change requires a similar approach. The brain is likely to change in a similar way. I am reminded of Taub's descriptions of constraint-induced rehabilitation from strokes: recovery of function, and neuroplastic brain change, can take place much more effectively if the person is in a state of physiologic "immersion."

Many people acquire books about psychological change (e.g. self-help books, CBT manuals, etc.) in the same way one might acquire a book about learning Spanish. People might read them through, learn a few things, then the books would sit unopened for the next five years.

Or many people might participate in psychotherapy similar to a weekly language lesson: it might be familiar, educational--if there was an exam to write, people might get high grades--but often the "new language" fluency never really develops.

So I encourage the idea of finding ways to create an "immersion" experience, with respect to psychological change. This requires daily work, preferably in an environment where you can set the "old language" aside completely. This work may feel artificial, slow, contrived, or superficial. But this is just like practicing phrases in a new language for the first time. Eventually, the work will feel more natural, spontaneous, and easy.

I think the greatest strength of cognitive-behavioural therapy is its emphasis on "homework," which calls upon people to focus every day on constructive psychological change. And the different columns of a CBT-style homework page remind me of the "columns" one might use to translate phrases from one language into another. In both cases, in order for this homework to work, it has to be practiced, not just on paper, but spoken out loud, or spoken inside your mind, with sincerity and repetition, and preferably also with other people in dialogs.

There's some interesting academic work out there on language acquisition--but for a start, here's a reference from a language-learning website (particularly the summary on the bottom half of this webpage):
http://www.200words-a-day.com/language-learning-reviews.html

Monday, August 17, 2009

ADHD questions

Here are some great questions about ADHD, submitted by a reader:

1) You write here that long-term use of stimulants has NOT been shown to improve long-term academic outcomes. Why do you think this is, given that symptoms of ADHD improve on medication? (It actually really depresses me to think that individual symptoms can improve, yet no real change takes place...though I know that this might not apply to all patients.

2) What are some effective non-drug treatments for ADHD? I am particularly interested in dietary measures, and also EEG biofeedback.

3) I have read about prescribing psychostimulants as a way of basically diagnosing ADHD...i.e., the diagnosis is based on your response to the medication. I am just wondering how precise this would be, given that stimulants would probably (?) impove most people's concentration, etc. Or is there any role for neuropsychological testing in trying to establish a diagnosis? Is there any way of definitively establishing this kind of diagnosis?

4) I have read that there are many differences between ADD and ADHD, i.e. not just in symptom presentation but in the underlying brain pathology. Is that true? I'm not sure how to phrase it, it seemed like the suggestion was that ADD was more "organic", although maybe that doesn't make sense. Does that have implications for prognosis or treatment strategies?

5) I have read that one red flag that suggests ADD in the context of MDD treatment is a good response to bupropion. If a patient did not have a really good response to bupropion-- or if the response was only partial-- does this usually mean that treatments with psychostimulants like Ritalin, Adderall, etc. will be ineffective (or only partially effective) also?

6) If ADD is not diagnosed/treated until adulthood, is it usually more difficult to treat than if it is diagnosed/ treated in early childhood? Is the response to stimulant treatment just as good? I guess I am wondering if there are certain structural changes that occur in the brain that result from untreated ADD-- kind of like long-term depression and hippocampal atrophy?

7) Is there a certain type of patient who usually does poorly on psychostimulants, or who experiences severe side effects on psychostimulants?



I don't know the answers to a lot of these, but I am interested to keep trying to learn more. Here's my best response I can come up with for now:

1) First of all, the bottom line of whether something is helpful or not may not be some specific thing, like academic performance. Perhaps "well-being" in a broad, general sense is a more reasonable goal. Yet, things like academic performance are important in life. Perhaps stimulants or other treatments for ADHD are "necessary but not sufficient" to help with ADHD-related academic problems over the longer term. It appears to me from the data that stimulants are actually helpful for academic problems, it's just that the size of the effect is much smaller than what most people would hope for.

2) I wrote a post about zinc supplementation before. Also adequate iron stores are probably important. A generally healthy diet is probably important. I've encountered some people with ADHD who have reduced tolerance for irritation or frustration, and may be particularly bothered or distracted by hunger; yet they may not be organized to have meals prepared regularly through the day. So it can help them manage their ADHD to make sure they always have snacks with them, so that they are never in a hungry state. Other than that, I think there are a lot of nutritional claims out there which have a poor evidence base. The link between sugar intake and hyperactivity is poorly substantiated--I've written a post about that.

Food additives or dyes could play a role in exacerbating ADHD symptoms. Based on this evidence, it makes sense to me to limit food dyes and sodium benzoate in the diet, since such changes do not compromise quality of life in any way, and may lead to improved symptoms. Here are a few references:

http://www.ncbi.nlm.nih.gov/pubmed/17825405
(this is the best of the references: it is from Lancet in 2007)

http://www.ncbi.nlm.nih.gov/pubmed/15613992
http://www.ncbi.nlm.nih.gov/pubmed/15155391

I once attended a presentation on EEG biofeedback. I think it is a promising modality. Harmless to give it a try, but probably expensive. It will be interesting once the technology is available to use EEG biofeedback in front of your own home computer, at low cost.

A few of the self-help books about ADHD are worth reading. There are a lot of practical suggestions about managing symptoms. Some of the books may contain a strongly biased agenda for or against things like stimulants or dietary changes, so you need to be prepared for that possibility.

3)The ADHD label is an artificial, semantic creation, a representation of symptoms or traits which exist on a continuum. Even for those who do not officially satisfy symptom checklist criteria for ADHD, they could benefit substantially from ADHD treatments if there is some component of these symptoms at play neurologically. Many people with apparent disorders of mood, personality, learning, conduct, etc. may have some component of ADHD as well: in some cases ADHD treatments are remarkably helpful for the other problems. So I think careful trials of stimulants could be helpful diagnostically for some people, provided there are no significant contraindications.

4) I've always thought about the ADHD label as just a semantic updating of the previous ADD label. Subtypes of ADHD which are predominantly inattentive rather than hyperactive may differ in terms of comorbidities and prognosis.

5) Hard to say. Many people think of bupropion as a "dopaminergic" drug, whereas bupropion and its relevant metabolites probably act mainly on the norepinephrine system in humans (its dopaminergic activity is more significant in dogs). But perhaps bupropion response could correlate with stimulant response. I haven't seen a good study to show this, nor do I have a case series myself to comment one way or the other based on personal experience.

6) I don't know about that. Comorbidities (e.g. substance use, relationship, or conduct problems) may have accumulated in adults who have not had help during childhood. Yet I have often found it to be the case that the core symptoms of most anything can improve with treatment, at any age.

7) Patients with psychotic disorders (i.e. having a history of hallucinations, delusions, or severely disorganized thinking) often seem to do poorly on stimulants. Patients who are using stimulants primarily to increase energy or motivation often are disappointed with stimulants after a few months, since tolerance develops for effects on energy. Patients with eating disorders could do poorly, since stimulant use may become yet another dysfunctional eating behaviour used to control appetite. And individuals who are trying to use stimulants as part of thrill-seeking behaviour, who are using more than prescribed doses, or who are selling their medication, are worse off for receiving stimulant prescriptions.

Wednesday, July 29, 2009

Twin Studies & Behavioral Genetics

The field of behavioral genetics is of great interest to me.

A lot of very good research has been done in this area for over 50 years.

One of the strongest methods of research in behavioral genetics is the "twin study", in which pairs of identical twins are compared with pairs of non-identical twins, looking at symptoms, traits, behaviors, disease frequencies, etc.

I would like to explore this subject in much greater detail in the future, but my very brief summary of the data is this:
1) most human traits, behaviors, and disease frequencies are strongly affected by hereditary (genetic) factors. Typically, about 50% of the variability in these measures is caused by variability of inherited genes. That is, the "heritability" is typically 50%, sometimes much higher.
2) The remaining variability is mostly due to so-called "non-shared environmental factors". This fact is jarring to those of us who have believed that the character of one's family home (a "shared environmental variable") is a major determinant of future character traits, etc.
3) Hereditary factors tend to become more prominent, rather than less prominent, with advancing age. One might have thought that, as one grows older, environmental events would play an ever-increasing role in "sculpting" our personalities or other traits. This is not the case.
4) Some of the "environmental variation" may in fact be random. Basically, good or bad luck. Getting struck by lightning, or winning the lottery, or not. Such "luck-based" events are mostly (though not entirely) outside our control.
5) All of these facts may lead to a kind of fatalism, a resignation about our traits being determined by factors outside our control. (mind, you, being "lucky" or "unlucky" may be more determined by attitudinal factors such as openness than just by random events: see the following article--http://www.scientificamerican.com/article.cfm?id=as-luck-would-have-it)


Here is some of my critical response to the above statements:

1) Statements about heritability are in fact dependent upon the average environmental conditions experienced by the population being studied. For example, if we were to measure the heritability of becoming the leader of a large country, we would find heritabilities of nearly 100% in times or places where there are hereditary monarchies, and much lower heritabilities for democracies (mind you, the case of the Bush family shows that the heritability has been non-zero in the U.S.).
2) Non-shared environmental factors are extremely important. This does not mean that the family environment is unimportant. Part of an individual's non-shared environmental experience is that person's unique experience of the family environment. The lesson in this is that families need to pay close attention to how each individual family member is adapting to the family situation, and to also pay close attention to a child's peer and school environment.
3) The influence of shared environmental factors is small, but rarely zero. Usually there is some small percentage of variability accounted for by shared factors. Often this percentage is larger in childhood, and declines towards zero during adult maturation. But it is not zero. Just because an influence is small does not mean that it is unimportant. We have limited control over our genetics, after all, but we do have more substantial control over shared and non-shared environmental variables.
4) Most studies look at the general effect of genetic & environmental factors in populations. Compelling examples are frequently cited of individual twins, separated at birth: perhaps one twin is adopted into a wealthy, privileged home with access to multiple educational resources, while the other grows up in a more impoverished setting. The story typically is that the twins both end up with similar funds of knowledge or intelligence: the first twin reads books available at home, while the other twin develops her inherited interest in knowledge by going out of her way to acquire a library card, and spending all day reading at the local library. Such case examples illustrate how inherited factors can prevail despite environmental differences.

But I'm interested to see counterexamples: examples in which differences in environment between twins did lead to substantial differences in traits later on. It is this type of example that has the most practical value, in my opinion.

5) I have considered the following idea:
For any trait or characteristic having any heritability, there may be environmental variables that can change the outcome of the trait for a given individual. Even for highly, obviously heritable traits. Consider eye color, for example. This seems obviously purely genetic. But suppose there was a medication that could change eye color. This would be a purely environmental factor (though, of course, perhaps the tendency to use a drug to change eye color would be partially inherited). Most people would not use such a drug. Measures of heritability for eye color would remain very high. But, despite this high heritability, there may well be simple, direct environmental changes which, for a given individual, could completely change the trait. Such environmental changes would have to be very different from average environmental conditions. The higher the heritability, the farther would the environmental change have to be from average, in order to effect a change in the trait.

We could say that the tendency to kill and devour wildebeest is heritable, among the different wild creatures of the African savanna. The genetic differences between lions and giraffes would completely determine the likelihood of such creatures devouring a wildebeest or not. We could say that lions inherit a tendency to eat wildebeest, while giraffes do not. Yet, I suppose that it is true that we could train and/or medicate lions (and also keep them well-fed with a vegetarian diet!) so that wildebeest are totally safe around them. In this way, we would be introducing a set of environmental changes which would cause a radical change in lion behavior. This does not change the fact that the heritability for lions' killing wildebeest is extremely high, it just means that the environmental change necessary to change the trait must have to be something radically different from the environmental experience of the average lion (most lions are not trained to be non-predatory!).


The clinical applications I have based on these observations are the following:

1) Many psychological phenomena are highly heritable. This does not mean that these phenomena are unchangeable though. It does mean that, in order to change the trait or behavior, an environmental change needs to occur which is substantially different from the environmental experiences of most people, or of the "average person". This may help us to use our efforts most efficiently. So, for example, it would be inefficient to merely provide everybody with a typical, average, 2-parent family living in a bungalo. The evidence shows that such "average" environmental changes have minimal impact on psychological or behavioral traits. It would be important to make sure each individual is not deprived or harmed, and has access to those basic environmental elements that are required for them to realize their potential. If there are problems, then the means of addressing those problems may require a substantial, unique, or radical environmental change.
2) The most influential environmental variables are those which are unique to the individual, not the ones which are shared in a family. This does not mean that family experiences are unimportant, but that a child's unique experience of his or her own family environment, is much more important than the overall atmosphere of the home. A chaotic household may be a pleasure, a source of boisterous social stimulation, for one child, but an injurious, disruptive, irritating source of stress for another. A calm household may allow one child to grow and develop, while it may cause another child to become bored or restless.
3) The higher the heritability, the more pronounced the environmental (or therapeutic) change is required to change the trait, compared to the average environment in the population.
4) The motivation to have a certain style of home, or parenting, etc. should logically not primarily be to "sculpt" the personality of your child, but to allow for joyous long-term memories, to be shared and recounted as stories by parent and child, and to pay attention to the unique nature of each individual child, providing for any healthy needs along the way.


Some references:

Segal, Nancy L. (2000). Entwined Lives: Twins and what they tell us about human behavior. New York: Plume.

http://www.ncbi.nlm.nih.gov/pubmed/19378334
{a 2009 review including a look at "epigenetics", the notion that one's genes are changeable, therefore identical twins are not truly "identical" in a genetic sense}

http://www.ncbi.nlm.nih.gov/pubmed/18412098
{genetics of PTSD}

http://www.ncbi.nlm.nih.gov/pubmed/17176502
{a look at how genetic factors influence environmental experience}

http://www.ncbi.nlm.nih.gov/pubmed/17679640
{a look at how choice of peers is influenced by heredity, moreso as a child grows up}

http://www.ncbi.nlm.nih.gov/pubmed/18391130
{some of the research showing different genetic influences coming "on line" during different stages of childhood and young adult development}

http://www.ncbi.nlm.nih.gov/pubmed/19634053
{a recent article by TJ Bouchard, one of the world's leading experts in twin studies}

Low-dose atypical antipsychotics for treating non-psychotic anxiety or mood symptoms

Atypical antipsychotics are frequently prescribed to treat symptoms of anxiety and depression. They can be used in the treatment of generalized anxiety, panic disorder, OCD, major depressive disorder, PTSD, bipolar disorder, personality disorders, etc. At this point, such use could be considered "off-label", since the primary use of antipsychotics is treating schizophrenia or major mood disorders with psychotic features.

But there is an expanding evidence base showing that atypicals can be useful in "off-label" situations. Here is a brief review of some of the studies:

http://www.ncbi.nlm.nih.gov/pubmed/19470174
{this is a good recent study comparing low-dose risperidone -- about 0.5 mg -- with paroxetine, for treating panic disorder over 8 weeks. The risperidone group did well, with equal or better symptom relief, also possibly faster onset. But 8 weeks is very brief -- it would be important to look at results over a year or more, and to assess the possibility of withdrawal or rebound symptoms if the medication is stopped. Also is would be important to determine if the medication is synergistic with psychological therapies, or whether it could undermine psychological therapy (there is some evidence that benzodiazepines may undermine the effectiveness of psychological therapies) }

http://www.ncbi.nlm.nih.gov/pubmed/16649823
{an open study from 2006 showing significant improvements in anxiety when low doses of risperidone, of about 1 mg, were added to an antidepressant, over an 8 week trial}

http://www.ncbi.nlm.nih.gov/pubmed/18455360
{this 2008 study shows significant improvement in generalized anxiety with 12 weeks of adjunctive quetiapine. It was not "low-dose" though -- the average dose was almost 400 mg per day. There is potential bias in this study due to conflict-of-interest, also there was no adjunctive placebo group}

http://www.ncbi.nlm.nih.gov/pubmed/16889446
{in this 2006 study. patients with a borderline personality diagnosis were given quetiapine 200-400 mg daily, for a 12 week trial. As I look at the results in the article itself, I see that the most substantial improvement was in anxiety symptoms, without much change in other symptom areas. The authors state that patients with prominent impulsive or aggressive symptoms responded best}

http://www.ncbi.nlm.nih.gov/pubmed/17110817
{in this large 2006 study (the BOLDER II study), quetiapine alone was used to treat bipolar depression. Doses were 300 mg/d, 600 mg/d, or placebo. There was significant, clinically relevant improvement in the quetiapine groups, with the 300 mg group doing best. Improvements were in anxiety symptoms, depressive symptoms, suicidal ideation, sleep, and overall quality of life.}

Here's a reference to a lengthy and detailed report from the FDA about quetiapine safety when used to treat depression or anxiety:
http://www.fda.gov/ohrms/dockets/ac/09/briefing/2009-4424b2-01-FDA.pdf


In summary, I support the use of atypical antipsychotics as adjuncts for treating various symptoms including anxiety, irritability, etc. But as with any treatment (or non-treatment), there needs to be a close review of benefits vs. risks. The risks of using antipsychotics for treating anxiety are probably underestimated, because the existing studies are of such short duration. Also the benefits over long-term use are not clearly established either.

For risk data, it would be relevant to look at groups who have taken antipsychotics for long periods of time. In this group, antipsychotic use is associated with reduced mortality rates (see the following 2009 reference from Lancet: http://www.ncbi.nlm.nih.gov/pubmed/19595447, which looks at a cohort of over 60 000 schizophrenic patients, showing reduced mortality rates in those who took antipsychotics long-term, compared to those taking shorter courses of antipsychotics, or none at all--the mortality rate was most dramatically reduced in those taking clozapine. Overall, the life expectancy of schizophrenic patients was shown to have increased over a 10-year period, alongside substantial increases in atypical antipsychotic use)

It is certainly clear to me that all other treatments for anxiety (especially behavioural therapies, lifestyle changes, other forms of psychotherapy) be optimized, in an individualized way, before medication adjuncts be used.

But I recognize that suffering from anxiety or other psychiatric symptoms can be severely debilitating, can delay or obstruct progress in relationships, work, school, quality of life, etc. The risks of non-treatment should be taken very seriously. My view of the existing evidence is that adjunctive low-dose antipsychotics can have significant benefits, which can outweigh risks for many patients with non-psychotic disorders. As with any medical treatment decision, it is important for you and your physician to regularly monitor or discuss risks vs. benefits of ongoing medication therapies, and be open to discuss new evidence which is coming out.

Wednesday, July 15, 2009

Benefits and Risks of Zinc Supplementation in Eating Disorders, ADHD, and Depression

Zinc supplementation may help treat anorexia nervosa, ADHD, and treatment-resistant depression.

Zinc is a metallic element involved in multiple aspects of human cellular function, metabolism, growth, and immune function. It is required for the function of about 100 human enzymes. The human body contains about 2000-3000 mg of zinc, of which about 2-3 mg are lost daily through kidneys, bowel, and sweat glands. The biologic half-life of zinc in the body is about 9 months, so it can take months or years for changes in dietary habits to substantially change zinc status, unless the intake is very high for short periods.

Red meat is a particularly rich source of zinc. Vegetarians may have a harder time getting an adequate amount from the diet. The prevalence of zinc deficiency may be as high as 40% worldwide.

When referring to zinc dosage, it is best to refer to "elemental zinc". Different types of zinc preparations (e.g. zinc gluconate or zinc sulphate) have different amounts of elemental zinc. For example, 100 mg of zinc gluconate contains about 14 mg of elemental zinc. 110 mg of zinc sulphate contains about 25 mg of elemental zinc.

Here are references to articles written by a Vancouver eating disorders specialist between 1994 and 2006, advising supplementation of 14 mg elemental zinc daily (corresponding to 100 mg zinc gluconate daily) for 2 months in all anorexic patients:
http://www.ncbi.nlm.nih.gov/pubmed/17272939
http://www.ncbi.nlm.nih.gov/pubmed/11930982
http://www.ncbi.nlm.nih.gov/pubmed/8199605

Here's a 1987 article from a pediatrics journal, showing improvement in depression and anxiety following 50 mg/d elemental zinc supplementation in anorexic adolescents:
http://www.ncbi.nlm.nih.gov/pubmed/3312133

In this 1990 open study, anorexic patients were treated with 45-90 mg elemental zinc daily, most of whom had significant improvement in their eating disorder symptoms over 2 years of follow-up.
http://www.ncbi.nlm.nih.gov/pubmed/2291418

Here's a 1992 case report of substantial improvement in severe anorexia following zinc supplementation:
http://www.ncbi.nlm.nih.gov/pubmed/1526438

Zinc depletion may lead to an abnormal sense of taste (hypogeusia or dysgeusia). This sensory abnormality improves with zinc supplementation. Here's a reference:
http://www.ncbi.nlm.nih.gov/pubmed/8835055

Here's a randomized , controlled 2009 Turkish study showing that 10 weeks of 15 mg/day zinc supplementation led to improvement in ADHD symptoms in children. However, a close look at the study shows a bizarre lack of statistical analysis comparing the supplemented group directly with the placebo group. When you look at the data from the article, both groups improved to a modest degree on most measures, with perhaps a little bit more improvement in the zinc group. The analysis here was insufficient, I'm surprised a journal would accept this.
http://www.ncbi.nlm.nih.gov/pubmed/19133873

Here's a 2004 reference to a study showing that 6 weeks of 15 mg elemental zinc daily as an adjunct to stimulant therapy improved ADHD symptoms in children, compared to stimulant therapy plus placebo. In this case, there was a valid statistical analysis:
http://www.ncbi.nlm.nih.gov/pubmed/15070418

Here's a 2009 study showing that zinc supplementation improves the response to antidepressants in treatment-resistant depression. The dose they used was 25 mg elemental zinc daily, over 12 weeks.
http://www.ncbi.nlm.nih.gov/pubmed/19278731

Here's an excellent 2008 review article about zinc deficiency, and about the potential role of zinc supplementation in a wide variety of diseases (e.g. infections ranging from the common cold, to TB, to warts; arthritis; diarrhea; mouth ulcers). The review shows that zinc may have benefit for some of these conditions, but the evidence is a bit inconsistent:
http://www.ncbi.nlm.nih.gov/pubmed/18221847

Here is a warning about zinc toxicity:

http://www.ncbi.nlm.nih.gov/pubmed/12368702 {hematological toxicity from taking 50-300 mg zinc daily for 6-7 months. The toxicity was thought to be due to zinc-induced copper malabsorption leading to sideroblastic anemia}

Here is a nice website from NIH summarizing the role of zinc in the diet, in the body, some of the research about health effects, and about toxicity. It sticks to a recommended daily intake of 10-15 mg elemental zinc for adults, or about 5 mg for young children. It states that the maximum tolerable daily intake levels are about 5-10 mg for young children, 20-30 mg for adolescents, and 40 mg daily for adults:
http://ods.od.nih.gov/FactSheets/Zinc.asp

Here is a reference to another excellent review of zinc requirements, benefits, and risks. It makes more cautious recommendations about zinc supplementation, advising no more than 20 mg/day of zinc intake in adults. In order to prevent copper deficiency, it also advises that that the ratio between zinc intake and copper intake does not exceed 10.
http://www.ncbi.nlm.nih.gov/pubmed/16632171

So, were I to make a recommendation about a zinc supplementation trial, I would advise sticking to amounts under 20 mg (elemental) per day for adults, and to ensure that you are getting 2 mg of copper per day with that.

Wednesday, July 8, 2009

Prazosin and other treatments for PTSD-related nightmares

Nightmares are a common symptom of post-traumatic stress disorder (PTSD).

Various psychotherapeutic approaches can help people to deal with nightmares, both to be more psychologically prepared for them, and to be able to let them pass with a smaller amount of distress. Techniques include simply keeping a written record of the nightmares, with or without doing some cognitive therapy exercises based on this record; practicing relaxation techniques; exposure therapy during the daytime (by evoking the imagery of the nightmares, possibly "rescripting" the sequence of events); or by planning for a "rescripting" of the nightmare during the nightmare itself. Here is a reference to a review article about psychotherapy for nightmares: http://www.ncbi.nlm.nih.gov/pubmed/18853707

Sedative drugs can change dreaming activity, but often times these medications are problematic: tolerance or oversedation may develop, or sometimes the nightmares continue despite other types of sleep improvement.

Prazosin is a cardiovascular drug which blocks alpha-receptors, and is commonly used to treat high blood pressure. Alpha receptors are stimulated by adrenaline, which causes constriction of blood vessels, therefore increased blood pressure. In the brain, increased stimulation of alpha-receptors may be one of the mechanisms driving PTSD-related sleep disturbances such as nightmares. Prazosin has been shown to help reduce PTSD-related nightmares. Here are a few references:

http://www.ncbi.nlm.nih.gov/pubmed/18447662 {a good review article}

http://www.ncbi.nlm.nih.gov/pubmed/17069768 {a 2007 randomized, controlled, crossover study published in Biological Psychiatry, showing pronounced reduction in PTSD-related nightmares with 10-15 mg bedtime doses of prazosin}

http://www.ncbi.nlm.nih.gov/pubmed/12562588 {a 2003 randomized study published in The American Journal of Psychiatry showing substantial benefit in PTSD-related sleep symptoms with prazosin at an average of 10 mg/d}

There is the suggestion in these studies that prazosin, if dosed in the daytime as well, could help treat other PTSD symptoms.

Prazosin has been used for over 35 years in the treatment of hypertension. Interestingly, it is also one of the treatments of choice in the medical management of severe scorpion stings. It may also be a promising option in the treatment of alcoholism (reference: http://www.ncbi.nlm.nih.gov/pubmed/18945226).

Prazosin is well-tolerated by the majority of people taking it. It appears to have minimal psychiatric side-effects. Sedation does not seem to be common. If the dose is too high, too soon, it can cause excessive postural blood pressure drops, with dizziness and a risk of fainting (syncope). It may cause nasal congestion or headache. Priapism (a medically dangerous sexual side-effect) is possible but very rare.