Tuesday, August 11, 2015

Fear of Alzheimer's

The Arboretum at Penn State 2015

Steve went to an informative talk at the American Psychological Association's meetings in Toronto, given by Glenn Smith, Ph.D., formerly of the Mayo Clinic and now at the University of Florida.  Dr. Smith gave a very balanced talk about the current fad diagnosis, MCI, or Mild Cognitive Impairment.  We have both felt that this is not a diagnosis, but rather the measured effects of normal age-related memory changes.  However, the pharmaceutical companies are heavily invested in this diagnosis because it will allow them to give their very expensive experimental drugs to younger people for longer periods of time (pardon my cynicism).  Research has now shown that the drugs make no measurable difference in cases that are diagnosable, which means that they are much further along.  So the thinking is that if you give them earlier they may delay cognitive decline (something, by the way, that is impossible to prove).  However, Dr. Smith's research shows that among those with the diagnosis of MCI, as many got better as got worse, and many stayed the same.  The drug studies talk about that percentage who get worse as "converting" to Alzheimer's, but in fact, they may be the minority who actually had a degenerative process in the first place.

There is a much longer and more complicated argument that I won't get into here, but it did get us thinking about how much public awareness of Alzheimer's and other dementias has changed since we began our careers 35 years ago.  And I think you'll find that things are both better and worse.

In the late 1970s, when we started out in this field, no one had heard of Alzheimer's disease.  When people came to the clinic they may have been told that they have "hardening of the arteries to the brain," "chronic brain syndrome," or in the worse cases, that their loved one was mentally ill.  Once they received the diagnosis, family members were immediately told that they would have to institutionalize the patient, usually in a locked mental hospital.  At Steve's clinic, we gave them the information about what we knew at the time, but mostly we listened and did some problem-solving with them.  Later, when we started the support groups, individuals were able to meet other caregivers for the first time, which was often instrumental in their being able to continue to provide care at home.  In the beginning, the groups were called Alzheimer's and Related Disorders, and we felt that information was part of our mission.

As we look back on it now, in those days there were probably many, many people who were aging in their homes and communities who were never diagnosed.  They were old, and they were failing in a variety of ways, and their families and communities expected them to be forgetful and no longer "with it."  It wasn't labeled, and it was considered a normal part of growing older.  As long as they were able to take care of themselves (what we call ADLs or Activities of Daily Living, liking feeding, clothing and bathing themselves) they stayed at home.  You could say nobody diagnosed it because there was no treatment, but frankly, that is not very different today.  And even then about every six months there would be a press release, announcing the latest "memory cure."  With the internet, the frequency of such announcements has increased exponentially, but the efficacy hasn't.

I think the biggest difference is that people were not nearly as fearful of the implications of memory changes.  It was not yet a disease, and the media hadn't started their relentless refrain about the incidence and frequency of Alzheimer's disease, with the dramatic images and stories that have now become part of our literature.  In fact, most people expected that if they lived long enough they would be allowed some memory lapses.

In truth, most people we have seen who live to an advanced age, or who have serious medical disease, have some changes in their memory and cognitive functioning compared with healthy, young adults.  That's normal and expected.  Are there plaques and tangles?  Maybe, but we all have some.  Are there tiny pinpoint strokes?  Probably yes again, since that's normal and expected with age.  Do we need to worry about it?  Probably a lot less than we do now.  I think the important question to ask is, can I still do the things I want to do?  If you can, you might be better off focusing on the things you enjoy doing and the people you like spending time with, rather than pursuing a diagnosis that at the very most will start you on the path of taking some fairly toxic medications that have not been shown to be effective.  This is, in fact, the direction that Dr. Smith is going in his research.

Now, three decades later, we have seen much more research about what goes on inside the brain.  One of the striking things is that how many problems a person has does not correlate well with the amount of brain changes there are.  The research, especially the tests and imaging studies, have focused on finding and counting those changes.  But if they don't correspond with the actual behavioral problems, how helpful is that?  Especially since there is growing evidence that we don't yet know what treatments will actually make a difference.

We all know someone, either in our own family or among our acquaintances who have been given a dementia diagnosis.  And we can't help but read about or hear about dramatic cases on media.  Are we better off for knowing this?  I'm really not so sure.

Friday, August 7, 2015

Steve: Does Knowledge Become Obsolete?

Park Monceau in Paris

One of the more dispiriting aspects of growing older is the widely held assumption that the knowledge we have accumulated over a lifetime of learning has become obsolete.  This belief is most pronounced when it comes to new technology.  Who among us doesn’t struggle with various electronic devices?  But technology apart, is our knowledge obsolete? 

I am currently at the annual meeting of the American Psychological Association and ran into a friend I have known since graduate school.  We were both attending the same talk given by a leading figure in the psychology of aging.  My friend Dave and I sat together, and as the speaker talked about her theories, we started poking each other.  Although some of the terms the speaker used were different, these were all ideas we learned in graduate school.  I said, “Isn’t that what Bob Havighurst wrote?” Havighurst who was a pioneer in studying the human life span and talked about the different challenges of each period of life and how people adapted (or failed to adapt) to them.   And here was his theory wrapped up in new terms.  Then Dave poked me, “This is what Bernice used to say.”  Bernice was Bernice Neugarten, another pioneer of the study of the adult years, who terrorized us as graduate students, but provided us with a broad intellectual base for understand aging.  Here were her concepts and even a study that largely replicates one that Neugarten had done 50 years ago.  The speaker, who is a highly regarded scholar, was simply unaware that much of what she was talking about had been described earlier by Havighurst, Neugarten and others.

There is a lot in research that is new and exciting, but it is not uncommon to come across ideas where the researcher has unknowingly repackaged something from the past.  There is a lot of lip service about knowing history so we don’t repeat the past, but it has become all too common for researchers to not go back in the literature more than 5 years, or 10 years at most.  As a result, they sometimes repeat what people have already done, rather than building on and extending prior work in new directions.  They could benefit from the perspectives of those of us who remember the past.

I went to a poster session later that day, where mostly graduate students present their work.  The nice thing about poster sessions is that there is an opportunity to talk with the students about what they did.  One poster was on mindfulness and how it was associated with emotional well-being in middle and late life.  It was a very nicely done poster.  In case you haven’t heard of it, mindfulness is the flavor of the month when it comes to psychological interventions.  It involves teaching people to be more aware of their thoughts and feelings and those of the people they are interacting with.  There are many studies now that demonstrate that mindfulness is a good thing that helps us function more effectively in a variety of situations from raising children to dealing with caregiving stress.   Yet I couldn’t help think about all the forerunners of mindfulness that were also fads in their day—getting in touch with one’s feelings, relaxation training, focusing, and of course meditation in all of its varieties, which is a major source of mindfulness.  All of these approaches had positive value but have fallen by the wayside.  Instead of believing they have discovered the next big thing, mindfulness researchers need to be asking why these earlier approaches are no longer widely used.  Too many of the mindfulness researchers have no idea that anything preceded it.  The answer, by the way, to why previous similar techniques are no longer used is that while they help people feel better in the short run, they do not usually help people solve the chronic and enduring challenges in their lives.  Certainly, a caregiver who is assisting someone with dementia may feel better in the short-run because of learning mindfulness, but is that going to be enough to help over the long run with the cascade of problems that fill each day?  That’s the dilemma that mindfulness researchers might ponder, if they had awareness of the past.

So does knowledge become obsolete?  Later in the day, I saw a friend who gave me an update about a research proposal where I had been listed as a consultant.  The proposal had not been funded, but she planning to resubmit the proposal and was hopeful it would be successful.  She had heard I was retiring and wondered if I would still serve as a consultant on her project.  I assured her I would.  Afterward, I realized I should have asked for twice as much money.  After all, I have knowledge from the past that is not obsolete.  I can help her steer the project away from pitfalls that plagued earlier research and toward extending and improving what we already know.