Marjie Popkin thought she had chemo brain, that fuzzy-headed forgetful state that she figured was a result of her treatment for ovarian cancer. She was not thinking clearly — having trouble with numbers, forgetting things she had just heard.
One doctor after another dismissed her complaints. Until recently, since she was, at age 62, functioning well and having no trouble taking care of herself, that might have been the end of her quest for an explanation.
Last year, though, Ms. Popkin, still troubled by what was happening to her mind, went to Dr. Michael Rafii, a neurologist at the University of California, San Diego, who not only gave her a thorough neurological examination but administered new tests, like an M.R.I. that assesses the volume of key brain areas and a spinal tap.
Then he told her there was something wrong. And it was not chemo brain. It most likely was Alzheimer's disease. Although she seemed to be in the very early stages, all the indicators pointed in that direction.
Until recently, the image of Alzheimer's was the clearly demented person with the sometimes vacant stare, unable to follow a conversation or remember a promise to meet a friend for lunch.
Ms. Popkin is nothing like that. To a casual observer, the articulate and groomed Ms. Popkin seems perfectly fine. She is in the vanguard of a new generation of Alzheimer's patients, given a diagnosis after tests found signs of the disease years before actual dementia sets in.
But the new diagnostic tests are leading to a moral dilemma. Since there is no treatment for Alzheimer's, is it a good thing to tell people, years earlier, that they have this progressive degenerative brain disease or have a good chance of getting it?
"I am grappling with that issue," Dr. Rafii said. "I give them the diagnosis — we are getting pretty good at diagnosis now. But it's challenging because what do we do then?"
It is a quandary that is emblematic of major changes in the practice of medicine, affecting not just Alzheimer's patients. Modern medicine has produced new diagnostic tools, from scanners to genetic tests, that can find diseases or predict disease risk decades before people would notice any symptoms.
At the same time, many of those diseases have no effective treatments. Does it help to know you are likely to get a disease if there is nothing you can do?
"This is the price we pay" for the new knowledge, said Dr. Jonathan D. Moreno, a professor of medical ethics and the history and sociology of science at the University of Pennsylvania.
"I think we are going to go through a really tough time," he added. "We have so much information now, and we have to try to learn as a culture what information we do not want to have."
Some doctors, like Dr. John C. Morris of Washington University in St. Louis, say they will not offer the new diagnostic tests for Alzheimer's — like M.R.I.'s and spinal taps — to patients because it is not yet clear how to interpret them. He uses them in research studies but does not tell subjects the results.
"We don't know for certain what these results mean," Dr. Morris said. "If you have amyloid in your brain, we don't know for certain that you will become demented, and we don't have anything we can do about it."
But many people want to know anyway and say they can handle the uncertainty.
That issue is facing investigators in a large federal study of early signs of Alzheimer's. The researchers, who include Dr. Morris, have been testing and following hundreds of people aged 55 to 90, some with normal memories, some with memory problems and some with dementia. So far, only investigators know the results. Now, the question is, should those who want to learn what their tests show be told?
"We are just confronting this," said Dr. Richard J. Hodes, director of the National Institute on Aging. "Bioethicists are talking with scientists and the public about what is the right thing to do."
More ...
http://www.nytimes.com/2010/12/18/health/18moral.html?_r=1&ref=health