Sponsored by an educational grant from Ortho Clinical Diagnostics

dm01.jpg (9668 bytes)Infectious diseases, the No. 1 cause of deaths worldwide and the third leading cause of deaths in the United States, are on the rise. From 1980 to 1992, they increased the U.S. mortality rate by 30 percent.

The never-ending microbial assault comes on all fronts, food, blood, air and water. The names — tuberculosis, HIV, hepatitis, E. coli, malaria, mad cow, Lyme — make newspaper headlines and are featured on the nightly newscast.

It’s only natural that these unwelcome predators would intimately inform our daily lives. Is that cough TB or just a cold? Should I cancel my hiking trip during the height of deer tick season? Does that delicious-looking cheeseburger harbor something more dangerous than just a spike in my cholesterol level? Can I trust the blood they give me when I’m in the hospital?

Before 1958 our nation’s blood centers tested for just blood type and syphilis. Today 10 tests are needed to keep blood transfusions safe from virulent strains of hepatitus, HIV and others.

At the same time, our antibiotic arsenal is not quite up to the task. Mutating microbes have developed resistance to over-prescribed drugs, resulting in super germs which healthcare officials have dubbed, “emergent” or “treatment resistant.”

Addressing the problem has become a public health priority. The National Center for Infectious Diseases at the Centers for Disease Control has put new, re-emerging and drug-resistant infections at the top of its disease prevention priority list.

As Jonathan Briggs reports in our first story, the emergence and re-emergence of these and other disease agents is fueled largely by human behaviors such as increased international travel, worldwide transport of animals and food, unprecedented population growth and human encroachment on wilderness habitats that are resevoirs for insects and animals that harbor infectious agents.

Whatever the cause, the resurgence and emergence of both old and new microbes pose interesting challenges for healthcare professionals in general and laboratorians in particular.

— Coleen Curran

Emerging infectious diseases come on strong — are we doing something wrong?
A few years ago it was the Hanta Virus in the Southwest. This summer it was the West Nile Virus in the Northeast. Deadly infectious diseases are on the rise, and, our approach to battling them has only left us with more antibiotic-resistant bacteria. Public health officials refer to both newly identified infectious diseases and treatment-resistant forms of  familiar infectious ills as “emergent.” Despite the success of modern vaccines and antibiotics, emergent diseases continue to challenge humankind in general, and the healthcare community in particular. How big is the challenge? Consider this, between 1980 and 1992, the U.S. mortality rate from infectious diseases increased by 39 percent.

Moreover, the Institute of Medicine (IOM), part of the National Academy of Sciences, predicts that despite our best efforts, many emerging communicable diseases will persist, and even increase, in the near future. As much as we might like to blame the bugs that cause these illnesses, most of IOM’s unsettling prediction actually points toward the behavior of humans rather than microbes. Just how are we helping to arm our microbial nemises? IOM lists the following:
•    Increasing human population density
•    More people with suppressed or compromised immune systems
•    Changing agricultural practices, settling into unpopulated areas, allowing global warming
•    Engaging in worldwide air travel and trade
• Allowing public health systems to deteriorate

Ironically, modern medical care is part of the problem. For example, the risk of acquiring an infection is increased by:
• Hospitalization for any reason
• Invasive procedures that break natural barriers
• Treatment with immunosuppressive drugs
• Antibiotics that alter the body’s bacteria balance
• Treatment with antibiotics that lead to the development of resistant infections

dm03.jpg (5596 bytes)These factors, particularly the last three, indicate a need to change our approach to the treatment of infectious disease, according to Arturo Casadevall, M.D. (left) associate professor of medicine in the division of infectious diseases at Albert Einstein School of Medicine in New York. Casadevall is referring to a change from our current emphasis on non-pathogen-specific therapy, as seen with broad-spectrum antibiotics, to pathogen-specific approaches, such as those used in cancer and other non-infectious disease. “Cancer chemotherapy has to be very specific because the drugs can be very toxic. Therefore, you want the drug to have the best effect and do the least harm possible.

You also want to be sure that you are using the most appropriate chemotherapy for the tumor in question, “ Casadevall said. However, since the advent of antibiotics, treatment of infectious disease has used a somewhat different paradigm.

“Before antibiotics were available, the focus of infectious disease treatment was on the host, “ said Casadevall. Treatments used the patient’s own immune system to help fight infection. One such treatment that dates back to the 1890s involved the use of animal-derived serum therapy. Although this technique required accurate microbe identification, it was most effective when used prophylactically or to treat early-stage bacterial infections. However, the advent of sulfa drugs in the mid-1930s, started a move away from serum therapy. The move was sparked by difficulty in administration, lot-to-lot variation, toxicity and expense. Serum therapy also required accurate pathogen identification

    Early in the antibiotic era, most pathogens were susceptible to existing drugs, reducing the importance of a specific diagnosis. However, the spread of resistant organisms has reduced the usefulness of many anti-microbial drugs. Consequently, identification, speciation and determination of microbial susceptibility profiles

often is needed for selecting appropriate drug regimens. “It is ironic, “ said Casadevall, “that susceptibility testing is once again often essential to the selection of appropriate anti-microbial drugs.”

We need to change our approach to infectious diseases, particularly emergent diseases, because the treatment of microbial infections is complex. There is always the possibility of a paradoxical treatment effect. For example, the antibiotic paradox is that their safety, ease of use and broad spectrum of effectiveness is exactly what led to their indiscriminate use. “Every time you use an antibiotic, there is a probability you will select for a resistant organism,” cautions Casadevall.

Another paradox associated with antibiotics comes from the idea that treatment is simply a matter of getting rid of the offending organism. In the early days of antibiotics, it was assumed that the drug alone eradicated the offending organism. Now we know this is not true, especially not for patients with compromised immune systems. “We have learned that drugs often do not work by themselves. For example, because of their compromised immune systems, it is much more difficult to get rid of infections in AIDs patients, even with agents that are very active against those infections,” noted Casadevall. Ironically, the number of people with compromised or surpressed immune systems is much greater today exactly because of the large numbers of patients with AIDS or who undergo cancer and auto-immune disease therapies or organ transplantation.

Another initial assumption about antibiotics was that they had few consequential side effects. Certainly, when used appropriately most antibiotics are safe and have few serious side effects. However, some side effects, initially considered inconsequential, have been found to be very important. For example, some antibiotics kill beneficial microbes in the human gut, making the individual more susceptible to other microbes.

Fortunately, another paradigm shift in the treatment of infectious disease may already be happening. “There are indications,” said Casadevall, “that the change to a non-pathogen specific approach to infectious disease is already happening.” Recent advances in monoclonal antibody technology and recombinant DNA techniques make it possible to generate human antibody reagents that do not have the drawbacks associated with early serum therapy. “In theory, there would be advantages to combinations of antibody therapy and anti-microbial chemotherapy. Combination therapy could reduce the amount of either agent needed to achieve a therapeutic effect,” he continued. Another potential advantage of combination therapy is a reduction in the likelihood that treatment will result in selection of resistant organisms. As for the laboratory role, “The use of antibody-body based therapy may also require antigenic screening in the clinical laboratory, a practice similar to current anti-microbial susceptibility testing,” Casadevall concluded.

So, perhaps it’s not so much that we are doing something wrong, when it comes to emergent infectious diseases, but that we can be doing so much more that is right.

— Jonathan Briggs


dm02.jpg (5320 bytes)New detection and treatment methods offer hope for reducing the incidence and prevalence of Hepatitis C virus
Hepatitis C Virus (HCV), one of the six known viruses that account for a majority of the cases of viral hepatitis, is one of the leading causes of chronic liver disease and the primary indication for liver transplantation in this country. It is estimated that four million Americans are infected with HCV, causing 8,000 to 10,000 deaths annually. Research suggests that although up to 30,000 new infections occur each year, 70 to 75 percent of those remain undiagnosed. The natural course of HCV infection can range from no symptoms to end-stage liver disease. Because the immune system does not easily clear HCV, the disease process spans decades, and patients must be followed over the course of a lifetime. Despite these challenges for clinicians and laboratorians, new methods for HCV detection and continued research into new treatment options offer hope for reducing the incidence and prevalence of HCV infection here and abroad.

Transmission and infection
The major route of HCV transmission is through parenteral contact with blood or blood products from infected individuals. Routine blood donor screening for HCV began in May 1990. With the advent of more sensitive assays available in June 1992, the risk of transmittal from transfusion has been reduced to less than one infection for 100,000 products infused. The most common source of new U.S. infections is IV drug abuse, although many patients without a history of exposure to blood or drug use are also infected. According to the CDC, the following put people at high risk for HCV: IV drug use, hemodialysis, healthcare work, multiple sex partners, sexual contact with HCV-infected individuals, blood transfusion prior to July 1992, taking clotting factors made prior to 1987 and being born to an infected women.

Infection has peaked
The rate of new infection has decreased mainly due to blood screening implemented in the early 1990s. However, according to Michael Fried, M.D., director of clinical hepatology at the University of North Carolina at Chapel Hill, we are likely to see an increase in the prevalence of HCV-related liver disease, impacting healthcare well into the future. “We may have seen the peak of infection, but we haven’t seen the peak of disease burden. That is going to take another 10 to 20 years to manifest”, he said.

Fried recommends that clinicians question all their patients to determine who should be tested for HCV, and that the guidelines developed by the CDC be implemented in daily practice. Currently, HCV diagnosis involves a series of serological evaluations, including tests for serum alanine aminotransferase (ALT) levels, enzyme immunoassay (EIA) to detect antibodies to HCV (anti-HCV), recombinant immunoblot assay (RIBA), a supplemental test to confirm the presence of anti-HCV, qualitative polymerase chain reaction assay (PCR) to determine the presence of HCV RNA. According to Fried, the use of these assays can vary between high- and low-risk populations. “In the high-risk group, meaning that they have an identifiable risk factor, a positive anti-HCV test indicates that the patient has hepatitis C, and you could probably stop there with about 95 percent certainty. But in the low risk group, as with a volunteer blood donor who has no identifiable risk factors, a positive anti-HCV test may be a false positive in as many as 60 percent of patients.” In low-risk populations, positive anti-HCV screening tests are confirmed with supplemental assays, usually RIBA and qualitative PCR. If these tests are positive, the patient may then undergo liver biopsy to assess for disease staging and fibrotic changes in the liver.

Viral load predictions
Once diagnosis is confirmed, patients are evaluated for both viral genotype and viral load. HCV is an enveloped, single-stranded RNA virus with at least six known genotypes and more than 50 subtypes. The virus exists in the patient as a population of various quasispecies, which may explain why the host cannot mount an effective immune response. Genotypes are geographically distributed, the most prevalent being types 1a and 1b in the United States. Recent studies indicate that genotype 1 infections have a lower response rate to antiviral therapy compared with others.

Although viral load does not seem to correlate with disease severity or prognosis, it can help to predict the response to antiviral therapy. Viral load assessment is also performed during treatment to determine therapy cessation, and post treatment to evaluate for a sustained response. Two methods commonly used to detect low levels of HCV RNA are targeted amplification using quantitative PCR and signal amplification using branched DNA. Each assay has varying degrees of sensitivity and reproducibility.

Several issues exist with the currently available serological assays for HCV in both screening and diagnostic applications. Although boasting a sensitivity of about 97 percent for anti-HCV, EIA alone cannot distinguish among acute, chronic or resolved infection. False positive rates are low in low-risk populations (0.04%), and false negatives can occur prior to seroconversion and in patients with suppressed immunity. Because of the perceived need to increase analytical sensitivity, qualitative PCR nucleic acid amplification technology (NAT) has been used to detect HCV RNA in serum for both screening and diagnosis. Although commercial test systems are available, they are expensive, time-consuming, and they lack standardization which can lead to inconsistent results.

NAT testing research
In 1999, U.S. blood centers began research to assess the sensitivity and specificity of NAT testing for both HIV and HCV using pooled donor samples. Pooling samples can significantly reduce the number, time and cost of test per donation, but may decrease sensitivity. Two pooling strategies have been implemented: a pool of 24 donor samples, in which reactivity requires further testing of all individual samples; and a second strategy using intermediate pools that are combined into one master pool, where reactivity requires testing of intermediate, then individual sampling. The two test systems under evaluation, developed by Roche and GenProbe, use amplified PCR and transcription mediated amplification techniques respectively. The semi-automated procedures take from five and six hours.

Assays for HCV RNA on the horizon
Preliminary data suggest that minipool NAT testing for HIV and HCV RNA has successfully identified preseroconversion blood units. According to Frederick Nolte, M.D., associate professor of pathology and laboratory medicine at Emory University, this research will likely lead to FDA-cleared assays for HCV RNA detection. “Having an FDA-cleared set of reagents will introduce standardization into the field. And in the future, automation in nucleic acid diagnostics is going to be important as the demand for testing grows. Transference of high-throughput technology … to the clinical lab is hopefully something we have to look forward to in the near future.” Fried agrees, adding that standardization will also have an impact at the clinical level. “One of the big problems with PCR assays is that until now, they have been poorly standardized, and their level of sensitivity varies greatly according to where the tests are done. Having a reproducible FDA-approved assay will be of great benefit in standardizing the diagnosis as well.”

Quantitative PCR and branched DNA assays to determine viral load are also currently not standardized and are prone to high levels of laboratory variability as well. “Quantitative HCV viral load determinations may become increasingly important as the data clarifies how well viral load may predict response to therapy,” said Nolte. “So the next step is to have some sort of FDA-cleared quantitative test for HCV RNA.”

Despite the move toward standardization, nucleic acid amplification testing remains expensive and time intensive, especially in the screening setting. According to David Miller, director of testing and production at the Charlotte, N.C., branch of the American Red Cross National Testing Laboratory, NAT is currently performed under an IND at two of their eight regional facilities. The organization plans to expand NAT to new sites in the future, but until then, samples from all regions are submitted to these two facilities for pooled testing.

Europe studies HCV core antigen ELISA
“Eventually we hope to have the ability to analyze individual samples with automated processing. In the meantime, there are other assays being developed that may be less expensive to implement.”

To that end, researchers in Europe are evaluating a murine monoclonal enzyme-linked immunoassay (ELISA) test system to detect HCV core antigen in serum. Developed by Ortho-Clinical Diagnostics, the method may be of particular use in HCV detection prior to seroconversion, especially in the screening. The assay, which is compatible with existing ELISA equipment, analyzes individual samples in about three hours. Researchers are comparing the sensitivity of the assay to NAT and examining its potential in quantitative applications for treatment monitoring and long-term follow-up. Preliminary studies by French and British researchers suggest that the assay can detect HCV infection an average of two days after HCV RNA and approximately 49 days earlier than anti-HCV. The assay is approved for use in France and Hungary, and U.S. clinical evaluation are scheduled to begin shortly.

Research regarding treatment for HCV continues as well. According to Fried, antiviral treatment regimens have been slowly improving, and new approaches to therapy are under evaluation at sites throughout the world. Currently, sustained response rates to therapy with combination interferon and ribavirin have reached a level of 35 to 40 percent. “That means that 60 percent of the patients we treat still don’t have a beneficial response to these agents. One of the big initiatives for new research is the use of the longer-acting pegylated interferons, either alone or in combination with ribavirin”. Other avenues for treatment include the use of inhibitors that target specific enzymes used during HCV replication. As for prevention, developing a universal vaccine for this rapidly mutating virus is probably a long way off.

Diagnosis and treatment in the future
What does the future hold for HCV diagnosis and patient monitoring during treatment? Because not all patients who test positive for HCV proceed to end-stage liver disease, more research needs to be done to help determine prognosis. “The biggest question we face as clinicians is predicting the natural course of hepatitis C. None of the available assays, such as PCR testing or genotyping, tell us who is going to progress with the disease,” said Fried. ”In addition we need to develop noninvasive ways to detect fibrosis and the presence of cirrhosis. And ultimately, we need to determine the best way to monitor patients while on therapy.” With the increase in demand for HCV testing and with the potential for new diagnostic methods, the role of the clinical laboratorian in HCV disease management will expand well into the future.

— Louise Lazear


dm04.jpg (7927 bytes)Microbes in the wardrobe

Three years ago, when he was first given the gift of herpes, Roger Freeman had an unusual response; he was pleased. The gift, an artistic, rendering of herpes cells on a silk necktie, was a perfect conversation starter for someone in the infection control business. At the time, Freeman was producing infection control videos for consumers and healthcare workers.

In fact, Freeman liked his necktie so much, he located its manufacturer and bought the business. But what began as a sideline to video production has become a full time pursuit. For about two years, Infectious Awareables of Encino, Calif., has sold a line of silk ties and scarves that showcase some of nature’s nastiest pathogens as works of art. In addition to fashion, there’s substance; on the flip-side of each tie is an educational message about the disease.

Among the infectious bugs represented are: Ebola, staphylococcus, influenza-C, plague, giardia, E. coli, dental plaque, tuberculosis, chlamydia, syphilis and gonorrhea. One of the latest releases, a breast cancer necktie, features one cell on the front and a backside message that says, “Breast Cancer is the leading cause of mortality in women ages 35-54. Early detection and optimal treatment are the keys to long-term survival.”

Freeman’s ties have been mentioned in GQ, Jane and People, but he has been careful to keep his marketing efforts in the scientific and healthcare community. The ties are in gift shops at the Centers for Disease Control and the National Institutes of Health, but not in upscale department stores. “These are serious issues. We’re not trivializing them,” Freeman said. Our customer base reads like a ‘Who’s Who in Science.’ I do not want to drop these in the middle of Nordstrom’s and have them taken out of context.”

After 25 yeas as a dentist, Freeman turned in his drill. “In the early ‘90s, OSHA hammered dentistry with infection control issues because of a Florida case where the dentist allegedly passed AIDS on to six of his patients,” Freeman said. “So dentists were at the forefront of infection control. I got involved in dental office infection control consulting and then infection control on the consumer level.”

Next on the agenda for Infectious Awareables is a line of boxer shorts targeted at teenagers. With a fall line up that includes dust mites, gonorrhea, testosterone, the plague and anthrax, how can they miss?

Infectious Awareables gives a portion of the proceeds from every purchase to research and education. Check out their Web site at: www.iawareables.com or call 800-388-1237.

— Coleen Curran


Safe blood requires strong process controls

Ensuring a safe supply of blood and blood products to our nation’s hospitals is a daunting task for the approximately 150 non-hospital U.S. blood centers that vouchsafe one of our most precious resources.

Testing blood for dangerous microbes has always been important but never more so than today. Prior to 1958, each donor unit received two tests, one for blood type (ABO), Rh factor and one for syphilis. Today, most blood centers perform 10 tests on every donor unit: ABO group and Rh (blood type); syphilis; an antibody test; ALT; HIV-1/2; HIV-1 p24 Ag; HTLV-1/2; HCV; HBS Ag; Hepatitis B core Ab.

While the number of marauding microbes on the horizon continues to grow, so does the battillion of diagnostic tests that seek them out. Just as important as those tests is the process control that ensures they are performed correctly. There is plenty of evidence that great strides in diagnostic testing have been made in securing the integrity of the nation’s blood supply. For example, in the late 1980s, when AIDS was discovered, the percentage of AIDS attributed to blood transfusions was 30 percent; today, it’s 0.03 percent.

Since about 90 percent of all positive sample identification mistakes can be attributed to data entry or human error, similar leaps in technology and process control have occurred.

dm05.jpg (7012 bytes)“That’s the reason everything is barcoded,” said Steve Negin (left), lab manager of the Central California Blood Center in Fresno, Calif. “Not just in our business, but every clinical lab. Every one of our instruments reads sample ID and control ID barcodes, so there is little manual entry. We’ve done everything we can to take the human factor out of positive sample identification. Most instruments have barcode scanners to read the label on tubes, but if the label on the tube isn’t placed high enough to be read automatically, we use a handheld barcode scanner as a back up. If we have to do a manual entry, we do a double-blind manual entry.

Central California, a medium-size blood center, runs the 10 tests on about 70,000 units a year. One of the four extra tubes, drawn during donation, goes for EIA viral marker testing on the Ortho Summit processor.

FDA requires that each of the many detailed steps in the microplate process for checking viral markers be completely documented and validated. The Summit system converts all the manual steps to an automated process and delivers a detailed report of every step down to the second.

“That level of information would be almost impossible to capture manually,” Negin said. “The documentation now is much more complete than just two years ago before this system came out. We’ve had a couple of inspectors come through and look at it. It makes their job much easier, because they can tell at a glance if things have been done right. Eventually, as this automated system becomes more widespread, inspectors are going to expect this level of detail from everyone. They’re going to be asking, ‘what are you doing manually to compensate for what’s being done on an automatic system? It’s going to be almost impossible to do.’”

— Coleen Curran


dm06.jpg (7851 bytes)Eat dessert first — a not-so-savory update on TSE
“Mad-cow” disease, or bovine spongiform encephalopathy (BSE), and its rare human variant, Creutzfeldt-Jakob disease (CJD), continue to make front-page news. In the UK, the rising incidence of new-variant (nv) CJD triggered a government investigation into what has been termed a possible epidemic. In a controversial action, the USDA recently ordered the slaughter of two sheep flocks in Vermont under the suspicion they are infected with an agent similar to BSE. Meanwhile, a recent report by EU scientists indicated that BSE is probably present in cattle in Germany, Spain and Italy, and the risk of BSE-infected cattle in both the United States and Canada, while unlikely, cannot be excluded.

Underlying these stories are far-reaching implications for world health, ranging from safety of food products to blood supplies. Scientists and researchers worldwide are racing to discover more about these ultimately fatal, degenerative, neurological diseases known as transmissible spongiform encephalopathies (TSEs).

TSEs are a family of diseases that attack the central nervous system, leaving microscopic sponge-like holes in the brain matter of infected individuals. TSEs have been identified in humans and animals. In most cases, the disease leads to involuntary muscle jerks, hence the moniker “mad-cow” disease for the bovine variant.

Initially, scientists suspected a slow-acting virus or other organism caused the disease. However, several characteristics of the elusive infectious agent, including its resistance to conventional anti-viral techniques and lack of DNA and RNA, lead researchers to believe the culprit is a misshapen protein molecule known as a prion, short for proteinaceous infectious particle. Prions exist in cells in both normal and infectious forms; infectious forms are shaped differently from normal prions. It is believed that infectious prions can arise spontaneously and replicate themselves by altering normal prions in other cells via a chain reaction. These abnormal prion proteins clump together, resulting in fibers or plaques that cause the spongiform appearance of infected brain tissue under microscopic evaluation.

Some are yet to be convinced that an agent lacking a genome can replicate and cause disease, as scientists have been unable to duplicate in vitro prion replication in mammalian cells. In a recent Science article, however, researchers reported that prions in yeast cells replicated by spontaneously converting normal proteins to the abnormal prion state. Scientists hope this research will lead to further understanding of prion-related disease, including a way to possibly block the mechanism for prion replication in human cells.

TSE infectivity concentrates mainly in the central nervous system of victims. Experiments with animals suggest that there is potential for the infectious agent to cross the species barrier, leading to a possible link between exposure to diseased animals and the development of CJD in humans. BSE, originally discovered in the UK in 1986, was thought to have arisen from the use of cattle-feed supplements made from rendered portions of TSE-infected sheep. Since 1996, reported cases of BSE in the UK have declined significantly due in large part to feed bans which restrict the use of rendered proteins in feed supplements. Recently, the EU has announced an upcoming ban of “Specified Risk Material”,

including cattle eyes, brains, and spinal cords, from the food chain in all member states. Although there have been no reported U.S. cases of BSE, in June 1997 the FDA initiated similar regulations prohibiting certain mammalian proteins in feed products for ruminant animals, including cows, sheep and goats. The FDA positioned the action as a precautionary measure to prevent the transmission of TSEs between animals and to reduce any potential risk to human infection.

In humans, TSE disease includes CJD, kuru, Gerstmann-Straussler-Scheinker disease and fatal familial insomnia. While CJD is the most common human TSE, it remains very rare, affecting one person per million annually worldwide. In the US, CJD occurs at a rate of about 200 cases per year. The disease, which can have a clinical latency of up to 30 years, usually manifests itself after age 60. Rapidly progressive dementia, severe muscle jerking or myoclonus, blindness and coma characterize CJD. Ninety percent of patients die within a year of symptom onset. Although researchers have experimented with anti-viral therapies, steroids, and antibiotics, there is no known cure.

Several variants of CJD are known to exist, differing in both clinical course and symptoms. In 1996, new-variant (nv) CJD was discovered in Great Britain and France. NvCJD differs from CJD in that it affects younger people, has a longer clinical course and symptoms often begin with psychological disturbances. Many scientists believe that nvCJD is linked to eating meat from BSE-infected cattle. At a recent meeting of the UK’s Spongiform Encephalopathy Advisory Committee, scientists reported a total of 76 cases of nvCJD, noting an increase in incidence of between 20 and 30 percent per year. Four of these cases included two teenagers from Leicestershire County, where an emergency investigation is underway. The investigation, to determine the number of people infected with the disease, will include examination of tonsil and appendix specimens removed since 1985.

Researchers believe humans acquire CJD in three ways. In 85 percent of cases, acquisition appears to be sporadic; no known source of infection can be identified. In 10 to 15 percent of cases, CJD has been linked to a hereditary mechanism, where patients have familial history of the disease and test positive for various genetic mutations related to the prion protein, or both. Lastly, in less than 1 percent of cases, CJD may be acquired by unintentional exposure to infected tissue through medical procedures, known as iatrogenic acquisition. Documented cases of CJD transmission have occurred in procedures involving dura matter grafts and corneal transplants from infected patients, implantation of contaminated brain electrodes and exposure via contaminated surgical instruments. CJD also has been acquired through injections of human-derived pituitary growth hormone. Although it is generally believed that CJD cannot be transmitted through casual contact with infected individuals, normal sterilization techniques do not destroy prions. Therefore, when interacting with CJD patients, healthcare workers must follow special precautions.

At present, there is no single diagnostic test for CJD. Typically, adults presenting with rapid dementia and myoclonus are evaluated to rule-out other forms of dementia. EEG studies can identify an abnormal wave pattern specific to CJD. CT may be used to rule-out other organic diseases, while MRI may help to identify patterns of brain degeneration consistent with CJD. The most definitive confirmation of diagnosis is brain biopsy, which is highly invasive, costly, and can produce false-negative results if the biopsy is from an unaffected portion of the brain. Other methods include evaluation for the presence of the prion gene mutation, and immunohistochemistry, western blotting, or ELISA to detect the presence of abnormal prion protein. In a recent article, German scientists reported developing a highly sensitive method to detect femtomolar concentrations of abnormal prion protein aggregates in the CSF of patients with CJD. This dual-color fluorescence correlation spectroscopy method may lead to a reproducible, standardized assay for CJD and other prion-related disease.

The eventual development of diagnostic assays for TSEs may play an important role in maintaining the safety and availability of blood and blood products. Although research suggests that TSEs may be transmissible via animal blood, there has been no documented human infection from blood transfusion. Nonetheless, in 1987, the FDA recommended that persons who received pituitary-derived human growth hormone be permanently deferred as blood donors. Since then, the agency has issued a series of revised and new recommendations for donor deferral, product disposition and recipient notification in order to reduce the possibility of transmission of CJD and nvCJD through blood and blood products. In late 1999, the FDA recommended the withdrawal of all nvCJD donor materials from the blood supply. It also recommended that any donor spending six months or more on a cumulative basis in the UK from 1980 through 1996 be indefinitely deferred as a donor. An organization representing U.S. blood centers estimates that this new deferral recommendation has the potential to reduce the blood supply by about 2.2 percent.

Although much headway has been made regarding the mechanism of TSE infection and detection, many questions remain unresolved. In an effort to coalesce international expertise, the FDA and NIH are sponsoring a three-day workshop for representatives in science, industry and government who share a common desire for validated TSE diagnostic assays. Information on the workshop is available at [removed]www.fda.gov/cber/meetings/tse092000.htm[/removed].

In the event conclusive, standardized assays for CJD and other TSEs are developed, the question remains — who really wants to know if they have the disease? With a potential latency of 30 years and no known cure, the prospect of disease detection among the seemingly healthy presents complex medical, ethical and social dilemmas. In one respect, knowledge of the extent of TSE infectivity in humans and animals may spur additional public awareness and demand for new therapies, some of which could impact other prion-related diseases. On the other hand, perhaps that conventional wisdom — ”Eat Dessert First” — will become the new mantra of the fast food generation.

— Louise Lazear


Some HIV patients may interrupt drug therapy when viral loads are down
Laboratory tests to determine viral load and CD4 counts in patients infected with HIV have been in use for some time. With the advent of new drugs, drug regimens and treatment strategies, these tests are becoming more important as indicators of patient response to therapy.

dm07.jpg (8432 bytes)Left: HIV budding out of a T-cell
Below: Computer-generated image of HIV

Standard antiretroviral regimens used to treat patients infected with HIV usually consist of at least three agents, one or two protease inhibitors or a non-nucleoside reverse transcriptase inhibitor and two nucleoside analogs. Indications for treatment with these regimens include a viral load of 1,000-5,000 copies/mL and/or CD4+T-lymphocyte count of less than 500/mm3. The goal of these regimens is to reduce measurable viral load to undetectable levels and allow native immune system components such as CD4+ T lymphocytes to increase. Consequently, changes in viral load and CD4+ T-lymphocyte counts are also used as surrogate markers of disease progression in patients infected with HIV.

Although these tests are indirect markers of patient status and improvements in them do not represent a “cure,” they are indicative of outcomes. For example, a recent Duke University Medical Center study of more than 1,1000 patients with HIV-1 found that reduced viral load and increased CD4 counts are associated with improved quality of life for AIDS patients. An additional benefit of viral load testing is that it makes it possible to individualize therapy and determine when to initiate or change it.

Viral Load Testing’s Role in New Intermittent Therapy HIV-infected patients have to take antiretroviral medications on a strict schedule to avoid drug-drug interactions and reduce

other side effects. It is also important that patients adhere to the therapy for long periods to minimize emergence of viral mutants that would require regimens that are even more demanding. However, the complexity of these multi-drug regimens makes if difficult for patients to comply with therapy for long periods. One strategy being tested to help patients comply is to give them a break from the regimens. An essential element of the intermittent therapy is viral load testing.

If a patient’s regimen reduces viral load then taking a break may be appropriate. And the lower the viral load, the longer the break could be, at least in theory. In addition to reducing the side effects, this approach also allows the patients’ own immune system to recoup. Moreover, by using this intermittent approach, researchers hope to increase the time period during which combination therapies are effective and the time before the virus mutates and becomes resistant to treatment.

This type of intermittent therapy is being tested in about 10 clinical trials in the United States and Europe. To date, patients who have stopped therapy for up to a month still experienced increases in viral load, but these were reduced to unmeasurable levels when the patients went back on antiretroviral therapy. Researchers hope these trials will provide them with more information on the effects of AIDS drugs and regimens and so patients may be allowed to take breaks from therapy for as much as six months or more.

— Jonathan Briggs