|Home > Knowledge Base > Disease Progression|
HIV disease is a continuum of progressive damage to the immune system from the time of infection to the manifestation of severe immunologic damage by opportunistic infections (OI), neoplasms, wasting, or low CD4 lymphocyte count that define AIDS. The time it takes to traverse this spectrum varies greatly, ranging from 1 year or less in some persons to a still unknown upper limit in others that has reached nearly 20 years in a few individuals. The period from infection to development of AIDS is known as the incubation period. The period from an AIDS diagnosis to death has been studied separately as AIDS survival time. The epidemiology of HIV disease progression has attempted to characterize the distribution of possible lengths of the incubation period and the AIDS survival period, to identify laboratory tests useful for prognosis and treatment decisions, and to determine what cofactors accelerate or retard the rate of disease progression. These topics are reviewed in this chapter.
The incubation period of an infectious disease characterizes the natural history from infection to initial manifestations of the disease in the absence of treatment. For most diseases with short incubation periods (e.g., the common cold), treatment does not intervene until disease symptoms appear, but the very long incubation period of HIV infection allows time to treat the infection prior to the appearance of clinical disease. As a result, studies of the natural history of HIV infection had been conducted in the absence of treatment only for the few years before antiretroviral therapy and prophylaxis of Pneumocystis carinii pneumonia (PCP) became standard treatment in developed countries. What is perhaps the most important question about the natural history of HIV infection is still in doubt: Will all infected persons eventually succumb to AIDS? The eventual fate is unknown of those individuals who have not manifested AIDS or progressive immunosuppression after nearly two decades with HIV infection. Longitudinal studies of HIV-infected persons have shown, however, that nearly all infected persons have a CD4 lymphocyte count below the mean for seronegative persons and show a progressive loss of these cells over time.(1-3) Persons who are exceptions to this general rule, sometimes characterized as long-term nonprogressors or long-term survivors, are of particular interest to an understanding of the natural history of HIV infection.(4,5) In data from the era prior to the advent of therapy with protease inhibitors in combination with other antiretroviral drugs, retrospective testing of serum samples stored from the hepatitis B vaccine trial in 1978 and subsequent follow-up identified a cohort of 621 men with well-characterized seroconversion. By 17 years after infection, 87% had developed AIDS. Defining nonprogression as a CD4 lymphocyte count greater than 500/µl, 12% were nonprogressors at 10 years of follow-up, but only 3% were nonprogressors at 16 years of follow-up.(6)
Other observations made in the era prior to the advent of combination therapy also suggest it is likely that nearly all HIV-infected persons will eventually lose CD4 lymphocytes and progress to AIDS in the absence of effective treatment. For example, in about 10 years of follow-up of 288 men HIV-seropositive at baseline in the San Francisco General Hospital Study (1983-1993), only one (0.3%) maintained a CD4 lymphocyte count above 700/µl throughout follow-up and nearly all showed some worsening of other laboratory values predictive of AIDS, such as anti-p24 antibody levels, .-2 microglobulin, or neopterin.(7) Most HIV-positive persons, even with near-normal CD4 lymphocyte counts, show functional lymphocyte abnormalities that suggest their long-term immune functioning will be impaired.(8,9) There are no clearly documented instances of persons who appear to have cleared a longstanding HIV infection, although there is a report of transient infection in an infant.(10)
These observations on the natural history of HIV infection may no longer hold in an era of more effective antiretroviral therapy. The effectiveness of combination antiretroviral therapy that includes a protease inhibitor in slowing HIV disease progression appears to be altering the disease course significantly. There has been speculation about whether very early initiation of therapy, especially during the period of the syndrome associated with acute primary infection, could control or even eradicate HIV. Although therapy can now reduce HIV RNA to undetectable levels in peripheral blood for long periods, early hopes that HIV might be eradicated in an infected person by therapy have been dampened by the detection of reservoirs of HIV-infected cells outside of the peripheral blood. The possibility of strengthening the initial immune response with early treatment to control infection indefinitely holds out more promise, although it may depend on whether the immune system can control HIV even if therapy is withdrawn at some point.
At this time in the history of the AIDS epidemic, questions about survival time past an AIDS diagnosis and the impact of the new combination therapies on the incubation period cannot be addressed with any certainty, because their effects are just being seen in population-based data, such as the number of annual AIDS cases and AIDS deaths. It is unclear for how many years the benefit of these therapies can be sustained or whether recent encouraging results can be built on with additional treatments to bring HIV infection under control indefinitely. Data on the incubation period presented in the next section derive from the period before the advent of the new combination therapies. Although zidovudine came into widespread use during the period when some of these data were collected, its effect on the incubation period and survival is now believed to have been modest, and many of the modelers of the incubation period distribution attempted to correct for the effect of zidovudine treatment.
The median incubation period from HIV infection until development of AIDS is estimated at approximately 10 years for young adults.(11) The estimate varies with the age at which infection occurs and is significantly shorter in infants and in older adults and varies even between infection at age 20 and infection at age 40.(12) Whether the incubation period varies by mode of HIV acquisition has been more difficult to determine, but the preponderance of evidence now indicates that, after adjustment for age, the incubation period is similar in injecting drug users, those infected sexually, and hemophiliacs, whereas incubation time in transfusion recipients is shorter, probably because of the large HIV inoculum in infected blood transfusions. The incubation period does not appear to vary significantly in men and women or in different racial groups.
The estimate of 10 years for most adults proved to be much longer than the estimates of the incubation period published in the first years of the AIDS epidemic.(13,14) The earliest estimates were based on individuals who had been diagnosed with AIDS for whom a date of infection could be established. These data were most readily obtained from transfusion recipients or hemophiliacs. Because the data used were contingent on AIDS having appeared within a specified number of years since the beginning of the epidemic, the data sets were over-represented by persons with more rapidly progressing disease, and the incubation estimates were therefore contingent on AIDS developing within that number of years. This bias resulted in shorter estimates of the incubation period in the first published papers.
The incubation period distribution is most directly determined by longitudinal follow-up of a cohort of infected individuals with known dates of infection. Because the incubation period is so long, following all of a sizable cohort from infection to disease or death could take decades. Some cohorts have now been followed long enough to observe the time at which half of them have been diagnosed with AIDS, the median of the incubation distribution, but the shape of the distribution for the remaining half is still to be determined. In the cohort of men who participated in the hepatitis B vaccine trials in San Francisco mentioned previously, 51% had been diagnosed with AIDS at 10 years of follow-up.(15) The development of more effective antiviral therapies means that the characteristics of the full distribution will only be known as modified by treatment.
The incubation period distribution can be estimated more efficiently by statistical models, either parametric or nonparametric, but at the price of trusting the validity of the model chosen. Parametric estimates have usually used a Weibull or a gamma distribution. These distributions require a changing rate of progression to AIDS over time, unlike an exponential distribution, which would require a constant rate of progression(hazard). Data from prospective study of seroconverters clearly support an increasing hazard for several years following infection.
Direct estimates of the median time to AIDS in cohorts of homosexual men are in close agreement with estimates from statistical models in homosexual men. In 1989, a nonparametric estimate by Bacchetti and Moss from San Francisco data found a median of 9.8 years,(11) and several other estimates for homosexual men range between 8.6 and 10.2 years.(16-19) Brookmeyer and Goedert(20) gave an estimate of 10 years from data on hemophiliacs, similar to the estimates in homosexual men. Shorter incubation periods have been estimated for persons infected through blood transfusion and for pediatric patients.(21,22) For pediatric patients, Auger et al.(21) described an incubation distribution with rapid progression in the first 2 years followed by a slower rate of progression to AIDS, which increases again after 4 or 5 years.
Table 1 shows two estimates of the incubation period distribution, giving the cumulative annual probability of progressing to AIDS for the first 10 years after infection and, for homosexual men, the annual hazard or rate of progression to AIDS. The estimates of cumulative probabilities and the annual hazard for homosexual men are by Bacchetti and Moss,(11) and the estimates of cumulative probabilities for hemophiliacs are by Brookmeyer and Goedert.(20) Although there are minor differences, the two estimates are quite similar.
Rates of progression to AIDS are very low in the first 2 years after infection and increase thereafter. Although patients infected by transfusion, especially infants, have developed AIDS in the first year following infection, progression to clinical AIDS in healthy adults is rare within 2 years of seroconversion. The annual hazard estimates of Table 1 show progression rates increasing for the first 7 years, after which they level off or even drop slightly. These estimates are consistent with observed rates of annual progression to AIDS in a number of prospective studies from the United States and other countries.
|AIDS Survival Time|
The time from first diagnosis of AIDS to death has been characterized separately from the incubation time from infection to AIDS as AIDS survival time. Incubation time has been altered by the introduction of antiviral therapies and prophylaxis for OI, but AIDS survival time was affected earlier by treatments. Antiviral medications were initially used primarily in persons with AIDS and prophylaxis for OI are still primarily used among persons with AIDS. AIDS survival time is a mean or median value of survival times among the large number of disease diagnoses that define AIDS. Knowledge of median AIDS survival time may not be prognostic for survival in an individual patient. AIDS-defining diagnoses have a wide range of average survival times. In the Multicenter Hemophilia Cohort Study median survival after a single AIDS-defining condition ranged from 3 to 51 months for the 10 most common conditions.(23) The addition of a CD4 lymphocyte count less than 200/µl as an AIDS-defining condition in 1993 further broadened the range of AIDS survival times because most of the AIDS-defining disease diagnoses occur at lower CD4 lymphocyte counts. The time from HIV infection to a CD4 lymphocyte count less than 200/µl is on average nearly 2 years less than to manifestation of an AIDS-defining OI.(16,24) Unless specified otherwise, the data described subsequently apply to survival after AIDS-defining OI or neoplasms and do not include time from a CD4 lymphocyte count less than 200/µl.
The rate of long-term survival after an initial AIDS diagnosis has been very low. Some persons, nearly all with a diagnosis of Kaposi's sarcoma (KS), have survived for more than 5 years after diagnosis, but survival rates are significantly lower in patients with an OI or a neoplasm other than KS. Studies conducted early in the epidemic on persons diagnosed before 1986 showed a median survival time past an initial AIDS diagnosis of 10 to 13 months. An early study of the first 505 AIDS patients in San Francisco captured all reported cases from a city where surveillance was thought to be excellent (estimated at the time by the San Francisco Department of Health as 94% complete)(25). Mortality follow-up was obtained for 98% of these patients, and there was little possibility of bias from loss of follow-up. Of these 505 patients, 99% were homosexual or bisexual males. Overall median survival was 11 months. For patients with OIs, median survival was 9 months. Median survival after an initial KS diagnosis was 16 months. Survival time was significantly shorter for later KS cases, but no difference over time was seen in survival after OIs.
The San Francisco study provides a good estimate of survival among homosexual men prior to the advent of effective therapies for HIV infection and prophylaxis for OIs. The homogeneity of the population in the San Francisco study might raise doubts that its results can be extrapolated to other risk groups, but other survival studies conducted in the same time period that included other HIV transmission groups largely confirm the San Francisco estimates. An analysis of U.S. hemophilia cases showed a median survival of 11.7 months.(26) All cases in Australia diagnosed before July 1987 had a median survival of 10.4 months.(27) A study from Barcelona, Spain, showed a median survival of 12.7 months (including cases diagnosed in 1986 and 1987) and a longer survival in IDUs than in other risk groups.(28)
Age is a cofactor for shorter survival. In the San Francisco study of the first 505 cases, survival was significantly shorter among all patients over 40 years old. An association between older age and shorter survival was also observed in the study of U.S. hemophiliacs with AIDS, those aged 13 to 29 years having the longest survival and those over age 60 years having the shortest survival.
Improved median AIDS survival time was observed in diverse geographic locations in the mid-1980s to early 1990s, but long-term survival remained poor.(24,29) In two San Francisco cohorts of homosexual men, median survival improved from 14.7 months in the period from 1983 to 1986 to 19.1 months in the period from 1986 to 1989.(24) An analysis of survival in 36,847 U.S. AIDS cases associated with PCP (using the pre-1987 definition) diagnosed between January 1984 and September 1987 estimated 1-year survival for those diagnosed in 1984 and 1985 at 43% and for those diagnosed in 1986 and 1987 at 55%.(30) The increase was observed in homosexual men and IDUs, in both sexes, and in all racial groups.
Some of the early improvement in AIDS survival occurring before effective therapies were in use was probably due to artifacts such as diagnosing AIDS earlier in the course of HIV disease due to greater awareness of symptoms and increased use of HIV testing ("lead-time" bias), and the tendency of patients with more rapid disease progression to be disproportionately represented among the early AIDS cases ("frailty selection").(31) These effects are hypothetical, but they may be significant in comparisons with historical survival data. The 1987 expansion of the AIDS case definition resulted in earlier AIDS diagnoses for those conditions not previously AIDS-defining and therefore in a shortening of the incubation time and a corresponding lengthening of survival time for some individuals. There is evidence that hospital experience with treating HIV disease has a significant effect on survival and could account for some improved survival in the pre-AZT era.(32)
Much of the increase in AIDS survival that was attributed at the time to the introduction of zidovudine may have been due to these factors and the impact of prophylaxis for PCP that began to be used widely shortly after AZT was licensed. Prophylaxis of PCP with trimethoprim-sulfamethoxazole, dapsone, or aerosolized pentamidine greatly reduced the frequency of PCP as a presenting AIDS diagnosis and increased survival time. Data from the early 1990s indicated that median survival for patients with PCP, the most frequent AIDS-indicator disease, had reached the range of 18 to 20 months.(24)
Fewer data are available concerning time from a CD4 lymphocyte count of 200/µl to death. An analysis of time from a CD4 lymphocyte count of 200/µl to death among two cohorts of homosexual men in San Francisco found the median survival time was 38 months and had increased about 12 months over the median time in the period from 1983 to 1986.(24) Comparable estimates were reported from the Multicenter AIDS Cohort Study; 53% of subjects with a CD4 count in the range 101 to 200 cells/µl survived 30 months in the period from 1985 to 1988 and 71% in the period from 1989 to 1993.(29)
Median survival times reported from clinical trials have been 6 to 18 months longer than the times cited here from cohort studies of HIV-infected persons and from AIDS case registries. Clinical trial participants usually have to meet inclusion criteria that may tend to result in a healthier population than other HIV-infected persons with comparable CD4 lymphocyte counts, and trials necessarily do not include persons who may avoid treatment or may not even be aware of their HIV infection.
In 1996, combination antiretroviral therapy that includes use of a protease inhibitor was widely adopted. From 1996 to 1997, the percentage of HIV-infected persons receiving this therapy increased rapidly. The use of the new combination therapy is having a significant effect in lengthening survival time. The size of this latest treatment effect on survival is still uncertain, but surveillance data showing rapidly declining death rates suggest the short-term effect is dramatic (see Epidemiology of HIV/AIDS in the United States). More direct evidence of longer AIDS survival associated with use of combination antiretroviral therapy is emerging from longitudinal studies. In the Multicenter AIDS Cohort Study, the estimated median time from seroconversion to death for a person infected at age 30 was associated with use of combined therapy and increased nearly 2 years from the 1993 to 1995 period to the 1995 to 1997 period (11.4 years versus 13.3 years).(33) In Ontario province, Canada, where health care access is universal and all treatment is government subsidized, median survival time increased from 19 to 30 months among all AIDS patients over age 35 years with a CD4 lymphocyte count of less than 100 cells/µl and receiving antiretroviral therapy.(34) These early estimates of the impact of the new treatments include individuals diagnosed before they were available and individuals who had previously been treated with monotherapy and were therefore less likely to obtain the full benefit of combination therapy. Although it is still too early for more definitive quantitative estimates of the changes in AIDS survival time, they appear to be significantly greater than the previous increase from the early pre-antiviral treatment era of the epidemic.
|Laboratory Markers of Disease Progression|
The length of the AIDS incubation period means that laboratory tests to identify persons at high risk of disease progression are needed to guide clinical decisions in asymptomatic seropositive persons, such as when to begin antiviral therapy and prophylaxis against OI. Because depletion of CD4+ T lymphocytes is the hallmark and the apparent source of the central immune defect of HIV disease, determination of the CD4 lymphocyte count (or percentage) has been the most important laboratory marker of disease progression. Absolute CD4 lymphocyte count or percentage correlates strongly with AIDS-defining disease, has been included in the surveillance case definition of AIDS since 1993, and has been used to set indications for therapy. Figure 1 shows the median CD4 lymphocyte count declining from a normal value of around 1000 cells to an AIDS-defining level of 200/µl in three groups of HIV-infected persons: seroconverters, HIV-positives without AIDS, and persons diagnosed with AIDS. As indicated in the first panel in Figure 1, there is an initial drop in CD4 lymphocyte count of 200 to 300 cells/µl in the first few months following infection followed by a slower decline (Figure 1, panel 2), estimated variously between 50 to 90 cells per year on average in asymptomatic seropositive persons, and an accelerated decline in the late stages of AIDS-defining disease (Figure 1, panel 3), although the evidence for an acceleration in late-stage disease is mixed.(35) There is a great deal of individual variability in this general pattern; persons progressing rapidly to disease and death lose CD4 lymphocytes at a much more rapid rate and long-term nonprogressors retain near normal counts. Attempts to fit parametric models to the rate of CD4 decline have found that, apart from the drop at seroconversion, it is well approximated by a linear model if the count is first transformed by taking the square root.(36,37)
Tracking the course of the virus itself by accurate measurement of the quantity of viral RNA in the peripheral blood has become as important a laboratory marker as CD4 lymphocyte count and is now the primary marker for antiretroviral treatment decisions. The measurement of the number of viral copies per milliliter of peripheral blood (commonly known as "viral load") has been made possible by the development of sensitive assays using polymerase chain reaction (PCR) or nucleic acid sequence-based amplification (NASBA) of the viral source or branched DNA amplification (BDNA) of the signal that can detect virus down to a few hundred copies per milliliter in the most commonly used tests, and down to a few copies per milliliter in the latest ultrasensitive tests. It was known from earlier viral detection tests, such as assays for p24 antigen and quantitative assays for the antibodies to the p24 antigen, that measures of viral activity were strong predictors of AIDS in asymptomatic HIV-infected persons independent of the CD4 lymphocyte count.(7,38,39) The newer measures of viral quantity are even stronger predictors of disease and provide a clinically useful range of values that can monitor the effectiveness of antiviral therapies in controlling viral replication (Figure 2). Their prognostic usefulness has been demonstrated in prospective studies by associating levels of viral quantity in peripheral blood and changes in viral quantity with subsequent development of AIDS and death.(40) The association has been shown in both recent seroconverters and asymptomatic HIV-infected persons and in subjects from different HIV transmission groups.(41) Plasma HIV RNA levels are orders of magnitude lower in long-term nonprogressors than in subjects with progressive disease.(4) Undetectable HIV RNA in peripheral blood is associated with stable CD4 lymphocyte counts and increases in HIV RNA correlate with rate of CD4 lymphocyte cell decline.(42,43) Peripheral blood viral load is changed by antiretroviral therapy, often dropping below the level of assay detection in persons who begin receiving combination therapy that includes a protease inhibitor.(44)
A number of studies have shown the prognostic strength of the association of HIV RNA quantity with risk of disease by examining viral quantity in peripheral blood in multivariate analysis with CD4 lymphocyte count and other laboratory tests previously associated with developing disease. In a study of 62 HIV-seroconverting homosexual men, 18 of whom developed AIDS during follow-up, plasma HIV RNA level was a stronger predictor of AIDS than CD4 lymphocyte count, beta2 microglobulin blood level, or neopterin blood level.(41) In a cohort of 165 hemophiliacs,serum HIV RNA level and CD4 lymphocyte count were independently associated with developing AIDS, and the highest relative hazard (6.4) was for HIV RNA > 10,000 copies/ml.(40)
The concentration of HIV in peripheral blood is very high during the primary infection/seroconversion phase, but it stabilizes shortly after seroconversion and changes relatively slowly thereafter. This observation has been made for the concentration of proviral DNA in peripheral blood(45) and for HIV RNA copy number in serum samples.(46) This finding suggests that host immune response to initial infection is critical in determining the extent of early viral dissemination and the subsequent course of HIV disease.
The result of these analyses has been to make HIV RNA blood levels the primary surrogate marker for the effectiveness of therapy in controlling HIV disease. Although most studies have measured HIV RNA in plasma, levels in serum have also been shown to have comparable prognostic usefulness, even though absolute values are lower in serum.(40) HIV RNA measured in either plasma or serum appears to yield equivalent biologic information.(47)
In addition to the CD4 lymphocyte count and the quantity of HIV in peripheral blood, other laboratory tests, primarily measures of generalized immune activation, have been shown to predict AIDS in asymptomatic HIV-infected persons. Serum levels of beta22 microglobulin, serum and urine levels of neopterin, soluble CD8, soluble interleukin-2 receptor, interferon-alpha, and serum levels of IgA predict development of AIDS according to a number of prospective studies.(38,48,49) Beta2 microglobulin and neopterin, which are the best studied, correlate highly with each other and are very strong predictors of the risk of AIDS comparable to and independent of the CD4 lymphocyte count.(50) The combined strength as prognostic tests of HIV RNA blood level and CD4 lymphocyte count appears from the multivariate analyses cited above to have replaced the usefulness of these immune activation markers as independent predictors of subsequent disease course.
|Cofactors for Disease Progression|
Endogenous biologic or psychologic factors, other infections, behaviors, or other environmental factors that alter the natural history of HIV infection may be cofactors for disease progression. Understanding the cofactors for HIV disease may shed light on the pathogenesis of HIV as well as on identifying potential interventions that could slow disease progression. Many potential cofactors for HIV have been investigated. They include genetic factors, age, gender, route of HIV infection, drug use, smoking, nutrition, and other infectious diseases. To date, the only cofactors for which the evidence is strong are age and genetic differences in chemotactic receptors required by HIV to infect cells. Other genetic differences in HLA molecules, smoking, and nutrition may also be cofactors, but the data are less compelling for these associations. Individual AIDS-defining OI require infection with the agent that is the necessary cause of the opportunistic disease, and so in that sense, such agents could be called cofactors, but here they are only considered cofactors for HIV disease if they accelerate the course of immunosuppression or shorten time to death.